INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue I, January 2025
www.ijltemas.in Page 321
tracking algorithm checks whether the marker center is above, below, left, or right of the center of the webcam screen. This
information helps to change the position by sending angle information to Arduino for pan and tilt servos. The coordinates and size
of the detected marker were used to find the center of the marker. The maximum distance measurement range depends on the
environment and camera specifications, which include megapixels and low-light performance.
The system is very adaptive and can be used with ARToolkitPlus and ARTag instead of the ArUco approach. Practical importance
in the fields of robotics augmented reality and in drones.
References
1. Mustafah, Y.M., Noor, R., Hasbi, H., and Azma, A.W. (2012). Stereo vision images processing for real-time object
distance and size measurements. 2012 International Conference on Computer and Communication Engineering (ICCCE),
659-663.
2. Hossain, M. A., and Mukit, M. (2015). A real-time face to camera distance measurement algorithm using object
classification. 2015 International Conference on Computer and Information Engineering (ICCIE), 107–110.
https://doi.org/10.1109/CCIE.2015.7399293
3. Ye, Y., Tsotsos, J., and Harley, E. (2000). Tracking a person with a pre-recorded image database and a pan, tilt, and zoom
camera. Machine Vision and Applications, 12(1), 32–43. https://doi.org/10.1007/s001380050122
4. Acuna, R., and Willert, V. (2018). Dynamic markers: UAV landing proof of concept. 2018 Latin American Robotic
Symposium, 2018 Brazilian Symposium on Robotics (SBR), and 2018 Workshop on Robotics in Education (WRE), 496–
502. https://doi.org/10.48550/arXiv.1709.04981
5. Saez, J. M., Lozano, M. A., Escolano, F., and others. (2020). An´ efficient, dense, and long-range marker system for the
guidance of the visually impaired. Machine Vision and Applications, 31, 57. https://doi.org/10.1007/s00138-020-01097-y
6. Kato, H. (2002). ARToolKit: Library for vision-based augmented reality. IEICE Technical Report, 101(652 (PRMU2001
222-232)), 79–86
7. Kato, H., and Billinghurst, M. (1999). Marker tracking and HMD calibration for a video-based augmented reality
conferencing system. Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR ’99),
85–94. https://doi.org/10.1109/IWAR.1999.803809
8. Olson, E. (2011). AprilTag: A robust and flexible visual fiducial system. Proceedings of the IEEE International Conference
on Robotics and Automation (ICRA 2011), 3400–3407. https://doi.org/10.1109/ICRA.2011.5979561
9. Garrido-Jurado, S., Munoz-Salinas, R., Madrid-Cuevas, F. J., and Marín-Jiménez, M. J. (2014). Automatic generation and
detection of highly´ reliable fiducial markers under occlusion. Pattern Recognition, 47(6), 2280–2292.
https://doi.org/10.1016/j.patcog.2014.01.005
10. Dandil, E., and C¸evik, K. K. (2019). Computer vision-based distance measurement system using stereo camera view.
2019 3rd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), 1–4.
https://doi.org/10.1109/ISMSIT.2019.8932817
11. Jun, J., Yue, Q., and Qing, Z. (2010). An extended marker-based tracking system for augmented reality. Proceedings of
the 2010 Second International Conference on Modeling, Simulation and Visualization Methods (WMSVM), 94–97.
https://doi.org/10.1109/WMSVM.2010.52
12. Ababsa, F., and Mallem, M. (2004). Robust camera pose estimation using 2D fiducials tracking for real-time augmented
reality systems. VRCAI ’04. https://doi.org/10.1145/1044588.1044682
13. Latifah, A., Saripudin, Aulawi, H., and Ramdhani, M. (2018). Pantilt modelling for face detection. IOP Conference Series:
Materials Science and Engineering, 434, 012204. https://doi.org/10.1088/1757899X/434/1/012204
14. Torkaman, B., and Farrokhi, M. (2012). Real-time visual tracking of a moving object using pan and tilt platform: A Kalman
filter approach. 20th Iranian Conference on Electrical Engineering (ICEE2012), 928–933.
https://doi.org/10.1109/IranianCEE.2012.6292486
15. Intel. (2008, October). Intel Open Source Computer Vision Library, v1.1ore. http://sourceforge.net/projects/opencvlibrary/
16. Chakravorty, T., Bilodeau, G., and Granger, E. (2020). Robust face track-´ ing using multiple appearance models and
graph relational learning. Machine Vision and Applications, 31, 23. https://doi.org/10.1007/s00138020-01071