INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
Development of Autonomous Agricultural Vehicle as New Trends  
of Agricultural Robotics  
1Punit Kumar Chaubey, 2Umakant Singh 3Sanjeev Kumar Pathak  
1Associate Professor Dept. of CSE, 2Assistant Professor Dept. of CSE, Bansal Institute of Engineering and Technology,  
Lucknow.  
2Assistant Professor Dept. of CSE, United University, Prayagraj.  
3Assistant Professor Dept. of CSE-AI, Bansal Institute of Engineering and Technology, Lucknow, Uttar Pradesh.  
Abstract: CCTV cameras are widely used to monitor traffic, but they often come with limitations that still force officials to check  
traffic flow manually. In this work, we introduce a simple and automatic method to measure traffic volume and vehicle speed by  
analyzing pixel patterns from CCTV video. Our approach begins by marking vertical and horizontal reference lines on each lane  
and collecting pixel information from these lines in every video frame. When a vehicle crosses these lines, the change in pixel  
brightness clearly reveals its movement. By studying these brightness changes, we can automatically detect vehicles and calculate  
key traffic parameters accurately.  
KeywordsCCTV image big data; pattern change graph analysis; vehicle detection;  
I. Introduction  
Video Image Detection Systems (VIDSs) and CCTV cameras are commonly employed for monitoring traffic conditions. Previous  
studies show that VIDSs can identify vehicles, measure traffic flow, and estimate vehicle speed. Despite these advantages, current  
VIDS setups face certain challenges. Since cameras are mounted at fixed and often limited viewing angles, only a portion of the  
roadway is captured, reducing the accuracy and number of vehicles detected. A similar need for improvement exists in the  
agricultural sector. Although many modern tools help ease farming activities, farmers still spend considerable time manually  
inspecting large crop fields. To reduce this effort, there is a growing demand for an agricultural vehicle that can move autonomously  
between crop rows and handle multiple operations efficiently. An ideal system should navigate accurately without drifting off its  
path, avoid obstacles, and choose the most efficient route across the field. This paper presents a prototype of an autonomous  
agricultural vehicle designed specifically for row-based crop layouts. The system integrates sensors with a NodeMCU controller to  
achieve real-time path detection and obstacle avoidance. The sensors continuously monitor the route to be followed, identify any  
obstacles in the path, and guide the vehicle back to its intended track if deviations occur. With these capabilities, the autonomous  
vehicle can systematically cover the entire field, significantly reduce inspection time, and enhance overall farming efficiency with  
minimal human involvement.  
According to the Food and Agriculture Organization (FAO), the global population is projected to rise to nearly 9.1 billion by 2050.  
Meeting the food requirements of this growing population will demand at least a 70% increase in food production, with developing  
nations needing to nearly double their current output. However, traditional farming practices, combined with the steady decline in  
available agricultural labor, are making farming less efficient and economically unsustainable. In this scenario, the development of  
intelligent and autonomous agricultural machinery becomes crucial for enhancing both the quantity and quality of farm produce.  
Modern agriculture must also address critical issues such as excessive use of agrochemicals, energy conservation, environmental  
pollution, and the impacts of climate change. Automation has already shown promising results in improving agricultural  
productivity. This paper presents a comprehensive review of existing research on agricultural automation and highlights the global  
advancements made toward integrating autonomous systems into farming operations.  
Index Terms - Autonomous vehicle, NodeMCU, Arduino’s IDE, Sensors, REHA.  
II. Introduction  
India is agriculture dominated country [1]. As population grows, need of growing crop increases [2]. So more production is required  
in less time is necessary. Time reduction can be successful if we introduce machinery in agriculture field. The idea of machinery in  
the agriculture field was involve long back. Farmers used different kind of equipment for different processes. Initially they just  
include some techniques with traditional way of irrigation, cultivation, cropping etc. but nowadays they prefer using the equipment  
available in the market. By these equipment they can do work efficiently and effectively.  
If we see the history in agriculture, we can realize that agriculture emits so much occupation from manual to high level business  
where there is use of wide range of tools and equipment [3] with human and technology intervention. According to the changes  
developed in the society and growth of population researchers are looking toward the autonomous system which can do work more  
suitably with low cost. In early 1920 some of them develop a system which can run the vehicle on the field to do multiple works.[4,5]  
Then automatic vehicle, remote controlled vehicle, wired and human controlled vehicles were invented in date back to 1950s and  
1960s. [6] Other combination of automatic and computer intervened equipment with some sensors like image processing sensor to  
Page 1246  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
give vision to machine as well as human were proceed in 1980s. This kind of system gives guidance too. In that duration Orange  
harvesting with programmed robot was also developed.[7] After that decade, things were changed, need of people and farmer needs  
became different. No-body wants to work for whole day in field only. So requirement of development and advancement in the  
agricultural field become necessary. Agriculture sector continuously changes into business since then. Also there were requirement  
of the machine which can travel on the agricultural field effectively with some extra accessories which can ease the work with lesser  
time. So, agricultural vehicle was come into account which can run on the crop field without creating problem in case of running and  
inspecting the crop field. Agricultural sector had utmost issues long with great precision ranked one to solve. Demand of agricultural  
vehicle contains quality to increase fertility, accuracy and also increase safety during operation with lesser time and cost. These were  
only possible because of advancement in electronics, computers and computing technologies. Some guidance technologies such as  
mechanical, radio navigation, optical guidance and ultrasonic guidance were developed and incorporated with the vehicle to perform  
desired work. [8,9]  
After these experiments, robotic system comes which could made things autonomous that mean autonomous vehicle were developed  
which can run on the agricultural field and also do multiple works in single visit of field. Also reduce time and cost without  
involvement of any body. [10] Everything became autonomous. Then in agricultural sector the biggest task for the crop field is to  
inspect. For this, now, there are many products available in the market but we need a product which can reduce the inspection time  
as well as by giving some advancement in this, it performs more than inspection. For that researcher do work continuously. [11]  
They introduce many kind of vehicle including aerial and ground vehicle or robots etc. For effective work the agriculture vehicle  
should be compact and autonomous. The vehicle should be intelligent enough for navigation control for every field including in-line  
crop field. This is only possible by applying artificial intelligence to the system. [12]  
So from the beginning, researchers were trying to make a vehicle for agricultural field to perform work. Firstly introduce some remote  
controlled vehicle then come to automatic vehicle and now autonomous vehicle. Remote controlled vehicle as name suggest that it  
controlled by wired remote where person had to present in the field to drive the vehicle. Then comes generation of automatic vehicle  
which was controlled wirelessly where person had to fully concentrated but there is no need of physical attention. But in an  
autonomous vehicle there is no need of person [13], but system perform specific work carefully according to the instruction given  
through codes. The vehicle has enough coding to make it intelligent for making decision. For these coding, many kind of Integrated  
Circuit Boards are available in the market like Arduino UNO, Raspberry Pi etc. Every board can connect with Arduino’s IDE, where  
program can burn and we get desired output as per the program written in the IDE. Here I am using NodeMCU board. In any work  
field the interaction of autonomous vehicle should be systematic and appropriate [14]. Scientific literature shows many studies on  
autonomous robot for agriculture application [15][16]. Agriculture bots are made for cropping, for irrigation, for monitoring the  
health of crops, for inspection and many more. But hey are in static condition. They didn’t move their self. Here in this paper, there  
is explanation regarding self-operating vehicle which runs in the field without any involvement of man power [17] and can choose  
its own path without any distraction, just by using some sensors and also it avoids obstacle present in the field. It can make 180  
degree move and can select the optimum path again. All these done by coding and burnt the code in the board called NodeMCU. It  
need to provide battery backup by which it can run without connecting with laptop or computer or any wire.  
National Overview of Agricultural Automation  
The agricultural machinery market in India has experienced significant expansion, with an estimated compound annual growth rate  
(CAGR) of more than 10% during 20132018. Mechanization levels vary sharply across the countrystates such as Haryana and  
Punjab exhibit high levels of automation, while many northeastern states still rely heavily on traditional farming methods. Factors  
such as rapid urbanization, increasing population, and the growth of non-agricultural sectors have contributed to reduced farm labor  
availability, ultimately affecting agricultural productivity. The Indian farm equipment industry, valued at nearly USD 6.5 billion, has  
shown strong and consistent growth in recent years. Studies suggest that the adoption of agricultural automation can enhance overall  
productivity by up to 30% while reducing input costs by roughly 20%. These trends highlight the critical role of mechanization and  
automation in strengthening India’s agricultural sector.  
Challenges for Automation of Agriculture  
Land holdings in India are generally small and fragmented, making it difficult to use large or advanced machinery.  
Farmers often face challenges related to the affordability and financing of modern farm equipment.  
The procurement process for agricultural machinery is often inefficient, and after-sales service quality remains poor in many  
regions.  
There is an excessive dependence on tractors, while other useful types of machinery remain underutilized.  
The agriculture sector was expected to employ nearly 205 million people by 201920.  
Around 60% of the rural population is engaged in agriculture and allied activities.  
Farm labour availability is projected to decline by approximately 26% by the year 2050.  
Most agricultural fields are small in size, making them unsuitable for operating large and heavy farm machines.  
Page 1247  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
III. Review of Literature  
To achieve a truly autonomous agricultural system, the two major functional componentsautonomous navigation and autonomous  
task executionmust be combined into a unified hardware and software architecture. This integrated system should share sensors,  
planning modules, and decision-making processes for both movement and field operations, while also minimizing the amount of  
hardware required without compromising performance. The architecture must be designed to accommodate a wide range of sensors,  
actuators, and commercial agricultural devices developed by different research teams. In addition, it should support multiple standard  
communication protocols commonly used in modern agricultural technologies. A modular design is essential to ensure smooth  
interaction between sensors and devices, efficient organization of perception and processing units, and flexible actuation control,  
especially given the diversity of available technologies.  
As an initial step toward this goal, this paper concentrates on developing an architectural framework for mobile autonomous vehicles  
that can work cooperatively as a fleet in agricultural environments. For such systems to be practical and effective, features such as  
hardware reliability, plug-and-play capability, and ease of programming are critical. Alongside these, modularity, expandability,  
ergonomic design, ease of maintenance, and cost-effectiveness are also vital to encourage wider adoption of autonomous systems in  
real-world agricultural settings.  
The proposed configuration incorporates these essential characteristics, while additional considerations are explored in subsequent  
sections. The objective is to offer manufacturers of agricultural machinery practical design guidelines for automating next-generation  
equipment, particularly in the rapidly growing field of precision agriculture, where durable and efficient solutions are in high demand.  
Autonomous guidance framework  
Navigation sensors outputs  
Motion model  
> Perception systems  
Environment characterization  
> Localization systems  
Absolute and relative  
position and heading  
Kinematic model  
Dynamic model  
Row detection  
Navigation planner  
> Comm. systems  
Follow a mission  
Navigation execution  
Path following  
> Actuator systems  
Throttle control  
(a)  
Page 1248  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
Autonomous implement framework  
Navigation sensors outputs  
Implement model  
> Perception systems  
Environment characterization  
> Implement  
Kinematic model  
Dynamic model  
> Localization  
systems  
Absolute and relative  
Agricultural knowledge  
Switching and  
Synchronization  
> Actuator systems  
Valve control  
Actuator control  
(b)  
Figure 1: General frameworks of a fully autonomous crop operation. (a) Core components of an agricultural vehicle guidance system.  
(b) Fundamental elements of autonomous field implements..  
The research described in this paper forms part of the RHEA project, an FP7 initiative funded by the European Commission. The  
primary objective of RHEA is to develop a new generation of autonomous agricultural vehicles capable of performing both chemical  
and mechanical crop management. The goal is to minimize the use of agricultural inputs, reduce environmental pollution, ensure  
better crop quality and safety, and ultimately lower production costs.  
To achieve these outcomes, the project focuses on several key research areas:  
(a) advanced perception technologies for assessing crop conditions, including reliable crop-row detection;  
(b) innovative actuation mechanisms for precise application of fertilizers and herbicides, as well as targeted weed removal; and  
(c) the creation of a fleet of compact, safe, reconfigurable, and complementary robotic units designed to operate cooperatively across  
entire agricultural fields.  
Supporting this scientific work are additional technological developments in:  
(d) robust communication and localization systems for coordinating robot fleets;  
(e) improved simulation platforms and collaborative graphical user interfaces; and  
(f) next-generation fuel cell solutions to provide clean, efficient power (see Figure 2).  
FIgUre 2: The RHEA fleet (ground mobile units and implements).  
Page 1249  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
A straightforward approach to designing a fully autonomous agricultural system is to integrate the two subsystems described earlier—  
an autonomous vehicle and an autonomous implementinto a coordinated operational unit. Achieving this integration requires a  
communication structure in which a Main Controller combines the behavior of both subsystems into a unified control strategy,  
allowing the entire system to function as a single robotic entity. In this configuration, the autonomous agricultural system can be  
organized into three primary modules: the vehicle, the implement, and the controller.  
The Vehicle: The vehicle module is responsible for managing the motion of the implement, ensuring that its position and orientation  
are accurately controlled according to the type of crop and the specific agricultural operation being performed. Since the vehicle  
typically carries or tows the implement, it also supplies the necessary power, which makes it essential for the vehicle to include  
standard mechanical interfaces (e.g., three-point hitch), electrical generation systems, and hydraulic pumps.  
Commercial agricultural tractors already incorporate these components, making them a practical and efficient platform for conversion  
into autonomous vehicles. Leveraging commercially available tractors accelerates system development by reducing the need for time-  
intensive tasks such as chassis construction, manual assembly, mechanical testing, and vehicle certification. This approach also  
improves system reliability, as key elementssuch as the engine, transmission, steering, braking systems, and housinghave been  
extensively validated under real-world operating conditions.  
Alongside these considerations, system safety, robustness, and operational efficiency remain essential factors when designing the  
structure of the autonomous vehicle subsystem.  
The vehicle selected for the RHEA project was the CNH Boomer-3050, a 51-hp (37.3-kW), 1200-kg compact agricultural tractor.  
Its modified cabin, cleared of all original interior components, served as the housing for the onboard computing systems  
responsible for perception, communication, localization, safety, and actuation. Several external componentssuch as the vision  
cameras, laser sensors, GPS antennas, communication antennas, and emergency stop buttonswere mounted outside the cabin to  
ensure proper operation. The complete system is organized into the following subsystems:  
Weed Detection System: A machine-visionbased module designed to identify weed patches within the crop field.  
Crop Row Detection System: A vision-based guidance system that assists in maintaining accurate vehicle alignment with crop  
rows.  
Obstacle Detection System: A forward-facing laser range finder used to sense obstacles in the vehicle’s path.  
Communication System: Equipment enabling connectivity between the operator station, autonomous units, and portable  
devices used by field personnel.  
Two-Antenna GPS System: A dual-antenna setup used for precise vehicle positioning and orientation during field operations.  
Inertial Measurement Unit (IMU): A sensor module that supports the GPS system by providing additional motion data for  
improved localization accuracy.  
Vehicle Controller: The subsystem responsible for computing steering, throttle, clutch, and braking commands needed for  
accurate path tracking. These actuations are typically accessed through the vehicle’s CAN bus interface.  
Central Controller: The main decision-making system that gathers data from all perception modules and determines the  
appropriate actions for the actuation subsystems.  
Auxiliary Power Supply: A fuel-cell-based energy unit monitored by the central controller to ensure continuous, clean, and  
efficient power delivery.  
Figures 3(a) and 3(b) show the original CNH Boomer T3050 and its modified autonomous version, respectively. In the redesigned  
model, the cabin has been reduced to accommodate the embedded computing systems, while additional componentsincluding the  
fuel cell, a roof-mounted solar panel, an antenna support bar, and the internal equipment layoutare clearly visible. The antenna bar  
and equipment distribution are further detailed in Figures 3(c) and 3(d).  
Implements: Implements are external devices designed to perform specific crop-related actions, such as herbicide and pesticide  
application or mechanical and thermal weed control. Modern precision agriculture demands that these components operate  
independently so that treatments can be applied only where necessary. To achieve this, nozzles, burners, and other functional elements  
often incorporate positioning systems that enhance targeting accuracy. Programmable logic controllers (PLCs) and onboard  
computers are typically used to manage these components and synchronize their operation with the autonomous vehicle. Within the  
RHEA project, three distinct implements have been developed:  
Boom Sprayer  
The herbicide boom sprayer [23], intended for cereal crops (see Figure 4), features a 5.5-m boom equipped with 12 independently  
actuated nozzles spaced at 0.5-m intervals. Mounted on the vehicle, the implement carries two herbicide tanks with capacities of 200  
L and 50 L, enabling mixing of different formulations. The Main Controller governs both the herbicide flow and the boom’s  
folding/unfolding mechanism.  
Page 1250  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
Mechanical-Thermal Weed Control Implement  
The mechanical-thermal system [25], used for flame-resistant crops such as maize, garlic, and onion (see Figure 4(b)), includes four  
pairs of burners mounted on a frame designed to cover four adjacent rows. The vehicle tows the implement and controls its lateral  
alignment relative to the vehicle’s path. Burner intensity is dynamically regulated based on data from the machine-vision-based Weed  
Detection System, which estimates weed density as the percentage of vegetation within a standard area unit (typically 0.25 m × 0.25  
m). The vehicle controller also manages the folding/unfolding system of this implement.  
Communication antennas  
Camera  
GPS antenna  
GPS antenna  
Obstacle  
Solar panel  
system  
Main Controller  
implement  
GPS  
FIgUre 3: Description of RHEA Mobile Unit Modifications(a) Initial commercial tractor. (b) Final RHEA mobile unit after  
modifications. (c) External equipment mounted on the mobile unit. (d) Internal equipment distribution inside the mobile unit’s  
cabin  
(a)  
Figure 4: Implements controlled by the RHEA system: (a) boom sprayer, (b) flame hoe, and (c) canopy sprayer.  
System Architecture  
(b)  
(c)  
For any autonomous system there are three steps (a) to make a mechanical structure (b) to build a control system (c) execution of  
specific work. The system architecture of the project is shown in fig. 5. It describes how the system device connected with each other  
physically.  
Here seven ultrasonic sensors are in use. Four ultrasonic sensors are used for tracking the path and one ultrasonic sensor is used for  
detecting obstacle. Four DC motors are used where left side front and rear wheel are connected in same way and right side front and  
rear wheel are connected in same way to the NodeMCU as shown in block diagram. All sensors are connected with microcontroller  
as input component which gives the environmental data and other like motor-driver is also connected to the microcontroller as output  
which navigates the wheel according to requirement.  
Page 1251  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
Every connection is done with input output pin of the NodeMCU board. Another two ultrasonic sensors are used in taking another  
row after ending of the current row. These two sensors are combined with compass sensor [14] and detect the next path to be followed.  
[18] There are literature on front wheel drive and rear wheel drive but in this project 4 motors are connected with four wheels for  
proper balancing as well as maintaining the speed of the vehicle.  
Fig. 5: System Architecture hardware  
Autonomous vehicle consists of many hardware components. Some of hardware used in this project shown in figures given below:  
Fig.6: NodeMCU  
The system has two front wheels and two rear wheel. The right side front and rear wheel are connected to the NodeMCU in same  
way and vice-versa. The wheels are controlled with ultrasonic sensor no. 2,3,4,5. These sensors has some threshold limit, when the  
value is under the threshold limit then vehicle moves straight and if both left side sensors receive greater value then it take right turn  
otherwise take left turn. The distances from plant are almost equal both side.  
Fig. 7: Ultra-Sonic Sensor  
The whole system put on the chassis which is made of lightweithed material, fibreglass of 5mm. thickness. The chassis has holes  
which are drilled according to the requirement. All components are put together in this chamber.  
Page 1252  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
Fig. 8: Motor Driver (L293D)  
NodeMCU is single board microcontroller with ESP8266, so we can perform another work too. it consists of 13 general purpose  
input and output pin, so any input and output device can connect with any of these 13 pins. There are many motor driver are available  
in the market but as we drive left side both wheel in same manner and vice-versa, we connect motor with L293D motor driver.  
Fig.9: Geared Motor  
Front Drive:  
Two HC-SR04 ultrasonic sensors are on right side of vehicle. One is at from and another at the rear. Similarly two ultrasonic sensors  
are mounted on left side. When all four sensors get value under limit then vehicle moves forward. When any side of sensors get  
greater value then vehicle turns another side. This continues till the whole field covered. Three ultrasonic sensors are mounted in the  
front of the vehicle. The center ultrasonic sensor is used for detecting obstacle present in track and automatically turns back to the  
starting point of the row and selects another row for further motion. Another two sensors are combined with LSM303 compass sensor.  
Fig.10: Wheel  
Turning action at the end of row:  
When the four ultrasonic sensors situated at the side, get greater value than threshold limit that is the sign of end of row. Then with  
the combined action of compass sensor and other ultrasonic sensor, vehicle take 90 degree and then move forward and backward to  
find the another row. Similarly it follows all rows till the end of field. Flowchart [18] for system architecture is:  
Page 1253  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
The wheel is connected with 6Volt, 0.8 Amp and 100 rpm geared motor and these motors are controlled through L293D motor driver  
which is programmed through Node MCU microcontroller. The programming and action of motor is totally dependent on the input  
data received by sensors.  
IV. Research Methodology  
Implementation Main Controller: The Evolution of the RHEAComputing System  
The computing system installed on the mobile units is required to interact with several subsystems, each operating on different  
platforms such as Windows, Linux, and QNX. These subsystems also rely on software created in multiple programming languages,  
including C++, .NET, and Python. In the initial design, all these components were connected through a common Ethernet network,  
where a single computer functioned as the Main Controller. This controller communicated with peripheral devices using serial  
connections or Ethernet (802.3 LAN) through a network switch. To manage the entire communication process, a Network Manager  
application was also needed on the Main Controller.  
As part of the system improvement, the first major step toward centralization involved incorporating the Weed Detection System  
(WDS) directly into the Main Controller. Since the vision camera used in the WDS complies with the GigE Vision standardwhich  
supports high-speed video transmission—the Main Controller’s dual Gigabit Ethernet ports enabled direct communication with the  
camera. Using Lab VIEW tools for camera configuration and data acquisition made this integration easier and removed the need for  
a separate vision-processing computer.  
However, this integration introduced two significant challenges. The first challenge was the reuse of the original acquisition software,  
which had been developed in C++ for Windows 7. The second was assessing whether the Main Controller could handle the additional  
processing load required by the vision algorithms. The software-reuse issue was addressed through Lab VIEW’s ability to interface  
with third-party tools, allowing external scientific librariessuch as C code, DLLs, and Code Interface Nodes (CINs)to be called  
from within Lab VIEW. The strategy adopted was to convert the existing C++ vision code into a DLL so it could be directly loaded  
and executed on the Main Controller.  
Before creating the DLL, the source code needed careful examination to identify and remove any system-specific calls, especially  
those dependent on Windows kernel libraries. Such calls could lead to conflicts when the code runs under Lab VIEW’s Real-Time  
Operating System (RTOS). After adapting the code to be compatible with the RTOS environment, the necessary functions were  
packaged into appropriate C-based structures, following the design process originally outlined in the initial system architecture.  
Page 1254  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
Low-level  
GPS  
RS23  
antenn  
RS23  
Ground mobile  
unit  
RS23  
Controlle  
mobile  
Etherne  
Wireless  
Laser  
Giga-  
Figure 11: General schema of the fleet of robot topology for the RHEA project.  
Low-level  
GPS  
RS23  
antenna  
RS23  
Ground mobile unit  
RS23  
controller  
Controller  
switch  
mobile unit  
Ethernet  
Wireless router  
Laser scanner  
Giga-Ethernet  
Figure 12: Overall hardware architecture designed for the autonomous mobile robot developed under the RHEA project.  
The system centralization examples discussed earlier demonstrate how complex subsystems can be integrated efficiently. In addition  
to these, several other subsystems can be centralized more easily by leveraging plug-and-play interfaces. Technologies such as  
Ethernet networking via WLAN modules and switches, RS-232 communication for devices like laser sensors and inertial  
measurement units, CAN bus for industrial communication, ISO modules, and analog or digital I/O modules for low-level actuation  
make integration straightforward. The final configuration, including the main external sensing components, is depicted in Figure 10.  
To manage this hardware setup, a simple and structured software architecture was developed using LabVIEW. This architecture  
Page 1255  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
ensures that every subsystem connects smoothly to the main decision-making module, which handles high-level system behavior.  
The overall design, shown in Figure 11, is organized into three hierarchical software levels.  
The first level, identified by yellow blocks, contains driver modules that handle direct communication with sensors, actuators, and  
other hardware components, including external user interfaces. These modules act as the bridge between physical devices and the  
software system.  
Used Arduino’s IDE Software for controlling devices:  
This system is built using a combination of hardware and software. The hardware components are programmed and controlled  
through the Arduino Integrated Development Environment (IDE). Although the Arduino IDE supports C++, it does not allow the use  
of every feature of standard C/C++ because of hardware limitations. To simplify interaction with physical components, the Arduino  
environment provides ready-made helper functions and a Serial Monitor for easy input and output operations.  
A microcontroller can be thought of as a compact computer embedded within a single Integrated Circuit (IC). It typically includes a  
CPU, RAM, ROM, and various input/output controllersall packed into one chip. These compact devices are ideal for embedded  
applications such as mobile phones, game controllers, remote controls, and industrial machines, where efficient and reliable operation  
is required.  
The term “Arduino” refers not only to the hardware but also to the company and the open-source project behind it. Arduino produces  
microcontroller-based development boards that are widely used in education, research, and commercial applications. Because the  
hardware designs are open-source, users can freely modify, customize, and build innovative projects according to their needs.  
The core Arduino boards are designed with simplicity in mind. They include both analog and digital I/O pins that allow users to  
connect sensors, actuators, and other electronic components. As projects grow in complexity, additional expansion boardsknown  
as “shields”—can be stacked on top of the main board. These boards also provide serial and parallel interfaces, including standard  
USB ports, allowing programs written on a computer to be uploaded directly to the microcontroller.  
On the software side, Arduino supports programming in both C and C++. Essentially, anylanguage that can be compiled into machine  
code for the microcontroller can be used. The Arduino IDE, built in Java and inspired by Processing and Wiring, offers an easy-to-  
use code editor with options for compiling and uploading programs at the click of a button. It also features a message area, console  
output, and a simple toolbar.  
The Arduino IDE has its own programming structure and uses libraries specific to different boards. Each board may require different  
functions or libraries, so code must be written and uploaded accordingly. Once the code is compiled, it can be burned onto the  
appropriate board, enabling it to perform the required task.  
Fig.13: Arduino’s IDE Screen  
The Arduino IDE serves as the primary platform for developing and testing various projects using Arduino boards. It provides its  
own structured programming environment that allows users to write code, upload it to the hardware, and observe different  
performance outputs based on the programmed functions. In addition to the Arduino IDE, other software tools and applications—  
such as Blynkcan also be used to enhance project capabilities.  
Applications like Blynk can be installed on a mobile device, enabling real-time monitoring of the operations carried out by the  
vehicle or hardware system. These mobile applications provide live feedback, display sensor data, and even allow remote control  
of the vehicle. With such tools, users can conveniently manage, track, and operate their projects directly from their smartphones,  
making the entire system more interactive and accessible.  
V. Conclusion  
In this work, an autonomous vehicle has been developed that is capable of making decisions on its own based on the surrounding  
conditions. Artificial intelligence is integrated into the system so the vehicle can detect obstacles, take appropriate turns to avoid  
them, and then return to its original path to ensure full coverage of the field. Initial experiments were conducted in a garden to test  
the guidance algorithm designed for row-structured crop fields. The plants were arranged with an inner row spacing of 50 cm, and  
an ultrasonic sensor was used to detect plant positions and correct any lateral deviation of the vehicle.  
Page 1256  
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,  
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)  
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025  
To demonstrate the practical implementation of a functioning autonomous fleet using the Main Controller proposed in this study, a  
series of evaluations were carried out in a real agricultural test field as part of the RHEA project [40]. Multiple integration and  
performance tests were conducted, confirming the system’s efficiency and its ability to incorporate new components with minimal  
effort. The results are organized as follows: Sections 6.1 and 6.2 present both quantitative and qualitative findings related to the  
reduction of hardware components and the simplification of software development within a single autonomous agricultural  
platform. Section 6.3 reports the performance of a collision-avoidance algorithm, highlighting the benefits of hardware optimization  
for a fleet of autonomous robots designed for agricultural applications.  
References  
1. Kesarwani, Sonali, Mishra,Devesh, Srivastava, Anshuka and Agrawal, K.K., “Design and Development of Automatic Soil  
Moisture Monitoring with Irrigation System for Sustainable Growth in Agriculture”, International Conference on  
Sustainable Computing in Science, Technology & Management (SUSCOM) 2019.  
2. Mondal, Anindita and Mishra, I.S., “Building a Low-Cost Solution using Wireless Sensor Network for Agriculture  
Applcation:, IEEE, 2017.  
3. Tamaki K. Agriculture and Robots. Tokyo Univ of Agric, 2006; 50(4): 8394.  
4. Willrodt F L. Steering attachment for tractors. 1924; U.S. Patent No. 1506706.  
5. Sissons R. Plowing in circles saves time. The Prairie Farmer, 1939; 111(20): 7.  
6. Morgan K E. A step towards an automatic tractor. Farm Mech, 1958; 10(13): 440441.  
7. Harrell R C, Adsit P D, Pool T A, Hoffman R. The Florida Robotic Grove Lab. ASAE Transactions, 1990; 33: 391399.  
8. Reid J F, Zhang Q, Noguchi N, Dickson M. Agricultural automatic guidance research in North America. Computers and  
Electronics in Agriculture, 2000; 25: 155167.  
9. Tillett N D. Automatic guidance sensors for agricultural field machines: a review. Journal of Agricultural Engineering  
Research, 1991; 50(33): 167187  
10. Hague T, Marchant J A, Tillet N D. Ground based sensing systems for autonomous agricultural vehicles, Computers and  
Electronics in Agriculture, 2000; 25(1-2): 1128.  
11. Marchant J A, Brivot R. Real-time tracking of plant rows using a Hough transform. Real-Time Imaging, 1995; 1(5): 363-  
71.  
12. Hague T, Tillett N D. A bandpass filter-based approach to crop row location and tracking. Mechatronics, 2001; 11(1): 1-  
12.  
13. V., Sandeep, K., Gopal, S., Naveen, A., Amudhan and Kumar, L.S., “Globally Accessible Machine Automation Using  
Raspberry Pi, Based on Internet of Things”, International Conference on Advances in Computing, Communications and  
Informatics (ICACCI), 2015.  
14. Celen, I.H., Onler,E. and Kilic,E., “A Design of an Agricultural Robot to Navigate Between Rows”, International  
Conference of Electrical, Automation and Mechanical Engineering (EAME), 2015.  
15. Bakker, T., Van Asselt, K., Bontsema, J. and Van Straten, G., “Systematic design of an autonomous Platform for Robotic  
Weeding”, Journal of Terramechanics, 47, pp. 63-73, 2010.  
16. Comba, L., Gay, P. and Ricauda Aimonino, D., “Robotics and Automation for crop management: trends and prospective”,  
International Conference on Work Safety and Risk Prevention in Agrofood and Forest Systems”, Ragusa, Italy, 2010.  
17. Anthony, James Bautista and Samuel,Oliver Wane, “ATLAS Robot: A teaching tool for Autonomous Agriculture Mobile  
Robotics”, The Internationl Conference on Control Automation & Information Science (ICCAIS) IEEE, pp. 264-269,  
2018.  
18. Wang, Fu Jaun and Zhang, Bin, “Path Tracking Control for four wheel differentially steered vision robot” IEEE, pp. 1608-  
1611.  
19. Zhan, Xiuxi, Ji, Qianyu, Wei,Shuyi, Zhang,Lingchum and Zhao,Ziwei, “The Design of underground crop digging robot  
and its control method”, 28th Cinese Control and Decision Conference (CCDC) IEEE, pp. 5980-5981, 2016.  
20. Bauer, Robert, Leung, Winnie and Barfoot,Tim, “Experiment and simulation Results of Wheel-Soil Interaction for  
Planetary Rover”, IEEE, 2005.  
21. Peleg, “Distributed coordination algorithms for mobile robot swarms: new directions and challenges,” in Proceedings of  
the 7th International Workshop on Distributed Computing (IWDC ’05), A. Pal, A. Kshemkalyani, R. Kumar, and A. Gupta,  
Eds., Springer, 2005.  
22. Punit Kumar Chaubey “How IoT and Big Data Are Driving Smart Traffic Management and Smart Cities, International  
Journal of Engineering Research & Technology (IJERT), pp. 147152, IJERTV11IS070084, Vol. 11 Issue 07, July-2022.  
Page 1257