AAN Project - Saxion
1. Saxion and The RAAK-MKB Autonomous Agricultural Navigation (AAN) Project
To address the challenge of sustaining a growing global population, this project targets the development of future-proof methods for food production. Unlike traditional arable farming that relies on heavy machinery, chemical pesticides, and artificial fertilizers, this research emphasizes the use of lighter and smaller agricultural robots, aiming to reduce soil compaction and eliminate the need for plowing.
In the Netherlands, several agricultural robots are under development, but challenges remain before these robots can be widely utilized. Collaborating with six pioneers in the field, Saxion is working on facilitating this research with the focus in this project being on the software needed to enable robots to drive autonomously, and/or independently. For driving, the Robot Operating System is utilized, which is a framework used worldwide by major robotics companies. The project involves modeling, simulating, and testing the robots, with the developed software already made publicly available as an open source project during development. This approach has the benefits of accelerating the time-to-market for new agricultural robots, and contributing to more sustainable and efficient food production practices.
In short, the AAN project focuses on exemplary configurations of a navigation stackin general and on deploying it in agricultural use cases specifically. The focus now lies on the question "How can the ROS 2 framework be implemented to achieve autonomous navigation in several user scenarios for agricultural robots?
When looking at the final deployment and testing in the agricultural setting, three use cases are evaluated:
• Open field navigation, which includes GNSS localization and field coverage path planning.
• Following a row of crops or trees, in which the robot uses relative localization inside of the row and switches to absolute localization to move to the next row.
• Precision docking, to facilitate docking onto rails or to charge batteries.
During the research, ROS2 Humble is used, while for the simulations the selected simulator is Gazebo Fortress. Gazebo is often used in combination with ROS2, and this support, as well as its realistic physics and sensor capabilities, makes switching between simulation and real hardware more straightforward.
The project was supported by Regiorgaan SIA project RAAK-MKB Autonomous Agricultural Navigation (RAAK.MKB16.016).
2. Asimovo
One great thing about simulation is the freedom to design worlds and the accessibility to test robot application in them. With the help of the Asimovo platform, not only is this made more easily achievable, but it also provides a way to showcase and share the project with the broader community.
The version of the AAN project available on Asimovo, showcases the simulation part of the work. This simulation is divided into three ROS2 packages. These launch the world, the robot and the navigation solution.
Users are able to launch the project simply by clicking RUN in the upper part of the screen. This starts every necessary component for the simulation. In the background, the solution works by launching some platform-specific elements, as well as a launchfile that has been prepared to launch the packages of the AAN project. Users are enabled to modify this launchfile or start the packages separately according to their preferences. The last chapter of this case study gives a description on how to launch the packages of this project separately.
For more information about the Asimovo platform, please visit https://github.com/asimovo-platform/docs/wiki
3. The world
This world was created to test the solutions developed during the project. It provides an efficient description of a generic farm environment that has various different areas. It includes a docking station next to a farm house, with a QR code on the wall at the robot's starting position. Furthermore, there is a grass field, a cabbage field, a stack of wood logs and a field designed for open farming. With this world, all three use cases described above can be tested. The cabbage field is suitable for testing the following of a row of crops or trees, the open farming field provides an area to test open navigation, and finally, the station with the QR code provides an opportunity to test precision docking.
4. The robot model
The robot used in this work is a simple three wheeled rover, with two motorized wheels in the back, and one castor wheel at the front. The goal of using this simple model is to enable more initial focus on the navigation stack instead of the specifics of the robot used. This rover can then be replaced with more complex robot models of the partners to test their capabilities in simulation before field-testing.
The kinematic models of these robots have also been created in the scope of this project. These kinematic models are used by the controller to compute actuator commands and platform motion estimations.
5. The navigation
Based on available work and best practices for robot navigation in ROS 2, the navigation stack is defined to have the components as depicted in the figure below. This includes control based on real-life or simulated hardware, localization and navigation.
As you can see on the image below, the navigation framework itself consist of the following main elements working together:
BT Navigator server: The plugin running here accepts the incoming task and controls the actions of the plugins in the other servers based on a behavior tree. It is basically an ROS action server, receiving a request from the client and reporting back to the client on the progress of the navigation task. It is able to send tasks to the planner, controller and behavior servers to execute their part of the navigation task.
Planner server: In the planner server, planner plugins can be loaded that can accept requests from the BT Navigator plugin. The plugin will plan a global path between the robot’s current pose and the requested goal pose(s)
Controller server: Just as the planner server, the controller server allows for configuring multiple controller plugins at the same time and the behavior tree will send request to use a specific controller.
Behaviour server: The plugins running in this server will be called by the behavior tree when they are needed. They include any action not related to normal path planning and control like recovery actions (e.g. back-up, spin) or task specific actions (e.g. take picture, dock at charging station).
Smoother server: The smoother server can run plugins that apply smoothing to paths or velocity profiles while taking into account the available costmaps and robot kinematics and constraints.
6. Results
Various elements are considered as the result of this project. First, the work and the software base that has been created is an important result by itself. The project will be shared open source on GitHub and the Asimovo platform. Furthermore, various workshops will be held about navigation solutions with ROS2, sharing the findings with the community. Further publications will also be created, and finally, proposals for continuing the research will also be made.
7. Launching the project
To start the project you first need to create a workspace. Navigate to the Workspaces page inside the project, and once there, click on Create Workspace. Name the workspace as you like, then select the ROS 2 Humble | Gazebo Fortress
option for the Platform Configuration, and a Medium machine for Hardware. You can add a description and specify tags to this new workspace as well. Since this is a demo project, there is no need to run it in the background.
On the next page, select the following assets (by default these should be the only assets you see):
smart_diffbot
aan_farm_world
aan_navigation_clients
Then click on Generate Launchfile and afterwards, on Save & Start simulation. This starts up a workspace by instantiating a container based on the image selected in the first step.
After the workspace has loaded, follow the next steps to start everything up. You will need multiple terminal windows, a Gazebo viewer and a Foxglove window.
First, build your ROS 2 workspace. Do this by giving the command colcon build
in a terminal window in the folder /home/asimovo
. Then, source this workspace in every terminal window you use, with the command source /home/asimovo/install/setup.bash
.
If not using the prepared launchfile or RUN option, the ROS packages can also be launched separately. Currently this is the recommended way of launching this project.
First, build the workspace with the colcon build
command. This ROS 2 workspace needs to be sourced in every terminal window that is used with the source /home/asimovo/install/setup.bash
command.
To launch the websocket connection so that the Gazebo viewer can visualize the world:
ros2 launch launch/system.launch
The following command launches the world:
ros2 launch aan_farm_world launch_world.launch.py headless:=true
With the next command the robot is launched:
ros2 launch smart_diffbot_bringup main.launch.py costmap_path:=/home/asimovo/assets/aan_farm_world-1.0.0/aan_farm_world/world/costmap/aan_farm.yaml camera:=true x:=30 y:=3
While the open field navigation and farming can be started with:
ros2 run aan_navigation_clients field_cover_client
To start the row following:
To command the rover to return to its docking station:
ros2 run aan_navigation_clients docking_client
Authors
Veronika Bojtar - Asimovo
Kees van Teeffelen - Smart Mechatronics And RoboTics (SMART) Research Group of Saxion University of Applied Sciences
Last updated