Commit 820553dd authored by Tejasri's avatar Tejasri 🎯
Browse files

Merge branch 'aastha-master-patch-54534' into 'master'

Added Project Details

See merge request agriculture/mobile-autonomous-cart-with-guided-vison-for-agriculture!1
parents 1cd51559 d886b534
......@@ -16,6 +16,149 @@ This involves multi-sensor fusion mechanisms and vision guided autonomous learni
- TeraBee follow sensor
- 3D Camera
## Notes :
## Project Details :
**A detailed project requirements would be added soon. **
1. About our project and the structure in which the autonomous vehicles will be moving for their respective tasks. - Here is a Drone Video for our farm.
Drone Footage of our under construction central building - YouTube - Here is the video of the packaging central house where the AGV has to transport the produce from the cravo greenhouse to here.
2. Attached is an image and video prototype of a trolley that we have already built. And our intent is to use the sensors on this trolley.
Our plan is to build an autonomous trolley platform which can be used to transport materials, used for autonomous sprays, etc.
THis is another version we had planned and for which we are making the prototype.
You can have a look at the video of the prototype we already have done.
Have attached a video in this email on the current prototype we already have.
3. Attached is an image on how the entire structure will be enabled with line follower along with barcode for autonomous navigation. THis is how we plan the Navigation path for the robots ... follow the line + QR code ... The robot is developed with a camera to capture floor images that is used to detect QR code and extract lines’ parameters. A fuzzy decision-maker is designed to solve the deviation problem occurring during a navigation process between QR codes. The QR code is used to get the current position and recognize the direction to neighbor QR code’s.
4. Terra Ranger Tower Evo - Sensor Integration.
The product that we have bought is TeraRanger Tower Evo - Evo 60m x 8
This will be the first set of sensors that we need to integrate and test with our trolley. Technically the plan is to place the sensors on the top of the trolley in height so that it can see 360 degress around the trolley and predict collision detection. This sensor is purely to identify collisions
About the Product - Teraranger Tower Evo | Solid State LiDAR System | Anti-Collision (
You can find the instruction manual to setup the sensor here - TeraRanger Tower Evo user manual (
Here is a document on placement of the Sensor for applications that is on the ground - TeraRanger Tower Evo user manual (
Task 1: We need to connect the power for this board. Below is the documentation on what needs to be done.
Task 2: Connect the device to the USB of the computer so that you can install their GUI app to start getting the readings
Task 3 : Mount it to the Trolley and see how we can simulate object avoidance and how to translate it to the trolley so that it can stop and once the avoidance is removed it can go forward or backward or move other side to avoid any objects.
If you goto the page and scroll to the bottom of the page you will find all the documation, Installtion files, Githubs with the ROS software, ets. The link is Teraranger Tower Evo | Solid State LiDAR System | Anti-Collision (
5. Terra Ranger Tower Evo - Sensor Integration.
This sensor is for the trolley to follow the human; The trolley is suppose to follow the line but also to follow the human. It's like a master and slave concept.
You can read more about this sensors here - Terabee Follow-Me - Terabee
This is the basic use case of the sensor.
Here is the specification Sheet [Follow-Me-Specification-Sheet.pdf (](
Here is the user manual. - [Terabee-Follow-Me-user-manual.pdf (](
Task 1: Refer to the user manual and figure out the physical connection within the slaves and connection between the master and slave devices. And also figure our how to mount the slaves to the trolley
[Terabee-Follow-Me-user-manual.pdf (](
Task 2: Integrate the output from the terabee follow me sensors to the drive system of the trolley. So that along with the follow me data + Follow me lines it knows how to reach the master safely . And also how can it use the evo lidar to identify any obstacles.
please scroll to the below of this page Terabee Follow-Me - Terabee and you will find all the relavant documents, software and github repo.
Below are some comments from the Terrabee developer on the follow me . Thought it would be helpful.
Many thanks for your interest in the Terabee Follow-Me system. We had some feedback from early users about the ability of the system to differentiate the front and the rear of a mobile robot with reference to the Follow-Me system.
To overcome this we have developed a ROS package that adds a direction confidence to the output information of the system.
The output of the direction confidence can take 3 different values:
0.0 - 1.0 -> in the initial phase of movement (when the back and forth solution is being determined). This gain is a constant parameter, defined by the user.
+1 -> After the solution has been determined and the remote has been located at the front of the system.
-1 -> After the solution has been determined and the remote has been located at the back of the system.
The system attempts to detect the position of the remote at the system start, by accumulating distance values over a short, user defined period of time. While the robot is moving forward, If the distance values are growing it indicates that the person is at the back of the system. If the distance values are decreasing it indicates that the person is in front of the system. To properly make use of the feature, the person must stand without major movement during the start operation. This time can be defined in the package configuration parameters.
Please find the package here:
Password: ROS
The installation instructions are provided as a README file inside the package.
The following configuration options are available (set in launch file):
timeout_start_op - specifies how long the robot will move with reduced speed before determining the direction.
speed_reduction_factor - specifies how the velocity of the robot should be scaled for the reduced speed mode.
6. 3D TOF Camera
We got this sensor to jus identify hand gestures so that the person the trolley is following can give hand gestures for the trolley to stop, move or go back.
You can refer to the product details at 3D TOF Camera | Compact Sensor | 3D TOF Technology (
Below is the purpose for the sensor but we are focusing only on the hand gesture for the movement.
Task 1: This sensor is is ready for developer use from the box. So we need to setup the sensors to capture the intial 3D and how to identify hand gesture.
Task 2: Sertup hand gesture regonition for go , stop and move backward. And those data has to be passed on to the trolley driving sytem.
Scroll down to bottom of this page 3D TOF Camera | Compact Sensor | 3D TOF Technology ( and you will find the documentaiton, video tutorials (for linux and windows), Software and github repo for sample code and ROS package.
All the sensors are right now with Vandan and he can ship it out to you when you are ready. We need about 5 days for the sensors to reach you.
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment