... | ... | @@ -30,10 +30,24 @@ For the first test we used the wide offroad wheels. The wheels have a diameter o |
|
|
##### Fig. 2 - Codesnippet from PilotSquare.java
|
|
|
|
|
|
### Results:
|
|
|
According to the program the robot should travel 20 cm before turning 90 degrees. This cycle is repeated 4 times in order to perform a square. After driving the robot ended up 5 mm to the right and 7 mm in front of its starting position.
|
|
|
According to the program (see code below) the robot should travel 20 cm before turning 90 degrees. This cycle is repeated 4 times in order to perform a square. After driving the robot ended up 5 mm to the right and 7 mm in front of its starting position.
|
|
|
|
|
|
```
|
|
|
for(int i = 0; i < 4; i++)
|
|
|
{
|
|
|
pilot.travel(20);
|
|
|
show(poseProvider.getPose());
|
|
|
Delay.msDelay(1000);
|
|
|
|
|
|
pilot.rotate(90);
|
|
|
show(poseProvider.getPose());
|
|
|
Delay.msDelay(1000);
|
|
|
}
|
|
|
```
|
|
|
##### Fig. 3 - Codesnippet from PilotSquare.java.
|
|
|
|
|
|
![IMG_0665](http://gitlab.au.dk/uploads/group-22/lego/ffe240c5cc/IMG_0665.JPG)
|
|
|
##### Fig. 3 - First testrun: Ended 5mm to the right and 7mm in front of its starting position.
|
|
|
##### Fig. 4 - First testrun: Ended 5mm to the right and 7mm in front of its starting position.
|
|
|
|
|
|
In this experiment the robot should have travelled a total of 20*4=80 cm. Using Pythagoras theorem it is revealed that the robot travelled 8.6 mm more than intended. This gives an error of (80.86/80*100)-100 = 1.075 %.
|
|
|
|
... | ... | @@ -42,12 +56,12 @@ In order to minimise the error we replaced the wide offroad wheels with slim whe |
|
|
|
|
|
### Results:
|
|
|
![after 50cm square drive](http://gitlab.au.dk/uploads/group-22/lego/49b44fef0e/after_50cm_square_drive.JPG)
|
|
|
##### Fig. 4 - Robot with slim wheels - end position aften run.
|
|
|
##### Fig. 5 - Robot with slim wheels - end position aften run.
|
|
|
|
|
|
This time the robot ended 1 mm to the right and 1.5 mm in front of the initial position revealing an error of (80.18/80*100)-100 = 0.225 %. The robot’s pose values show that the robot have travelled 1.6 mm too far and 1.7 mm to the right (see fig. 5).
|
|
|
This time the robot ended 1 mm to the right and 1.5 mm in front of the initial position revealing an error of (80.18/80*100)-100 = 0.225 %. The robot’s pose values show that the robot have travelled 1.6 mm too far and 1.7 mm to the right (see fig. 6).
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 09.32.02](http://gitlab.au.dk/uploads/group-22/lego/1283b932a5/Sk%C3%A6rmbillede_2015-06-01_kl._09.32.02.png)
|
|
|
##### Fig. 5 - Screendump of the robot screen with values.
|
|
|
##### Fig. 6 - Screendump of the robot screen with values.
|
|
|
|
|
|
These values are very close to the ones we noted on the paper grid layout.
|
|
|
|
... | ... | @@ -60,7 +74,7 @@ There is two dominant errors [1] which can occur within the differential-drive m |
|
|
As our Test 2 revealed the importance of a precisely measured wheelDiameter we decided to use a caliper to get these values as accurate as possible. Doing these measurements would also reveal any possible deviations for the same type of wheels.
|
|
|
|
|
|
![IMG_8677](http://gitlab.au.dk/uploads/group-22/lego/e449d916ab/IMG_8677.JPG)
|
|
|
##### Fig. 6 - With the use of a caliper, we got the accurate measuring of the wheel diameter.
|
|
|
##### Fig. 7 - With the use of a caliper, we got the accurate measuring of the wheel diameter.
|
|
|
|
|
|
The fat off road wheels were measured to be 5,6 + (1/10)*0,1 = 5,61 centimetres, but were likely to compress slightly under the weight of the robot, something that could not be measured using a standard caliper with the wheels mounted to the robot.
|
|
|
|
... | ... | @@ -73,11 +87,11 @@ Since the true track width is measured from the centre of one wheel to the centr |
|
|
To calculate the true track width, we had to calculate the average wheel thickness. This was done by measuring the outer and inner track widths, subtracting them and dividing by two.
|
|
|
|
|
|
![outer track width](http://gitlab.au.dk/uploads/group-22/lego/8863569861/outer_track_width.jpg)
|
|
|
##### Fig. 7 - Outer track width.
|
|
|
##### Fig. 8 - Outer track width.
|
|
|
|
|
|
Avg. wheel thickness:
|
|
|
![Skærmbillede 2015-06-01 kl. 09.42.53](http://gitlab.au.dk/uploads/group-22/lego/a94a8f240e/Sk%C3%A6rmbillede_2015-06-01_kl._09.42.53.png)
|
|
|
##### Fig. 8 - Average wheel thickness.
|
|
|
##### Fig. 9 - Average wheel thickness.
|
|
|
|
|
|
|
|
|
The point of the wheel where the diameter is largest would naturally be at the centre point, at 0,31cm/2 = 0,155 cm
|
... | ... | @@ -85,13 +99,13 @@ The point of the wheel where the diameter is largest would naturally be at the c |
|
|
The True Track Width with the slim wheels was therefore calculated as following:
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 09.45.12](http://gitlab.au.dk/uploads/group-22/lego/fe2c58f523/Sk%C3%A6rmbillede_2015-06-01_kl._09.45.12.png)
|
|
|
##### Fig. 9 - True Track Width calculation.
|
|
|
##### Fig. 10 - True Track Width calculation.
|
|
|
|
|
|
### Testing changes in orientation using extended pointer:
|
|
|
By attaching an extended pointer the change in orientation should be easier to detect. Initial result:
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 09.47.57](http://gitlab.au.dk/uploads/group-22/lego/2f8db38051/Sk%C3%A6rmbillede_2015-06-01_kl._09.47.57.png)
|
|
|
##### Fig. 10 - Screendump of the robot screen with values after extended version.
|
|
|
##### Fig. 11 - Screendump of the robot screen with values after extended version.
|
|
|
|
|
|
The extended arm with two pointers made it easier to align the robot to the line on the ground.
|
|
|
|
... | ... | @@ -105,38 +119,38 @@ To calibrate the robots travel distance according to the wheel diameter we modif |
|
|
pilot.travel(50);
|
|
|
show(poseProvider.getPose());
|
|
|
```
|
|
|
##### Fig. 11 - Codesnippet from PilotSquare.java
|
|
|
##### Fig. 12 - Codesnippet from PilotSquare.java
|
|
|
|
|
|
|
|
|
This code should drive the robot forward 50 cm’s. In order to get accurate results the robot was placed on a paper with a grid layout and markers for each 25 cm’s. By doing this we know that if the robot is calibrated correctly it will stop after two lines corresponding to 50 cm.
|
|
|
|
|
|
|
|
|
![IMG_0677](http://gitlab.au.dk/uploads/group-22/lego/c6c2a8e30e/IMG_0677.JPG)
|
|
|
##### Fig. 12 - The robot starting position.
|
|
|
##### Fig. 13 - The robot starting position.
|
|
|
|
|
|
### Initial drive (diameter 3.0):
|
|
|
The initial drive revealed a traveled distance of 51 cm which is 1 cm more than targeted. This result is also proved by the pose() values showed below which indicates a value larger than 50.
|
|
|
|
|
|
![IMG_0679](http://gitlab.au.dk/uploads/group-22/lego/299f6cf5de/IMG_0679.JPG)
|
|
|
##### Fig. 13 - Ending position of the robot after test run.
|
|
|
##### Fig. 14 - Ending position of the robot after test run.
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 09.59.50](http://gitlab.au.dk/uploads/group-22/lego/08b4fb0c39/Sk%C3%A6rmbillede_2015-06-01_kl._09.59.50.png)
|
|
|
##### Fig. 14 - Screendump of the robot after test run.
|
|
|
##### Fig. 15 - Screendump of the robot after test run.
|
|
|
|
|
|
As the robot drove too far, we modified the wheelDiameter value from 3.0 to 3.1 (cm) and observed the change in travelled distance.
|
|
|
|
|
|
#### Wheel diameter at 3.1:
|
|
|
With this value the robot travelled too short (see fig. 15).
|
|
|
![IMG_0678](http://gitlab.au.dk/uploads/group-22/lego/2c9eeb3955/IMG_0678.JPG)
|
|
|
##### Fig. 15 - The robot endpoint with 3.1 Wheel diameter.
|
|
|
##### Fig. 16 - The robot endpoint with 3.1 Wheel diameter.
|
|
|
|
|
|
#### Wheel diameter at 3.025:
|
|
|
After some fine tuning we discovered that a wheelDiameter value of 3.025 revealed a correct travel in distance (see fig. 16).
|
|
|
After some fine tuning we discovered that a wheelDiameter value of 3.025 revealed a correct travel in distance (see fig. 17).
|
|
|
![Skærmbillede 2015-06-01 kl. 10.04.44](http://gitlab.au.dk/uploads/group-22/lego/33822843f2/Sk%C3%A6rmbillede_2015-06-01_kl._10.04.44.png)
|
|
|
##### Fig. 16 - Screendump of the Robot with values.
|
|
|
##### Fig. 17 - Screendump of the Robot with values.
|
|
|
|
|
|
![IMG_0681](http://gitlab.au.dk/uploads/group-22/lego/ca29af600f/IMG_0681.JPG)
|
|
|
##### Fig. 17 - Ending position with wheel diameter at 3.025.
|
|
|
##### Fig. 18 - Ending position with wheel diameter at 3.025.
|
|
|
|
|
|
### Causes for different Odometry
|
|
|
|
... | ... | @@ -154,7 +168,7 @@ To make the robot turn 180 degrees we modified the for-loop in PilotSquare.java. |
|
|
pilot.rotate(180);
|
|
|
show(poseProvider.getPose());
|
|
|
```
|
|
|
##### Fig. 18 - Codesnippet from PilotSquare.java
|
|
|
##### Fig. 19 - Codesnippet from PilotSquare.java
|
|
|
|
|
|
With the value trackWidth correctly calibrated we expect the robot to turn exactly 180 degrees.
|
|
|
|
... | ... | @@ -162,13 +176,13 @@ With the value trackWidth correctly calibrated we expect the robot to turn exact |
|
|
The results showed the robot turn 180 degrees and stop with the pointer touching the line. When looking at the pose result V we see that it is practically as close to 180 degrees as itcan be, meaning that the trackWidth value is calibrated properly from the start.
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 10.15.35](http://gitlab.au.dk/uploads/group-22/lego/53ebbb1e9d/Sk%C3%A6rmbillede_2015-06-01_kl._10.15.35.png)
|
|
|
##### Fig. 19 - Screendump from the robot screen with values.
|
|
|
##### Fig. 20 - Screendump from the robot screen with values.
|
|
|
|
|
|
![90 turn start](http://gitlab.au.dk/uploads/group-22/lego/88f4af5140/90_turn_start.JPG)
|
|
|
Fig. 20 - Starting position of the robot before 90 degree turn.
|
|
|
Fig. 21 - Starting position of the robot before 90 degree turn.
|
|
|
|
|
|
![90 turn finished](http://gitlab.au.dk/uploads/group-22/lego/2f3dfa8f34/90_turn_finished.JPG)
|
|
|
Fig. 21 - Finish position of the robot after 90 degree turn.
|
|
|
Fig. 22 - Finish position of the robot after 90 degree turn.
|
|
|
|
|
|
In [4] it is described that "With proper adjustment of these parameters, errors in distance traveled and angle of rotation can be held to 2% or perhaps less". See if this can be achieved.
|
|
|
|
... | ... | @@ -181,7 +195,7 @@ To obtain an increased travel distance we made the following changes to the for- |
|
|
```
|
|
|
pilot.travel(50); //instead of 25
|
|
|
```
|
|
|
##### Fig. 22 - Codesnippet of the for-loop in the PilotSquare.java
|
|
|
##### Fig. 23 - Codesnippet of the for-loop in the PilotSquare.java
|
|
|
|
|
|
## Results
|
|
|
The experiments showed the robot stopping within 3 mm of the line after driving in a square with side = 50 cm.
|
... | ... | @@ -189,13 +203,13 @@ The experiments showed the robot stopping within 3 mm of the line after driving |
|
|
The robot travelled a total distance of 200 cm. The pose values show that the robot have travelled 0.4 cm too far. This reveals an error of (200.4/200*100)-100 = 0.205 % which is very low and acceptable.
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 10.40.37](http://gitlab.au.dk/uploads/group-22/lego/ed32d850c4/Sk%C3%A6rmbillede_2015-06-01_kl._10.40.37.png)
|
|
|
##### Fig. 23 - Screendump of the robot screen after run.
|
|
|
##### Fig. 24 - Screendump of the robot screen after run.
|
|
|
|
|
|
## Position tracking by means of particle filters
|
|
|
To estimate the influence of non-systematic errors on the position of the vehicle obtained by odometry we will use the method of Monte Carlo localization or particle filter localization, [9]: "The algorithm uses a particle filter to represent the distribution of likely states, with each particle representing a possible state, i.e. a hypothesis of where the robot is". We will use use the algorithm in the special case where only the movements of the vehicle will be modelled in the motion update step of the algorithm, [9] and all particles are initially set to the known starting position. An example of the resulting particle set after each move of a non-sensing vehicle can be seen in Fig. 24. The details of the stochastic motion model used in Fig. 24 can be found in [9].
|
|
|
To estimate the influence of non-systematic errors on the position of the vehicle obtained by odometry we will use the method of Monte Carlo localization or particle filter localization, [2]: "The algorithm uses a particle filter to represent the distribution of likely states, with each particle representing a possible state, i.e. a hypothesis of where the robot is". We will use use the algorithm in the special case where only the movements of the vehicle will be modelled in the motion update step of the algorithm, [2] and all particles are initially set to the known starting position. An example of the resulting particle set after each move of a non-sensing vehicle can be seen in Fig. 25. The details of the stochastic motion model used in Fig. 25 can be found in [2].
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 10.42.53](http://gitlab.au.dk/uploads/group-22/lego/51f222e026/Sk%C3%A6rmbillede_2015-06-01_kl._10.42.53.png)
|
|
|
##### Fig. 24: The distribution of likely positions for a non-sensing vehicle after moving for several steps.
|
|
|
##### Fig. 25: The distribution of likely positions for a non-sensing vehicle after moving for several steps.
|
|
|
|
|
|
## Task
|
|
|
Investigate if the model of non-systematic random odometry errors also model the errors of the base vehicle driving a sequence of travel and rotation steps.
|
... | ... | @@ -205,20 +219,20 @@ By repeating a square-driving sequence we will measure the extent of the randoml |
|
|
|
|
|
### Changing the distance noise
|
|
|
![Skærmbillede 2015-06-01 kl. 10.44.54](http://gitlab.au.dk/uploads/group-22/lego/6fb5274750/Sk%C3%A6rmbillede_2015-06-01_kl._10.44.54.png)
|
|
|
##### Fig. 25: The four experiments with changing the distance noise.
|
|
|
##### Fig. 26: The four experiments with changing the distance noise.
|
|
|
|
|
|
The four experiments show that when the noise factor is increased the spread increases as well. Each point represents a possible outcome position of the robot after driving the square. However the noise value may be fine tuned according to the robot’s accuracy as our previous experiments revealed that the robot was able to travel a square with 50 cm sides within mm’s of accuracy. With this in mind a distance noise factor above 0.2 may be inappropriate.
|
|
|
|
|
|
### Changing the angular noise
|
|
|
![Skærmbillede 2015-06-01 kl. 10.46.12](http://gitlab.au.dk/uploads/group-22/lego/600af11772/Sk%C3%A6rmbillede_2015-06-01_kl._10.46.12.png)
|
|
|
##### Fig. 26: The four experiments with changing the angular noise
|
|
|
##### Fig. 27: The four experiments with changing the angular noise
|
|
|
|
|
|
This experiments reveals similar results to the experiment with distance noise above. It shows that once the noise factor increases the spread will increase equally. What is worth noting is that the angle noise factor is significantly higher than the distance noise factor. This is due to the fact that if the robot only has to drive forward, it will have less trouble staying on track. Whenever the robot turns more factors comes into play eg. accuracy in turning angles, friction when turning and so on resulting in a larger spread in points. However an angle noise factor of 32 seems way too much compared to how the robot acts on a track. As mentioned earlier the robot can complete a square route with high accuracy which makes an angle noise factor of less than 4 more appropriate.
|
|
|
|
|
|
### Sub Task
|
|
|
Look into the motion model used in [2] to get inspiration for experimenting with the model used in applyMove.
|
|
|
|
|
|
The motion model from [2] uses theta to calculate the heading of the robot at its starting position. In our applyMove model from Particle.java we don’t use the theta parameter. This means we don’t get any noise on the angle of the robot when it is driving in a straight line. This can be seen in the figure below (see fig. 28). Only when turning we get noise on the heading of the robot and thus the particles spread out as seen in the previous table (see fig. 26).
|
|
|
The motion model from [2] uses theta to calculate the heading of the robot at its starting position. In our applyMove model from Particle.java we don’t use the theta parameter. This means we don’t get any noise on the angle of the robot when it is driving in a straight line. This can be seen in the figure below (see fig. 29). Only when turning we get noise on the heading of the robot and thus the particles spread out as seen in the previous table (see fig. 27).
|
|
|
|
|
|
```
|
|
|
|
... | ... | @@ -226,10 +240,10 @@ float ym = (move.getDistanceTraveled()*((float)Math.sin(Math.toRadians(pose.getH |
|
|
float xm = (move.getDistanceTraveled()*((float)Math.cos(Math.toRadians(pose.getHeading()))));
|
|
|
|
|
|
```
|
|
|
##### Fig. 27 - Codesnippet from where?
|
|
|
##### Fig. 28 - Codesnippet from where?
|
|
|
|
|
|
![Skærmbillede 2015-06-01 kl. 10.50.47](http://gitlab.au.dk/uploads/group-22/lego/8c7ecb627b/Sk%C3%A6rmbillede_2015-06-01_kl._10.50.47.png)
|
|
|
##### Fig. 28 - As long the robot drives straight, the points will stay on the line with no angle.
|
|
|
##### Fig. 29 - As long the robot drives straight, the points will stay on the line with no angle.
|
|
|
|
|
|
### Particle filter experiments conclusion
|
|
|
Our two experiments revealed that once the noise factors increase the spread of particles will increase equally. When comparing these results to how our robot acts when travelling in a square we can conclude that a noise factor above the default values of distanceNoise=0.2 and angleNoise=4 is inappropriate as once values like wheelDiameter and trackWidth are adjusted properly the robot will drive and turn with high accuracy. We can also conclude that the angular noise should be larger than the distance noise as the robot is influenced by more factors once it turns than if it drives straight.
|
... | ... | @@ -243,14 +257,14 @@ To make a robot that travels along a fixed route using dead reckoning [3], while |
|
|
The logic behind the task is to imagine a public space in which the robot has to navigate a predefined route. In such situation the robot should be able to detect and avoid people on its path and navigate to a fixed point without crashing into people.
|
|
|
|
|
|
![banegaard](http://gitlab.au.dk/uploads/group-22/lego/17eb9ec552/banegaard.jpg)
|
|
|
##### Fig. 29 - Picture of Aarhus Main Station.
|
|
|
##### Fig. 30 - Picture of Aarhus Main Station.
|
|
|
|
|
|
Our strategy is to make the robot drive from point A towards point B in a straight line (see fig. 30).
|
|
|
If the robot encounters an object it will save its pose value in forward direction, turn 90 degrees, drive a short predetermined distance, turn 90 degrees in opposite direction, and then return to the path. When it has passed the object it will compare the current pose value with the ones saved before encountering the object. From this comparison it knows how much distance it needs to cover in order to get to point B.
|
|
|
It is worth noting that this is just an example of a solution to the problem with navigation along dynamic objects. We will only drive in a straight line and in order to keep things simple we will only handle one predefined object on the path.
|
|
|
|
|
|
![plan](http://gitlab.au.dk/uploads/group-22/lego/f151686264/plan.png)
|
|
|
##### Fig. 30 - Our plan to make the robot drive from A to B in a straight line.
|
|
|
##### Fig. 31 - Our plan to make the robot drive from A to B in a straight line.
|
|
|
|
|
|
In comparison to a real life scenario you could argue that most people occupies a certain amount of space in a room and thus use a predefined hardcoded avoidance sequence. However it seems more reasonable if the robot used it sensors while avoiding an object to compensate for unknown factors. For example it would be crucial to detect if it has to avoid just one person or a large group of persons or a person pushing a stroller.
|
|
|
|
... | ... | @@ -284,7 +298,7 @@ oldPoseX = poseProvider.getPose().getX(); |
|
|
show(poseProvider.getPose());
|
|
|
}
|
|
|
```
|
|
|
##### Fig. 31 - Code from PilotSquare.java shows how an ulrasonic sensor is used along with the poseProvider to get the robot to avoid objects when it encounters an object while still being able to get to its desired destination.
|
|
|
##### Fig. 32 - Code from PilotSquare.java shows how an ulrasonic sensor is used along with the poseProvider to get the robot to avoid objects when it encounters an object while still being able to get to its desired destination.
|
|
|
|
|
|
The code shows that as long the x value from the current pose is less than the predefined travelDistance, the robot will drive forward towards its destination. While driving forward the ultrasonic sensor conducts readings in order to detect upcoming objects. If an object is detected it will trigger the if statement containing the code to avoid the object. After completing a series of avoidance steps the oldPoseX value will be updated with the current pose’s x value. Then loop starts over and the robot will travel the remaining distance towards the destination.
|
|
|
|
... | ... | @@ -294,7 +308,7 @@ The code shows that as long the x value from the current pose is less than the p |
|
|
The robot was able to drive along a path with a fixed length and avoid the object encountered in various positions.
|
|
|
|
|
|
[![image alt text](http://img.youtube.com/vi/aCHZVuBc0l0/0.jpg)](http://www.youtube.com/watch?v=aCHZVuBc0l0)
|
|
|
##### Fig. 32 - Video showing how the robot car is avoiding a bucket on the route.
|
|
|
##### Fig. 33 - Video showing how the robot car is avoiding a bucket on the route.
|
|
|
|
|
|
This proves that a robot can navigate between dynamically placed objects while following a predefined route. In spite of the successful result, the experiment showed some concerns which should be handled through further experimentation:
|
|
|
|
... | ... | |