|
|
# Group 8
|
|
|
|
|
|
## Lab Notebook 10
|
|
|
|
|
|
**Date:** 2/6 - 2015
|
|
|
|
|
|
**Duration:** 14-16 hours
|
|
|
|
|
|
**Participants:**
|
|
|
|
|
|
- Frederik Jerløv
|
|
|
- Casper Christensen
|
|
|
- Mark Sejr Gottenborg
|
|
|
- Peder Detlefsen
|
|
|
|
|
|
## Goal
|
|
|
The goal is to track our position using odometry and tacho counter.
|
|
|
|
|
|
## Plan
|
|
|
Our plan is to follow the exercises exactly as described.
|
|
|
|
|
|
## Odometry
|
|
|
First we made our vehicle run the `PilotSquare.java`[2] to see how accurate our vehicle was. Below is shown an image of one of the runs.
|
|
|
|
|
|
![Image showing a run of PilotSquare.java](http://i.imgur.com/Rf6mntq.jpg)
|
|
|
|
|
|
So we can easy see that the initial wheel distance and track width is not optimal for out vehicle. We noticed that the endpoint is not the same, as we expect because we have not calibrated it yet. In the next section we will try improve on hitting the same spot better.
|
|
|
|
|
|
## Calibrate the wheel diameter and the track width
|
|
|
In this section we will describe how we reduced the systematic errors by calibrating wheel diameter and track width. In order to make an more accurate vehicle we need to tell the `DifferentialPilot` constructor the wheel diameter and track width of our robot. The easy solution would just be to use a ruler and then measure the distances. This of course is not that precise, for instance we use rubber wheels, meaning the more weight we but on the vehicle the more pressure will be put on the wheel. Another thing is, that it is not clear where we should measure the track width. Should it from start of each wheel? Should it be from middle, or end?
|
|
|
|
|
|
### Calibrate wheel diameter
|
|
|
So what we did instead was to first tweak on the wheel diameter. So we wrote a new test program `PilotStraight.java`[3] which would make it drive strightforward for 25 cm. Below is a image of one of the runs.
|
|
|
|
|
|
![Image showing a run while calibrating wheel diameter](http://i.imgur.com/6DEzj9t.jpg)
|
|
|
|
|
|
We noticed our vehicle drifted slightly to the right, see picture below.
|
|
|
|
|
|
![Run of straight test with right drift, width 5.5](http://i.imgur.com/yiGniuz.jpg)
|
|
|
|
|
|
So instead of having both wheel diameter the same we sat each wheel diameter in the constructor instead. First we started by getting it to drive straight by reducing the `rightWheelDiameter`value until stopping points were even distributed. Then we multiplied a factor in order to get it to stop at the correct distance.
|
|
|
|
|
|
The magic numbers for our vehicle ended up being.
|
|
|
`leftWheelDiameter`: 5.5385, `rightWheelDiameter`: 5.522388
|
|
|
See image below for the distribution.
|
|
|
|
|
|
![Several runs of straight test, width 5.5385 5.522388](http://i.imgur.com/X9RXcDU.jpg)
|
|
|
|
|
|
After the calibrating was done we also tried increase the distance to 50 cm, which stopped at the 50 cm. mark.
|
|
|
|
|
|
### Calibrate track width
|
|
|
Just as we did for the wheel diameter we wrote a new program `PilotTurn.java`[4] which would make the robot turn 90 degrees. Below is a image of one the runs.
|
|
|
|
|
|
![Image showing a run while calibrating track width](http://i.imgur.com/2uLO9NE.jpg)
|
|
|
|
|
|
It seemed to be a lot easier to get it to turn 90 degrees. So after some runs we ended up with the magic number for the track width to be.
|
|
|
`trackWidth`: 16,5
|
|
|
See image below for distribution.
|
|
|
|
|
|
![Several runs of turn test, track width 16,5](http://i.imgur.com/nQ6xWp7.jpg)
|
|
|
|
|
|
### Back to Pilot Square
|
|
|
Now after we have calibrated the wheel diameter and track width to reduce the systematic errors we tried running `PilotSquare.java`[2] again. The end point is shown in the image below.
|
|
|
|
|
|
![Image showing a run of PilotSquare.java after calibration](http://i.imgur.com/lKBJKGG.jpg)
|
|
|
|
|
|
It differs a little from the start, which we would expect due to non-systematic errors, but it is way better than the first run we had and the small errors we had. Next we also tried to increase the travel distance in `PilotSquare.java`by changing `pilot.travel(25)` to `pilot.travel(50)`. See image below for result.
|
|
|
|
|
|
![Image showing a run of PilotSquare.java after calibration and travel 50 instead of 25](http://i.imgur.com/oHoy60W.jpg)
|
|
|
|
|
|
This error is a bit larger but this is still within an the acceptable.
|
|
|
|
|
|
## Position tracking by means of particle filters
|
|
|
Before we started doing any changes to the `applyMove()` and the noise factors we just tried to run it. The image below show the distribution after one of the runs.
|
|
|
|
|
|
![PilotRoute test without improvements](http://i.imgur.com/7buCqFw.png)
|
|
|
|
|
|
We see that we have a high variance, which we would like to improve on by estimating our noise factors and using some of the stuff from [5].
|
|
|
|
|
|
### Estimate distanceNoiseFactor
|
|
|
When estimating the distance noise factor we did 6 runs with the robot programmed to travel straight for 50 cm. The stoppings points were: (50, 0.9), (50, 0.5), (50, 0.5), (50, 0.1), (50, 0), (50, -0.2).
|
|
|
The average deviation was then: (0.9 + 0.5 + 0.5 + 0.1 + 0 + 0.2) / 6 = 0.367
|
|
|
This gave us a distance noise factor of: 0.367 / 50 = 0.00733 = 0.733 %
|
|
|
|
|
|
### Estimate angleNoiseFactor
|
|
|
When estimating the angle noise factor we also did 6 runs but this time the robot was programmed to rotate 90 degrees. The actual degrees turned rounded to the nearest half degree were: 87.5, 89, 89.5, 89.5, 90, 90.
|
|
|
The average deviation was then: (2.5 + 1 + 0.5 + 0.5 + 0 + 0) / 6 = 0.75
|
|
|
This gave us an angle noise factor of : 0.75 / 90 = 0.00833 = 0.833 %
|
|
|
|
|
|
Should the angle noise be a percentage as with the distance noise? Let us do a quick test with 360 and 720 degree turns. Tests confirm that increasing the amount of degrees rotated also increases the deviation from the target rotation. Meaning that it should be fine to use the percentage.
|
|
|
|
|
|
### Experimenting with the model
|
|
|
Below is a snipped of how the particle locations and angles are calculated in [5].
|
|
|
|
|
|
```C++
|
|
|
z[i].theta += t + r*ANGLE_P_NOISE*noise() + ANGLE_NOISE*noise() + t*ANGLE_R_NOISE*noise();
|
|
|
z[i].x += r*cos(z[i].theta) + R_NOISE*noise() + r*R_P_NOISE*noise();
|
|
|
z[i].y += r*sin(z[i].theta) + R_NOISE*noise() + r*R_P_NOISE*noise();
|
|
|
```
|
|
|
|
|
|
This is different from how the standard `applyMove()` method work in Lejos. The heading calculation above also relies on the distance traveled, which is not the case in `applyMove()`. This is because his move functions support both traveling and rotating in the same move. Furthermore, we notice that his heading calculation contains the angle turned multiplied by the angle noise factor whereas Lejos only adds the angle noise to the angle turned. This means the standard `applyMove()` gives us a fixed angle deviation no matter how much we turn even though our tests showed us that the deviation increased along with the angle turned. Lastly we observed that the code above uses the distance traveled ´r´ when calculating distance noise while Lejos uses distance traveled along the x- or y-axis separately. This leads to errors since there will be no deviation on the y-axis when traveling along the x-axis and vice versa. Below is Lejos `applyMove()`[2] code
|
|
|
|
|
|
```Java
|
|
|
public void applyMove(Move move, float distanceNoiseFactor, float angleNoiseFactor)
|
|
|
{
|
|
|
float ym = (move.getDistanceTraveled() * (
|
|
|
(float)Math.sin(Math.toRadians(pose.getHeading()))));
|
|
|
float xm = (move.getDistanceTraveled() * (
|
|
|
(float)Math.cos(Math.toRadians(pose.getHeading()))));
|
|
|
|
|
|
pose.setLocation(new Point(
|
|
|
(float) (pose.getX() + xm + (distanceNoiseFactor * xm *
|
|
|
rand.nextGaussian())), (float) (pose.getY() + ym + (distanceNoiseFactor *
|
|
|
ym * rand.nextGaussian()))));
|
|
|
pose.setHeading((float) (pose.getHeading() + move.getAngleTurned() +
|
|
|
(angleNoiseFactor * rand.nextGaussian())));
|
|
|
pose.setHeading((float) ((int) (pose.getHeading() + 0.5f) % 360));
|
|
|
}
|
|
|
```
|
|
|
|
|
|
### PilotRoute after improving `applyMove()`
|
|
|
After we had calculated the noise factors we changed the noise factors in `ParticleSet.java`[7] to
|
|
|
|
|
|
```Java
|
|
|
private float distanceNoiseFactor = 0.00733f; //0.2f;
|
|
|
private float angleNoiseFactor = 0.00833f; //4f;
|
|
|
```
|
|
|
|
|
|
The angle noise factor is now a percentage instead of a constant that just gets added to the angle turned. Below is the improved `applyMove()`[6] method, which now also uses `move.getDistanceTraveled()` to calculate the distance noise and `move.getAngleTurned()` to calculate the angle noise.
|
|
|
|
|
|
```Java
|
|
|
public void applyMove(Move move, float distanceNoiseFactor, float angleNoiseFactor)
|
|
|
{
|
|
|
float ym = (move.getDistanceTraveled() * (
|
|
|
(float) Math.sin(Math.toRadians(pose.getHeading()))));
|
|
|
float xm = (move.getDistanceTraveled() * (
|
|
|
(float) Math.cos(Math.toRadians(pose.getHeading()))));
|
|
|
pose.setLocation(new Point((float) (pose.getX() + xm + (distanceNoiseFactor *
|
|
|
move.getDistanceTraveled() * rand.nextGaussian())),(float) (pose.getY() +
|
|
|
ym + (distanceNoiseFactor * move.getDistanceTraveled() *
|
|
|
rand.nextGaussian()))));
|
|
|
pose.setHeading((float) (pose.getHeading() + move.getAngleTurned() +
|
|
|
(angleNoiseFactor * move.getAngleTurned() * rand.nextGaussian())));
|
|
|
pose.setHeading((float) ((int) (pose.getHeading() + 0.5f) % 360));
|
|
|
}
|
|
|
```
|
|
|
|
|
|
Below we see a new run using the newly estimated noise factors together with the new `applyMove()` method. The image below shows the result.
|
|
|
|
|
|
![PilotRoute test with improvements](http://i.imgur.com/sWeBG7a.png)
|
|
|
|
|
|
We can now see that the variance is much lower.
|
|
|
|
|
|
### PilotGUI improvements
|
|
|
Since our robot ended up stopping within 1 - 2 cm of the intended stopping point when running PilotRoute there was no way of seeing the distribution of points on the PilotMonitor since they were so close to each other. To fix this we changed our unit of measurement from centimeter to millimeter in PilotRoute which made the monitor 10 times more accurate. Furthermore, we also increased the number of particles from 10 to 1000 and decreased their size to 1 to get results closer to what they had in [5]. Finally, we wanted to show the particles for each individual step even after the robot had reached its destination. To be able to do this we saved a copy of the particles array each time the GUI was updated and stored it in an ArrayList containing ParticleSet’s. This list could then be used to paint all the particles that had been calculated throughout the route.
|
|
|
|
|
|
![PilotRoute with improved GUI](http://i.imgur.com/SaFOKfD.png)
|
|
|
|
|
|
We later discovered that the image above was not entirely accurate since another set of particles was added on top of the existing particles after each rotate. This meant that on every turn there were two sets of particles making it appear like the robot was more sure of its position than it actually was. Here is a new picture where this have been fixed by only keeping the particles after the rotate:
|
|
|
|
|
|
![PilotRoute with improved GUI and fix](http://i.imgur.com/CCcAnyF.png)
|
|
|
|
|
|
The code used is shown below[8]
|
|
|
|
|
|
```Java
|
|
|
public void update(ArrayList<Point> route, boolean isRotate)
|
|
|
{
|
|
|
this.route = route;
|
|
|
if (isRotate) {
|
|
|
// Remove last move since rotate will appear on top.
|
|
|
particleSets.remove(particleSets.size() - 1);
|
|
|
}
|
|
|
particleSets.add(new ParticleSet(particles));
|
|
|
repaint();
|
|
|
}
|
|
|
```
|
|
|
|
|
|
## Position tracking while avoiding objects
|
|
|
We have started working on a behavior based position tracking robot which avoids any objects in its way but unfortunately did not finish to do this. We will finish this together with the making of Lab Lesson 11.
|
|
|
|
|
|
## Conclusion
|
|
|
We have in the lab lesson worked with localization of the vehicle with the help of odometry using Lejos, `DifferntialPilot`[2] and `OdemetryPoseProvider`[2]. After calibrating we reduce the systematic errors to a minimum.
|
|
|
Then we used the `PilotMonitor`[2] in order to estimate our endpoint. It turned out that the particle estimation and the actual end position played well together.
|
|
|
Last we should have tried position tracking while avoiding objects. This we did not finish and will be done together with the next lab lesson.
|
|
|
|
|
|
## References
|
|
|
[1], Exercise material lab 10 (http://legolab.cs.au.dk/DigitalControl.dir/NXT/Lesson10.dir/Lesson.html)
|
|
|
|
|
|
[2], Given code (http://legolab.cs.au.dk/DigitalControl.dir/NXT/Lesson10.dir/LessonPrograms.zip)
|
|
|
|
|
|
[3], PilotStraight code (https://gitlab.au.dk/lego-group-8/lego/blob/master/lesson10/NXT/PilotStraight.java)
|
|
|
|
|
|
[4], PilotTurn2 code (https://gitlab.au.dk/lego-group-8/lego/blob/master/lesson10/NXT/PilotTurn.java)
|
|
|
|
|
|
[5], Monte Carlo Localization, Particle2Dmotion (http://commons.wikimedia.org/wiki/File:Particle2dmotion.svg)
|
|
|
|
|
|
[6], New applyMove in Pilot.java (https://gitlab.au.dk/lego-group-8/lego/blob/master/lesson10/PC/Particle.java)
|
|
|
|
|
|
[7], New ParticleSet.java (https://gitlab.au.dk/lego-group-8/lego/blob/master/lesson10/PC/ParticleSet.java)
|
|
|
|
|
|
[8], New PilotGui.java (https://gitlab.au.dk/lego-group-8/lego/blob/master/lesson10/PC/PilotGUI.java) |
|
|
\ No newline at end of file |