zteel created page: Week11 sumo wrestling robots authored by Troels Fleischer Skov Jensen's avatar Troels Fleischer Skov Jensen
......@@ -165,6 +165,7 @@ To detect the white edge of the arena plate, we wanted 2 light sensors in front
We wanted to use the ultrasonic sensor to find our opponent without having turn the car, the best way to do that was to place the ultrasonic sensor on a motor so it could be rotated independently.
![small_IMAG0691](http://gitlab.au.dk/uploads/lego-group-3/lego/c4aedf3d8a/small_IMAG0691.jpg)
Image 2: Image of partial build original idea
We started building our robot, but we didn’t get that far before we checked the weight and discovered it was way too heavy and so we had to cut down on a lot of things.
......@@ -172,26 +173,48 @@ We started building our robot, but we didn’t get that far before we checked th
After cutting down we had only small walls and no walls behind the robot, we also cut the light sensor behind the robot plus some other small optimizations and we managed to get below 1 kg.
![small_IMAG0692](http://gitlab.au.dk/uploads/lego-group-3/lego/a6438303cf/small_IMAG0692.jpg)
Image 2: Image of ultrasonic version of the robot.
Image 3: Image of ultrasonic version of the robot.
### Programming the robot
#### Line detection
The first behavior we implemented was detecting the line, and turning away from it. The detection is done by looking at the median of the black and white values, if the measured value is lower we see black if higher we see white. Then we turn on the spot in a direction dependent on which sensor has been activated, if both the robot drives a little backwards.
#### Ultrasonic Behavior
As distance readings, from the ultrasonic sensor, blocks the thread, we implemented this behavior in a separate thread. The *Behavior* class could then request the latest reading. Using the *NXTRegulatedMotor* class, we pointed the ultrasonic sensor forward, then 45 degrees to the right and finally 45 degrees to the left. Using distance values, from the three angles, we estimated turned the robot in the direction with the shortest distance.
We however found that the motor mounted ultrasonic sensor was simply too slow. It took about one second to do the entire reading of forward and 45 degrees left and right. Meaning that the guess of the position of the other robot always will be way too outdated, so we could not use the guess for anything constructive. One second is a lot of time in a lego sumo match.
Because of the disappointing results, we chose not to use the ultrasonic sensor.
The implementation is found on GitLab[9].
#### Bumpers
By discarding the motor and the ultrasonic sensor, we suddenly had a bit more weight to play with, so we could remount our back wall with touch sensors. And because we no longer needed a sensor port for the ultrasonic sensor, we were able to separate the front wall from the three other walls.
By discarding the motor and the ultrasonic sensor, we suddenly had a bit more weight to play with, so we could remount our back wall with touch sensors. And because we no longer needed a sensor port for the ultrasonic sensor, we were able to separate the front wall from the three other walls. The change picture below.
![IMAG0695](http://gitlab.au.dk/uploads/lego-group-3/lego/befac979c4/IMAG0695.jpg)
With this design we however have another problem. We only know if have something in front of us, which is what we want, or if we have been touched on one of the three sides we do not want. Question is if we can distinguish the walls from each other by small differences in the raw value from the touch sensors. This will be possible if the sensors are inaccurate, but precise.
Image 4: The robot after removing the ultrasonic sensor and adding touch sensors in the back.
With this design we however have another problem. We only know if have something in front of us, which is what we want, or if we have been touched on one of the three sides we do not want. Then we got the idea that we might be able to distinguish the walls from each other by small differences in the raw value from the touch sensors. This will be possible if the sensors are inaccurate i.e. the sensors do not all produce the same value, but precise i.e. the sensors keeps reproducing the same value.
We tested all the unused touch sensors in LegoLab, and found the the raw value returned by SensorPort.readRawValue() ranged in 181-185 with readings from the individual sensor being fairly consistent. We were able to pair touch sensors to get raw values of 98, 99 and 100. It was possible to get readings of +/- 3, but most readings of the paired sensors were the same. Although detecting which bumper is pressed using the raw values, may sometimes be imprecise, it is still better than random guessing.
We have found that we are surprisingly good at detecting which sensors have been activated. Before testing we considered using an average value over the latest measurements, an idea we discarded, because our experiments showed it was unnecessary. We even considered connecting the front bumper to the side and rear bumpers, in order to free up a sensor port. We ended up keeping them separate, as adding another sensor would push the robot above the weight requirement.
We decided that the best reaction when front was pressed, was to drive forward. When a side was pressed we decided to turn towards the opponent, attempting to remain as near to the center as possible and preventing it from pushing us over the edge. When the back bumper is pressed we use the latest white-line detection to decide the direction to turn, turning away from the line.
#### Motor speed
We tried setting the motor speed to the maximum possible. This resulted in the robot driving over the edge, as the momentum of the robot was too large for it to stop and turn. We solved this by setting at lower cruising speed. While an opponent is detected in front of the robot, we set the motor speed to the maximum speed, to get maximum force against the opponent.
We also found that we could get more force by making the robot drive the other way, one reason was that we realised that if the front bumper was pressed, the wheels had a small friction against our construction, but also our weight distribution was more fitting for the robot to drive “backwards”, so we modified the robot to drive the other, which was the final modification to the robot and can be seen in the picture below.
![IMAG0696](http://gitlab.au.dk/uploads/lego-group-3/lego/5a1736ce12/IMAG0696.jpg)
Image 5: Final version of the robot, after moving the light sensor to the other end and making that the front of the robot.
We ended up with the following behavior priority queue:
EXIT > Front touch > Sides touched > Avoid Edge > Cruise
# Conclusion
We have been very limited by the weight limitation, because of our vast amounts of sensors and especially the rcx cables. We have found that a motor mounted ultrasonic sensor is simply too slow to use in a sumo fight, a solution could have been to use two ultrasonic sensors, but we did not have weight or sensor ports to spare. Most interestingly is our experience with using multiple touch sensors connected to the same port through rcx cables. It was surprising how well we could distinguish the sensors solely because of small differences in the measured raw value.
......@@ -216,17 +239,13 @@ http://www.legolab.daimi.au.dk/DigitalControl.dir/NXT/Lesson10.dir/Arbitrator.ja
[7], Dynamic behavior -
http://www.legolab.daimi.au.dk/DigitalControl.dir/NXT/Lesson10.dir/Behavior.java
[8], Source code of our sumo robot - https://gitlab.au.dk/lego-group-3/lego/blob/master/Week9/SumoBot.java
[1], description - link
* // remember using / referencing to litteratur for higher grade than 7
![IMAG0695](http://gitlab.au.dk/uploads/lego-group-3/lego/befac979c4/IMAG0695.jpg)
[9], Source code for the ultrasonic sensor distance detection -
https://gitlab.au.dk/lego-group-3/lego/blob/master/Week9/EnemyDetector.java
![IMAG0696](http://gitlab.au.dk/uploads/lego-group-3/lego/5a1736ce12/IMAG0696.jpg)
[1], description - link
* // remember using / referencing to litteratur for higher grade than 7