... | ... | @@ -14,7 +14,7 @@ To implement observable behaviors on a vehicle fitted with three kinds of sensor |
|
|
|
|
|
## Plan
|
|
|
We will attempt to complete the exercises within 6 hours (based on the fact that we have not managed to complete the previous lessons in less time than that).
|
|
|
Camilla will write the code, Emil will take notes, and Ida and Nicolai will be in charge of performing and visualy documenting the experiments.
|
|
|
Camilla will write the code, Emil will take notes, and Ida and Nicolai will be in charge of performing and visually documenting the experiments.
|
|
|
|
|
|
## Results
|
|
|
### Rebuilding the robot
|
... | ... | @@ -26,17 +26,17 @@ We rebuilt the robot to use four sensors, of which the two touch sensors were pl |
|
|
### Observing the Avoid behavior
|
|
|
We were provided with the program ***AvoidFigure9_3.java*** [4] which implements a behavior that tries to avoid obstacles using the ultrasonic sensor. Figure 2 shows a diagram representing the robot's overall behavior when running it. We ran the program in order to observe the resulting conduct of the robot.
|
|
|
|
|
|
![the avoid behavior](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week9/img/fig93.PNG)
|
|
|
![The avoid behavior](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week9/img/fig93.PNG)
|
|
|
*Figure 2: Diagram of the avoid behavior. The image is originally Figure 9.3 in [2].*
|
|
|
|
|
|
The program worked quite well. The robot succesfully avoided small obstacles registered by the ultrasonic sensor, and when approaching a large obstacle or a corner it seemed like the robot was attempting to find a way around the obstacle by scanning from side to side by turning its body, increasing the angle for left-turns with each try. When we looked into the program code, we saw that this observation is correct albeit a little too specific: the robot increases its turn angle in the direction where it measures the largest distance to any obstacles - i.e. it will not necessarily be the angle for left-turns that is increased. Video 1 shows the robot avoiding different obstacles (symbolised by Idas arm) with the avoid behavior enabled while in the end showing that the touch sensors is indeed disabled.
|
|
|
The program worked quite well. The robot successfully avoided small obstacles registered by the ultrasonic sensor, and when approaching a large obstacle, or a corner, it seemed like the robot was attempting to find a way around the obstacle by scanning from side to side by turning its body, increasing the angle for left-turns with each try. When we looked into the program code, we saw that this observation is correct albeit a little too specific: the robot increases its turn angle in the direction where it measures the largest distance to any obstacles - i.e. it will not necessarily be the angle for left-turns that is increased. Video 1 shows the robot avoiding different obstacles (symbolized by Ida's arm) with the avoid behavior enabled while in the end showing that the touch sensors is indeed disabled.
|
|
|
|
|
|
[![Robot running avoid behavior](http://img.youtube.com/vi/4Xa1ZYIT-Rs/0.jpg)](https://www.youtube.com/watch?v=4Xa1ZYIT-Rs)
|
|
|
*Video 1: The robot running AvoidFigure9_3.java, implementing the avoid behavior*
|
|
|
|
|
|
|
|
|
#### Incorporating a 180 degree escape turn
|
|
|
As prescribed in the lesson plan, we modified ***AvoidFigure9_3.java*** [5] to make the robot perform a 180 degree turn when both front - and side distances were below the threshold value.
|
|
|
As prescribed in the lesson plan, we modified ***AvoidFigure9_3.java*** [5] to make the robot perform a 180 degree turn when the front distance, as well as both side distances were below the threshold value.
|
|
|
|
|
|
We tried changing the program by making the robot drive backwards a little when encountering an obstacle, check if it was a big obstacle or a corner and then spin around 180 degrees (by making one motor drive forward and the other drive backwards). Initially we made the robot perform the 180 turn for 1 second (1000 ms), which wasn't enough, but when we changed it to 2 seconds (2000 ms) it spun approximately 180 degrees. This behavior can be seen in video 2 where the robot turn 180 degrees when measuring a low distance in all three checks, but not when only measuring a low distance in some. Since we start backing and turning directly after looking to the right, we won't go straight backwards from our original direction, but the overall behavior is still correct since we won't spend too much time when facing large obstacles. The skew might even help if our robot were to end up between 2 large enough surfaces to make it spin. That would make it drive back and forth between the 2 obstacles.
|
|
|
|
... | ... | @@ -50,7 +50,7 @@ We were provided with the program ***RobotFigure9_9.java*** [6] which implements |
|
|
![Behavior control network](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week9/img/fig99.PNG)
|
|
|
*Figure 3: Diagram of the behavior control network of section 9.5 in [2], exluding the Escape behavior. The original diagram, including the Escape behavior, is Figure 9.9 in [2].*
|
|
|
|
|
|
Running ***RobotFigure9_9***, the robot would stop and turn to both sides every time it was bothered by either an obstacle or light, in the same manner as with ***AvoidFigure9_3***. It was hard to distinguish whether it was reacting to light or an obstacle, as the responses looked the same (if one didn't use the LCD screen that is). Later, a closer look at the code confirmed that the behavior implementations were similar, in that they both sought in the direction of the higher value (of either light reading or distance). The behavior of the initial implementation of ***RobotFigure9_9*** can be seen in video 3, where the robot avoids the walls and follows the increased light source of Idas mobile phone.
|
|
|
Running ***RobotFigure9_9***, the robot would stop and turn to both sides every time it was bothered by either an obstacle or light, in the same manner as with ***AvoidFigure9_3***. It was hard to distinguish whether it was reacting to light or an obstacle, as the responses looked the same (if one didn't use the LCD screen that is). Later, a closer look at the code confirmed that the behavior implementations were similar, in that they both sought in the direction of the higher value (of either light reading or distance). The behavior of the initial implementation of ***RobotFigure9_9*** can be seen in video 3, where the robot avoids the walls and follows the increased light source of Ida's mobile phone.
|
|
|
|
|
|
[![Avoid follow cruise](http://img.youtube.com/vi/2qxAJpQgZMM/0.jpg)](https://www.youtube.com/watch?v=2qxAJpQgZMM)
|
|
|
*Video 3: The robot running with three behaviors: Cruise, Follow, and Avoid*
|
... | ... | @@ -62,20 +62,20 @@ Furthermore, observing the robot running with the Follow behavior, we became awa |
|
|
![Pale hand and dark shirt](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week9/img/handshirt.PNG)
|
|
|
*Figure 4: Pale hand and t-shirt made of dark fabric. The hand reflected too much light back to the sensor causing it not to register that the torch light was blocked, while the dark fabric succesfully blocked the torch light.*
|
|
|
|
|
|
We speculate that this might lead to unintuitive behavior. For instance, if the robot gets close to a white wall, the Follow behavior might cause it to be drawn to the wall before the Avoid behavior kicks in and drives it away from the surface. In a dark room, where the ambient light set point is very low, it might even 'follow' a white wall from a significant distance, though we aren't sure how far from a white wall it needs to be for the reflection to be readable. We however concluded that this would not affect the robots follow behavior in other cases, as it would just pick up upon the red color spectrum of the ambient light or the mobile torch and still measure a higher value.
|
|
|
We speculate that this might lead to unintuitive behavior. For instance, if the robot gets close to a white wall, the Follow behavior might cause it to be drawn to the wall before the Avoid behavior kicks in and drives it away from the surface. In a dark room, where the ambient light set point is very low, it might even 'follow' a white wall from a significant distance, though we aren't sure how far from a white wall it needs to be for the reflection to be readable. We however concluded that this would not affect the robots follow behavior in other cases, as it would just pick up on the red color spectrum of the ambient light or the mobile torch and still measure a higher value.
|
|
|
|
|
|
Additionally, we discovered a slight nuisance during closer observations of the Follow behavior. The robot would, when left alone, constantly stop up and check its sides for light sources, even when not given any significant light source from a torch, as shown in video 4.
|
|
|
|
|
|
[![Follow keeps checking](http://img.youtube.com/vi/--ZTLgyoltc/0.jpg)](https://www.youtube.com/watch?v=--ZTLgyoltc)
|
|
|
*Video 4: The Follow behavior constantly checks for light, without being given stimuli*
|
|
|
|
|
|
This was caused by the Follow behavior being programmed to trigger a 'search' for light, as soon as the light sensor measures a light level higher than its initial reading when starting the program. This means that because of the potential for slight instability in the ambient lighting in the area of the light sensor (especially when the robot is driving around), any time the light value fluctuated above the initial reading, it would result in a Follow 'check'. For the remainder of the lab exercise, we would launch the programs while holding the robot up in the air (*cue Circle of Life*), with the light sensor facing the skylights. Thereby, the initial ambient light reading would be slightly higher than what the sensor measured when driving around horizontally on the tables in the building, and as such caused it to only execute the Follow behavior when given noticeably interesting light sources, such as our torch. Alternatively we could have simply modified the program to only trigger when measuring a light reading of 'X' higher than the initial reading, but we opted for the quick hack instead, which worked very well.
|
|
|
This was caused by the Follow behavior being programmed to trigger a 'search' for light, as soon as the light sensor measures a light level higher than its initial reading when starting the program. This means that because of the potential for slight instability in the ambient lighting in the area of the light sensor (especially when the robot is driving around), any time the light value fluctuated above the initial reading, it would result in a Follow 'check'. For the remainder of the lab exercise, we would launch the programs while holding the robot up in the air, with the light sensor facing the skylights. Thereby, the initial ambient light reading would be slightly higher than what the sensor measured when driving around horizontally on the tables in the building, and as such caused it to only execute the Follow behavior when given noticeably interesting light sources, such as our torch. Alternatively we could have simply modified the program to only trigger when measuring a light reading of 'X' higher than the initial reading, but we opted for the quick hack instead, which worked very well.
|
|
|
|
|
|
|
|
|
### Implementing the Escape behavior
|
|
|
The next step in our experiments with a behavior controlled robot was to implement a new behavior to use in the ***RobotFigure9_9***. As suggested in the lesson plan, we used the pseudocode on page 305 in [2] as inspiration for our own escape behavior. This behavior should use the two touch sensors mounted on front of the robot to avoid driving into small objects that might not be seen by the ultrasound sensor.
|
|
|
The next step in our experiments with a behavior controlled robot was to implement a new behavior to use in the ***RobotFigure9_9***. As suggested in the lesson plan, we used the pseudo-code on page 305 in [2] as inspiration for our own escape behavior. This behavior should use the two touch sensors mounted on front of the robot to avoid driving into small objects that might not be seen by the ultrasound sensor.
|
|
|
|
|
|
Initially we basically copied earlier mentioned pseudocode (without the flags). This ment, that the left - and right turn was naively implemented by powering up one motor and leaving the other. This resulted in the robot driving even more into the detected object instead of avoiding it as it turned too slow. Instead we tried to back up the unused motor as the other drove forward to make the robot turn more rapidly, which seemed to work even though the run was affected by the other behaviors as the avoid behavior would be triggered before the escape behavior. We therefore turned off the other behaviors to observe whether the escape behavior worked as expected. The result of the rapidly turning escape behavior with no avoid behavior can be seen in video 5.
|
|
|
Initially we basically copied earlier mentioned pseudo-code (without the flags). This meant, that the left - and right turn were naively implemented by powering up one motor and leaving the other. This resulted in the robot driving even more into the detected object instead of avoiding it as it turned too slow. Instead we tried to back up the unused motor as the other drove forward to make the robot turn more rapidly, which seemed to work even though the run was affected by the other behaviors as the avoid behavior would be triggered before the escape behavior. We therefore turned off the other behaviors to observe whether the escape behavior worked as expected. The result of the rapidly turning escape behavior with no avoid behavior can be seen in video 5.
|
|
|
|
|
|
[![Implemented Escape behavior](http://img.youtube.com/vi/nINQb163pyQ/0.jpg)](https://www.youtube.com/watch?v=nINQb163pyQ)
|
|
|
*Video 5: Robot avoiding obstacles using an Escape behavior via the TouchSensor and with no Avoid behavior.*
|
... | ... | @@ -83,17 +83,17 @@ Initially we basically copied earlier mentioned pseudocode (without the flags). |
|
|
The final implementation of our Escape behavior can be found in [6].
|
|
|
|
|
|
### Motor for light sensor
|
|
|
After implementing the additional behavoir for our behavoir control, we were asked to mount the robots light sensor on an additional motor as seen on the lesson plan picture. This installation would then enable the follow behavior to check for better light source without turning the actual robot. To do this we started out by rebuilding the robot. The new look can be seen in figure 5, where the light sensor is now attached to a third motor.
|
|
|
After implementing the additional behavior for our behavior control, we were asked to mount the robot's light sensor on an additional motor as seen on the lesson plan picture. This installation would then enable the follow behavior to check better check for light sources without having to turn the actual robot. To do this we started out by rebuilding the robot. The new look can be seen in figure 5, where the light sensor is now attached to a third motor.
|
|
|
|
|
|
![Rebuilt motorized LightSensor](http://i.imgur.com/Ek1SRyh.jpg?1)
|
|
|
*Figure 5: Rebuilt robot mounting the LightSensor on a motor*
|
|
|
|
|
|
The follow behavior implementation now needed an update to use the new motor atachment. Initially we simply replaced all calls to car.forward or car.backwards with the coresponding calls to the new motor and thereby using the same motorpower values and milisecond delays. This however resulted in a very dramatic reaction where the robot nearly strangled itself in the attached vires as seen in video 6 where a motorpower of 70 and a milisecond delay of 500 ms were used.
|
|
|
The follow behavior implementation now needed an update to use the new motor attachment. Initially we simply replaced all calls to car.forward or car.backwards with the corresponding calls to the new motor and thereby using the same motorpower values and millisecond delays. This however resulted in a very dramatic reaction where the robot nearly strangled itself in the attached wires, as seen in video 6 where a motorpower of 70 and a millisecond delay of 500 ms were used.
|
|
|
|
|
|
[![Suicidal ChokeBot](http://img.youtube.com/vi/7dXc0_qPSww/0.jpg)](https://www.youtube.com/watch?v=7dXc0_qPSww)
|
|
|
*Video 6: Motorized LightSensor doesn't stop turning.*
|
|
|
|
|
|
After observing the dramatic turn of the light sensor, we realised that the turn-angle was simply too big and concluded that due to less resistance on the extra motor compared to the two initial motors, there was two ways of effecting this; by changing the motor power or the delay. We decided on changing the delay from the initial 500 ms to 100 ms, as the speed of the rotating light sensor seemed about right. This didn't work as expected as the angle still seemed wrong.
|
|
|
After observing the dramatic turn of the light sensor, we realized that the turn-angle was simply too big. The time spent turning and motor power given to the previous two motors were designed to turn the entire robot, whereas we now only had to slightly shift the light sensor directly, and additionally didn't have the entire weight of the robot causing friction to the motors (and thereby demanding more motor power to turn). There were two ways of fixing this; by changing the motor power or the delay. We decided on changing the delay from the initial 500 ms to 100 ms, as the speed of the rotating light sensor seemed about right. This didn't work as expected as the angle still seemed wrong.
|
|
|
|
|
|
At this point in time, we realized that a GUI for transmitting motor power - and delay parameters while running the program would have eased the testing of best fitting parameter values. We however judged that the time invested in implementing such a GUI and fitting it with the existing programs would take longer time than simply trying different values. So, we switched off all other behaviors to ease the observation of the follow behavior and started testing different combinations of motor power - and delay values, by starting with varying the motor power, starting with relatively small values.
|
|
|
|
... | ... | |