... | @@ -89,7 +89,7 @@ We wrote a program, ***SoundHater.java*** [6], as the inhibitory version of ***S |
... | @@ -89,7 +89,7 @@ We wrote a program, ***SoundHater.java*** [6], as the inhibitory version of ***S |
|
*Video 5: The robot running SoundHater.java*
|
|
*Video 5: The robot running SoundHater.java*
|
|
|
|
|
|
#### Dancing robot
|
|
#### Dancing robot
|
|
Finally, we made the program ***TinyDancer.java*** [7], where we initially made the robot turn left if the sensor measured a sound level above 50 (TODO Ida: som i 50 procent? Eller dB? Nicolai: Procent... I think? Halp i need 2nd opinion Camilla), and turn right if the value was less than 50. The robot kept constantly turning left as the sound level caused by itself was too high. This can be seen in Video 6.
|
|
Finally, we made the program ***TinyDancer.java*** [7], where we initially made the robot turn left if the sensor measured a sound level above 50 procent of the totalt spectrum, and turn right if the value was less than 50. The robot kept constantly turning left as the sound level caused by itself was too high. This can be seen in Video 6.
|
|
|
|
|
|
[![TinyDancer, 1st attempt](http://img.youtube.com/vi/gjiWjaYU4eA/0.jpg)](https://www.youtube.com/watch?v=gjiWjaYU4eA)
|
|
[![TinyDancer, 1st attempt](http://img.youtube.com/vi/gjiWjaYU4eA/0.jpg)](https://www.youtube.com/watch?v=gjiWjaYU4eA)
|
|
|
|
|
... | @@ -101,40 +101,38 @@ We then tried increasing the values of the threshold for turning left or right t |
... | @@ -101,40 +101,38 @@ We then tried increasing the values of the threshold for turning left or right t |
|
|
|
|
|
*Video 7: Our robot showing some serious moves with the modified TinyDancer program*
|
|
*Video 7: Our robot showing some serious moves with the modified TinyDancer program*
|
|
|
|
|
|
|
|
|
|
### Vehicle 2
|
|
### Vehicle 2
|
|
|
|
|
|
The robot was rebuilt to include two light sensors instead of the sound sensor, as shown in Figure 3.
|
|
For this vehicle design the robot was rebuilt again to include two light sensors instead of the sound sensor, as shown in Figure 3.
|
|
|
|
|
|
![Robot with light sensors](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week8/img/firefry1.PNG)
|
|
![Robot with light sensors](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week8/img/firefry1.PNG)
|
|
*Figure 3: Robot fitted with two light sensors*
|
|
*Figure 3: Robot fitted with two light sensors*
|
|
|
|
|
|
We created a program, ***LightLover.java*** [8], which behaves similarly to ***SoundLover.java*** - the difference being that we now have two sensors, mapping one sensor's light readings to the power of one motor, and the other sensor's readings to the power of the other motor. We created Braitenberg's Vehicle 2a (see Figure 4) by simply mapping the value measured by the left sensor to the left motor, and the value from the right sensor to the right motor. As we have programmed for an exitatory behavior, this means that high light levels on the left sensor will induce a high power on the left motor and as a result cause the robot to drive *away* from light sources. The resulting behaviour turned our to be that the robot mostly stood still, as the ambient light values in the room did not map to a high enough motor power to get the motors to actually move (the measured values and resulting motopower would lie around 57). However, when stimulated by a bright light source, such as the torch on our phone, the robot accurately drives away from it, as shown in Video 8. If we wanted the robot to properly drive around and watch out for light, we could simply do a small calibration by adding some extra base power to each motor, to get the robot to actually drive around in ambient lighting.
|
|
We created a program, ***LightHater.java*** [8], which behaves similarly to ***SoundLover.java*** - the difference being that we now have two sensors, mapping one sensor's light readings to the power of one motor, and the other sensor's readings to the power of the other motor. We created Braitenberg's Vehicle 2a (see Figure 4) by simply mapping the value measured by the left sensor to the left motor, and the value from the right sensor to the right motor. As we have programmed for an exitatory behavior, this means that high light levels on the left sensor will induce a high power on the left motor and as a result cause the robot to drive *away* from light sources. The resulting behaviour turned our to be that the robot mostly stood still, as the ambient light values in the room did not map to a high enough motor power to get the motors to actually move (the measured values and resulting motorpower would lie around 57). However, when stimulated by a bright light source, such as the torch on our phone, the robot accurately drives away from it, as shown in Video 8. If we wanted the robot to properly drive around and watch out for light, we could simply do a small calibration by adding some extra base power to each motor, to get the robot to actually drive around in ambient lighting.
|
|
|
|
|
|
[![Light haters gonna hate](http://img.youtube.com/vi/R5ao7_WGzs8/0.jpg)](https://www.youtube.com/watch?v=R5ao7_WGzs8)
|
|
[![Light haters gonna hate](http://img.youtube.com/vi/R5ao7_WGzs8/0.jpg)](https://www.youtube.com/watch?v=R5ao7_WGzs8)
|
|
|
|
|
|
*Video 8: The robot moving away from bright light*
|
|
*Video 8: The robot moving away from bright light*
|
|
|
|
|
|
Next, we constructed vehicle 2b by simply switching the attachments of the connecting wires of the sensor's (the difference is illustrated in Figure 4, taken from the lesson plan), such that which sensor causes which motor to drive is flipped - i.e. the robot will drive towards light instead of away from it. This is shown in Video 9.
|
|
#### Light Lover
|
|
|
|
Next, we constructed vehicle 2b by simply switching the attachments of the connecting wires of the sensor's (the difference is illustrated in Figure 4, taken from the lesson plan), such that which sensor causes which motor to drive is flipped - i.e. the robot will drive towards light instead of away from it.
|
|
|
|
|
|
![Vehicles 2a and 2b](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week8/img/2a2b.PNG)
|
|
![Vehicles 2a and 2b](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week8/img/2a2b.PNG)
|
|
*Figure 4: Illustration of vehicle 2a and vehicle 2b*
|
|
*Figure 4: Illustration of vehicle 2a and vehicle 2b*
|
|
|
|
|
|
TODO: !!!Nicolai: Der er ingen video i drive til den her D: D: D: D:!!! (Hvis vi ikke har en mener jeg heller ikke det er super vigtigt, ville bare lige forklare hvorfor jeg ikke har sat én ind :P)
|
|
|
|
[TODO: LIGHT LOOOVE / Light luvver vid]
|
|
|
|
*Video 9: The robot driving towards bright light*
|
|
|
|
|
|
|
|
We did not create similar programs with inhibitory connections, but they would be reasonably trivial. It could be obtained simply by inverting the values used as parameters for the motor powers. This would cause the robot to stand still when measuring bright light and drive towards dark (low light) sources.
|
|
We did not create similar programs with inhibitory connections, but they would be reasonably trivial. It could be obtained simply by inverting the values used as parameters for the motor powers. This would cause the robot to stand still when measuring bright light and drive towards dark (low light) sources.
|
|
|
|
|
|
We proceeded on to having several robots running ***LightLover.java*** move about in a dark room wearing lights. Our observations from this are summarized in Table 1.
|
|
#### Several robots in the dark
|
|
|
|
We proceeded on to having several robots running ***LightHater.java*** move about in a dark room wearing lights on their rear. As it was not possible to coordinate this with the other groups as we where working friday afternoon and therefore didn't have several other groups in the Suze building, we decided to merely discuss the expected result from a group of robots in a dark room running one of the following four possibilities. The results from this discussion are summarized in Table 1.
|
|
|
|
|
|
| Situation | Explanation |
|
|
| Situation | Explanation |
|
|
| ------------- | ----------- |
|
|
| ------------- | ----------- |
|
|
| 2a Exitatory | Every time a robot gets close to another robot, it drives away from the other one |
|
|
| 2a Exitatory | Every time a robot gets close to another robots rear, it sees more light and turns up power at the same side motor resulting in the robots driving away from each other, but slowing down as they get seperated. |
|
|
| 2a Inhibitory | The robots stop when getting close to another robot and start again when the other drives away. The end result is the robots following each other |
|
|
| 2a Inhibitory | The robots stop when getting close to another robot as this gives a high reading of light resulting in less motor power. The robots will however power up their motors when the other robots move away and the sensors sees less light resulting in the robots ultimately following each other.|
|
|
| 2b Exitatory | Same behavior as 2a Inhibitory (TODO: Pending godlike Camilla forklaring) |
|
|
| 2b Exitatory | Same behavior as 2a Inhibitory as the robots power up the opposite motor when seeing more light, resulting in them driving toward the light of the other robots rears and thereby following each other.|
|
|
| 2b Inhibitory | Same behavior as 2a Exitatory |
|
|
| 2b Inhibitory | Same behavior as 2a Exitatory as the robots power up their motors when seeing less light and thereby driving away from the light on the rear of the other robots.|
|
|
*Table 1: Observations of several light-bearing robots running the LightLover program*
|
|
|
|
|
|
*Table 1: Observations of several light-bearing robots running the LightHater program*
|
|
|
|
|
|
In our experiments so far we have included the entire range of raw light values from 0 to 1023 in our mapping calculations. This has not been a massive problem given the light environment we were in, but could be improved by adapting our calculations to include the maximum and minimum light values sensed during execution of our programs in this specific environment, and using these values as the mapping range in our calculations, similar to what Tom Dean suggests in [2].
|
|
In our experiments so far we have included the entire range of raw light values from 0 to 1023 in our mapping calculations. This has not been a massive problem given the light environment we were in, but could be improved by adapting our calculations to include the maximum and minimum light values sensed during execution of our programs in this specific environment, and using these values as the mapping range in our calculations, similar to what Tom Dean suggests in [2].
|
|
|
|
|
... | | ... | |