... | ... | @@ -28,7 +28,7 @@ |
|
|
>## Exercise 1: Self-balancing robot with light sensor
|
|
|
>
|
|
|
>#### Task
|
|
|
>Use the LEGO model suggested by Phillippe Hurbain, and the java program from Brian Bagnall, to start experiments with a self-balancing robot. Under the headline Usage Phillippe Hurbain describes under which conditions the NXTway works best. Try to follow some of his advices.
|
|
|
>Use the LEGO model suggested by Phillippe Hurbain [2], and the java program from Brian Bagnall [3], to start experiments with a self-balancing robot. Under the headline Usage Phillippe Hurbain describes under which conditions the NXTway works best. Try to follow some of his advices.
|
|
|
>
|
|
|
>#### Plan
|
|
|
>We first mounted the light-sensor on the car as showed on the picture below. The light-sensor is placed just above the ground surface. In last week we tested the light-sensor and its readings on different surfaces and under different light settings. These findings made it easy for us to choose the circumstances for this exercise.
|
... | ... | @@ -43,7 +43,7 @@ |
|
|
> ![fixed-axle](http://gitlab.au.dk/uploads/group-22/lego/9f18edf68a/fixed-axle.jpg)
|
|
|
> ##### Fig. 2 - Our robot with the fixed axle improvement of Hurbains model.
|
|
|
>
|
|
|
> We had a hard time making the robot balance using the PID controller. We managed to make it balance for some seconds and though we spent much time tweaking the PID values we did not succeed in making it balance for more than few seconds (see fig. ??). First we conducted an experiment on a white surface in a bright room. We found that the light-sensor was influenced by small changes in the light setting eg. our shadows. To overcome this we moved into a dark room with no light and tried the robot on a white paper surface. After some tweaking with the values we made the robot balance for about 2 seconds.
|
|
|
> We had a hard time making the robot balance using the PID controller. We managed to make it balance for some seconds and though we spent much time tweaking the PID values we did not succeed in making it balance for more than few seconds (see fig. 3). First we conducted an experiment on a white surface in a bright room. We found that the light-sensor was influenced by small changes in the light setting eg. our shadows. To overcome this we moved into a dark room with no light and tried the robot on a white paper surface. After some tweaking with the values we made the robot balance for about 2 seconds.
|
|
|
>
|
|
|
> [![image alt text](http://img.youtube.com/vi/k6nARWC63OY/0.jpg)](http://www.youtube.com/watch?v=k6nARWC63OY)
|
|
|
>
|
... | ... | @@ -66,7 +66,7 @@ In the datalogger class we had to replace FileOutputStream with BufferedOutputSt |
|
|
> ![SejwayLight measurements](http://gitlab.au.dk/uploads/group-22/lego/ea5153b6c8/SejwayLight_measurements.png)
|
|
|
> ##### Fig. 4 - SejwayLight assisted balance measurements.
|
|
|
>
|
|
|
>As can be seen in the graph in fig. ??, the light sensor oscillates just around the balance point. It is important to note that the balance point changes depending on the surrounding environment, so the best place to get the robot to balance in, is a room with no windows and a constant light level with no influence from outside.
|
|
|
>As can be seen in the graph in (fig. 4), the light sensor oscillates just around the balance point. It is important to note that the balance point changes depending on the surrounding environment, so the best place to get the robot to balance in, is a room with no windows and a constant light level with no influence from outside.
|
|
|
>
|
|
|
>---
|
|
|
>
|
... | ... | @@ -79,7 +79,7 @@ In the datalogger class we had to replace FileOutputStream with BufferedOutputSt |
|
|
> We implemented the NXT GUI in our code with five parameters so that we could adjust the parameters while holding the robot in balance.
|
|
|
>
|
|
|
>#### Results
|
|
|
>During the testing of our robot we used the GUI (see fig. ??) to adjust the PID values as well as the offset and SCALE values. We found it necessary to be able to adjust the offset as the value would change a little just by clicking the enter button for the selection.
|
|
|
>During the testing of our robot we used the GUI (see fig. 5) to adjust the PID values as well as the offset and SCALE values. We found it necessary to be able to adjust the offset as the value would change a little just by clicking the enter button for the selection.
|
|
|
>
|
|
|
> ![PID_gui](http://gitlab.au.dk/uploads/group-22/lego/56fed5f205/PID_gui.png)
|
|
|
> ##### Fig. 5 - Control GUI used to adjust the PID values as well as offset (o) and SCALE (s).
|
... | ... | |