... | ... | @@ -13,13 +13,16 @@ |
|
|
>#### Activity duration:
|
|
|
> 4+8+6+5+4 hours
|
|
|
>
|
|
|
>#### Overall goal
|
|
|
>#### Overall goals
|
|
|
>In this lab session we will try to make two different self-balancing LEGO robots. In order to do this, we use 3 different sensors, which include a light-sensor, a color-sensor and an gyro-sensor. The exercises for this week, are as follows:
|
|
|
> * Exercise 1: Self-balancing robot with light sensor
|
|
|
> * Exercise 2: Choice of parameters on-the-fly
|
|
|
> * Exercise 3: Self-balancing robot with color sensor
|
|
|
> * Exercise 4: Self-balancing robot with gyro sensor
|
|
|
>
|
|
|
>#### Additional goal of this week
|
|
|
> * An explanation of the balance theory
|
|
|
>
|
|
|
>---
|
|
|
>
|
|
|
>## Exercise 1: Self-balancing robot with light sensor
|
... | ... | @@ -87,46 +90,43 @@ In the datalogger class we had to replace FileOutputStream with BufferedOutputSt |
|
|
> dos.writeFloat(i); // order to send decimal numbers.
|
|
|
> dos.flush();
|
|
|
>
|
|
|
> String dString = dField.getText();
|
|
|
> String dString = dField.getText();
|
|
|
> int d = new Integer(dString).intValue();
|
|
|
> dos.writeInt(d);
|
|
|
> dos.flush();
|
|
|
>
|
|
|
> String oString = oField.getText();
|
|
|
> String oString = oField.getText();
|
|
|
> int o = new Integer(oString).intValue();
|
|
|
> dos.writeInt(o);
|
|
|
> dos.flush();
|
|
|
>
|
|
|
> String sString = sField.getText();
|
|
|
> String sString = sField.getText();
|
|
|
> int s = new Integer(sString).intValue();
|
|
|
> dos.writeInt(s);
|
|
|
> dos.flush();
|
|
|
> }
|
|
|
>```
|
|
|
>
|
|
|
> ##### Fig. 5 - Code from the PCcarController.java program where the I value needed to be changed to fix a bluetooth connection problem.
|
|
|
>
|
|
|
>---
|
|
|
>
|
|
|
>## Exercise 3: ThreeColorSensor with Calibration
|
|
|
|
|
|
>## Exercise 3: Self-balancing robots with color sensor
|
|
|
>
|
|
|
>
|
|
|
>#### Task
|
|
|
>Use the idea and structure of the BlackWhiteSensor to program a class ThreeColorSensor that can detect three colors: black, green and white. Make a test program that investigate the usefulness of the class.
|
|
|
>The NXT Segway with rider, [7], use a color sensor to measure the position of the robot. Try that as an alternative to the light sensor. And maybe the physical structure of this robot is better that the structure in [3] and [4]. Maybe it is worth to try it.
|
|
|
>
|
|
|
>#### Plan
|
|
|
>
|
|
|
>We built our Segway with rider according to the instructions on nxtprograms.com [7]. The only change we made to the build was replacing the “head” (ultrasonic sensor) with a gyro sensor to be used in a later exercise. We also did experiments using both the color sensor and the light sensor to see if there would be any difference. We did not intend to use the third motor during this exercise.
|
|
|
>
|
|
|
>We implemented a third possibility in the BlackWhiteSensor class, which is green. The program is still using the light sensor to determine the values of light from the table. With the corrections to the code, we had the possibility to sample black, white and green.
|
|
|
>In Lab session 1 [1] we used the light sensor to figure out the values of different colors - Green had a value between black and white. We used these findings to determine that we needed two new thresholds in the code. To distinguish between the three different colors we changed the thresholds in the code between white & green and black & green. The white’s values are the highest and the black are the lowest, so in order to determine green, we know that it has to be between black and white.
|
|
|
>
|
|
|
>By calibrating the colors before the cars starts running it is possible to find the median between the colors to create a threshold which leaves room for some error. We also rewrote some of the code to give us the raw values of the lightsensor, as the results between black and green were pretty close when using the readValue() method, this left us with a bit more precise readings and a slightly larger error margin.
|
|
|
|
|
|
>#### Results
|
|
|
|
|
|
>During the initial testing, we found that the program would work as intended with the robot driving right when the light sensor read a low (dark) value and left when the light sensor read a high (light) value. The problem occurred when the light sensor settled on the border between the black line and the white space around it within a few seconds. This meant that the reading it got was a median value between the high and low color thresholds. The resulting combined value was too similar to the light value for the color green. This meant that the robot saw this value as green in the code and stopped as was intended when it read a similar light value for the real color green.
|
|
|
> ![3](http://gitlab.au.dk/uploads/group-22/lego/a4841d7341/3.png)
|
|
|
> #### Results
|
|
|
>
|
|
|
>We tested the robot with the color sensor in a similar way as in exercise 1. We used the GUI to change the values while the program was running to get a more precise balancing robot. To eliminate any outside interference we did the test in a closed dark room with a white shiny surface. The program from exercise 1 was modified to use a color sensor and read the normalizedLightValue for the initial offset and the normVal. The program can be found at sejwayColor.java [??].
|
|
|
>
|
|
|
>
|
|
|
>
|
|
|
>From this exercise we can conclude that the light sensor is not very useful for navigating based on following specific lines of specific colors, as the way that the light sensor works
|
|
|
>
|
|
|
>---
|
|
|
>
|
... | ... | |