... | @@ -26,7 +26,9 @@ TODO: ret kode-eksempler til at være i kodeformat |
... | @@ -26,7 +26,9 @@ TODO: ret kode-eksempler til at være i kodeformat |
|
|
|
|
|
TODO: ret andre formatteringsting, som ikke kan laves i Google Docs
|
|
TODO: ret andre formatteringsting, som ikke kan laves i Google Docs
|
|
|
|
|
|
TODO: ordn refs
|
|
TODO: ordn refs (til kode mv.)
|
|
|
|
|
|
|
|
TODO: indsæt billeder
|
|
|
|
|
|
## **Plan**
|
|
## **Plan**
|
|
|
|
|
... | @@ -44,7 +46,7 @@ Next, we discussed how to follow the track and turn on the platforms. We discuss |
... | @@ -44,7 +46,7 @@ Next, we discussed how to follow the track and turn on the platforms. We discuss |
|
|
|
|
|
The discussion regarding the Y-shaped lines led to a suggestion that rather than having the robot turn on the platform, the shape might be utilized to have the robot go down the "leg" of the Y and then drive backwards up to the next platform, in the hopes of saving time. The idea is illustrated in Figure 1. There was debate as to whether or not it was doable (or practical) to do this. We might try it out when we get further into the project.
|
|
The discussion regarding the Y-shaped lines led to a suggestion that rather than having the robot turn on the platform, the shape might be utilized to have the robot go down the "leg" of the Y and then drive backwards up to the next platform, in the hopes of saving time. The idea is illustrated in Figure 1. There was debate as to whether or not it was doable (or practical) to do this. We might try it out when we get further into the project.
|
|
|
|
|
|
[TODO: Y-back.jpg]
|
|
![Y-backing](https://gitlab.au.dk/LEGO/lego-kode/raw/master/week1011/img/Y-back_2.PNG)
|
|
|
|
|
|
*Figure 1: Diagrammatic presentation of the robot moving down the Y and then backing up.*
|
|
*Figure 1: Diagrammatic presentation of the robot moving down the Y and then backing up.*
|
|
|
|
|
... | @@ -244,7 +246,11 @@ The turns mostly worked fine on the way down as well, although the robot would n |
... | @@ -244,7 +246,11 @@ The turns mostly worked fine on the way down as well, although the robot would n |
|
|
|
|
|
Now we hit a new obstacle. When hitting the second platform on the way down, our light sensors were hitting the floor. This meant they weren’t reading light values correctly, and would lose track of the black line. As a result, we had to rebuild the robot to put the light sensors further up. When placing the robot at the top of the platform initially, we were now able to get it to drive successfully all the way down to the green zone. However, we were now no longer able to get the robot to act appropriately when driving up the track. The newly increased height of the light sensors meant that when driving up the ramps, as the robot was driving onto the platform, the sensors would be raised so far above the ground, that they weren’t reading any reflected light, and as a result were reading ‘black’, causing the robot to think it was at the Y-section and should turn.
|
|
Now we hit a new obstacle. When hitting the second platform on the way down, our light sensors were hitting the floor. This meant they weren’t reading light values correctly, and would lose track of the black line. As a result, we had to rebuild the robot to put the light sensors further up. When placing the robot at the top of the platform initially, we were now able to get it to drive successfully all the way down to the green zone. However, we were now no longer able to get the robot to act appropriately when driving up the track. The newly increased height of the light sensors meant that when driving up the ramps, as the robot was driving onto the platform, the sensors would be raised so far above the ground, that they weren’t reading any reflected light, and as a result were reading ‘black’, causing the robot to think it was at the Y-section and should turn.
|
|
|
|
|
|
We were at an impasse trying to solve this problem. Lowering the light value for which we would accept to be considered ‘black’ on both sensors, meant that the robot was no longer recognizing the Y-sections either. We briefly attempted to simply accept the reading when driving onto a platform as ‘black’ and had code a slight drive onto the platform and finding the next black line. The problem then became that no value for what was accepted ‘black’ on both sensors, resulted in reliable behavior. Either the robot wouldn’t react to one or more of the four ‘black’ triggers (the two platforms and the two Y-sections) - both on the way up and the way down, or it would trigger at times it weren’t supposed to at all.
|
|
We were at an impasse trying to solve this problem. Lowering the light value for which we would accept to be considered ‘black’ on both sensors, meant that the robot was no longer recognizing the Y-sections either. We briefly attempted to simply accept the reading when driving onto a platform as ‘black’ and had code a slight drive onto the platform and finding the next black line. The problem then became that no value for what was accepted ‘black’ on both sensors, resulted in reliable behavior. Either the robot wouldn’t react to one or more of the four ‘black’ triggers (the two platforms and the two Y-sections) - both on the way up and the way down, or it would trigger at times it weren’t supposed to at all. Video [TODO: video nr] shows this problem, where the robot triggers two left turns when hitting the top of second ramp as a result of somehow still seeing black after hitting the line on the platform.
|
|
|
|
|
|
|
|
[TODO: Fix to markdown video format]
|
|
|
|
|
|
|
|
*Video X: PilotedLineFollower succeeding in performing the turn on the first platform*
|
|
|
|
|
|
We spent a long time fiddling with these triggers, but weren’t able to ever get the robot to behave reliably at all turns, both on the way up and down.
|
|
We spent a long time fiddling with these triggers, but weren’t able to ever get the robot to behave reliably at all turns, both on the way up and down.
|
|
|
|
|
... | @@ -252,9 +258,13 @@ We spent a long time fiddling with these triggers, but weren’t able to ever ge |
... | @@ -252,9 +258,13 @@ We spent a long time fiddling with these triggers, but weren’t able to ever ge |
|
|
|
|
|
As an alternative to the above approach, which was beginning to seem complex, we began simultaneous work on a more simple approach with the additional intention of focusing more on the track completion time.
|
|
As an alternative to the above approach, which was beginning to seem complex, we began simultaneous work on a more simple approach with the additional intention of focusing more on the track completion time.
|
|
|
|
|
|
The two light sensors’ only purpose in this robot was to make sure we knew on which side of the robot the line was. We took advantage of their relative positions, by keeping a state of which sensor last saw the black line. Partially knowing our position (or at least which side of the line we’re on) means that we can use the integral part of a PID controller well, since that enables us to drive smoothly in a (sine) curve instead of the erratic behavior, we get from reacting every time we sense black. It mainly uses the integral and the proportional part of PID, since that’s the parts we thought that the robot would benefit the most from. The robot remembers, which sensor saw black the last and decreases the power of the respective motor exponentially until it sees black again. This way we can have both motors running with a lot of power continuously. A very naive implementation worked pretty well, and it didn’t seem to benefit a lot from more advanced techniques, but that might have something to do with the light in the room, since the readings became more unstable. An improvement we tried to make was to note how many reading of black a sensor made and decrease the turn rate proportionally with the amount of readings, since we want the turns to be as smooth as possible when we are close to following the direction of the line. This change didn’t appear to improve anything from the testing we did. Another approach was to introduce more states so that we react while sensing black as well, where the simple robot only start turning after sensing black. This worked alright most of the time, but it made the robot a lot more complex without that much to show for it. Due to that, we continued with the simple implementation, since time definitely was an issue for us at this point of time. This simple robot reached the second plateau a couple of times without any turning mechanism, mainly because it was lucky and avoided the black tape ‘cross’. An improvement we considered was to deterministically avoid the cross, but we didn’t look into that due to lack of time. What we ended up doing was to approximate what the tachometer would read at different plateaus. The turns were approximated and hard-coded as well. This took a lot of tuning and calibration, but made us reach the top relatively easily. Getting down never happened, since the light readings (especially going down) became very unstable at night. Considering our experiences driving up to the top, getting back down wouldn’t take that much effort, since we already had experience in driving down. This robot is implemented in the program **_PIDTest.java_**. The program runs a loop with a sampling time of a few ms. In each sample, it first checks whether the tachometer is above the threshold on which it is supposed to do something, and accordingly does exactly that and otherwise drives using LineFollower functionality. The thresholds are approximated from observations and testing. The first turn should happen at approximately 2200 on the tachometer, which is supposed to be when the robot hits the first plateau. The next threshold is a bit different according to new approximations stemming from observation and testing. As mentioned, this works all the way to the top when the stars align. The LineFollowing functionality works by taking a light reading on each light sensor. These readings lead into different cases, where the previous state is taking into consideration as well. An improvement to this implementation could be to separate the functionality into behaviors, such as LineFollowing and Turning, with an arbitrator, since it could be useful to have the light sensing running, while a hard-coded turn was happening. This way the robot would know where to look for the line after a turn. The robot drove fairly fast and did reach the top every once in a while, but the approximations combined with external factors such as battery power affecting the motor speeds, made it increasingly unstable the further on the ramp it got.
|
|
The two light sensors’ only purpose in this robot was to make sure we knew on which side of the robot the line was. We took advantage of their relative positions, by keeping a state of which sensor last saw the black line. Partially knowing our position (or at least which side of the line we’re on) means that we can use the integral part of a PID controller well, since that enables us to drive smoothly in a (sine) curve instead of the erratic behavior, we get from reacting every time we sense black. It mainly uses the integral and the proportional part of PID, since that’s the parts we thought that the robot would benefit the most from. The robot remembers, which sensor saw black the last and decreases the power of the respective motor exponentially until it sees black again. This way we can have both motors running with a lot of power continuously. A very naive implementation worked pretty well, and it didn’t seem to benefit a lot from more advanced techniques, but that might have something to do with the light in the room, since the readings became more unstable. An improvement we tried to make was to note how many reading of black a sensor made and decrease the turn rate proportionally with the amount of readings, since we want the turns to be as smooth as possible when we are close to following the direction of the line. This change didn’t appear to improve anything from the testing we did. Another approach was to introduce more states so that we react while sensing black as well, where the simple robot only start turning after sensing black. This worked alright most of the time, but it made the robot a lot more complex without that much to show for it. Due to that, we continued with the simple implementation, since time definitely was an issue for us at this point of time. This simple robot reached the second plateau a couple of times without any turning mechanism, mainly because it was lucky and avoided the black tape ‘cross’. An improvement we considered was to deterministically avoid the cross, but we didn’t look into that due to lack of time. What we ended up doing was to approximate what the tachometer would read at different plateaus. The turns were approximated and hard-coded as well. This took a lot of tuning and calibration, but made us reach the top relatively easily. Getting down never happened, since the light readings (especially going down) became very unstable at night. Considering our experiences driving up to the top, getting back down wouldn’t take that much effort, since we already had experience in driving down. This robot is implemented in the program **_OldPIDTest.java_**. Some of the attempted improvements can be found in **_PIDTest.java_**. The program runs a loop with a sampling time of a few ms. In each sample, it first checks whether the tachometer is above the threshold on which it is supposed to do something, and accordingly does exactly that and otherwise drives using LineFollower functionality. The thresholds are approximated from observations and testing. The first turn should happen at approximately 2200 on the tachometer, which is supposed to be when the robot hits the first plateau. The next threshold is a bit different according to new approximations stemming from observation and testing. As mentioned, this works all the way to the top when the stars align. The LineFollowing functionality works by taking a light reading on each light sensor. These readings lead into different cases, where the previous state is taking into consideration as well. An improvement to this implementation could be to separate the functionality into behaviors, such as LineFollowing and Turning, with an arbitrator, since it could be useful to have the light sensing running, while a hard-coded turn was happening. This way the robot would know where to look for the line after a turn, since this was mainly the place, where the approximations failed. The robot would either over- or understeer and end up placed differently to where it thought it was. The robot drove fairly fast and did reach the top every once in a while and even seemed consistent at times, but the approximations combined with external factors such as battery power affecting the motor speeds, made it increasingly unstable the further on the ramp it got. The video [TODO video ref nr.] shows the robot reaching the second plateau, but failing due to an oversteer. The different improvements, we considered, could have probably made it a lot more stable.
|
|
|
|
|
|
|
|
[TODO: Fix to markdown format & number
|
|
|
|
|
|
|
|
https://www.youtube.com/watch?v=w5JwM_dEYVQ
|
|
|
|
|
|
[TODO EMIL kodespecifikt afsnit - tacho counter, pid, behavior as improvement, so that we knew where the line is while turning eg]
|
|
*Video X:...*
|
|
|
|
|
|
### Introducing DifferentialPilot
|
|
### Introducing DifferentialPilot
|
|
|
|
|
... | @@ -280,9 +290,9 @@ In most runs, the robot succeeded in passing the second platform onto the third |
... | @@ -280,9 +290,9 @@ In most runs, the robot succeeded in passing the second platform onto the third |
|
|
|
|
|
https://www.youtube.com/watch?v=EdtbRaXm6NA
|
|
https://www.youtube.com/watch?v=EdtbRaXm6NA
|
|
|
|
|
|
*Video 4: ….*
|
|
*Video 4: SimplePilot doing well but eventually running off the track*
|
|
|
|
|
|
We had foreseen the issues with keeping the robot going in the right direction and not off the track, as expecting to be able to place the robot completely aligning with the track and not having it diverge from this direction later on is unrealistic in a real world environment: Aside from human motorics not and eye-measuring skills not being precise enough, irregularities in the surface and the tires of the robot, as well as imbalances in the robot’s construction, etc. all introduce drifts in its course.
|
|
We had foreseen the issues with keeping the robot going in the right direction and not driving off the track, as expecting to be able to place the robot completely aligning with the track and not having it diverge from this direction later on is unrealistic in a real world environment: Aside from human motorics not and eye-measuring skills not being precise enough, irregularities in the surface and the tires of the robot, as well as imbalances in the robot’s construction, etc. all introduce drifts in its course.
|
|
|
|
|
|
The robot may have had trouble getting safely up the ramps using this approach, but it was able to clear the platforms a lot faster compared to line following, so we decided to try and combine the two approaches.
|
|
The robot may have had trouble getting safely up the ramps using this approach, but it was able to clear the platforms a lot faster compared to line following, so we decided to try and combine the two approaches.
|
|
|
|
|
... | @@ -290,9 +300,15 @@ The robot may have had trouble getting safely up the ramps using this approach, |
... | @@ -290,9 +300,15 @@ The robot may have had trouble getting safely up the ramps using this approach, |
|
|
|
|
|
Initially, we included code from **_GodBot.java_** [TODO ref] in a copy of **_SimplePilot.java _**[TODO ref], **_PilotedCarLineFollower.java _**[TODO ref], to drive the robot up the ramp instead of using pilot.forward(). In the API for DifferentialPilot [TODO ref. til [http://www.lejos.org/nxt/nxj/api/lejos/robotics/navigation/DifferentialPilot.html](http://www.lejos.org/nxt/nxj/api/lejos/robotics/navigation/DifferentialPilot.html)] it is stated that the results of other objects making calls to the motors when DifferentialPilot is in use are unpredictable. We had trouble figuring out how to separate line following control from the differential pilot’s control, to be able to switch between them. Our first attempt was to instantiate the DifferentialPilot within the if-statement that captures the case where the robot reads black on both sensors, so that it would only try to run in this isolated case and would be terminated upon exit from the if-statement’s scope. The robot had no trouble getting up the first ramp using line following and individual control of the motors, and it performed the turn on the first platforms - but after this, when exiting the if-statement, the robot started randomly turning and racing around. We spent several hours trying out different approaches to getting the differential pilot to work along with the GodBot line following, but did not succeed.
|
|
Initially, we included code from **_GodBot.java_** [TODO ref] in a copy of **_SimplePilot.java _**[TODO ref], **_PilotedCarLineFollower.java _**[TODO ref], to drive the robot up the ramp instead of using pilot.forward(). In the API for DifferentialPilot [TODO ref. til [http://www.lejos.org/nxt/nxj/api/lejos/robotics/navigation/DifferentialPilot.html](http://www.lejos.org/nxt/nxj/api/lejos/robotics/navigation/DifferentialPilot.html)] it is stated that the results of other objects making calls to the motors when DifferentialPilot is in use are unpredictable. We had trouble figuring out how to separate line following control from the differential pilot’s control, to be able to switch between them. Our first attempt was to instantiate the DifferentialPilot within the if-statement that captures the case where the robot reads black on both sensors, so that it would only try to run in this isolated case and would be terminated upon exit from the if-statement’s scope. The robot had no trouble getting up the first ramp using line following and individual control of the motors, and it performed the turn on the first platforms - but after this, when exiting the if-statement, the robot started randomly turning and racing around. We spent several hours trying out different approaches to getting the differential pilot to work along with the GodBot line following, but did not succeed.
|
|
|
|
|
|
##### DifferentialPilot line following (unsuccessful)
|
|
##### DifferentialPilot line following
|
|
|
|
|
|
Deciding to take a break from attempting the combined approach, we resorted to implementing line following using the DifferentialPilot’s arc-method, in the class **_Piloted_****_LineFollower_****_.java _**[TODO ref]. It proved difficult to properly steer the robot according to divergence in the black/white readings of the sensors. Though this was perhaps most likely due to a limited overview of how to utilize our error measurements in the arguments to **_arc()_** and/or increasing tiredness and frustration levels, we temporarily gave up on creating a satisfactory implementation. Instead, we began an implementation using a switcher to switch between behaviors implemented in an interface. This is described in the section below. After some debate regarding this solution (to be found in the section below), we continued the attempt at implementing line following using DifferentialPilot. We experimented with different parameters for **_arc()_** and introduced a turn utilizing **_rotateRight()_** to turn until aligned with the bottom bar of the Y on the platform in order to let the robot more easily find its way back on track after making a platform turn. We also performed other minor additions. We managed to get the robot to the second platform, successfully performing the intermediate turn operation on the first platform [TODO: Video]. At some point, however, the robot began failing in detecting the black end line - perhaps due to poor lighting in the later hours of the day, perhaps due to the sensors being raised as they were sometimes scraping on the track, and perhaps simply due to low battery. As black sensing was still working for the other two ongoing implementations, we decided to leave the DifferentialPilot at this.
|
|
Deciding to take a break from attempting the combined approach, we resorted to implementing line following using the DifferentialPilot’s arc-method, in the class **_Piloted_****_LineFollower_****_.java _**[TODO ref]. It proved difficult to properly steer the robot according to divergence in the black/white readings of the sensors. Though this was perhaps most likely due to a limited overview of how to utilize our error measurements in the arguments to **_arc()_** and/or increasing tiredness and frustration levels, we temporarily gave up on creating a satisfactory implementation. Instead, we began an implementation using a switcher to switch between behaviors implemented in an interface. This is described in the section below. After some debate regarding this solution (to be found in the section below), we continued the attempt at implementing line following using DifferentialPilot. We experimented with different parameters for **_arc()_** and introduced a turn utilizing **_rotateRight()_** to turn until aligned with the bottom bar of the Y on the platform in order to let the robot more easily find its way back on track after making a platform turn. We also performed other minor additions. We managed to get the robot to the second platform, successfully performing the intermediate turn operation on the first platform (see Video 5). At some point, however, the robot began failing in detecting the black end line - perhaps due to poor lighting in the later hours of the day, perhaps due to the sensors being raised as they were sometimes scraping on the track, and perhaps simply due to low battery. As black sensing was still working for the other two ongoing implementations, we decided to leave the DifferentialPilot at this.
|
|
|
|
|
|
|
|
[TODO: Fix to markdown video format]
|
|
|
|
|
|
|
|
https://www.youtube.com/watch?v=LqK0Zw65I-c
|
|
|
|
|
|
|
|
*Video 5: PilotedLineFollower succeeding in performing the turn on the first platform*
|
|
|
|
|
|
##### Switching between navigation types
|
|
##### Switching between navigation types
|
|
|
|
|
... | @@ -304,13 +320,9 @@ An implementation similar to **_SimplePilot.java_** [TODO ref], that used a touc |
... | @@ -304,13 +320,9 @@ An implementation similar to **_SimplePilot.java_** [TODO ref], that used a touc |
|
|
|
|
|
### Discussion of alternative approaches
|
|
### Discussion of alternative approaches
|
|
|
|
|
|
Several times during the work of this lab session we planned to use behaviors and an arbiter to separate each behavior from each other. Instead we used very long clase
|
|
Several times during the work of this lab session we discussed using behaviors and an arbiter to separate each behavior from the others. Instead we used very long classes with a lot of if-statements inside a surrounding while-loop. This additionally means that we are currently relying on **_Thread.sleep()_** to make sure that each desired action are executed for the duration we want it to be. Although this was not optimal, we didn’t feel like implementing the behavior-based approach as we wanted to focus on getting the robot to climb the track.
|
|
|
|
|
|
TODOQ: ville nok være bedre at have behaviors og at håndtere dem så én behavior kører færdig, før en behavior med lavere prioritet kan overtage (lige nu kører vi på if-statements og er derfor nødt til at sleepe, mens vi drejer 180 grader, så den ikke bliver fucket op af at læse hvid undervejs i drejet)
|
|
|
|
|
|
|
|
Furthermore, we discussed using an ultrasound sensor mounted to the front of the robot, facing down, as a means of discovering when the robot was hitting a platform,
|
|
|
|
|
|
|
|
TODOQ: brug ultralydssensor til at måle afstanden til platformen og dermed til at se, om robotten er på vej over en platform-kant.
|
|
Furthermore, we discussed using an ultrasound sensor mounted to the front of the robot, facing down, as a means of discovering when the robot was hitting a platform from the increase in measured distance. We decided against exploring this much further, as our past experiences with the ultrasound sensor indicated that it was far too inaccurate at very short distances for us to get any readings that we could use to discover the platforms with
|
|
|
|
|
|
## **Conclusion**
|
|
## **Conclusion**
|
|
|
|
|
... | @@ -322,7 +334,7 @@ At the 5th of April we decided to split the group into working with two differen |
... | @@ -322,7 +334,7 @@ At the 5th of April we decided to split the group into working with two differen |
|
|
|
|
|
The precision line follower, **_PIDGod.java _**[TODO ref], had a lot of success getting to the top of the platform, and once we raised the light sensors were able to drive down as well, but the problems caused by the raised sensors reading black from being too far off the ground when driving onto the platforms while driving up the track, meant we were never able to get a reliable reaction to all of our desired turning triggers, and could never get it to drive up and down in a singular build and program.
|
|
The precision line follower, **_PIDGod.java _**[TODO ref], had a lot of success getting to the top of the platform, and once we raised the light sensors were able to drive down as well, but the problems caused by the raised sensors reading black from being too far off the ground when driving onto the platforms while driving up the track, meant we were never able to get a reliable reaction to all of our desired turning triggers, and could never get it to drive up and down in a singular build and program.
|
|
|
|
|
|
TODOQ: Konklusion på line following og PID - Emil]
|
|
The other PID line follower approach, **_OldPIDTest.java_**, wasn’t very consistent in reaching the top, but did it a lot faster. It was intended to be a simple solution, but it became obvious that the implementation could also easily become complex. At times it performed consistently, but it could also be very unstable. It wasn’t entirely clear which factors affected the performance, but it seemed like especially battery and the resulting motor power would affect both the tacho counter and linefollowing. The low level of ambient light also seemed to affect our light readings when attempting to drive down. With a few very good results and a relatively simple implementation, it definitely shows promise and serves as a fine proof-of-concept of what we initially attempted to achieve.
|
|
|
|
|
|
Our attempt to use the **_DifferentialPilot_** class was illustrative of the inconsistencies that the real world brings into robot programming; although **_DifferentialPilot_** is very effective at controlling the robot’s movements precisely, the simple approach in **_SimplePilot.java_** is not enough to make a functioning hill-climbing robot as external factors bring the robot off a perfectly straight course. The work on **_PilotedLineFollower.java_**, which adds line following to SimplePilot, showed promise, but due to time constraints we chose to opt out of our experiments with it before having implemented a fully functional up-down robot.
|
|
Our attempt to use the **_DifferentialPilot_** class was illustrative of the inconsistencies that the real world brings into robot programming; although **_DifferentialPilot_** is very effective at controlling the robot’s movements precisely, the simple approach in **_SimplePilot.java_** is not enough to make a functioning hill-climbing robot as external factors bring the robot off a perfectly straight course. The work on **_PilotedLineFollower.java_**, which adds line following to SimplePilot, showed promise, but due to time constraints we chose to opt out of our experiments with it before having implemented a fully functional up-down robot.
|
|
|
|
|
... | | ... | |