@@ -99,10 +99,8 @@ We became aware that some of the identified issues express behaviors, while othe
We performed an initial run with the LineFollower program as it was in Lesson 1. Following the line went fine when going up the first ramp, but then the robot lost track of the line and nearly toppled over the edge. Going down, the robot followed the line for a while, before turning into a wall. Later on in our experiments, we realized that the inexplicable turns were caused by the robot’s rear support wheel.
The LineFollower program from Lesson 1, when attempting to correct the robot’s course, turns the robot by only having one motor turned on, which we also observed when we ran the program. We had already, in the very beginning of our experiments, discussed modifying the program so that the robot would always use both motors. The purpose of this modification would be to increase the forward motion along the desired path such that less time would be wasted in the turns.
@@ -149,17 +147,13 @@ The extension of the code consisted in an if-statement checking if both sensors
Running the InitialClimber program resulted in the robot driving exceptionally slow up the ramp. As the robot drove onto the platform it kept on following the line, but instead of driving all the way to the end of the platform it started turning in the middle where the three black lines of the Y-shape meet. This can be seen in Video 2.
*Video 2: Initial run of InitialClimber, where the robot surpricingly turns in the middle of the first platform.*
*Video 2: Initial run of InitialClimber, where the robot surprisingly turns in the middle of the first platform.*
We reasoned that the premature turn in the middle of the platform was caused by the two sensors coincidentally being positioned above each their "arm" of the Y and thus both reading black at the same time, thereby triggering the turn. Instead of moving the sensors further away from each other, to try and stop them from seeing black in the middle of the platform, we decided to decrease the turn, as turning in the center instead of at the edge would decrease the total time spend on the track and thereby improve our solution. We decreased the turn time and tried again.
[TODO: Fix to markdown video format]
[![Robot reaching second platform](http://img.youtube.com/vi/Use9_ANpYOA/0.jpg)](https://www.youtube.com/watch?v=Use9_ANpYOA)
*Video 3: Robot reaching second platform with the InitialClimber program*
@@ -219,9 +213,7 @@ Now we hit a new obstacle. When hitting the second platform on the way down, our
We were at an impasse trying to solve this problem. Lowering the light value for which we would accept to be considered ‘black’ on both sensors, meant that the robot was no longer recognizing the Y-sections either. We briefly attempted to simply accept the reading when driving onto a platform as ‘black’ and had code a slight drive onto the platform and finding the next black line. The problem then became that no value for what was accepted ‘black’ on both sensors, resulted in reliable behavior. Either the robot wouldn’t react to one or more of the four ‘black’ triggers (the two platforms and the two Y-sections) - both on the way up and the way down, or it would trigger at times it weren’t supposed to at all. Video 4 shows this problem, where the robot triggers two left turns when hitting the top of second ramp as a result of somehow still seeing black after hitting the line on the platform.
[TODO: Fix to markdown video format]
[![PilotedLineFollower turning robot at first platform](http://img.youtube.com/vi/B-lPjFO3Bh0/0.jpg)](https://www.youtube.com/watch?v=B-lPjFO3Bh0)
*Video 4: PilotedLineFollower succeeding in performing the turn on the first platform*
@@ -233,11 +225,9 @@ As an alternative to the above approach, which was beginning to seem complex, we
The two light sensors’ only purpose in this robot was to make sure we knew on which side of the robot the line was. We took advantage of their relative positions, by keeping a state of which sensor last saw the black line. Partially knowing our position (or at least which side of the line we’re on) means that we can use the integral part of a PID controller well, since that enables us to drive smoothly in a (sine) curve instead of the erratic behavior, we get from reacting every time we sense black. It mainly uses the integral and the proportional part of PID, since that’s the parts we thought that the robot would benefit the most from. The robot remembers, which sensor saw black the last and decreases the power of the respective motor exponentially until it sees black again. This way we can have both motors running with a lot of power continuously. A very naive implementation worked pretty well, and it didn’t seem to benefit a lot from more advanced techniques, but that might have something to do with the light in the room, since the readings became more unstable. An improvement we tried to make was to note how many reading of black a sensor made and decrease the turn rate proportionally with the amount of readings, since we want the turns to be as smooth as possible when we are close to following the direction of the line. This change didn’t appear to improve anything from the testing we did. Another approach was to introduce more states so that we react while sensing black as well, where the simple robot only start turning after sensing black. This worked alright most of the time, but it made the robot a lot more complex without that much to show for it. Due to that, we continued with the simple implementation, since time definitely was an issue for us at this point of time. This simple robot reached the second plateau a couple of times without any turning mechanism, mainly because it was lucky and avoided the black tape ‘cross’. An improvement we considered was to deterministically avoid the cross, but we didn’t look into that due to lack of time. What we ended up doing was to approximate what the tachometer would read at different plateaus. The turns were approximated and hard-coded as well. This took a lot of tuning and calibration, but made us reach the top relatively easily. Getting down never happened, since the light readings (especially going down) became very unstable at night. Considering our experiences driving up to the top, getting back down wouldn’t take that much effort, since we already had experience in driving down. This robot is implemented in the program **_OldPIDTest.java_** . Some of the attempted improvements can be found in **_PIDTest.java_** . The program runs a loop with a sampling time of a few ms. In each sample, it first checks whether the tachometer is above the threshold on which it is supposed to do something, and accordingly does exactly that and otherwise drives using LineFollower functionality. The thresholds are approximated from observations and testing. The first turn should happen at approximately 2200 on the tachometer, which is supposed to be when the robot hits the first plateau. The next threshold is a bit different according to new approximations stemming from observation and testing. As mentioned, this works all the way to the top when the stars align. The LineFollowing functionality works by taking a light reading on each light sensor. These readings lead into different cases, where the previous state is taking into consideration as well. An improvement to this implementation could be to separate the functionality into behaviors, such as LineFollowing and Turning, with an arbitrator, since it could be useful to have the light sensing running, while a hard-coded turn was happening. This way the robot would know where to look for the line after a turn, since this was mainly the place, where the approximations failed. The robot would either over- or understeer and end up placed differently to where it thought it was. The robot drove fairly fast and did reach the top every once in a while and even seemed consistent at times, but the approximations combined with external factors such as battery power affecting the motor speeds, made it increasingly unstable the further on the ramp it got. Video 5 shows the robot reaching the second plateau, but failing due to an oversteer. The different improvements, we considered, could have probably made it a lot more stable.
@@ -259,9 +249,7 @@ The turn on the top platform was performed similarly, and on the way down the pr
In most runs, the robot succeeded in passing the second platform onto the third ramp, but would then run into the wall or off the track because it had come too much off course. The turn on the top platform as well as the final turn were therefore tested separately, by commenting out sections of the code and placing the robot appropriately on the track.
*Video 6: SimplePilot doing well but eventually running off the track*
@@ -277,9 +265,7 @@ Initially, we included code from **_GodBot.java_**  in a copy of **_SimplePi
Deciding to take a break from attempting the combined approach, we resorted to implementing line following using the DifferentialPilot’s arc-method, in the class **_PilotedLineFollower.java_** . It proved difficult to properly steer the robot according to divergence in the black/white readings of the sensors. Though this was perhaps most likely due to a limited overview of how to utilize our error measurements in the arguments to **_arc()_** and/or increasing tiredness and frustration levels, we temporarily gave up on creating a satisfactory implementation. Instead, we began an implementation using a switcher to switch between behaviors implemented in an interface. This is described in the section below. After some debate regarding this solution (to be found in the section below), we continued the attempt at implementing line following using DifferentialPilot. We experimented with different parameters for **_arc()_** and introduced a turn utilizing **_rotateRight()_** to turn until aligned with the bottom bar of the Y on the platform in order to let the robot more easily find its way back on track after making a platform turn. We also performed other minor additions. We managed to get the robot to the second platform, successfully performing the intermediate turn operation on the first platform (see Video 7). At some point, however, the robot began failing in detecting the black end line - perhaps due to poor lighting in the later hours of the day, perhaps due to the sensors being raised as they were sometimes scraping on the track, and perhaps simply due to low battery. As black sensing was still working for the other two ongoing implementations, we decided to leave the DifferentialPilot at this.