This is the code for our design pattern, we will make the robot draw concentric squares. I left in the comments so it is easier to understand (although the pattern and code is simple enough so that the comments are not needed).

float _initialX, _initialY, _finalX, _finalY;
float _originX, _originY;
int _windowHeight, _windowWidth;
color[] _colors = {#CC0000, #000099, #00FF00};// Color are Red, Blue, and Green respectively
float _lineDistanceX, _lineDistanceY;
float _percentHeight, _percentWidth; // Control the start point of the line by percent away from edge
int _max; //Control the array index

/**
*Setup initializes all variables that will be used for the line draw and load the
*default color a random choice between red, green and blue
*/
void setup()
{
_windowHeight = 250;
_windowWidth = 230;
_percentHeight = 0.05;
_percentWidth = 0.05;
_initialX = _windowWidth *_percentWidth; //Sets the origin of the maze at 10% of the screen width
_initialY = _windowHeight *_percentHeight; //Sets the origin of the maze at 10% of the screen height
_originX = _initialX;
_originY = _initialY;
_finalX = _initialX+1;
_finalY = _initialY;
_lineDistanceY = _windowHeight *_percentHeight;
_lineDistanceX = _windowWidth *_percentWidth;
_max = 3;
size ( _windowWidth, _windowHeight);
background(255);
smooth();
stroke(_colors[int(random(0,_max))]);
}

void draw()
{
if (_finalX == (_windowWidth-_lineDistanceX) && _finalY!= (_windowHeight-_lineDistanceY))
{
if (_initialY == _finalY)
{
_initialX = _finalX;
stroke(_colors[int(random(0,_max))]);
}
_finalY++;
}

else if (_finalX == _originX && _finalY == (_windowHeight-_lineDistanceY))
{
if (_initialY == _finalY)
{
stroke(_colors[int(random(0,_max))]);
_initialX = _finalX;
}

if (_lineDistanceY <= (_windowHeight*.50))
{
_finalY–;
}
}
else if(_finalX <= _initialX && _finalY == (_windowHeight-_lineDistanceY))
{
if (_initialX == _finalX)
{
stroke(_colors[int(random(0,_max))]);
_initialY = _finalY;
}
if (_lineDistanceX <= (_windowWidth*.50))
{
_finalX–;
}
}
else if (_finalY == (_originY + 10) && _finalY != _initialY)
{
stroke(_colors[int(random(0,_max))]);
_originX = _originX +10;
_originY = _finalY;
_initialY = _finalY;
_lineDistanceX = _lineDistanceX+10;
_lineDistanceY = _lineDistanceY+10;
}
else if (_finalX >= _initialX && _finalY == _initialY)
{
if (_lineDistanceX <= (_windowWidth*.50))
{
_finalX++;
}
}
else if (_finalX == _initialX && _finalY <= _initialY)
{
if (_lineDistanceY <= (_windowHeight*.50))
{
_finalY–;
}
}

line ( _initialX, _initialY, _finalX, _finalY);
}

Advertisements

 

Sorry for the late information, I have built a total of four walls each being 8 feet long (which is about 2.4 meter, where the ultrasonic sensor limitation is 3 meter [although I have gotten reading from about 3.3 meters]). These walls are made of MDF because the original plan to make them of concrete had an obvious problem transportation due to its weight and a not so obvious problem of porous texture of the walls cause an uneven distribution of paint and improper reading from the IR sensor. Although the concrete walls would have cost less I still think the MDF walls provide a better overall usability, when I tested the reading error was at 1% (at 100CM, the sensor read 101CM), I did not test the IR but considering that there were no correction during the initial run made from the IR sensor I think it is acceptable. The paint I used to coat the walls is an Oil-based high-gloss white enamel, I was going to use a water based paint, but everywhere I read about the paint it was recommended for its light and heat absorption which is not what we need here. I also made 4 barriers to allow everyone to be able to use this setup to fit their purpose. Next I will give the specification on the MDF used.

 

The MDF is 1/2 inch thick, (I believed that the thicker ones would not have provided much more benefit), the texture is of course smooth, the height of the walls and barriers are 6 inches (the robot is only 4 & 1/16 inches high) . The length of the walls are in 4 feet segments (connected by hinges which make it 8 feet) and the barriers are 1 feet in length (the robots width is 6 & 3/8 inches wide). 

Image

Image

To be perfectly honest, I think that aside from the micro-controller and the sensors, everything else was of very poor quality. The metal body is very malleable, some of the plastic component came warped, the design of the robot itself also leave very little to be desired. I do like the Arduino board very much, it provided a lot of I/Os (although it came very close to being insufficient) and is very easy to program and debug. The motors of the robot are very flimsy and are prone to lag (at least from my experience). The ultrasonic sensor although it is very good, I think it would have been better to have made one ourselves from separate component, because even by giving the sensor a 200ms delay I still get a lot of timeouts when I am trying to read from it. With all of these issues some good does come from it, because these are problems that we would face in a real-world scenario. The overall experience because of these problems I think make it more rewarding because overcoming the defects are half the fun, it also allows for better coding because you have to consider more setbacks.

The AMTEL is unique because of its design and the fact that the architecture is designed for the C programming language. It also has a wide range of operation voltage which makes it ideal for most embedded projects. It uses a reduced instruction set and the Harvard Architecture, which is a computer architecture where the storage and the signal pathways for the instructions and data are physically separated. Another great feature does not deal directly with the AVR but more with the way to develop for it; although there are different models, there only exist a specific set of tools to deal with them. This is good because it makes the learning curve much smoother. Another great feature is that they are inexpensive when compared with the alternatives.

This only applies to FAU students (in-case someone else stumbles upon this blog):

We are required to read these books for class, and I found out that you can now save the PDF of the book so you can always have it with you. To save the PDF I used the free PDF-XChange, it allows me to save from online sources, I think Adobe Reader may support it as well but I have not tried.

Be sure you are on the Campus WIFI (using fau network not fauguest): First go to http://fau.catalog.fcla.edu/fa.jsp, if you are connected to the network it will bypass all the required authentication, if not you will need to log-in. Then in the search you type in “Atmel AVR Primer”, the first link is the book for the class (the second link seems to be pretty useful but I cannot find part II of it on FAU Catalog) should take you to a PDF version of the book. If you are using a PDF-XChange, the most-left button on the tool bar will be the save icon, click on it and save. There are other ways of doing it too, depending what browser you are using, some browsers have resource sniffer which can rebuild PDFs.

For the second book, which is split into two parts, you will follow the same steps as before except you will now search for “Arduino Microcontroller” the first two links are both parts of the book. Follow the same steps as before, and you have all the eBooks at your disposal. If you have any complications get my email address from the blackboard directory and send me an email.

We built our robot on Thursday and I must say I was disappointed. The only thing we did was screw on the parts together, which was difficult simply because the quality of the robot itself is extremely poor. The holes did not align and the screws were very easily to get damaged. The metal was very flimsy and bent easily. The instructions give on how to build the robot by the T.A. were very clear and concise so there was no misunderstanding the only problem was the robot. The pictures below show the robot and a future part that will need to be attached and have connections soldered.

After reading the NSF proposal, the problem that seems to plague us the most is localization. Localization limits all functionality of the robot (except its ability to communicate with the controller Android device [to a certain extent of course])  and some of the methods we’ve talked about in class produce doubtful returns. I believe that the answer lies in echo location (ultrasound, and my personal area of interest). At first look it may seem unreliable because you will need to account environmental noise, interference and overall margin of error. I don’t believe that power consumption will be a problem because sound will only be emitted and received at certain times and even then power use should be minimal.

Now my way of implementing the solution is naive, but I believe it could be effective. One of the problem that you will notice in my implementation, which I have attempted to solve but may require more “setup” for the robots prior to use, is at what height would you have the ultrasound transmitted, because at the level of the robot there will be many obstacles. One solution is having a mechanism that can raise the transmitter when needed, but then again how would the height be determined, so this solution, to me anyway, is not feasible. The other solution is dependent on how the robots communicate with each other, assuming that they have full communication abilities (meaning not just a flashing LED), then at the start of use of the robots, the beacons would send a signal that localization would start and each beacon would emit the sound to check their surrounding. From there it would begin in a sequence to identify which robot is where (to facilitate the understanding of what I mean I will scan a drawing later). Most of the heavy calculations would be done by the beacons themselves and after they would have compiled the data then the robots would receive a “map” with the resultant locations of all identified objects in the field.

So the sequence would be as follows: Beacon1 sends sound and records objects at the given distances, followed by Beacon2, Beacon3 and Beacon4. From there they would begin requesting the other robots to emit the sound, of course this would be done one robot at a time. The Beacons would go into listening for the sound as a robot emits it; then the Beacons would try to verify if the results of the ping (as transmitted by the robot after it has emitted and received the sound) matched with any of the detected objects in the original transmission. If it does then an ID is attached to the object that was ping by the Beacons, then it could go down the list of robots performing this procedure. At the end of the procedure the Beacon would know the locale of all the robots and some of the possible obstacles, and with this information the next step would be to create a “map” of IDs and distances, and transmit the map back to the robots in the field. At every completed movement there would need to be a “ping” operation to ensure that the map remains accurate, the ping would not be as extensive as the initial setup, as the ping would only be done by the robot that moved and the results would be transferred to the Beacons for processing.

I have only thought-out the software implementation of this tactic, but have not really looked into hardware abilities and limitations.