Up until this year, the Hydrobot has really been more of a fancy hydrogen powered remote control car than a robot. However, during the school year of 2003/04 the power supply was finally made to work reliably and all sorts of mechanical and electrical issues were sorted out. It was time to make the robot behave like a robot!
We made the decision to have me add the sensors over the summer. This was not because the club members are incapable of doing so. Rather, it was because the club has spent two years working on the hardware, and we want them to able to focus on the programming and behavioral aspects of the robot this year.
The pan/tilt sensor array has a Devantech SR04 ultrasonic range detector, a CMUcam and an Sharp GP2D-12 infrared range detector. The array can pan 180° and tilt about 45°. The tilt range could be improved by doing a little re-engineering and mount modifications, and I hope to sick a student or two on that this year.
All parts for the sensor array were made by hand using simple tools. The frame was made out of common brass stock available at hardware or hobby stores, 6.40mm by 1.60mm and 6.40mm by 1.0mm (1/4" by .062" and 1/4" by .035" for you stone-agers). The servo mounting platforms are made out of PCB scraps, 1.60mm in thickness.
As you can see in the picture, I made the box frame for the mount with the thinner brass stock. In order to mount the SRF04, I had to make the cutouts to accept the transducers. The SRF04 is mounted with two layers of electrical tape on the bracket. Otherwise, the bracket might short out the traces on the board.
The servo lever was soldered to the frame, then the hole was drilled for the bolt. The two thicker "arms" were placed in a small hobby vise, then twisted with a crescent wrench. The connecting rod between the tilt servo and lever the is simply a length of 12 gauge copper wire. Once it was all mounted, I tweaked the mounting arms and frame until it was all level/plumb and straight, or at least somewhat so.
In the "if I could do it over again" department, I wish I had used much thinner and more supple wire for the sensor array, and that I had put plugs on the ends instead of soldering the wires directly to the sensors. The too heavy wire slows down movement and makes it harder for the servos to get to the far end of their throw. With enough use, I am sure we will eventually run into connection problems, and have to replace the wires at that point. For now, it works!
Once the array was mounted on the robot, I played around a bit with SRF04. It's a cool sensor, but there are certain angles that it does not like. This is to be expected from the nature of the beast, not any slam on the sensor. One of the angles it does not like is when the array is tilted as low as it can go. We want to be able look for a sharp drop off, say the edge of a table.
So, we decided to try the Sharp GPD2-12 infrared range detector as a complement to the SRF04. As usual, my first move was to call the manufacturer and see if I could get them donated. After a little looking and calling around I was given a number in San Jose to try. Once I was connected to the right person, I started my spiel: "Hi, I'm working with the Mendocino High School on a hydrogen fuel cell powered robot…" when I was interrupted with "What high school did you say?". It turns out that the person on the other end of the line was Correy Robinson, class of 1987!! What a case of serendipity! Correy very promptly sent us six GPD2-12 sensors. Thanks to both Correy and Sharp!
The GPD2-12 sensor outputs a voltage according to distance, but it is not linear. I was struggling with the problem, so I consulted Tracy Allen of Electronically Monitored Ecosystems, Basic Stamp guru and math genius, although he will tell you his math is so-so. Ha! Anyway, he wrote a formula based on the data sheet and I tried it out, but it didn't work very well.
To figure it out, he asked for a set of readings at every 10cm, which I took very carefully and sent to him. Jon Williams of Parallax also took some reading from a sensor he had. It became obvious that there is some variation to the individual sensors. So, Tracy came up with a formula and a PBasic routine to calibrate each sensor, and it works a treat! It will be really handy to be able to calibrate the sensors for different lighting conditions and surfaces. Thanks a bunch, Tracy! The sample code that resulted can be downloaded HERE. You can also find a page with more information about using one of these sensors HERE.
Even calibrated, the sensors work best in the first half of their range. I found the 10uF cap called for in the data sheet too low, and by experimenting determined that about 33uF seems best. More makes no difference, and with less the jitter increases. One problem with the sensor is that when there is no target in range wacky readings can result, and they may be a legitimate number, meaning between 10 and 80cm, the range of the sensor. This will be a challenge to work with. As with the SRF04, it has angles it doesn't like, but we are hoping that between the two we will do OK.
The Hydrobot currently has three of the GPD2-12 sensor mounted, one on the pan/tilt array, and two mounted on the side of the robot body. The purpose of the two on the side is to determine when the robot is parallel to a wall. We hope this will be useful in maze solving and room mapping.
Side mounted Sharp GPD2-12 sensors
NOTE: In order to use the CMUcam in the manner described below, you have to modify it slightly and probably void the warranty of the 'cam from Parallax. I'm sure they do not recommend you do so, and I say if you try it, you are on your own! However, it has worked fine for me.
To modify CMUcam, all you have to do is remove the strap across the jumper terminals that select the baud rate on the 'cam. With no jumpers, the default is 115200, just what we want. I decided to put a SIP header on there so I can easily select baud rates if I want to. So far though, it has worked fine at 115200, and I don't think the SIP header is needed. While Parallax modestly claims 115200 to be too fast for the 2p40, we have had no problems doing so in this application. This is not to say that I would expect it to do so under all circumstances.
The output of the CMUcam is directly connected to the EB500 module, but the input is connected to the Stamp. This bypasses the Stamp and dumps the CMUcam data directly to the PC. In operation, the PC passes commands to the Stamp, which in turn passes them to the 'cam. This system means that there can be no accidental commands sent to the 'cam from other data passing through.
So far, I am beating my head silly trying to get a pic from the CMUcam to PC, or rather I can get the data there, but I can't manage to display it yet. I'll either figure it out or have "QWERTY" permanently tattooed on my forehead!
As yet, very little practical testing and use of the sensor array has taken place. The idea is to pursue maze solving and room mapping. This is the job of the club members, and it will be very interesting to see what uses they put the array to. Whatever happens, it is going to be cool to see the robot evolve from a fancy remote control car to a real robot!