Tuesday, May 28, 2013

Robot Arm: Part 1

In Part 0, I covered the motivation for this project as well as a general outline for how I will proceed. In this part, I'll try to cover the hardware modifications and basic electronics I've used to control the robot arm.

To start, how does the arm work once it is assembled? There are 5 DC motors that control the different joints. There is one motor to control the base swivel, three to bend the arm joints up and down, and one to open and close the gripper. Each motor is driven by applying a voltage across the leads, positive for one direction, negative for the other. There is a hand controller with rocker switches to apply these voltages in either direction. The whole thing is loaded with 4 D-size batteries in series. Each cell provides about 1.5V, but the motors are not driven with the full 6V possible with all of them in series. Instead, 'ground' is taken as half way up this stack, so that the hand controller can apply +/- 3V to each battery depending on the position of the switches.

I've chosen to use an Arduino Pro Mini as the on-board controller for this project, so the first step is to find a way to supply each motor with the proper voltage and current using 5V logic. The Arduino IO pins can't provide enough current to move the motors, so I use a couple H-Bridge drivers instead. Each driver can apply a given voltage to two DC motors in either direction based on input from a microcontroller. With five motors to control, I hook up 3 of these drivers to a protoboard and wire the control lines to the Arduino IO pins.


It's a mess of wires, but some of that is to allow the arm to move freely.


Instead of sticking with the original 3V for each motor, I've gone ahead and allowed for the full 6V of the batteries to be used. The danger here is probably overheating the motors, so I'll have to be cautious. If this turns out to be a problem, I can always get the Arduino to pulse the drivers instead of having them continuously on.

With this step done, I have full control over the movement of the arm. I can either program the Arduino to move the arm in some regular way, or use my laptop to send commands and have the Arduino interpret them and drive the motors. The issue here is that I have no way of knowing where the arm is at a given moment. There is no feedback in the motors that tells the Arduino how far it has been moved, I need to figure that out myself.

There are a few ways I can think of off the top of my head that will allow me to know what position the arm is in:
1) Dead-reckoning - If you know the position at one point and you know how long you've run one of the motors, you can estimate how far it has moved. This still requires knowing the position once, and imperfections in the gearing will cause this method to be very inaccurate.
2) Accelerometers - Each arm segment makes some angle relative to gravity, and knowing each angle allows you to find the arm position. In theory this is fine, but it seems like overkill. Also it does not work for rotation around the base.
3) Potentiometers - Each joint is connected to a potentiometer so that as the arm bends, the potentiometer is turned, causing a variable resistance corresponding to the angle. The Arduino measures this as an analog voltage.
4) Rotary Encoder - Open up the gear boxes and attach a rotary encoder to each motor that sends a signal for every few degrees of gear rotation. Add up the clicks and you get how much the motor has turned. Unfortunately the gear boxes are a pain to open.

The third option seemed like the most reasonable, so I bought a couple 10K potentiometers and glued them to each joint. I'm a fan of using hot glue for projects that don't need to last forever, but if the potentiometers start to come loose I can always switch to epoxy. Each was wired to the main board as a voltage divider with the variable lead being read by the Arduino analog inputs.


Sensor for base rotation. The black sticks are left over from the arm kit.


Second joint. The wire lashing is for safety, or something.


Third joint.


Last sensor. Not the best position, but it works well enough.


With everything hooked up, I check the sensors by letting a single motor turn back and forth while printing the analog input value to the serial port:


Analog reading directly from the Arduino every 10 ms. Bottom axis is around 18 seconds total. I rotated the base from one end to the other, so this is the total range I can get for this sensor.

As you can see, the sensor readings look clean and fairly stable. With some calibration, I'm able to convert these analog values to the angle of each joint. With some simple forward kinematics, I can figure out the 3D position of the claw with reasonable accuracy.

The next part of this project will be sorting out the input method for controlling the arm. I'm hoping to have some sort of physical input using my own arm, but we'll see how things work out!


Maybe after some clever programming I will remove the dunce hat.

Robot Arm: Part 0



The other week, my two siblings sent me a robot arm toy for my birthday. I have a bad history of cutting, burning, and shocking my hands while working on my hobbies, so the idea was that I could use this robot hand instead of my own to save some skin. At first this seemed a little silly, but it got me thinking. Could I figure out a way to use this toy as a useful proxy of my own hand? The range would be limited, the carrying capacity would be small, and the joints wouldn't match my own arm, but there are still some instances where having a hand that doesn't spout blood when something goes wrong would be useful. One such instance is when dealing with the spinning propellers of a drone. In order to test the stability algorithm that keeps my drone in the air, I would really like to poke one side of the frame to see how it reacts. Putting your body anywhere near a large drone is a pretty bad idea (as I now know), so having a remotely controlled arm would be immensely useful.
So I've started a new project to modify this toy arm to mimic my real arm movement. I see this project happening in at least 3 parts. First, I need to modify the arm so it can be controlled by my own code. Next, I need find a way of using the position of my own arm as input for a controller. Finally, I need to come up with an algorithm that translates my own motions to ones the robot arm can handle.

Through all of this I will try to stick to cheap(ish) electronics parts, because if I subject this device to the situations I've put my own hands into, it won't have a long life anyways.

Saturday, February 2, 2013

Arduino Intervalometer (Hardware)



Summary:
I design and build an Arduino-powered intervalometer for my camera, but mostly I learn that circuit design is non-trivial.

Motivation:
I do a decent amount of photography, and have always enjoyed doing time-lapses. To trigger my camera at regular intervals, I've been using a cheap intervalometer that has a few simple functions. One issue I've always had with this particular one is that you can't specify both the interval between shots as well as the exposure time. One reason for needing this would be when doing nighttime time-lapses, where the exposure time might need to be greater than 30s (longest the camera can be set to do on its own).

So I decided to build my own intervalometer using an Arduino-compatible microcontroller to trigger the shutter release at arbitrary intervals and for arbitrary exposure times.

Build Details:
After doing a little research, I found that my camera can be triggered easily using a 2.5mm audio cable. The three wires in the audio cable are for ground, auto-focus, and shutter release. Shorting either of the last two to the ground wire will cause the camera to perform the desired function. I've never wanted to redo my focus after each shot of a time-lapse, so I didn't do anything with the auto-focus wire. This very basic circuit shows how I controlled the shutter release using the microcontroller:
Shutter gets shorted to ground when Logic goes high.

Now, I could have essentially stopped there. With a 555 timer and some potentiometers I could have had an intervalometer with fully adjustable intervals and exposure times, but that would just be too easy. And boring. I wanted some kind of visual output from the circuit as well as the ability to run more advanced interval routines. To handle user input, visual output, and triggering the camera, I decided to use an ATMega328P (the brain of an Arduino Uno) that I had lying around. It has plenty of IO pins, can use the Arduino bootloader (to make programming easy), and can even run on an internal 8MHz oscillator, eliminating the need to add an external oscillator to the circuit.

I found some neat 8-character, 8-segment displays at my local electronics shop for $1.50 a piece, so I grabbed a couple so I could use them in future projects if they turned out useful. They have 16 lines, 8 common cathode lines for each segment and 8 common anode lines for each character. I decided to source the segments directly from the ATMega and select which character to drain through a 3-8 line decoder and some transistors. In all, the display needed 12 IO lines from the microcontroller (8 source, 3 line selection, 1 enable/disable for the decoder).
This one was actually broken. Good thing I bought a couple.

I'm still very much a novice at designing circuits and putting them on boards, but I've been experimenting with a few different methods. Usually I just solder everything onto a protoboard and hope I can make all the necessary connections with solder bridges and jumper wires. This time I decided to try something that might make my life a little simpler, these pre-wired protoboards. Every pad is connected to each neighboring pad, and the connections can be severed with an exacto knife. The goal was to plan out which connections to keep, cut the other connections, then solder all the components in place and be done. So first, I planned the circuit and layout.
It's almost like I know what I'm doing. Time for the layout..

This is not going well. Time to use a computer.

This makes sense... Right?

Drew in the lines I didn't want to cut. Everything else must go!

This was extremely tedious.

Almost forgot the back. Instead of attempting to duplicate a reversed pattern, I just cut everything. One side should be enough.
Everything soldered on, minus the display. Camera cable is the bottom right.

After attaching the display lines to the top of the board, it was time to test everything. I loaded a simple sketch onto the ATMega that would enable the segments making up a '0' while looping through the characters fast enough to have them all appear on at the same time. I switched it all on and nothing happened. Well, nothing that I intended on happening happened. The ATMega did start to heat up a bit though.. Turns out there were a couple shorts in the board due to me not severing the preset connections properly. After spending some quality time with an ohmmeter to check connectivity, I fixed all of the unintentional shorts, and luckily none existed in hard to reach places. Finally, the display could light up.
Well that's not right.

This time the problem was with a burnt out display segment and a mistake in my board layout. The display anodes are connected to a group of transistors at the top of the board, but I forgot to connect all of the emitters to ground. Added an extra wire, replaced the display with one of the extras I bought, and finally the display is working.
Fancy!

At this point I stopped. There's no user input, but I have full control of the camera shutter and the 8-character display. I've just started working on the code to run everything, so at some point I'll make another post about the firmware and how I will deal with user input. For now, I've made a Github repository for the intervalometer code as I work on it.

Insights:
1) The pre-wired protoboards are a pain to use. I need to either make my own PCB, or get them printed elsewhere. If the circuit is complicated enough, it will be worth the money.
2) Programming the ATMega currently requires removing it from the board, placing it on a breadboard, and using another Arduino to upload the sketch. Next time I should add an ICSP header to the board itself to make programming easier.

Saturday, January 12, 2013

Embedded Halloween Costume




Summary:
Used Halloween as an excuse to get myself back up to speed on basic electronics and Arduinos. Ended up with something with flashy lights that looked great in the dark.

Motivation:
After finishing my grad school qualifying exams, I took a few days off work to get myself back into electronics. Halloween was coming up, so I could kill two birds with one stone by making some kind of light-up costume.

Build Details:
After searching around the internet for ideas, I found a few sites offering instructions on how to make an Iron Man costume using LEDs and miscellaneous hardware parts. I decided to model my costume after this image and include a microcontroller to control the LED brightness. There were four main components that needed to be built: the chest light, the hand light, the controller circuit, and the arm brace.
Goatee not included.

The chest light was made by punching holes in a circular piece of cardboard and gluing some blue and white LEDs into the holes. The LEDs were connected in series in groups of three or four, so there were a couple wires coming out of the chest unit that would need to be hooked up to power. I didn't care much about the number of wires needed to power everything, more bare circuitry would make it look a little more authentic anyways. To diffuse the light, I added another layer of cardboard and masking tape to the front to get the outline I wanted.
Cardboard and duct tape, because Halloween is only one night.

Poor white balance on my part making the blue LEDs look purple.

Looks much better with the lights off.

The hand light was made similarly with 7 LEDs (mix of blue and white) on a cardboard backing. For some reason I decided it was not interesting enough to take pictures of while building, so you'll have to use your imagination.

With both lights built, it was time to figure out the controller circuit. I wanted the microcontroller to have complete control over the brightness of the hand and chest units independently, so I wired up a simple circuit that would drive the LEDs with transistors and allow pulse-width modulated output from an ATtiny84 to set the brightness.
Testing the circuit.

Final board. Not pretty, but it worked.

After burning the Arduino bootloader on to the ATtiny I uploaded a sketch that would slowly modulate the brightness of the chest light like a heartbeat. When the button is pushed, the chest dims, the hand flashes for a few seconds, then the chest comes back on with a faster heartrate that slows back down over time. The full code can be found here.

I'm a fan of making my Halloween costumes super-cheap, so the entire arm brace was made out of cardboard, duct tape, and an old motorcycle piece or two. While duct tape obviously isn't the best choice for structural stability, the costume only had to last a few hours of one day.
All assembled.

Bright enough to provide dramatic lighting.

Insights:
1) The ATtiny85 is a great alternative to a full-blown Arduino board when you want something cheap and small. I'll probably use them again in future projects.
2) The lights look nice in the dark, but were far too dim in normal lighting. It might have been worth it to significantly increase the number of LEDs and just be prepared to change the battery every now and then.

Thursday, January 3, 2013

Water Level Analysis

The water level sensor has been steadily collecting data for more than 4 years now. Well, not so steadily. There are small gaps in the data due to things like internet connectivity issues, power outages, code bugs, you name it. Then there are larger data gaps, usually due to a physical malfunction of the sensor. Not to say it was poorly set up or is poorly maintained, in fact it has lived through a good number of storms.

As mentioned in the previous post about the build details, two sets of data are held in the online database. One is high temporal resolution data from only the most recent few days, and the other is coarser data for as long as the project has been going. For this post I will be showing this second data set and hopefully pointing out some interesting features. No build details or motivation, just good ol' analysis.

(Click to Embiggen)

Without any processing, this is what it looks like for the first 1000 days. After this, it's mostly gaps in the data due to my own negligence. But even in this 'nicer' part, there are some obvious flaws. The first is the large gap around day 200. I'll get back to that one. Next is the spiderweb of lines going back and forth through time near day 400. At some point around that time there was a bug in determining the timestamp to associate with a data point, so data was being sent around in time. A quick sweep through the data to remove non-sequential data points would probably clear that up. Next is some anomalous data near data 800 and 1000 showing the water level suddenly dropping to a constant -2.7 feet. After conversion, that is the level returned if the ultrasonic sensor consistently reads the maximum value it can. I'll be honest, I don't remember why that happened. Oh well!

Now that we've pointed out all of the flaws, let's look at the clean parts.
The first couple of days of data sure look clean. The most obvious trend is the daily tides which cause about a foot of variation every day.
More normal daily variations, but suddenly we hit the large gap from the first plot. Even before the gap, there is a long trend of rising water level. Right around the day 175 marker on this plot, Hurricane Ike made landfall a few miles away. The damage to pretty much everything on the island was pretty bad, so getting the water level sensor back up and running was not the number one priority. And so, large data gap.

But back to the cleaner parts of the data, we can also look at tide patterns and how storms play into the mix.
This is the longest portion of uninterrupted data I have, so I'll spend some time analyzing it. As before, we see the daily tide variation causing a foot of variation, but there is something else on top of this signal causing low amplitude nodes every 14 days or so. These low amplitude tides are called neap tides, while the large tides that occur in between are spring tides (this wiki article explains it more). Near day 125 is a storm that rolled through and messed up the nice tide pattern from the weeks before. As seen in both this storm and the hurricane from before, it's possible to predict incoming storms by a day or so just by keeping track of the water level.

The daily and monthly tidal variations are pretty obvious by looking at these time-series plots, but periodic features such as these should also be visible when looking as a function of frequency. The neap/spring tide pattern mimics a beat, so we might expect to see two peaks of power separated by a frequency splitting corresponding to 14 days. To look for this, I apodized the data from the previous plot using a reasonable-looking Gaussian, took a Fourier Transform, and plotted the logarithm of the magnitude.
The whole spectrum, log power. Clearly I record too many data points per day.

Looking at the low frequency power, there are two significant peaks located at 11 uHz and 22 uHz. These frequencies correspond to 25.3 hours and 12.6 hours, respectively. Not quite the frequency splitting I was looking for, but nice to see the daily tides showing up strong. Now at this point in my analysis, I nearly gave up thinking the frequency splitting was hidden under the noise. Maybe it takes more than just a few weeks of clean data to measure this subtle sun-moon interaction.

On a whim, I decided to check out the two small peaks on top of the left larger peak. Considering the scale of noise in the rest of the spectrum, I wouldn't really want to assign much meaning to these. But maybe.. By eye, the two peaks have frequencies 10.76 uHz and 11.57 uHz, so a difference of 0.81 uHz, which is a period of... 14.3 days! Exactly what was expected. Unfortunately, the scientist in me is saying this is just a cute coincidence due to noise and we shouldn't take this seriously. Still, kind of cool to see.

Wednesday, January 2, 2013

Water Level Sensor


Summary:
Used an ultrasonic sensor, 555 timer, netbook, and PHP skills to create a live online database of the water level in Galveston Bay.


This is a project started by my father and me in Spring 2008, with modifications and additions being made intermittently through the years. It is by far the longest running project I've ever done. The setup has been recording water level data in Galveston, TX almost continuously since the beginning (minus a hurricane or two), and the most recent few days of data can be viewed live on the internet:

Most recent 6 days of data. Sometimes looks weird if I've recalibrated something, or if a hurricane has wiped out our house. The dashed blue line is roughly the height of the bulkhead.

Motivation:
Measure the water level in Galveston Bay and display the results on the internet. Simple enough..

Build Details:
1) Sensor
The first issue was how to actually measure the water level. We have a house with a boat slip, so direct access to the water isn't a problem. One method would have been to have some sort of sensor float on top of the water or run down the height of the bulkhead. Exposure to salt water would cause problems for any kind of sensor we could think of, so we opted for remote sensing. We mounted an ultrasonic sensor a few feet above the water pointed down to measure the distance between the sensor and the top of the water. The first sensor we used had a range of 6 to 255 inches with one inch resolution. The connections needed to control the sensor (Vin, GND, TX, RX) were made through an ethernet cable running from the mounted sensor all the way back into our house (~50' of cable) where the rest of the electronics were. We did this to limit the amount of circuitry exposed to the elements.


2) Controller
The second part was figuring out how to trigger the sensor to take a measurement and how to read the result. All that is needed to trigger a measurement on the sensor we chose is a single pulse at least 20us long on the RX line. I wired up a simple astable 555 timer circuit that would send such a pulse 4 times per second, enough to capture the change in water level due to individual waves moving by. Once the sensor has made a measurement, the result is sent back on the TX line as regular serial data in the format R###, with the number corresponding to the distance between sensor and water. The TX line runs straight into an adjacent netbook for processing.

3) Processing
Running on the netbook is a short C# code that reads the incoming serial data and parses it. After converting the R### message from the sensor into a water height using a known calibration, the code stores the result into a buffer. After collecting about 100 measurements (25 seconds of data), the mean and standard deviation of the buffered data is computed. Since the measurements are being taken a few times per second, the standard deviation is taken as an approximate wave height and stored along with the mean water level. Once a few of these water level / wave height pairs are collected, the code sends them off to the internet for logging.

4) Logging
One measurement every 25 seconds is still a lot of data to store if you are interested in keeping track of years of data. The data set sent from the netbook gets stored in two different databases. The first stores every measurement received, but clears out any data that is over 7 days old. This way, recent data can be viewed at high resolution and only requires about 25k data points. The second database stores 15 minute averages of the incoming data, so even years of data only needs 100k data points. In all, both databases combined only take up a few megabytes of memory.

5) Plotting
To generate live plots of the most recent few days of data, I wrote a somewhat lengthy PHP script that would read data from the database and output a png file plotting the data. Instead of using a library for plotting things, I for some reason hand-wrote how to set up the axes, autoscale them, draw grids, print labels, and plot the water level line by line. While this allowed for a significant amount of customization, in hindsight I think we could have survived with a pre-built library like Google Chart Tools. Still, the plot works well and allows the user to set both the image dimensions and the time extent to display.

To this day, the water level plots have been useful for figuring out water conditions, tide patterns, and incoming storms. At some point I'll write a post showing some of the long-term recorded data and how it can be analyzed. All kinds of interesting features can be seen, like the interplay between solar and lunar tides and the effects of incoming storms.



Insights:
1) Weather-proofing an outdoor sensor is difficult, but not impossible. Over 4.5 years the sensor had only been compromised once, and that was from a hurricane that was able to take someone else's boat and place it on our lawn.
2) The 555 timer is too simple and static, while the netbook is too complicated and overpowered. A simpler solution would be to use a single Arduino board to trigger the sensor, process the measurements, and send data to the internet.

Monday, December 31, 2012

Informed Battleship AI




Summary:
Set out to measure where people typically place ships in Battleship, possibly found pattern? Mostly learned about online survey biases.

This project is from a little after the Multitouch Screen, in Fall 2007 at the beginning of my sophomore year of college. A friend and I were considering algorithms to use in a Battleship AI and started to wonder if people tend to place ships in common locations. It seemed reasonable to assume corners would be frequently populated, but we didn't want to just build in a few heuristics based on guesswork. We decided to measure how often spots on the Battleship board were populated by ships by recording how people placed ships when playing.

Motivation:
To build an 'informed' Battleship AI, we wanted to measure how frequently spaces on the board are populated.

Build Details:
To record how people place ships, I wrote a simple web-app that presented the user with a blank board and allowed them to click to place ships. Once all of the ships are placed, the app would submit the placement mask to a database that would keep track of every entry as well as a composite board that combined every submission into a heat map.

Blank board presented to the user. Full of AJAX-y goodness.

Populated board with score that meant little to nothing.

I never had the energy to actually write a working Battleship AI, so after the ship placement was submitted, the app would return a score based on how much the submitted pattern differed from the current composite map. A low score was supposed to indicate that an AI using the recorded results would quickly find your ships by following a look-up table.

To collect data, I sent out a link to the app to college friends. We got maybe 40 submissions from that, but we wanted much more. After submitting a link to reddit, the number of submissions shot up above 600, enough to start looking for trends. The heat map below shows the frequency of a spot being occupied by a ship, with black being 10% and white being 25%.

Strangely symmetric?

First thing of note is how the upper left and bottom right corners are populated nearly a quarter of the time, while the other corners are significantly less. The whole board shows rough symmetry around a line connecting top left to bottom right, which seems strange to me now. Before I make any grand claims about significant patterns in Battleship placement, I would redo the experiment with a few changes:
1) Verify the code to make sure the board isn't accidentally getting flipped every now and then (I wasn't the best programmer back when I did this)
2) Log the IP address of a user to eliminate duplicate submissions.
3) Create an interface that is easier to use and mimics ship placement on a real board (the click-to-rotate might have biased the ship orientations)

Insights:
1) Asking people to fill out an online survey is asking for people to mess with your results.
2) If you want people to pretend to play Battleship, make an interface that mimics Battleship