Saturday, April 1, 2017

Astrophotography with the Raspberry Pi Camera - A Cheapskate's Guide to Solar System Photography

I recently bought a Celestron 127SLT telescope with the goal of doing some solar system observation and photography. I had a Raspberry Pi camera and a B+ board kicking around from a previous robotics project, and decided to give it a try. It's not an ideal camera for this sort of thing, not by a long shot - but I already had it. The v1 camera can now be had for $13. It was worth a try.

The results were better than I expected. Here's some results. Details on how I did it are below.

Copernicus Crater, The Moon


 Edge of a Mare, The Moon

How the Camera Is Mounted

A friend printed this mount for me. It was easy to assemble and everything fits well. This allows you to plug the camera in like an eyepiece. If you just replace the eyepiece with your camera, you are doing prime focus photography, which uses the telescope as a giant camera lens and focuses the image directly on your camera's sensor. To do this, you remove the lense that came on the camera. The magnification you get depends on the sensor size and the telescope focal length. The Pi camera has a very small sensor, and the 127SLT has a long focal length of 1500mm, resulting in a fairly high magnification level. Shorter focal length scopes will result in less magnification.

I mounted the camera first to check fit, and then carefully unscrewed the lens to remove it. There is a small piece of glue securing the lens - remove that with an Xacto knife before unscrewing the lens. Grab the black square camera body with one pair of pliers and gently twist the len with hemostats. You don't want the camera body to transfer those forces to the PCB or it may be torn off the board. Alternatively, there are printable spanner sets for this purpose on Thingiverse.

I then mounted the camera assembly to a printed case with a RaspPi B+.

I powered the Pi off a USB battery pack intended for charging phones - even a small one provides several hours of juice.

Software on the Raspberry Pi

In order to frame the picture and to focus, you need to be able to see a preview of what the camera is seeing. The best way I found to do this was using RPi-Cam-Web-Interface. I set it to stream 640x480 previews at 40% quality, which was pretty snappy.  You can then take pictures and video over the web interface. It's really slick. The installation instructions are excellent.

In order to be able to use it away from my home access point, I also set up the Pi to act as a wireless access point. That's completely optional, but is convenient if you intend to operate the system away from home. Then you just connect to the Pi with your phone, tablet, or computer and control the camera from there. You can adjust contrast, exposure compensation, ISO, shutter speed, etc.

Capturing Data and Best Camera Settings

I experimented with both video and still capture. Without fail, images made from stacked video turned out much better than still frames. The noise from the camera is high enough that even lunar shots benefit from stacking frames to improve signal to noise ratio.

The capture mode you select matters a great deal. I had best results using modes that result in binning. This combines data from adjacent pixels into one, which reduces resolution but is more sensitive and gives you a better signal to noise ratio. For Jupiter I used 640x480, which results in 4x4 binning, and 1296x730, which results in 2x2 binning.

Autoexposure will not be ideal - it will almost certainly overexpose. Try using -2 EV compensation and manually dialing the contrast in.


I used AutoStakkert or Registax 6 to stack frames, and did wavelet sharpening in Registax 6. I first ran the video through PIPP to crop the video and make alignment easier.

For comparison, stacked frames resulted in the picture on the left, and wavelet processing resulted in the final image on the right.The right image is also flipped right to left, since my telescope does that with a diagonal installed.

I have also gotten good results stitching lunar panoramas from video shot on the Pi and my SLR with Microsoft's Image Composite Editor, which is flat out awesome.

I usually do finishing touches on the lunar pictures with Silver Efex Pro, part of the now-free Nik Collection.

Attaching to the Telescope - Prime Focus

For lunar photography this is the way to go, unless you want extreme closeups. Just plug the camera in just like you would an eyepiece. You could probably gain a small amount of efficiency by removing the mirror diagonal, but if a 3D printed component broke I wanted it falling into my diagonal, not my telescope.

Eyepiece Projection for Planetary Imaging

For higher magnfication you can use Eyepiece Projection (EP). An eyepiece is used to project the image onto the sensor. An adjustable tube allows you to increase the magnification by increasing the distance of the eyepiece to the camera sensor. I used a 25 mm eyepiece and the extension tubes shown belowm along with this adapter. The same hardware can accept an SLR with T-ring, or a dedicated astro camera. Note: with EP, you focus by adjusting the telescope's focuser, and adjust magnification by sliding the camera in and out. I didn't actually see that written down anywhere and had to determine it experimentally. :-)

Update: Using my new ZWO astro camera, I have gotten MUCH better results with a 2x Barlow (Celestron Omni) I have not tried it with the RaspPi cam, but you may well get better results. I tried the ZWO cam with both and the Barlow was far superior. This shot of Jupiter at opposition was taken with a 2X barlow. 


The low light sensitivity, extremely small sensor, and high noise combine to make the RPi cam a far from ideal solution, but it does work and is a very fun way to dip your toe into astrophotography. You are limited to 6 second exposures, so you won't be imaging nebulas, but the moon and the planets are fun.

It was enough to convince me I wanted to spend the money on a dedicated astronomy camera, so I bought a ZWO ASI290MC. I will keep the Pi scope camera though - it might be fun to play with streaming video from the scope over the internet.

Sunday, January 22, 2017

SOLD, Thanks! - Siskiwit Bay 17' strip built touring kayak for sale w/ Accessories

I am selling a strip built kayak that I built in 2008. It is built from plans - the design is Bryan Hansel's Siskiwit Bay. I built it for camping trips and day paddling, but changing circumstances have made that difficult, and I don't want to see it sit. It was built for adventure.

Want more detail? All these photos are available at full resolution in a Google Photos album.

 It has plenty of space for camping gear in the front and rear compartments, and has front and rear bulkheads. It also is a wonderful boat to paddle on lakes just to relax.

The boat is 17' long, and weighs 58 lbs.The deck is cedar, and the hull is pine. It uses a pretty standard fiberglass and epoxy overlay inside and out, as outlined in Nick Schade's book. The glass and epoxy were of very good quality since I knew I'd be paddling it out into the woods.  Extra layers were applied to the hull scrape zone and the rear deck, where you sit to get in and out. I'm a pretty big guy, and it carries me and camping gear with ease.

It has a seat back, hatch hold-downs and adjustable foot braces from Pygmy Boats. It includes a 1000D Cordura nylon cockpit cover I made, as well as a commercially made skirt. It also includes a Thermarest seat for the cockpit, which is comfortable and serves to keep you low and stable. Paddles, a very nice Kokotat life jacket, a bilge pump and a stern light are available, though not included. Details below.

The boat has been stored inside since it was built, with the exception of a couple overnight camping trips. It got 5 coats of spar varnish after completion, but it's due for another application now. Wood boats require care and some routine maintenance - every so often you have to varnish the wood to prevent yellowing of the epoxy when exposed to sunlight. It has not been paddled much over the last couple years, so has not seen much UV exposure.

A lot of effort was made to make the exterior look as good as I could make it, and it gets compliments most times it's on the water. I didn't put a lot of effort into making the inside look good - my logic was that it would generally be hidden by the cockpit cover or the sea skirt and paddler, and it was a tradeoff to save a few months of build time. I've included pictures of the cockpit below so that you can determine if you are comfortable with that tradeoff.  I understand that your time is valuable, and I don't want anyone to be surprised.

There are also some surface scratches on the bottom of the hull and the sides where the top and bottom. A light sanding and skim coat of epoxy will easily fix them. Pictures of those are below as well. I'd recommend that you fix the scratches, wait the recommended time for the epoxy to cure and avoid blush, and then varnish it.

The boat is registered in the state of Ohio. I am happy to provide you a bill of sale to facilitate transfer and registration. Please understand that I cannot provide a warranty - wood boats require inspection and maintenance, and determining that it is safe to paddle is your responsibility. That being said, I have spent a great deal of time paddling it without trouble, as have my friends and members of my family. It was a great project and has served me well. I hope you enjoy it.


Kayak with skirt and cockpit cover - $650
Werner paddle - $65 ($50 if bought with boat)
Adventure Paddle - $25 ($15 if bought with boat)
Kokotat Bahia XXL life jacket - $60 ($50 if bought with boat)
Bilge pump - $10
Stern light - $10

Front Hatch

Rear Hatch
Cockpit coaming detail

Light Scratches

Light Scratches

Sunday, October 16, 2016

3.5" SPI LCD experiments on Raspberry Pi Zero

A while back my local MicroCenter got some Pi Zeros in stock. I bought several, along with a couple microSD cards. I ordered a micro-HDMI adapter and an OTG USB hub, and I was in business.

I was poking around on Amazon and found that a number of vendors are selling low cost TFT LCDs that work over the hardware SPI port. A number of developers have written kernel drivers for them, including the very popular Notro FBTFT drivers  and SWKim01's Waveshare drivers. Prices range from $11 on up, with plenty being available for around $20.

I didn't have a project in mind, but it seemed like a neat thing to be able to do with a $5 computer, so I selected this one, a 3.5" LCD that uses the Waveshare driver. This unit is designed to plug right on top of the headers on the Pi Zero. This is convenient, but blocks a lot of the IO pins. The vendor provides a customized distribution of Rasbian for download, but I preferred to work on the standard one. Luckily, one of the reviewers left an outstanding review with all the details needed to get it going on a standard Raspian machine.

The Zero doesn't have any IO pin headers installed, so I added enough to plug in the LCD.

The LCD plugs in and stands off a good distance from the Pi. If I wanted a thinner setup, I'd probably buy an LCD that does not plug into the headers and hand wire the connections.

When you next start up the Pi, don't expect video yet. You need to do some configuration to have it use the correct driver for the framebuffer and direct the console there. All of this information came from the terrific writeup in an Amazon review here. I'm not breaking any new ground at all, but wanted to duplicate it in case the review goes away.

1) Edit /boot/cmdline.txt to have the following arguments:

dwc_otg.lpm_enable=0 console=tty1 console=ttyAMA0,115200 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline rootwait fbcon=map:10 fbcon=font:ProFont6x11 logo.nologo
2) Add the following to the end of /boot/config.txt:


3) Download a copy of waveshare35a.dtb from swkim01's waveshare github site and copy it to /boot/overlays/waveshare35a.dtbo:

cp waveshare35a-overlay.dtb /boot/overlays/waveshare35a.dtbo

If, like me, you have a minimal installation without X Windows installed, you can reboot now - you're done. The framebuffer console should scroll on the LCD. I have not tested the section below yet, but the Amazon comment goes into the detail for configuring X Windows:

(update: I have tested X, and it works great. Notes on preventing screen blanking are below)

"4. edit /usr/share/X11/xorg.conf.d/99-fbturbo.conf to contain:

Section "Device"
Identifier "Allwinner A10/A13 FBDEV"
Driver "fbturbo"
Option "fbdev" "/dev/fb1"

Option "SwapbuffersWait" "true"

5. create a file named 99-calibration.conf under /etc/X11/xorg.conf.d to contain:

Section "InputClass"
Identifier "calibration"
MatchProduct "ADS7846 Touchscreen"
Option "Calibration" "3932 300 294 3801"
Option "SwapAxes" "1"

6. reboot and enjoy!" - Amazon Commenter Willie

Further useful stuff:

To disable screen blanking (framebuffer console only):

Add the following argument in /boot/cmdline.txt


Mine now looks like this, all on one line:

dwc_otg.lpm_enable=0 console=tty1 console=ttyAMA0,115200 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline rootwait fbcon=map:10 fbcon=font:ProFont6x11 consoleblank=0

To disable X Windows screen blanking:

add the following to /etc/lightdm/lightdm.conf in the section [SeatDefaults]

xserver-command=X -s 0 -dpms

Framebuffer Console Tools

With the framebuffer working, you can use programs like fbi to display images, or other utilities to play video. I found that video did work, but was limited by the SPI throughput to a couple of frames per second, so I'm not convinced that's very useful. For displaying images in a slide show, this command works well:

sudo fbi -T 2 -d /dev/fb1 -a -t 5 -noverbose /home/pi/images/*

It will display all the images in the passed directory in an infinite loop with delay of 5 seconds.

Saturday, October 31, 2015

Raspberry Pi Wifi Rover on Dagu Rover 5 Chassis


This is the second revision of my Wifi controlled rover. The first used an old Android phone and an IOIO board as a way to learn some Android programming and figure out how to control a vehicle over the network. It worked pretty well - a friend drove it across the internet from 40 miles away, and I learned a lot.

I did figure out that I didn't like Android for a robotics platform, since it is so highly optimized for GUI use. Keeping a program running at high priority in the background on an Android device isn't trivial, because it's not designed for that. You have to assume your program can be interrupted and resumed at any time. I decided that even though the phone came with all sorts of cool sensors and was extremely compact, I wanted the control over what was happening that comes with Linux.

My intent is to use this rover as a testbed for systems I will eventually install in an underwater remotely operated vehicle. It's also a lot of fun to drive. I wanted to document the overall design here - it has been done lots and lots of times, but a detailed writeup might be useful to someone.

So here we go. This is intended to show you one possible way to do it, and the thought processes that went into the design. It can all be improved.


Dagu Rover 5 chassis. This thing is pretty awesome, but I have a problem with mine shedding tracks occasionally that I have not been able to fix. I understand this was fixed in later versions than mine.

A motor controller capable of handling the current of all four motors. The stall current on each motor is 2.5A, and there are four of them.

A Raspberry Pi B+ and camera module and wifi dongle. Probably going to upgrade the dongle to something with a real antenna soon, as range is limited.

A 2200 mah LiPO battery from my quadcopter, providing 12.6V.  LiPo batteries are a signifcant fire hazard if not handled properly. The rover currently has no method to automatically kill power when pack voltage drops - consider a safer battery chemistry, like NiMH, if you are unfamiliar with the risks inherent in big unprotected LiPo packs.  An excellent guide is here.

At the very least, a low voltage alarm like those commonly used in RC aircraft is a must.

A battery eliminator circuit like those used in RC aircraft to generate a nice steady 5 volts from the 12.6V LiPo pack to power the Raspberry Pi and motor board.

Network Design

The rover starts up an access point and also starts two servers on the Raspberry Pi. Each listens on a different port. Full details on the software configuration are below.

An Android device connects to the access point and is issued an IP address. The IP address of the rover is fixed - it's acting just like a router for your home internet. This makes it very easy to take the router somewhere and run it with no additional infrastructure. However, it makes it harder to run over the internet. If that's your goal, it's better to just connect your rover to an existing Wifi network so that traffic can be routed to it from anywhere. I may add a switch later that allows me to flip between these modes.

My ROV will be designed to take to places with no infrastructure, and I wanted to be able to easily take the rover to show friends, so I chose to make the rover the access point.

I currently have the rover configured to act as an access point and hand out IP addresses in the 192.168.42.x range with no DNS or default gateway. The rover itself is on

Software Design

A traditional robotics paradigm is the "Sense, Think, Act" cycle. A robot takes input from it's sensors, performs processing on them to try to identify the best course of action, and then commands the robot's actuators to do something. The process then repeats.

We're not building a robot in the typical sense. That's because a human is in the loop, making the decisions based on sensor input. I wanted to make sure that the platform could be used as a robot, just by changing the software on the server, but right now I'm interested in building a reasonably robust remotely operated vehicle rather than something autonomous.

On reflection, I decided that a remotely operated vehicle can follow the same sense-think-act cycle. The primary difference is that the thinking is done off-vehicle, by the human operator.

I wanted to be able to send back sensor data from the rover, such as video, voltage levels, accelerometers, GPS data, etc. and display them on a simple console. So on the network, the command traffic would look like:

rover sends current sensor data to console
console sends back commands (turn on motors, etc)

Video would be handled on a separate connection, on it's own thread.

Currently, I'm not sending back any data from sensors. I will detail plans for that in the "Next Steps" below.

The server sets the appropriate IO pins, which drives the motor controller board. My rover has 4 motors, each controlled by a direction line and an enable line.

If the timeout value is exceeded, the server shuts down the motors, resets and waits for another connection,

The Python program at the end of this post implements this. Sending the full string defining the direction is horrendously inefficient - in the next revision of the client program I'll reduce that to, say, a single character. I originally did it this way to aid in debugging the client, and never got around to fixing it.

Client Program

The client I wrote is fairly simple. It rapidly makes HTTP requests to get an updated JPEG image from the rover, and updates the screen. A separate thread sends commands and gets a fake sensor value back. It attempts to reconnect when the connection is lost.

Doing the video this was is crude and eats a lot of network bandwidth compared to something like H.264, but it's easy to implement and actually works pretty well at 320x240 and 640x480.

Low(er) Lag Video Streaming on the Raspberry Pi

There are a number of tutorials for using the raspi-still command to grab a still frame and shove it across the network via a couple methods. These work well for a stream that can tolerate a lag, but it results in a delay of up to a second and the framerates are low. This is due to an inherent delay in the raspi-still program - it's not designed for that.

I got much better results using the Video for Linux (V4L) driver and MJPG-Streamer.It took some doing - you first have to compile the V4L driver. Good instructions are available here and here.

I ran into a problem getting mine to compile. I got an error, "undefined reference to symbol 'clock_gettime'". The solution was found here.

A great tutorial for compiling mjpg-streamer is here.

While compiling mjpg-streamer, I ran into a kernel specific problem with kernel version 3.18 . The solution was found here.

I use this command to launch the video server on port 6001.

/usr/local/bin/mjpg_streamer -i "/usr/local/lib/ -n -f 15 -q 80 -r 320x240" -o "/usr/local/lib/ -p 6001 -w /usr/local/www"

Access Point Configuration

One way to turn your Raspberry Pi into an access point is to use hostapd and dhcpd.

The Edimax WiFi dongle is not supported by the stock hostapd binary that you get with apt-get install. Dave Conroy has figured out how to make it work - he has a great document describing the process here (starts at the Prerequisites section). I used that to get it working, and some of the configuration options described on Adafruit's tutorial. My dhcpd.conf  file and hostapd.conf file are below.


# Sample configuration file for ISC dhcpd for Debian

# The ddns-updates-style parameter controls whether or not the server will
# attempt to do a DNS update when a lease is confirmed. We default to the
# behavior of the version 2 packages ('none', since DHCP v2 didn't
# have support for DDNS.)
ddns-update-style none;

# option definitions common to all supported networks...
#option domain-name "";
#option domain-name-servers,;

default-lease-time 600;
max-lease-time 7200;

# If this DHCP server is the official DHCP server for the local
# network, the authoritative directive should be uncommented.

# Use this to send dhcp log messages to a different log file (you also
# have to hack syslog.conf to complete the redirection).
log-facility local7;

# No service will be given on this subnet, but declaring it helps the 
# DHCP server to understand the network topology.

#subnet netmask {

# This is a very basic subnet declaration.

#subnet netmask {
#  range;
#  option routers,;

# This declaration allows BOOTP clients to get dynamic addresses,
# which we don't really recommend.

#subnet netmask {
#  range dynamic-bootp;
#  option broadcast-address;
#  option routers;

# A slightly different configuration for an internal subnet.
#subnet netmask {
#  range;
#  option domain-name-servers;
#  option domain-name "";
#  option routers;
#  option broadcast-address;
#  default-lease-time 600;
#  max-lease-time 7200;

# Hosts which require special configuration options can be listed in
# host statements.   If no address is specified, the address will be
# allocated dynamically (if possible), but the host-specific information
# will still come from the host declaration.

#host passacaglia {
#  hardware ethernet 0:0:c0:5d:bd:95;
#  filename "vmunix.passacaglia";
#  server-name "";

# Fixed IP addresses can also be specified for hosts.   These addresses
# should not also be listed as being available for dynamic assignment.
# Hosts for which fixed IP addresses have been specified can boot using
# BOOTP or DHCP.   Hosts for which no fixed address is specified can only
# be booted with DHCP, unless there is an address range on the subnet
# to which a BOOTP client is connected which has the dynamic-bootp flag
# set.
#host fantasia {
#  hardware ethernet 08:00:07:26:c0:a5;
#  fixed-address;

# You can declare a class of clients and then do address allocation
# based on that.   The example below shows a case where all clients
# in a certain class get addresses on the 10.17.224/24 subnet, and all
# other clients get addresses on the 10.0.29/24 subnet.

#class "foo" {
#  match if substring (option vendor-class-identifier, 0, 4) = "SUNW";

#shared-network 224-29 {
#  subnet netmask {
#    option routers;
#  }
#  subnet netmask {
#    option routers;
#  }
#  pool {
#    allow members of "foo";
#    range;
#  }
#  pool {
#    deny members of "foo";
#    range;
#  }

subnet netmask {
 option broadcast-address;
 option routers;
 default-lease-time 600;
 max-lease-time 7200;
 option domain-name "local";
 option domain-name-servers,;


#  this enables the 802.11n speeds and capabilities
The following commands are in a small script, /home/pi/startap, to start the dhcp server and hostapd. sudo service isc-dhcp-server start sudo hostapd /etc/hostapd/hostapd.conf &

Automatic startup

There are a number of ways to do this, but I decided the simplest way was to make small scripts to start each subsystem and then launch the from /etc/rc.local. I appended these commands to /etc/rc.local:

/home/pi/startAP &
/home/pi/ &
/home/pi/startVidServer &

Next Steps

I intend to add an Arduino that can communicate via USB to gather sensor data such as pack voltage. Ideally, the Arduino could control power to the Raspberry Pi to allow a complete shutdown. Even if you shut down the Raspberry Pi via a shutdown -h, it will still draw significant power while halted. That's not good - you need to be able to kill power when the pack is dead. I intend to design this and test it prior to using it in the ROV.

It needs some big honkin' bright lights. Just because.

A Sharp IR sensor or ultrasonic range finder would be cool to have and would allow for simple autonomous behavior, as well as being useful to a human operator.

Control server code

#!/usr/bin/env python

##example command set: true,stop,75
import socket
import sys
import traceback
import time
import syslog
import RPi.GPIO as GPIO
import time

GPIO.setup(18, GPIO.OUT)
GPIO.setup(23, GPIO.OUT)
GPIO.setup(24, GPIO.OUT)
GPIO.setup(22, GPIO.OUT)
GPIO.setup(27, GPIO.OUT)

pwm = GPIO.PWM(18, 1000)

##pin 22 = back left, true = reverse
##pin 23 = front left, true = reverse
##pin 24 = back right, true = reverse
##pin 27 = front right, true = reverse

value = 0

syslog.syslog('Rover: Server starting....')

host = ''
port = 6000
backlog = 1
size = 4096
count = 0

while 1:
 syslog.syslog("Rover: Waiting for connection...")
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)


        client, address = s.accept()

        syslog.syslog("Rover: Got client connection...")
 count = 0
 while (1):
   clientReq = client.recv(size)
   syslog.syslog("Rover: Socket error")

  if (clientReq == ""):
   syslog.syslog("Rover: Connection broken.")

  parsedCommands = clientReq.split(',')
  ##parsedCommands[0] is motorEnabled
  ##parsedCommands[1] is direction
  ##parsedCommands[2] is integer 0-100 representing throttle


  if (parsedCommands[1] == "forward"):
   value = int(parsedCommands[2])

  if (parsedCommands[1] == "reverse"):
   value = int(parsedCommands[2])

  if (parsedCommands[1] == "rotateRight"):
   value = int(parsedCommands[2])

  if (parsedCommands[1] == "rotateLeft"):
   value = int(parsedCommands[2])

  response = str(5) + "\n"
  count = count + 1
        syslog.syslog("Rover: Shutting down server socket.")

Saturday, June 27, 2015

Sonar Development: Towards Something That Works In Water

Bathymetric survey data from a NOAA ship.

After a few months of distractions to prepare for a new kid, build a quadcopter, and work a bunch, I'm trying to get my sonar project moving forward. As documented in previous articles, I've got a simple digital sonar working in air. It was a simple way to test the echo detection algorithms. I'm convinced if I can figure out a way to use piezo transducers to transmit sound in the water, I can make it work.

Previous installments:

Audible Frequency Chirp Sonar on the Stellaris Launchpad

Initial Experiments - Sonar in air with a conferencing speaker mic and Python

So the next challenge is how to mount the piezo element and efficiently couple the sound to the surrounding water. One way to do this appears to be to pot the transducer in a potting compound that closely matches the density of water.

This article from NOAA on building hydrophones for listening to whales details one way to pot a transducer, and also includes a high gain amplifier circuit. My current plan is to build one, and figure out how to get the ADC on the Launchpad reading the audio. From there, I can make a transmit circuit. The challenges will likely be in acoustic coupling, transducer selection, and getting enough power into the water to travel a reasonable distance.

I considered using piezo disks, but I found that getting any sort of output from them at all requires them being mounted at either their edges or nodal points in a resonant cavity known as a Helmholtz chamber. I don't think I can manufacture one to the precision needed for the small size. I'm going to work first with cylindrical piezo units as used in the hydrophone above.

I intend to try one with a resonant frequency in the audible range - that's not going to result in very good resolution, but should be easier to debug since I can hear it and use PC audio equipment to measure it. Once that works, I'll switch to higher frequencies.

The next step is to build a functioning hydrophone with a piezo element, get it working with the op-amp, and get that feeding into the ADC of the Launchpad. That will complete the receiver side, and test the methods of acoustically coupling the transducer to the water.

Sunday, June 14, 2015

F450 Quadcopter Mods: Walkera G-2D Gimbal and XIaomi Yi Camera

I built my F450 with aerial video in mind. Once I got it flying, it was time to select a camera and gimbal. 

The camera needs to be able to record at high framerates to reduce the "jello" effect of rolling shutter. If you try to strap a cheap keychain camera to the frame of your quad, it is very likely that the result will be a garbled mess of distortion. This is because the CMOS sensors in those cameras scan each frame into memory over a small period of time. Vibration causes the frame to move as it is being captured. 

Additionally, even if you get the vibration under control, the rapid movements in all directions as the quad flies around will make you ill. It's not a lot of fun to watch. 

The solution is a camera that can record at 60 fps and a motorized gimbal to compensate for the motion of the quadcopter and keep the camera level. There are gimbals that use servo motors, but the best use brushless motors, which are quiet and smooth. They nearly instantly compensate for the motion in pitch and roll that occurs from pilot inputs and wind gusts.

I selected the Xiaomi Yi camera. This has the same imaging sensor as a GoPro without some of the frills, and is much less expensive. They are currently available on Amazon Prime for $88. The don't come with a case, or even a lens cap. The Android version of the app is rather untrustworthy looking - it is currently distributed off of a file sharing site I normally associate with pirated software, rather than from the company's website. "Here! Run this random APK from the Internet on your phone! It will be fine!"

Yeah. I dug out an old phone that doesn't have access to any of my important stuff and used that. I used the app to set up the video mode (60 fps at 1080p) and timelapse mode (still frame every 3 seconds). You can toggle between these modes with the camera's button - you really only need the app once.

I also ordered a Walkera G-2D 2-axis gimbal. This only compensates for pitch and roll, but uncommanded yaw motions don't seem to be much of a problem. I am extremely pleased with this gimbal for the money. It has an onboard regulator, so you can run it straight off your 3S lipo pack. I connected it to my main power line on the quad and it fired right up. It supports the use of auxilliary channels on your receiver to aim it in roll and/or pitch, but it doesn't require it - you can set the tilt and roll angle with a couple of trim pots and leave it alone, and it requires no connection to your receiver. It even comes with a small tool to adjust the pots with and the needed Allen keys. It worked right out of the box, and bolted directly onto the lower frame of the F450, aligning nicely with the slots on the lower frame. I secured it with 4 bolts.

One note: the gimbal is not designed for the Xiaomi Yi and the existing mount doesn't fit. I found that the frame could easily be removed, a 1/4" cardboard shim cut to level off the mounting plate, and a large zip tie easily secures the camera to the gimbal. There is probably a more dignified way, but that works just fine.

I am really pleased with this combo. I am still seeing some vibration in the video that I want to eliminate, but it's by far the best video I've gotten from an RC model so far. More to come on the vibration problem as I work it out. (Update on how to fix this below)

Here are a couple of still frames of a local park, shot in timelapse mode.

And some video....

Video Test Flight 3 - Xiaomi Yi and Walkera G-2D Gimbal on F450 from Jason Bowling on Vimeo.

Update on the vibration problem, and a note about the camera:

1) The vibration was improved by changing the vibration dampeners that came with the gimbal with more rigid ones from HobbyKing. The dampeners that it comes with are too soft.

2) Additional improvements were made by inserting soft foam earplugs into all four vibration dampeners.

3) The lens rectification function on the Xiaomi Yi makes the edges of the video very blurry. Once I fixed the vibration, the edges were still bad. I turned the lens rectification off, and it's much better. Here's a test flight with these improvements.

TestFlightNoFisheyeCompensation from Jason Bowling on Vimeo.