AspireOne Petcam

Have a spare AspireOne sitting around not being actively used? With built-in wireless and webcam it can easily substitute for a dedicated wireless webcam. As the prices of these netbooks continue to drop they may even be better deals while providing the additional features of a display and full operating system.

I’ll be looking at using an AspireOne netbook with Ubuntu 9.04 netbook remix installed, other linux distributions should also work. For Windows netbooks I like the program Yawcam, that provides lots of functionality along with a GUI.

A quick search for ‘webcam’ in the Synaptic Package Manager will show a variety of available programs that support webcam capture with ftp uploads. I tried these but had little success.

  • camstream – program hung when run
  • camgrab – the AspireOne camera wasn’t found
  • camorama – could not connect with the webcam
  • camE – came the closest of the ‘webcam’ apps. captured images to disk, connected to the ftp server, but failed to complete uploads. I was not able to resolve the upload failure.
  • vgrabbj – fails to connect to the webcam
  • webcamd – sort-of worked. captured images, uploads worked. however it seemed to have upload troubles and be low on options and/or documentation.

The best program for webcam monitoring on linux is Motion. Install it using the Synaptic package manager. This will place sample configuration files in /etc/motion/motion.conf

Since I plan to occasionally use the netbook for casual surfing and want to keep an eye on the camera operations to easily stop it for privacy, I’ll run it as a user rather than in daemon mode. Copy the sample configuration to your user directory /home/<user>/.motion/motion.conf

I changed just a few settings to support uploading a snapshot every 60 seconds. I also disabled the motion images and videos for now. I’ll include these in a future setup, but for now the captures would quickly fill the small drive of the netbook if left running.

Edit the motion.conf file in your user ~/.motion directory and modify the following settings:

# turn of daemon mode, I'll run in a shell
daemon off
# optionally use a larger image size
width 640
height 480
# turn off motion capture images
output_normal off
# turn off motion video
ffmpeg_cap_new off
# take a picture every 60 seconds
snapshot_interval 60
# reuse the jpg image file name lastsnap
snapshot_filename lastsnap
# run an FTP upload script after taking a picture
on_picture_save /home/<user>/ftppicture %f

Motion does not include any FTP functionality, but it does provide a set of events that can be used to run external scripts. on_picture_save allows you to specify a script that will be run after each image is taken and stored to the local drive. Here we will run the script ftppicture, the contents of which are listed below:

#!/bin/sh
HOST='your.host.ip.addr'
USER='username'
PASSWD='password'

filename=${1##*/}

ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
cd cams
put $filename
delete aspireone.jpg
rename $filename aspireone.jpg
quit
END_SCRIPT
exit 0

Fill in the constants at the top with your ftp server login information. This script will connect to the server, upload the file lastsnap.jpg , delete the image aspireone.jpg from the ftp site, then finally rename the new lastsnap.jpg to aspireone.jpg on the server.

From a terminal run the program with ‘motion’. Motion includes a web server, so you can view the live camera image for adjusting its postion by opening a web browser to http://localhost:8081

It will now take and upload images every 60 seconds. Stop the program with CTRL-C in the terminal and run motion to restart.

rsync, basic is best

I’ve tried countless backup programs over the years in search of the best solution for my needs. Virtually all of them do the standard full, incremental or differential backups. This is fine for archival purposes, since you can recover all documents lost after a failure.

The downside is that, unless you do a full backup each time, in the case of a failure you have to piece together the last state of the files combining the full and incremental backups. This manual piecing together may require a lot of manual merging if files have been moved between incremental backups that result in two copies in the recovered file system. Similarly obsolete files that were purposely deleted will be restored, since file deletions are not recorded by incremental or differential backups.

What I want is essentially a full backup or mirror each time, so that the backup always represents an exact copy of what I’m backing up. Then in the case of a failure, it is a simple file copy to a new disk to restore, or in an emergency just use the backup directly since it is identical to the original.

Backing up terabytes of data is still to slow and costly to keep a sequence of full backups. Mirrors can be kept in sync quicker, but if you accidentally delete a file and it gets mirrored, you lost it in the backup as well. What would be ideal is mirroring where any deleted or changed files are kept in a side folder. The main mirror folder always contains an exact copy of the original but any previous revisions or deleted files can still be recovered. This is similar to what Time Machine does for OSX, but rsync can do this for linux and other platforms using related projects.

Wading through the large number of options for rsync can be intimidating. Its a powerful tool and so can be disastrous if incorrectly used. You don’t want to accidentally mix source and destination with the –delete option for example. Try out the options in a sandbox first until you see how it works. A good first step may be to use a graphical front end such as grsync where all the options are clearly labeled with context help.

The mirroring command I’d use to backup a drive called nas1 would be as follows:

rsync -r -t --progress --delete -b --backup-dir=/mnt/backup/$(date +%F)/nas1 /mnt/nas1 /mnt/backup/nas1

  • -r to recurse through the directories
  • -t to preserve the timestamps of the original file
  • –progress to display the progress
  • –delete to remove deleted source files from the destination. This makes it a true mirror, identical to the source
  • -b backup changed and deleted files from the destination
  • –backup-dir where to move the backups, in a date specific folder. Files on the mirror which are being replaced or deleted will be moved here instead, so they can be recovered if needed.
  • /mnt/nas1 the drive being backed up
  • /mnt/backup/nas1 the folder to place the mirror image into

What this results in is the drive ‘nas1’ mirrored into the folder ‘nas1’ on drive backup. Previous versions and deleted files moved to a date specific folder such as 2009-04-20/nas1 on drive backup.

Barcamp Boston 4

I recently attended Barcamp Boston 4, an “un-conference”. There is a nice mix of discussions and presentations, and covering technical topics as well as the business of technologies.

The big board of sessions

Some of the sessions I was able to attend:

  • Good introductory presentation about Git by @qrush
  • QR Codes, mostly covering the ZXing project on Android
  • Discussions on selling iPhone apps, Drupal development
  • Comparing Amazon vs. Google cloud hosting services
  • A review of Tools for Ruby development
  • Review of three frameworks: django, rails and zends and a discussion on client-side web testing
  • Discussions on freelancing, entrepreneurship, and Co-working
  • Some microcontroller sessions, mostly around Arduino development
  • Got introduced to LOLCode, there really is such a thing..

There was a lot of interesting sessions to take in, definatly worth a try if you haven’t been to one.

http://www.barcampboston.org/