Tuesday, October 1, 2013

PiCam Update

After a (4-month) detour writing a web server and playing with GPIO stuff, I have been getting back to the original project which was to make a free-standing web cam.  Last weekend I packed up the pi (complete with its whole GPIO spaghetti tangle) in a small box with the webcam and a 4400mAh cell phone battery (which, in testing, gets me about 6-8 hours of continuous use).  The photo below was taken from in the living room (my wife happened to be in the frame at the time).  




I bungeed the whole setup to the carrier on my bike, set it to record, and took it around the block.  Unfortunately the camera didn't like the sunlight and all of the outdoor footage was whitewashed.  I'm also (still) having trouble with the webcam working reliably with the unpowered USB hub and don't have a way (as of yet) to power my active USB hub remotely.  Once I get the kinks worked out I'll post the bike cam footage.

I did get it to successfully record video inside the house with the powered USB hub, but it's more stunning footage of the ceiling fan.  Truly riveting stuff - before viewing be sure to push back a few feet from the monitor and to hold on to your skull with both hands to keep your mind from being BLOWN RIGHT OUT OF YOUR HEAD.


Friday, August 23, 2013

OliWeb with PHP

After getting the CGI mechanism working with OliWeb, it wasn't that much of a stretch to add PHP support.  I've just posted updates to GitHub for the as-yet-barely-tested capability.

I already know that the CGI QUERY_STRING is not compliant (I'm sending the full request plus arguments in the env variable instead of just the stuff to the right of the '?' which is what the standard calls for).  I'm not sure whether this is going to cause problems for PHP but it will be easy enough to test out (fixing the QUERY_STRING CGI compliance has been on my to-do list for a long time anyways...).

https://github.com/m2ware/OliWeb

[UPDATE] There was really no excuse other than laziness for not fixing the QUERY_STRING to be CGI-compliant and I wanted to make sure that PHP wouldn't have any trouble with it, so I went in and fixed that.  I added a PHP script and an HTML submit form to test proper passing / parsing of arguments through the CGI interface into PHP, which appears to work just fine!

Thursday, July 18, 2013

Chirpy Pi (or - Robot Rooster)

It's been a bit since I've posted, but spent a few weeks messing around with the GPIO pins on the Pi.

I pulled Gordon's code and used it to do some "hello world" LED stuff.  After that, I spent some time figuring out how to read/write the pins in C from Gordon's examples and some others floating around the web.  

Using a couple of bipolar junction transistors I had around, an infrared LED, and the buzzer from a smoke alarm, I've turned the pi into a (very vocal) chirping light detector.  It's got the following components:


  • BJT amplifier with an IR phototransistor in the base attached to a read-mode GPIO pin
  • Shell script using Gordon's cmd-line pin reader that looks for a change in the pin state between 100ms sleeps...
  • ...which invokes a pulse train generator that I wrote in C that turns a pin off and on with programmable frequency and duration...
  • ...which pushes another BJT circuit that makes the buzzer chirp away 



The photo detector circuit is pretty sensitive so I put the whole contraption (pi+case+breadboard circuitry) in a cardboard box with a hole cut for the photo transistor to "see" out of.  With that configuration, I can point it at a window and it will chirp any time someone walks in front of it. 

I've also made a couple of longer pulse train variants that I can invoke from the web server (via CGI) remotely so I can make it chatter at the wife and kids when I'm out of town for work (I think the kids are probably more amused than my wife is...).




Wednesday, May 1, 2013

Sodding Git


I went ahead and put the OliWeb code in GIT and attached an MIT boilerplate to it.  Feel free to use if interested!  CGI parameter passing doesn't work yet (on my to-do list).  After a few minor fixes, I've gotten it running on several Linux platforms including 64-bit (Intel) Ubuntu.  If you happen to use it on another linux distro shoot me a note and let me know if there are any issues (or if it works beautifully out of the box).  

Git Page Here:

https://github.com/m2ware/OliWeb

Git location:

https://github.com/m2ware/OliWeb.git

Friday, April 5, 2013

OliWeb and IvySox

One of the goals that I've had for the pi project is to run a cgi-enabled web server to enable me to browser-connect and invoke shell scripts or programs remotely.  I wanted something very small and lightweight that would serve up content and invoke scripts and potentially to embed video / snapshot content from the camera.

So which software packages were considered for this job?  None of them, actually.  I know there's probably seven different things out there that would suit my purposes filling every possible niche between command-line socket listener and Apache, but I wanted to write my own.  I've always wanted to learn how low-level socket programming works but haven't had a really good excuse to invest the time until now.  While this has been a bit of a quixotic (quixidiotic?) detour from the main thrust of the project (to the tune of about 2 weeks), I'll borrow in my defense the brilliantly-tautological, debate-silencing quip of my former roommate Gary: "what's fun for me is fun for me".

It turns out that socket programming is really not all that difficult, apart from all the weird old c-language structs with quasi-polymorphic casting and old-school error handling and inscrutably-abbreviated variable names.  Having resources like Beej available also helped tremendously - this is a great and very readable resource with solid examples without which this would have been significantly more painful.  I'm ordering one of his hard copies and making a PayPal donation out of gratitude and suggest you do the same.

So this project yielded two main artifacts.  The first is IvySox, which is a C++ wrapper for the socket stuff with, you know, modern, readable variable names and method calls.  It provides some simple interfaces for things like "listen on this port", "accept inbound connection", "receive message", "send message" and the like, and has some helper classes for managing connections that mask all the scary c-structs.  It's pretty messy right now and I need to do a whole bunch of stuff like adding more methods for client-side comms (since I was building a web server, it's pretty server-side focused), adding full IPV6 support (it's partway there, but haven't verified that it's 100% compatible), and general code cleanup and pruning of all the abortive false-start stubs that are still hanging around.  

The web server itself is a class called OliWeb, which uses IvySox interfaces to listen on a chosen port, receive inbound connections, and serve up files.  It's lean and mean, can invoke CGI scripts or programs and display their (stdout) output, logs all inbound traffic, etc.  OliWeb started out single-threaded, but I bit the bullet and refactored to multithread with a single detached pthread servicing each inbound HTTP request.  Here it is!  This is the home page (index.html) as viewed from Safari on my iPhone (I'm not clearing any shelf space for Placards Proclaiming Greatness in the Field of Web Design just yet):


Here's the snapshot interface where I've got a CGI script set up to take a pic (using fswebcam at the moment) from the attached webcam (sitting on a bedside table, taking thrilling pics of a bookshelf and the just-glimpsed reflection of the ceiling fan... anyone for starting a meme that involves surreptitious midday photos of household furniture and partial views of idle home appliances?). 


Yesterday a co-worker and I pounded it with a bunch of these in parallel:

while true; 
do curl http://10.10.48.141:8077/snapshot/snapshot.jpeg > /dev/null; 
done

while simultaneously using the browser and navigating around.  It's holding up great... now.  One snag to be aware of when setting up a server (which I learned the hard way) is that a client disconnecting mid-stream (like hitting CTRL-C on the above script or hitting "X" or stop on your browser while it's still loading images) will instantly and unceremoniously terminate your server.  After a little internet research, we discovered that your program has to provide a handler to intercept SIGPIPE signals (lucky #13!).  The always helpful (if not particularly friendly) Stack Overflow pretty much sums it up - handle as no-op and go on about your business (I'm detecting the error after the signal now though so I don't continue trying to push bits down the broken pipe).

For this particular application, detached pthreads work great.  I tried to go down a more complicated path involving a managed thread pool with join logic and monitoring and what have you, but at the end of the day it's a lot easier to fire and forget: just pthread_create() the handler threads in a detached state and have them off themselves when they're done:

void OliWeb::handleInboundRequest()
{
    // Accept inbound request from socket listener
    InboundRequest *request = new InboundRequest();
    request->oliWebPtr = (void *)this;
    request->socketNumber = ivySox.acceptInbound(&request->inbound);

    // Create a detached thread to handle inbound request
    pthread_t aThread;
    pthread_attr_t threadAttribute;
    pthread_attr_init(&threadAttribute);
    pthread_attr_setdetachstate(&threadAttribute, 
                                PTHREAD_CREATE_DETACHED);
    int result = pthread_create( &aThread, &threadAttribute, 
                                 threadEntryPoint, (void *) request);
    writeLog("Thread launch result = " + toString(result));
}


void *threadEntryPoint(void *requestVoid)
{
    InboundRequest *request = (InboundRequest *) requestVoid;
    OliWeb *oliWeb = (OliWeb *)request->oliWebPtr;
    oliWeb->threadRequestHandler(request);
    cout << "Deleteding request handler!!" << endl;
    delete request;
    cout << "EXITING THREAD!!" << endl;
    pthread_exit(NULL);
}

I've still got a weird thing going on with parallel handler threads right now where one of them closing its connection after sending its message seems to kill all of the other threads that are still mid-stream, which I've "solved" (i.e. bottlenecked) with a mutex lock on all send operations.  Figuring this out is on my (akin-to-reorganizing-my-sock-drawer-in-level-of-current-interest) to-do list.

I wanted an extensible XML configuration file as well, so I looked at a number of available options and picked TinyXML-2 which is just about perfect for my purposes (light! fast! reasonably-compliant!).  It's overkill right now since I've only got like 5 configuration options in total (like where is the root web directory, where does the CGI-BIN live, what port to use, and two other ones that I can't remember offhand), but I wanted to have it "just in case" and may have need of a lightweight XML parser for other aspects of the pi project.

"But wait!" (one might say), "Since you're already doing a ground-up web server with the sockets and the pthreads and the what have you, why not go whole-hog and write your own XML parser while you're at it?".  To which I would reply "I already did one like six years ago.  It was an STL-based thing that was also light and fast, but kinda sloppy and basically swiss cheese from a compliance perspective.  So instead of starting from scratch or trying to polish that turd, I just decided to grab one of the abundant readily-available options.  And, really, I don't have to defend my choices to you, voice in my head - see 'Gary', above."

So: awesome.  I'm really pleased with how it came out and it allows me to do pretty much anything on the Pi remotely by invoking CGI scripts.  Detour over (cleanup notwithstanding) - I can now go back to futzing with motion .conf files to try to get the video streaming bit to work.

Saturday, March 30, 2013

Camera problems

After two weeks of messing with this, I still can't get consistent function from the webcam with any software.  I've tried motion, mplayer, vlc, and webcam so far.  I have managed to get a few successful image captures from motion, but it mostly just complains about not being able to connect to the camera and other complainy stuff.

I'd like to see them get better OOTB support for webcams on the platform... it was actually easier to write my own webserver for the pi (which I did last week, complete with CGI support!) than to get the stupid camera to work.

Most of my time has been spent with motion, which a co-worker has running on an Ubuntu box at work 24-7.  I've been through every darned setting in the massive config file like 8 times and can get different (wrong) behavior, but none of it amounts to streaming video or even functioning snapshot capture.

Monday, March 25, 2013

Find my pi

Now that I've got the pi up and running, I don't necessarily want to plug it in to keyboards and mice and monitors all the time... it's easier just to feed it 5V and log in from the laptop via ssh which is what I'm doing right now.

I've been doing this by using a broadcast ping from my Mac terminal app.  This just involves pinging the whole subnet by setting (typically) the last byte of the ping target to 255 (all bits = 1).  

Then you get responses from all devices on your local area network.  Unfortunately you have to try them all until you find the right one.  On my home network it tends to always get the same address (192.168.1.124) which is handy.

To enable broadcast ping, you can use this script:

echo '..'

echo '# Disable ignore broadcast request'
echo 'net.ipv4.icmp_echo_ignore_broadcasts = 0'

read -n1

sudo nano /etc/sysctl.conf

Adding the second echo line (sans quotes, echo) to your /etc/sysctl.conf will enable broadcast ping.

Detecting the pi automatically on the network is one of the problems I plan to tackle.  Being able to connect to it with minimal fanfare no matter where I am is a goal.





Bubble Face

So right now I'm mostly working on the plumbing aspects of the camera project, but I've also been working on some prototyping related to facial recognition in Matlab.  I made this video on YouTube illustrating the "bubble face" algorithm I've built to identify the bounding region around human faces in a robust fashion. 



It can be summed up as:

  1. Grayscale
  2. Shrink
  3. Belligerent bells (difference of Gaussian HPF)
  4. High/low edge detection
  5. Morphological filter (1x contraction, 1x dilation)
  6. Bubble Face
The bubble face algorithm itself blows up an ellipse like a balloon with the individual nodes getting "tied up" by edge features.  Every node is attached elastically to its nearest neighbors though, so the balloon can "pull through" features that don't fit.  It stops inflating when it detects something "reasonably elliptical" (based on coefficient of variance of the anchor point radii currently tied to edge features - the red points) and relaxes for a bit (which smooths out the distortion) before inflating again.  It does 3 cycles of this and tends to do a pretty good job of bounding facial features as long as it originates in a location near-ish to the center of the face.

I'm a long way off from thinking about porting this to the pi, but using pi to do some image and audio signal processing will be something I'll get to once I've got a working platform.


Setup

So I've had the pi for about a week now.  I purchased it from Amazon and had it delivered (Prime) at the vanguard of a 4-day blitz of nubbins, dongles, adapters, bit buckets, and encasements.  Here it is!  In all its pint-sized glory...


[Pictured:

  • IOGEAR 4-port USB 2.0 hub (passive power)
    •       Logitech QuickCam (pictured at bottom)
    •       SanDisk 16GB thumb drive
    •       Wireless dongle for ancient Logitech mouse (not pictured)
  • Old Kindle micro-usb power supply (5V)
  • 32GB (class 4... Zzzzz) SanDisk SD card
  • EdiMax WiFi dongle (hiding underneath hub connector)  
Keyboard and monitor not currently attached - in "SSH-only" mode.




[So that's the monitor (next to my MacBook) on the kitchen table.  Up to this point I've been about 50/50 between tiling terminal windows in the built-in X-win environment and SSH'ing in from either the MacBook or the iPad mini (with SSH client and bluetooth keyboard - great for tweaking on things from bed at 2am or during meetings)]




I went with the latest "wheezy" Raspbian distribution (hard float, recommended distro) for a few reasons:


  1. Being new to the platform and wanting to play with lots of different peripherals, suspect that "vanilla" setup will be easiest and best-supported online for troubleshooting
  2. Where possible, want to enable the broadest-possible audience to follow/replicate/improve upon my results
  3. Don't have any compelling reason (yet) to do anything else (well, hard float issues aside.  More on that later...)
The first couple of days I basically spent getting the thing up and running and getting the various pieces of hardware to show up / mount / fulfill each one its unique and special purpose.  

Below is a rough record of what I did to get stuff up and running.  Before trying to use my bumbling as a template for your own endeavors, I'd recommend giving a look to the basic setup guide.

Step 1: Image the SD Card

The first step was of course imaging the flash disk (for a couple of bucks extra you can order a pre-loaded card on Amazon or from one of these guys... no judgment!).  I first tried RPI-SD Card Builder, but for reasons that I could not ascertain and without any sort of useful hints or error messages it kept crapping out on my MacBook after roughly 15 minutes of chugging.  

I didn't try all that hard to troubleshoot - instead tried out the Pi Filler app, which worked on the first try (YMMV).

[As an aside - Pi Filler's IvanX also has a "finder" app to locate your pi on the network without having to plug in a keyboard and other helpful features.  I'd like to see really helpful tools like this get rolled into the standard distribution - more on this later.  Okay now.  Just a teaser.  I've set up a shell script on my Mac shell to do broadcast ping (response to which on Raspbian is disabled by default / out of the box, natch) and then I try them one at a time with SSH until I get a hit, but am working to make this happen automagically. ]

Thanks, helpful free tool providers of the interwebs!  I hope to follow in your sterling footsteps and provide something of value to the community as well.

Step 2: .....

Step 3: Profit!

After imaging the disk, it was pretty much just a matter of plugging in the usb hub, the HDMI-to-DVI cable to attach the monitor (just borrowing the Dell flatscreen from the home desktop), a usb keyboard (also borrowed from home desktop), and a kindle 5-V micro usb charger (we have what must be dozens of these and the like in every drawer in the house accumulated over the past decade).  That's it!

The system boots into the one-time setup screen.

[I highly recommend doing what I didn't do and following the instructions to update raspi-config before using.  The reason I didn't was pure laziness... I didn't want to drag pi+wires+monitor halfway across the house and re-set up in range of a wired ethernet connection.  In point of fact, I have never used the ethernet jack on my pi - it's been wireless from day 1.  It hasn't cost me anything... but really - doing it the right way may spare you some headaches downstream.]

 Here's what I touched:
  1. Turned off the "desktop on boot" option (personal preference, but I always prefer to start in a shell and startx only when I think it will be helpful or necessary, thank you very much - and this goes double for a little guy like the pi).
  2. SSH on (you didn't get a marvelously-portable Raspberry Pi just to have it shackled to keyboards and monitors and the like all the time, right?)
  3. Set timezone
  4. Changed my password
[OMINOUS PAUSE...]

From there it cheerfully booted up in seconds with a perky scroll of crisp, hi-res Linux-y-looking startup commands under the approving gaze of a plump little raspberry overlord (I call him Baron von Raspby).  While this experience may not carry with it the high-polish commercial sheen of the anthropo/mythical ritual unboxing of the latest Apple iThing, it still got my little hobbyist's heart aflutter watching it boot up for the first time...   

Step 3 Revisited: A Cautionary Tale

Fun fact: did you know that our friends across the pond have their very own keyboard layout?  And that the pi comes with this layout (in my experience, 100% of the time) out of the box?  It's called the "GB" keyboard layout, which I understand to be shorthand for Good Bangers (and mash).  While the folks in "Great Britain" speak the King's English, for those of us in the colonies, know this: when you depress the space bar and go "off alpha" - you're in a twisted shadow-world of through-the-looking-glass proportions.  

So with a quick web search I found the appropriate system file to edit (can't remember offhand but I'll post it later) and changed "GB" to "US" so I could get things like my pipe [|] and hashes [#] and whatnot back to their rightful places on the keyboard.  Problem solved.  

So, on reboot of course, I found myself locked out of the pi.  I'm a little slow sometimes, so it took me a few minutes to realize that my password, which (I don't think I'm compromising my pi's security too much to say this on the internet) contains one or more particular non-alphanumeric characters, was not what I thought it was when I originally entered it with the GB keyboard layout.  So I had to go back and look at the GB keyboard layout on Google Images to find out which characters the characters I thought I was typing I was actually typing when I set my password.

So, the punchline is: if you are in the US, just go with the default password (or at least stick to alphanumeric) until you change the keyboard setup.  After you change the keyboard to 'US', go nuts with your 

pi@raspberrypi ~ $ passwd

Summary

So, as long as you don't shoot yourself in the foot like I did (minor detour), the setup is largely painless and for the most part foolproof if you follow the yellow brick road.  I'll post more on the setup of the peripherals (WiFi, webcam, thumb drive, etc.) later on.

(Photo by Silver Screen Collection/Getty Images)

Salutations

Hi!  I'm starting this blog to chronicle the development of a hobby project using the Raspberry Pi platform that I'm referring to as the "Pi Eye".

What is the Pi Eye?

Simply put, the Pi Eye is a standalone streaming webcam on the rPi platform.  The plan is to use as much freely-available open-source components as possible (except when I get a wild-eyed urge to build something ground-up like, say, a web server - more on that to follow).  The plan is to build something cool and useful and to learn some things along the way.  I'm sharing the experience so that hopefully others can benefit from my stubbed toes and blind wanderings (as well as to get input and advice from the community of users).

A quick tidbit on background: I'm a moderately-skilled programmer and linux user.  My background and schooling is in digital signal processing but I have been in analytical business applications software (SaaS / Cloud "Predictive Analytics" software) for the past 10 years.  I'm most familiar with C/C++ and C# and have used Visual Studio for most of my professional career, but have also used Java, eclipse, and Linux-based development.  My focus has always been on mathematical algorithms and I do a lot of work and prototyping in Matlab, for what that's worth.

I make no claims to being an expert in any of these languages or platforms, but part of the fun of this project for me is getting back into Linux-based dev (something I haven't done much of professionally for upwards of 5 years).  I'm kind of ecstatic about the pi platform and its possibilities, and hopefully can convince a couple of folks to give it a look.

Another objective is to build some tools for the platform to provide a smoother "on-ramp" for non-*[nix/nux]-hax0rs.

Hopefully a few folks in the hobbyist/pi/enthusiast communities will find some of this interesting (or, more selfishly, provide me with some tips and pointers).  I'll make all of the code freely available as well under some TBD minimally-restrictive license.