It doesn’t pay to put a Raspberry Pi camera out directly facing the great British outdoors for more than a season even if you can keep the water out of it. I had a RPi Model B and camera doing just that and groused about the lens crazing problem where there seems to be some sort of microbial attack on the lens after a season outdoors.
That didn’t respond to pretty aggressive scrubbing with isopropyl alcohol (IPA), and the Pi lenses are proprietary.
The back side of the lens not facing the elements looks fine
They are not standard M12 CCTV lenses1, so I got to buy another camera board, and used Sugru and a cut down glass microscope slide to try and keep it intact. I can buy aftermarket RPi compatible cameras using M12 CCTV lenses now, but then it wouldn’t fit in the PICE case.
Now I have convinced myself that I can get a version of the OpenEEG hardware to run into EEGmir, I want to how see if I can reproduce one of the Cade-Blundell filters. I have an analogue simulation from earlier, and I want to see if I can reproduce this in EEGmir. The filter specification protocol in EEGmir is the same as in Fiview from Jim Peter’s site[ref]they use the same underlying library, fidlib[/ref], and since that displays the transfer function it looks like a good place to start.
a tale of linux graphical display woe…
The windows version doesn’t run, beats me why. So I try it on Linux. My most powerful Linux computer is an Intel NUC but because Debian is hair-shirt purist and therefore snippy about NDAs and proprietary drivers, I think it doesn’t like the graphics drivers. It was tough enough to get the network port working. Xserver and VNC is so deeply borked on that. If something is stuffed on Linux then it’s reload from CD and start again because I haven’t got enough life left to trawl through fifty pages of line noise telling me what went wrong. So I’m stuck with the command line. So I try fiview on the Pi, and this fellow sorts me out on tightVNC and the Pi which is a relief, trying to get a remote graphical display on a Linux box seems to be an endless world of hurt, and I only have a baseband video monitor on the Pi console.
Simulating the 9Hz Blundell filter
I already have SDL 1.2 on the Pi, so it goes. Let me try the 9Hz channel, which was the highest Q of the Cade-Blundell filters. If you munge the order and bandwidth specs you get fc=9Hz BW=1.51.
Converting that to Fiview-speak that is
fiview 256 -i BpBe2/8.22-9.72
which in plain English means simulate a sampling rate of 256Hz bandpass Bessel 2nd order IIR between 8.22 and 1.51. So let’s hit it.
Unfortunately the amplitude axis is linear, which is bizarre. Maybe mindful of their 10-bit (1024 level) resolution OpenEEG didn’t want to see the horror of the truncation noise and hash. I can go on Tony Fisher’s site (he wrote the base routines Jim Peters used in fiview) and have another bash
Running the analogue filter with the same linear frequency display I get
which shows the same response[ref]It’s not strictly exactly the same because of the increasing effect of the frequency warping of the bilinear transformation as the frequency approaches fs/2. But in practice given the fractional bandwidth of the filters the warping only has an effect in giving the upper stopband a subtly different shape in the tails, I struggle to see it here.[/ref]. H/T to the bilinear transformation for that. I had reasonable confidence this would work, I did once cudgel my brain through this mapping of the imaginary axis of the s plane onto the unit circle when I did my MSc. Thirty summers have left their mark on the textbook and faded the exact details in my memory 😉 But I retained enough to know I’d get a win here.
For the last year or so I’ve been trying to make an timed start recorder using a Raspberry Pi and the Wolfson/Cirrus audio card. I was able to make it work, but never eliminate some rattiness in terms of overruns on record – I confess I couldn’t hear them, but it didn’t give me a good feeling. Then I added up the costs –
£25 – Cirrus Audio card
£27 – Raspberry Pi B+£10 – case and odds and sods to make it work
£20 – PCB, time and bits to make a preamp to get from mic to line level
so I’m looking at £80 to get off the ground, and that gives me a seriously power-hungry SD audio recorder, although I can use a timer to save the power drain for active service.
Alternatively, if I could crack the remote control for them I could go on ebay and get a secondhand Olympus LS10, or one of the similar models (LS-5, LS-11, LS-12, LS-14) and use my own LS10 to start with. I can feed a mic straight into the LS10, no extra preamp required and the audio spec is good.
Reverse engineering the Olympus remote control protocol
This cost me £90 on ebay, and it turned out I didn’t need it. You get the info for free, but then I got a natty nearly new LS-14 with an RS30 remote control, so I’m not too unhappy. Unfortunately the RS30 doesn’t work with my Olympus LS10, don’t know why. I’d have been hacked off if I’d just got the RS30[ref]I’ve just got onto the Olympus RS30 website and if you scroll through the models that is compatible with it includes the LS-3, LS-5, LS-11, LS-12, LS14, LS-20M, LS100 so perhaps my LS10 was never compatible with it and Olympus have changed their mind since writing the LS10 manual which says on p65 “Exclusive remote control RS30W (scheduled for Spring 2008)”[/ref]. Works a treat with the LS14 it came with, on their own a RS30 seems to go for £50, so I got an okay deal.
The connector is an evil little 2.5mm four-pole jack, and these are a bear to solder
I can’t help wondering if life would be easier using a three-pole jack, since only sleeve and ring are needed. Now I didn’t like that battery in dashanna’s version – I mean who the heck would make a wired remote for a machine offering you a 3.3V supply on the tip of the plug and demand you go fit a battery in your remote? It’s just not a clean engineering solution at all. But apparently it works.
So I rigged the cable in series with the RS30 and sniffed the signals. Of the TRRS the tip had 3.3V, the second ring seemed open circuit, the first ring had the wanted signal and the sleeve was ground. Presumably the IR receiver and LED driver are powered off the 3.3V on tip. The signal on the first ring rests high at 3.3V.
In practice you can ignore the second pulse. For all I know it could be an ack back to the receiver to light the LED. I tried using a couple of diodes to pull the signal down to 1.2V but that didn’t initialise record. I then figured this is one of those analogue resistor chain remotes, so I look for what resistor would give me ~1.5V. Turns out if you replace the 1.5V battery in dashanna’s schematic with 100k you get about 1.5V and the recorder starts recording. You don’t need the second pulse at all, and the debouncing seems to be done in the recorder, it takes a little while, up to about half a second to start recording. I guess that means inside the recorder there’s a 100k resistor to the 3.3V rail in series with the first ring.
That works with both the LS 10 and the new LS14, although the RS30 only works with the LS14. So now all I need do is mod the timer to pull down a couple of pins, one through 100k. If I make the stop command the open-drain pin to ring and the rec command a normal pin resting High via 100k to ring, and pull the relevant pin down for 100ms I should be good to go.
The problem is still the same as it was this time last year – the birds get up before I do in the Spring and I can only be one place at a time. Automatic recording devices let me scout locations in parallel.
A timed field recorder needs to be cheap, because somebody might nick it, it needs to be weather-resistant because it’ll be stuck outside, and it needs to be low-power, because 13A mains sockets are rare outside. Oh and it needs to be standalone, and not part of some cloud, because mobile Internet is ratty and expensive.
tl;dr the hardware performance is good but software support is dire. You can make this work but it isn’t fun at all. If you can use something like a USB stereo audio in board then do it rather than use this Cirrus Logic Audio Card, particularly if you have mains power available. I like the Behringer UCA202 and it works with the Pi
A Raspberry Pi and A Wolfson audio card sort of fitted the bill, but the Wolfson Audio card is no more. I say sort of, because I’m still looking at about £70 for a Pi[ref]HiFi world clock it in at £220![/ref], the audio card and enough odds and sods to power it. You can buy a Zoom H1 for about £80, although there’s still a bit more cost in powering it for long times, keeping the water out and making up some gizmo to pretend to be you pressing the big rec record button early in the morning.
But with the Pi I get to drive the recorder via cron and ssh, and transfer the files via the internet or mobile data in some places. Even if I don’t get a case, though they are to be had for the Pi/CL Audio card combination…
I have one of these Ravpower iSmart USB batteries, and it works a treat when used as the manufacturer intended – to power a mobile phone or an iPod (4th gen touch in my case). No complaints whatsoever.
I constructed a remote GPS module with MAX232 RS232 chip, and all this wants to run off 5V – the MAX232 is specced at 4.5V to 5.5V, the GPS is probably more tolerant. So the obvious thing to do is to cut off a USB cable, use the USB A plug and wire the power to my device from this. No need for a regulator, job done, and indeed the GPS fires up. Dandy. No need for 5V regulators, no need to mess about with undervolt cutoff, 5V power straight out of the box, what’s not to like?
An intelligently managed battery
The USB battery gives me a USB chargeable device and integrated power management, you can’t overcharge these or run them flat, and as someone who has just trashed a LiPo battery by leaving it connected overnight and flattening it, I appreciate that thought. Until I find out that
iSmart is too darn smart
and decides my device isn’t drawing enough power and pulls the plug after a couple of minutes. Damn. My GPS draws a hefty 50-60mA, depending on whether the unbelievably bright LED the Chinese makers decided to fit is on or not.
I took my Radio Amateur’s Exam (RAE) in 1978 – I’d been interested in electronics as a child but I was never going to be able to afford any gear, I thought a technical interest would add a little bit of colour to my application to do Physics at Imperial College. My grandfather had been a radio amateur and he gave me an old homebrew crystalled 2m AM rig. But when I fired it up and my Physics teacher who was a radio ham looked for the signal at his home about 500 yards away there was nothing there, and I didn’t have the skill or gear to know what to do. I had a multimeter but no ‘scope. I could do the RAE, with a general electronics background and revising the licensing terms on the train up from southeast London to City and Guilds which was taken in what the University College London building in Malet Street. Imposing joint, I think it looked like this.
I got into Imperial. Didn’t do anything with the pass for over ten years until I came to Suffolk, and there were a few radio amateurs in the group I joined, and I got my amateur licence in 1991 – next year I will be eligible for the QCWA 😉 Initially I used a modified Tait PMR rig on 2m, but getting crystals cut got old quickly because it was dear. I then bought a secondhand FT290. However, I was living only 15m above sea level in the town, and I never got my head round all this propagation malarkey, and not having Morse meant I had to stay 6m and up ISTR. I stuck with 2m and I was never going to be doing this working the world thing without HF[ref]I’ve simplified things a lot – I had technical and engineering skills but no talent for operating[/ref], and was always a second class licensee ‘cuz only Real Men used HF.
The Internet and Amateur radio
Packet radio and the early TCP-IP over KISS modems was interesting and how I learned some of how routeing went. Then the Internet happened and basically ate amateur radio’s lunch, well, what was left of it after GSM mobile and SMS became widespread. It’s difficult for anybody born after 1990 to realise just how poor communications were, but in the end when you want to just get in touch then a modern mobile phone has solved most of the problems amateur radio had uniquely addressed, if we leave out the self training and experimentation lark. Amateur radio had been doing okay with data communications and packet AX-25, but then that Berners-Lee chap invented the Web, and broadband showed up. It looked like game over, and, well, as for so many people, life and work kinda gets in the way.
A different era, and different applications
Recently I had a use for APRS, and I take another look and I like what a new generation have done with amateur radio, they’ve grabbed it by the short and curlies and dragged it into the 21st century, working with modern networking and tech rather than harking back to the golden days of Morse and tubes. I have nothing against Morse or tubes and indeed now that I don’t have to pass Morse to get on HF I am messing about with it.
Wouldn’t it be nice if I could take a picture of a bird as it passed through an invisible beam of light? The idea’s not original, these things exist, but they are quite dear, so I am experimenting with making these.
The most obvious way is a light source and a photocell, and indeed many years ago at secondary school I developed an analogue circuit[ref]people normally consider monostables as digital but mine was built using discrete transistors and resistors, and the time delay was infinitely variable, as it would be with a CMOS 4538, so I consider it analogue[/ref] using OC71 transistors scavenged off postwar computer boards to make up monostable multivibrators for the delay elements and one with the black scraped away from the housing to act as a phototransistor.
This gonzo technology of 40 years ago triggered the flash for the source negatives used in the animation – you set a very slow drip, and as the drop passes the photocell it triggers the delay. By increasing the delay between the drop passing and the flash going off you get the progressive animation, assuming each drop makes a similar pattern.
This was done with a manual camera, a new Canon AE1 ISTR that one of the other kids had. But the trick is to do all this in the dark, click the camera on Bulb and use the trigger to trip the electronic flash, which responds within milliseconds and has a short duration of about a millisecond if you reflect some of the flash back into the photocell of the flash (to turn it off as early as possible).
So there’s nothing incredibly hard about doing this, in controlled conditions, in a darkened room. If I were doing it again, I’d do it in the same basic way, using a phototransistor and a CD4538 CMOS dual monostable rather than a discrete monostable – one half to give the delay controllable with a pot and the other to make a pulse off the falling edge to go into a NPN transistor to trip the flash. There’s no need to muck about with PIC microcontrollers or Arduinos though you could do it that way if you really have to for a higher cost plus the aggravation of writing code, plus the jitter of the Arduino sampling the sensor/responding to the interrupt. In high-speed photography sub-milliseconds matter.
Everything gets harder outdoors
Outdoors you have massive and variable amounts of light from the sun, distances are longer, there’s just a whole lot more hurt all round.