Tuesday, 3 January 2017

Sigfox Weather Station Grapher

A couple of months ago I attended a workshop on Sigfox, an IoT network provider. At the end of the session, we were generously given the development board we practiced on to continue our experimentations.

The Arduino-based SmartEverything board is loaded with sensors. One of the examples we played with was a simple weather station program that read temperature, pressure and humidity from the sensors and sent that every 10 minutes to the Sigfox network.

It was a nice little introduction, but checking out the result meant login onto the Sigfox backend and seeing the incoming messages as text in a list. I thought I'd try to present the same information in a more graphical way and hopefully make it fun to play with. Here is the result:



How does it work?

The SmartEverything board runs the weather station program. Weather messages are sent regularly to the Sigfox backend over their network.

A Raspberry Pi runs a simple Node.js server that:
  • serves the grapher webpage (HTML and scripts basically) to any browser visiting it
  • forwards to the Sigfox backend the AJAX calls made by the browser  
And finally, the user goes to the Raspberry Pi hosted webpage (which can be on a LAN at home or over the internet depending on how all this was set up/deployed), and the grapher should appear!




Where can I get it?

The whole project is hosted on github here: https://github.com/jbitoniau/sigfox-weather-station-grapher. You'll find all the information for setting it up there.



Sunday, 31 July 2016

Near real-time remote monitoring of home electric consumption with a Raspberry Pi


Just out of curiosity, I wanted to get a better and more precise idea of the amount of electricity I use at home every day. Which are the periods of the day I'm using a lot? Are there any particularly power-hungry appliances?

Just for the fun of it, I wanted a system as real-time as possible, that makes it easy to access the stored data, and while we're at it, that lets me do so remotely!

So I came up with this Raspberry Pi based setup:


The first thing was to measure the consumption at the electric meter. A safe way was to simply detect the flashes that my meter emits for each watt hour consumed.


I found a nice article explaining how to read the value of a photoresistor connected to the Raspberry Pi GPIO in Python. After logging these values and analyzing them, I came up with a little function that identifies the sudden changes in value that correspond to a flash.

To make sure it was working decently, I connected a LED to the Pi and flashed it when I detected a flash on the electric meter (that's the green one, the blue one is the WiFi dongle :) )


The next step was simply to count these flashes by periods of say 5 seconds and send the results to a Google Spreadsheet using the gspread Python library.

It turns out that modifying the spreadsheet isn't super quick (a couple of seconds) so I found that doing that every 5 seconds was a good compromise for my totally overkill real-time requirement.

Once everything was in place, I did some more tests like this one:



And then I let the system run for a few days...

Here is a 3 day graph where I collapsed the data points into watt hours per minute. Working with per-5-second data was too heavy for the grapher (that'd be 17280 points per day!). I also discovered that a Google Spreadsheet has limits on the number of cells in a document (400 000, I think), how disappointing :) !


I tried to keep notes of what I was doing during these 3 days. Here my attempt at explaining the different bits:


The Python code is available on github. It's pretty rough around the edges but maybe it can give people some ideas!




Sunday, 16 August 2015

Piano Trainer

I've been taking piano lessons for a little less than a year. As I have never been playing music before that, my score reading skills are still pretty poor. This gets a bit frustrating especially when discovering a new piece of music, I spend ages reading the notes.

To get better at it, I wrote a little training program.


It's very simple: it randomly displays a note on a grand staff, wait for a key to be pressed on the piano (using MIDI), checks the result and displays an OK/Fail! message. It then continues with a new note until you realize it's diner time. Here it is in action:


The applications logs some statistics about the user's performance into a CSV file. Basically, for each note drawn, it measures the time the user took to press a key. It looks like this:

#DateTime;Count;NoteToFindNum;NoteToFindName;AnsweredNoteNum;AnsweredNoteName;AnswerTimeInMs;OK
dim. août 16 15:07:16 2015;1;65;Fa 4;65;Fa 4;2954;0;
dim. août 16 15:07:18 2015;2;48;Do 3;48;Do 3;1374;0;
dim. août 16 15:07:20 2015;3;79;Sol 5;79;Sol 5;1409;0;
dim. août 16 15:07:23 2015;4;65;Fa 4;65;Fa 4;2237;0;
dim. août 16 15:07:24 2015;5;65;Fa 4;65;Fa 4;463;0;
...

When importing that into a spreadsheet program, it's then possible to draw some cool graphs showing the effect of the training (hopefully positive!).

The code is available on GitHub. The application can use the PC keyboard as input. But if you have a digital piano, all you need is a MIDI adapter like this one (it cost me about 5 euros).


Note that another very similar Piano-Trainer program exists on GitHub. It's made by Philipp Otto. It's web-based and it's pretty cool! However it's probably aimed at more experienced music readers as it deals with chords and not single notes like mine.

Friday, 11 April 2014

Oculus Rift on the Raspberry Pi

I already approached the subject a while ago, when I got the Oculus Rift SDK compiled for the Raspberry Pi and successfully accessed the head-orientation.

This time, I wanted to render simple 3D geometry for the Rift using the Pi. Is this adorable little machine powerful enough to support the Rift?

As explained in the official Oculus SDK document, rendering for the Oculus Rift implies several things:
  • Stereo rendering: rendering the 3D scene twice, once for each eye, side by side
  • Distortion correction: distorting the rendered image in a such a way that viewing it through the Rift lenses makes it look correct 
  • Chromatic aberration correction: this is a bonus step that aims at reducing color fringes introduced by the lenses

Accelerated 3D on the Raspberry Pi means using OpenGL ES, much like any mobile platform these days. The Pi supports OpenGL ES 2.0 which is enough for the shaders which implement the corrections described above. In fact, the SDK comes with "regular" Open GL shaders that work perfectly on Open GL ES 2.0. Smashing!

So after some learning and testing, I finally got "Rift-correct" rendering on the Pi. The scene is extremely simple as it's just a rotating cube floating in front of the user (the head tracking is there too). And here is how it looks like. Note that I removed the lenses of the Rift in order to film its screen.



Now for some benchmarks
All the tests were done on a Release build using the official vertex and fragment shaders. No attempt was made to optimize anything (I'm not sure there's much to be done honestly).

  • Rendering the scene twice (no correction shaders): 16 ms/frame
  • Rendering the scene in a texture and rendering the texture on a quad filling the screen (no fancy correction shaders): 27 ms/frame
  • Same thing with distortion correction shader: 36 ms/frame
  • Same thing with distortion and chroma correction instead: 45 ms/frame
Note: the render-target texture used in the tests was precisely the size of the screen (1280x800). Because of the pinching effect of the distortion shader, it should be larger than that (see "Distortion Scale" section of the Oculus doc) about 2175x1360. Unfortunately this was too much for the Pi. As a result a good part of the FOV is lost (the visible pink border). I haven't tried to see what the maximum texture size is, so I stuck to a scale of 1.

Conclusion
The good news is that using the Rift with the Pi can be done! However don't expect amazing results: with distortion correction on (a minimum to see a scene correctly), the frame rate on the simplest possible scene is about 27 fps. At first glance, this doesn't seem that bad. But when head-tracking is on, it does feel choppy and uncomfortable. Indeed, the Oculus team says that 60 fps is a minimum (in fact I believe the next development kit will have a screen with an even higher frame rate than that).

Getting the code
The code of this little experiment can be found at https://github.com/jbitoniau/RiftOnThePi
Building instructions are provided there.

Enjoy!



Edit - 11/26/2014
Since I did this test, things have moved a lot on the Oculus Rift front. It seems they're now providing rendering code that does much more clever things like vertex shader based distorsion correction with a precomputed map (instead of using complex pixel shader code). Hopefully this should be very beneficial for the Pi.

Thursday, 27 February 2014

Pan-Tilt FPV using the Oculus Rift

In a previous experiment, I used the head-orientation of the Oculus Rift to drive two servos moving an FPV camera. That was a good start but not very useful as the FPV video feed wasn't displayed in the Rift.

After putting more work in this project, I finally got a functional FPV or tele-presence system that takes the most of the Rift immersivity (if that's a word). The system relies on various bits and pieces that can't possibly be better explained than with a diagram!


The result
The result is indeed surprisingly immersive. I initially feared that the movement of the FPV camera would lag behind but it's not the case, the servos react quickly enough. Also, the large field of view of the Rift is put to good use with the lens I used on the FPV camera.



Some technical notes
The wide FOV lens I use on the FPV camera causes significant barrel distortion on the captured image. After calibrating the camera (using Agisoft Lens), I implemented a shader to correct this in realtime.

I use Ogre 3D and Kojack's Oculus code to produce the type of image expected by the Rift. In the Ogre 3D scene, I simply create a 3D quad mapped with the captured image and place it in front of the "virtual" head. Kojack's Rift code takes care of rendering the scene on two viewports (one for each eye). It also performs another distortion correction step which, this time, compensates for the Rift lenses in front each eye. Lastly, it provides me with the user's head-orientation that translates later down the chain to servo positions for moving the FPV camera.

As the camera is physically servo-controlled only on yaw and pitch, I apply the head-roll to the 3D quad displaying the captured image (in the opposite direction). This actually works really well (thanks Mathieu for the idea!). I'm not aware of any commercial RC FPV system that does that.

And ideas for future developments...
One of the downside of the system is the poor video quality. This comes from several things:
  • the source video feed is rather low resolution,
  • the wireless transmission adds some noise
  • the analog to digital conversion is performed with a cheap USB dongle
Going fully-digital could theoretically solve these problems:
  • for example, using the Raspberry Pi camera as a source: the resolution and image quality would be better. It is also much lighter than the Sony CCD. It doesn't have a large FOV though (but this can be worked around)
  • transmitting over WiFi would avoid using a separate wireless system. But what kind of low-latency codec to use then? Also range is an issue (though directional antena and tracking could help)
  • the image manipulated by the receiving computer would directly be digital, so no more composite video capture step.
Another problem with the current system is that the receiver end relies on a PC. It would be far more transportable if it could run on a small computer like the Raspberry Pi (which could probably be held at the user's belt).

I should also get rid of the Pololu Maestro module on the transmitter end as I've already successfully used the Raspberry Pi GPIO for generating PWM signals in the past.

Lastly, it would be fantastic to capture with two cameras and use the Rift stereoscopic display.

So still some room for improvement! Any advice welcomed.

The receiver-end (A/V receiver on the left, Wifi-Pi on the right)

Friday, 21 February 2014

Video latency investigation Part 2

In a previous article I investigated the latency of different video capture and display systems. Wireless video transmission was left aside at that time.

Since then, I got my hands a brand new RC video transmission kit and also a very old CRT TV...

Portable CRT TV
(composite input)
TS832 and RC832
5.8GHz AV Transmitter & Receiver


I (re)did some of the tests on the LCD TV I used in the previous article, just to check the reproductibility of previous measures. Unfortunately its composite input died when I started the tests, so I switched to the SCART input using an adapter.

Here are the results:
Setup Average
latency in ms
(4 measures)
Sony CCD + LCD TV (composite) (same test as prev. article) 41
Sony CCD + LCD TV (SCART) 31
Sony CCD + AV Transmission + LCD TV (SCART) 31
Sony CCD + CRT TV 15

Some more conclusions:

  • The latency induced by a CRT display is inredibly small!
  • The 5.8Ghz AV transmission kit doesn't add any measurable latency to the system
  • Weirdly enough (at least on this particular LCD TV), the latency decreases by about 10ms when using the SCART input instead of the composite one.

Thursday, 20 February 2014

Raspberry Pi Flight Controller Part 2 : Single-axis "acro" PID control

After dealing with inputs/outputs it's time for some basic flight control code!

The best explanation I've found about quadcopter flight control is Gareth Owen's article called "How to build your own Quadcopter Autopilot / Flight Controller". Basically, you can control a quadcopter in two different ways:

  • In "Acrobatic" or "Rate" mode: the user's input (i.e. moving the sticks on the transmitter) tells the controller at what rate/speed the aircraft should rotate on each axis (yaw, pitch and roll). In this mode the user is constantly adjusting the quadcopter rotational speed which is a bit tricky but allows acrobatic maneuvers, hence it's name! The "acro" controller is the easiest you can implement and it only requires a gyroscope in terms of hardware. The basic KK board I was using previously is an Acro controller.
  • In "Stabilized" mode: this time the user's inputs indicate the angles on each axis that the aircraft should hold. This is far easier to pilot: for example: if you center the sticks, the aircraft levels. Technically, a stabilized flight controller internally uses an Acro flight controller. In terms of hardware, in addition to the gyroscope, this controller needs an accelerometer (to distinguish up from down), and optionally a magnetometer (to get an absolute reference frame).
So let's start at the very beginning: an Acro flight controller working on a single axis. Here's the kind of loop behind such a controller:
  • Get the latest user's inputs:
    • The desired rotation rate around the axis (in degrees per second)
    • The throttle value (in my case, unit-less "motor-power" between 0 and 1)
  • Get a reading from the gyroscope (a rotation rate in degrees per second)
  • Determine the difference between the measured rotation rate and the desired one: this is the error (still in degrees per second)
  • Use the PID magic on this error to calculate the amount of motor-power to apply to correct the error. This is the same unit as the throttle.
  • Add this correction to the throttle value, and send the result to one motors. Subtract the correction from the throttle value and send the result to the second one (the [0..1] value is converted into a pulse-width in microseconds, i.e a PWM signal that the ESCs understand)
And repeat this loop as frequently as possible (for me: the flight control is done at 125Hz, the gyro reading at 250Hz, and the user's input reading at 50Hz). And this is what the result looks like:


Graph colors:

  • green: gyroscope reading
  • blue: user's "desired" angular rate 
  • red: throttle value
Notice how the quadcopter rotates at constant speed when the desired angular rate stays at the same non-zero value.