Delving the depths of computing,
hoping not to get eaten by a wumpus

By Timm Murray

Arduino/Raspberry Pi -- TWI

2013-10-14


NOTE: This post was moved over from WumpusUAV.com. The Indiegogo campaign was not successful, so I’m copying some key posts from there to over here and shutting the site down.

NOTE 2: I later found that these instructions are flawed. You need to convert between the 3.3V signal on the Raspberry Pi to the 5V signals on the Arduino. Or get a 3.3V Arduino. I’ve ordered the Sparkfun Bi-Direction Logic Level Converter and will report back once I’ve received it.

Two Wire Interface (TWI) is a simple means of communication between two systems. “I2C” is another name for basically the same thing. On the Arduino side, it is implemented by the Wire library. On Raspberry Pi under Perl, we will use HiPi::BCM2835::I2C.

Hardware Setup

Take a look at the Raspberry Pi version 2 header pinout. With the header on the top-left, the I2C pins are the second and third pins on the bottom row. The Arduino pins depend on the board; see the link above to the Wire library for details.

The SDA pin (Serial Data Line) and SCL pin (Serial Clock Line) between the two boards must be connected. We also need to connect the Raspberry Pi’s GND and +5V wires to the Arduino’s GND and Vin pins, respectively.

(If the Arduino is powered externally, then only the GND wire needs to be connected to the Raspbery Pi.)

When using Raspbian on the Raspberry Pi, you need to load the kernel modules i2c_dev and i2c_bcm2708. The second one is blacklisted by default in /etc/modprobe.d/raspi-blacklist.conf, so remove it from there. It’s also handy to add these modules to /etc/modules so that they get loaded on startup.

Add your user to the i2c group if you want to be able to connect without being root.

Raspberry Pi Programming

On Raspberry Pi revision 2, the I2C pins on the main header are actually the second I2C device. The first sits on a secondary ribbon header, which is currently used for the Raspberry Pi camera accessory. Thus, we use the device /dev/i2c-1.

Each slave (which will be the Arduino board) needs an address starting at 4 and up to 127. The currently connected devices can be scanned with i2cdetect -y 1.

Example Perl program:

#!perl
use v5.14;
use HiPi::BCM2835::I2C qw( :all );
use constant ADDR       => 0x28;
use constant DEVICE     => '/dev/i2c-1';
use constant BUSMODE    => 'i2c';
use constant SLAVE_ADDR => 0x04;
use constant REGISTER   => 0x00;

my $DATA = 0x05;

#say "Baudrate: " . HiPi::Device::I2C->get_baudrate;


my $dev = HiPi::BCM2835::I2C->new(
        peripheral => BB_I2C_PERI_1,
        address    => SLAVE_ADDR,
);

say "Sending [$DATA]";
$dev->i2c_write( REGISTER, $DATA );
my @recv = $dev->bus_read( REGISTER, 1 );
say 'Got [' . join( ", ", @recv ) . ']';

Arduino

We intitilize the Wire library with the slave address. It then takes functions as arguments to onReceive and onRequest.

The loop() function will simply delay.

A complete example:

#include 

#define SLAVE_ADDR 0x04


uint8_t last_read_byte = 42;

void setup()
{
    Wire.begin( SLAVE_ADDR );
    Wire.onReceive( read_event );
    Wire.onRequest( write_event );
}

void loop()
{
    delay( 1000 );
}

void read_event( int len )
{
    last_read_byte = Wire.read();
}

void write_event()
{
    Wire.write( last_read_byte ):
}

References

http://neophob.com/2013/04/i2c-communication-between-a-rpi-and-a-arduino/


Underappreciated Perl Code: TAP's YAMLish Syntax

2013-10-07


If you write tests using Test::More, you may have seen the cmp_ok() sub output something like:

$ perl -MTest::More -E 'plan tests => 1; cmp_ok( 1, "==", 2, "fail" )'
1..1
not ok 1 - fail
#   Failed test 'fail'
#   at -e line 1.
#          got: 1
#     expected: 2
# Looks like you failed 1 test of 1.

This is a textual output of the file and line number of the tests, as well as what failed. If you wanted to write a TAP parser for a report, you could parse the comments and get that file location. But then you’d be parsing comments, and those aren’t supposed to be parsed by computers (SKIP and TODO being exceptions in TAP for hysterical raisins). Plus, Test::More makes no guarantees about the format of those comments, nor should it.

Fortunately, the TAP protocol has an official extension for parsable information about a test, called “YAMLish“. As the name implies, it’s a small subset of YAML, specifically the one supported by YAML::Tiny. This makes it easy to implement in other languages.

Test::More doesn’t seem to support outputting YAMLish, but we can get the functionality with TAP::Parser::YAMLish::Writer. We can write up a subroutine for a tests to handle YAMLish:

#!/usr/bin/perl
use v5.14;
use warnings;
use Test::More tests => 1;
use TAP::Parser::YAMLish::Writer;
use DateTime;


# Need to have write_yaml() on the same line so its line number output is correct
cmp_ok( 1, '==', 2, "Fail" ); write_yaml( 1, 2, { foo => 'bar' } );


my $yw = undef;
sub write_yaml
{
    my ($expected, $actual, $extensions) = @_;
    $yw = TAP::Parser::YAMLish::Writer->new
        unless defined $yw;
    my ($pack, $filename, $line) = caller;

    my $dt = DateTime->now;
    my $date = $dt->iso8601();

    my %fields = (
        datetime   => $date,
        file       => $filename,
        line       => $line,
        expected   => $expected,
        actual     => $actual,
        extensions => $extensions,
    );

    $yw->write( \%fields, \*STDOUT );
    return 1;
}

The keys datetime, file, line, expected, actual, and extensions are defined directly on the wiki page for YAMLish. The extensions key is a hashref that can hold custom information. The datetime key is in either ISO8601 or HTTP date format.

Output:

$ perl yamlish_example.pl 
1..1
not ok 1 - Fail
#   Failed test 'Fail'
#   at yamlish_example.pl line 9.
#          got: 1
#     expected: 2
---
actual: 2
datetime: 2013-10-06T16:55:14
expected: 1
extensions:
  foo: bar
file: yamlish_example.pl
line: 9
...
# Looks like you failed 1 test of 1.

But this is awkward and ought to be wrapped up by a CPAN module. The requirement for write_yaml() to be on the same line is particularly bad. Duplicating your actual/expected values in the call to write_yaml() is no good, either.

It’d be nice if Test::More did this for us, or barring that, a drop-in replacement. After a quick search, I can’t seem to find anything like that. Any takers? :)


My Application to Star Fleet Corp of Engineers

2013-10-06


Today, I fixed something by reversing its polarity.


Killing Procrastination

2013-09-30


Interesting blog post about how to kill procrastination. tl;dr: Instead of “buckets” (like TODO lists) that you just throw things into, have something that lights a fire, such as a regular prompt to keep you moving.


How UAV::Pilot got Real Time Video, or: So, Would You Like to Write a Perl Media Player?

2013-09-26


Real-time graphics isn’t something people normally do in Perl, and certainly not video decoding. Video decoding is too computation-intensive to be done in pure Perl, but that doesn’t stop us from interfacing to existing libraries, like ffmpeg.

The Parrot AR.Drone v2.0 has an h.264 video stream, which you get by connecting to TCP port 5555. Older versions of the AR.Drone had its own encoding mechanism, which their SDK docs refer to as “P.264”, and which is a slight variation on h.264. I don’t intend to implement the older version. It’s for silly people.

Basics of h.264

Most compressed video works by taking an initial “key frame” (or I-frame), which is the complete data of the image. This is followed by several “predicted frames” (or P-frame), which hold only the differences compared to the previous frame. If you think about a movie with a simple dialog scene between two characters, you might see a character on camera not moving very much except for their mouth. This can be compressed very efficiently with a single big I-frame and lots of little P-frames. Then the camera switches to the other character, at which point a good encoder will choose to put in a new I-frame. You could technically keep going with P-frames, but there are probably too many changes to keep track of to be worth it.

Since correctly decoding a P-frame depends on getting all the frames back to the last I-frame right, it’s a good idea for encoders to throw in a new I-frame on a regular basis for error correction. If you’ve ever seen a video stream get mangled for a while and then suddenly correct itself, it’s probably because it hit a new I-frame.

(One exception to all this is Motion JPEG, which, as the name implies, is just a series of JPEG images. These tend to have a higher bitrate than h.264, but are also cheaper to decode and avoid having errors affect subsequent frames.)

If you’ve done any kind of graphics programming, or even just HTML/CSS colors, then you know about the RGB color space. Each of the Red, Green, and Blue channels gets 8 bits. Throw in an Alpha (transparency) channel, and things fit nice into a 32 bit word.

Videos are different. They use the “YCbCr” color space, at term which is sometimes used interchangeably with “YUV”. The “Y” is luma, while “Cb” and “Cr” is blue and red, respectively. There are bunch of encoding variations, but the most important one for our purposes is YUV 4:2:2.

The reason this is that YUV can do a clever trick where it sends the Y channel on every pixel on a row, but only sends the U and V channels on every other pixel. So where RGB has 24 bits per pixel (or 32 for RGBA), YUV averages to only 16.

The h.264 format internally stores things in YUV 4:2:2, which corresponds to SDL::Overlay‘s flag of SDL_YV12_OVERLAY.

Getting Data From the AR.Drone

As I said before, the AR.Drone sends the video stream over TCP port 5555. Before getting the h.264 frame, a “PaVE” header is sent. The most important information in that header is the packet size. Some resolution data is nice, too. This is all processed in UAV::Pilot::Driver::ARDrone::Video.

The Video object can take a list of objects that do the role UAV::Pilot::Video::H264Handler. This role requires a single method to be implemented, process_h264_frame(), which is passed the frame and some width/height data.

The first object to do that role was UAV::Pilot::Video::FileDump, which (duh) dumps the frames to a file. The result could be played on VLC, or encoded into an AVI with mencoder. This is as far as things got for UAV::Pilot version 0.4.

(In theory, you should have been able to play the stream in real time on Unixy operating systems by piping the output to a video player that can take a stream on STDIN, but it never seemed to work right for me.)

Real Time Display

The major part of version 0.5 was to get the real time display working. This meant brushing up my rusty C skills and interfacing to ffmpeg and SDL. Now, SDL does have Perl bindings, but they aren’t totally suitable for video display (more on that later). There are also two major bindings to ffmpeg on CPAN: Video::FFmpeg and FFmpeg. Neither was suitable for this project, because they both rely on having a local file that you’re processing, rather than having frames in memory.

Fortunately, the ffmpeg library has an excellent decoding example. Most of the xs code for UAV::Pilot::Video::H264Decoder was copy/pasted from there.

Most of that code involves initializing ffmpeg’s various C structs. Some of the most important lines are codec = avcodec_find_decoder( CODEC_ID_H264 );, which gets us an h.264 decoder, and c->pix_fmt = PIX_FMT_YUV420P;, which tells ffmpeg that we want to get data back in the YUV 4:2:2 format. Since h.264 stores in this format internally, this will keep things fast.

In process_h264_frame(), we call avcodec_decode_video2() to decode the h.264 frame and get us the raw YUV array. At this point, the YUV data is in C arrays, which are nothing more than a block of memory.

High-level languages like Perl don’t work on blocks of memory, at least not in ways that the programmer is usually supposed to care about. They hold variables in a more sophisticated structure, which in Perl’s case is called an ‘SV’ for scalars (or ‘AV’ for array, or ‘HV’ for hashes). For details, see Rob Hoelz’s series on Perl internals, or read perlguts for all the gory details.

If we wanted to process that frame data in Perl, we would have iterate through the three arrays (one for each YUV channel). As we go, we would put the content in an SV, then push that SV onto an AV. Those AVs can then be passed back from C and into Perl code. The function get_last_frame_pixels_arrayref() handles this conversion, if you really want to do that. Protip: you really don’t want to do that.

Why? Remember that YUV sends Y for every pixel in a row, and U and V for every other pixel, for an average of 2 bytes per pixel, and therefore 2 SVs per pixel (again, on average). If we assume a resolution of 1280×720 (720p), then there are 921,600 pixels, or 1,843,200 SVs to create and push. You would need to do this 25-30 times per second to keep up with a real time video stream, on top of the video decoding and whatever else the CPU needs to be doing while controlling a flying robot.

This would obviously be too taxing on the CPU and memory bandwidth. My humble laptop (which has a AMD Athlon II P320 dual-core CPU) runs up to about 75% CPU usage in UAV::Pilot while decoding a 360p video stream. That laptop is starting to show its age, but it’s clear that the above scheme would not work even on newer and beefier machines.

Fortunately, there’s a little trick that’s hinted at in perlguts. The SV struct is broken down into more specific types, like SViv. The trick is that the IV type is guaranteed to be big enough to store a pointer, which means we can store a pointer to the frame data in an SV and then pass it around in Perl code. This means that instead of 1.8 million SVs, we make just one for holding a pointer to the frame struct.

This trick is pretty common in xs modules. If you’ve ever run Data::Dumper on a XML::LibXML node, you may have noticed that it just shows a number. That number is actually a memory address that points to the libxml2 struct for that particular DOM node. The SDL bindings also do this.

The tradeoff is that the data can never be actually processed by Perl, just passed around between one piece of C code to another. The method get_last_frame_c_obj() will give you those pointers for passing around to whatever C code you want.

This is why SDL::Overlay isn’t exactly what we need. To pass the data into the Perl versions of the overlay pixels() and pitches() methods, we would have to do that whole conversion process. Then, since the SDL bindings are a thin wrapper around C code, it would undo the conversion all over again.

Instead, UAV::Pilot::SDL::Video uses the Perl bindings to initialize everything in Perl code. Since SDL is doing that same little C pointer trick, we can grab the SDL struct for the overlay the same way. When it comes time to draw the frame to the screen, the module’s xs code gets the SDL_Overlay C struct and feeds in the frame data we already have. Actual copying of the data is done by the ffmpeg function sws_scale(), because that’s solution I found, and I freely admit to cargo-culting it.

At this point, it all worked, I jumped for joy, and put the final touches on UAV::Pilot version 0.5.

Where to go From Here

I would like to be able to draw right on the video display, such as to display nav data like the one in this video:

http://www.youtube.com/watch?v=ipFo8YPCs-E

Preliminary work is done in UAV::Pilot::SDL::VideoOverlay (a role for objects to draw things on top of the video) and UAV::Pilot::SDL::VideoOverlay::Reticle (which implements that role and draws a reticule).

The problem I hit is that you can’t just draw on the YUV overlay using standard SDL drawing commands for lines or such. They come up black and tend to flicker. Part of the reason appears to go back to YUV only storing the UV channels on every other pixel, which screws up 1-pixel wide lines 50% of the time. The other reason is that hardware accelerated YUV overlays are rather complicated. Notice that linked discussion thread goes back to 2006, and things don’t appear to have gotten better until maybe just recently with the release of SDL2.

The video frame could be converted to RGB in software, but that would probably be too expensive in real time. The options appear to be to either work it out with SDL2, or rewrite things in OpenGL ES. OpenGL would add a lot more boilerplate code, but could have side benefits for speed on top of just plain working correctly.

Once you can draw on the screen, you could do some other cool things like doing object detection and displaying boxes around those objects. Image::ObjectDetect is a Perl wrapper around the opencv object detection library, though you’ll run into the same problem of copying SVs shown above. Best to use the opencv library directly.


What if Perl OO was a Core Feature?

2013-09-20


Over on Reddit /r/perl, there’s a rather blatant troll complaining about the lack of OO as a core feature in Perl. The tone there is clearly not constructive and not worth responding further, but I feel compelled to answer a question: what would be improved if OO was a core feature, rather than built out of existing components of the language?

Personally, I think the fact that OO can be built this way is a demonstration of flexibility. It also allows you to build your own OO system that suits your needs. If the standard kind of blessed data structure doesn’t work for you, try Inside-Out Objects. This also had hidden benefits later on; when roles/traits/mixins became all the rage, Perl just added them via CPAN modules. There was no long, drawn out, design-by-committee bikeshedding discussion like there was for Java.

If you really wanted to, you could even build an object system without bless() or packages. The constructor would return a hash, which is filled with sub references:

sub new_object
{
    my %object = (
        foo => sub { 'foo' },
        bar => sub { 'bar' },
        baz => sub { 'baz' },
    );
    return %object;
}

my %obj = new_object;
say $obj{foo}->(); # Prints 'foo'

Inheritance is a matter of a different constructor calling the original constructor, then filling in different subrefs:

sub new_inherit_object
{
    my %object = new_object();
    $object{foo} = sub { 'override foo' };
    $object{qux} = sub { 'qux' };
    return %object;
}

my %obj = new_inherit_object;
say $obj{foo}->(); # Prints 'override foo'

(No, I don’t really suggest doing this. It’s just an interesting exercise in building an object system using limited language components. Although this isn’t too far from how JavaScript builds objects.)

Getting back to the usual bless() system, there is a drawback: the learning curve. Think of all the things you have to grasp before you can understand Perl OO. There’s subroutines, passing arguments to subroutines, and putting those subroutines into packages. Any bless‘d object is going to be a complex data structure, and before you can figure those out, you need to grasp references. Even for somebody experienced in another programming language, that’s a lot to ask before you can get them up to speed on object programming. It’s even harder for somebody with no programming background at all.

Moose somewhat alleviates this. It can hide a lot of the details of the complex data structure, though obviously you’re going to need to dig into that at some point. The constructors in Moose can also do sensible things for you. It’s a lot better than Java, where the constructors tend to have a lot of this.foo = foo; statements to set private member variables from the arguments.

What Moose can’t fix is subroutines. Sub signatures are a sore spot for Perl. I think it really is an embarrassment that we still have this problem in 2013. And we need signatures down pat before we can even start thinking about multi-method dispatch.


Perl SDL GUI Layout Engine

2013-09-19


As it turns out, SDLx::App only supports having one window at a time. It’s effectively a singleton. This was a problem for UAV::Pilot, because I wanted to draw the navigation output and video in seperate windows.

The fix I have now is to implement a simple layout engine, where you can specify if you want the widget to be placed at the top or bottom of the window. I didn’t want to add complications like left or right. That’s good enough for now; the video always goes on top, the nav on bottom. If you were watching the AR.Drones 720p video stream on a 720p monitor, it might be an issue, but I’m not going to worry about it for now.

Where it could be an issue is when UAV::Pilot starts implementing other types of UAVs. I’m contemplating a rover, which makes the current nav output of roll/pitch/yaw rather pointless. But it’ll still need the battery output, and maybe add throttle and steering.

That all leads into breaking the current nav output into individual widgets with a complicated layout engine, like the sort you might see in Gtk+ or KDE.

As far as I can tell, Perl has nothing like this for SDL. There’s SDLx::GUI and SDLx::Widget. Both of these are limited, and look like the respective authors have left them by the wayside. Which I totally understand. I left Gopher::Server like that.

I’m tempted to write my own, but that seems like a full project in itself. Intergrating with Gtk2 might work, but I’m not sure how well that will go with SDL. Even making Perl bindings for a C layout library seems like a full project.

For now, I can ignore the issue. Implementing just a rover won’t introduce too much redundancy.

Any takers on this one? :)


Underappreciated Perl Code -- Test::More::subtest()

2013-09-16


Consider a long test that you can break into logical subsections. You could make a flat output with the standard series of TAP ok messages, but Test::More gives a more sophisticated alternative:

use Test::More tests => 3;
 
pass("First test");
 
subtest 'An example subtest' => sub {
    plan tests => 2;
 
    pass("This is a subtest");
    pass("So is this");
};
 
pass("Third test");

This works into the TAP:

1..3
ok 1 - First test
    1..2
    ok 1 - This is a subtest
    ok 2 - So is this
ok 2 - An example subtest
ok 3 - Third test

Notice that the subtest has it’s own test count (“1..2“).

Now, what if we decide that we could speed up this test by fork()ing off and running in parallel, or by using some kind of event interface where we can’t predict what order the callbacks are going to run. This would make the TAP output of each process or event get mixed up with the others.

TAP is actually a defined protocol with lots of features that should see more use. One of the proposals for a new version of the protocol is Test Groups, which looks like this:

1..3
ok 1
1..2 2 a block
1..3 2.1 another block
ok 2.1.1
ok 2.1.2
ok 2.1.3
ok 2.1 # end of another block
ok 2.2
ok 2 # end of a block
1..3 3 a third block
ok 3.1
ok 3.2
not ok 3 # end of a third block, planned for 3 but only ran 2 tests

Which isn’t the prettiest solution, I’ll admit, but it does solve the interleaving problem. The parser can take the test numbers in any order and put them back together.

The other proposal along these lines is Test Blocks, which are similar to what Test::More::subtest() does:

TAP version 14
1..4
begin 1 Object creation
  1..2
  ok 1 Object created OK
  ok 2 Object isa Flunge::Twizzler
end 1 Object creation
ok 2 Clone OK
begin 3 Methods
  1..4
  ok 1 has twizzle method
  ok 2 has burnish method
  ok 3 has spangle method
  not ok 4 has frob method
end 3 Methods
ok 4 Resources released

But this doesn’t solve the interleaving problem.

I had mentioned these in a Lighting Talk at YAPC::NA 2012, and immediately got myself warnocked. So this is me standing up and asking to get this moving again :)


UAV::Pilot Presentation

2013-09-11


Last night, I gave a presentation for the Madison Perl Mongers group on UAV::Pilot. The video is now up:

https://www.youtube.com/watch?v=rQgheFfc_As

This is also the announcement for the WumpusUAV Indiegogo project, which aims to create a new, cheap, hackable UAV platform. More information is on WumpusUAV.com.


UAV/FOSS -- ArduPilot

2013-09-10


ArduPilot is a FOSS autopilot based around Arduino. It has different firmware builds to support helicopters, multicopters, planes, cars, and boats.

Since I’ve been mostly focusing on multicopters, I’ll stick with that. A basic, fully assembled quadcopter kit will run you about $600. This does not include a telemetry module for controlling from a computer, or an RC radio for controlling manually. The US-band telemetry module will run another $85. A cheap 2.4GHz RC radio can go for $50-75, though if you’re serious, you’ll probably want to run at least $150-250. Then there’s the battery, which goes for about $70.

That also doesn’t come with on-board video, which is another $190, and uses a secondary radio on 5.8GHz. The standard OSD module sold on 3D Robotics does not have HD resolution.

If you’re looking for something to play around with, the AR.Drone will cost about a third the price.

I don’t mean to be all negative about the ArduPilot. Clearly, ArduPilot does something much more serious than the AR.Drone. This is a platform you can hack. Change out motors, platforms, hexcopters, octocopters, camera gimbals, everything. It’s also designed with a GPS module that can be used to instruct the UAV to fly to a spot and fly back.

ArduPilot has a documented control protocol. At least, I think that doc is still relevant. The wiki page there says they’ve moved, but I couldn’t find anything more up to date on the protocol description. In any case, I’d love to implement this in UAV::Pilot someday.

The impression I’ve been getting is that if you just want to mess around (nothing wrong with that), buy the AR.Drone. If you want to get serious, buy the ArduPilot. Somehow, though, I think there should be a platform that starts as cheap as the AR.Drone but lets you work your way up.



Copyright © 2024 Timm Murray
CC BY-NC

Opinions expressed are solely my own and do not express the views or opinions of my employer.