Sub-surface broadcast


I thought I’d come up to periscope depth to drop a few words about what I’ve been up to. WordPress is a tricky vehicle to publish with and it usually devolves into extended configuration and Linux spelunking but I think I’m through the worst of it. If I can just ignore that blinking terminal window…

Anyway, I’ve been pretty much off Twitter the past couple of weeks, only dropping in to retweet a funny animal picture or cryptic sigil. It’s a funny thing, Twitter. Everybody’s so wrapped up in their little moments while there are big things happening in the world at large all around them. The juxtaposition of tech foibles, social injustices, untappd posts and the terribleness in Paris were too much for my feeble mind to cope with, so I shut it all down. This morning’s the first time I had Twitter open on my desktop machine in a week or two. Didn’t really miss it, though I’ll keep it around for DMs and the occasional drive-by.


Camera System: Check.

Last weekend I was at the Waterloo Wellington Flight Centre attending ground school for UAV pilots. In the eyes of Transport Canada, I am a certified airman. Industry Canada has licensed me to operate an aviation radio. I ordered my Big Book of Canada Aerodromes and received it in the mail on Friday with giddy enthusiasm. Yes, it’s an actual book with paper in it. No, I don’t know why it’s not on the internet. Because reasons, I’m sure. Anyway, it was a lot of fun and the pilots at the WWFC were all wonderful people and full of stories from the world of aviation they backed up with deep knowledge. If you would like to go legit, I recommend their program whole-heartedly. Or you could just keep flying your drone illegally which is what basically every hobbyist is doing in the lower Canadian latitudes in controlled and restricted airspaces. Which is everywhere.

There will probably be a come-uppance. You’ve been warned.

And now a short teaser about a project I’ve been working on for lo this past year. That’s it, really. That’s the whole teaser. It’s really close to being done in a form that I’m almost happy with and I hope to have an exciting announcement later this week. Maybe Wednesday if the stars all line up.

no it’s not an app. or a startup. geez.

UAV Preflight Checklist

This morning I came across this list for new pilots in my feeds and thought it’d be useful to share my own pre-flight checklist. Before any flight, I always run through a quick checklist to make sure I’m not breaking any rules and that my quadcopter’s in good working shape. It’s short, takes a few minutes and potentially saves me from a costly mistake.

Portable fly #followmode #drones

  1. Check the weather. I don’t just rely on The Weather Network / Channel, I find the closest airport and pull a METAR (Meteorological Terminal Aviation Routine Weather Report) feed using one of several web services. E.g., These are auto-generated every hour or half-hour and provide localized reports. What to watch for, pay particular attention to the windspeed information. Anything higher than 15knots and we’re grounded. No flying! METAR reports use knots or nautical miles per hour, lots of funny units in aviation, but 15kts == 28km/h. Other issues: visibility, fog, cloud or rain can ground you. One pro-tip that can get you flying happens to apply to both photography and hot-air ballooning: the best conditions are always early in the morning around sunrise and later in the evening around sunset. Better light too. I’ll record the METAR report in my log book.
  2. Location > 9km from an airport? This is key. In Canada and the US, there’s a 5 mile / 9km exclusion zone around any airports. I take this one pretty seriously as I don’t want to be the jackass that gets civilian quadcopters grounded for everyone because of a near-miss by an airplane. Check Flightaware in your location to get a local aeronautical map (E.g., Other considerations for location: are you on private property? If so, do you have permission? Trees? Powerlines? I do a good walk-around on a site before I take off to get a feel for any obstacles I need to be aware of. I learned this the hard way when I had a hard crash doing some free flying and not making note of a row of small trees once. There is no grosser feeling than watching your quad collide with a tree at over 50kph and then crashing into concrete.
  3. Batteries. Before a flight, I’ll charge the battery packs for my quad, my transmitter’s battery pack and my phone which I use for telemetry and auto-pilot programming. Don’t forget the camera! I fly with a GoPro Hero3+ Black and it chews through batteries like nobody’s business. If I’m flying with more than 2 packs on my ‘copter, I need at least 2 charged batteries for the GoPro. Don’t forget memory cards either.
  4. Tools. I keep a go bag with my controller, and a bunch of tools to do field repairs and tune-ups. Hex keys for my gimbal and drone’s body screws, wrenches for the props. Batteries for my camera and quad, backups for my transmitter and memory cards. I usually bring a multi-tool as well. You’ll need these for…
  5. Props, legs and arms. Give everything a little wiggle. Most of the screws on my copter for the arms and motors are all held in place with lock-tite, and the props are self-tightening. In most cases, this is formality, but always check your machine before a flight.
  6. People. Are there any people around? Dogs or other animals? Be aware of them. Flying over crowds is a strict no-no and in Canada, you have to keep a 100 ft. minimum distance from any groups of people. Dogs and other animals can become excited or scared by loud electric motors. I recently flew at a wedding where there were horses and they just didn’t like the thing. I had to keep a long distance to keep from spooking the horse and in the end, didn’t really get the shots we wanted. Still better than having a freaked-out work-horse running rampant through a crowd of people.
  7. Power. GoPro on, correct mode? Start recording. Battery in, check the flight controller’s tones. Wait for gimbal calibration and GPS connection.
  8. Transmitter and Telemetry. Check controller sticks and switch positions. Turn it on. Check gimbal tilt response to verify connection. Turn on telemetry radio and start ground station app (I use Tower on my Android phone).
  9. If I have a pre-programmed auto flight plan, I’ll load it on my phone and send it to the drone. At a minimum, I’ll verify the home location when I arm my drone is accurate.
  10. Lift off and hover check. Before any mission, always do a brief take-off and loiter test to make sure we’re holding position and everything’s flying correctly.

And we’re good to go! In flight, I keep a strict 150m altitude limit and a close eye on my battery voltage. My ground control software, Tower has a nice feature where it’ll give audio read-outs of altitude and voltage levels every 10 seconds or so I don’t have to take my eyes off the sky. After a flight, I record everything in my log book.

This sounds complicated, but after you do it enough times, it becomes routine. Stick to the list and you can’t go wrong.

Happy flying!

The Incredible Resolution of the Olympus OM-D E-M1

And by “resolution”, I mean, I am making the resolution to take more pictures with this camera in 2015. See what I just did?

I like cameras.

Olympus OM-D E-M1

OM-D E-M1 Action shot with Sony RX1

Cameras have always had a special place in my heart. Designed to be held in your hand, a good camera feels like it belongs there. As an imaging instrument, they capture a scene by gathering photons through a focusing mechanism – usually a lens – and stacking them on a sensor. At least that’s what happens in a digital camera.

Think about that for a second. Photons.

When you take a picture, you are literally capturing a moment in time. My inner physics nerd freaks out a bit when I think about this too deeply. Photographs may well be our best proof of time’s existence. Sidebar, if you want to read about the elusiveness of proving time, Dan Falk’s In Search of Time: Journeys Along a Curious Dimension ( link) has nothing to do with photography but is pretty interesting.


Man hovering outside of GDC. Shot with the OM-D E-M1 and a Voigtlander 17.5mm F0.95. Manual focus.

Show a picture to one or more people who were in that place at that time and they will tell you, “Oh yeah, I remember that.” The image can take you back there.

Humans have been taking pictures (or, if you’re more serious about photography and want to sound like a prat, “making photographs”) for almost 200 years now. In that time, we’ve seen a couple of technologies come and go, though the death of film is somewhat up for debate. For the sake of argument, I’m going to claim it’s over.

As 2015 closes in, we’re seeing mirrorless cameras finally usurping the dominance of the once-ubiquitous DSLR. The tech is moving quickly too. Late last year, Sony released the first interchangeable lens full-frame mirrorless cameras. They’d previously managed a proof-of-concept with the RX1, a camera that can produce stunning images if you can put up with its incredibly poor performance and quirky controls and happen to like shooting in 35mm focal length (I do). The A7 series is an impressive line of cameras that one year later is already seeing its first revision in the form of the A7 mark II. Reviews are starting to come in and most of them have people gushing over it. “It boots up in under 2 seconds!” exuded one reviewer. “It focuses pretty fast!” wagged another.


If you want a camera that’s properly fast, look no further than the still amazing Olympus OM-D E-M1. Behind that mouthful of letters is a camera that screams capability. It will blow your doors off it takes pictures so fast. It will melt your face with its incredible electronic viewfinder.

It will boot up and shoot about six frames before the A7m2 has powered-up. (not actually tested with science!)

That’s right.

“But you can’t get a good image with that tiny sensor.”


Yes you can. This thing produces really sharp 16MP images.

sony a7r + voigtlander 15mm f4.5

Deb’s Sony A7R with Voigtlander 15mm Heliar M-mount lens on a Metabones adapter. Shot with OM-D E-M1 and 12mm F2 Olympus prime lens.

Olympus is making some really excellent cameras these days. Rumor has it, they’re coming out with a new E-M5 in the spring with some kind of crazy sensor shift technology that boosts the sensor up to an “effective” 40MP. Who cares? The E-M1 is the bomb. And the old E-M5 is still a plenty capable camera.


“Oli” on an Oly E-M5, Voigtlander 17.5mm prime at F0.95.

This post is my commitment to get out and shoot more pictures.

GDC 2014

I was fortunate to be able to attend the Game Developers Conference in San Francisco this year. Thanks to our organizers and IT staff for all the hard work they put into making everything run so smoothly.


My GDC14 Flickr Set

This was the first year Mozilla had an actual booth on the show floor and we put it to good use demoing our Developer Tools alongside some fun games. We showed off our new Canvas Debugger (should be landing next week!), the Shader Editor as well as our Debugger and our other inspection tools. People were really receptive to the Canvas tool. The Shader Editor got a fair number of positive comments as well. I was also able to show off our Network panel as a temporary solution for inspecting game assets like textures.

Another well-received demo was a setup where I paused my desktop JS Debugger when receiving a device orientation event on my phone. I loaded the three.js DeviceOrientation demo on my phone’s browser (Firefox for Android). I then connected the phone via USB to my laptop and launched our remote tools via the “connect” option. Opening the Events panel, I was able to pick “deviceorientation” as a category and selecting that caused execution on the phone to immediately pause with my desktop debugger showing the exact location.

Debugging device events is easy to do on a mobile device. I was also able to demo our Shader Editor running on mobile which was pretty cool. Editing shaders in real-time running on a remote device is some real science fiction level stuff.

Having the kind of immediate feedback for WebGL (and soon WebAudio) that our tools provide is kind of a big deal for people who aren’t used to living in a dynamic environment like a web browser. There is lots of opportunity in this space to make tools for game developers that are fun to use and interactive. You can literally program your game while playing it.

This feels like a tipping point for games on the web. There are now multiple engine developers offering the Web as a bona fide deployment target. Three big engines have reduced their pricing models to the point of being effectively free for most developers and that happened just this week. This is a big deal and I think we’re going to start seeing a lot of game publishers shipping games to the web very soon.

We also weren’t the only booth showing off HTML5-related game technology. Nintendo is shipping a “Web Framework” around a bundled WebKit shell for deployment on the WiiU and had a pretty sizeable installation to show it off. Unity is also making that a deployment target. Various other booths were demoing HTML5 games and tech.

In the emerging technology department, head-mounted displays were in full-evidence. Sony just announced a new piece of head-gear for the PS4 and there were some other vendors kicking around similar technologies. At this point, it seems obvious that head-displays are going to be very real, very soon. The lines of people at Oculus’ displays were a constant stream of humanity.



(This post is not really Mozilla-related, so if you’re not interested in open source flight-controllers and software, you can stop reading here. There are some parallels though and I draw a connection later on, so I did decide to push this to

A few weeks ago, a friend said, “hey, 3D Robotics has a sale on two of their drones right now.” This is exactly the kind of thing friends shouldn’t say to friends who are of a certain disposition – a highly-suggestible technology geek with a love of flying things and photography.

So I did a little research. And a little more research. And then a little more research… I did a lot of research and ultimately decided that the drone I wanted was not one of the two drones 3DR had on sale. The deal was a free GoPro Hero 3+ Black camera or 200 dollars off, I think. I looked hard at the Y6, but being a multicopter noob, it was a bit intimidating as a first vehicle. The drone I wanted was the IRIS which was still in “Developer Preview” mode but shipped ready to fly and promised to be a good platform to learn on. They’ve since closed orders on the machine and are shipping the consumer version sometime in December.

That weekend I bought a Heli-max 1SQ Vcam and proceeded to begin crashing it around inside the house. I nearly lost it after an exhilarating 10 minute flight around the park that terminated in crashing into a tree.

I was totally hooked.

I ordered my IRIS that Sunday night. Astoundingly, it arrived direct from the assembly plant in Tijuana that Thursday. I was all set for a 2-4 week waiting period, but was denied the wait. Unfortunately, the weather would force me to wait before I could take it out for its maiden flight. Since getting my quadcopter, I’ve become even more obsessed with the weather and frequently ping Billy Bishop airport for METAR weather codes since they’re nearby.

First flight happened on a blustery Wednesday afternoon. We went to Woodbine park in some pretty windy conditions, but I was dying to get some airtime with this thing. I did a few quick test launches before boxing everything back up and hurrying to the car because of the wind. First flight was a success!

That weekend the weather cleared and we had a really nice day on Saturday. That’s when I did my first real test flight.

A word about setup. There is a fairly lengthy bit of documentation on and the instructions pages on

The one thing you really have to do before you can make use of autopilot is calibrate your compass! Mine was still carrying the calibration from the factory in Tijuana.

If you watch the first video above, I go onto autopilot around the 56 second mark and it has a hard time hitting the waypoint (I later learned that you need to set a reasonable acceptance radius on each waypoint in the flight planning software). At 2:45 it flew out of bounds over some trees. At 4:50 it did a slingshot spiral out over the pumping station which was pretty spectacular. Each time I was able to cancel autopilot and manually control it back to ground.

So that was educational. I spent the rest of the weekend calibrating and then recalibrating. The onboard flight software is particularly sensitive to this calibration and if you do it incorrectly as I did the first time, it won’t even let you arm the craft for take off and will just flash a yellow warning light at you. The documentation is not exactly helpful in figuring out what the problem is, so I had to go back to initial configuration to guess what had gone wrong. Since calibrating it properly, control has improved and waypointing works flawlessly.

I’ve also learned a bit about using a GoPro camera. The Hero 3+ now has a “protune” option for video that drastically improves stability and rolling shutter. It takes pretty excellent video now. I’ve posted a couple of other videos showing my first successful auto-piloted flights including landing.

This stuff is a lot of fun, and I’m experiencing some of the “developer preview” speed bumps along the way which I’m more than happy to absorb. I find the documentation is pretty good, though the flight planning software screenshots (either Mission Planner on Windows or APM Planner on Mac) rarely match what’s in the instructions.

I also experienced a bit of deja vu trying to figure out which site to find the right documentation on. Feels a lot like Mozilla in some ways where you have a bunch of inputs and a bunch of outputs and its up to the user to figure out how it all fits together. Programming the radios and PIDs is still a bit of a mystery though the presets the IRIS came with are excellent right out of the box.

The other gem in all of this is the Android flight planning software. There are two of them, but I’ve been running Droid Planner and am pretty happy with it. The other option is called AndroPilot. They both seem to have different capabilities and are both on github.

I still have a lot of learning to do to optimize the controls to my liking. Fortunately, the software is all open source. Now if the wind would just die down I can go out for a flight.



I sat in on three sessions at Google IO 2013 yesterday.

Memory Lane with Chrome Devtools and GMail was the first.

The presenters showed off their Heap Tracking Profiler and Memory Tracking tool in the Timeline and explained how to use them to track down a leaky DOM node. It was a practical application of how to use a developer tool to solve a particular problem.

One interesting takeaway that surprised the presenters during their research: Always allocating more memory (caching) as a way to improve performance in a large application like Gmail is not a panacea for slow performance. Having a large heap space actually slows down the garbage collector and your performance suffers. It’s a fine balance.

What surprised me was how they analyzed their problem by tracking a user with a known high memory problem for three days. The Google team constantly monitors their apps’ performance via the window.performance API and can single out hotspots in the population.

The next talk I sat in on was about Chrome Apps. The presenter, Erik Kay showed off some of the “Immersive” experiences of Chrome Apps and the different ways they could interact with the hardware on the Chrome Book. The talk included a demo of a small thermal printer being hooked up and controlled over USB which garnered some applause.

The Chrome Web Store lets you buy apps for Chrome.

The only real mention of Android was that they were using PhoneGap and Cordova to provide their compatibility layer. Same for IOS. There will be compatibility issues with deploying on iOS but it seems surprising that they would pursue this completely separate technology for Android. Surely they could ship a full version of the Chrome Runtime and deal with hardware incompatibilities directly.

The questions from the room were interesting. One man (not a Mozillian) asked about WebRTC compatibility across the different platforms, pointedly repeating the question of whether or not he’d be able to use WebRTC in an app on iOS. Only when their WebView supports it.

Another man asked something about interoperability between B2G and ChromeRT. Erik said that there is “no forcing function yet to drive standardization”.

I think my biggest takeaway from this talk was that people wearing Google Glass look like dorks.

My second biggest takeaway was that I was very surprised that there was zero mention of the Google Play Store for Chrome Apps.

Last talk I attended was a Fireside Chat with the Blink Team. While I was expecting an actual fire and was disappointed there wasn’t one, the team bravely took questions from an audience confused about feature-detection, unprefixed CSS and market fragmentation.

Dan Buchner asked the panel something about standardization and I felt a little badly for the Blink team who had a whole chunk of slides talking about how they’re going to be good citizens. (If you want to participate, you should join

I was interested in which shows a spreadsheet of features in-progress. Time will tell how their Intent to Implement and Intent to Ship broadcasting will work from an Open Source point-of-view, but they are currently claiming that a third of their intents to implement are coming from outside of Google.

I wanted to meet Paul Irish after the talk but Steven Shankland showed up and pushed me out of the way. When he was done I did get to meet him, but I think Buchner had made him angry or something. Maybe he was just tired. I dunno.


frank's demise

I have had a number of iPods over the years. Starting with my iPod 3G, then a 5G (whose screen is pictured above), a 1st-gen iPod Touch and lately an iPod 7G Classic with a 160GB drive in it. I’ve loved them all, but it really feels like “Device as a Music Player” is done. Apple’s shift from iPods to iPhones started that downward trend.

But it’s gone further than just the obsolescence of the dedicated player. Music storage itself has become another quaint notion. Services like Rdio and Pandora (still not available in Canada) have replaced saved music for many people. And video too is a thing that is streamed rather than “owned”.

I blame Apple. The company that started the shift to digital music has failed to innovate. The 256kbit AAC DRM file is now the pinnacle of purchasable audio and it’s not nearly good enough. Marketing phrases such as “Mastered for iTunes+” really mean “We’ve destroyed any dynamic range this recording might have had”. Sure, the Compact Disc wasn’t perfect, but mastering for that format certainly left a lot more room for the engineer to play with.

iTunes itself has become something of an abomination. More interested with selling you things than maintaining and organizing your library, it’s frustrating to use if you have any amount of content in your library. Ironically, I think the Apple Remote software available on the iPad may be my preferred interface for the new iTunes. It may feel more connected to my library than iTunes itself.

iTunes 11 is the Apple Maps of media software.

I think the field is ripe for picking. Someone could come along and ship some music library software that doesn’t suck. I would pay for it. Bonus points if it will recognize and consolidate libraries from around my home network. And if it could stream to my devices while I’m out and about, automagically compressing my music on the fly, well that’d be keen.

Android Photography

Android’s come a long way since my first fumblings a few years ago with a Dell Streak (“steak”). Both software and hardware have improved immensely since then. Sure, there are still some things that make you shake your head and question reality, but for the most part, Android 4.2 is a great platform.

In most cases when looking for a piece of software for a particular task, the problem you’re faced with is that there are too many options to choose from. So, I’m going to share a few of my favorite photo apps with you. (spoiler: none of them are Instagram).

Stock Android Camera

Yes, it’s built-in on the Nexus phones. It was a big deal with the release of 4.2 and has a fairly unusual interface. Radial menus for most control functions and a strange focus indicator. It’s a decent app, but I tend to fumble with it when trying to use it. The radial menus and icons are harder to figure out at a glance than they need to be and the control I want is never on the menu I go to first.

Ok, not a huge fan, but one nice feature in the Gallery app is a new set of built-in photo filters that are on par with just about any other camera app. Combined with Android’s impressive sharing capabilities, it lets you edit some great pictures without installing anything else.

galaxy glitch

Camera Zoom FX

This was the first alternative camera app I bought for my Galaxy Nexus. I bought it for the built-in effects, but it turns out to be a very capable camera app in its own right. With a bunch of setup options for guidelines, stability indicators, and horizon leveling, it’s already more useful than the built-in app. Add in some additional shot controls like timers, timelapse and voice-activation and you’ve got a stew goin’.

How are the effects? Delicious. You can grunge up any decent photo and make it look like you’re shooting a $10 Lomo Diana without the embarrassment of actually handling one.



Apparently this is one of the most popular Android photography apps according to the Android Photographers group on Flickr. I tried it and it’s decent. I find some of the control settings a little hard to get to though, buried in the settings menu. If I need exposure compensation, I don’t want to leave the camera’s live view to do it.

Nice effects though and good level of control if you can dig into it.

construction at the river


Shot Control

If there were a camera app for nerds this would be it. Every control right there on the screen. There are so many controls, that the developers made a curious choice: The live view from the camera is in a corner of the screen. This real-time resizing comes at the cost of performance. It’s slow. Yes, there are lots of controls there to fiddle with. There are probably going to be times when I pull this out because I need a very specific bunch of settings. God help me when that’s true.


Hey, remember Snapseed? It used to be for iOS only and it’s by the very capable crew at Nik Software. Then they shipped it for Android. And then Google bought it. You can tell because it has a G+ icon in the top strip.

Don’t let its checkered past confuse you though. This is one of the few apps built by a company with a pedigreed history in producing professional image editing software. The controls are specific and very finely tunable. You should be able to produce some great photos with this. Only problem? It’s doesn’t have camera software builtin so you’ll be relying on one of the others to take the actual shot.


I like flickr. It’s no secret. Their app has finally gotten some love as well.

The app ain’t bad, but the built-in effects are slow. You’re better off using one of the other image editors and using flickr to upload to flickr. It also shows up as a sharing option in the Android sharing menu for smooth, comfortable convenience. Nice!


A late addition from the Android Photographer group. Just installed this and it looks like a very capable camera app. I’ll have to play with it a bit but it might just be a thing.