The Incredible Resolution of the Olympus OM-D E-M1

And by “resolution”, I mean, I am making the resolution to take more pictures with this camera in 2015. See what I just did?

I like cameras.

Olympus OM-D E-M1

OM-D E-M1 Action shot with Sony RX1

Cameras have always had a special place in my heart. Designed to be held in your hand, a good camera feels like it belongs there. As an imaging instrument, they capture a scene by gathering photons through a focusing mechanism – usually a lens – and stacking them on a sensor. At least that’s what happens in a digital camera.

Think about that for a second. Photons.

When you take a picture, you are literally capturing a moment in time. My inner physics nerd freaks out a bit when I think about this too deeply. Photographs may well be our best proof of time’s existence. Sidebar, if you want to read about the elusiveness of proving time, Dan Falk’s In Search of Time: Journeys Along a Curious Dimension (Amazon.ca link) has nothing to do with photography but is pretty interesting.

_3190009

Man hovering outside of GDC. Shot with the OM-D E-M1 and a Voigtlander 17.5mm F0.95. Manual focus.

Show a picture to one or more people who were in that place at that time and they will tell you, “Oh yeah, I remember that.” The image can take you back there.

Humans have been taking pictures (or, if you’re more serious about photography and want to sound like a prat, “making photographs”) for almost 200 years now. In that time, we’ve seen a couple of technologies come and go, though the death of film is somewhat up for debate. For the sake of argument, I’m going to claim it’s over.

As 2015 closes in, we’re seeing mirrorless cameras finally usurping the dominance of the once-ubiquitous DSLR. The tech is moving quickly too. Late last year, Sony released the first interchangeable lens full-frame mirrorless cameras. They’d previously managed a proof-of-concept with the RX1, a camera that can produce stunning images if you can put up with its incredibly poor performance and quirky controls and happen to like shooting in 35mm focal length (I do). The A7 series is an impressive line of cameras that one year later is already seeing its first revision in the form of the A7 mark II. Reviews are starting to come in and most of them have people gushing over it. “It boots up in under 2 seconds!” exuded one reviewer. “It focuses pretty fast!” wagged another.

Balls.

If you want a camera that’s properly fast, look no further than the still amazing Olympus OM-D E-M1. Behind that mouthful of letters is a camera that screams capability. It will blow your doors off it takes pictures so fast. It will melt your face with its incredible electronic viewfinder.

It will boot up and shoot about six frames before the A7m2 has powered-up. (not actually tested with science!)

That’s right.

“But you can’t get a good image with that tiny sensor.”

Balls.

Yes you can. This thing produces really sharp 16MP images.

sony a7r + voigtlander 15mm f4.5

Deb’s Sony A7R with Voigtlander 15mm Heliar M-mount lens on a Metabones adapter. Shot with OM-D E-M1 and 12mm F2 Olympus prime lens.

Olympus is making some really excellent cameras these days. Rumor has it, they’re coming out with a new E-M5 in the spring with some kind of crazy sensor shift technology that boosts the sensor up to an “effective” 40MP. Who cares? The E-M1 is the bomb. And the old E-M5 is still a plenty capable camera.

oli

“Oli” on an Oly E-M5, Voigtlander 17.5mm prime at F0.95.

This post is my commitment to get out and shoot more pictures.

GDC 2014

I was fortunate to be able to attend the Game Developers Conference in San Francisco this year. Thanks to our organizers and IT staff for all the hard work they put into making everything run so smoothly.

_3200033

My GDC14 Flickr Set

This was the first year Mozilla had an actual booth on the show floor and we put it to good use demoing our Developer Tools alongside some fun games. We showed off our new Canvas Debugger (should be landing next week!), the Shader Editor as well as our Debugger and our other inspection tools. People were really receptive to the Canvas tool. The Shader Editor got a fair number of positive comments as well. I was also able to show off our Network panel as a temporary solution for inspecting game assets like textures.

Another well-received demo was a setup where I paused my desktop JS Debugger when receiving a device orientation event on my phone. I loaded the three.js DeviceOrientation demo on my phone’s browser (Firefox for Android). I then connected the phone via USB to my laptop and launched our remote tools via the “connect” option. Opening the Events panel, I was able to pick “deviceorientation” as a category and selecting that caused execution on the phone to immediately pause with my desktop debugger showing the exact location.

Debugging device events is easy to do on a mobile device. I was also able to demo our Shader Editor running on mobile which was pretty cool. Editing shaders in real-time running on a remote device is some real science fiction level stuff.

Having the kind of immediate feedback for WebGL (and soon WebAudio) that our tools provide is kind of a big deal for people who aren’t used to living in a dynamic environment like a web browser. There is lots of opportunity in this space to make tools for game developers that are fun to use and interactive. You can literally program your game while playing it.

This feels like a tipping point for games on the web. There are now multiple engine developers offering the Web as a bona fide deployment target. Three big engines have reduced their pricing models to the point of being effectively free for most developers and that happened just this week. This is a big deal and I think we’re going to start seeing a lot of game publishers shipping games to the web very soon.

We also weren’t the only booth showing off HTML5-related game technology. Nintendo is shipping a “Web Framework” around a bundled WebKit shell for deployment on the WiiU and had a pretty sizeable installation to show it off. Unity is also making that a deployment target. Various other booths were demoing HTML5 games and tech.

In the emerging technology department, head-mounted displays were in full-evidence. Sony just announced a new piece of head-gear for the PS4 and there were some other vendors kicking around similar technologies. At this point, it seems obvious that head-displays are going to be very real, very soon. The lines of people at Oculus’ displays were a constant stream of humanity.

gg;hf.

#io13

IO13

I sat in on three sessions at Google IO 2013 yesterday.

Memory Lane with Chrome Devtools and GMail was the first.

The presenters showed off their Heap Tracking Profiler and Memory Tracking tool in the Timeline and explained how to use them to track down a leaky DOM node. It was a practical application of how to use a developer tool to solve a particular problem.

One interesting takeaway that surprised the presenters during their research: Always allocating more memory (caching) as a way to improve performance in a large application like Gmail is not a panacea for slow performance. Having a large heap space actually slows down the garbage collector and your performance suffers. It’s a fine balance.

What surprised me was how they analyzed their problem by tracking a user with a known high memory problem for three days. The Google team constantly monitors their apps’ performance via the window.performance API and can single out hotspots in the population.

The next talk I sat in on was about Chrome Apps. The presenter, Erik Kay showed off some of the “Immersive” experiences of Chrome Apps and the different ways they could interact with the hardware on the Chrome Book. The talk included a demo of a small thermal printer being hooked up and controlled over USB which garnered some applause.

The Chrome Web Store lets you buy apps for Chrome.

The only real mention of Android was that they were using PhoneGap and Cordova to provide their compatibility layer. Same for IOS. There will be compatibility issues with deploying on iOS but it seems surprising that they would pursue this completely separate technology for Android. Surely they could ship a full version of the Chrome Runtime and deal with hardware incompatibilities directly.

The questions from the room were interesting. One man (not a Mozillian) asked about WebRTC compatibility across the different platforms, pointedly repeating the question of whether or not he’d be able to use WebRTC in an app on iOS. Only when their WebView supports it.

Another man asked something about interoperability between B2G and ChromeRT. Erik said that there is “no forcing function yet to drive standardization”.

I think my biggest takeaway from this talk was that people wearing Google Glass look like dorks.

My second biggest takeaway was that I was very surprised that there was zero mention of the Google Play Store for Chrome Apps.

Last talk I attended was a Fireside Chat with the Blink Team. While I was expecting an actual fire and was disappointed there wasn’t one, the team bravely took questions from an audience confused about feature-detection, unprefixed CSS and market fragmentation.

Dan Buchner asked the panel something about standardization and I felt a little badly for the Blink team who had a whole chunk of slides talking about how they’re going to be good citizens. (If you want to participate, you should join blink-dev@chromium.org.)

I was interested in ChromeStatus.com/features which shows a spreadsheet of features in-progress. Time will tell how their Intent to Implement and Intent to Ship broadcasting will work from an Open Source point-of-view, but they are currently claiming that a third of their intents to implement are coming from outside of Google.

I wanted to meet Paul Irish after the talk but Steven Shankland showed up and pushed me out of the way. When he was done I did get to meet him, but I think Buchner had made him angry or something. Maybe he was just tired. I dunno.

iPods

frank's demise

I have had a number of iPods over the years. Starting with my iPod 3G, then a 5G (whose screen is pictured above), a 1st-gen iPod Touch and lately an iPod 7G Classic with a 160GB drive in it. I’ve loved them all, but it really feels like “Device as a Music Player” is done. Apple’s shift from iPods to iPhones started that downward trend.

But it’s gone further than just the obsolescence of the dedicated player. Music storage itself has become another quaint notion. Services like Rdio and Pandora (still not available in Canada) have replaced saved music for many people. And video too is a thing that is streamed rather than “owned”.

I blame Apple. The company that started the shift to digital music has failed to innovate. The 256kbit AAC DRM file is now the pinnacle of purchasable audio and it’s not nearly good enough. Marketing phrases such as “Mastered for iTunes+” really mean “We’ve destroyed any dynamic range this recording might have had”. Sure, the Compact Disc wasn’t perfect, but mastering for that format certainly left a lot more room for the engineer to play with.

iTunes itself has become something of an abomination. More interested with selling you things than maintaining and organizing your library, it’s frustrating to use if you have any amount of content in your library. Ironically, I think the Apple Remote software available on the iPad may be my preferred interface for the new iTunes. It may feel more connected to my library than iTunes itself.

iTunes 11 is the Apple Maps of media software.

I think the field is ripe for picking. Someone could come along and ship some music library software that doesn’t suck. I would pay for it. Bonus points if it will recognize and consolidate libraries from around my home network. And if it could stream to my devices while I’m out and about, automagically compressing my music on the fly, well that’d be keen.

Android Photography

Android’s come a long way since my first fumblings a few years ago with a Dell Streak (“steak”). Both software and hardware have improved immensely since then. Sure, there are still some things that make you shake your head and question reality, but for the most part, Android 4.2 is a great platform.

In most cases when looking for a piece of software for a particular task, the problem you’re faced with is that there are too many options to choose from. So, I’m going to share a few of my favorite photo apps with you. (spoiler: none of them are Instagram).

Stock Android Camera

Yes, it’s built-in on the Nexus phones. It was a big deal with the release of 4.2 and has a fairly unusual interface. Radial menus for most control functions and a strange focus indicator. It’s a decent app, but I tend to fumble with it when trying to use it. The radial menus and icons are harder to figure out at a glance than they need to be and the control I want is never on the menu I go to first.

Ok, not a huge fan, but one nice feature in the Gallery app is a new set of built-in photo filters that are on par with just about any other camera app. Combined with Android’s impressive sharing capabilities, it lets you edit some great pictures without installing anything else.

galaxy glitch

Camera Zoom FX

This was the first alternative camera app I bought for my Galaxy Nexus. I bought it for the built-in effects, but it turns out to be a very capable camera app in its own right. With a bunch of setup options for guidelines, stability indicators, and horizon leveling, it’s already more useful than the built-in app. Add in some additional shot controls like timers, timelapse and voice-activation and you’ve got a stew goin’.

How are the effects? Delicious. You can grunge up any decent photo and make it look like you’re shooting a $10 Lomo Diana without the embarrassment of actually handling one.

sale

Vignette

Apparently this is one of the most popular Android photography apps according to the Android Photographers group on Flickr. I tried it and it’s decent. I find some of the control settings a little hard to get to though, buried in the settings menu. If I need exposure compensation, I don’t want to leave the camera’s live view to do it.

Nice effects though and good level of control if you can dig into it.

construction at the river

 

Shot Control

If there were a camera app for nerds this would be it. Every control right there on the screen. There are so many controls, that the developers made a curious choice: The live view from the camera is in a corner of the screen. This real-time resizing comes at the cost of performance. It’s slow. Yes, there are lots of controls there to fiddle with. There are probably going to be times when I pull this out because I need a very specific bunch of settings. God help me when that’s true.

Snapseed

Hey, remember Snapseed? It used to be for iOS only and it’s by the very capable crew at Nik Software. Then they shipped it for Android. And then Google bought it. You can tell because it has a G+ icon in the top strip.

Don’t let its checkered past confuse you though. This is one of the few apps built by a company with a pedigreed history in producing professional image editing software. The controls are specific and very finely tunable. You should be able to produce some great photos with this. Only problem? It’s doesn’t have camera software builtin so you’ll be relying on one of the others to take the actual shot.

Flickr

I like flickr. It’s no secret. Their app has finally gotten some love as well.

The app ain’t bad, but the built-in effects are slow. You’re better off using one of the other image editors and using flickr to upload to flickr. It also shows up as a sharing option in the Android sharing menu for smooth, comfortable convenience. Nice!

ProCapture

A late addition from the Android Photographer group. Just installed this and it looks like a very capable camera app. I’ll have to play with it a bit but it might just be a thing.