If you roll with Android, and are jealous of Instagram’s Hyperlapse app…

Looking to build a technology solution? Hire Dennis Forbes and make your technical dreams a reality.


Earlier today I finally bit the bullet and released an early version of Gallus on the Google Play Store – https://play.google.com/store/apps/details?id=com.yafla.gallus&hl=en. Gallus is, as the title hints, a video stabilization/hyperlapse application for Android, created specifically because there is a gap in the Android space.

UI wise I started with a very focused application — with the same sort of commitment to a singular purpose as seen in Instagram’s Hyperlapse product — but quite recently decided to add some basic video management (primarily out of concern that apps like this allow the user to unknowingly accumulate a lot of video files, so it was an effort to surface them), which at this point is quite spartan.

I continue to work on improving that GUI (the ancillary GUI, such as the video details page which currently is a spartan kick-out to an external video manager), but for now it’s available for perusal for those casually interested in stabilization or time compression/expansion.

The complexity of the application is in the algorithms and approach required to make this possible on Android. I could write volumes about the issues, but note that some devices have poor accuracy sensors, making the stabilization of limited use. Further some devices have varying frame delays, and while I built a very efficient pathway to minimize this overhead, on some devices it reduces the value proposition (for instance on Intel Android tablets it seems to be particularly bad).

EDIT: 2013-03-23¬†a separate code path is now in effect for Android 5.x+ devices with 5x camera integration (which excludes the Nexus 4, but includes the Nexus 5 and 6). This dramatically improves the stabilization and functionality/options on those devices, so keep a lookout for version 1.0.11 if you haven’t been amazed by the results on your device thus far — put your socks on tight, because they’re about to be blown off.

This example was with a Nexus 4 and a sensor offset of 0ms. I’ve used a 1x sample, but the benefit of smoothing is dramatically more obvious when you’re running at faster playback rates. The subject matter is miserable (dreary slush and a gravel drive), I didn’t add any complimentary perky and inspirational music to it, and the results are hardly perfect — there is still motion (the algorithm is not zeroing — it smooths out small, generally¬†unintentional motion, but it still intentionally allows what is perceived as normal motion), and the Nexus cameras just generally aren’t super impressive, but it gives an idea. In my walk I made absolutely no efforts at being either smooth or otherwise when taking this, and instead tried to walk and hold the camera as normally as possible (though when I have tried to be particularly smooth, it remarkably almost always yielded the opposite result).

The algorithm right now, for those who might be interested, is basically a best-fitting least squares line — it tries to draw a straight line through as many points as possible with none of them deviating more than the max degrees offset, and once one does it starts a new line. This means that if you have a max degrees offset of 5 degrees, and hold the on the horizon without going more than 5 degrees rotation (or pitch, or heading), one way or the other, it will cancel out those sub-5 degree movements, presuming that the sensors are accurate (if the sensors aren’t accurate, and you had your device in a clamp, it would be “correcting” in error. In my testing this hasn’t been the case with most devices though. The only exception is the GS3 that does seem to introduce sporadic noise into the sensors).

I’ll clean up the market page soon (which, as is often the case for the non-code part, is the minimum possible. After spending weeks reducing latency and optimizing algorithms, at some point I just need to move forward), but for those who read this blog I thought you might find it interesting.

It does require Android 4.4 and an on-device gyroscope and accelerometer, owing to some of the APIs it makes use of (originally I supported lower versions, but the video pipeline was so long, the performance so varying, that it was impossible to yield a good result). Hopefully this doesn’t cut out many. If you have a chance to try it out I would love to hear your thoughts or experiences.