Around 2015, I was asked by Circle of Confusion (the production company behind The Walking Dead television show on AMC) to build a companion app for a film they had made called Capture.
Capture was one of those projects that is simultaneously exhilarating and terrifying. One of the challenges of app development is trying to plan out the timeline and budget for a process that is absolutely full of unknowns. And some projects have many more unknowns than others.
In the case of the capture app, the task was to develop and app that was a fairly typical film companion app, including a trailer, some character bios, photo stills, plot description, and so on. So far, not many unknowns there.
But this app also needed to listen when it was opened, and recognize audio cues. When a particular audio clip from the film was recognized, it would trigger some event — an incoming text message, an incoming phone call, an audio clip, vibration, or a film clip would start playing full screen, for example.
Essentially, this was Shazam but for a specific set of movie clips.
The first step, naturally, in planning out a project like this is to figure out what the big challenges are, and start to look at how they might be solved.
On a project like this, there is obviously no way we are going to develop a proprietary algorithm for recognizing audio out in the wild. This is, coincidentally, an area I studied with some seriousness in graduate school — specifically recognizing patterns in audio clips, images, and other assorted media using a fuzzy algorithm (specifically, wavelets). It was not something that would be feasible to do from scratch for a film companion app, to say the least!
Fortunately, we were able to find and license a C library that did just what we needed. The library needed an Objective-C wrapper to be used in the iOS project, so that was step one in prototyping this app. Once that was done, the rest of the app could be built around our audio recognition engine, and we could then focus on processing the audio clips into data we could embed in the app and building the system that would allow us to trigger the various events that would occur when our recognition engine would fire off a notification that we had an audio match.
The result was one of the most satisfying apps to test — we spent hours playing clips from the film and watching our phones go crazy in response.
Capture is available for iOS — but of course, in order to fully experience it you need to also watch the film, which you can stream on Amazon.
If you need an app developed, reach out and let’s talk! You can contact me using the contact form on this site, via Skype at stromdotcom or by visiting my company website at https://glowdot.com.