|Issue: December 1, 2011
Bringing kids apps to life
|By Michael Elman
MONTREAL — At Budge Studios, we specialize in producing fun apps for kids that feature their favorite characters and stories. Having launched in 2010, we are still a relatively new company, but we have already created applications for multiple platforms with characters from Dora, SpongeBob, Happy Feet and other top childrens’ properties.
Our production team focuses on creating very immersive storytelling experiences for kids that include reading, watching, and playing — all brought together using robust animation, illustration, sound and music. These features help the apps we design stand out among the other interactive books and apps available. One recent example is Happy Feet Two: Erik’s Adventure, an interactive gaming storybook experience for iPad, iPhone and iPod Touch.
The app development world moves quickly, and our productions need to do so as well. As we have grown the production studio here at Budge, we have invested in tools that we feel will help us get the job done well and efficiently.
In regards to video hardware, Budge has had two different and equally critical needs. The first is the development of the apps themselves. Our audio team is frequently called upon to craft sound effects and music for the apps. To do so, they require video sources to sync their sound to [they import the videos into Pro Tools]. Once they have completed their synced sounds to all of the animations in the app, they edit and output them as individual files for the programmers to integrate into the game engine.
The issue that arises is that until now there has not been an efficient way to capture video from iPads and iPhones. Our artists previously used our app development machine to capture video, which meant that they had to interact with the app using a mouse. In addition to losing audio and a realistic sense of play, we couldn’t assume that the frame accuracy was the same as it would be on a smartphone or tablet.
Recently, however, Apple released the ability to output HDMI from the devices themselves. We immediately started looking for a device that could capture HDMI output to convert to digital video files. After some investigation into how we could perform a proper video capture from a target device, we found that we could use the Matrox MXO2 to capture the output of the iPad 2 HDMI output.
Upon learning that Matrox already had tested this capability successfully, we purchased and installed the MXO2 and began capturing video directly from the iPad 2 while maintaining audio and frame accuracy. It was an ideal solution for us. The artists just plug an iPad 2 into the MXO2, play the game, and seamlessly capture and live-encode the video at 720p60, and it’s fast. We no longer have development machines tied up with video capture, and staff with any level of technical knowledge can create clips from the iPad 2 easily.
Our studio’s second need was related to our in-house marketing, as I believe it’s critical to create great trailers of our apps to post online for potential customers so they can see what the apps are all about. Smaller developers, in particular, need a better alternative to videotaping someone playing the app on a device.
Before the MXO2 system was in place, we were unable to capture great HD video of our apps being played. We needed to take a lot of time to create “fake” game footage. With this system, we can capture someone playing the game for a few minutes and the content is ready to send out to the editors to work their magic. The MXO2 product has helped us work faster and create better products — and now with the new Thunderbolt connection, it is easier to transport the unit around the studio so anybody could use it.
Other tools used by Budge to create these apps include Unity 3D, Cocos 2D, Xcode, Objective C, Pro Tools, Photoshop, Flash and After Effects.