Posted By Neale Van Fleet on February 13th, 2018
The intro video for our new soundboard app Farrago has been very well received, so I thought I’d discuss how we made it. If you haven’t seen the video yet, take just 73 seconds to watch it, then read on.
When we decided to make a video to accompany the release of Farrago, the concept we began with was very different from what we wound up with. The original video featured a montage of mostly still screenshots, with animations between them. A great deal of time was spent making ultra-high detail vector screenshots of the app, then compositing them in Apple’s Motion app.
A screenshot of the original video being animated
Unfortunately, though the resulting video looked gorgeous in 4K, I couldn’t shake the feeling that it was overly flat, even lifeless. Much work went into making Farrago a lively app that’s full of subtle motion, and the video simply wasn’t doing it justice.
Thankfully, a better idea arrived in an unexpected flash of inspiration. While I was assembling the video’s soundtrack, it struck me that the layout of my GarageBand-based project looked rather like Farrago itself, with colored sound file blocks showing waveforms. I wondered how the video’s audio track would look if I actually brought it in to Farrago.
The GarageBand project that inspired the concept
After spending just a few minutes porting the audio over, I knew I could make a much more dynamic and effective video by simply recording my screen while I ran through a sequence of steps live in Farrago. Instead of disjointed screenshots, the new video would have a continuous flow from one feature to the next, with no cuts or editing. Crucially, it would show exactly how the app works.
I created a rough abstract for the team to review, and everyone was immediately on board with the new concept. I unceremoniously dumped most of the work from the first video and started fresh. While we kept the voiceover largely unchanged, I had to decide what exactly we would show. I took inspiration from the classic first level of Super Mario Bros., deciding that the video would slowly reveal different parts of the app. As the video builds on previous items step by step, the viewer gains a solid understanding of the whole product.
To do this, Farrago became something of an instrument, and I had to practice over and over to get the desired sequence just right: Click this tile, wait five seconds, drag in a new file, and so on. Every mouse movement and key press was deliberate, and when I messed up, I’d start again. I ran through the entire flow at least 50 times before I was ready to create the actual recording to use in the final video.
Amusingly, when I set up my screen recording software, I realized I had an audio issue. I needed to get Farrago’s audio into my screen recorder, which only accepted audio from an input device like a microphone. Thankfully, Rogue Amoeba’s own software helped me work around this in mere seconds. Using Loopback, I was able to route audio from Farrago right into my screen capture app. With that solved, I was ready to record.
Loopback saves the day!
Once I had my raw recording, I dove back into Motion to assemble everything. I added zooming and panning to emphasize different parts of the app, but nothing else was changed. Everything in the video is a real action, done in real time, in one long shot. It took many takes to get it right, but it was worth it to get the final result.
In the end, I think it was our willingness to throw away the effort we had already expended on the first concept that helped the video turn out as well as it did. When an obviously superior idea came along, we were willing to pursue it, even though it meant additional work. The end result is a video that vividly shows how the app works, rather than just telling, and viewers have responded to that.