Music Video: Crosslegged — <I>Automatic</I>
Marc Loftus
June 12, 2023

Music Video: Crosslegged — Automatic

BROOKLYN, NY — Keba Robinson (AKA Crosslegged) released her second studio album, “Another Blue,” on January 27th. Recently, she released a music video for the song “Automatic,” which was directed by Sophia Bennett Holmes and Cade Featherstone, who made use of AI technology to animate photos of the artist into uncanny, playful and poetic moving images.

"Automatic" is a love song at its core, and the video retains that sense of romanticism as Keba (@crosslegged_) and her clones sing and dance together while trying to woo each other with winks, kisses and flowers.


“We fed the software a series of photos we took, which then generated the 20 or so Kebas you see in the video,” explain the directing duo. “It was really important in the midst of AI panic to keep on reminding ourselves that it was something we could work with and something that could push us creatively. We also, foolishly, believed it would make for a speedy music video, but actually, it opened up its own set of bizarre challenges that were completely new and took twice as long.”

Sophia Bennett Holmes made a music video for Crosslegged in 2014, back when they were undergraduates at Cooper Union. With the new album release, Keba Robinson reached out once again to make a video for its first single, “Only In The.” For the next single, “Automatic,” the collaborators decided to use it as an opportunity to work outside of their past narrative work.

“Every different face you see was generated from a different still,” explain the filmmakers. “There were probably only around 30 in the final cut, but we had literally hundreds in our folders because it was so difficult to predict how the software would interpret the images.”

The primary software employed was Deep Nostalgia. 

“It’s part of a site called MyHeritage, which is essentially an ancestry-locating website, but it also has this crazy section where you can feed photos in and it can reanimate them,” they explain. “It’s for people to see what their ancestors would look like in motion. There’s some other AI software floating about that does similar things, but this had a particular strangeness to it that drew us in."



Because the Deep Nostalgia technology relies on text input, it was difficult to time the speech to the song’s lyrics and tempo. 

“We tried to match it by spelling things phonetically. For example, a lyric like ‘It’s cruel’ is sung to last over a few bars, but to replicate that, we would have to write, ‘It’s crooooooooo, ooowowoooo, elllllll.’ But even with that, the timing was never perfect, so we did take it into Adobe Premiere Pro to warp the speed to match the song a little more. We also eventually composited all the individual animations in After Effects to produce the final video.”

Initially, the filmmakers thought AI would relieve them of numerous responsibilities.

“It will animate for us, sing for us, green screen for us, and we’ll just throw it together,” they thought, “but the tasks around that became huge. Learning what input it needed, generating that input, how many photos we had to run through for one to be ‘right,’ tidying up the mistakes..."

The project was produced by @gummyfilms. You can follow the directors on Instagram at @cade_featherstone and @sophiabennettholmes.