The Era of Data Orchestration
Jason Lohrey

The Era of Data Orchestration

Sponsored News

It has been over a quarter of a century since Bill Gates wrote "Content is King" - the essence of that essay is equally relevant today as it was then, particularly when applied to the visual content production industry. The number of pathways from producer to consumer has increased dramatically in the intervening period. It has never been easier to consume content wherever and whenever we choose. We are converging on content that is produced specifically for each of us.

Our expectations for production quality have increased with every passing year - that is obvious when trying to enthuse my teenage kids to appreciate Ridley Scott's Blade Runner from 1982 or Terry Gilliam's Brazil from 1985. For the foreseeable future (unless there is a time when people might yearn for older analogues), every new production that pushes the boundaries for audience experience permanently raises the bar for every show that follows. We have hit an inflection point, where the level of complexity exceeds scales that humans can deal with. We need systems to orchestrate the amount (number and size) of data essential to creating even more engaging content.

Content production is a team effort that leverages the creativity and energy of people who are increasingly globally distributed. The minutiae for visual effects are essential - modelling every hair, blade of grass, or water droplet to higher levels of fidelity and realism - requires access to specialists who may live and work anywhere. Moving data between distributed teams can be costly over wide area networks and moving it over large distances is affected by network latencies, impacting delivery times. We should only send the data that is required for collaborators to work, and no more. For oversight and governance, we need to know where everything is (or was) at any point in time.

Every increase in fidelity results in even greater amounts (both number and size) of data to manage - over time the scales will rise from hundreds of millions today to billions and then trillions of assets generated for a finished product. Could you manage a project with a trillion components involving a geographically-distributed team of 10,000 people? It's only a matter of time before we reach those scales.

Put simply, the number of parts is increasing dramatically. This is a classic optimisation problem: We need to get the right data to the right people at the right time. We are in an era in which we need vastly better systems for orchestrating those flows of data to ensure they happen securely, reliably and at the lowest possible cost. It was that realisation that was the driving force for Arcitecta to create solutions to address this problem space.

Data orchestration is a branch of data management that focusses on the flow of data from one place, person, or team to another. That might be one-to-one, one-to-many, or many-to-many. This is not unique to visual content production - it's a requirement for many disciplines - but because visual content production often involves the collaboration of a lot of people and resources, there is much to be gained by paying attention to those flows.

Let's look at an example. A successful production company headquartered in Germany has many clients and significant work on the books. To handle the load, they decide to outsource parts of their content production pipeline to a company in Australia.  The time zone differences are quite significant, so it is going to be challenging to coordinate the flow of project data to avoid any unnecessary idle time. Whilst the client is supportive, some of the work is sensitive and they want the production company to ensure that all the sensitive work is done in Germany only. The production company will need to ensure only the non-sensitive data is sent to Australia, and that there is an audit trail of where that data is and who has access to it. After six months, they decide to outsource more work to another production company in Canada - some of the work that was being done in Australia is now diverted to Canada. The Canadians can start immediately, but they need to be primed with enough data to get going. The remainder of the data will be sent as time and bandwidth permits. The Canadians decide that they would also like to reciprocate and outsource some of their project work to Germany.

As the number of people and sites grow, it becomes increasingly difficult to keep track of everything and make sure data is flowing precisely where and when it's needed so that people can work efficiently without any downtime. If that sounds familiar to you, then you need metadata-driven systems to orchestrate your data.

It shouldn't matter where these processes occur - joint solutions from Arcitecta and Dell Technologies deliver data where it's needed at the right time. Arcitecta's pioneering metadata and data orchestration tools and Dell Technologies powerful, industry-trusted infrastructure enable a global distributed edge that stays simple and performant, no matter the complexity of your workflows.

Jason Lohrey is the CTO of Arcitecta (www.arcitecta.com), which has created its own comprehensive data management platform called Mediaflux.