NAB 2026: AI-Powered Video Creation with Avid and Google
Avid and Google teamed up for a tech preview at NAB 2026 to demonstrate new generative video functionality for Avid Media Composer. This workflow tool brings us to the front of the lens, where we’ve moved one step closer to building realistic content with Gen AI.
I saw a demo which showed a young woman in a room. We were able to direct the activity of the woman using text prompts. We made her cry, greet another person, put her sunglasses onto her head, and leave the room. We could also choose other natural-language prompts to create actions. The woman in the demo had natural movement and a very realistic-looking background. It was a very impressive demo.
The demo leveraged the following technologies:
- Google Gemini models
- Avid Media Composer
The two companies just announced a multi-year partnership to embed Google’s Gemini models and Vertex AI into Media Composer, Avid’s video editing suite, and Content Core, the company’s AI-driven media management platform.
Media context can be sourced by postproduction teams doing a natural-language search. This integration allows editors to use a semantic (and possibly complex multi-concept) search to find a piece of content. The aim is to help with manual processing and reduce the time required for media discovery. I've seen many demos from a wide range of companies where you can search by keyword or content sentiment. I would say this is now table stakes. The search capabilities really come down to how good the results are, and Google is known for finding things online, so that's a big plus.
The next part is helping to fill in the missing gaps. This is the “wow” part of this announcement. Agentic AI workflows with digital assistants are capable of autonomously managing complex tasks, such as matching visual styles, identifying emotional cues in raw footage, and streamlining metadata logging.
The idea is you can generate B-roll or use this as a way to visualize ideas. Avid has not yet released information on output formats or resolution, but they do support the native Avid editing file format, DNX (digital nonlinear extensible high definition). The Avid demo staff said content creation duration is based on your Google account type. Avid and Google are still working on pricing for this, and release date is TBD.
“By embedding agentic AI directly into the tools video editors live in, we’re moving beyond simple automation,” said Anil Jain, Global Managing Director, Strategic Industries GTM, Google Cloud. “With Avid Media Composer and Google Cloud, an editor can now collaborate with an intelligent agent to create assets on the fly and handle the heavy lifting of matching styles and filling timelines, enabling them to focus on storytelling instead of infrastructure.”
While the initial intention is to generate B-roll and assist with visualization, now that we have tools to create video, we’ve let the genie out of the bottle. A contact of mine had his image taken at the Google photo booth and he showed me how, based on one image, they were able to generate video content of him moving. The content looked exactly like how a real person would look.
The question now is, should we have labels on video—like nutrition labels on food—that say how much is real imagery and how much is AI-generated?
Related Articles
One demo I saw at NAB 2026 covered using agents to create content. Obviously, agents need to be managed so they don't think too far outside the box. In this demo, NVIDIA talked about their control plane for a multitask agent project that helps create both a script and animated characters.
27 Apr 2026
My visit to NAB 2026 skewed heavily toward talking about workflow changes with AI. I tried to keep my travels on the convention floor to very specific examples of AI's impact on streaming workflows, and this article will explore the most interesting ones I found. First up is how AMC Global Media (newly rebranded from AMC Networks) is working with AWS to leverage AI to create better content access.
24 Apr 2026
For Anil Jain, leading the Strategic Consumer Industries team for Google Cloud has meant helping traditional media companies embrace cloud and AI, move to an OpEx-centric business model, and adapt to a more development-oriented mindset in engineering and product management. In this interview, Jain discusses the global shift to cloud-based operations in the media industry, the ways generative AI is disrupting everything from production to packaging to all aspects of the user experience, and what media companies should be afraid of if they're not already.
12 Jun 2024