Introducing the Uni-1.1 API
The new Uni-1.1 API allows creators to integrate Luma’s video generation model directly into their custom pipelines and applications. This update focuses on improved visual quality and more precise control over cinematic motion.
Luma recently released the Uni-1.1 API, providing a direct interface for developers and studios to build applications on top of the Dream Machine video generation model. This move shifts the tool from a standalone web playground into a functional infrastructure component for professional production environments. Creators can now automate video creation tasks and integrate generative video into existing software suites.
What's new
The Uni-1.1 API update introduces several technical improvements aimed at professional output. The model features what Luma describes as "directable intelligence," which translates to better adherence to complex prompts and more predictable camera movements. The visual fidelity has been tuned for "production-ready aesthetics," focusing on reducing common AI artifacts and improving the consistency of textures and lighting across generated clips.
Key features of the API include:
- High-speed generation: Optimized processing times for rapid prototyping and iterative workflows.
- Directable motion: Enhanced control over how subjects and cameras move within the 3D space of the frame.
- Scalable infrastructure: The ability to trigger multiple generations simultaneously via code, rather than manual web uploads.
- Improved character consistency: Better retention of physical traits across different shots (see the provider's announcement).
How it fits your workflow
For filmmakers and VFX artists, the Uni-1.1 API serves as a bridge between creative concepting and technical execution. Instead of manually prompting on a website, a technical director can script the generation of hundreds of b-roll variations or background plates directly within a project management tool. This makes Dream Machine a viable option for pre-visualization (previz) where specific timing and camera angles are non-negotiable.
Editors and motion designers can use the API to build custom plugins for tools like Adobe Premiere Pro or After Effects, allowing them to generate assets without leaving their timeline. In a professional setting, this replaces the fragmented workflow of downloading files from a browser and re-importing them into a project. It competes directly with the API offerings from Runway and Pika, though Luma’s focus on cinematic motion and spatial physics gives it a distinct edge for creators who prioritize realistic movement over abstract stylization.
Sound designers can also benefit by using the API to generate visual references for foley work or atmospheric scoring. By automating the creation of visual cues, teams can sync their audio production to a generated visual guide much earlier in the creative process.
What it costs / how to try it
Access to the Uni-1.1 API is available through Luma’s developer portal. Pricing is typically based on usage tiers or credits, and interested creators can find the full documentation and sign-up details on the Luma Labs website.
Read the original announcement on Luma Dream Machine ↗