AI startup Runway launches app to bring users video-to-video generative AI

technohoop


Runway, an AI startup that helped develop the AI image generator Stable Diffusion, launched its first mobile app yesterday to give users access to Gen-1, its video-to-video generative AI model. The app is currently only available on iOS devices.

With the new app, users will be able to record a video from their phones and generate an AI video in minutes.  They can also transform any existing video in their library by using text prompts, images or style presets.

Users can select from Runway’s list of presets like “Cloudscape,” or transform their video to look like it’s a claymation, charcoal sketch, watercolor art, paper origami and more. They can also upload an image or type an idea into the text box.

The app will then generate four previews for users to select from. Once they select which one they like the most, it will take a few minutes to produce the final product. We tested the app ourselves and found it took about 60 seconds or longer. Sometimes it took two minutes to generate.

Naturally, as with any AI generator, the results aren’t perfect and often look distorted or weird looking. In general, the concept of AI video generators may seem silly and possibly even gimmicky. But as the tech evolves and improves over time, it could be valuable for users. For example, content creators can spice up their social media posts.

Regardless, we found that Runway’s mobile app was easy to use and overall fun to mess around with.

Below is one example that we came up with, using a clip of Michael Scott from “The Office.” A text prompt we entered was “realistic puppet.”

(Warning: the result is terrifying.)

We also tried “3D animation,” which turned out alright.

Granted, there are a few other caveats besides glitches and warped faces.

If users want the free version, there’s a limit of 525 credits, and they can only upload videos that are five seconds long. Each second of a video uses 14 credits.

In the future, Runway plans to add support for longer videos, co-founder and CEO Cristóbal Valenzuela told TechCrunch. The app will continue to improve and launch new features, he added.

“We’re focused on improving efficiency, quality and control. In the coming weeks and months, you’ll see all kinds of updates, from longer outputs to higher-quality videos,” Valenzuela said.

Also, note that the app doesn’t generate nudity or copyright-protected work, so you can’t create videos that mimic the style of popular IP.

Runway’s new mobile app has two premium plans: Standard ($143.99/year) and Pro ($344.99/year). The standard plan gives you 625 credits/month and other premium features like 1080p video, unlimited projects and more. The Pro plan offers 2250 credits/month and all of Runway’s 30+ AI tools.

A month after Runway released Gen-1 — which launched in February — Runway rolled out its Gen-2 model. Arguably a step up from text-to-image models Stable Diffusion and DALL-E, Gen-2 is a text-to-video generator, so users will be able to generate videos from scratch.

Runway has slowly begun to roll out access to its closed beta for Gen-2, Valenzuela told us.

The app currently supports the Gen-1 model, however, Gen-2 will soon become available along with Runway’s other AI tools, such as its image-to-image generator.

Meta and Google have both launched text-to-video generators, which are called Make-A-Video and Imagen, respectively.

Runway has developed various AI-powered video-editing software ever since it launched in 2018. The company has a variety of different tools within its web-based video editor, such as frame interpolation, background removal, blur effects, a feature that cleans or removes audio and motion tracking, among many others.

The tools have helped influencers and even movie/TV studios reduce the time spent editing and creating videos.

For instance, the visual effects team behind “Everything Everywhere All at Once” used Runway’s tech to help create the scene in the film where Evelyn (Michelle Yeoh) and Joy (Stephanie Hsu) are in a multiverse where they’ve been turned into moving rocks.

Plus, the graphics team behind CBS’ “The Late Show with Stephen Colbert” used Runway to trim down hours of editing to only five minutes, according to art director Andro Buneta.

Runway also operates Runway Studios, its entertainment and production division.



Source link

Leave a Comment

Immediate Peak Immediate Peak