-->
Save your FREE seat for Streaming Media Connect in November. Register Now!

Review: Skyglass: Movie Effects Camera App

Article Featured Image

When doing live streaming productions, a lot of times you see these days there's a virtual background or a background replacement, whether it be greenscreen, blue screen, or even just a background blur on Teams or Zoom. But there's a new tool out there that actually lets you do sort of the same motion tracking with the kind of perspective background replacement that feature films are using in LED soundstages, like the Volume that is used on The Mandalorian. They replace the background with video that's being created on the fly and it tracks the camera movement. So as the perspective changes the background that's projected behind the person, it also changes. You can do that with an LED volume, but you can also do it with a greenscreen and using tracking software on the camera. The real trick now comes with cell phones that have tracking built in, whether it be through using the multiple cameras to assess depth or LiDAR built into the device itself, and assessing when you move the camera, whether you pan it left and right, tilt it up and down, or move it from side to side.

Subscribing to the Skyglass App

Full disclosure, I saw Skyglass mentioned in an NAB news release in 2023. I paid for the service, and I am using it just like any end user would be able to use it at this point in time.

Being a software app on a mobile device, it is subject to what device you have versus what device I have, your internet capability because it uses a live connection, and also future updates that are available after this video is made. It requires $25/month subscription to be able to use it. I'm going to be using ethernet for my connection here, so I have an ethernet adapter and I am using the screencasting out to an Apple TV to be able to watch it. You could go completely wireless using Wi-Fi and screencasting. I am trying to get a little bit more reliability by using an ethernet connection into my cell phone.

Tracking and Keying with Skyglass

When I launch Skyglass, it's immediately going to go to the clean output from the mobile device to the feed that's over on your right. In the video you see me holding it out in front of me. As I move the camera, you can see that it tracks with me using the camera's sensors to assess how far I'm moving, how fast I'm moving, and to track the background. I have a greenscreen behind me, but I have not taught this to use a greenscreen. So you can see there's a little bit of a mistake going on here around my hands and things like that. If you use a greenscreen, you could have it select the greenscreen, but what it's doing in this video right now is keying just me. If I move away, there's no greenscreen behind me and it is still doing a clean key.

If I teach it to greenscreen, the key will be cleaner around me. But then you're limited to just the greenscreen area. This is what it looks like on the phone itself. You can see I'm doing the key. And up here on the right it says "Tap here to switch environments." You can record the video internally. You can use it for live switching. You can also see there's a delay between me speaking and the video that is being returned from the mobile device itself. There is a delay in the processing. You can spin the camera around and now look at the rear cameras and I'm not there. Touching this icon up here, we'll open up a whole set of camera controls, including focus lock, white balance, exposure, compensation, and color filters.

Below is where you would tap your greenscreen. You can see that my greenscreen is not totally matching all the way around. And also when I reach the end of the greenscreen, the end of the greenscreen is now the end of the keying. That's only if you're using the greenscreen mode, you can get a better edge, but you're limited to the greenscreen area. So we can turn that off. So this is using AI background assessment, which is part of where the delay comes from. Now up here, this is where you can access their backgrounds, whether it be a cool cityscape. Let me spin this camera back around. And now as I move the camera up and down or around, you can see how well this thing tracks the built-in camera. And again, if I point all the way up, you can see I'm not locked into just looking at the greenscreen.

Choosing a Background

I'm very impressed with how well this thing tracks. But will you be using a cyberpunk 2032 background in your live streaming? Maybe not, but you are not limited to those. You can do a cityscape. This one I found particularly useful. It is a full set. One of the features here is it actually lets you set the height of the camera in the set so you can see it set to five feet seven inches, which is my height. And when you tap on this, it's going to let you adjust the height, I'm sure am showing you this interface.

But now let me actually put it up and here is me on the set and let me tap this and now I can make myself higher or I can put the camera lower to match the actual placement of the camera facing me. So if the camera's at five feet, you want to have the environment at five feet. You don't want to be tilting up from a low angle, but then have the camera as if it's higher. You don't want to have a disparity between the virtual environment and the real environment.

Moving Around a Virtual Environment

Another really cool feature of this is the ability to move around the environment. So if you just wanted to use this app to create stills in a virtual environment, but move around the environment so that you have those stills where you can use them where you want them. Let me show you how to do that.

Right here it has a little game controller icon. I'm going to click on that and I'm gonna get two icons. The left one lets me move around the environment up, down, left, and right. This one lets me move forward and back. So let me show you what that is like. So here's the environment again. I'm using the rear cameras so there's no person in front of it to key, but this is an interesting angle.

But what if I wanted to have two people and have a different angle? I can move to the left and rotate to the right and then zoom in or walk in, move over to the left again. And now I have the reverse angle shot and this is outputting 1080p from my phone. I could literally grab that frame and use that as a background for a standing talk with a virtual set.

What if I wanted a different angle? I can move over here and I could walk down here and not have that top headlines thing. Move back a little bit. Pan left back up. There we go. So someone could be standing right in that space. And of course, this is tracking. So if I want to just to look around, I can move my cell phone and change the view. It is a full virtual environment--ceiling, floor, and everything.

The app also says it will load Unreal environments into the app. That means you can create your own virtual environment and utilize it in the phone itself. As with most iTunes subscriptions, you purchase it once I have it on multiple devices. And that indeed is a very cool feature if you plan on using this for multiple cameras in an actual environment.

Generating Backgrounds

Also built into the system is a generative AI background, which means you can generate backgrounds. The level of service that I am paying for lets me generate 200 backgrounds. So we're gonna click on one, "A gothic cathedral interior with soaring arches." You can pick a digital painting, anime, fantasy lands, sci-fi, cyberpunk, modern computer emanation, dreamlike oil painting, sky interior views, pen in ink, technical drawing, cartoon claymation, super art, holographic testing. I'm gonna go with realistic.

Or we could choose "indoor modern office" for an office conference room, and it generates that for me. I can look around the room and that is not bad at all. The windows don't line up exactly, but I think if you're focused on the content in the scene, that's not gonna be as big of a deal.

Let's try another one. I've got an outdoor modern patio area and here is my patio furniture. The patio furniture looks good and that leads to another cove area in the background. So as long as you're not paying too much attention to what looks to be two different kinds of green bushes and a diving board that goes off into the bushes, or a couch that goes into the bushes, and trees that don't move... I guess you'll have to go through multiple prompts to make sure that the diving board doesn't go into the bushes. But that's generally what AI does right now: it gives you these odd decisions.

But the tools to make your video look better and better are getting easier to use and more affordable. You don't need to start with anything complex--just a light, a greenscreen, and a cell phone and--and like that you can have a fantastic-looking video, move around the environment, and do a whole lot more with just a phone.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

The Streaming Toolbox: FILMIC Pro, Bitcentral FUEL, Touchstream VirtualNOC

This installment of The Streaming Toolbox is all about remote control, for mobile devices, advertising, and network operations

Virtual Idols: YouTube's Next Generation Stars Are Made, Not Born

Thanks to affordable real-time motion capture software, virtual characters are gaining fans on YouTube. Before long, they'll find a place in the enterprise.