Streaming Media

Streaming Media on Facebook Streaming Media on Twitter Streaming Media on LinkedIn

NAB 2024: PTZOptics Talks Hive Studio and Remote Camera Control

PTZOptics Director of Technology Matthew Davis discusses Hive Studio, PTZOptics new cloud-based remote camera control solution, in this interview with Streaming Media's Shawn Lam at NAB 2024.

Shawn Lam: I'm here with Matthew Davis, he's the director of technology for PTZOptics, and today you've got a big announcement. It's coming in June and it's the Hive, right? Tell us a little bit about that.

Matthew Davis: We are super excited about the launch of this brand new product. We've been thinking about this for countless years at this point, and finally have something we think you are all going to absolutely love. What we've created is basically a remote control solution for  cameras. Now, this isn't just focused on our own cameras; we focus on as many cameras as possible, currently supporting over 400 unique models. So what we've got here is a nice solution for a variety of producers out in the wild.

The Hive Studio is going to be launching very soon, and when it does, it's going to have three different tiers available to the end users. These tiers start with a free basic model. They'll have one camera that they can control and experience the platform a little bit. You can jump up to a three-camera solution or an unlimited camera solution depending on your needs.

PTZOptics Hive

Shawn Lam: And so these tiers are available on a subscription model?

Matthew Davis: They're subscription, they'll come in a monthly or yearly flavor depending on people's needs, budgets, all sorts of things like that.

Shawn Lam: So you're controlling remote cameras with software. Give us a bit of an overview.

Matthew Davis: So you will have a couple of ways to approach the system. There'll be a nice little agent you can run on your PCs; it'll connect everything on your network and make everything really easy. The same agent will allow you to operate the system even without the cloud. So you get the same interface to control everything regardless of whether you're cloud-based or not at the moment. Now, the neat thing, once you go cloud-based, is that those cameras, if they're PTZOptics, don't even need an agent. They don't need a PC in the middle. You can literally send one of our cameras to the other side of the world, they plug it into the internet, and it will show up right in your studio, in the web interface.

Shawn Lam: And if they're one of the other 400 cameras that you have already programmed in there. It's just a bit of simple programming, right?

Matthew Davis: Yes. It's as simple as if you have a Windows or Mac PC, you run a little agent on there, and it will accomplish the identical thing for you.

So beyond just that, there's some nice low level of switching that people will get in here. I mean, you've got a bunch of cameras, you kind of need some way to do something with those. And why we might have switching? The software allows you to have an NDI output so you can bring that into your existing vMix systems, your OBS, your TriCasters, and leverage your existing equipment already. The platform will also handle recording auto tracking, so if your cameras don't actually already have auto tracking, the platform will add that for you.

ptzoptics hive

Shawn Lam: And that's kind one of the neat things I saw when I was watching the demo was that it's supporting cameras that don't natively have auto tracking built in, and it's not only just really quick movements, right? I mean, you guys call it "cinema movement."

Matthew Davis: We've done this before. I appreciate you identifying that because, we could deploy a nice auto tracking that does all these robotic movements and all the movements that we do. We're trying to bring to the table something that makes it very easy for each end user to produce something of a high quality. We all know what TV production and movie production look like, and while we all want to attain that, it can be a little bit unrealistic sometimes either with experience or the equipment that you have on hand. But with something like the Hive Studio in the middle, you're allowing people to bridge that gap very easily.

Shawn Lam: When you're looking at auto tracking, what are some of the human elements that you've incorporated into this technology to make it feel like there is a human operator on it and we're able to control it a little bit better or make decisions to say I want left or right or center bias?

Matthew Davis: So I was very fortunate actually, thanks to NAB, I got introduced to a producer near the very beginning of us launching all of this. He imparted to me what they would use in their television studios and the styles of production that when they had a hand operator, he is like, "No, no, no, I see the movement you've got going on. It works, but here's how you ease it. Here's how you smooth it. Here's what I would expect my camera operator to do."

And literally as soon as I had somebody kind enough to share how they did it, well guess what? There's no reason we can't bake it into a platform for end users where they don't even need to be aware that they're doing these cinematic movements. It's really making it happen for them.

Shawn Lam: That's excellent. So let's talk a little bit about the recording capabilities. Once we get this footage up into the cloud, now we have the ability to stream it, but can we ISO record each individual input?

Matthew Davis: Exactly. And then the beauty with the system is, if you have existing on-premise equipment, you can be doing it in the cloud, you can be doing it with your existing equipment. You can even be doing it on the remote production end of it--not true ISO recording per channel--but at least a single recording coming out at that point.

Shawn Lam: Alright, let's go have a look.

Matthew Davis: So what we're looking at here is actually the Hive Studio operating in a local mode. So what's going on here? We have cameras that are local to this network. In theory, we're not using any cloud resources. Nobody'd have to pay for anything in this type of mode. One of the neat things I was talking about before, the ability to support multiple studios. So from this one interface I can actually jump around to. We're going to have some fun here. Let's see if Paul's home office is actually live. Some of the neat features we've got here, we've got a click to center, so you can make it different sizes, you can zoom in on different areas. If we come back out, you'll see that we also have things like a nice cineframe or the C center. So we saw what that movement did

Shawn Lam: And that was the really fast one. Whereas the cine movement is the smooth one--the human operator imitation.

Matthew Davis: Yes. One of the things that we have heard from clients--as much as I strove to get all these cinematic movements, whether it's in the cameras or this platform, ultimately we had a lot of clients that were like, "No, no, no, no. I do a lot more scene switching. I just want it to move quickly to where it needs to go."

Shawn Lam: Are there timing controls or is it a preset duration of that movement?

Matthew Davis: It is a preset duration, but you can alter the time spans on it by adjusting the movement speeds. You also have more basic control capabilities. So here we get a nice little arrow. The further out you go in theory, maximum velocities that'll be reached. This has a pretty slow speed set for it, it looks like. What other thing? So if we come over here to our controls section, this in theory would be everything that you would natively find in the onscreen display. Every adjustment, whether it's image, motion control, you get it all exposed through this web interface. You do not need to go messing with jumping between a camera's web interface or control platforms or management platforms. Just one spot to do everything from. One of the other nice aspects you've got here is the ability to easily color balance between multiple cameras using one interface to be able to compare between multiple ones.

We can quickly set these into the available modes that are available and then go into nice manual. I'm going to make this look real funky, but as you can see, you're able to make these live adjustments on the fly. You get some random light that comes into your scene, you can actually adjust for that without a problem. As we know, it's not always predictable when you go into live production. So unfortunately, one of the other nice aspects of the system is this ability to not just compare between two PTZOptics cameras, two Sony cameras, but one of the larger things that we hear is, balancing between their existing Sony, their existing Panasonic, and our PTZOptics can get to be a very difficult thing. Once you've got this unified interface with all the same controls, actually getting them in line becomes an easy feat for almost any beginner even.

Shawn Lam: And that's big because so many camera manufacturers have very poor camera color controls. They just don't play well with others.

Matthew Davis: Yes, and one of the nice things that we've found is that as we gain control over these cameras, we found ways where we could implement better control over their color aspects than they were offering to their own end users.

Shawn Lam: Let's go through an end-to-end workflow. So you start off with your PTZ camera, you connect with your Hive Cloud, and then what?

Matthew Davis: That's where the magic really happens. You've got all this stuff floating around in the cloud, but where do you get it? How do you direct it? What do you do with it? So within the Hive platform itself, you've got a couple of options. The high platform itself will have an NDI output, so you can bring it into existing workflows with TriCasters, OBS, vMix, no problem, or the large amount of NDI supporting products out there. Now you also have the option, if you wanted it do to have on-premise equipment for mixing. So just because your camera may be remote and your producer may be remote, well, there's no reason that the local equipment that somebody's already invested in can't be used for all the recording everything. It always was. You're just handling control. Then at that point through Hive. Going beyond that, if you start getting into the recording capabilities, you're starting to look a little bit more at the post-production end of this.

Shawn Lam: And then the streaming. Is that done locally or is that done in the cloud, from the cloud?

Matthew Davis: That can actually be done either way. It has a cloud mode and a local mode. So you don't need to eat up your cloud resources, so to speak, if you don't need them. And you can pump right out from the agent that will work live on your PC, regardless of having a cloud connection or not. But hey, if you've got the cloud connection, you can do all these extra bells and whistles is the nice thing.

Shawn Lam: What's a typical use case for the Hive Cloud?

Matthew Davis: We envisioned a handful it seems like, between education, the church markets. I actually haven't heard anyone yet that hasn't said, "Oh wait, I have a place for that." From a very large-scale, end-user sales type of scenario where they saw amazing benefits to allowing producers to make their talent look very good as they were presenting different products. But we're finding a lot of interest, oddly, in the educational market, way more than I originally thought. Beyond that, we've always had remote producers. People that are generating the studio in boxes have also been very intrigued by this, with the idea that they can have one of our cameras, they can have a box ship it off, and there's nothing else for them to do.

Shawn Lam: Let's talk a little bit about latency. So how much time does it take to get up into the Hive Cloud and then back down?

Matthew Davis: So, I like to do this as an example for most people. I can get into the numbers, but it has allowed me traveling over 3,000 miles to actually properly track an individual by hand. There were no missteps, no problems. It was a very natural feel to it. As far as latency goes, depending on the scenarios, like right now what we're seeing is probably less than 300 milliseconds, and that's for that 3,000 mile journey to occur.

Shawn Lam: That's fast.

Matthew Davis: Yeah, it's really fast. And we've seen lower, but I'm not going to go claiming any numbers there for anyone yet.

Shawn Lam: Thank you very much, Matt.

Related Articles
Adobe Senior Product Manager Kylee Peña discusses and demos new audio workflow updates in Premiere Pro designed to save time and reduce mouse usage in this interview with Streaming Media's Marc Franklin in the Adobe booth at NAB 2024.
Blackmagic Design Director of Sales Operations Bob Caniglia discusses key features of Blackmagic's new URSA Cine 12K, DaVinci Resolve 19, 4K switchers, and more in this interview with Streaming Media's Marc Franklin from the Blackmagic Design booth at NAB 2024.
JVC's Alicia Reed discusses key features of JVC's KY-PZ510 and KY-PZ540 PTZ cams--including extra-narrow zoom, super-wide angle, and smart auto-tracking--as well as JVC's new 12-channel vMix Studio Switcher with NDI, SRT, and SDI support in this interview with Streaming Media's Marc Franklin from the JVC booth at NAB 2024.
Shure's Russ Shelfman discusses new audio production developments from Shure including the new MoveMic, SLX-D Portable mic, the MV7+ radio and podcast mic, and more in this interview with Streaming Media's Marc Franklin from the Shure booth at NAB 2024.
In this interview from the Panasonic booth at NAB 2024, Panasonic Connect Director of Product Management Chris Merrill and Streaming Media's Marc Franklin discuss Panasonic's new 4K UE30 PTZ cam, the Lumix S5 II/X, SMPTE 2110 support, and new developments in Panasonic's KAIROS liver production platform.
In this interview from the Western Digital booth at NAB 2024, Streaming Media Producer's Marc Franklin talks with Western Digital Director of Product Marketing Christina Garza about WD's latest 4TB SanDisk SDXC cards and 24TB hard drives.
Allen & Heath Marketing Specialist Richard Starr gives viewers a close-up look at Allen & Heath's CQ Series digital audio mixers with their touchscreen and physical controls, automatic mic mixer, presets for conferences, garage bands, and more, and remote operation capabilities in this interview with Streaming Media's Shawn Lam from the Allen & Heath booth at NAB 2024.
Among the key features of vMix 27 are Zoom integration, enabling remote streaming producers to bring in (theoretically) an unlimited number of remote guests, vMix Senior Systems Engineer Heath Barker reports in this interview with Streaming Media's Shawn Lam in the vMix booth at NAB. Barker also does a quick hands-on demo of how the feature works.
In this interview from the Blackmagic Design booth at NAB 2024, Blackmagic Design's Bob Caniglia and Streaming Media's Shawn Lam discuss how Blackmagic is enabling producers to convert 4K and HD signals to SMPTE 2110 so they can move content across IP networks, with their new open-source 2110 IP codec and new 10 gig port-equipped Blackmagic Design cameras that support it like the PYXIS 6K and the URSA Cine 12K.
Exciting new and (mostly) AI-driven tools and services from NAB 2024 that very specific problems, from shooting great iPhone footage to automatically creating short clips to providing live low-latency translation and captioning to creating customized radio programming to building purpose-driven social communities.
On the show floor at NAB 2024, Shawn Lam of Streaming Media and SLV Live interviews Atomos CEO Jeromy Young about the new Atomos Ninja Phone, which turns an iPhone 15 Pro or Pro Max into a 1600nit, 10-bit, 2,000,000:1 contrast ratio, 460ppi, HDR OLED, ProRes monitor-recorder for any pro HDMI camera.
Anthony Burokas introduces streaming producers to a brand new BirdDog in the new X1 PTZ cam, which departs from past models with a new design featuring a Wi-Fi antenna, Halo Tally, AI Tracking, an industry-first e-ink display for confidence monitoring.