-->

Streaming the Universe: A Q&A with GM & Head of NASA+ Rebecca Sirmons

Article Featured Image

After launching on Nov 8, 2023, and the sunsetting of NASA TV (NASA’s linear channel) in August 2024, NASA+ has become the official destination for all NASA content, whether you are watching on Amazon Prime, Netflix or NASA’s website, and app. Currently, NASA is planning to stream the first crewed launch to the Moon in over 50 years, Artemis II (estimated viewership 25M) and will become the world’s largest live streaming event, the next moon landing on Artemis III (estimated viewership 250M).

In this Q&A, I spoke with General Manager and Head of NASA+ Rebecca Sirmons about how her team built NASA+ and the infrastructure, gear, ops, and workflows that launch live streams at this massive scope and scale.

nasa+ gm rebecca sirmons
Rebecca Sirmons, General Manager, Head of NASA+, NASA’s official home for streaming live and original content

What is NASA+ and how does it differ from NASA’s previous streaming efforts?

NASA has been streaming for a while now. When we launched NASA+ in 2023, I like to say weaned ourselves off the linear network. Now that we’ve cut the cord, I’m very proud of us. We’re fully streaming the universe—everything from space flights to rocket launches, International Space Station, and original content.

nasa+
NASA+ launched in 2023

Who is your audience?

Our mandate in the 1958 Space Act states that we have to reach the widest possible audience, which means that no matter where you are in the world, you can watch NASA.

What is the Artemis program?

Artemis is us going back to the moon. For Artemis I last year, we went out and tested the rocket itself. There was no crew aboard. Artemis II is very similar to Apollo 8, where we test. We’re making sure the crew is healthy, the vehicle is working properly, and that everything is working to get us ready for the moon landing, which is Artemis III. We don’t have a true launch date yet until we go through the proper procedures to make sure everything’s all good.

nasa artemis iii moon landing mission
NASA’s Artemis III moon landing mission

How do you measure ROI and success as a government organization?

650 million people watched the moon landing on CBS in 1969 with Walter Cronkite. I doubt we will hit that exact number again. For now, we are spec’ing around 250 million [for our Artemis III moon launch]. We have to think outside the box and work with industry partners. Ensuring success on the NASA side is that we will guarantee you a great feed, but we’ll need help sharing it with the world.

Does that mean you’re distributing through partners?

Yes, we have partnerships with Netflix and Amazon Prime. We do YouTube, and we work with VOD partners like Kanopy.

nasa+ on netflix
NASA+ on Netflix

Do you distribute through partners for load-balancing reasons?

Exactly. You do it for load-balancing reasons. A lot of people are doing it now because we all realize that streaming is hard—especially streaming from space. In the private, we’re always worried about the last mile of delivery. With space, I’m also worried about the first million miles.

Do you get feedback from viewers about the latency?

No, because it’s physics. We’ve literally pushed it as far as it can go with latency. Latency for NASA, is speed of light, which from the moon is 1.3 seconds. Take that, ground-based jumps! Ultimately, you’ll get two to three, or possibly four seconds of delay.

Which areas of the agency are involved in running NASA+?

At NASA headquarters, we manage NASA+ overall operations, programming, content development, and strategic partnerships 

All technical operations for NASA+ are handled out of Marshall Space Flight Center in Alabama. This includes NASA+ Engineering HUB and virtual master control. 

At the Kennedy Space Center in Florida, we launch live broadcasts, including the upcoming Artemis II Live Launch.

Johnson Space Center in Texas is the point of origin for all live streaming from space activities. This includes ISS live broadcasts (downlinks and spacewalks), and space operations live broadcasts, among them the upcoming live broadcasts for Artemis II (lunar flyby, splashdown) and the upcoming Artemis III moon landing. 

What does the workflow look like?

Live events could be launched out of any of the centers. Primary live events come out of Kennedy Space Center (live launch broadcasts) and Johnson Space Center (live streaming from space).

Technical infrastructure is interconnected via all centers so that we can operate via a virtual master control room. We are cloud based, and it makes us nimble.

Live streaming from Space is switched internally and one feed is shared with the world. It’s one more hop from there to Marshall, and then it goes out to our core via SRT feeds. We have our HLS that we get out to Akamai for our apps. For our internal NASA+, we have our clean feeds that we get out to CNNs and other partners.

After many, many years of building an infrastructure, we can continuously stream in 4K. Live streaming from ISS (in Lower Earth Orbit) is mapped out. You have to be in direction over certain satellites to bring the signal down.

At Marshall, we have the ultimate remote production workflow streaming team. They run our virtual master control room during a live event. For redundancy, we encoders set up across several centers.

Who are your streaming leads?

Megan Cruz, Live EP, NASA+ 

  • Location: Kennedy Space Center 
  • Currently producing: Artemis II and other live broadcasts out of KSC

Nilufar Ramji, Live EP, NASA+ 

  • Location: Johnson Space Center
  • Currently producing: Artemis II and other broadcasts out of JSC

Jori Kates, Director NASA+ 

  • Location: NASA Headquarters
  • Currently directing: Artemis II live broadcast out of Kennedy Space Center 

Lee Erickson, Head of Streaming Tech NASA+ 

  • Location: Marshall Space Flight Center
  • Currently: Head of all technical execution and workflows for NASA+ 

How do teams coordinate to synchronize a live event?

Our brilliant Live EP Nilufar Ramji leads the charge on the production side. We have our distribution. We do a NASA+ tech call, where our app team connects with our web team.

We work with Lee Erickson, head of live streaming, and his team to coordinate with the centers to pull off a live broadcast to make sure that we’re all set for the workflows. For every event we build a workflow. We always have multiple redundancies. We have about three redundancies consistently for each live event.

What codec are you using?

We use H.265/HEVC.

What are you streaming?

For Artemis II, we have cameras outside of the vehicle, and we have cameras inside the spacecraft, so you’ll be able to see the astronauts live and we’ll be able to see the moon from outside. We’ll have a 24/7 feed as well as our live broadcast. We’re going to have three key broadcasts. First is the rocket launch. Then there’s the lunar fly-by, where you’ll see the moon which is going to be pretty exciting—we’ll be live streaming the moon, and it’ll be the first time that we’ll be experiencing this in over 50 years. The third, of course, is splashdown, when we’ll bring our crew home.

Artemis III is the thing that I am honestly the most excited about, to provide the world one of the greatest feeds of all time, live streaming in 4K from the moon. That is a difficult task. When Artemis goes around the moon, they’re going to be on the dark side of the moon. We will not have signal for a period of time, because it’s just physics.

What makes the moon landing broadcast so challenging?

We’re going to try to really provide people imagery that they’ve never seen before. I want people to be able to see the science and be able to see what it looks like. Back in 1969, it was a little blurry, but it was pretty incredible. For this broadcast, we’re talking about deep blacks and deep whites.

There’s no color on the moon?

No, there's no color. It's difficult to wrap your mind around it.

How do you manage the camera feeds? Do astronauts point the cameras?

Yes, they’ll use a handheld camera. We’re actually going to see what we can do to make sure that one is live streaming as well.

We’ll also have live cameras on the helmets. We have live cameras set up on the craft itself or the lander, so there are different points of view. There’s a lot of integration. And the great thing is, it’s not just about sharing this imagery for the public around the world. All of these cameras serve different purposes at any different point in time. It’s really fascinating. You’ll be able to see some things that you’ve never seen before.

Did you build the aspects of your workflow in-house, or are you using commercial products?

For our overall workflow, it’s all homegrown. Lee Erickson, our head of streaming, built it based on the tools that we already have. They built the infrastructure, the workflow, and way of doing it that’s sustainable. It’s cloud-based, so we can go from Kennedy Space Center to Johnson Space Center. If Goddard Space Flight Center in Maryland needs to run the broadcast from their location, or if we need to run a broadcast from headquarters, we can.

a schematic of the nasa+ streaming workflow
A schematic of the NASA+ streaming workflow. Image credit: Lee Erickson 

When developing NASA+, how did you approach the build-versus-buy decision?

When I first got here, I needed to see how everything was currently working in the NASA TV infrastructure. I had to figure out who was doing what, and how everything was being communicated. I put together a network operations manual that covered how we work with each other.

The big reveal is that we already had a streaming infrastructure. One of the main things we stopped doing was using satellite. We moved into AWS for a virtual master control room. That was a big move. It helps us now move quicker. We can turn things around on a dime. This is due to the long experience of our engineers at NASA, and the fact that they’re always willing to learn and to grow.

Let’s talk about hardware versus software. What’s your approach for encoding?

In the private sector, I would use software-based encoders. At NASA+, knowing what I know now about radiation testing, extreme environments, it’s hardware-based encoders all the way. They don’t take as much power, they can be more ruggedized, and the likelihood of them succeeding is higher.

Going to the moon, you are going on the most epic camping trip. You have to bring everything. You have to pack it all. You can’t go back. You can’t say, “Oh, can you fix this encoder?” We have to have reliable equipment. It’s risk mitigation, but also radiation.

How does radiation affect your equipment?

If you watch one of our feeds, you’ll say, “Why does it look like that? Why does that camera look like a Christmas light in the distance?” That’s from all the pixels being fried from radiation. We’re lucky if a camera can stay over there for a couple of years.

We try to put all of our encoders and our cameras through radiation testing. These are all the restraints that we have to deal with. Weight is another—things can’t be heavy. There are many things to think about.

How do you optimize latency while streaming?

We drastically reduced the amount of hops that we have on the ground-based side. It just goes from Johnson's Space Center straight to Marshall. That’s our hop.

From ISS and then Artemis, it really has to do with the most minimal feed coming down. The reduction of hops. That’s the most basic way for us to minimize any sort of latency on our end.

With NASA+, are you taking the same video feeds that would’ve been captured just for internal uses and using them externally?

Everything that’s pulled down for public use goes through our control. We pick the feeds, and we put it into one distributed feed to be able to tell the story in a clear way. Usually, this has to do with things like, “Oh, that shot is a little off. Let’s go to this shot.”

But the one thing that I’m very proud of—and I think it helps set NASA apart in today’s environment—is that we’re 100% transparent. We are not manipulating imagery. It’s not AI-generated. What you see is what you get.

Do you have a problem with people thinking the footage is AI-generated, or do they tend to trust what they’re seeing?

I didn’t realize until I [started at NASA] that so many people did not believe in the moon landing. To me, that’s what makes our job even more important. The camera’s mission is to show us the reality of what's going on.

Do you have live content 24/7?

No. We have a FAST channel on Amazon Prime. We do not have a 24/7 NASA TV-type channel anymore.

What personas do you target with your content?

I’m a little biased, and I always like to claim my bias. I have a daughter who’s six years old. She’s part of what I think of as the Artemis generation. It’s that inner kid in us all that makes us all excited about space. I want to double down on the Artemis next generation to inspire people to be astronauts, scientists, or streaming engineers in space.

The other side of that, of course, is the educator, the STEM-obsessed person, people who say, “Hey, let's watch this broadcast from NASA.” We do have quite a few of those, which is why we work with the Kanopy.

Ultimately, it’s the fandom, the type of person that does Reddit, does Twitch, is interested in Star Wars—the sci-fi nerds.

How are people watching NASA+?

Thirty-five percent of everyone is accessing the web NASA+ direct. The rest are watching on our app and via our partners, mostly on TV and Roku. They only release so much information, but I’ll say that about 20% of that comes from Amazon Prime and Netflix. With more partners on board, obviously I want to grow that number.

Can you monetize NASA content?

No. We cannot make money. It’s a free service.

To clarify, does money exchange hands with partners like Netflix?

No money is exchanging hands with any partner or anything like that. There needs to be paperwork involved to do distribution, but if it didn’t, we would just hand our partners a feed and  leave it up to them to decide what feed they want to take in and air. It’s the same as us giving CNN a feed.

How does your on-demand content perform?

Our videos on demand do exceptionally well. And now it’s almost half and half as far as success rate. Cosmic Dawn is where we took 25 years of archival footage to go through the development of the James Webb Space telescope. We created these amazing images that we all fell in love with over the pandemic. That story was incredible.

The great thing about NASA+ is we already have in-house creatives here. We have the scientists, but we also have people on hand to be able to tell these stories. So this was really a labor of love. But it was also that we had all this material already here, and we just organized it in a way to make an excellent documentary that has done incredibly well.

Do you plan to integrate VR or AR to visualize future missions?

I would love to, but not right now. I hate to say it like that, but what we have in our hands currently is plenty.

What happens to your archives? Is it all available?

Yes, and in fact our archive is called Avail.

Is NASA imagery in the public domain?

Yes. You can see our brand partnership guidelines at [go2sm.com/partner]. We don’t make money and people can’t make money off of us.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues