What if I told you...that the Superbowl was NOT broadcast from the Hard Rock Stadium in Miami Gardens, Florida - as Fox Sports will have you believe - but from an AWS S3 bucket in Northern Virginia, would you believe me?
Last year's game on CBS was watched by over 100 million viewers across all platforms - including digital and streaming, which continues to grow by leaps and bounds, while traditional TV-based game-watching continues its steady decline. With that much viewership, how does a broadcast provider handle an unpredictable and increasing demand for live-streamed content? With AWS, that's how!
In a recent interview with FierceVideo, Liz Carrasco, CTO and Chris Xiques, VP of Video Technology at CBS Interactive, gave an overview of the streaming architecture design for Super Bowl LIII and the challenges handling demand, or getting delivery partners on-board. As no single Content Delivery Network (CDN) vendor would sign up to handle the forecasted volume of traffic, they cooked up their own in-house solution for signal acquisition and encoding and built a multi-vendor CDN for load balancing and redundancy. “...we really had no choice but to do that. We actually did go to a couple of the bigger CDNs and said, ‘Guys, we’re looking to get 30 or 35 terabits from you on game day,’ and it was just crickets!” (FierceVideo).
At the core of this 'Super CDN' was AWS. "We used Elemental encoders; I think that’s pretty standard across the industry for our contribution feeds. For origin, in order to use multi-CDN effectively you have to have a common origin that all the CDNs can go to and pull the segments from and do their delivery. The Amazon folks had a new suite of products and one of them is called MediaStore, which acts as an optimized S3 origin that we were trafficking all of these encoded segments to" (FierceVideo).
Now do you believe me? :-)
So how does this whole Cloud...Broadcasting...Streaming thing work?
Well putting CBS's Frankenstein broadcast rig aside, the basic nuts and bolts of broadcast streaming on AWS are pretty straightforward and a lot more turnkey that they used to be. The following diagram and narrative give a super-simplified overview.
At # 1, this is the live video capture point we know and love - a traditional camera source at live events or TV station. Also at #1 (not pictured) is the digital encoder, which is software that turns source video into a digital stream of 1's and 0's. Great encoder options include AWS ElementalLive, Wirecast, XSplit Broadcaster, Flash Media Live Encoder or the open source Open Broadcaster Software (OBS) package.
We hit AWS land at #2 with AWS Elemental MediaLive, which has the ability to ingest two input feeds and transcode that content into two (2) adaptive bitrate (ABR) HTTP Live Streaming (HLS) streams as output. MediaLive basically encodes live video streams in real-time, compressing a larger-sized live video source and into smaller versions for faster distribution.
Next we have AWS Elemental MediaPackage which ingests the MediaLive output and can package the live stream into various formats - HLS, Dynamic Adaptive Streaming over HTTP (DASH), Microsoft Smooth Streaming (MSS), and Common Media Application Format (CMAF) - that are delivered to four distinct MediaPackage custom endpoints. Alternatively, if your encoder of choice is already putting the content into the desired format for targeted devices, you can direct the MediaLive output to a high performance scalable origin like AWS Elemental MediaStore (like CBSi did for the Superbowl 53 broadcast).
At #4 we have an Amazon CloudFront distribution which can be configured to use the MediaPackage custom endpoints (or MediaStore endpoints) as its origin. CloudFront is AWS' globally-distributed content delivery network (CDN) of proxy servers which caches content at the edges for low latency and high throughput video delivery, thus improving access speed for viewing or downloading content.
CloudFront delivers the live stream to viewers with low latency and high transfer speeds in the device-appropriate formats HLS, HDS or MSS. These protocols "improve the user experience by delivering video as it is being watched, generally fetching content a few seconds ahead of when it will be needed" (AWS). Playback starts quickly, fast-forwarding is efficient, and the overall user experience is smooth.
Clearly this simplified small scale design will look a lot different at scale. In CBS's case with that much content and that many viewers, they had to make other considerations.
"We also used a couple vendors to do origin shielding so that that one common origin didn’t get overwhelmed by requests from the edge CDN. And then we used four different CDN vendors to do delivery along with a CDN decisioning vendor that allowed us to create a fairly sophisticated app that allowed us to tailor exactly how much bandwidth we were using from each of the CDNs to make sure we didn’t overwhelm any of them or exceed any of our bandwidth commitment arrangements we made with them".
Its amazing the amount of technology that goes into just producing a smooth user experience, which they'll probably only look at for a few minutes. In the case of last year's game which was a lop-sided snoozer, the game was over by halftime and many viewers, particularly the transient mobile user tuned out; but interesting nonetheless all the same.
Its interesting how before Cloud became a thing, global or even coast-to-coast broadcasting was extremely complex and even more costly, requiring it to be done "via satellite". When last have you heard THOSE words :-). Needless to say, it was only for those blessed with deep pockets. Today with a few clicks, anyone, anywhere can broadcast anything at anytime...and isn't that just awesome?
If you're in the broadcasting business, contact BreezeIT today to learn how our expertise with Cloud can deliver simple, secure and stress-free solutions for your mission success.
More info
Comments