Blackbird: Revolutionizing Cloud-Based Video Production with Rapid Content Access
Discover how Blackbird's unique cloud-based video production platform transforms content creation with rapid access, seamless editing, and efficient delivery.
File
Sports and Entertainment Case Study for Social Media with Blackbird
Added on 10/01/2024
Speakers
add Add new speaker

Speaker 1: So first, what is Blackbird? Blackbird is certainly a cloud-based video production platform. It is unique technology. It's based on the Blackbird codec. As mentioned, it is a proprietary working codec. And it's served up in a SaaS environment. The benefit of that is that it gives you rapid access to content. It gives mass visibility of content without building a huge infrastructure to do this, or any infrastructure to do any of this type of work. And of course, the ability to manipulate content in a browser and deliver it to wherever you see fit. And we'll talk a little bit more about that where we really don't define the endpoints, you define the endpoints, we just embrace them. So first, just a little bit of a view. Blackbird's widely used throughout the world in production companies locally, actually Scott Brothers, the Property Brothers are using it daily. Other companies like Our House Media use it, and that's more in the post-production environment. Broadcasters like BBC and ITV and numerous other ones, too many to mention. Also in news and corporate and education. But today we're going to really be talking about sport, and we're going to be certainly focusing on just across the border on Buffalo, talking about the Sabres and the Bills and how platforms trying to bring multiple files together to do delivery to their owned and operated platforms and into their Go app. That became a problem, because by the time a goal happened or basket happened, they deal with Knicks, they deal with Rangers, they deal with Islanders, basically they deal with any New York-based team. They were at about 45 minutes per clip to get it out properly, because they had to stop records and they had to marry SRT files together and do a final delivery. What Blackbird was able to do was to take live feeds with embedded metadata, serve them up in less than six seconds to an edit environment, and push the entire package out within 45 seconds. So we took them, we went from 45 minutes to 45 seconds to get something to their owned and operated platforms. For IMG, it was absolutely about fast turnaround, definitely live multiple endpoints were a major part of their infrastructure that they had to achieve. DeltaTray, it was all about decoupling and actually hosting everything in public cloud. And we'll talk a little bit, we'll certainly get into that a little bit as well. So for DeltaTray, it really was about decoupling the ingest from the editorial. That was a really, really big deal, because what they were interested in is internationalization of talent and not have to put people on location or on premise to deal with that type of thing. And of course, Sabres we're going to get into, I'm not going to talk about it right now because we're going to actually show you a little bit of a demonstration of it. So let's talk a little bit about the tech stack, only two slides on the tech stack, and then we're going to go into some testimonial and right into the tech itself. So Blackbird is based on something called Edge technology. The interesting thing about Edge, and in our world, what it does is it acts on the most basic level as a transcode engine. It can take any number of inputs that you want, they can be live inputs, they can be file-based inputs, it can be metadata, it can be a combination of all of those. We could work with watch folder technology, we could work with streams, feeds, streams or feeds, either of those. The form factors that those sit on could be pretty well anything you want. You could virtualize it if you want, you could run it on this computer, I could run it on Android phone, I could run it in Azure, I could run it in AWS, Linux. So it's operating system agnostic, it's cloud agnostic and we're pushing to other cloud platforms, and that acts as a transcode engine to any number of inputs. And what we do is we essentially transcode into the Blackbird working codec, and that's how this stack works. It also does a number of other things. It actually will act as a conform engine because it will act as a transcode, it will save or reference the high bitrate content, because you're going to need to get back to that high bitrate content at the end, so we can conform against the high bitrate content as well. It's very, very scriptable. We use Python to script. It means that we could actually design things like directory structures that need to follow a specific directory structure. We could take metadata and inject that into the file. I mean, because it's Python, it's open and scalable. We could pretty well do anything we want with how content is presented and what kind of metadata we're going to inject into it. As I said, live sources, RTMP, HLS, SDI, file-based sources, many, you know, it's just hosts. We can just source a variety of file-based sources. And then metadata, certainly closed captioning is a major topic. And then other log or time data files as well can be injected into the platform as well. A perfect example is from the people we just heard from, from Veritone, right, being able to take time data from Veritone and inject that into Blackbird as well. So that's the edge. Once it hits the edge, what happens to the...we create that codec, and what happens to that codec? Well, this is the first part where we're talking about the streams or SDI in Sport. It could be file-based as well and video file. So we transcode that in and makes its way into the edge. It then makes its way up into our private cloud, to the Blackbird cloud. What you see is the words Blackbird Forte and Blackbird Ascent. And those are just two different interfaces, one's a more complex, full-featured interface. The other is a much simpler, streamlined, easy clip interface. For SDI workflows, you are looking at about six seconds by the time once it hits our transcode engine till it appears in the clipping interface, the latency. If you want to go to other platforms, such as Avid or Adobe or Premiere or Resolve or any of those other applications, we can give you a very rich bi-directional AAF, by the way. You can actually start on the platform and come to the Blackbird site. So that's the stack, that's what it looks like in a PowerPoint document, and we'll certainly show it. Delta Tray, just one of the things they're doing with our platform is they're using Game Pass for NFL. It's all based on Azure, so our edges reside in Azure. We essentially access a lot of storage at that point, pull in all the content and clip

Speaker 2: it up and then deliver it.

Speaker 3: We saw interviews that are occurring as part of the live coverage, which you can then very quickly clip up and push either to your own and operating platform or into social. And then there is longer clips where you're making more of a highlights edit. So if you're reducing, say, a 90-minute game to either five minutes, we do those kind of workflows. If you want to have anything that's like a full-featured editor, you need to be able to scrub the content, you need to be able to move frame by frame through the content. Blackbird's a game changer because it gives you that decoupling of the ingest and the editing. So you can use your existing OTT digital workflow, you can have a remote distributed editing team working on those streams, which is obviously very cost efficient. You can follow this on where you want to have the editing done. It doesn't necessarily need to be co-located with the actual event itself.

Speaker 1: So that's how DeltaTrader is. Our testimonials are only 90 seconds, so you don't have to sit through six-minute videos or anything like that. The one that we're going to be really focusing on today is a company called Google Sports and Entertainment. They do all the highlights, both for pre-game, post-game shows, and in-show clipping for Buffalo teams. Their workflow is based on on-premise edge, very much like MSG. So they basically host it in a Linux environment, and they've actually decoupled the editorial. And when you actually run a testimony, they're just sitting in the stands, pretty well clipping in the stands, using public Wi-Fi. That's the infrastructure. It's like log on to KeyBank Center public Wi-Fi and start clipping up the event. So they're doing very much the same thing, highlights, 45-minute game, down to short form highlight clips. Completely closed caption is a major part of their workflow because they need to be able to deliver that as a compliance, a compliancy portion. And rather than me talking about it, we'll have someone from the Sabres talk about it. I'm Chris Rendak. I am the director of media and content for the Buffalo Sabres.

Speaker 4: I kind of oversee how, as an organization, we distribute content. We first discovered Blackbird in summer of 2017. A law had passed in the United States that basically said that any video of a Blackbird clip that was shared on social media, it would be censored. And so we were like, okay, let's do this. Let's do this. Let's do this. A law had passed in the United States that basically said that any video from the web that had previously aired on broadcast needed a closed caption solution. Eventually we found Blackbird as a great solution to helping this, and it actually expanded what we were able to do as well. We produce our own broadcast with the Sabres. So there's a lot of collaboration between our broadcast team and our digital team. So we end up clipping a lot of segments from the broadcast that end up on web and social media. Having Blackbird is a great option because we're able to seamlessly transition from the broadcast to the digital clip within minutes. One of the great features about Blackbird is the quick turnaround time. So anything that airs on our broadcast, we're able to take within minutes and turn it around for our web and social channels. So the ability to capture things in real time has been a huge help for us. As we continue to build our digital footprint and expand upon who's going to use the software and how we're going to use it and really strategically implement new things in our digital arsenal, Blackbird is a tool that's right at the forefront for us.

Speaker 2: You'll see it as a file.

Speaker 1: You'll be able to pass it along as part of a publish as well. So that means you're going to have to deliver it as well. The nice thing about pulling in the closed caption metadata is it becomes searchable at this point. So if I'm looking for a specific word, you know, the word little, just like that, it allows me to go and search. If I want to pull a clip or a variety of clips based on this and start to create a timeline, I've just done a search based on that, those details, and anything that had the word little

Speaker 2: on it is automatically being pulled into a string of sequences just like that.

Speaker 1: So the way you know this is a live stream, by the way, is you can see the chevrons and essentially we present everything as a growing file. So you'll start to see these grow in just a second. This is a pregame. This is a pregame or just like a live show that they do every day. Now, what's really interesting about this is serving it up off of a network from where all the content's hosted in our private data centers. That's the first thing that happens. The second thing that happens is the codec itself has some unique abilities. As opposite an experience as you can imagine to what H.264 does, especially when served up from the cloud. Being able to play it backwards at double time to have absolute frame accuracy that I can get through frame by frame is one of the unique aspects of the Blackbird codec. When you zoom in, we actually capture every single thumbnail. When you zoom in, it's not so spectacular, but when you zoom out, you can actually see where the cut points take place during a live event. We're obviously bringing all the metadata in. In this case, one of the things that we're bringing in as well, which will open up over here, is all the closed captioning. So you'll be able to see it as metadata. You'll be able to see it as a file. You'll be able to pass it along as part of a publish as well, because you're going to have to deliver it as well. The nice thing about pulling in the closed caption metadata is it becomes searchable at this point. So if I'm looking for a specific word, the word little, just like that, it allows me to go and search. If I want to pull a clip or a variety of clips based on this and start to create a timeline, I've just done a search based on those details, and anything that had the word little on it is automatically being pulled into a string out sequence, just like that. It is that easy to use the software. This is actually the more advanced version. We actually have a simpler version for people who use products like Snappy TV who just want basically a source monitor to publish. It works quite well with that. The thing about the interface itself is it's not so different than other editors out there. You still have a trim mode. You can still build layers. You can still build composites, but there's a lot of automation that's built into the application. So when you're building branding, having pre-rolls and post-rolls that are automatically added to the sequence where the editor only has to think about the content, but the pre-roll and post-roll are part of the publish, we can build those elements in. Obviously, things like alpha channel, graphic support, animation import, additional subtitle tracks, obviously that's all there. As a matter of fact, one of the things that you'll notice here is when you look at the interface that it's not just video and audio, it's metadata tracks associated with it as well. So it gives you a lit... Obviously, there's the ability to trim. It's all non-destructive editorial. Obviously, you have things like transitions. So if I want to throw a dissolve in there and trim it, it dissolves just like that. There's nothing in here that's going to be such a surprise as an editor. But what we've done is make it highly available, meaning that whoever has a laptop here, if I give you a login, you're on the platform. And the interesting thing about the codec itself is it runs in JavaScript. So basically, you just run it in Chrome like any other... You just fire up your Chrome browser or your Firefox browser or your Safari browser and you start working away at it. At the end of the clip, of course, what you're going to want to do is publish to a variety of destination points. And we're going to certainly talk about publishing because it's not just about the editing, it's about the delivery. It's also about the mass visibility that you have within the application. So at this point, in their world, they're delivering 1080p MP4s and .scc files that get taken into another platform and multi-bit rate distributions done from there. But they could also go directly to their Twitter or they use AAF if they're having Premiere if they need to get it out into the next platform. So it is a hub that can also link into other platforms. Certainly, we'll be around during lunch. So let's talk a little bit about delivery. As we said, you can deliver to a lot of different social platforms. I neglected to put Instagram on here specifically because Instagram does not have a public delivery API. So we can, of course, deliver square video out of here. But you would have to put it into another application to deliver to that. But we can create the high bit rate conform and delivery from that content. Of course, as I said, you can see here we have high bit rate delivery. And the thing that's really interesting is that box over to the right. We could do very unique as they call bespoke deliveries as well. So if you need to do XML with very specific publish options that are delivered into those environments, we can do that. We do that with Sabres and MSG where they can deliver absolutely into specific applications through their environment. Things like Brightcove, of course, is another application that we can deliver. Or, of course, Azure or AWS as well. So we can deliver into those environments. So in summary, give you a little bit of the overview and the need, speed to screen. One of the things that we always talk about is don't break your current infrastructure. Don't change for us. We'll just slip neatly into your infrastructure without destroying what you already have. And we're just going to look at what you're doing and deliver a codec based on that. The other thing, of course, is embrace limited bandwidth. I'm using public Wi-Fi to do this. I often do this at a Starbucks. Decouple ingest from edit. And certainly don't transport your high bit rate. Leave it where it is and we'll just find it and proxy off of that. Thank you. I'm certainly happy to take some questions if you have. On the staff, on anything that you've seen here today. And you don't want to shout it out now. It's certainly around during lunch as well.

ai AI Insights
Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Enter your query
Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Quizzes

Create interactive quizzes based on the content of the transcript to test comprehension or engage users.

Generate
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript