
Voices of Video
Explore the inner workings of video technology with Voices of Video: Inside the Tech. This podcast gathers industry experts and innovators to examine every facet of video technology, from decoding and encoding processes to the latest advancements in hardware versus software processing and codecs. Alongside these technical insights, we dive into practical techniques, emerging trends, and industry-shaping facts that define the future of video.
Ideal for engineers, developers, and tech enthusiasts, each episode offers hands-on advice and the in-depth knowledge you need to excel in today’s fast-evolving video landscape. Join us to master the tools, technologies, and trends driving the future of digital video.
Voices of Video
The Secret Sauce of Sub-Second Streaming (THEO Player - now Dolby)
What happens when engineering brilliance meets streaming challenges? Peter Jan Spielmans, co-founder and CTO of Theo Technologies, takes us behind the scenes of video player technology and revolutionary low-latency streaming.
Most viewers never think about the video player - it's the invisible interface between complex streaming technology and the viewing experience. Yet this critical component determines not just how video looks and performs, but provides essential analytics about user behavior. Peter Jan explains how TheoPlayer evolved from addressing the fragmentation of streaming protocols to becoming a comprehensive solution handling everything from DRM protection to quality adaptation.
The conversation shifts to an exciting innovation in streaming technology: HESP (High Efficiency Streaming Protocol). Unlike traditional protocols that struggle to balance latency with quality, HESP delivers sub-second latency - around 800 milliseconds - while maintaining broadcast-quality video. This breakthrough comes from a clever dual-stream approach that decouples GOP size from latency, enabling rapid channel changes without sacrificing compression efficiency.
For interactive applications like sports betting, webinars, and live auctions, this ultra-low latency creates possibilities previously unattainable with conventional streaming methods. Peter Jan outlines how TheoLive makes implementing this technology remarkably straightforward - simply provide an RTMP or SRT feed, and the service handles everything from transcoding to delivery and playback.
The discussion also explores the evolving codec landscape, with insights into how organizations are implementing hybrid encoding ladders that leverage HEVC for higher-quality renditions while maintaining H.264 compatibility. Peter Jan shares valuable perspective on when commercial players make sense versus open-source alternatives, highlighting the hidden costs of maintaining complex integrations with analytics, advertising, and DRM systems.
Whether you're a video engineer weighing technology options or a product manager seeking to understand streaming innovations, this episode provides both technical depth and practical guidance for navigating today's video delivery challenges.
Stay tuned for more in-depth insights on video technology, trends, and practical applications. Subscribe to Voices of Video: Inside the Tech for exclusive, hands-on knowledge from the experts. For more resources, visit Voices of Video.
Voices of Video. Voices of Video. Voices of Video.
Peter Jan Spielmans:Voices of Video.
Jan Ozer:Welcome to NetEnt's Voices of Video. Today we talk with Peter Jan Spielmans, co-founder and CTO of Theo Technologies, the developer of the Theo Player and the Theo Live low latency streaming service. We'd love to talk about codecs and CDNs and encoding ladders when we talk about streaming, but it's the video player, like Theo Player, that the viewer interfaces with and that largely controls the viewer experience and certainly the analytics that we get back from the viewer. Of course, there are many different options to obtain a player. There's open source, there's commercial, and Peter Yan will talk about choosing between open source and commercial and also the factors to consider when choosing a commercial player. Then we'll turn to low latency, which is a streaming mode of interest to many video engineers.
Jan Ozer:Theo Live is a service mode of interest to many video engineers. Theolive is a service that uses a unique protocol called HESP. Peter Jan will detail general approaches to low latency and compare those to HESP, the types of applications or the type of productions that need low latency, and then how to produce a low latency production with HESP and TheoLive, to produce a low latency production with each HHP and Theo live. If we have time, we'll cover what Peter Yan is seeing regarding codec usage from the player analytics he's getting back from his customers. Peter Yan, thanks for joining us. Why don't we start with a quick overview of your background and your history before Theo?
Peter Jan Spielmans:Before Theo, I actually did not work for that long. I actually started as a software engineer after studying software engineering and then after about a year we said like we can do something else, let's start a company. That's basically it. There's no more magic behind it.
Jan Ozer:What was the big idea behind Theo?
Peter Jan Spielmans:Originally it wasn't called Theo yet and the big idea was actually aggregating content. We saw that there was a lot of good content out there and this was well. Youtube was there but pre a lot of the other services, and we basically figured how can we make this easier to discover? Basically so sort of a recommendation engine aggregation idea.
Jan Ozer:Okay, so how did this evolve into a player and a live streaming service?
Peter Jan Spielmans:Especially back in the day, there were a lot of walled gardens, so it was not very easy to aggregate all of that content and to get some money in. We actually started doing some consultancy work, helping others bring their streams live. So basically, well, getting the cable from the OB van, plugging it into a server, getting that stuff out there, and very soon we actually noticed that just making it play everywhere was well more complex than you wanted. Um, so, yeah, that's, that's why we started, uh, thinking about can't we make this simple? Can't we just remove the need for flash, remove the need for silver light, um stream it in one protocol, because it was like adobe hds, microsoft, smooth streaming, the whole shebang, basically. And that's where the idea came from.
Jan Ozer:And what about HESP and Theo Live? When did those come into being and what was the big idea behind those?
Peter Jan Spielmans:After we had our first player customers. At one point in time we were working together with Periscope. So now, yeah, part of X, I would say customers. At one point in time, we were working together with periscope, um. So well, now, yeah, part of x, I would say um.
Peter Jan Spielmans:And I don't know if you even remember that, but I think it was in 2015 or 2016 I did a talk on streaming media west together with uh, with somebody from twitter. I know you were there, jan. I don't remember if you remember or I don't know if you remember, but back then we were actually working on the first low-latency HLS, the LHLS approach, together with Twitter, and we actually made the player for that and we started thinking can't we also improve this? Because I mean improving on the HLS protocol absolutely possible, but it was for us, a bit repurposing something that was built for something else in the past and we figured if we would take like a blank sheet, what can we do to push user experience, to push quality forward, to make sure that user experience just in general improves through latency, channel changes, all that kind of stuff?
Jan Ozer:What type of applications are you seeing? Are, you know, migrating towards the Alive and using the low latency technologies?
Peter Jan Spielmans:If we're honest, today, most of the services and most of the use cases really benefiting from low latency, they are still what I would call the user engagement segment. So that's very often things like starting interactive TV shows, but also and mainly like sports betting, things like webinars as well. If you're mass distributing, they have quite some benefit out of that as well. You could see this as still niche use cases, because it's not like premium content being streamed as a TV channel. I don't see the big value there. Yet It'll get there, but today it's mostly the user engagement kind of streams.
Jan Ozer:What's the latency that you're delivering in those type of applications?
Peter Jan Spielmans:With the Alive we are actually delivering well sub-second latency in the end. On average it's like 800 milliseconds. But we have customers because with DLive you can actually tune it how low you want the latency to be. We have customers who tune it as well to like 1.5, 2 seconds, depending on where in the world they are actually delivering. If they are delivering globally well, then it's not always achievable to go to like 800 milliseconds. To a shaky network connection in Brazil, for example, that's going to be hard.
Jan Ozer:Okay, and low latency, hls and Dash are in the practically the four to six second range. Is that accurate? Is that what you're seeing?
Peter Jan Spielmans:It's absolutely accurate. It really depends also on the scale. You can go very low with low latency HLS and low latency Dash as well. I've seen very impressive demos by other people in the industry. But in my experience, once you really start going to scale hundreds of thousands of people being live at that point in time it just becomes very complex to do that with low latency HLS or low latency DASH and you end up in a more realistic scenario with the broadcast latency six seconds, eight seconds, that kind of ballpark.
Jan Ozer:Let's dig into the protocols. You've got a PowerPoint slide for us to let us compare your technology, HSV, with DASH and HLS and some others.
Peter Jan Spielmans:I'll start actually with sharing a different slide, which is one that a lot of people probably know. So, of course, the slides this one, this specific one, was actually made by Nicholas Weil. I think he presented it on segments at the SVTA conference, but historically, I think most people know this kind of slide from the people at Wowza. I think they made one of the first ones really showing this.
Jan Ozer:The classic slide.
Peter Jan Spielmans:It's the classic slide showcasing here as well, similar to what I said in the past, like the real interactive streams which benefit from low latency, but also it shows more that low latency HLS, low latency Dash they're really around that broadcast latency. Well, if you really need to go lower, well then you have to look for other alternatives. And yeah, I mean that's what we wanted to do with hsp as well really make that, that lower that sub-second latency range possible. And the other thing that probably is relevant, because that's the thing that really kicks in when you start looking at which protocol should I use or what kind of service should I use At that point in time, it's not just about latency, or at least that's my opinion.
Peter Jan Spielmans:It's about how much does it really cost to get this out to the audience that you want to serve? How is the picture quality? Are there trade-offs that you need to take? If you take, for example, hls and Dash, these protocols, they're very, very good at delivering a high quality stream to a massive audience, but they are compensating on the latency. On the other hand, if you go to low latency HLS, low latency Dash well, very often you are, yeah, trading in a bit shortening gob sizes, all those kind of things. Well, stuff you know way better than I do, but that's something which is, yeah, an important trade-off that needs to be made there.
Jan Ozer:So you're just zooming in on the quality bandwidth for low latency, hls or DASH. You're saying the reduced GOP size is going to restrict the quality, or are there any other factors you're referring to?
Peter Jan Spielmans:And often the GOP sizes are also a part of the tradeoff with channel change times. That's at least what we are seeing. So a lot of the solutions where they want to make sure that you can tune in fast, I mean I've seen a lot of people move towards, yeah, gop sizes of a second. In my experience at least, that's cutting it a bit short. I don't know what your experience says on that, but for most content I mean, a GOP size smaller than two seconds starts impacting bitrate versus quality.
Jan Ozer:And somehow you're avoiding that. You're still using a larger GOP size. Is that why your quality is better on the slide than low latency dash and low latency HLS?
Peter Jan Spielmans:So what we can actually do with HESP and I don't have a slide on how it works exactly, but with HESP you I don't have a slide on how it works exactly, but with HESP you actually have two streams. There's a stream which does only keyframes, from which we can collect a keyframe to inject into the normal stream at any point in time, and this gives us the ability to change channels very quickly but also to change qualities very quickly. So if there is a need for an ABR switch, we can execute that at a very short amount of time. But because of that, we've decoupled the latency and the channel change time from the GOG size, and this is basically the secret sauce of HESP. Let's say. Well, it's not secret, it's publicly published on IETF and it allows us actually to do a very nice thing and to well, even completely decouple GOP sizes from even segment sizes or anything that you're used to in well, today's popular protocols for streaming.
Jan Ozer:I wrote about low latency technology, so there's a pretty good description of HESP on the NetEnt website, as well as low latency dash and HLS and WebRTC. So there are two streams. What are the names for the streams?
Peter Jan Spielmans:The normal, the baseline stream is what we call the continuation stream, and that's actually a stream which could be identical to low latency, hls or dash. It's like CMF based stream. And then there's the initialization stream, and that's the special one, basically, which allows us to select a single frame as a key frame at any point in time.
Jan Ozer:That's the all iframe stream, and then the other stream of the continuation stream can be done with whatever gop size you want, typically two to four seconds.
Peter Jan Spielmans:Yes, or I've even seen somebody implementing it without a fixed gop size. So really looking at scene changes, really looking at most optimal bandwidth usage, with occasionally like if he was reaching, I think, 10 seconds or something, he was doing a keyframe just to make sure the gob didn't become too long. But that was a very interesting idea, to be honest.
Jan Ozer:And looking at WebRTC, what are the restrictions on quality bandwidth for services like that?
Peter Jan Spielmans:It really depends on how you implement WebRTC. The implementation that I usually see is you do a single encode and then you distribute it towards the entire audience. So that's not how you would do WebRTC if you would do it in like a video conference call, but I think that's fine. But there the problem is actually the channel change time and, as well, the way how the network really works. A complaint that we often hear is actually that, well, webrtc is made to drop packets. It's made that it can actually drop frames occasionally. But if you drop a frame, well, you need a new keyframe to basically restart, and that often pushes these services to just reduce the GOP size so significantly that quality starts becoming an issue.
Jan Ozer:What are the other? When you talk about feature completeness, what are the features lacking in the typical WebRTC implementation that you're seeing?
Peter Jan Spielmans:One of the big ones is listed above that as well. It's DRM, but this slide is a bit older. I hope we are getting there. I don't think we're really there yet. It's not really standardized yet or available across the board, but that's an important one. But also, I mean WebRTC is strong in metadata carriage but it's not very strong in things like, for example, subtitles and all those kinds of things. I've once been told you don't have a product until you have subtitles and, to be honest, I fear that, especially for the premium use cases like the premium content, that's absolutely a thing. Accessibility, subtitles it's very important if you really want to go after that kind of segment.
Jan Ozer:But that's going to be available on a service provider by service provider basis, yes or no? Some services, I think, do provide captions, others don't. Is that accurate?
Peter Jan Spielmans:That's accurate, but the problem is that it's not standards-based. So as a result, you get I mean, it's the same with the DRM. I believe that anything can probably be built. The question is, how portable is it towards other vendors or towards other solutions?
Jan Ozer:Tell us about the production schema. What do you need on the initiation side if you're going to use HESP with your Theo Live service?
Peter Jan Spielmans:Well, if you're going to use HESP together with Theo Live, what you basically need is you need to provide us with an SRT or an RTMP feed, the Theo Live product. We see it as an end-to-end video API. We just take in whatever feed you have available and we will give you a player embed that you can drop anywhere website native app, whatever is needed. Well, that's our strength right the player side, so we allow you to basically drop it anywhere and that's it. It's fully API driven. You start stop whenever you want it to be, but, production wise, we try to make it as simple as possible.
Jan Ozer:So it's kind of an end-to-end service. You scale up as needed, you provide the CDN type delivery services, the player. Basically, I send you a stream and you take care of the rest.
Peter Jan Spielmans:That's the idea behind it.
Jan Ozer:Tell me about device support. I guess that should also be a strength of yours. But if I'm going to use the HESP service, what devices can I support on the playback side?
Peter Jan Spielmans:Well, basically everything, but I need to make one small asterisk. When you look at player support, teal Player almost supports it everywhere already today. So our standard support for HLS and Dash. We cover HESP on those platforms as well, with one exception being Roku. We have an internal POC for Roku, but it's not as low latency as we want yet. I think we hit like three to four seconds, which is not the target that we want. I know it's better than most well than any other protocol on Roku, but it's not something that we are bringing to production yet.
Jan Ozer:Talk to me about monitoring capabilities. When I'm producing a live event, I want to know at the time how the signal is getting through, what audience engagement is. What type of analytics do you provide within the Theo live service?
Peter Jan Spielmans:We don't call it analytics because that's not one of the things that we really focus on, but of course we do have all of the monitoring that we deem necessary for live production. So we do have insights, for example, on how good is the signal strength coming through? Are there any frames being dropped? Are all the frame rates okay? Is the audio there? All that kind of what we call basics that we absolutely have on the ingest side, but similarly, on the egress side, we do have insights on what is the average latency that's being delivered. What types of devices is your audience usually using? Are there any stalls happening? What kind of qualities are people getting? But this is more what we call the operational metrics, and anybody can actually add whatever analytic solution that they would want on top of TLive as well.
Jan Ozer:So HESP is a I guess it's a group standard. It's not an ISO or similar standard, is it?
Peter Jan Spielmans:No, so what we did is we, of course? Well, we started to work on it well, 2015, 2016, somewhere, but a few years ago we actually started, together with CineMedia, the HESP Alliance, and we've been evolving the standard from within that and we've published it towards IETF as a draft standard. So it's not an official RFC. Who knows, maybe one day we get there. I don't know how long it took for HLS to become an RFC.
Jan Ozer:Are there royalties involved with using this technology? I know that the organization's page talks about royalties and give us a high level view and where people can go and get more details and tell us how that applies if I use Theo Live.
Peter Jan Spielmans:If you use Theo Live, there's nothing to be concerned about. That's something that we will take care of. If you would use HESP directly, yes, within HESP Alliance there is a pool that was started to make sure that if people want to claim royalties, that they can just join that pool, and that pool is focused on the player side as well. So we developing the player side. That's where the royalties would need to come from. We try to make it simple for people. But, yeah, all of the details are basically on the HSP Alliance website, so that's probably the best source for this.
Jan Ozer:What's the URL of that HESPorg or?
Peter Jan Spielmans:I think it's HESP Allianceorg.
Jan Ozer:And who are the other service providers? I mean, you're not the only provider of HESP driven live streaming, are you, or are there others?
Peter Jan Spielmans:No, I mean within the Alliance. We have a bunch of other people or companies who have already implemented it. So CineMedia I already mentioned they have services around it available, but also, for example, scalestream, cblue they demoed it actually at IBC a few weeks ago. They have solutions which are end-to-end available and similarly, for example, drm partners like EasyDRM, bydrm they have sample streams up and running as well, with DRM then also included.
Jan Ozer:What about other player vendors at this point?
Peter Jan Spielmans:Not yet. We actually are hoping that others will start developing players for this as well. But from Tio's perspective, well, we obviously have TheoPlayer as a player which is available.
Jan Ozer:So let's switch gears and let's talk about TheoPlayer At a high level. What do you see as the primary functions of the player?
Peter Jan Spielmans:It depends on how you define player, and if you look at a lot of the open source players, what they define as the player is actually it's a streaming pipeline. You give it a stream, it renders it out on the screen and it does some stuff around subtitles, it does some stuff around multiple audio tracks and that's about it. If I talk with customers, what they see as a player well, that includes the UI, it includes integrations with analytics, with DRM, with advertisements and with all of that kind of stuff as well. So, in my opinion and that's also what the scope of TheoPlayer is well, all of those things are a part of the player as well.
Jan Ozer:What are the big you know, open source versus commercial. What are the big decision points? You know a lot of people use open source and develop some of the features you talked about. That they're you know themselves. You know if you're talking to a major corporate customer, what are the pros you see of commercial as compared to open source?
Peter Jan Spielmans:The first because I usually ask a bunch of questions to them, and the first question that I think any company should ask itself is is this really differentiating you if you are basing it on open source and building everything else around it yourself? And very often you don't really get a competitive edge by integrating an analytic solution or building a very complex UI yourself or doing any of that kind of what I would call repetitive baseline work that others have done already hundreds of thousands of times and it doesn't, in a lot of cases, generate you any additional revenue if you build it yourself. So for me, that's usually the first question that people need to ask themselves. And the next question is usually about manpower. Do you really have all of the people in-house to build all of this, to add the integrations with DRM, the ads, the analytics, to do the maintenance on it? And, yeah, all of the budget that's needed for that as well. And, yeah, all of the budget that's needed for that as well.
Peter Jan Spielmans:No-transcript issue from some kind of limitation. And if that limitation is with a vendor of yours, I mean, you get on the phone and you yell, and normally it gets fixed, or you switch to a different vendor phone and you yell and normally it gets fixed, or you switch to a different vendor, but well, I think the point is usually that it should get fixed. But if that happens with an open source solution, well, you can't really call anybody, you can't really yell at them, and if you submit a ticket, usually the answer is well, we're open for pull requests and you need to dig in and you need to dive in and you need to understand how that beast is working and that's knowledge that I mean. We're hiring, we're trying to hire people that know these kinds of things, but that knowledge is extremely rare.
Jan Ozer:What about the compatibility side, the most basic level? The player is in charge of making sure the video plays reliably on a platform. I mean, how much time do you devote to that within your engineering team, and how does that compare to an open source type player?
Peter Jan Spielmans:Most open source players and most in-house developed video players usually have it a bit easier. Developed video players usually have it a bit easier. They follow the the standard very strictly, um, or they follow their own stack very strict and they know exactly what they will get. We don't know, um. I mean, we, we have hundreds of different customers and they all do something which is slightly unique, um, and as a result, we, we have to be very, very robust, very redundant, and that's, yeah, that's one of the things that drains a lot of time for us. But on the other hand, when you look at it I mean you mentioned the player is very responsible for user experience. It is also the most visible part. If something goes wrong somewhere in your streaming pipeline, the player can probably accommodate for it, even a bit and try to smoothen the user experience. But if that player goes wrong, then you can have an amazing pipeline and everything will yeah, it will just be destroyed from a user experience perspective.
Jan Ozer:What industries have you been particularly successful in penetrating with your player?
Peter Jan Spielmans:That's a lot of different industries. If I really look at it, I think there's a few major verticals and I mean one very clear one obviously is the telcos and the operators, the cable companies where historically everybody was already working with to distribute their content, for example, companies like Swisscom Telecom Argentina. They're a few customers of ours. There are a few customers of ours Also, obviously, the broadcasters, companies like TV2 or Rai, trying to tap new markets, going direct to consumer, trying to cut out a bit of the telcos doing that, but that's an interesting story. And also, of course, ott platforms, sometimes linked to the broadcasters, sometimes linked to the, the operators but think of companies like peacock.
Peter Jan Spielmans:Also a lot of major sports leagues. Usually they're a bit protective about their brands so we can't always name them publicly. But if you name a few major sports brands, probably well, a few of those are customers of ours. And then even well, corporates NASDAQ, cern these are customers of ours as well. It looks like those are very different use cases because they're, of course, not doing premium content, but we're really covering the spectrum from subscription-based services to fast channels, advertisement-based services and even the legislation mandatory European Parliament kind of things where the stream is obviously free, but usually not watched that often.
Jan Ozer:What percentage of your customers are DRM protected?
Peter Jan Spielmans:That's actually the bulk of them, so most customers do have DRM protection on there. Obviously it's required once you get some kind of premium content, or at least for most of the rights holders it's required. If I would need to make a guess, I would think that's probably 70 to 80%.
Jan Ozer:Is DRM as complicated as it looks between the different families of DRM that you have to use to different targets, or is there an easy button you can push to make that?
Peter Jan Spielmans:It's a good question. Today, in my opinion, it's not that hard anymore. Four years ago, five years ago, yes, but for example, for DLive, we implemented this as a checkbox. You just check the box and your stream is DRM protected. That's the level that we think it can get down to if you would really want to.
Jan Ozer:Well, do I have to choose a certified provider like EasyDRM or?
Peter Jan Spielmans:ByDRM With RealLife? No, of course. If you would want to set it up yourself, yes. Then you get one of those providers. They, to date, take care of most of the complexity and players like us. I mean. We have all of those integrated, so you just load it up and it's done.
Jan Ozer:So if I check the box in your player, you're going to handle the DRM and you're going to send me an invoice, which is fine. I mean, I know I've got to pay and I might as well. As long as it's simple, I don't really care.
Peter Jan Spielmans:That's the goal. Yes, Try to make it as easy as possible. Streaming is hard already, yeah.
Jan Ozer:I mean, it's one of the major DRMs that you're supporting. Why don't you give us a two-minute overview of which DRMs, to which platforms, you're supporting?
Peter Jan Spielmans:Top of my head, obviously all of the Google platforms Android, android TV, fire TV, Chrome and similar meaning all of the edge-based browsers these days, or Chromium-based browsers, a lot of the smart TV platforms all of those will do Widevine, a lot of the older smart TVs. Obviously, windows platforms they will all do PlayReady as well, and Apple will always be Apple. That will probably always be Fairplay. The more interesting thing these days is, if you approach it the right way, then you can actually start combining all of those with CBCS DRM, and the only disadvantage you have is the old smart TVs, and then I'm thinking, oh, not that old, but like smart TVs that you bought like a year or two ago in the store. Those will not do a CBC as DRM yet. But I mean the difference white, fine play, ready, fair play. For me it's more becoming yeah, it's more becoming a brand compatibility kind of thing. The real question I think will soon be is it going to be CTR or CBCS encryption? And soon it will probably all become CBCS.
Jan Ozer:And what are you seeing in terms of CMAP versus HLS and Dash? I mean, how quickly is CMAP making an impact and the analytics you're getting back from your customers?
Peter Jan Spielmans:So CMAP itself. Of course, hls and Dash are fully compatible with that, so that's great. But if I look at, for example, hls itself like how many segments have become CMAP compared to how many segments are still transport stream? Most of the VOD archives are still transport stream and I don't expect that to change in time soon, even though it could be very easy to migrate those. It's just a cost. But today I think most of the customers are using fragmented MP4s and CMath already. So it's an evolution, but especially on the live side, I think it's moving in the right direction.
Jan Ozer:I was going to ask what are the trends you're seeing on the live streaming side, Mostly CMath at this point.
Peter Jan Spielmans:Mostly CMAP. I am noticing a trend towards more HLS compared to Dash as well, which I found interesting, reason for that probably being the mandate from Apple, or at least the tight coupling from Apple with HLS on their platforms. But beyond that, with HLS on their platforms, but beyond that, yeah, I mean I don't really see any big advantages between HLS or Dash. Well, hsp-wise, of course that's CMF compatible as well. So that's I mean. Of course I'm cheering that that one day will become a standard as well that everybody is using, but we'll see.
Jan Ozer:So let's finish off with a look at Codex. What type of analytics do you get back from your customers on which Codex they're using?
Peter Jan Spielmans:We don't harvest the analytics ourselves, of course, so we leave that up to our customers, but obviously we do get insights from our customers.
Jan Ozer:What are you seeing?
Peter Jan Spielmans:Historically, of course, everything H.264, all the things that's something which is still very much the case. But especially on smart TVs, and these days as well, I mean more and more companies and more and more customers are looking at mixed ABR ladders, hevc, definitely on the rise for smart TVs let's see if the recent lawsuits with Netflix and others will change that or not, who knows? And AV1 actually, surprisingly, also getting a little bit more traction over the last year. Already Not that commonly deployed yet I actually see VP9 still a bit more than AV1. But it is a clear trend that those codecs are also on the rise.
Jan Ozer:Give us a percentage of AV1 and tell us who's using it, if there's any concentration you can identify.
Peter Jan Spielmans:I would probably think that on all the bulk of video that we are doing, it's probably still less than a percent for us.
Jan Ozer:Your comment on hybrid encoding ladders raised a question how much detail do you know about what people are doing on the hybrid side? How much detail do you know about what people are doing on the hybrid side? If I'm offering H.264 and HEVC, do I do it in two separate ladders, like Apple recommends, or do I have a hybrid ladder that's got H.264 in the bottom rungs and HEVC in the top rungs? What are people doing?
Peter Jan Spielmans:We see both. In the past most customers did separate ladders but of course it's not always economically interesting to really do that. So these days we're seeing more and more companies switching towards HEVC for the higher rungs and then H264 for the lower rungs, and not every platform allows for it. So that's an asterisk to make. But on most platforms you can today seamlessly switch between H.264 and HEVC. So that's a very relevant change that we've seen.
Peter Jan Spielmans:And on those platforms where it's not possible to do a seamless switch, what we do as a player or what we at least attempt as a player and try to provide as a possibility for our customers is that we start with whatever the best codec is for the current bandwidth and the current device and if we see that it would be possible to switch towards the other codec to get a better quality or because we need to switch down, at that point in time we can actually make that switch, depending on customer configuration. If a customer would configure like I want to stay with H.264 and that's it, then we will not dynamically switch. But if they would say yes, you're allowed to switch dynamically, even though it might degrade user experience because there will be a black screen inserted in between the switch or the switch will be very noticeable. That's an option that we provide for those devices that don't allow you to switch smoothly.
Jan Ozer:How much of that is 4K and how much of that is 1080p.
Peter Jan Spielmans:Most of the times when it's about HEVC or AV1, it's almost always the discussion always starts with 4k For 1080p. Yeah, I see a lot of H.264 still.
Jan Ozer:What are you seeing, 10-bit versus 8-bit and HDR versus SDR and if you're not getting data back, then let me know. But how much 10-bit usage outside of the premium content field?
Peter Jan Spielmans:Outside of the premium content field. I think the value will be zero for at least what I know about. If I look at the premium content side, for example, services like Peacock, I mean obviously they serve HDR as well, they do the whole Dolby Vision, dolby Atmos, all that kind of stuff they have in there as well.
Jan Ozer:Got a question in about the year live. You talked about maintaining low latency with large audience sizes. What audience size are you talking about? What's the largest in terms of viewers type production have you achieved with Theo Live and what latency was that?
Peter Jan Spielmans:I would need to check. I know that there's a very big one coming up in a few weeks that a lot of people are very happy but also a bit well wanting to monitor for as well. I think today it's, yeah, tens of thousands, hundreds of thousands. That kind of ballpark we've seen already today and that's usually at latencies. Let's say 800 milliseconds to like a second. That's the latency latencies. Let's say 800 milliseconds to like a second. That's the latency that we see there. It depends a little bit on location, on device. There are always some users who go to like a second and a half. There's always some users who are a bit lower than the 800 milliseconds as well.
Jan Ozer:Usually services talk about synchronizing those so everybody's at the same place. How does that work with Theo Live?
Peter Jan Spielmans:So when you set what the target latency is that you would want to have, so if you would say, instead of go as low as possible, you set it to go to a second or go to a second and a half, At that point in time all of the players will try to synchronize themselves because they will all try to achieve that same latency. If you just say go as fast as possible, yeah, it's of course not synchronized, but it's as fast as possible.
Jan Ozer:What are your customers typically? I mean, if I'm an auction house or a gambling house, how synchronized do I have to be?
Peter Jan Spielmans:Most of the betting people. They will basically it um and try to synchronize around like a second, a second and a half um, that's at least the experience that I have today. Um, simply to level the playing field a little bit, um, make the integration with the metadata, uh, slightly easier as well, because in those cases it's it's highly important that all of the metadata is in sync. But, yeah, that's more or less the ballpark that I see there.
Jan Ozer:A couple of questions about origination streams with HESP and TheoLive. What are the recommendations in terms of configuration for the origination stream? So GOP, size, b-frames, profiles, codecs.
Peter Jan Spielmans:B-frames and low latency always a bad idea. That's not just my opinion, I hope so. That's something that I would not recommend. For the origination stream and beyond that, it really depends on what kind of output you want to get. So what we see is when people want to output a 1080p stream, yeah, it doesn't make a lot of sense to send us a 4K feed. Similarly, when there are customers who want to output 720p very common as well. Sometimes even it's like what is it 576p for some of the betting or when they don't have rights to go higher than that and they run it at like 2 megabits or something, yeah, then don't send us a 16 megabit stream. Then it makes a lot more sense to take like a 2 megabit or a 4 megabit style stream. Send that to us and then we can take it from there. And obviously I mean frame rates. Don't force us to transform 25 frames per second to 30 or vice versa. That's, of course, not something you should be doing.
Jan Ozer:Question about HEVC versus H.264, any preference? Question about HEVC versus.
Peter Jan Spielmans:H.264, any preference, we take both, but most people send us H.264 still today, which is fine. We can take either.
Jan Ozer:And I guess the last question talks about hardware versus software encoders on the origination side. I mean, how many people are sending you a stream from Wirecast or OBS? Versus some of the hardware encoders that are out there.
Peter Jan Spielmans:That's a good question. I do know that there's a lot of people on the event side that are still using things like OBS or Wirecast. I think most of the more serious content that we have, I mean they have dedicated devices for this kind of contribution. So probably well, a part is still going to be in software, but a nice part is probably in hardware as well.
Jan Ozer:Which protocols are you seeing being streamed to you RTMP and SRT or what are you seeing as the mix now?
Peter Jan Spielmans:Most of it is still RTMPS, and the reason for that appears to be relatively straightforward. With SRT there's a lot of good tools, but very often there's not a lot of flexibility in how big you want the buffer sizes to be, and as a result we sometimes see that using SRT actually adds latency on top of RTMP. So if the network connection is stable, there's no real added benefit of using SRT every time.
Jan Ozer:One other question popped up. You were one of the first implementers of LCEVC. What are you seeing on that front? Are you seeing increased adoption? Is it about to explode, or what's your sense of what's happening with that codec?
Peter Jan Spielmans:It's an interesting story, I think. Are we seeing an increased interest? Absolutely. Are we seeing a lot of adoption today? I think that the answer is unfortunately no, but interest-wise that's definitely something which is increasing.
Jan Ozer:What does that mean? I mean, does that mean it's about to pop or you just still don't know that it's going to be successful or not?
Peter Jan Spielmans:It's difficult to say. If there is one thing still holding it back, I think it's the DRM question which, especially for most of our customers, makes it very difficult. You can't do DRM with LCVC today, or at least definitely not hardware-based DRM, and that's yeah for most of the premium content. That's still a limitation. For some of the user-generated content it's obviously not an issue and as a result I do see a lot more interest coming from that corner. But I know a lot of the big telcos and some of those types of customers as well. They've looked at it, they're interested in it, they want to test with it. But once they hit the DRM wall, yeah, that's usually when interest goes to sleep again until they get to solve that as well.
Jan Ozer:Well, give us a couple of websites. I guess you have Theo Technologies. What's your website? Is it theoplayercom?
Peter Jan Spielmans:Yes, theoplayercom is still the place where you can find well almost all of the information. Hspallianceorg probably a good source for hsp kind of information, but that's at least two places that I'm most active on listen, thanks for your time today.
Jan Ozer:this was, uh, this was a lot of fun and pretty interesting stuff, so thanks for agreeing to chat with us.
Peter Jan Spielmans:It was a pleasure for me as well.
Jan Ozer:This episode of Voices of Video is brought to you by NetInt Technologies.
Peter Jan Spielmans:If you are looking for cutting-edge video encoding solutions, check out NetInt's products at netintcom.