There could be a #4 "historically, streaming could use protocols with unreliable delivery and with limited or no retransmission" (which is somewhat related to #1 and #2). For example, there have been media streaming protocols built on UDP rather than TCP, so packets that are lost are not automatically retransmitted. The idea is that for a real-time stream transmission, older frames are no longer considered relevant (as they would not be rendered at all if they were received late), so there is typically no benefit in retransmitting a dropped packet.
That means you could get drop-outs when data gets lost in transmission, but the overall data consumption of the protocol wouldn't go up as a result.
Not all that long ago, this prompted lots of debate about QoS and prioritization and paid prioritization and network neutrality and stuff. People were arguing that media streams needed higher priority on the Internet than downloads (and other asynchronous communications). Effectively, different Internet applications were directly competing with one another, yet they had very different degrees of tolerance to delays, packet reordering, and packet loss. Wouldn't ISPs have to intervene to prioritize some applications over others?
I remember reading from Andrew Odlyzko that this controversy was mostly resolved in an unexpected way: faster-than-realtime streams with buffering (as the network was typically faster overall than what was needed for a given level of media quality, you could use TCP to reliably download frames that were still in the future with respect to what would be played, and then buffer those locally). This is indeed the scenario depicted in this article.
What about actual live events? My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime, specifically to allow for significant buffering on the client, and then using reliable TCP faster-than-realtime downloads of the "near future" of the video content. Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live. (I don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.)
There's a big difference for bidirectional conversation, like phone calls, because there even tiny delays are extremely psychologically noticeable. It appears that Zoom, for instance, is still using unreliable UDP streams for call content, which allows skips or dropouts, but keeps the latency relatively low so that it feels comparatively more like a face-to-face interactive conversation.
> My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime,
Yeah. The rule of thumb with Twitch used to be 11 seconds. You can still measure this because many streams replay the chat in the stream as an overlay for both being able to see when the streamer has seen your message and for archival purposes to preserve the chat in VODs.
> don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.
There's a buffer on the CDN (which they have anyways because they're recording the VOD) and you start playback at the point t seconds back.
There is a low-latency mode for twitch now that I think is enabled by default. The actual delay is reported in the "stats" view under advanced settings, probably coming from timestamps in the video metadata. There's third-party twitch clients I've used with the option to delay the chat to match the video using this information, which would be useful if you didn't want chat popping off to spoil you.
Good point. I didn't think of that, but that's right.
I have seen significant delays in that situation, so maybe a better way to say this is "using text rather than voice for feedback makes the delays introduced this way less psychologically noticeable" (because they are noticeable in the situation you mention).
On the grasping hand, sometimes the feed is deliberately delayed, such as when the streamer is showcasing some kind of competitive activity wear their own information could be used against them.
> What about actual live events? My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime, specifically to allow for significant buffering on the client, and then using reliable TCP faster-than-realtime downloads of the "near future" of the video content. Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live. (I don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.)
For TV, last I worked on a system like this the clients received data the same way as non-live streams: Http streaming (HLS or Dash), where you fetch playlists and small video files and the player stitches them all together. There's buffering along the pipe, the 30-60s total delay (which you'll notice if you watch sports and chat with someone who has cable and is watching the same thing) is a cumulative thing, so you don't see a 1-min startup delay, you just near-instantly get dropped into something that's already quite a bit behind.
Not sure what Twitch does. The over-the-network video game streaming-console services are obviously completely different from TV land, they couldn't get away with it there; but for TV the expense of better isn't seen worth it.
Twitch is HLS, but they've tightened the buffers and shortened the segments (2s is standard) so that latencies of down to a couple of seconds is common. It's quite impressive, tbh.
No need for notifications even, you can literally hear latency varying up to 30 seconds by listening for cheers during important game in a block of flats on a warm summer night.
> My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay
This used to be the case, and may still be for some steamers, but mostly when I watch it's less than a couple seconds delay with the low latency mode enabled in the browser.
...you could use TCP to reliably download frames that were still in the future with respect to what would be played, and then buffer those locally)
I was mucking around with my network recently, with Netflix playing in the background. Rebooted the router, and to my utter surprise, the stream continued to play uninterrupted for the entire (30+ seconds) time it takes my network stack to reinitialize. I did not realize how aggressively the providers buffer, but it completely papered over the lack of internet service for the window.
With YouTube on desktop, you can right click and turn on "Stats for Nerds" to watch the buffer health in realtime to see how much buffer it has decided your connection needs and how often it refills it
>Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live.
Depending on the delay, this can cause problems when switching from delayed streaming to real life. For example, watching the countdown on a rocket launch via streaming then going outside to watch the actual launch. Usually, for me, a few seconds delay is OK, because I can't see the rocket until about 30 seconds after liftoff due to trees. But when I have a better view of the launch pad the delay can become an issue.
I remember having countless depressing conversations about this all the way back in the very early 2000s when potential clients wanted us to program a video "streaming" system that did not allow downloading, and fruitlessly trying to convince them that streaming was downloading--there's no meaningful technical difference. People were convinced that "streaming" was some weird distinct mode that the Internet could be converted into, and that you just need to program harder to do it.
Streaming does not by default save to device. There are ways around it; these are pointless to invest too much in fighting. Making streaming good and reasonably priced makes legit customers out of pirates.
You will never defeat piracy through technology, only through economics.
I remember going into local browser cache folders and pulling out YouTube videos in full. Am I remembering wrong or did in fact the #1 video streaming platform simply just download the videos to your hard drive, same as you would have with right click save? Only difference is the default folder it goes to.
But that's not accurate. It does download the content to the device but only stores it in the browser cache as chunks or otherwise you could not buffer or replay.
I remember a line from the 2002 movie Big Fat Liar involving a school assignment, that went something like, "And don't even think about downloading something from the internet; I want that essay hand-written".
One reason I remember it is the implicit assumption that one couldn't transcribe a digital essay onto physical paper.
The other reason is because it seemed strange to call it "downloading" when I was imagining a web page. Aside from the possibility that it was downloading a non-HTML document file, "downloading" didn't feel right for "visiting a web page", even though of course it is downloading in a more technical sense.
Like "streaming", downloading a web page into your browser's memory isn't saving it to long term storage.
Large CD-collection, large DVD/BD collection. I go to the cinema etc.
Problem is a lot of media is missing in the mainstream, much of the less popular stuff is kept alive on the private trackers. My philosophy nowadays is that it's become an archival endeavor that I want to contribute to.
I wish we could all pay an internet fee for free access to all media like proposed by Stallman in Free Software, Free Society but we're going backwards towards fragmentation like cable tv back in the 90s with ever increasing prices (watch Black Mirror S07E01 common people for an ironic take on this).
I do, but still routinely resort to high seas. Many reasons:
Forcing me to download 1080p but 4K is stream only (how does that make sense, esp when your CDN is too far to effectively stream 4K)
Downloads that expire after few weeks
Sad lack of localized subtitles (Netflix is the worst offender here and Apple is best. While there are some third party browser plugins for this, it obviously doesn't work on TVs)
Can't add multiple subtitles (i.e. watching same movie with parents who don't understand neither movies original language, nor english subtitles)
Free national content that's only accessible locally. As an expat who spends quite a bit with our tiny community abroad - I find it ridiculous we can't easily access content my kids could watch and learn my language. Such a simple step.
Cherry on a cake - how come pirate sites are even better at search + filtering?
People willing to put in an effort to acquire and lovingly conserve the media of interest to them are often the people who really care about that media. They are often the most eager to support the creators of different media.
Which is extremely logical and obvious, if one can quickly lift one's head up above the "anti-piracy" propaganda of the major copyright-wielding creativity-killing companies spewing out the same drivel year after year.
Conversely, I have found that the same people who will happily equate "having a spotify subscription" to "supporting artists", who do things like attack people who jailbreak their kindle or whatever, these people are often the greatest thoughtless vibers when it comes to media. Try asking someone like that to name a piano player, or a bassist, or a drummer. Ask them to name three directors.
They'll know celebrities, not musicians or actors. They'll know to attack "pirates" on cue, but have no conception where their money goes every month when their subscriptions are billed.
On capable devices actual downloading is even supported as an USP by most providers for offline/travel scenarios.
Besides that there are even more externalities that differentiate them:
Client and User requirements and targeted devices, therefore mass adoption and market penetration.
Downloading requires quite expensive hardware by comparison in usually quite complicated setups for a TV/like experience, it requires the user to do active file management, (which includes deleting files at some point, or buy more expensive local infrastructure) to become a mass market consumer thing, this needs to be externalized.
A streaming client is way cheaper to build and market, since doesn't need any relevant amount of non/volatile memory to speak off, that the user experience easier to sell is also quite obvious as witnessed by the golden last decade, it's only now getting tainted by encroaching advertising and platform proliferation etc.
(Music) Streaming being a "rented" download is the analogy I used to use back in the day.
e.g. the "rented" downloads can be removed from file system at any time by the service you've "rented" from, while a "purchased" or "owned" download is only removed by the person who purchased it.
1) In remote, The server will provide chunks of data. Each chunk has predefined length.
Even in Text format you have to identify each chunks to process. `Transfer-Encoding: chunked` HTTP header for example.
> No local streaming in remote computer.
2) In local storage system you can stream any length of data from Storage Drive to RAM
Downloading:
1) In remote, The client will request chunks of data. You can request any length.
You don't need to identify the chunks. You can append the downloaded data without any process.
> There is Local streaming. The remote computer actually streams data from it's storage to RAM.
2) In local copying from peripheral device is also called Downloading. I've seen Downloading label in micro-controller burner.
Presenting, Storing or Deleting the either one data is your choice. Not only stream; also you can watch, listen, read the downloaded content also without storing it into actual drive or without finishing the download. It's all actions nothing to do with techniques behind the terms.
At Koofr[1] one of the most requested features was an option to prevent downloading files from public links. We didn't want to lie to our users so we added a "Hide download button" option because that's the only thing you can do. You can hide the download button but you can never really prevent the download.
>> rights holders have engaged in a fundamentally-doomed arms race of implementing copy-protection strategies
Not entirely true. They simply haven't succeeded in created an industry-standard secure pipeline to the pixels on the display. Aside from the "analogue hole", eventually all of the gaps will be plugged, the same way we use secure sockets today. All media devices (including home HDTV/8K/etc) will extend the chain of trust farther into the pipeline. A set of signed apps and hardware will be required to watch any DRM films on HDTV, with each stage using authenticated encryption completely annihilating any MITM siphoning of the video.
So, its not doomed, just moving slowly, but it absolutely WILL arrive. I know, because I'm working on secure embedded video codec hardware, and our customers are targeting this..
At some point you hit the pixel driver with a bunch of bits, unless your pipeline involves digital signing of copyrights in everyone's future cyber eyeballs, it will always be possible to get the video if you have hardware access.
And the article goes over how there is already an industry standard for the encryption pipeline that goes all the way to monitors and television sets themselves and how you can get a cheap device which just pretends to be a TV and passes on an unencrypted HDMI out.
The end goal is end-to-end protection with online verification. As far as I can tell, we are already halfway there. The highest level of Widevine protection in use today essentially involves the streaming server having a private encrypted conversation directly with your GPU. That includes a certificate that can expire due to age and be revoked due to suspicion of tampering. If anything is not up to snuff, you'll get downgraded content at best and a ban at worst.
The next logical step is to extend this process down the chain to include every device from the GPU to the display.
In order to make a fake TV work, you'd likely need to take a real TV and hack it. That's going to get increasingly difficult and various watermarking techniques will likely allow it to be identified and blacklisted anyway.
There are still people watching television on 1980’s hardware. Full HD televisions have been essentially feature complete for over 20 years and should remain relevant for another 20 years, since the vast majority of broadcasts are still 480p and 720p. There are now hundreds of millions of 4k and 8k televisions and projectors with expected service life and lifecycles extending into 2050s.
Bricking those devices en masse is a PR disaster and invites legal scrutiny from regulators, and any individual service suddenly requiring special hardware is shooting itself in the face financially.
> since the vast majority of broadcasts are still 480p and 720p.
I don’t think I’ve seen anything below 1080p on Xfinity cable in the USA for at least 10 years. Even older content is typically upscaled at the broadcast source (e.g. Seinfeld reruns)
Are you referring to over-the-air broadcasts? Or cable/satellite broadcasts?
All it takes is one person to figure out how to get the bits out, and then the only other potential solution would be to make devices that cannot play unencrypted content.
>I know, because I'm working on secure embedded video codec hardware, and our customers are targeting this..
Why? Or more specifically, why you, doing that?
You can say no, you know. To solve your problem, you're making for some of the least scrupulous people on the planet, (Hollywood types), the primitives to a guaranteed technologically enforceable tyranny. Remember that just because someone says they won't do something with a thing, doesn't mean the heel turn isn't coming. Sometimes you just don't build things because people can't be trusted with them.
So, why are you doing it?
You might think it's just harmless bits now... But today's harmless bits are tomorrow's chain links. Seriously asking. Might help me out of a mental hang up I'm trying to work through.
That means you could get drop-outs when data gets lost in transmission, but the overall data consumption of the protocol wouldn't go up as a result.
Not all that long ago, this prompted lots of debate about QoS and prioritization and paid prioritization and network neutrality and stuff. People were arguing that media streams needed higher priority on the Internet than downloads (and other asynchronous communications). Effectively, different Internet applications were directly competing with one another, yet they had very different degrees of tolerance to delays, packet reordering, and packet loss. Wouldn't ISPs have to intervene to prioritize some applications over others?
I remember reading from Andrew Odlyzko that this controversy was mostly resolved in an unexpected way: faster-than-realtime streams with buffering (as the network was typically faster overall than what was needed for a given level of media quality, you could use TCP to reliably download frames that were still in the future with respect to what would be played, and then buffer those locally). This is indeed the scenario depicted in this article.
What about actual live events? My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime, specifically to allow for significant buffering on the client, and then using reliable TCP faster-than-realtime downloads of the "near future" of the video content. Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live. (I don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.)
There's a big difference for bidirectional conversation, like phone calls, because there even tiny delays are extremely psychologically noticeable. It appears that Zoom, for instance, is still using unreliable UDP streams for call content, which allows skips or dropouts, but keeps the latency relatively low so that it feels comparatively more like a face-to-face interactive conversation.
Yeah. The rule of thumb with Twitch used to be 11 seconds. You can still measure this because many streams replay the chat in the stream as an overlay for both being able to see when the streamer has seen your message and for archival purposes to preserve the chat in VODs.
> don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.
There's a buffer on the CDN (which they have anyways because they're recording the VOD) and you start playback at the point t seconds back.
Plenty of streamers show the chat on screen and talk with people in the chat. This is not true.
I have seen significant delays in that situation, so maybe a better way to say this is "using text rather than voice for feedback makes the delays introduced this way less psychologically noticeable" (because they are noticeable in the situation you mention).
For TV, last I worked on a system like this the clients received data the same way as non-live streams: Http streaming (HLS or Dash), where you fetch playlists and small video files and the player stitches them all together. There's buffering along the pipe, the 30-60s total delay (which you'll notice if you watch sports and chat with someone who has cable and is watching the same thing) is a cumulative thing, so you don't see a 1-min startup delay, you just near-instantly get dropped into something that's already quite a bit behind.
Not sure what Twitch does. The over-the-network video game streaming-console services are obviously completely different from TV land, they couldn't get away with it there; but for TV the expense of better isn't seen worth it.
Deleted Comment
This can be a problem, for example when sports fans receive out-of-band notification of a goal before they see it happen on their "live" stream.
This used to be the case, and may still be for some steamers, but mostly when I watch it's less than a couple seconds delay with the low latency mode enabled in the browser.
Depending on the delay, this can cause problems when switching from delayed streaming to real life. For example, watching the countdown on a rocket launch via streaming then going outside to watch the actual launch. Usually, for me, a few seconds delay is OK, because I can't see the rocket until about 30 seconds after liftoff due to trees. But when I have a better view of the launch pad the delay can become an issue.
Streaming does not by default save to device. There are ways around it; these are pointless to invest too much in fighting. Making streaming good and reasonably priced makes legit customers out of pirates.
You will never defeat piracy through technology, only through economics.
But that's not accurate. It does download the content to the device but only stores it in the browser cache as chunks or otherwise you could not buffer or replay.
On the other hand, that will get you plenty of job working for those people.
One reason I remember it is the implicit assumption that one couldn't transcribe a digital essay onto physical paper.
The other reason is because it seemed strange to call it "downloading" when I was imagining a web page. Aside from the possibility that it was downloading a non-HTML document file, "downloading" didn't feel right for "visiting a web page", even though of course it is downloading in a more technical sense.
Like "streaming", downloading a web page into your browser's memory isn't saving it to long term storage.
Sailing the high seas since Napster.
I give thanks to:
BitTorrent - Private trackers - Subsonic API - Navidrome - invidious - yt-dlp - Infuse - mpv
Large CD-collection, large DVD/BD collection. I go to the cinema etc.
Problem is a lot of media is missing in the mainstream, much of the less popular stuff is kept alive on the private trackers. My philosophy nowadays is that it's become an archival endeavor that I want to contribute to.
I wish we could all pay an internet fee for free access to all media like proposed by Stallman in Free Software, Free Society but we're going backwards towards fragmentation like cable tv back in the 90s with ever increasing prices (watch Black Mirror S07E01 common people for an ironic take on this).
Forcing me to download 1080p but 4K is stream only (how does that make sense, esp when your CDN is too far to effectively stream 4K)
Downloads that expire after few weeks
Sad lack of localized subtitles (Netflix is the worst offender here and Apple is best. While there are some third party browser plugins for this, it obviously doesn't work on TVs)
Can't add multiple subtitles (i.e. watching same movie with parents who don't understand neither movies original language, nor english subtitles)
Free national content that's only accessible locally. As an expat who spends quite a bit with our tiny community abroad - I find it ridiculous we can't easily access content my kids could watch and learn my language. Such a simple step.
Cherry on a cake - how come pirate sites are even better at search + filtering?
One can always subscribe to all major video and audio platforms, visit cinema's and concerts, buy merchandise, but still sail the high seas..
Which is extremely logical and obvious, if one can quickly lift one's head up above the "anti-piracy" propaganda of the major copyright-wielding creativity-killing companies spewing out the same drivel year after year.
Conversely, I have found that the same people who will happily equate "having a spotify subscription" to "supporting artists", who do things like attack people who jailbreak their kindle or whatever, these people are often the greatest thoughtless vibers when it comes to media. Try asking someone like that to name a piano player, or a bassist, or a drummer. Ask them to name three directors.
They'll know celebrities, not musicians or actors. They'll know to attack "pirates" on cue, but have no conception where their money goes every month when their subscriptions are billed.
I'm caricaturing, but these are my experiences.
Deleted Comment
Deleted Comment
Besides that there are even more externalities that differentiate them:
Client and User requirements and targeted devices, therefore mass adoption and market penetration.
Downloading requires quite expensive hardware by comparison in usually quite complicated setups for a TV/like experience, it requires the user to do active file management, (which includes deleting files at some point, or buy more expensive local infrastructure) to become a mass market consumer thing, this needs to be externalized.
A streaming client is way cheaper to build and market, since doesn't need any relevant amount of non/volatile memory to speak off, that the user experience easier to sell is also quite obvious as witnessed by the golden last decade, it's only now getting tainted by encroaching advertising and platform proliferation etc.
e.g. the "rented" downloads can be removed from file system at any time by the service you've "rented" from, while a "purchased" or "owned" download is only removed by the person who purchased it.
Streaming:
1) In remote, The server will provide chunks of data. Each chunk has predefined length.
Even in Text format you have to identify each chunks to process. `Transfer-Encoding: chunked` HTTP header for example.
> No local streaming in remote computer.
2) In local storage system you can stream any length of data from Storage Drive to RAM
Downloading:
1) In remote, The client will request chunks of data. You can request any length.
You don't need to identify the chunks. You can append the downloaded data without any process.
> There is Local streaming. The remote computer actually streams data from it's storage to RAM.
2) In local copying from peripheral device is also called Downloading. I've seen Downloading label in micro-controller burner.
Presenting, Storing or Deleting the either one data is your choice. Not only stream; also you can watch, listen, read the downloaded content also without storing it into actual drive or without finishing the download. It's all actions nothing to do with techniques behind the terms.
[1] https://koofr.eu
Not entirely true. They simply haven't succeeded in created an industry-standard secure pipeline to the pixels on the display. Aside from the "analogue hole", eventually all of the gaps will be plugged, the same way we use secure sockets today. All media devices (including home HDTV/8K/etc) will extend the chain of trust farther into the pipeline. A set of signed apps and hardware will be required to watch any DRM films on HDTV, with each stage using authenticated encryption completely annihilating any MITM siphoning of the video.
So, its not doomed, just moving slowly, but it absolutely WILL arrive. I know, because I'm working on secure embedded video codec hardware, and our customers are targeting this..
And the article goes over how there is already an industry standard for the encryption pipeline that goes all the way to monitors and television sets themselves and how you can get a cheap device which just pretends to be a TV and passes on an unencrypted HDMI out.
The next logical step is to extend this process down the chain to include every device from the GPU to the display.
In order to make a fake TV work, you'd likely need to take a real TV and hack it. That's going to get increasingly difficult and various watermarking techniques will likely allow it to be identified and blacklisted anyway.
There are still people watching television on 1980’s hardware. Full HD televisions have been essentially feature complete for over 20 years and should remain relevant for another 20 years, since the vast majority of broadcasts are still 480p and 720p. There are now hundreds of millions of 4k and 8k televisions and projectors with expected service life and lifecycles extending into 2050s.
Bricking those devices en masse is a PR disaster and invites legal scrutiny from regulators, and any individual service suddenly requiring special hardware is shooting itself in the face financially.
I don’t think I’ve seen anything below 1080p on Xfinity cable in the USA for at least 10 years. Even older content is typically upscaled at the broadcast source (e.g. Seinfeld reruns)
Are you referring to over-the-air broadcasts? Or cable/satellite broadcasts?
Why? Or more specifically, why you, doing that?
You can say no, you know. To solve your problem, you're making for some of the least scrupulous people on the planet, (Hollywood types), the primitives to a guaranteed technologically enforceable tyranny. Remember that just because someone says they won't do something with a thing, doesn't mean the heel turn isn't coming. Sometimes you just don't build things because people can't be trusted with them.
So, why are you doing it?
You might think it's just harmless bits now... But today's harmless bits are tomorrow's chain links. Seriously asking. Might help me out of a mental hang up I'm trying to work through.