The push for lossless seems more like pushback on low bit rate and reduced dubamic range by avoiding compression altogether. Not really a snob thing as much as trying to avoid a common issue.
The video version is getting the Blu-ray which is significantly better than streaming in specific scenes. For example every scene that I have seen with confetti on any streaming service is an eldritch horror of artifacts, but fine on physical media, because the streaming compression just can’t handle that kind of fast changing detail.
It does depend on the music or video though, the vast majority are fine with compression.
My roommate always corrects me when I make this same point, so I’ll pass it along. Blu-Rays are compressed using H.264/H.265, just less than streaming services.
Or worse. I think it was the original Ninja Turtles movie that I had owned on DVD and the quality of it kind of sucked. Years later I got it on blu ray and I swear they just ripped one of the DVD copies to make the blu ray disc.
People don’t like hearing this, but streaming services tune their codecs to properly calibrated TVs. Very few people have properly calibrated TVs. In particular, people really like to up the brightness and contrast.
A lot of scenes that look like mud are that way because you really aren’t supposed to be able distinguish between those levels of blackness.
That said, streaming services should have seen the 1000 comments like the ones here and adjusted already. You don’t need bluray level of bits to make things look better in those dark scenes, you need to tune your encoder to allow it to throw more bits into the void.
Lmao, I promise streaming services and CDNs employ world-class experts in encoding, both in tuning and development. They have already poured through maximized quality vs cost. Tuning your encoder to allow for more bits in some scenes by definition ups the average bitrate of the file, unless you’re also taking bits away from other scenes. Streaming services have already found a balance of video quality vs storage/bandwith costs that they are willing to accept, which tends to be around 15mbps for 4k. That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.
Calibrating your tv, while a great idea, can only do so much vs low-bitrate encodings and the fake HDR services build in solely to trigger the HDR popup on your tv and trick it into upping the brightness rather than to actuality improve the color accuracy/vibrancy.
They don’t really care about the quality, they care that subscribers will keep their subscriptions. They go as low quality as possible to cut costs while retaining subs.
Blu-rays don’t have this same issue because there are no storage or bandwith costs to the provider, and people buying blu-rays are typically more informed, have higher quality equipment, and care more about image quality than your typical streaming subscriber.
I fail to see where TV calibration comes in here tbh. If I can see blocky artifacts from low bitrate it will show up on any screen unless you turn the brightness down so far that nothing is visible.
Blocky artifacts typically appear in low light situations. There will be situations where it might just be blocky due to not having enough bits (high motion scenes) but there are plenty of cases where low light tuning is where you’d end up noticing the blockyness.
Blocky artifacts are the result of poor bitrates. In streaming services it’s due to over compressing the stream, which is why you see it when part of a scene is still or during dark scenes. It’s due to the service cheaping out and sending UHD video at 720p bitrates.
The issue at play for streaming services is they have a general pipeline for encoding. I mean, it could be described as cheaping out because they don’t have enough QA spot checking and special purposing encodes to make sure the quality isn’t trash. But it’s really not strictly a “not enough bits” problem.
The thing is, dynamic range compression and audio file compression are two entirely separate things. People often conflate the two by thinking that going from wav or flac to a lossy file format like mp3 or m4a means the track becomes more compressed dynamically, but that’s not the case at all. Essentially, an mp3 and a flac version of the same track will have the same dynamic range.
And yes, while audible artifacts can be a thing with very low bitrate lossy compression, once you get to128kbps with a modern lossy codec it becomes pretty much impossible to hear in a blind test. Hell, even 96kbps opus is much audibly perfect for the vast majority of listeners.
The push for lossless seems more like pushback on low bit rate and reduced dubamic range by avoiding compression altogether. Not really a snob thing as much as trying to avoid a common issue.
The video version is getting the Blu-ray which is significantly better than streaming in specific scenes. For example every scene that I have seen with confetti on any streaming service is an eldritch horror of artifacts, but fine on physical media, because the streaming compression just can’t handle that kind of fast changing detail.
It does depend on the music or video though, the vast majority are fine with compression.
My roommate always corrects me when I make this same point, so I’ll pass it along. Blu-Rays are compressed using H.264/H.265, just less than streaming services.
Higher bitrate though init
Significantly, streaming is 8-16Mbps for 4K, whereas 4K discs are >100
🤓☝️ many older blu-rays also used VC1
Or worse. I think it was the original Ninja Turtles movie that I had owned on DVD and the quality of it kind of sucked. Years later I got it on blu ray and I swear they just ripped one of the DVD copies to make the blu ray disc.
Sadly, that basically feels like what happened with The Fellowship of the Ring’s theatrical cut blu ray, too. It just doesn’t look that great.
Then the extended edition has decent fidelity but some bizarro green-blue color grading.
People don’t like hearing this, but streaming services tune their codecs to properly calibrated TVs. Very few people have properly calibrated TVs. In particular, people really like to up the brightness and contrast.
A lot of scenes that look like mud are that way because you really aren’t supposed to be able distinguish between those levels of blackness.
That said, streaming services should have seen the 1000 comments like the ones here and adjusted already. You don’t need bluray level of bits to make things look better in those dark scenes, you need to tune your encoder to allow it to throw more bits into the void.
Lmao, I promise streaming services and CDNs employ world-class experts in encoding, both in tuning and development. They have already poured through maximized quality vs cost. Tuning your encoder to allow for more bits in some scenes by definition ups the average bitrate of the file, unless you’re also taking bits away from other scenes. Streaming services have already found a balance of video quality vs storage/bandwith costs that they are willing to accept, which tends to be around 15mbps for 4k. That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.
Calibrating your tv, while a great idea, can only do so much vs low-bitrate encodings and the fake HDR services build in solely to trigger the HDR popup on your tv and trick it into upping the brightness rather than to actuality improve the color accuracy/vibrancy.
They don’t really care about the quality, they care that subscribers will keep their subscriptions. They go as low quality as possible to cut costs while retaining subs.
Blu-rays don’t have this same issue because there are no storage or bandwith costs to the provider, and people buying blu-rays are typically more informed, have higher quality equipment, and care more about image quality than your typical streaming subscriber.
I fail to see where TV calibration comes in here tbh. If I can see blocky artifacts from low bitrate it will show up on any screen unless you turn the brightness down so far that nothing is visible.
Blocky artifacts typically appear in low light situations. There will be situations where it might just be blocky due to not having enough bits (high motion scenes) but there are plenty of cases where low light tuning is where you’d end up noticing the blockyness.
Blocky artifacts are the result of poor bitrates. In streaming services it’s due to over compressing the stream, which is why you see it when part of a scene is still or during dark scenes. It’s due to the service cheaping out and sending UHD video at 720p bitrates.
Look, this is just an incorrect oversimplification of the problem. It’s popular on the internet but it’s just factually incorrect.
Here’s a thread discussing the exact problem I’m describing
https://www.reddit.com/r/AV1/comments/1co9sgx/av1_in_dark_scenes/
The issue at play for streaming services is they have a general pipeline for encoding. I mean, it could be described as cheaping out because they don’t have enough QA spot checking and special purposing encodes to make sure the quality isn’t trash. But it’s really not strictly a “not enough bits” problem.
The thing is, dynamic range compression and audio file compression are two entirely separate things. People often conflate the two by thinking that going from wav or flac to a lossy file format like mp3 or m4a means the track becomes more compressed dynamically, but that’s not the case at all. Essentially, an mp3 and a flac version of the same track will have the same dynamic range.
And yes, while audible artifacts can be a thing with very low bitrate lossy compression, once you get to128kbps with a modern lossy codec it becomes pretty much impossible to hear in a blind test. Hell, even 96kbps opus is much audibly perfect for the vast majority of listeners.