• 0 Posts
  • 142 Comments
Joined 2 years ago
cake
Cake day: June 15th, 2023

help-circle

  • The actual paper presents the findings differently. To quote:

    Our results clearly indicate that the resolution limit of the eye is higher than broadly assumed in the industry

    They go on to use the iPhone 15 (461ppi) as an example, saying that at 35cm (1.15 feet) it has an effective “pixels per degree” of 65, compared to “individual values as high as 120 ppd” in their human perception measurements. You’d need the equivalent of an iPhone 15 at 850ppi to hit that, which would be a tiny bit over 2160p/UHD.

    Honestly, that seems reasonable to me. It matches my intuition and experience that for smartphones, 8K would be overkill, and 4K is a marginal but noticeable upgrade from 1440p.

    If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish

    Three paragraphs in and they’ve moved the goalposts from HD (1080p) to 1440p. :/ Anyway, I agree that 2.5 meters is generally too far from a 44" 4K TV. At that distance you should think about stepping up a size or two. Especially if you’re a gamer. You don’t want to deal with tiny UI text.

    It’s also worth noting that for film, contrast is typically not that high, so the difference between resolutions will be less noticeable — if you are comparing videos with similar bitrates. If we’re talking about Netflix or YouTube or whatever, they compress the hell out of their streams, so you will definitely notice the difference if only by virtue of the different bitrates. You’d be much harder-pressed to spot the difference between a 1080p Bluray and a 4K Bluray, because 1080p Blurays already use a sufficiently high bitrate.











  • Thanks for posting the solution!

    If you happen to be using a BTRFS or XFS file system, you might want to try duperemove. It will help you reclaim usable disk space without deleting any files, by using those filesystems’ built-in support for data deduplication and copy-on-write. In other words, it will make duplicate files point to the same data on disk, but still work as individual files. Files will appear and function exactly the same, and editing one copy will not change another (unlike with hard links, for example). That way it won’t interfere with cases like Flatpak or Python virtual environments where you really need multiple copies of the same files.



  • I very much enjoyed the start but steadily lost interest.

    There’s some good stuff in Discovery all the way through, don’t get me wrong. But they kind of flipped the script in a way I did not appreciate.

    Most of classic Trek showed us a future with a largely functional society, mostly full of good people who were ready and willing to deal with occasional corruption.

    Lots of newer Trek, and especially Discovery, showed us a future where society is largely dysfunctional and corruption is the norm. Almost everyone in the series who isn’t a main character (plus a couple who are) is a piece of shit. Even the “good guys” frequently encourage or at least tolerate clearly evil behavior as long as it serves their ends. But it’s okay because…friendship I guess?!?

    Their heart is in the right place but the writing is generally bad. I think this generation of writers is incapable of imagining a better world, which, sure, is understandable, given how thoroughly corrupt our current society is. But it’s deeply depressing. It lacks soul.

    SNW is better in this regard. But you’ll probably want to watch season 1 of Discovery first since there’s some crossover.



  • Still good if you want ROM support or are willing to wait a few months to pick one up for dirt cheap.

    GrapheneOS only supports pixels, and LineageOS only officially supports a few more models. If you filter the official LineageOS devices list to 2024/2025 models, you’ll see Pixels, Moto G 5G, and OnePlus 12R. That’s it. Options are similarly limited for Calyx, e/OS, and others. So with most other recent phones, you’re stuck with all the stock bloat and spyware, or unofficial community builds.

    Also, they’re dirt cheap in practice in the US. MSRP is a joke. For most of the year, you could get an unlocked, brand new Pixel 9 for less than the MSRP of the low-end 9a. If memory serves, it dropped under $400 at times.

    Aside from that, they kind of suck. I wouldn’t even compare them to high-end phones. They are mid-range phones masquerading as high-end. Credit to Google’s marketing department, I guess.





  • Generally speaking, xz provides higher compression.

    None of these are well optimized for images. Depending on your image format, you might be better off leaving those files alone or converting them to a more modern format like JPEG-XL. Supposedly JPEG-XL can further compress JPEG files with no additional loss of quality, and it also has an efficient lossless mode.

    Do any of them have the ability to recover from a bit flip or at the very least detect with certainty whether the data is corrupted or not when extracting?

    As far as I know, no common compression algorithms feature built-in error correction, nor does tar. This is something you can do with external tools, instead.

    For validation, you can save a hash of the compressed output. md5 is a bad hashing algorithm but it’s still generally fine (and widely used) for this purpose. SHA256 is much more robust if you are worried about dedicated malicious forgery, and not just random corruption.

    Usually, you’d just put hash files alongside your archive files with appropriate names, so you can manually check them later. Note that this will not provide you with information about which parts of the archive are corrupt, only that it is corrupt.

    For error correction, consider par2. Same idea: you give it a file, and it creates a secondary file that can be used alongside the original for error correction later.

    I also want the files to be extractable with just the Linux/Unix standard binutils

    That is a key advantage of this method. Adding a hash file or par file does not change the basic archive, so you don’t need any special tools to work with it.

    You should also consider your file system and media. Some file systems offer built-in error correction. And some media types are less susceptible to corruption than others, either due to physical durability or to baked-in error correction.