• BananaTrifleViolin@lemmy.world
    link
    fedilink
    English
    arrow-up
    154
    arrow-down
    3
    ·
    3 months ago

    After being forced to standardise to usb c and be responsible for some of the e-waste it produces, apple has finally relented.

    They fought tooth and nail against the EU regulations to force charging standards. I don’t care if they up sell cables to some people; most people will reuse what they have and thats the whole point of the regulations.

    Regulation works.

    • MrSpArkle@lemmy.ca
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      3 months ago

      They transitioned most of their devices to usb save the iPhone before the EU legislation went into effect.

      Apple caught shit for going USB-C only on their laptops years ago.

      • nilloc@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        They switched back to the much more durable MagSafe (3?) connector. I have 3 MagSafe MacBooks and one usc-c model. The only one I have issues not charging is the USB-C one, and it’s the newest by 2 years.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        Exactly, and it’s still kind of annoying years later on my work laptop (2019 Macbook Pro). I got a USB hub and now I get all those other ports, but that wouldn’t have been necessary if they just gave me an HDMI and USB-A port. The newer M-series Macbook Pros went back to having HDMI, which is really nice.

        I wish everything I had was the same port, but I’m not going to go out and repurchase everything to standardize on one plug.

        • GamingChairModel@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          4
          ·
          3 months ago

          HDMI is a dogshit standard and everyone should’ve moved over to DisplayPort or Thunderbolt over the USB-C form factor.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            2
            ·
            3 months ago

            Nah, it’s totally fine, and it’s ubiquitous. Ideally, I get both, so if I’m connecting to a TV or something, I can use HDMI, and if I’m connecting to a monitor, I can use DP.

            • GamingChairModel@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              3 months ago

              Are people connecting their laptops to TVs frequently enough that this should be built into every single unit shipped? I can’t imagine the percentage of users who actually use their HDMI ports is very high.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                3 months ago

                Yes? Someone in my group connects to our work TV pretty much every day for our morning meeting, and I connect to a monitor at home and at work multiple times every day. Yeah, I guess you could ensure that every TV supports streaming and have a USB-C hub at every desk, but that sounds odd compared to just adding an HDMI port or something.

                • GamingChairModel@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  3 months ago

                  You use HDMI for all those use cases? Seems like Thunderbolt is a much better dock for workstations, and DisplayPort is generally better for computer monitors and the resolution/refresh rates useful for that kind of work. The broad support of cables and HDMI displays is for HDMI 2.0, which caps at 4k60. By the time HDMI 2.1 hit the market, Thunderbolt and DisplayPort Alt mode had been out for a few years, so it would’ve made more sense to just upgrade to Thunderbolt rather than getting an all new HDMI lineup.

                  • sugar_in_your_tea@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    ·
                    3 months ago

                    Yep!

                    Thunderbolt only works for workstations if the monitor supports it, and none of my monitors at home do. My gaming PC doesn’t have USB-C out on the GPU, so even if my monitors supported it, I couldn’t use it. I do use DisplayPort for my gaming PC, but the monitor for my home office doesn’t have it.

                    I do have Thunderbolt at work, but it’s super finicky (sometimes have to unplug/replug a few times for it to register) and I’d honestly rather just use HDMI because it pretty much always works for me.

                    DisplayPort is only better than HDMI if your monitor sends more data than HDMI can support, and HDMI can support all resolutions and refresh rates that I use (basically, the only thing it doesn’t support are high res ultra-wide screens, or high res high refresh screens). I don’t need high refresh for my work computer (I just use 1080p/60; I’m just dealing w/ text), so I’m well within that range. At work, I use a high res ultra-wide, which is nice I guess, and I use Thunderbolt there. My coworkers, however, use HDMI w/ a dongle just fine on similar screens (the ones that don’t support Thunderbolt).

                    just upgrade to Thunderbolt

                    Yeah, I’m not going to throw out perfectly good hardware just to unify cables somewhat.

                    Adding an HDMI port really isn’t a big deal. Apple did that with the M-series chips after having USB-C only on the previous gen, so HDMI isn’t obsolete in any way. I only ever use 2 USB-C at a time anyway, and I’d honestly rather have a USB-A and HDMI on the other side than more USB-C ports. Variety > quantity IMO.

              • Honytawk@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                3 months ago

                Definitely.

                People who never connect their laptop to a second screen are in the minority.

                I never encountered one that has never done so, including Mac users.

                • GamingChairModel@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 months ago

                  To a second screen, sure. But I’m saying that DisplayPort and Thunderbolt are so much better, are generally supported by more computer monitors (but probably fewer TVs). I’d be surprised that there are a lot of people using HDMI in particular.

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            edit-2
            3 months ago

            Displayport have bad connectors compared to HDMI. They break so regularly, I switched back to HDMI after every single one of those cables died.

    • 𝕸𝖔𝖘𝖘@infosec.pub
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      19
      ·
      edit-2
      3 months ago

      Not if when they add a chip in the official Apple cable that the iPhone/iPad/iwhatever checks for, and refuses to properly charge or transfer data without it. At this point, a generic USBC will only work for a short time, before the device rejects it, forcing you to bin it and buy a new one, which negates the benefits of the regulation. Regulations do work, but they have to be thorough, and this one isn’t covering all the corners.

      Edit: changes when to if. It was causing confusion as to what I meant.