• eltimablo@kbin.social
    link
    fedilink
    arrow-up
    2
    arrow-down
    8
    ·
    10 months ago

    This is the stupidest idea I’ve ever heard. You don’t fine a bank for getting robbed. This reeks of frontend engineer idiocy, which is ironically the exact type of idiocy that tends to cause breaches like this.

    • demesisx@infosec.pub
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      edit-2
      10 months ago

      Every time some corporatist replies to me, they’re always kbin.

      Your analogy falls apart with even a cursory thought about the differences between banks (which are required to be insured against loss which would make a customer whole again without any negative effects) and corporations that just throw all of their customers’ data onto a portal that lacks basic protection. Once that personal data is compromised, there’s no way to repay the customer and no amount of fines will EVER right that wrong. In a properly-regulated, just society, a bank would ABSOLUTELY be fined back to the Stone Age if they left their customers’ cash in the middle of a town square, for example.

      Be better, you corporate cuck.

      • eltimablo@kbin.social
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        10 months ago

        Ok then, how about considering that this will only serve to benefit the big tech companies because they’re the ones that can afford the fines? A breach is usually enough to make a smaller company go out of business already between cleanup and lawsuits. Why make it easier for the big tech companies to maintain power?

        • demesisx@infosec.pub
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          10 months ago

          Did you even read my comment? I specifically mentioned that the size of the fine could be tied to their market cap.

          If you work in cyber security, you’d know that there are best practices in place for cybersecurity and it is a WELL UNDERSTOOD FIELD. The main advice everyone gives is to never roll your own cryptography…and that is EXACTLY what many of the hacked companies did.

          Taking a shortcut and hiring shitty devs who just use some random NPM package for security and call it a day is exactly why there are so many breaches. Just as bridges need to be built to withstand double or triple their weight, there should be STANDARDS in place that if violated are subject to fines.

          Companies like Google would basically have to build SUPER SECURE technologies lest they be bankrupted by a breach.

          In conclusion, please try to remove your tongue from your exploitive employer’s back side.

          • eltimablo@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            I did miss that, but again, it’s additional fines on top of an almost guaranteed lawsuit for something that may not even be their fault. If they got owned by a Heartbleed exploit back when it was first announced and a fix wasn’t available yet, should a company be responsible for that? What about when they get hit by a vuln that’s been stockpiled for a couple years and purposely has no fix due to interference from bad actors? There are a lot of situations where fining someone for getting breached doesn’t make sense.

            • demesisx@infosec.pub
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              10 months ago

              You make great points but my final point is this: if a company simply cannot guarantee protection of user data, it shouldn’t be trusted with user data in the first place.

              • eltimablo@kbin.social
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                10 months ago

                And I’ll counter with this: no system is perfect, especially when major parts are made by non-employees. Mistakes can and do happen because corporations, regardless of size, are made up of humans, and humans are really good at fucking up.

                • demesisx@infosec.pub
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  10 months ago

                  I’m not trying to get the last word, I swear! 🤣

                  Go back to my bridge analogy and test that against what you just said.

                  Your comment equates to: “oh well, that bridge falling killed thousands of people. At least we were able to allow them to fail in the crucible of the free market!”

                  • eltimablo@kbin.social
                    link
                    fedilink
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    edit-2
                    10 months ago

                    Your bridge analogy falls apart because there already are standards (FIPS, among others) that are shockingly insecure despite having been updated relatively recently, and yet we still have breaches. If the standards were effective, places like AmerisourceBergen, the country’s largest pharmaceutical distributor, wouldn’t be supplementing them with additional safeguards. No standard is going to work perfectly, or even particularly well, for everyone. Bridges still fall down.

                    EDIT: Alternatively, there would need to be a provision that allows companies to sue the government if they get breached while following their standards, since it was the government that said they were safe.