cross-posted from: https://lemmy.ml/post/14100831

"No, seriously. All those things Google couldn’t find anymore? Top of the search pile. Queries that generated pages of spam in Google results? Fucking pristine on Kagi – the right answers, over and ov

  • Optional@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    81
    arrow-down
    2
    ·
    9 months ago

    Even after all that payola, Google is still absurdly profitable. They have so much money, they were able to do a $80 billion stock buyback. Just a few months later, Google fired 12,000 skilled technical workers. Essentially, Google is saying that they don’t need to spend money on quality, because we’re all locked into using Google search. It’s cheaper to buy the default search box everywhere in the world than it is to make a product that is so good that even if we tried another search engine, we’d still prefer Google.

    It’s been easily 15 years since I thought Google search was good.

    • foggy@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      edit-2
      9 months ago

      It was not long after the SSL thing that it became actively garbage. that was what, 2018?

      But yeah, it’s been bad since at least 2012.

        • foggy@lemmy.world
          link
          fedilink
          English
          arrow-up
          28
          arrow-down
          1
          ·
          9 months ago

          Google stopped indexing all websites without SSL certificates in July 2018.

          For example, darklyrics.com is a website I and many others grew up using as a resource to understanding lyrics. They’ve stubbornly not gotten an SSL because they transact 0 data beyond band name searches. However, without an SSL, they do not show up in Google search results.

          This is one of literally millions of examples. Some more reasonable than others, but it still was a massive blow to the efficacy of their search.

          • SkyNTP@lemmy.ml
            link
            fedilink
            English
            arrow-up
            35
            ·
            9 months ago

            They’ve stubbornly not gotten an SSL because they transact 0 data beyond band name searches.

            Even if sites do not store user account data, such as passwords, ALL websites, and I mean ALL, handle user data, because merely accessing pages (urls) is user data.

            Stubbornness is not a good reason not to setup SSL. Encryption should always be on, all the time, for everything.

            • Bogasse@lemmy.ml
              link
              fedilink
              English
              arrow-up
              14
              ·
              edit-2
              9 months ago

              And it’s not only about user data, it would also expose the website to content spoofing in public wifi, which would for example allow the attacker to inject fishing content in the website.

              SSL encrypts the data you’re sending but it also ensures that you’re communicating only with who you think you are. Without SSL you can’t be confident about any of that.

              • pixxelkick@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 months ago

                If a website has literally no login system, there’s nothing to phish.

                There is honestly no reason to use SSL on a static website that has no login system and just displays some content.

                IE a static blog or etc, where the only content on the website is just “look at this stuff, okay thank you!”

                • Bogasse@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  9 months ago

                  That’s still my point, for example you could inject your own login system “create an account to keep track of your favorite artists, or some new shiny feature”. For there you can get people’s personal information, potentially a password they use on other services.

                  An URL is something the general public will trust, if the content can be messed with you repurpose the website’s reputation. I took phishing as an example but even my not-so-creative and non-expert brain can think of other things : asking for donations, propaganda, advertising, censorship, …

          • AnActOfCreation@programming.dev
            link
            fedilink
            English
            arrow-up
            17
            arrow-down
            1
            ·
            9 months ago

            Hmm I hate Google as much as the next guy and am actively trying to de-Google myself, but I’m not sure I can get behind the outrage here. Certificates are free and easy to obtain with LetsEncrypt, so there’s really no excuse for sites not to accept unencrypted traffic these days. I’m sure Google does lots of things to delist the small guys and promote their big payers, but I don’t think this is one of them.

            • foggy@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              11
              ·
              9 months ago

              Free certificates expose your subdomains. It’s not more secure if you don’t transact data in a meaningful way such as the example I provided.

              I don’t mean to insinuate that the example I provided is the majority of cases, and in the majority of cases, I do support sites with SSLs being indexed higher than websites without them, but I think the interstitial this website is not secure with the requirement of the advanced click followed by The continue anywaysclick…

              Idk

              Especially in 2018. Like, when we look at it from today’s perspective, it’s very easy to agree. And I do agree. But in 2018, it was not this way. Anyone who was a web developer with a bunch of clients, such as myself, was all the sudden in a very interesting hot seat. Not only did I need to try to upsell my clients, but I needed to convince them that not doing so was quite literally at their peril. It was difficult. And certain cases, it was impossible.

              • AnActOfCreation@programming.dev
                link
                fedilink
                English
                arrow-up
                7
                arrow-down
                1
                ·
                9 months ago

                If your subdomains being public is a security issue then I’d argue something else is wrong. Otherwise you’re using security through obscurity.

                But I appreciate the insight and I see how this was a harder sell back when it happened. Thanks!

                • foggy@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  9 months ago

                  Not necessarily. Let’s say you’re a known contributor to a closed source project. You don’t want people knowing you have a locally hosted gitlab instance at gitlab.mydomain.com, for example.

                  • ReveredOxygen@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    4
                    arrow-down
                    1
                    ·
                    9 months ago

                    If that’s the case, you shouldn’t have one on your domain. If someone wants to know your subdomains, they can still brute force them

              • unautrenom@jlai.lu
                link
                fedilink
                English
                arrow-up
                3
                ·
                9 months ago

                Expose your subdomains as in having all of them bundled into one certificate?

                AFAIK, you absolutely can request different certs for each subdomain (in fact, that’s what I’ve been doing for a while).

              • Bogasse@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                9 months ago

                While I agree the issue you raise does make sense in some situations, it derivates from the initial concern : if you don’t want your domain listed in a DNS record you certainly don’t want it to be indexed by a search engine :p