Microsoft’s LinkedIn will update its User Agreement next month with a warning that it may show users generative AI content that’s inaccurate or misleading.
LinkedIn thus takes after its parent, which recently revised its Service Agreement to make clear that its Assistive AI should not be relied upon.
LinkedIn, however, has taken its denial of responsibility a step further: it will hold users responsible for sharing any policy-violating misinformation created by its own AI tools.
The relevant passage, which takes effect on November 20, 2024, reads:
Generative AI Features: By using the Services, you may interact with features we offer that automate content generation for you. The content that is generated might be inaccurate, incomplete, delayed, misleading or not suitable for your purposes. Please review and edit such content before sharing with others. Like all content you share on our Services, you are responsible for ensuring it complies with our Professional Community Policies, including not sharing misleading information.
In short, LinkedIn will provide features that can produce automated content, but that content may be inaccurate. Users are expected to review and correct false information before sharing said content, because LinkedIn won’t be held responsible for any consequences.
The platform’s Professional Community Policies direct users to “share information that is real and authentic” – a standard to which LinkedIn is not holding its own tools.
I mean like everyone else I don’t read peoples linkedin content. I just have it keep in touch with people I used to work with and as an online resume.
Yea, who is actively participating on linkedin? Especially to the point where this is an issue?
I used it for awhile to find a job recently. It’s all recruiters contacting you that have no idea what your skill set is so they just end up wasting your time
I’m always getting recruiters telling me that I am a perfect match for some software development job or 3D artist or AI engineer, I’ve never worked in those industries in my life, I work in cyber security but the skill sets are not really transferable.
But they just see “complicated computer stuff” and assume that all complicated computer stuff they don’t understand is interchangeable with all other complicated computer stuff they don’t understand.
It would be like asking a structural engineer to become an architect. The surface that makes sense but when you spend 4 seconds thinking about it you realize it doesn’t work.
Also for some bizarre reason they always seem to be in Dubai
This is where AI was always headed.
Honestly, the AI information might be better than most of the dog shit insights people post on that platform.
I had to create an email filter to stop getting emails about that dumb shit.
Socials and the Internet in general would be a much better place if people stopped believing and blindly resharing everything they read, AI-generated or not.
Expect more of this stuff in the future. What’s the point of generating thousands of articles if you have to spend thousands of man hours vetting snf proofing the damned things? Is this even considered ‘work’ anymore?
Later: "Why is nobody using our software anymore?? ;_; "
Has anyone seen the comments on popular stories on LinkedIn lately? The site is overrun by AI scammers.
LinkedIn is fucking impossible to read. I don’t know anyone who actually did anything else than update their resume
I would like to join politicians and corporations in divorcing the conventional relationship between my actions and their consequences.
Where to I sign up?
do people share shit on linkedin when it isn’t part of their job to share shit on linkedin?
It’s weirdly used as a normal social media platform by a ton of people I’ve worked with over the years. I have no idea why, tbh, but they’re out there.
LOL TIL
whatever. it’s stupid and it sucks balls, but it’s better than instatwitsnapbooktok
It really isn’t. Lots of weirdos came out during the pandemic. It’s pretty much a cringier Facebook now. The only difference being you have to be on FB for some neighbourhood groups and you have to be on LI for your unimportant job at a multi-billion dollar company.
sorry, i didn’t mean to give the impression that i give a rat’s ass whether or which social medias are better than others
Last time I checked into that it was all worthless circle jerking
I stopped using LinkedIn because it totally turned into Facebook. Everyone is just posting memes, motivation quotes or soccer.
I still don’t know why anyone USES linkedin. It was a shit company built by hacking Windows and sending out emails in other people’s names to try to build their user base. The fact that Microsoft actually bought the company that hacked their operating system just shows how little moral value is present in any of this.
i use it to msg people i used to msg on facebook before i deleted it
I think their mobile apps were in on the contact snooping too, it wasn’t just Windows
Wonderful, and yet I’m not surprised…
Seems sensible. Check the output of AI tools before posting. Be pretty stupid not to proof read it at a minimum.
Wow Microsoft is still a dick… which is totally on brand.
Lol fool me once
“AI is bullshit, and you’re dumb for using it” is what they are saying. It’s amazing.
“AI can bullshit, and you’re dumb if you don’t verify it.”
I’m always surprised at the amount of people that expect an algorithm, built by rawdogging literally half the internet, to be an arbiter of truth.
I think the massive push for it by every single company gives the layman a picture of “everyone uses it so it must be good”, combined with most people just simply not caring enough to think too much into it.
Kind of an aside, but I’m really hoping for technology platue of some sort, with the hopes that people really have a chance to look at everything and ditch all the crap.
And then another period of growth from there.
If companies don’t trust their own AI on their own sites then they are pushing a shitty unvetted algorithm and hiding behind the word “AI” to avoid accountability for their own software bugs. If we want AI to be anything other than trash then companies need to be held accountable just like with any other software they produce.