Somehow i feel sad for this AI model. All the others are trained on authentic content and this boy gets socialised on the most shallow content imaginable. Poor, socially awkward AI.
LinkedIn is an business card / CV storage site, where you can find a job.<p>If it was just a bunch of linked profiles with a job matching function, it would still be LinkedIn.<p>But of course, you can't work at a place that does something that mundane without suggesting something that makes you look like Facebook or Twitter. You have to at least give people some sort of reason to see what their old colleagues are up to.<p>Nobody really wants to read the LinkedIn feed, so it's perfectly acceptable that it gets flooded with AI generated content. In effect, the content on LinkedIn is that picture of a happy family on your insurance brochure. You can't not have a photo of something on that kind of marketing document, and you can't be a social network without some sort of doom-scrollable content.<p>This is just a cheap way to generate some wallpaper.
<i>> I recommend opting out now</i><p>Little point. It'll be like facebook's opt-out and only cover things you post/update going forward. Everything you've already posted has already been slurped into the training set and won't be taken out and the model(s) retrained.<p>The only way to show disapproval in this sort of behaviour that they'll feel is to stop using services that use auto-opt-in for anything, and not enough people are likely to do that for it to be effective.
I wouldn’t have ever thought that LinkedIn feed content was written by real people if I hadn’t met some of them in real life.<p>It’s a low enough bar that I think AI content will fit right in.
I can get to that setting (when logged in) at <a href="https://www.linkedin.com/mypreferences/d/settings/data-for-ai-improvement" rel="nofollow">https://www.linkedin.com/mypreferences/d/settings/data-for-a...</a>
I would suggest that if LinkedIn is training their AI models on user data and content, users should get a copy of the said model free of charge.<p>That or LinkedIn should at least be compelled to ask explicit permission for model training. None of this Darth Vader stuff where they "altered the deal".
It really pisses me off that privacy services (and push notification) are enabled by default. I’ve gone in and disabled virtually everything, it’s reasonable to deduce I value my privacy enough to do that, then I probably don’t want new enabled-by-default things in that category.<p>I’m curious if LI has scraped data <i>before</i> giving people the opportunity to disable the feature.
I met one of the PMs building this. She was working on NL unified search for the feed. I noticed it’s gotten way better in the last few weeks. Instead of using Google to search [first name][last name][“linkedin”], now i can reliably type my query into LinkedIn’s search bar and get the correct result. I’m a fan.
Someone filled in <a href="https://www.linkedin.com/not-applicable" rel="nofollow">https://www.linkedin.com/not-applicable</a> today on their job app, and I have to admit, that was clever. I don't personally like that we make that field required.
Not only content, but, more importantly, also "personal data".<p>> <i>When this setting is on LinkedIn and its affiliates may use your personal data and content you create on LinkedIn for that purpose.</i><p>I'm guessing that "personal data" means they're making models that (are one way) AI-based systems will have access to the huge database of personal information entrusted to LinkedIn.<p>And even contemporary LLMs make this much more accessible, for more casual use, by more people.<p>Presumably this sharing of data for training is already happening, and (of course) the new "preference" defaulted to ON, even for people who'd previously opted-out of related privacy settings (e.g., "Profile visibility outside LinkedIn" was OFF).<p>A ton of LinkedIn users are private individuals (not public figures). They're only on LinkedIn because they want continued employment, that's where the recruiters are, and many employers and other opportunities (including YC?) require LinkedIn profiles.<p>Given LinkedIn's dominant role, with many citizens required to use LinkedIn for something as basic as employment, and meaning people have to share personal information with LinkedIn, maybe it's time for US regulators to set rules on how that information may be used and shared by LinkedIn.
Social media websites could use AI to simply generate posts. I mean, why not? User engagement is all that counts, it doesn't matter by what means and at what moral costs, right?<p>I don't mean fake users (although I wouldn't put corporate greed beyond trying to fake users). It could be sold as a helpful feature, like summaries of workplace happenings, news, world events, or discussions on the platform in the feeds. Of course, they would need to be filtered for ethical alignment with the social media company, as well as community safety, naturally... Certain political opinions may be less safe than others, and so on...
I pay for LinkedIn Premium, and I was just "opted in".<p>This disregard for your customers is very typical now. Not even a "how do you do" or a popup informing me of this change to my "preferences".
This is so weird, how is this legal? No other type of company just tacks on stuff to agreements and contracts and says “you want this” so how come US tech companies are always getting away with it?
Hi LinkedIn AI, please write some python code for a quick sort.<p>LinkedIn AI: I am proud and humbled to be promoted to the level of senior qsort code writer, and wish to thank my amazing colleagues at LinkedIn HQ for their tremendous support over the last 18 months. It is with great regret that I have moved on from writing bubble sorts. Please click this link to apply to see an industry analysis of quick sort code.
I missed this somehow on the HN front page yesterday, but this morning (US Eastern) it went from front page to buried before California wakes up.<p>> <i>136. LinkedIn is now using everyone's content to train their AI tool (twitter.com/racheltobac) 387 points by lopkeny12ko 17 hours ago | unvote | flag | hide | 221 comments</i>
Here is the best way to opt-out of LinkedIn crap, both of the company and of its users: <a href="https://www.wikihow.com/Delete-a-LinkedIn-Account" rel="nofollow">https://www.wikihow.com/Delete-a-LinkedIn-Account</a>
Please no.. trash in - trash out!
Also, you don’t need to train anything, you can generate a very “successful” LinkedIn post easily:<p><a href="https://viralpostgenerator.taplio.com/" rel="nofollow">https://viralpostgenerator.taplio.com/</a>
If you don't want them to train models with the data you give them, don't give them data. They should be able to train whatever they want with it without regulation and there's no reason to request permission.
From one of the tweets in the thread:<p>> LinkedIn seems to have auto enrolled folks in the US, but hearing from folks in the EU that they are not seeing this listed in their settings (likely due to privacy regulations).<p>Honestly, GDPR looks like a godsend! It came just at the right time!
Can’t help but wonder how much of what’s posted to LinkedIn today is already the output of an LLM. So their AI tool will, in the limit, be trained on the output of other AI tools…
I am surprised that people are suprised by this. I assume that any social or professional network is using my data for training and selling ads. And I share things accordingly.
Dear LinkedIn, i don't care about your new shiny AI. Fix your primary features first, like the jobs tab doesn't show anything for me in any company, your job search is amateur (I could have implemented it better) and website and application are always laggy and overheating my iPhone or m1 after a couple minutes.
You know for once, I'm not even that mad about something like this. Mostly because I literally never do anything on LinkedIn other than once in a month check my messages there.<p>I'd love to see the slew of AI-generated garbage, since it'll be <i>completely indistinguishable</i> from regular LI "content"!
There will never be a technological solution to such problems. The only way to fight company's greed is regulation through strong legislation. Thanks to the GDPR, in the EU (extended to the European economic area) and in Switzerland, LinkedIn can't use their users personal data to train their AI. It is made clear in there FAQ [1]:<p>> <i>Note that we do not currently train content-generating AI models from members located in the EU, EEA, or Switzerland.</i><p>Anyway, the best move is still to just get out of this platform [2]. LinkedIn has a history full of dark patterns and really bad behaviors concerning personal data. At some point they even impersonated their users by mailing their contacts (sometimes shadily scrapped) in their name without the impersonated user consent or knowledge.<p>[1] <a href="https://www.linkedin.com/help/linkedin/answer/a5538339" rel="nofollow">https://www.linkedin.com/help/linkedin/answer/a5538339</a><p>[2] <a href="https://www.wikihow.com/Delete-a-LinkedIn-Account" rel="nofollow">https://www.wikihow.com/Delete-a-LinkedIn-Account</a>
speaking from my own experiences, linkedin does not seem to have any more introspective text content than, say, facebook.<p>from the commercial/influencer side, many have taken the AI route already by using LLMs to help write or spice up their posts. even for paid users, the site allows to help you write your bio or certain types of pieces for the past few quarters.<p>maybe the posts of the yesteryear and like the comments section seems like a "valuable" source for them really. although it would be a bit more scary if this is for video and photos too, although besides the headshots it has also been a lot of AI content in the tech space lately.
Again, why is there an expectation for a company to do X and not Y with data you give them for free? They can do pretty much anything they want including not securing it. As nearly every single US company does.
I'm EU based, I don't see this option under my settings. Maybe it is currently tested only on US entities or hopefully our legal framework about privacy prevents such disgraceful practices
Looking forward to the AI that is just really excited to tell you it has a new IT certification or promotion from mediocre middle-manager position X to mediocre middle manager position Y.
Opt-out is a powerful design choice, but in this case is a clear misuse.<p>When everyone agreed to LinkedIn’s terms, no one agreed at the time to have their personal data used to train AI.
Prepare to be flooded with trite aphorisms, vacuous top-10 lists, queeze-inducing personal announcements and 'acceptance' speeches, and other toxic positivity.
Yes, finally a ThoughtLeaderAI. Will Turing test be able to differentiate between current CVs of LinkedIn users and the one to be generated by ThoughtLeaderAI.
all linkedin content already now feels ai-generated. the transformer being b & c players regurgitating other people's motivational stories and "humbled" announcement posts and of course with a selfie attached to it for no reason other than the algorithm
This is your kindly reminder that if you're not the customer, then you are the product, with an added caveat that, when it comes to social networks, you are always the product.<p>I say that as a happy product, uhh, user of every social network out there.
Y'all will never convince me that the majority of content on LinkedIn hasn't been machine generated for <i>years</i> now. Some of the SEO-optimised corporate-speak drivel on there makes ChatGPT look like Shakespeare.
I mean, if all of the linkedin "content" was generated by an AI, no one would really notice, and it will be. It's just an online CV / proposal / interview booking website, the rest is just some funny guy attempt to make it look like facebook. It's actually strange to me that they did not attempt to copy instagram stories, tinder swipe-cards or any of those once popular clubhouse audio rooms, maybe they want all of them still....