Information should be free. We subsidize scientific research to the tune of trillions of dollars—not to support to livelihood of grad students; not to create jobs in the publishing industry; but on the theory the fruit of that research has planet-wide benefits. We throw money hand-over-fist at scientific research because we view that as <i>investing in civilization</i>. If we believe that, and we believe research progress builds on top of other research, then the conclusion is we ought to minimize the friction of discovering other groups' research results, and maximize their availability. Make the act of research as easy and painless as possible.<p>Scientific research output should be free, universally, without hindrance.<p>It's myopic to try extract wealth from this public good by siloing it, by toll-gating access to it. Like barricading a public highway with toll-booths every 500 meters: it's a myopia that's blind to the public-good value of infrastructure—a myopia of greed that's a universal drain on public wealth, for some petty local optimization.<p>If you obstruct ML models on some financial profit theory, you're obstructing not only the ML entities; you're obstructing the thousand researchers downstream who stand to benefit from them. You're standing the in road blocking traffic, collecting tolls; you've not only stopped the vehicle in front of you, you've stopped a thousand more stranded behind it. It is a public nuisance.
The only thing I'm 'shocked' by is the idea that anyone needs to pay to access my academic writing for model training. I would hope that using my academic writing to train models would be considered fair use.
I’m kind of ok with this? I’ve written and had book chapters and research articles published. I never thought I was in any sort of position to restrict access. Publishing is about getting it out there. Attribution might be an issue, but that is a separate conversation and perhaps dealt with, if possible, by having LLM’s cite sources more accurately.<p>I have not kept up with the latest on LLM’s and licensing, but I’m curious: are scientific papers accessible to LLMs? Honestly, a bigger societal loss in my view is publishers like Elsevier restricting LLM access to research articles, rather than being too permissive. I could not care less if Elsevier makes a little bit of money in the process.
Aren't they also one of the academic publishers that's been criticized for charging for access to articles, and the authors don't get anything from the publication/distribution?
The whole academic publishing system is rotten to its core. Rent seekers living off the labour of underpaid academics and selling the product back to them, all while tax-payer money props up the racket.
Pretty disappointed by the responses here, but I suppose I can't be surprised in a community sympathetic to AI and abhorrent to copyright as a concept.<p>Those are both topics that can be a post in and of itself, so I'll just keep it simple and emphasize once again that we should implement the 3C's when asking of anything from another person's IP. I doubt many of the older papers/articles had contracts that allowed for such usage. Reinforced by the article:<p>>The agreement with Microsoft was included in a trading update by the publisher’s parent company in May this year. However, academics published by the group claim they have not been told about the AI deal, were not given the opportunity to opt out and are receiving no extra payment for the use of their research by the tech company.<p>regardless of your position, this publishing group at worst lied and at best is being irresponsible, this isn't even an issue of AI or copyright. We can debate "well this is how it should be", but let's leave ShouldLand for a bit and actually look at the current situation. Trust being broken in real time.
This is a rotten thing to do by Taylor & Francis. Humans are treated as expendable pawns to serve the capital and the machine.<p>We need new publishing models with strict copyright protections that protect against theft. Academics should run their own publishing houses as a cooperative.
If no one is going to learn from academic papers/data what’s the point in doing it? People care less and less about the impact more about how much money it’ll make them. If that’s all you want from it then be honest but don’t complain when someone makes more money from something than you.
The reality is that AI publishers (NeurIPS, ACL, related) all do proper academic publishing norms, and they are now forcing the rest of the world to follow them or have their content laundered in the form of LLMs trained on it. Good.
Why is a <i>researcher</i> called a <i>creator</i> in this article?<p>Are they not a fact discoverer or truth revealer?<p>It's unclear to me researchers should “own” truths prior research and public patronage enabled them to unearth.<p>// <i>note: research != invention, i.e., Space X experimenting until systems and machinery can land a rocket on a barge is not “research”, but testing and documenting characteristics of fuels in a vacuum as the environment swings from -100C to 120C is</i>