

If you were born during the first industrial revolution, then you’d think the mind was a complicated machine. People seem to always anthropomorphize inventions of the era.
If you were born during the first industrial revolution, then you’d think the mind was a complicated machine. People seem to always anthropomorphize inventions of the era.
Citation Need (by Molly White) also frequently bashes AI.
I like her stuff because, no matter how you feel about crypto, AI, or other big tech, you can never fault her reporting. She steers clear of any subjective accusations or prognostication.
It’s all “ABC person claimed XYZ thing on such and such date, and then 24 hours later submitted a report to the FTC claiming the exact opposite. They later bought $5 million worth of Trumpcoin, and two weeks later the FTC announced they were dropping the lawsuit.”
I think one of the most toxic things on Lemmy is the prevalence of judging normies for using incredibly popular services and ascribing it to a character defect instead of life just being too complex for most people to be able to prioritize exploring more ethical technology choices.
We have mistaken rationality for a philosophy rather than a methodology, and efficiency for a virtue without any particular end in mind.
To have a unique, personal, subjective, divergent human experience is to sin against your prescribed algorithm.
Polluting the sky in order to pollute the internet 👌
Yes, that’s a good addition.
Overall, my point was not that scraping is a universal moral good, but that legislating tighter boundaries for scraping in an effort to curb AI abuses is a bad approach.
We have better tools to combat this, and placing new limits on scraping will do collateral damage that we should not accept.
And at the very least, the portfolio value of Disney’s IP holdings should not be the motivating force behind AI regulation.
I’d say that scraping as a verb implies an element of intent. It’s about compiling information about a body of work, not simply making a copy, and therefore if you can accurately call it “scraping” then it’s always fair use. (Accuse me of “No True Scotsman” if you would like.)
But since it involves making a copy (even if only a temporary one) of licensed material, there’s the potential that you’re doing one thing with that copy which is fair use, and another thing with the copy that isn’t fair use.
Take archive.org for example:
It doesn’t only contain information about the work, but also a copy (or copies, plural) of the work itself. You could argue (and many have) that archive.org only claims to be about preserving an accurate history of a piece of content, but functionally mostly serves as a way to distribute unlicensed copies of that content.
I don’t personally think that’s a justified accusation, because I think they do everything in their power to be as fair as possible, and there’s a massive public benefit to having a service like this. But it does illustrate how you could easily have a scenario where the stated purpose is fair use but the actual implementation is not, and the infringing material was “scraped” in the first place.
But in the case of gen AI, I think it’s pretty clear that the residual data from the source content is much closer to a linguistic analysis than to an internet archive. So it’s firmly in the fair use category, in my opinion.
Edit: And to be clear, when I say it’s fair use, I only mean in the strict sense of following copyright law. I don’t mean that it is (or should be) clear of all other legal considerations.
I say this as a massive AI critic: Disney does not have a legitimate grievance here.
AI training data is scraping. Scraping is — and must continue to be — fair use. As Cory Doctorow (fellow AI critic) says: Scraping against the wishes of the scraped is good, actually.
I want generative AI firms to get taken down. But I want them to be taken down for the right reasons.
Their products are toxic to communication and collaboration.
They are the embodiment of a pathology that sees humanity — what they might call inefficiency, disagreement, incoherence, emotionality, bias, chaos, disobedience — as a problem, and technology as the answer.
Dismantle them on the basis of what their poison does to public discourse, shared knowledge, connection to each other, mental well-being, fair competition, privacy, labor dignity, and personal identity.
Not because they didn’t pay the fucking Mickey Mouse toll.
How about robots that heal people?
Join, pay, request chargeback from your card company.
When they dispute it: “The position clearly says unpaid. So I won’t pay.”
Use AI to expand this argument to 10 pages. Bonus points for citing nonexistent court decisions.
Don’t use tariffs. Legalize jailbreaking and adversarial interop instead. Disregard American DRM.
I’ve been using Kagi for about a month now, and I think I’m gonna stick with it. Paying with dollars instead of data/attention feels more healthy for everyone involved.
(Fully realizing, of course, that there’s nothing stopping them from doing both, and that’s why we need better laws. Voting with your wallet will never be a complete solution… but it is something I can do right now.)
Antitrust is the right approach. (As opposed to copyright.) I hope Google gets decimated.