

It never was. Hallucinations are in no way unique to LLMs.
A contrarian isn’t one who always objects - that’s a confirmist of a different sort. A contrarian reasons independently, from the ground up, and resists pressure to conform.
It never was. Hallucinations are in no way unique to LLMs.
I think you might be right that the term incel has gone through some concept creep over time. What I’d call “classical inceldom” definitely had a fatalistic core - people who believed that nothing they did could change their circumstances. In those spaces, self-improvement wasn’t just seen as pointless, it was actively discouraged. There’s a strong crabs-in-a-bucket mentality, where even small expressions of hope - like saying a waitress smiled at you - are treated as betrayal. That kind of remark gets torn down because it suggests there is hope, and hope runs against the entire premise of the community.
So while I don’t necessarily disagree with how you’re framing things, I think it’s important we clarify what version of incel we’re each talking about. Otherwise, it’s easy to talk past one another while thinking we’re arguing about the same thing.
I see at least three different ways people use the term incel, and mixing them up leads to a lot of noise in these discussions. First, there’s the literal definition - someone who is involuntarily celibate but doesn’t necessarily hold resentment or misogynistic views. They might even be actively trying to improve their situation, through social development, fitness, or other personal changes.
Then there’s what I’d call the ideological definition - this is closer to the original online “incel community” form: people who believe they’re permanently locked out of dating and sex due to genetics or physical traits, and who often adopt a fatalistic worldview and resentment towards women because of it. That group tends to see looksmaxxing as a waste of time or a cope, because they’ve already written themselves off as hopeless. That was the version I was referring to in my comment.
And then there’s the slur usage, which is probably the most common in everyday discourse now: “incel” as a catch-all insult aimed at men with unpopular, toxic, or misogynistic views, whether or not they’re actually celibate or share any ideological connection to the incel community. This version often gets applied to manosphere types, Andrew Tate fans, or anyone viewed as a reactionary online. But here’s the irony - many of those guys despise incels and distance themselves from that label. Likewise, I’d say most ideological incels don’t align with Tate’s worldview either. Tate’s core message is about self-improvement to gain status and sexual access, while incels - at least in their blackpilled form - tend to reject the idea that improvement is even possible.
So yeah, I agree there’s a lot of grift and posturing in these online spaces, but we’re not talking about a single, coherent group. That was the point of my original comment: looksmaxxing isn’t inherently tied to incel ideology. In fact, it contradicts the most fatalistic version of it. Conflating them flattens out the distinctions between self-improvement, toxic ideology, and hopelessness - and I think that matters if we want to criticize these cultures without just throwing buzzwords around.
How does one promote involuntary celibacy? And what on earth this has to do with looksmaxing? Those are entirely different things for entirely different audiences. The core idea in “incel ideology” is that there’s nothing you can do about it.
Isn’t human intelligence exactly what most people mean by “general intelligence”? It becomes ASI (Artificial Superintelligence) once its capabilities surpass those of humans - which I’d argue would happen almost immediately.
I 100% agree with the first point, but I’d make a slight correction to the second: it’s debatable whether an LLM can truly use what we call “logic,” but it’s undeniable that its output is far more logical than that of not only the average Lemmy user, but the vast majority of social media users in general.
Quite widely accepted definition among philosophers and scientists is “the fact of felt experience” Which is basically how Thoman Nagel defined it in his essay “What’s it like to be a bat”
“An organism has conscious mental states if and only if there is something that it is like to be that organism - something it is like for the organism.”
First, one needs to define consciousness. What I mean by it is the fact that it feels like something to be from a subjective perspective - that there is qualia to experience.
So what I hear you asking is whether it’s conceivable that it could feel like something to be an AI system. Personally, I don’t see why not - unless consciousness is substrate-dependent, meaning there’s something inherently special about biological “wetware,” i.e. brains, that can’t be replicated in silicon. I don’t think that’s the case, since both are made of matter. I highly doubt there’s consciousness in our current systems, but at some point, there very likely will be - though we’ll probably start treating them as conscious beings before they actually become such.
As for the idea of “emulated consciousness,” that doesn’t make much sense to me. Emulated consciousness is real consciousness. It’s kind of like bravery - you can’t fake it. Acting brave despite being scared is bravery.
I’ve had no issues with adblocker for over half a year. Works flawlessly on both browser and mobile browser.