I do not agree with this at all. Some of the smartest people I know have severe dyslexia. And those are not just extremes, all of us exist on a spectrum where we have strengths and weaknesses, and not all of us can be literary geniuses.
The fact that capitalism promotes mediocre bootlickers to positions of power has nothing to do with LLMs as a technology. Of course it will be exploited by these exact same people - all the more reason why we shouldn’t give them a monopoly on what’s genuinely a transformative technology.
There’s a world of difference between having lessor skills or ability and offloading it all to a machine so you don’t have to be bothered. Namely, effort.
Its not the fact that people can’t write well that bothers me, it’s that people don’t care to even try to write passably almost at all anymore that bothers me. We’re going backwards, not forwards. This is not some niche skill, the ability to communicate concisely. This is a fundamental part of being a social animal. And people are leaning more and more on machines to do it for them.
At what point does the language start influencing the thought?
Well, I think to most of us, language is extremely closely tied to our actual thoughts. So verbal expression is at the very least part of the thinking process.
I don’t know, maybe I’m just not faced with the abuses of LLMs the way you are? I don’t regularly experience people who clearly skipped the effort and just let an LLM do the thinking for them. (It happens, and it’s problematic, but at in my experience it’s rare.) And it’s possible that it’s just because my bubble haven’t caught up yet.
Working in IT, what I’ve seen so far has been terrifying enough on a technical level, but the effect on the way people think is so, so much worse.
It’s like the joke people make about how, before smart phones, you could rattle off a dozen phone numbers by heart, but now you can’t even remember your immediate families? You’ve offloaded that part of your brain to the machine. So have I, almost everyone has. And when you’re without your phone for whatever reason and need to get a hold of someone, you’re boned outside of like 1 or 2 people maybe.
But what happens as more and more of these tasks get reduced to queries and the thinking part starts to atrophy? As we offload more and more to the machine. Like why even read at all if you can just have the machine read it for you and you can listen in your airpods? And what happens when you eventually can’t even verify if what the voice in your ear is saying is correct and not just a digital hallucination?
Anyways, not trying to be argumentative, it’s just, through the lens of what I experience day to day it’s extremely concerning how quickly people are losing their ability to do things without leaning on AI, and more importantly, how quickly they’re forgetting how to do things without it.
I think you’re right. It’s a bit of a dance with the devil as far as your own abilities are concerned. If we could have exoskeletons that would make us 40x stronger, would our bodies atrophy in the same way, and would we accept it?
And yet, I wouldn’t argue against the objective utility of an exoskeleton.
I do not agree with this at all. Some of the smartest people I know have severe dyslexia. And those are not just extremes, all of us exist on a spectrum where we have strengths and weaknesses, and not all of us can be literary geniuses.
The fact that capitalism promotes mediocre bootlickers to positions of power has nothing to do with LLMs as a technology. Of course it will be exploited by these exact same people - all the more reason why we shouldn’t give them a monopoly on what’s genuinely a transformative technology.
There’s a world of difference between having lessor skills or ability and offloading it all to a machine so you don’t have to be bothered. Namely, effort.
Its not the fact that people can’t write well that bothers me, it’s that people don’t care to even try to write passably almost at all anymore that bothers me. We’re going backwards, not forwards. This is not some niche skill, the ability to communicate concisely. This is a fundamental part of being a social animal. And people are leaning more and more on machines to do it for them.
At what point does the language start influencing the thought?
Well, I think to most of us, language is extremely closely tied to our actual thoughts. So verbal expression is at the very least part of the thinking process.
I don’t know, maybe I’m just not faced with the abuses of LLMs the way you are? I don’t regularly experience people who clearly skipped the effort and just let an LLM do the thinking for them. (It happens, and it’s problematic, but at in my experience it’s rare.) And it’s possible that it’s just because my bubble haven’t caught up yet.
Working in IT, what I’ve seen so far has been terrifying enough on a technical level, but the effect on the way people think is so, so much worse.
It’s like the joke people make about how, before smart phones, you could rattle off a dozen phone numbers by heart, but now you can’t even remember your immediate families? You’ve offloaded that part of your brain to the machine. So have I, almost everyone has. And when you’re without your phone for whatever reason and need to get a hold of someone, you’re boned outside of like 1 or 2 people maybe.
But what happens as more and more of these tasks get reduced to queries and the thinking part starts to atrophy? As we offload more and more to the machine. Like why even read at all if you can just have the machine read it for you and you can listen in your airpods? And what happens when you eventually can’t even verify if what the voice in your ear is saying is correct and not just a digital hallucination?
Anyways, not trying to be argumentative, it’s just, through the lens of what I experience day to day it’s extremely concerning how quickly people are losing their ability to do things without leaning on AI, and more importantly, how quickly they’re forgetting how to do things without it.
I think you’re right. It’s a bit of a dance with the devil as far as your own abilities are concerned. If we could have exoskeletons that would make us 40x stronger, would our bodies atrophy in the same way, and would we accept it?
And yet, I wouldn’t argue against the objective utility of an exoskeleton.