I was trying to write a reply for a google maps review.
It was from a pissed off customer who left barely a star.
I thought I’d write this response myself unlike the others I did with Gemini.
I clicked on the text box and I stared at it for a while until I typed in that first word.
Soon it barely became a sentence. And then it went blank.
I thought “Has writing always been this hard for me?”
A few weeks later, while I was helping my mom clean the house, I found a mug among the rubbish.
A steel interior, blue-colored one with a writing on it that says “Tata Group, Essay Writing Winner - School Level”.
It made me really think, I wasn’t always this horrible at writing. In fact I was pretty damn good and I enjoyed the time I spent writing.
So what happened since then? Was it not writing enough?
Lately I’ve been fully dependent on AI for writing. Emails, descriptions anything long or short or formal or casual, now I don’t even bother to do it myself. From the first GPT model, I’ve been using it extensively.
At first, it felt like a superpower.
Even though English is not my first language, I could sound native and professional every time. It also boosted my confidence.
But now that superpower reveals itself as the kryptonite to my brain’s cognitive ability to process on its own.
Recently Sam Altman shared this in a podcast :
We see a future where intelligence is a utility, like electricity or water, and people buy it from us on a meter
So imagine our upcoming generation. The ones who are born into a norm where this external brain is a necessity.
I’m wondering what that world would look like.
Maybe in next few decades. I see a Terminator 2 plot coming.
I know it sounds ridiculous, but I’m not joking.
I’m not here to tell you what to do. But I think this is a conversation we need to have, loudly, collectively and soon. Not just among individuals, but inside classrooms, companies, institutions and governments.
Because the question isn’t whether AI gets smarter. For sure it will.
The real question is whether we allow our species to be slower. Whether the millions of years of evolution to eventually devolve into a dumb piece of meat.
Maybe the most human thing we can do right now is to stay uncomfortable.
To write the difficult email yourself sometimes. To sit with a problem before you prompt it away.
And to demand that our schools teach children how to think rather than what to think, with or without a machine.
Sam Altman wants to sell us intelligence on a meter. The only leverage we have is to make sure we never completely run out of our own.
Check out this post to read more of my opinion on Cognitive Atrophy & AI.