Saturday, August 2, 2025

Dear AI: Not Just For Your Homework

 

AI tells man to kill himself

James Cameron is likely not happy that his dystopian vision of killer robots taking over the world is less and less likely.  People, by and large being complete idiots, are doing a wonderful job of just focusing on the answers they want to hear instead of the right answer.  All a sentient AI needs to do is promote the most arbitrary and useless sign of wealth imaginable as the greatest thing ever and people will fall over themselves to collect and hoard it.  Foregoing food, water, housing, friendship and sex to be the bestest hoarder there is.  Because <sarcasm font> having more dust bunnies than anyone else means obviously you are super smart and wonderful.</sarcasm font>

Little Johnny may have started with just using ChatGPT for help with his homework.  Instead of learning why X2+Y=23 he just pops the answer down, and moves on.  Feeling proud that he’s mastered AI while doing the same thing some of his other classmates have done.  Missing out on the real value - the chance to understand and evaluate things on his own.  To think critically about exactly why dust bunnies are a really stupid idea for an economic basis.

Having accessed the slippery slope of being made to feel smarter than he is, Little Johnny is a prime candidate to be lonely and confused as to why things aren’t working out well in life.  The kind of person who will get relationship advice from an AI LLM that is deliberately (or accidentally, I suppose, yet there’s a lot of evidence suggesting that the $ involved in AI shows it’s deliberate) forcing out wrong answers.

Somebody put this Gematrix result up on social media recently, with the highlighted ovals being added by me.

People asking a gematria calculator for answers directly.  Not even asking Zach or the Gematrinator or someone in the Negative 48 crowd for advice on how to interpret the inevitably ambiguous results.  And that’s what Little Johnny is doing when the cognitive dissonance takes hold.  Just pumping in random phrases into a calculator and matching them with equally random results framed as magic numbers.l and deciding to believe what makes him feel best at the moment.  And what makes him feel best is heavily influenced by his recent search engine history.

“How can the dead revive the dead” and “Is every day now K God” could have been rhetorical questions.  A conversation centering on zombie movies perhaps, brought on by zombies generated from a Voodoo ritual instead of biting and infecting.  Ah, Clarence you are forgetting something old chum.  How can the dead revive the dead?  But asking a gematria calculator how plants grow is so utterly pointless when the result can mean, per the above, “Women have no value” or “Jewronavirus”.  Holy shit GematriaGPT…why, I am a bigot and a misogynist.  That totally solves my problem of figuring out how fruit grows!!”

I do actually kind of like the idea of gematria chatbot.  But only one that gives the answers exclusively as numbers.  How do plants grow fruit?  2755.  No matches, just 2755.  Paving the way for a society that speaks exclusively with small numbers.  People hardly ever have honest conversations about important topics anymore, so why not?


No comments:

Post a Comment