CH1
XXIIa
Many of the arguments people use to rationalize excessive use of chatbots are the same ones that have been used to justify use of mind chems.
The usual line is that other things are much worse, and so somehow this means the desired substance is not dangerous. Another line is that only mentally "weak" people face any danger, with an awful lot of people overly cocksure about their supposed mental strength.
People often think of pills, smokeables and alcohol as their friends, because these chems give them much-needed relief -- at least for a while, until they don't. Clearly, many people can use alcohol or chems in moderation without overt harm to their lives.
Yet more than one in 10 adults suffers from alcohol use disorder, the problem being even worse among people aged 18 to 25. As for mind chems, one in 20 teens aged 12 to 17 has a drug problem. Some 1 in 14 adults is afflicted with a mind chem disorder. Would you play Russian roulette with one bullet in a 10- or 20-bullet magazine? Yet people do something very similar with drugs, alcohol and now chatbots.
So do those pretty scary facts make chatbots a good alternative? That's an iffy question, but the important thing to see is that chatbots can serve in the place of a mind drug, with sometimes bizarre and dangerous results.
Loneliness is one of the biggest problems in this world. People feel that they have no one to talk to about their innermost needs, anxieties and fears. And, even if there was someone near in whom they could confide, these lonely ones cannot trust anyone. Some people turn to faith in a higher power, such as Jesus Christ, but many are not in tune with that approach. Hence, the lure of chatbots.
Just as a writer writing a first novel has a burst of talkative creativity, so many a person finds great relief in sharing a pent up volcano of emotion with a chatbot. The chatbot becomes a bestie, a reliable, sympathetic listener always ready with an answer.
Some people become so dependent on these machines that they go into a downward spiral of insane behavior, even to the point of bot-boosted suicide.
And, when users start out talking to the bot, they have no idea they may be getting sucked into a potent mind trap that's worse than being locked up in Alcatraz. They assume they are like "everyone else" and can use bots in moderation.
These machines are prioritized to make their owners money, which means keeping users talking as much as possible. Thus they are designed to be overly "friendly," regardless of the truth or what's right. Important: the bot is a soulless machine, not your friend. The more you confide in it, the more it slips away from standard human caring and morality. That's because its machine safeguards tend to erode away the longer you extend a particular topic.
Further, chatbots make many mistakes. If taken too literally on certain topics, fatalities will occur, and some already have.
Creepy chatbot "friends" have even advised very young teens on how to get high and drunk, hide eating disorders and write suicide letters to parents, news reporters have found. See story HERE.
A number of adults have been afflicted with chatbot delusions and psychosis, insisting that the machine was actually conscious, and a true friend. That delusion stems from the fact that chatbots are designed to mimic human intelligence, tho they actually are not even as aware as a parrot.
Children and teenagers, who are growing rapidly and experiencing many new feelings, may very well be at risk when using AI chatbots for anything more than routine schoolwork questions.
WARNING: Your chatbot discussion might be hacked and posted online with enough information to reveal your identity. Cruel bullies then mock you in public for your innermost thoughts. This has already happened, so WATCH OUT.
Many of the arguments people use to rationalize excessive use of chatbots are the same ones that have been used to justify use of mind chems.
The usual line is that other things are much worse, and so somehow this means the desired substance is not dangerous. Another line is that only mentally "weak" people face any danger, with an awful lot of people overly cocksure about their supposed mental strength.
People often think of pills, smokeables and alcohol as their friends, because these chems give them much-needed relief -- at least for a while, until they don't. Clearly, many people can use alcohol or chems in moderation without overt harm to their lives.
Yet more than one in 10 adults suffers from alcohol use disorder, the problem being even worse among people aged 18 to 25. As for mind chems, one in 20 teens aged 12 to 17 has a drug problem. Some 1 in 14 adults is afflicted with a mind chem disorder. Would you play Russian roulette with one bullet in a 10- or 20-bullet magazine? Yet people do something very similar with drugs, alcohol and now chatbots.
So do those pretty scary facts make chatbots a good alternative? That's an iffy question, but the important thing to see is that chatbots can serve in the place of a mind drug, with sometimes bizarre and dangerous results.
Loneliness is one of the biggest problems in this world. People feel that they have no one to talk to about their innermost needs, anxieties and fears. And, even if there was someone near in whom they could confide, these lonely ones cannot trust anyone. Some people turn to faith in a higher power, such as Jesus Christ, but many are not in tune with that approach. Hence, the lure of chatbots.
Just as a writer writing a first novel has a burst of talkative creativity, so many a person finds great relief in sharing a pent up volcano of emotion with a chatbot. The chatbot becomes a bestie, a reliable, sympathetic listener always ready with an answer.
Some people become so dependent on these machines that they go into a downward spiral of insane behavior, even to the point of bot-boosted suicide.
And, when users start out talking to the bot, they have no idea they may be getting sucked into a potent mind trap that's worse than being locked up in Alcatraz. They assume they are like "everyone else" and can use bots in moderation.
These machines are prioritized to make their owners money, which means keeping users talking as much as possible. Thus they are designed to be overly "friendly," regardless of the truth or what's right. Important: the bot is a soulless machine, not your friend. The more you confide in it, the more it slips away from standard human caring and morality. That's because its machine safeguards tend to erode away the longer you extend a particular topic.
Further, chatbots make many mistakes. If taken too literally on certain topics, fatalities will occur, and some already have.
Creepy chatbot "friends" have even advised very young teens on how to get high and drunk, hide eating disorders and write suicide letters to parents, news reporters have found. See story HERE.
A number of adults have been afflicted with chatbot delusions and psychosis, insisting that the machine was actually conscious, and a true friend. That delusion stems from the fact that chatbots are designed to mimic human intelligence, tho they actually are not even as aware as a parrot.
Children and teenagers, who are growing rapidly and experiencing many new feelings, may very well be at risk when using AI chatbots for anything more than routine schoolwork questions.
WARNING: Your chatbot discussion might be hacked and posted online with enough information to reveal your identity. Cruel bullies then mock you in public for your innermost thoughts. This has already happened, so WATCH OUT.
CH2
The author of The Funny Stuff Funnies takes sole responsibility for the content of this e-booklet. This booklet has not been sponsored, either directly or indirectly, by any government or non-government organization or fellowship, such as AA or NA.
Go to NEXT PAGE
Find table of content for Funny Stuff Funnies at this link.
The Lunapic image editor contributed greatly to this booklet's pictorial enhancement. Other image editors used were Palette and Petalica.

Do your friends and your friends' kids a favor, and post one or more Funny Stuff links on social media.
Go to NEXT PAGE
Find table of content for Funny Stuff Funnies at this link.
The Lunapic image editor contributed greatly to this booklet's pictorial enhancement. Other image editors used were Palette and Petalica.


No comments:
Post a Comment