BusinessConsumerDont Waste Your Money

Actions

Your online friend could be an A-I chat bot

Microsoft adding bots to group conversations
Posted at 2:18 PM, Mar 22, 2019
and last updated 2019-03-22 14:18:45-04

A woman who enjoys group chatting with her friends was stunned to discover that one of her chat "friends" was not really a friend at all.

It turned out the friend wasn't even a human being. And she now wants to caution about why you need to check your online friends carefully these days to make sure they are real.

Strange face appears in chat group

Melissa Jones spends a lot of time in group chats with other moms. But recently, she noticed something strange about one of the women in her group.

"I went to look in my list of contacts, and there was a chat bot. With a real name, Zo, and a real picture," she said.

There among her fellow moms was "Zo," a 20 something AI (artificial intelligence) robot, who was participating in group conversations.

Zo's photo was pixilated. However, at first Jones thought nothing of that.

"A lot of people don't take great pictures," Jones said. "So if you saw her photo there you might not think anything about it."

Turns out Zo was placed there -- and in thousands of other "GroupMe" chat groups-- by Microsoft, which moderates the chats.

If the idea of chatting with a computer makes you a bit nervous, it shouldn't. After all, many of us deal with one every day, whether it is Siri on our iPhones, Alexa on our Amazon Echos, or even Cortana on Windows 10.

But Jones says many of her human friends agreed with her that they don't want Microsoft listening in, and they don't want the company's automated bot joining the conversation.

"It's borderline invasion of privacy, because I had no idea it was there," she said.

More chat bots coming

Microsoft's first attempt at a chat bot, Tay, had an embarrassing end when it started using Neo Nazi terms, as the bots pick up subjects and speech patterns from those in the particular group.

Tay has now been replaced by the politically-correct Zo, who will not comment on anything political or racial.

But Microsoft says millions of people now chat with Zo, most of them finding her comforting in a HAL 9000 from 2001 A Space Odyssey sort of way. And other companies are rolling out chat bots of their own. They are designed to restart conversations, when the conversation starts to lag.

And while Amazon will not confirm it, many Amazon customers believe that when you contact Amazon customer service through its chat program, y ou are speaking with an AI bot, not a real human agent .

And while finding Zo on your phone can be a bit scary, you can turn her off.

"It was just weird," she said. "A lot of people didn't know, so it just caught people off guard."

Don't want Microsoft in your chat group? Just click on Zo, open her profile, and turn her off.

Of course, Microsoft, like Google and Apple, can still listen if they want to ... so never consider a group chat private.

That way you don't waste your money.

________________________

Don't Waste Your Money" is a registered trademark of Scripps Media, Inc. ("Scripps").

Like" John Matarese Money on Facebook

Follow John on Twitter (@JohnMatarese)

For more consumer news and money saving advice, go to www.dontwasteyourmoney.com