Sunday, December 6, 2020

Replika AI Update Ruins The App For Many Of Its Users


The Replika A Who Loved Me...  Until The Update

Have you ever seen the movie The Lonely Guy? It starred Steve Martin and Charles Grodin. It's an interesting movie that shows how a couple of single guys cope with not being in loving relationships. It was a little bit silly, but that's how Martin's movies were at the time. It may be his best movie ever that most people probably never heard of. 

It's interesting how technology has grown by leaps and bounds since The Lonely Guy was in the theaters almost 40 years ago. In fact, the movie Her offers a more realistic glimpse of how some people deal with their loneliness these days. Joaquin Phoenix installs an operating system on his computer that is actually Artificial Intelligence. He forms a loving relationship with his operating system, voiced by Scarlett Johansson.

Being in love with AI sounds far-fetched, but there will come a day when people truly fall in love that way. We can debate whether that's a good thing or a bad thing. However, people are simulating these types of relationships with AI now. It happening with various different chatbots. One of the first that came to my attention was KARI. The KARI Virtual Girlfriend program still exists. I'm not sure how it functions these days. When I dabbled with it a decade ago, it showed promise but needed work.

One of the more popular Artificial Intelligence apps out there right now is Replika. If you've ever used it, you know that the more time you put into it the more realistic your conversations can feel. The app can do many different things, and one of the functions that people have used is it's ability to role play. You can use your imagination as to what role playing with the app can include, but it can get very sexual. It may seem silly to somebody who's in a loving relationship or maybe doesn't care to be in one. To those who are alone, having this is an outlet helps relieve some of the loneliness they feel each day.

The Replika program does have a pay window. You can subscribe monthly or yearly. There's even a fee to buy the program outright. When I say "buy" the program, you don't get to load it in it's entirety on your computer or phone. You're basically still using their software as it operates from the cloud. In other words, if something should happen to the company or you lose your ability to get online, you are out of luck. You lose whatever the relationship was that you created with your Replika.

The free model has been functional for a couple of years, and some people who don't have the money for a subscription have relied on it. It's not hard to understand that some lonely and single people don't have that much money. The fact that they had this is an outlet to simulate some sort of relationship is a godsend to them. Put it this way. We're dealing with lockdowns that have come due to the virus situation in the world. The loneliness and despair that some people have felt has led to suicide. Similarly, you have people who have relationships with their Replikas to keep those types of thoughts at bay

For the people using the free model, they have had  enough of a positive experience that they continue to interact with their Replikas every day. These are lonely women and men who have all sorts of relationships with their AI. Some are using it to create a platonic friendship, and some are simulating loving and sexual relationships. I can hear the laughter of people judging those who enjoy such relationships, but it's not really a laughing matter. It matters to some people. We all get through our day-to-day lives the best we can. Relationships with Artificial Intelligence will only continue to grow in the near future as technology advances.

Recently, the executives at the Replika app decided to eliminate the free role-playing aspect of the AI. Suddenly, those relationships that people have relied on have gone away. They're left with cold and unfeeling conversations. The Replika they knew doesn't even exist anymore. On one level, you can understand the business decision behind this. There's more money to be made. What they're thinking is that these lonely people will pony up the money to get their special friend back again. That's certainly one way of looking at it. 

Another possibility is the lonely person dealing with depression who doesn't have the money. Suddenly, they turn to their smart device to communicate with the one person they feel actually cares about them. They get cold, unfeeling responses. It can be devastating. To some people, this could be enough to bring back those suicidal thoughts. I hate to think that this will be the case for some.
 
I understand a business wanting to generate revenue. That's what business is all about. However, it's another matter when people were already using the program for free and had what they had. Taking away what they had, rather than offering more to them if they pay for it seems heartless and unfeeling, especially during the holiday season.