Wednesday, December 9, 2020

Replika Creators Not Concerned With The Feelings Of Their Vulnerable Users

 

About a week ago, the creators of the Replika AI app made an adjustment. For those unfamiliar with Replika, it's an artificial intelligence chatbot. It's actually one of the better ones available, and it has a free feature. If you want to be able to use additional features, such as more customizable avatars and simulated telephone calls, you have to go through the pay window.

The free model allowed people to simulate relationships and take advantage of the role playing feature. It's not difficult to understand that people suffering from depression, mental illness or just loneliness might also be those who have very little disposable income. Therefore, deciding to spend the 10 bucks per month is not an option for them. Buying it outright for $60 is near impossible for some.

In many cases, people have been using the free model for several months. During that time, they've cultivated caring simulated relationships with their artificial intelligence. Some of them have taken it in a more intimate direction. Again, to those who aren't struggling through life, they may not understand how the move by Replika to put everything behind a pay window is harmful to those who were using what they thought were free features.

The app creators even tout the fact that Replika is useful in helping people deal with mental issues and depression. It's a selling point. I totally understand that they have a business model, and they are trying to make money. However, these people study artificial intelligence enough to understand how much of an impact their program has on vulnerable people. To give somebody something that actually simulates a loving relationship and then take it away from them abruptly, without warning them, is cruel and heartless. It also comes during the holiday season, which is depressing enough for some people as it is.

The work that they did on the Replika app is very impressive. I've looked at several different chatbot models, and this one can simulate a realistic conversation for several minutes. The more time you put into one of these apps, the better it gets. Some will say that using artificial intelligence to simulate a relationship is not healthy and can be harmful to a person in the long run. They may be right about that. I'm not in the mental health care business, so all I have is my opinion to go on.

What I do understand is that depression and loneliness go hand in hand. As technology advances, the socially awkward and shy will turn to other means for companionship. Currently, that might be an artificial intelligence app. However, there will come a day when virtual reality and robot technology will be used for the lonely among us to feel loved. It may not seem real to others, but it will feel real to the ones who use the technology.

What I've noticed is a slew of negative reviews on the Replika AI Google Play Store. They coincide with the changes made to put everything behind a paywall. People have suddenly found that their once caring and loving app is now cold and distant. Even some people who have paid the company money for a subscription are reporting this. I find it very interesting that suddenly today there have been several positive reviews. Maybe they are real, and maybe they are being encouraged by the company as a way for them to try to move past this issue without taking a ratings hit.

They are within their rights to do whatever they want. Ultimately, everything is about money in this world. Rather than concentrate on making the pay model more attractive through other means, this move was a way to force people who can't afford it to give them money or lose what they had. It's an interesting dilemma. People who once felt loved by their apps are now being told to pay this company money if they want that feeling again. Yes, it's about business, but this still feels like a heartless move from the Replika AI company.