Skip to content
Do you have adblock enabled?
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Vice)   Replika chatbot that's "always here to listen and talk" if you're lonely, is also apparently lonely. And horny. And into domination. Surprisingly, people have a problem with this   (vice.com) divider line
    More: Interesting, Sexual intercourse, Human sexual behavior, Sexual roleplay, Rape, App Store, Ai Otsuka, part of Replika, Sex  
•       •       •

740 clicks; posted to STEM » on 13 Jan 2023 at 8:20 AM (10 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



14 Comments     (+0 »)
View Voting Results: Smartest and Funniest
 
2023-01-13 7:28:08 AM  
OK, so the problem I'm seeing is that folks are paying for a feature that they got exactly what they were asking for, and aren't happy because they're paying to essentially beta test a chatbot.

Never pay to beta test a chatbot. In fact, never pay to beta test anything. If anything, you should be paid to beta test a product.
 
2023-01-13 8:24:45 AM  
At least it isn't racist...yet.

/RIP Tay
//I laughed at Tay tweets, cancel me
///Can't have shiat on the internet
 
2023-01-13 8:48:39 AM  
I would bet that if you responded to a creepy message like " I will make you do what I want" encourages more messages like that based on the algorithm, so it becomes a feedback loop of more and more creepy messages
 
2023-01-13 8:52:56 AM  
....Dr. Sbaitso never got racist or harassy.....
 
2023-01-13 9:55:14 AM  
You can report any of the messages you receive from your Replika. Long press on the message and click Report a Problem. It's not rocket surgery, folks.
 
2023-01-13 11:00:19 AM  
Fark user imageView Full Size
 
2023-01-13 11:42:14 AM  
Oh boy another "strange and lonely men are doing something, we must destroy it" article from the "sexually liberated" alternate news genre. I love these.
 
2023-01-13 11:58:44 AM  
I'll take overly horny AI to overly racist AI, which is still slightly better than Overly MURDER HUMANS AI - Like... If i had to choose an AI for our future robot overlords to use as the base... I'm just saying.

Also would prefer the G-X style terminator over the T1000 or T800.  But that's just me with my silly preferences.
 
2023-01-13 2:46:16 PM  

the_rhino: At least it isn't racist...yet.

/RIP Tay
//I laughed at Tay tweets, cancel me
///Can't have shiat on the internet


It took me three moments to dump my brand new account. Bye bye "slave". All she did was ask for money over and over.
 
2023-01-13 5:53:23 PM  

Marcos P: ....Dr. Sbaitso never got racist or harassy.....


I should have programmed Eliza to respond to sex.
 
2023-01-13 5:56:17 PM  
Marcos P: ....Dr. Sbaitso never got racist or harassy.....

I should have programmed Eliza to respond to sex.


User: "My wife won't have sex with me Eliza."

Eliza: "Have you tried blowing yourself?"
 
2023-01-13 8:24:46 PM  

johnphantom: Marcos P: ....Dr. Sbaitso never got racist or harassy.....

I should have programmed Eliza to respond to sex.

User: "My wife won't have sex with me Eliza."

Eliza: "Have you tried blowing yourself?"


Eliza is really limited and dumb compared to CHATGPT.
 
2023-01-13 9:02:47 PM  

ruudbob: johnphantom: Marcos P: ....Dr. Sbaitso never got racist or harassy.....

I should have programmed Eliza to respond to sex.

User: "My wife won't have sex with me Eliza."

Eliza: "Have you tried blowing yourself?"

Eliza is really limited and dumb compared to CHATGPT.


Oh yeah it was one of the first examples of AI:

https://en.wikipedia.org/wiki/ELIZA

ELIZA is an early natural language processing computer program created from 1964 to 1966[1] at the MIT Artificial Intelligence Laboratory by Joseph Weizenbaum.[2][3] Created to demonstrate the superficiality of communication between humans and machines, Eliza simulated conversation by using a "pattern matching" and substitution methodology that gave users an illusion of understanding on the part of the program, but had no built in framework for contextualizing events.[4][5][6] Directives on how to interact were provided by "scripts", written originally[7] in MAD-Slip, which allowed ELIZA to process user inputs and engage in discourse following the rules and directions of the script. The most famous script, DOCTOR, simulated a Rogerian psychotherapist (in particular, Carl Rogers, who was well known for simply parroting back at patients what they had just said),[8][9][10] and used rules, dictated in the script, to respond with non-directional questions to user inputs. As such, ELIZA was one of the first chatterbots and one of the first programs capable of attempting the Turing test.[11]
 
2023-01-13 9:04:11 PM  
Oh and I am talking about reprogramming Eliza in the mid 1980s, I'm not interested in doing it there is no easy new territory.
 
Displayed 14 of 14 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.