Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Unite.ai)   The AI-generated language of the future is set to be mighty white, mighty straight   (unite.ai) divider line
    More: Interesting  
•       •       •

778 clicks; posted to STEM » on 24 Sep 2021 at 11:15 AM (3 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



17 Comments     (+0 »)
View Voting Results: Smartest and Funniest
 
2021-09-24 10:43:16 AM  
Given how I know kids operate, that inevitably means the reality is everything goes multi-ethnic and sexually flexible, which we're clearly already seeing.
 
2021-09-24 11:29:22 AM  
FTR(eport):

'Our examination of the excluded data suggests that documents associated with Black and Hispanic authors and documents mentioning sexual orientations are significantly more likely to be excluded by C4.EN's blocklist filtering, and that many excluded documents contained non-offensive or non-sexual content (e.g., legislative discussions of same-sex marriage, scientific and medical content).'

It sounds like an attempt was made to filter racist and homophobic bias and in the process filtered more than it should have.

Wasn't there a recent attempt to teach an AI how to "speak" by letting it process comments on the Internet and the result was comically over the top bigoted?
 
2021-09-24 11:30:47 AM  
from the Parler ChatBot I am building:

To be the POTUS shouldn't you be able to film porn in the middle of the night. Great...

.@TheardDad: "Just watched your GITMO film. Outstanding work. Thank you for your kind words. It will encourage me to get it published!!

All of this has been planned and in the works for a long time. They will continue to cheat and steal an election they're going to have to fight like the devil to

People don't want to be lectured, they just want to live here, not integrate & live by the law of our lands🤷♀

much more to this than you will ever be. You have no ideology.


#Trigun has to be one of the most popular Christmas decorations...

@TRUMPCHICK20 I love the country, I used to live in Tampa by the Buccaneer stadium!

@Harleybiker97 Call me when you take Joe's cock out of your mouth so I can understand you

Joe has always lied over and over and over and over and over again...This is a well written article.

Just joined trying to learn how to use logic before voting!!


n-grams of 4 and 5 get repetitive


That's one way to look at that event, here's another way to look at that event, here's another way to look at that event, here's another way to look at that event, here's another way to look at that event, here's another way to look at that event, here's another way to look at the actions of politicians instead of just thier words.
 
2021-09-24 11:32:22 AM  
Does Google Translate handle jive?

Fark user imageView Full Size
 
2021-09-24 11:32:37 AM  

Erek the Red: FTR(eport):

'Our examination of the excluded data suggests that documents associated with Black and Hispanic authors and documents mentioning sexual orientations are significantly more likely to be excluded by C4.EN's blocklist filtering, and that many excluded documents contained non-offensive or non-sexual content (e.g., legislative discussions of same-sex marriage, scientific and medical content).'

It sounds like an attempt was made to filter racist and homophobic bias and in the process filtered more than it should have.

Wasn't there a recent attempt to teach an AI how to "speak" by letting it process comments on the Internet and the result was comically over the top bigoted?


That's my take as well. "We can't accurately identify when 'gay' isn't used as a slang, so lets exclude pretty much everything containing the word" is less of a PR nightmare than another Hitler-bot.

/Anyone else remember when chess videos were being targeted for language pertaining to white attacking black?
//Same problem
 
2021-09-24 11:35:13 AM  

Erek the Red: FTR(eport):

'Our examination of the excluded data suggests that documents associated with Black and Hispanic authors and documents mentioning sexual orientations are significantly more likely to be excluded by C4.EN's blocklist filtering, and that many excluded documents contained non-offensive or non-sexual content (e.g., legislative discussions of same-sex marriage, scientific and medical content).'

It sounds like an attempt was made to filter racist and homophobic bias and in the process filtered more than it should have.

Wasn't there a recent attempt to teach an AI how to "speak" by letting it process comments on the Internet and the result was comically over the top bigoted?


Tay.ai
 
2021-09-24 11:43:02 AM  
It's just going to be

🍆🍑
 
2021-09-24 11:44:02 AM  

DerAppie: That's my take as well. "We can't accurately identify when 'gay' isn't used as a slang, so lets exclude pretty much everything containing the word" is less of a PR nightmare than another Hitler-bot.


hey, wanted to say thanks for participating in a conversation about in-person meetings a week ago.  thread closed before I had a chance to say thanks for the differing perspective.

/you may be right that I am on the spectrum to some degree
 
2021-09-24 11:48:40 AM  

Marcos P: It's just going to be

🍆🍑


you may be right.  In the parler data set I get a lot of repeats of the drudge cop siren emoji and the 100%.

sometimes 10 copies of it in a row.

This is not my research but it is where I found the data set and kinda interesting: link - research on parler posts
 
2021-09-24 11:56:36 AM  
eep opp ork ah ah
 
2021-09-24 12:15:27 PM  

Hyjamon: DerAppie: That's my take as well. "We can't accurately identify when 'gay' isn't used as a slang, so lets exclude pretty much everything containing the word" is less of a PR nightmare than another Hitler-bot.

hey, wanted to say thanks for participating in a conversation about in-person meetings a week ago.  thread closed before I had a chance to say thanks for the differing perspective.

/you may be right that I am on the spectrum to some degree


You're welcome. It was a nice discussion, but I wasn't the one who said you might be on the spectrum. That was someone replying to me saying I shouldn't bother.
 
2021-09-24 2:57:25 PM  
Counterpoint: If a computer needs to use audio to communicate it makes the most possible sense to use the most easily understood dialect for whatever language it is using. That means southern drawls and cockney would never the be accent for English. I'm sure every language has a whole spectrum of dialects that vary from everyone understand to even native speakers have to turn the captions on when it is in a movie.
 
2021-09-24 9:44:22 PM  

Erek the Red: FTR(eport):

'Our examination of the excluded data suggests that documents associated with Black and Hispanic authors and documents mentioning sexual orientations are significantly more likely to be excluded by C4.EN's blocklist filtering, and that many excluded documents contained non-offensive or non-sexual content (e.g., legislative discussions of same-sex marriage, scientific and medical content).'

It sounds like an attempt was made to filter racist and homophobic bias and in the process filtered more than it should have.

Wasn't there a recent attempt to teach an AI how to "speak" by letting it process comments on the Internet and the result was comically over the top bigoted?


AI and other automated struggle so hard with this, and it's absolutely stupid that so many people didn't realize it would be a problem. Words change meaning, and appropriateness, based on context. Me using the word "f@ggot" to refer to myself (appropriate) is completely different from a straight person hurling it at me as an insult (inappropriate), and both of those are different from a scholar using the word in a relevant quote or to discuss the term itself (most likely appropriate).

Any system that can't tell the difference between them is utterly useless. It's either letting people get away with hate crimes or it's censoring queer voices. We should be better than this by now.
 
2021-09-24 9:45:06 PM  

DerAppie: Erek the Red: FTR(eport):

'Our examination of the excluded data suggests that documents associated with Black and Hispanic authors and documents mentioning sexual orientations are significantly more likely to be excluded by C4.EN's blocklist filtering, and that many excluded documents contained non-offensive or non-sexual content (e.g., legislative discussions of same-sex marriage, scientific and medical content).'

It sounds like an attempt was made to filter racist and homophobic bias and in the process filtered more than it should have.

Wasn't there a recent attempt to teach an AI how to "speak" by letting it process comments on the Internet and the result was comically over the top bigoted?

That's my take as well. "We can't accurately identify when 'gay' isn't used as a slang, so lets exclude pretty much everything containing the word" is less of a PR nightmare than another Hitler-bot.

/Anyone else remember when chess videos were being targeted for language pertaining to white attacking black?
//Same problem


Yeah but lemme tell ya, the queer community is extremely tired of having our speech censored.
 
2021-09-25 11:12:25 AM  

austerity101: DerAppie: Erek the Red: FTR(eport):

'Our examination of the excluded data suggests that documents associated with Black and Hispanic authors and documents mentioning sexual orientations are significantly more likely to be excluded by C4.EN's blocklist filtering, and that many excluded documents contained non-offensive or non-sexual content (e.g., legislative discussions of same-sex marriage, scientific and medical content).'

It sounds like an attempt was made to filter racist and homophobic bias and in the process filtered more than it should have.

Wasn't there a recent attempt to teach an AI how to "speak" by letting it process comments on the Internet and the result was comically over the top bigoted?

That's my take as well. "We can't accurately identify when 'gay' isn't used as a slang, so lets exclude pretty much everything containing the word" is less of a PR nightmare than another Hitler-bot.

/Anyone else remember when chess videos were being targeted for language pertaining to white attacking black?
//Same problem

Yeah but lemme tell ya, the queer community is extremely tired of having our speech censored.


I'll just restate my post:

That doesn't make it an easy problem to solve.

You have currently have 3 options:
1) Accept automatic filtering with a lot of false positives
B) Accept unmoderated content
III) Accept that certain words just get banned outright because it is the simplest way to prevent people from using words as an insult even if you "reclaimed" the word.

If you do not like those options you should go into AI research and build a filter that can read intent of the poster, identity of the poster, identity of the receiver, and remains up to date with the various slurs and "reclaimed" slurs. Because until that is built, there is no way to go "but I really am gay so I get to say it, but other people can't".
 
2021-09-25 11:40:23 AM  
i.imgur.comView Full Size
 
2021-09-25 8:43:11 PM  
In before Esperanto.

/ Kind of surprised spell check recognized it, much less suggested it.
// There is no AI. There will never be AI.
///It's mechanical Turks all the way down.
4. I wish we could make AI. I'd probably be tempted to join team Skynet.
 
Displayed 17 of 17 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.