If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(BBC)   Using Siri to search for a brothel in China no longer comes to a happy ending   (bbc.co.uk) divider line 20
    More: Spiffy, noodle soup, apples, happy ending, China Daily, prostitution  
•       •       •

4113 clicks; posted to Geek » on 31 Oct 2012 at 2:39 PM (1 year ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



20 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread
 
2012-10-31 02:42:17 PM
Censorship is spiffy?
 
2012-10-31 02:50:23 PM
It reminds me of those fancy Hooker Urns.
 
2012-10-31 02:52:34 PM
Went on a high school trip with my daughter to China. We were told that students were not allowed in the night clubs below ground level because they were a mix of strip club disco where services were for sale. our local guides set up massages so there was no misunderstanding. One night some of the adults went out for a drink and we were offered a young girl. Not really the hardest place to find sex.
 
2012-10-31 02:56:42 PM
And now the link is 404'd. And the link is the same one as on the BBC front page. Censored too??
 
2012-10-31 03:11:59 PM
Another commented: "A mobile phone can know all this while the police do not?"

The police always know where the brothel is. I always thought a vice-cop could make a great second career as a tourist guide.
 
2012-10-31 03:14:21 PM
Link farked
 
2012-10-31 03:16:07 PM
'Siri, me so horny.'

'Here are some places you can be loved long time.'
 
2012-10-31 04:24:57 PM

NutznGum: 'Siri, me so horny.'

'Here are some places you can be loved long time.'


We are done here.
 
2012-10-31 04:43:08 PM

Leeds: Link farked


Siri has censored it. For you own good.

Now, pick up that can.
 
2012-10-31 07:34:56 PM
If I ask Siri how to chop up a hooker and fit it in a duffel bag I want a response.

Siri does not control morality, I control my own morality and deal with the consequences myself.

Why does Apple feel the need to make everything morality based while still using Foxcomm in China for it's "flexibility"?
 
2012-10-31 08:09:23 PM

Enemabag Jones: If I ask Siri how to chop up a hooker and fit it in a duffel bag I want a response.

Siri does not control morality, I control my own morality and deal with the consequences myself.

Why does Apple feel the need to make everything morality based while still using Foxcomm in China for it's "flexibility"?


Morality is only for the 99%.
 
2012-10-31 08:20:57 PM
Slow clap for subby.
 
2012-10-31 08:35:40 PM

Enemabag Jones: If I ask Siri how to chop up a hooker and fit it in a duffel bag I want a response.

Siri does not control morality, I control my own morality and deal with the consequences myself.

Why does Apple feel the need to make everything morality based while still using Foxcomm in China for it's "flexibility"?


Siri will still find "escort services" in other countries. Apple is complying with the censorship law in the territory in which is offering service.

Apple did not build in the hooker finder feature, BTW, it was in Siri's code when they acquired it.
 
2012-10-31 09:57:14 PM
"Siri, is it raining golden showers?"
 
gad
2012-10-31 10:18:10 PM

Enemabag Jones: If I ask Siri how to chop up a hooker and fit it in a duffel bag I want a response.

Siri does not control morality, I control my own morality and deal with the consequences myself.

Why does Apple feel the need to make everything morality based while still using Foxcomm in China for it's "flexibility"?


You are not entitled to be told how to chop up another human being and the response I hope you actually get for asking that question involves a visit from your local police and a trip to the psych ward. And you control your own 'morality' as far as the end of your nose, after that it's something that society will tend to pay attention to and deal with accordingly.
 
2012-10-31 11:30:02 PM

gad: Enemabag Jones: If I ask Siri how to chop up a hooker and fit it in a duffel bag I want a response.

Siri does not control morality, I control my own morality and deal with the consequences myself.

Why does Apple feel the need to make everything morality based while still using Foxcomm in China for it's "flexibility"?

You are not entitled to be told how to chop up another human being and the response I hope you actually get for asking that question involves a visit from your local police and a trip to the psych ward. And you control your own 'morality' as far as the end of your nose, after that it's something that society will tend to pay attention to and deal with accordingly.


I read that in Siri's voice
 
2012-11-01 01:13:40 AM

gad: Enemabag Jones: If I ask Siri how to chop up a hooker and fit it in a duffel bag I want a response.

Siri does not control morality, I control my own morality and deal with the consequences myself.

Why does Apple feel the need to make everything morality based while still using Foxcomm in China for it's "flexibility"?

You are not entitled to be told how to chop up another human being and the response I hope you actually get for asking that question involves a visit from your local police and a trip to the psych ward. And you control your own 'morality' as far as the end of your nose, after that it's something that society will tend to pay attention to and deal with accordingly.


Siri, what's the thought crimes version of internet tough guy? Nevermind siri, rhetorical question.
 
2012-11-01 06:01:22 AM
From a link on TFA
"For a question such as "what is the best smartphone ever?", Wolfram Alpha would pool available reviews and comment in order to come up with what it feels is the right result.

In this instance, the "best" result was determined by reviews on the website of US retailer Best Buy.
Siri has offered some unexpected responses since launching on the iPhone 4S last year.

Nokia's Lumia 900 - which launched in the UK this month - came out on top.

However, when asked the same question, the software no longer attempts to search Wolfram Alpha to find its answer, instead producing a default answer.

Nokia spokeswoman Tracey Postill told the Sydney Morning Herald: "Apple position Siri as the intelligent system that's there to help, but clearly if they don't like the answer, they override the software.""
 
2012-11-01 07:41:23 AM

Warlordtrooper: Censorship is spiffy?


I can't explain that. The tag was "sad" when I submitted it.

/subby
 
2012-11-01 09:26:00 AM
gad-

You're a bad person and you should feel bad about yourself.

// Pro-censorship pussies piss me off
 
Displayed 20 of 20 comments

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »






Report