Do you have adblock enabled?
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Slate)   Facebook COO Sheryl Sandberg comments on study designed to upset you: "We never meant to upset you"   (slate.com) divider line 50
    More: Stupid, Facebook users, National Academy of Sciences  
•       •       •

1183 clicks; posted to Geek » on 02 Jul 2014 at 8:32 PM (1 year ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



50 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread
 
2014-07-02 06:02:57 PM  
On top of everything else, the fact that a social media company is so disconnected from society and the communities it tries to attract that it didn't think that this would go over poorly? I would seriously like to know what their expectations were. Did they think it would just go unnoticed in whatever journal they published in? Or that people were just going to look at it and just think "Oh, that's neat"?
 
2014-07-02 06:14:18 PM  
This is the biggest first world problem of all time.
 
2014-07-02 06:17:20 PM  

Aarontology: This is the biggest first world problem of all time.


Not really.

If a University wishes to conduct a psychological experiment the experimenters have to get ethics board approval. The reason for this is before ethics boards people used to do farking disgusting this to other humans in the name of science. Now we have a private company allowed to perform psychological experiments with no ethics board oversight.

Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.
 
2014-07-02 07:06:11 PM  
Yeah so if you ever need info about anyone on Earth, just ask. We have over 4,000,000,000,000 emails, pictures, addresses, SSNs. How'd we manage that? We leaned in, moved fast, and broke things. Or maybe people just submitted them. I don't know why. They "trust us." Dumb farks.
 
2014-07-02 07:33:40 PM  
ohmygodwhothehellcares.jpeg.

Maybe this is why I'm not in scientific research.  I really don't see how this was such a big deal.  Also:  I really don't see how they needed to do this research when it's already pretty well known that thoughts shape reality, and words affect thought.  Guess they just wanted to make extra extra sure, and in this particular setup.
 
2014-07-02 07:39:30 PM  

Tigger: If a University wishes to conduct a psychological experiment the experimenters have to get ethics board approval. The reason for this is before ethics boards people used to do farking disgusting this to other humans in the name of science. Now we have a private company allowed to perform psychological experiments with no ethics board oversight.


This. Doing such experimenting without approval, and without signed permission from your subjects, is something that could cost you your job, tenure or not.
 
2014-07-02 07:49:12 PM  

Tigger: Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.


I don't see this as the biggest problem of the methodology of the study.  Let's put the ethics aside for a moment.  I didn't read the actual study, but going by what I think I know of it, they wanted to see if exposure to emotionally valenced status updates would result in subjects posting emotionally valenced status updates themselves.  "One group will be more upset" could be a way of simplifying the situation for a lay audience.  Actual alternate hypotheses could be:

- Negatively valenced feeds will cause subjects to post negatively valenced status updates.  Why?  Because they either felt bad, or wanted to fit in.
- Negatively valenced feeds will cause subjects to post negatively valenced or neutral status updates.  Why?  Because they don't want to contradict/invalidate the current negative vibe on the feed.  People in negatively valenced moods may harbor negative feelings towards people with positively valenced moods, and subjects may not want to trigger that reaction.
- Positively valenced feeds will cause subjects to post positively valenced status updates.  Why?  To fit it with the trend.
- Positively valenced feeds will cause subjects to post positively valenced or neutral messages.  Why?  Similar to the negativity hypothesis above, they don't want people to be annoyed at them for bringing everybody down.
- Positively valenced feeds will cause subjects to post negatively valenced status updates.  Why?  Because they're sad/angry/attention whores, and since everybody else has a happy, they can get people to pay attention to them and make them feel better.

This is off the top of my head... there are probably other viable hypotheses.

The biggest problem is I think they only measured what was there.  Which seems obvious, but people who are in moods contrary to the valence of the feed may simply choose to not post for a while.  I've wanted to post happy/frivolous things every once in a while, but saw that a chunk of my feed was reacting to a death of a friend or loved one.  I held off posting, because it just felt wrong.  Without knowing a crapload about the subject matter (which Facebook doesn't seem to yet), you can't attempt to accurately measure what isn't there.

So the end result of this study seems to not take into account a very important factor, as I see it.
 
2014-07-02 08:49:41 PM  
Facebook COO Sheryl Sandberg comments on study designed to upset you: "We never meant to upset you get caught."
 
2014-07-02 08:52:37 PM  

TheOmni: On top of everything else, the fact that a social media company is so disconnected from society and the communities it tries to attract that it didn't think that this would go over poorly? I would seriously like to know what their expectations were. Did they think it would just go unnoticed in whatever journal they published in? Or that people were just going to look at it and just think "Oh, that's neat"?


They thought users would forget about it in Five minutes and this will help them sell advertising so go for it
 
2014-07-02 09:07:55 PM  
You're a farking subject.
 
2014-07-02 09:26:43 PM  
It was fortunate happenstance. Serendipity if you will.
 
2014-07-02 09:34:36 PM  
Are people REALLY biatching about this? Oh no! They used data to do a segmentation study on their audience! TEH HOOORORRRRR!!!!

F*ck, people...
 
2014-07-02 09:37:09 PM  
She is a disgusting narcissist.
 
2014-07-02 09:39:48 PM  
The experiment was a form of mind control.

Pretty disgusting.
 
2014-07-02 09:49:08 PM  
They should give everyone a free month of Facebook for this.
 
2014-07-02 10:18:33 PM  

BalugaJoe: She is a disgusting narcissist.


Just don't call her bossy.
 
2014-07-02 10:21:31 PM  
Slooooooooy giiving it awaaaaay....
 
2014-07-02 10:22:05 PM  

BalugaJoe: They should give everyone a free month of Facebook Total Fark for this.


FTFY
 
2014-07-02 10:42:26 PM  

Tigger: Aarontology: This is the biggest first world problem of all time.

Not really.

If a University wishes to conduct a psychological experiment the experimenters have to get ethics board approval. The reason for this is before ethics boards people used to do farking disgusting this to other humans in the name of science. Now we have a private company allowed to perform psychological experiments with no ethics board oversight.

Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.


Was there really an ethical issue with this experiment, though?  It's one thing to say that ethics boards getting ignored could be an issue later on, but if they would have approved this experiment, then this experiment would have been fine anyways.

Wait, would an ethics board approve this since informed consent was not given?  I am seriously asking these questions because I know very little about the issue at hand.
 
2014-07-02 10:55:03 PM  
37.media.tumblr.com
 
2014-07-02 11:00:30 PM  
I'm surprised the "Facebook caused me to cut myself due to mental anguish" story hasn't popped up yet...
 
2014-07-02 11:00:38 PM  
Did they inform/ ask consent of the subjects beforehand? If not, then will they be held responsible for any suicidal or homicidal incidents occuring with subjects during the experiment? I think they should.
 
2014-07-02 11:05:23 PM  

llortcM_yllort: Tigger: Aarontology: This is the biggest first world problem of all time.

Not really.

If a University wishes to conduct a psychological experiment the experimenters have to get ethics board approval. The reason for this is before ethics boards people used to do farking disgusting this to other humans in the name of science. Now we have a private company allowed to perform psychological experiments with no ethics board oversight.

Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.

Was there really an ethical issue with this experiment, though?  It's one thing to say that ethics boards getting ignored could be an issue later on, but if they would have approved this experiment, then this experiment would have been fine anyways.

Wait, would an ethics board approve this since informed consent was not given?  I am seriously asking these questions because I know very little about the issue at hand.


It appears that an ethics board did approve this. An ethics board shouldn't approve an experiment if informed consent is not given, but it appears that both the board and Facebook seemed to think that a line in the terms of service about their Data Use Policy was sufficient informed consent. People are taking disagreement with this.
 
2014-07-03 12:01:03 AM  

She then said "...please, don't treat this as an actual study. The government will get involved, and we won't get the phat lewt from selling the results of this clearly unethical activity."

What did she expect, honestly? She basically conducted a large-scale psychological study on unknowing subjects, without any real controls, standards, or processes, fully expecting...what? That no one would ever find out? That the company would be able to act on the at best imprecise analysis derived from the study? That Facebook would confirm what conspiracy nuts already claimed - they were manipulating information presented to their users and measuring the responses based on statistical analysis of user activity?

Facebook, you done goofed. An apology isn't going to cut it.
 
2014-07-03 12:46:23 AM  

RodneyToady: Tigger: Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.

I don't see this as the biggest problem of the methodology of the study.  Let's put the ethics aside for a moment.  I didn't read the actual study, but going by what I think I know of it, they wanted to see if exposure to emotionally valenced status updates would result in subjects posting emotionally valenced status updates themselves.  "One group will be more upset" could be a way of simplifying the situation for a lay audience.  Actual alternate hypotheses could be:

- Negatively valenced feeds will cause subjects to post negatively valenced status updates.  Why?  Because they either felt bad, or wanted to fit in.
- Negatively valenced feeds will cause subjects to post negatively valenced or neutral status updates.  Why?  Because they don't want to contradict/invalidate the current negative vibe on the feed.  People in negatively valenced moods may harbor negative feelings towards people with positively valenced moods, and subjects may not want to trigger that reaction.
- Positively valenced feeds will cause subjects to post positively valenced status updates.  Why?  To fit it with the trend.
- Positively valenced feeds will cause subjects to post positively valenced or neutral messages.  Why?  Similar to the negativity hypothesis above, they don't want people to be annoyed at them for bringing everybody down.
- Positively valenced feeds will cause subjects to post negatively valenced status updates.  Why?  Because they're sad/angry/attention whores, and since everybody else has a happy, they can get people to pay attention to them and make them feel better.

This is off the top of my head... there are probably other viable hypotheses.

The biggest problem is I think they only measured what was there.  Which seems obvious, but people who are in moods contrary to the valence of the feed may simply choose to not post for a while.  I've wanted to post happy/frivolous things every once in a while, but saw that a chunk of my feed was reacting to a death of a friend or loved one.  I held off posting, because it just felt wrong.  Without knowing a crapload about the subject matter (which Facebook doesn't seem to yet), you can't attempt to accurately measure what isn't there.

So the end result of this study seems to not take into account a very important factor, as I see it.


They felt that if users exposed to negative content posted less, it meamt they were feeling negative. That's a stretch. If I had a bunch of users whining on ky feed, I'd close out the site and go do sonething better with my time. It's not that I'm feeling negative.

I suppose people are upset because they think they're Facebook customers and a product wasn't delivered as promised. Unfortunately, the users are the commodity and advertisers are the customers. So I'll go months without posting and skim it quickly. It feels like work undoing stealthy changes made randomly by FB. I can't get the content I want on my feed, to actually stay on my feed.

It's been convenient for networking with roller derby players and refs and keeping in touch with friends who have moved away, but not much else.
 
2014-07-03 12:47:45 AM  

llortcM_yllort: Tigger: Aarontology: This is the biggest first world problem of all time.

Not really.

If a University wishes to conduct a psychological experiment the experimenters have to get ethics board approval. The reason for this is before ethics boards people used to do farking disgusting this to other humans in the name of science. Now we have a private company allowed to perform psychological experiments with no ethics board oversight.

Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.

Was there really an ethical issue with this experiment, though?  It's one thing to say that ethics boards getting ignored could be an issue later on, but if they would have approved this experiment, then this experiment would have been fine anyways.

Wait, would an ethics board approve this since informed consent was not given?  I am seriously asking these questions because I know very little about the issue at hand.


Informed consent is pretty non-negotiable. A researcher might not have to tell subjects beforehand what's going on, if the study is minimally invasive and poses only negligible risks, and the deception is necessary for the integrity of the data. But at the very least, subjects (to the extent that they are identifiable) always have a right to know after the fact -- and to be given all the pertinent information about who conducted the study, to what end, and what authorities the subject can contact if he/she feels she has been mistreated or is in need of aftercare (which the researchers have to provide for).

I'm pretty sure no ethics board would have signed off on this. (At least, not an actual human subjects IRB.) That said, Facebook doesn't fall under IRB authority, so it's sort of mox nix. And, while the academics involved do, if they were only analyzing the data after the fact (which is what I've read), their research is likewise exempt from IRB review.

/sat on a university IRB for several years
 
2014-07-03 12:50:09 AM  

RedVentrue: Did they inform/ ask consent of the subjects beforehand? If not, then will they be held responsible for any suicidal or homicidal incidents occuring with subjects during the experiment? I think they should.


Users gave consent by clicking on Terms of Service. It's arguable whether it'd qualify as "informed" consent. The actual line made it sound like that would collect data for internal processes. Nothing about research studies or manipulating user content.
 
2014-07-03 01:12:52 AM  
img.fark.net

/oblig
 
2014-07-03 01:50:15 AM  
Feeling pretty smug about the steady line of disinformation and cryptic non-updates I've been posting on facebook for years.
 
2014-07-03 02:05:02 AM  
I wonder if the internal name for the project was "Fox News".
 
2014-07-03 03:00:58 AM  

RodneyToady: Tigger: Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.

I don't see this as the biggest problem of the methodology of the study.  Let's put the ethics aside for a moment.  I didn't read the actual study,


I'm shocked

/sarcasm

Yes let us ignore the primary argument against this sort of experiment and focus on my observation of a minor inaccuracy in the study...which I didn't read.  Oh I like you.  You should dress up as a flaming (on fire not gay) straw man for Halloween since you like setting them up so much just to burn them down.
 
2014-07-03 03:47:09 AM  

TDBoedy: RodneyToady: Tigger: Also Bossyboots excuse is terrible. As the headline points out "one group will be more upset" was the farking alternate hypothesis.

I don't see this as the biggest problem of the methodology of the study. Let's put the ethics aside for a moment. I didn't read the actual study,

I'm shocked

/sarcasm

Yes let us ignore the primary argument against this sort of experiment and focus on my observation of a minor inaccuracy in the study...which I didn't read.



Most everyone else is addressing the "primary argument against this sort of experiment."  Someone brought up an actual study-related aspect, and I addressed that, because it's important, and interesting.  The ethical issues are important, but it's not the only important aspect.  A study was actually conducted.  Data was collected, results were published.  In and of itself, whether or not the results support what the researchers say the results support is important.  It's not a "minor inaccuracy," it's something that provides a complicating factor for their conclusions.

And, if this factor is a methodological issue that makes the study less meaningful, it would make their... questionable... research ethics all the more questionable.

In a case like this, I'm not that inclined to read the entire study if the abstract and results/discussion will do.

Beyond all of that, yes, it's important to have a discussion about research ethics in light of what Facebook did in this study.  But at the same time, I'm not going to be so blinded by the ethical ramifications that it prevents me from thinking about the study itself.  Facebook manipulated people's feeds, something they do not infrequently.  It's not exactly on par with Tuskegee.
 
2014-07-03 05:28:34 AM  
Farcebook-tards getting their knickers in a knot over being trolled... priceless....
 
2014-07-03 06:41:19 AM  
I guess Facebok's stance is "why ask for permission when you can ask for forgiveness."
 
2014-07-03 07:30:10 AM  
If you use facebook, you are as dumb as she thinks you are.
 
2014-07-03 08:24:16 AM  
This is good research. It needs to be done over and over again. We need real numbers of crowd susceptibility to emotional contagion, and we need it in peer-reviewed journals. We also need more advanced studies : We know what people "like" and don't like...

My single question is : How verifiable are the results? How general are they ? It's not like I can whip up my own 2-billion person social network tomorrow...
 
2014-07-03 09:00:21 AM  
They ain't no little Mengeles.
 
2014-07-03 10:19:17 AM  

TheOmni: On top of everything else, the fact that a social media company is so disconnected from society and the communities it tries to attract that it didn't think that this would go over poorly? I would seriously like to know what their expectations were. Did they think it would just go unnoticed in whatever journal they published in? Or that people were just going to look at it and just think "Oh, that's neat"?


Suckerberg and friends have never been "connected" to their audience. Hence all the privacy-stripping, data mining, and ad targeting crap they've been boastfully announcing for years.

The only question is why there isn't an equally successful, privacy-respecting alternative yet.
 
2014-07-03 10:29:48 AM  

xanadian: ohmygodwhothehellcares.jpeg.

Maybe this is why I'm not in scientific research.  I really don't see how this was such a big deal.  Also:  I really don't see how they needed to do this research when it's already pretty well known that thoughts shape reality, and words affect thought.  Guess they just wanted to make extra extra sure, and in this particular setup.


Regarding the first part, nobody should be allowed to screw with you for research without your permission. Seems like a no-brainer to me.

Regarding the second, it's a common error known as the "naive scientist" fallacy. Despite the name it has nothing to do with actual scientists. It refers to the layperson's common "well duh" presumption about things "we all know" but which are untested. With testing a number of those things are indeed proved true. A number are also proved false. And perhaps unsurprisingly, a large number turn out to be "it's a lot more complicated than we thought."
 
2014-07-03 10:35:08 AM  

SacriliciousBeerSwiller: Are people REALLY biatching about this? Oh no! They used data to do a segmentation study on their audience! TEH HOOORORRRRR!!!!

F*ck, people...


This wasn't a passive study using existing data. They manipulated people without consent for research purposes. Is it Joseph Mengele? Of course not. Is it unethical, and should it raise concerns about what deeper manipulations they might try in the interest of corporate research and profit? Yes and oh hell yes.
 
2014-07-03 10:35:34 AM  

Bumblefark: Informed consent is pretty non-negotiable. A researcher might not have to tell subjects beforehand what's going on, if the study is minimally invasive and poses only negligible risks, and the deception is necessary for the integrity of the data. But at the very least, subjects (to the extent that they are identifiable) always have a right to know after the fact -- and to be given all the pertinent information about who conducted the study, to what end, and what authorities the subject can contact if he/she feels she has been mistreated or is in need of aftercare (which the researchers have to provide for).


This study is minimally invasive, poses negligible risks, and you could argue that deception is necessary for the integrity of the data.  Assuming Facebook notifies the subjects after the study has concluded (and I'm not entirely sure how they would do this.  A press release maybe?), wouldn't this study fall into those guidelines?
 
2014-07-03 11:10:56 AM  
Sheryl Sandberg is a smarmy, smug, hypocritical biatch.  She has built this BS public persona as some "visionary" who is a success at business and tells women they can do it too, conveniently leaving out key facts like:
1. She's from a rich family
2. She was always a step ahead due to family
3. She;s been afforded great connections, to which she used it (despite not having a business success track record) to get more connected.

She parlays silly claptrap like "lean in" and "be bossy", when in reality, she wouldn't be anywhere without a guy (Larry Summers) bringing her along to ride his coattails and getting her connected with the right people.

She's a fraud.  Maybe I'm the only person who sees this, but that garbage 60 Minutes fawning piece about her sealed the deal in my book.  Am I the only one who saw through her nonsense "visionary" talk?
 
2014-07-03 11:15:12 AM  

llortcM_yllort: poses negligible risks


What are the risks of potentially exposing an unstable person to nothing but negativity for a determinate amount of time?
 
2014-07-03 11:49:45 AM  

llortcM_yllort: Bumblefark: Informed consent is pretty non-negotiable. A researcher might not have to tell subjects beforehand what's going on, if the study is minimally invasive and poses only negligible risks, and the deception is necessary for the integrity of the data. But at the very least, subjects (to the extent that they are identifiable) always have a right to know after the fact -- and to be given all the pertinent information about who conducted the study, to what end, and what authorities the subject can contact if he/she feels she has been mistreated or is in need of aftercare (which the researchers have to provide for).

This study is minimally invasive, poses negligible risks, and you could argue that deception is necessary for the integrity of the data.  Assuming Facebook notifies the subjects after the study has concluded (and I'm not entirely sure how they would do this.  A press release maybe?), wouldn't this study fall into those guidelines?


Never been on FB, but aren't they able to send messages to individual members? I'm sure they know who was included in their sample, since they were the ones who drew it.

In any case, if such a "debriefing" had been included in the protocol submitted before the study started, yeah -- a study like that would have had a fair chance at getting approved. More so if they told people they were going to be studied, but just stayed vague about how their feed would be altered (i.e., "we'll be testing out some different deliveries in the coming week..."). I'm pretty sure I've seen similar studies that have. The burden would be on the researchers to demonstrate that nothing in existing research or theory suggests that their manipulations will have any likely harms on the participants...but, from what I know of that literature, they could probably pull that off.

But, that's not to suggest this was a minor omission. This hubbub is good example of why these requirements exist. When people find out essentially by accident that they were subjected to a study (that still hasn't been adequately explained to them), that doesn't do great things for the relationship between the public and the research community. We already have enough problems with anti-science sentiments...
 
2014-07-03 11:53:57 AM  

Mi-5: Am I the only one who saw through her nonsense "visionary" talk?


i.imgur.com

On recognizing the importance of her connections and getting a head start on life, you're not alone.

For those without connections, Sandberg's schtick always impressed me as "work harder, not smarter." If you're an average schmoe with average connections, her prescription seems to be to put in 60-hour weeks and to do a lot of schmoozing. Your oppressors are putting in 60-hour weeks and schmoozing, right? (No, not my 60-hour weeks, and not my connections, look at your boss's connections, not at mine!) You're not working hard enough. Put in those 60-hour weeks, and when you get to the point where you can ask your subordinates to put in 80-hour weeks, you've won! You're not a workaholic burning herself out in order to make your founder incredibly wealthy, you're an empowered professional who's leaning in and discovering your passion! Ugh.
 
2014-07-03 12:27:30 PM  

The_Six_Fingered_Man: llortcM_yllort: poses negligible risks

What are the risks of potentially exposing an unstable person to nothing but negativity for a determinate amount of time?


A friend of my wife's did the math on the likelihood of someone in the study having committed suicide during the study. Based on the number of participants and the average US suicide rate, she had estimated that 13 people will have killed themselves while being unwilling participants in the study (just on their own, without any participation by Facebook or otherwise).

I wonder how many of those peoples' families are reading the study, looking at a calendar, and calling a lawyer this week?
 
2014-07-03 12:39:45 PM  

brimed03: They manipulated people without consent for research purposes.



Frankly, if you're being manipulated via the badger sputum on Farcebook, you should probably neuter yourself. Because we don't need more stupid.
 
2014-07-03 02:06:13 PM  

brimed03: The only question is why there isn't an equally successful, privacy-respecting alternative yet.


Because it's expensive to create and run, and the more "respect" you show to your users in this regard means the less money you'll make to pay the bills.
 
2014-07-03 05:44:03 PM  

Twilight Farkle: Mi-5: Am I the only one who saw through her nonsense "visionary" talk?

[i.imgur.com image 600x363]

On recognizing the importance of her connections and getting a head start on life, you're not alone.

For those without connections, Sandberg's schtick always impressed me as "work harder, not smarter." If you're an average schmoe with average connections, her prescription seems to be to put in 60-hour weeks and to do a lot of schmoozing. Your oppressors are putting in 60-hour weeks and schmoozing, right? (No, not my 60-hour weeks, and not my connections, look at your boss's connections, not at mine!) You're not working hard enough. Put in those 60-hour weeks, and when you get to the point where you can ask your subordinates to put in 80-hour weeks, you've won! You're not a workaholic burning herself out in order to make your founder incredibly wealthy, you're an empowered professional who's leaning in and discovering your passion! Ugh.


Bingo.  It's basically an executive at a giant corporation telling you that you should feel honored to have the privilege to be a slave for a giant corporation; i.e. it would be a lot cheaper for us if you could work more for the same pay, thanks!

Also, I still am pissed off about the whole "can women have it all?" BS.  Why does it have to be just women that can't have it all?  Both males and females are subject to the same primary resource constraint: time.  No one can have it all, frankly; you can either have a little bit of everything, or a whole lot of a few things.  Not to sound dystopian, but all of this crap (including the #yesallwomen crap) seems like a way for corporate overlords to erect false barriers between the proletariat so that they remain in-fighting against one another rather than taking the fight to bourgeoisie's door where it rightfully belongs.

/hmmm, that might have been a little stream of conscience
 
2014-07-03 09:12:23 PM  

Mi-5: Sheryl Sandberg is a smarmy, smug, hypocritical biatch.  She has built this BS public persona as some "visionary" who is a success at business and tells women they can do it too, conveniently leaving out key facts like:
1. She's from a rich family
2. She was always a step ahead due to family
3. She;s been afforded great connections, to which she used it (despite not having a business success track record) to get more connected.

She parlays silly claptrap like "lean in" and "be bossy", when in reality, she wouldn't be anywhere without a guy (Larry Summers) bringing her along to ride his coattails and getting her connected with the right people.

She's a fraud.  Maybe I'm the only person who sees this, but that garbage 60 Minutes fawning piece about her sealed the deal in my book.  Am I the only one who saw through her nonsense "visionary" talk?


She is a grade A coont.
 
Displayed 50 of 50 comments

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »
Advertisement
On Twitter






In Other Media


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.

Report