Tech & The Web

Facebook Whistleblower Slams Platform, But Who’s to Blame?


Senators received an earfull from a Facebook whistleblower who accused the social media giant of placing profit above public safety.

A former employee turned Facebook whistleblower testified before the Senate, urging senators to not trust the company.

Frances Haugen accused Mark Zuckerberg and his company of knowingly pushing products harmful to children and young adults to pursue endlessly growing profits.

“I am here today because I believe that Facebook’s products harm children, stoke division, and weaken our democracy,” she said during her opening remarks. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They won’t solve this crisis without your help.”

What kind of help can Congress provide? That’s the question no one can really answer so far.

The trouble with younger users

I’m sure there are adolescents whose body image issues may get worse on Facebook and Instagram. We compare ourselves against our peers in real life. Why wouldn’t we fall into that same trap on social media? No one can reasonably expect otherwise.

The Facebook whistleblower said the company has research that confirms young girls have experienced body image concerns after looking at pictures of others on Instagram.

The answer is obvious. Facebook and Instagram should simply ban anyone under 18, right?

But wait a second. Facebook and Instagram do have an age limit. The company does not allow those under 13 to use their platform. Thirteen still seems a bit young to me. Sixteen seems more reasonable.

Either way, you can argue that one can change their date of birth to appear older than they are.

But almost no one I know is willing to submit a photocopy of their driver’s license or birth certificate to the company to verify their age.

CBS Mornings interviewed a Facebook executive during an interview Wednesday morning. Anchor Nate Burleson asked what she would say to the parent of an 11-year-old girl who may suffer from body image issues because of pictures she sees on Instagram.

But wait: What’s an 11-year-old doing on Instagram?

The solution to the age of its users seems fall on parents. I know several parents with children of teenagers who didn’t allow their children to have an account until they were at least 16 or 17. (Yes, they periodically checked their children’s devices to make sure they weren’t trying to get away with anything, too.)

Is it fair to expect parents to watch social media, too? I don’t know that “fairness” is a realistic question here. Parents have a responsibility to protect their children. Unfortunately, if you accept that social media can cause harm, how can you not add social media as an area in which parents must look out for their kids?

Does Facebook try to divide us?

Since part of my real job involves managing social media accounts, I hear the “promoting division” complaint all the time. People accuse various pages of trying to “stir the pot” any time a page simply asks a question. The question doesn’t have to even be particularly controversial.

Something as simple as, “Do you love or hate candy corn?” can be an example, in some people’s minds, of attempts to divide us.

That tells me some people don’t understand how social media works.

To be a success on Facebook, you need interactions. Interactions include any of the seven “Reactions:” like, love, care, haha, wow, sad or angry. Interactions also include comments and shares. Publishers — whether they are media outlets or bloggers — want clicks from Facebook. They happen when someone clicks a link on a Facebook page to go to the website.

Of course, Facebook would prefer people stay on their platform, not a publisher’s site.

The easiest way to get interactions sometimes is to ask a question. Bloggers have been receiving that advice for years now: Ask a question to get comments. It rarely works on blogs, but it does still work on Facebook…some of the time.

I’ll give you an example.

I’d ask you to consider three posts, each with different questions and topics, as examples. The first receives 75 reactions, 125 comments and 10 shares. The second receives 50 reactions, 130 comments and 30 shares. The third receives 150 reactions, 60 comments and no shares.

Without a need to see the actual topics or the types of comments, what do we know?

That all three posts received 210 interactions each.

Depending on how well an “average” post on your page performs, that might be a good number.

Facebook’s non-human algorithm sees posts that get a lot of reactions and shows them to more people, hoping to increase the number of interactions.

I doubt seriously if the algorithm attempts to examine the tone of the comments. I doubt if any effort is made to determine when comments are getting out of control. That’s what page owners in their moderation role are supposed to do.

When a page I manage asks a question, it doesn’t honestly matter to me whether I get 210 interactions that all agree or disagree. It does matter whether people are disrespectful and rude to one another or to the page itself. As a moderator, I have various tools at my disposal to address that, including banning people from being allowed to participate on the page at all.

The algorithm isn’t going to be able to do that. There’s too much content and too many interactions. Too many happen too quickly for an algorithm to keep up with.

A question on a controversial topic can prompt rudeness and bitter arguments. It can also provide at least some interesting debate. If you’re a page owner, you’re going to want to inspire discussions. You’ll want to step in and stop personal attacks and other acceptable behavior. But you’ll want to have those interactions.

You’ll want people to treat each other with respect. You’ll want to build a community where there can at least be friendly discussion rather than constant yelling. But you’ll want to have those interactions.

You’ll want to be a success on Facebook. But just like a football player, you’ll have to play your game on your gridiron. If you stop onto a baseball field, you’ll have to play a different game. You’ll have to play Facebook’s game on Facebook.

Until Facebook changes its rules, that’s the game page owners have to deal with.

Ideally, people would simply behave better.

They won’t. Many actually think the snarkier they are, the more they get noticed.

But is that entirely Facebook’s fault? It seems to me that at least a good portion of the problem with Facebook falls upon Facebook’s users, not the platform itself.

We don’t have to pick fights over every little thing. We don’t have to smart off to complete strangers. It should go without saying, but we also shouldn’t feel the need to step onto someone else’s porch and start trouble, either.

It’s a cop-out to claim that the users should get a free pass and the platform should shepherd them into “proper” behavior. The people who would make that claim surely do not want to be shepherded. But that’s the problem: they don’t think they’re the problem, do they?

We like to bully big businesses; we tend to like underdogs. But some of the same people who want stricter government regulation on social media sure were slow to adopt it in other types of industry.

No matter how much regulation gets proposed, it comes down to this. If you want to improve Facebook, maybe we should improve ourselves before we log in so often. Maybe we should check our anger and outrage — along with the various chips on our shoulders — at the door before we log in.

Based on what the Facebook whistleblower said, did you opinion of Facebook and Instagram change?

1 Comment

Comments are closed.

Patrick is a Christian with more than 30 years experience in professional writing, producing and marketing. His professional background also includes social media, reporting for broadcast television and the web, directing, videography and photography. He enjoys getting to know people over coffee and spending time with his dog.