WHISTLEBLOWER Frances Haugen in her testimony before the US Senate, and submitting documents to prove her allegations, claimed that Facebook “amplifies hate, misinformation and political unrest.” For teenagers, with still malleable demands, this would have terrible consequences.
That accusation is so true, and I’ll explain here how I discovered this myself in my Facebook (FB) account, and how you will arrive at the same conclusion by just studying your own account.
If you’ve been using FB, I’m sure that you have wondered how, out of 6 billion Facebook posts daily by users around the world, 152 million by Filipinos*, the platform provides you with only 20, at most 30, new posts in your news feed every day.
The answer is that Facebook employs an artificial intelligence software that chooses which of these billions of posts appear in your news feed. First of all, because FB tracks your ISP number, it easily determines that you are using FB from the Philippines. Therefore, it filters posts mainly coming from the Philippines, or else you’d get bored using FB if you keep getting updates from Iceland or Uganda. Filling out the “About” section, e.g., marital status, gender, etc., gives it more parameters on what kind of posts you’d like.
I easily noticed these years back when I started using FB (in 2010) since, unlike most people, those who got to be my FB friends come from vastly different backgrounds, from different periods of my life.
These phases of my life in reverse chronological order include friends from my Lourdes School and Ateneo High School days, from my years with the communist movement and at UP, in journalism (Business Day, Manila Chronicle, Inquirer to the present), when I was a “seeker” in meditation and Eastern mysticism groups (I was with yoga, Zen and Osho groups here), in government (President Arroyo’s spokesman and chief of staff), and now as a columnist.
Of course, many of my FB friends are relatives, or relatives of relatives as well as new acquaintances struck where I live. I haven’t met most of my FB friends as I, for a time, routinely okayed their friend requests, which probably was due to the fact that they knew about me through my thrice-weekly column.
How does FB filter which posts by my friends appear in my news feed? FB keeps that — its “algorithm” — secret, that even that whistleblower was unable to discover. However, I or anyone can easily detect that algorithm at work.
Many, if not most of my FB friends in the communist movement, even if they have lived totally bourgeois lives, have retained their 1970s zeitgeist, their overall view of life and society, which since 2016 took on an anti-Duterte stance, with the usual Yellow set of biases.
What happened in my FB account is that I have seldom, maybe even never, clicked the “like” or “share” button in their posts that were anti-Duterte since I have been, within the limits of my journalistic objectivity, a supporter of this administration.
The FB algorithm detected this so that very few anti-Duterte posts appear in my news feed, while many pro-Duterte ones (for example from the now famous Facebook wit Darwin Cañete and the intriguing Maria Bratikova) do so, since I “liked” or “shared” their posts on this political issue.
In fact, I have often wondered, even before this whistleblower news broke out, why I don’t get posts from friends (even close ones) appearing in my timeline, so I went into their FB accounts. Lo and behold, they’d been very active, but mostly posting anti-Duterte posts which the FB algorithm decided I wouldn’t like, so these didn’t appear in my news feed.
The genius of the FB algorithm is that the posts it allows to appear in my news feed aren’t, in my case, all pro-Duterte ones, or on politics, every day. Otherwise, I’d get bored and stop using it. Once in a while, anti-Duterte posts appear to which I often comment to debunk. Once in a while of course, posts by FB friends from the different phases of my life appear, or from FB users all around the world posting something interesting to me. This FB algorithm though, to use the academic philosophy’s term, is “transparent” — like a glass, it is invisible, but a powerful mechanism that chooses which posts appear in your timeline.
Why does FB do this? To keep me, to use the social media term, “engaged,” which means I interact with my FB more (“liking”, “sharing” and posting my own posts), so I am in my FB account longer and get to see more paid ads, FB’s main source of profits, totaling $29 billion in 2020. The more I am entertained, the longer I use FB, the bigger Zuckerberg profits become.
FB isn’t really “social networking.” It is a mix of entertainment as well as information- and opinion-delivery system, designed by its algorithm for me to spend time on it.
On the surface, there’s nothing wrong with that: FB entertains, it gets paid through ads. The colossal problem is this: such a filtering system magnifies and develops an FB user’s views which, as the whistleblower pointed out, is detrimental to a rational, democratic society, and polarizes it.
For instance, my pro-Duterte views would be supported and magnified as I keep getting those kinds of political opinion and data. On the other hand, FB would have tracked for instance an Ateneo college student’s “like” of an anti-Duterte post, and would be getting mostly anti-Duterte views, with little chance of him even getting my pro-Duterte views. A Red FB user would start to think that revolution is just around the corner, as most posts he gets are fomenting revolution.
Investigators of the January 6 “People Power” at Capitol Hill are in fact finding out that most of the participants were convinced that revolution was just around the corner, as their FB accounts had been flooded with posts by similar-minded idiots. The “pro-democracy” students rioting in Hong Kong many months ago thought that the Chinese Communist Party was just about to fall, and that all Hongkongers were behind them.
Look at your FB timeline and news feed: Don’t you notice that most posts are agreeable or entertaining to you? Experiment, and pick a topic you’re not at all interested in, say vegetarianism. Find a Facebook account dealing with it, and click its posts, or better “follow.” Do this several times, and you’ll get a lot of posts on vegetarianism soon. The FB algorithm though seems to be clever in that it detects such probing of its filters.
Or go to an FB friend’s account and click many of his or her posts that you aren’t interested in or, in the case of his political views, are opposite to yours. Sooner or later, you’ll get posts on these on your account.
What the whistleblower Haugen claimed is that FB in its research has long found out these detrimental effects on FB users’ mind but have chosen not to do anything about it as the company values profits more than people’s well-being.
The very bad news is that FB has become so crucial in our and the Western world’s life, which was demonstrated when it underwent a six-hour outage which disrupted not just businesses that rely on it, but well, people’s psyche. Especially during the pandemic, we have relied on it to keep in touch with our friends, real or virtual, and get information on what’s happening in the world and in the country.
This is what’s so despicable about FB CEO Mark Zuckerberg who issued a blanket denial of the accusations.
For god’s sake, anyone of the FB’s 1.7 billion users, with some little study or experimentation I have outlined, will realize how it is molding his or her mind, magnifying hates and loves, through its algorithm. Only Zuckerberg with his billions of dollars has the resources and responsibility to correct such threats to people’s rationality and society’s democracy. But it appears he doesn’t want to.
*76 million FB users as of 2020, multiplied by two (average posts per day)