13 November 2018

Are we marionettes?


Yesterday I watched a documentary about Facebook and learned that the company does not collect only data from their platform, but also buys data from other companies and organizations. Meaning they have way more info about you than you suspected. They know stuff you do not know or never wished it shared (like data from credit score tracking companies that include your shopping patterns). The trouble is not that Facebook uses that data for their adds, but FB is also selling access to that data, in very insecure ways. Hence the scandal with Cambridge Analitica.
And if you think that problem is solved, think again. Yes, FB asked them to erase data, and they complied, but only after Cambridge Analitica completed certain Data Science analysis on the data which allows them to keep the useful stuff like their own. That's why another company sprung under a different name that does the same shaite, and it has the same people as leaders.
But as you know, getting add to buy a book, shoes, tools, or pet supplies is not a big problem. The truth is, we all learned how to ignore such adds. The problem is when Facebook and its data is used to manipulate you.

First, you have to stop fooling yourself that FB will solve that problem. They cannot. The root of the problem is their two main algorithms that are the basis of their business model. An algorithm that controls what you see in a newsfeed and an algorithm that chooses an audience for an add. Cambridge Analitica scandal was connected only with an add algorithm. And FB offered the patch. Now they stopped masking political adds as an element of the news feed. Now we can tell that political add is an add, instead of a post that you would like to see. But that is all they changed.
Oh, they did introduce some new rules if someone is trying to post political add, but they will not and cannot remove underlying cause. It is connected with the source of their income.
The more significant problem that remains is connected with the newsfeed algorithm. That algorithm is made to determine and show you the posts which are more likely to cause you to react to them. You see on newsfeed what algorithm thinks it will cause the most immediate emotional reaction from you. That's how FB can make you hooked on the feed, make you come back, and charge an add that appears somewhere on your landing page.

FB needs an algorithm that will emotionally provoke you to keep you coming back so that they can serve you more ads and earn more money. Your attention is what they sell to companies and individuals who are buying advertising space on FB. They will not change that. It would kill their company.

It is not surprising that 'extremization' of the users on social media sprung as a problem. It is not surprising that FB was used heavily to incite and sustain genocide in Myanmar. Because the more provoking the post is, the more emotionally outraged it is, the more likely is that you, the user, will react to it. The consequences of the algorithm to the well being of the users and society are irrelevant from the FB business point of view. And because FB is a publicly traded company with the primary duty to increase profit for their shareholders, their business model will not change. It works. It fulfills its primary function. It brings money to shareholders. Thinking that any publicly traded company has any other purpose than increasing the profit is merely naive.
So algorithm pushes extreme posts, manipulates you emotionally so that you keep coming back and at the same time, makes you more extreme. An algorithm does not wish you to think critically, because if you think you will not make knee-jerk reactions and keep coming back to the news feed.
That's why the news you get from Facebook is most likely fake. If news does not come from the reputable source, it is a manipulative post made only to provoke your emotions and control your actions.

I learned about Facebook’s newsfeed algorithm during 2016 and realized that I, to, fell as a victim of the manipulation. So I stopped getting my news from Facebook. I even stopped visiting Facebook on regular bases. From several daily visits, my visits dropped to maybe once every two-three weeks.
Instead, I performed research on news sources and collected all bunch of reputable ones, domestic and international. And it works, this time I am not manipulated. I use my brain, critical thinking and several simple analytical methods to make sure that stuff I hear about is a reality.
There is another interesting consequence of my rare visits to FB. Now my news feed is full of posts from my contacts. Last two times I visited FB (to announce the publication of my Sci-fi short story) I've seen only one add. A stark difference from my daily visits when I saw just adds and maybe one post from my contacts.

Because the FB business model relies on emotional provocations, it discourages critical thinking. Hey, it discourages any thinking. And that is my biggest beef with FB. Social media platforms are making dummies from us.

Every user is manipulated on social media. Your attention is the commodity they sell. Any social media will adjust design and what they are showing you to keep you on their platform for longer so that they can serve you more adds. The moment you start thinking, you might break free and leave. Not to forget that the more you can control your emotions and use your rational thought it is less likely you’ll be manipulated. That's why they do not wish you to use your brain.
In case of some reason you desperately need to go and visit social media daily, I'll post next time a few tips on how to arm yourself against manipulation. But I have to warn you, use of those tips will eventually reduce the time you spend on social media.


No comments:

Post a Comment