mOp-Ed: Leveraging Addiction to Attack Freedom

by Mark Oppenheim

Facebook and other social media platforms sell their ability to direct our eyes to content. These platforms have been accused of promulgating and profiting from various forms of racism and disinformation shaped to cause harm and division in America, and authoritarianism in the world.

In response, social media platforms and spokespeople, most prominently Facebook CEO Mark Zuckerberg, argue that social media companies provide neutral free speech platforms, and they position themselves as defenders of freedoms.

Directing our eyes to particular content and advertising is a purposeful not a neutral act, so the neutrality part of their argument is untrue. If you purposefully build products to direct attention in particular directions, you can just as purposefully build products to direct attention in ways that exclude racism and disinformation.

Algorithms forming the core of social media platforms interpret language into math in order to present us with content that generates responses in our brain linked to engagement and addiction. We are given an experience that changes our brain chemistry in ways we want to repeat, then we are given the option to repeat it. In addition to being untrue, the neutrality argument is disingenuous because social media employees know that their job is to direct our experience in specific ways that cause us to view, click and frequently refresh pages so that their companies sell more advertising.

But misdirection by certain social media executives is not the key problem we are facing.

We each face the same issue faced by the addicted smoker, tobacco farmer, advertising professional and tobacco company employee, all of whom are well aware that smoking tobacco harms us. We all know that tobacco companies make their products as enticing and addictive as possible to young and old consumers who suffer lifelong health consequences or die. The companies and their employees know this. Each and every one of us also knows that these companies stretch out the process of revealing truths about harm, and stretch out the process of regulating their activities.

Just as no tobacco company executive is selected primarily because they care about public health, no social media company leader, employee or spokesperson is chosen primarily because they care about civil society or free speech. Those employed have their jobs because they (we) know how to use tech to create products and experiences that generate profit; they (we) know how to work levers of power to advance their company’s interests; and they (we) are willing to exercise our skills in this way.

It is up to us all, the user, the employee and the executive, to determine whether participating in all that generates profit is good. If some uses also cause harm, then we must break our addiction and stop using. If we feel that what we are doing is harmful, then we need to stop, think about our role in promulgating harm, and do some things differently. We may need to place certain activities off-limits, including through the force of law and regulation.

This will be a balancing act with each path full of tradeoffs, but we can recognize harm when we see it, recognize our personal responsibility for that harm… and we can change our behavior.

Hi. My name is Mark and I am a social media user, creator and addict.

What comes next is up to us.

____

Mark Oppenheim is a nonprofit wonk who runs nonprofit search and media organizations.

mOp-Ed pieces reflect diverse opinions about the nonprofit world and we welcome yours. If you would like to be a guest writer for mOppenheim.Org, please contact us for more information

Arts Media & Culture, COVID-19, mOp-Ed, News
mOp-Ed, News