top of page
Search

Algorithms, Social Power, and Coercion


Symbol-covered plastic ribbons are pulled in all directions out of a metal cage.

I changed my professional name to get better search results (my first name is Elizabeth, my middle name is Cargile). I’ve carefully curated a LinkedIn presence, because I need the platform to get a job I care about.


Each of these decisions is working out in the ways I wanted them to, but they feel almost coerced. Why do they feel so fraught?


It feels like I’m stuck in something like a values hostage negotiation: there’s a direct conflict between my long-term values and the steps I need to take to achieve those values. I'm aware of that conflict, but that awareness doesn't change my actions.


  • I want to create good content, but I worry about the number of impressions and why the algorithm seems to promote some posts but not others.

  • I want a good work/life balance, but I’m dedicating significant amounts of time and energy to a platform that gives me a good bit of anxiety.

  • I want to find a workplace where I can be valued for who I am as a person, but I’m making seemingly arbitrary decisions about how to present myself to appear hirable.

  • I want to make genuine connections with people, but I’m also making a large number of superficial connections to seem legitimate enough to get a job.


LinkedIn makes some valuable things possible: connecting more deeply with other people through in-depth text conversations and virtual meetings we’ve set up through the app. I really like and value the people on the app.


Why do I feel like I’m stuck in a kind of hostage situation?


LinkedIn is the only real option I have for effective social networking around jobs. It seems like I have to give up on at least some of my deeply held values to make LinkedIn work for me to ensure my basic survival and path to the good life. My wellbeing is in the balance, and the quickest way to long-term wellbeing seems to be short-term sacrifices to it. I don’t know when I get to step off that treadmill.


These kinds of value hostage negotiations happen fairly often, especially in digital spaces where an impersonal, opaque algorithm determines how well we’re able to meet our basic goals. Think about TikTok creators whose income dramatically drops when the algorithm suddenly stops showing their content, or professional YouTubers who very carefully avoid making certain kinds of content, not because it’s inappropriate but because the platform won’t promote it.


If I just had to convince other people on LinkedIn that I was worth hiring through direct, human to human discussions, that would be relevantly different. There’s always some social pressure in these interactions, but they’re at least human connections with real people. However, there’s something that feels weirdly coercive about the broader LinkedIn algorithm and structure. I feel like I have to feed the LinkedIn beast several times a week to keep the platform happy.


The problem isn’t merely the algorithm. As soon as you have a large, gatekeeping service that has a rough monopoly in an area, I think you’re going to start running into deep problems of autonomy, coercion, and responsibility, especially when algorithms are involved. The necessity of basic survival or access to a meaningful life can lead to greater addiction to apps and services and influence us to more readily give up on our values to appease the invisible algorithmic hand.


At the end of the day, I feel alienated and like I lack the integrity to really stick to my values. The only problem is, I can’t readily achieve those values if I don’t engage in at least some of the things that contradict them. I have to give up on at least some of the things I deeply care about either way.


I’m not leaving LinkedIn anytime soon, but I am going to try to find ways to post the content I care about and limit my time feeding the LinkedIn beast (I’m no longer logged in on mobile, and I’ve changed the password to something long and complicated that I can’t remember).


The bigger point is that the moral issues algorithms raise are directly impacted by the social power they have. Concerns about algorithmic fairness take on a new dimension when that algorithm is an essentially inescapable tool for meeting basic needs.


We need to think about how to build algorithms that more closely track what we value, but we should also think about diversifying platforms and algorithms so we don’t get stuck with only one viable option. Otherwise, we’ll continue changing ourselves and giving up on our key values to appease impersonal machines.


Have you felt similarly about LinkedIn or another online platform? How do you stay dedicated to your core values in these kinds of value hostage negotiations?


Photo Credit: DeepMind

16 views0 comments

Recent Posts

See All

Comments


bottom of page