Photo by Jens Kreuter on Unsplash

Data Science in the Real World

The Ethical and Privacy Issues of Recommendation Engines on Media Platforms

Why we should pay more attention to the systems that serve up content.

6 min readApr 24, 2019

--

Recommendation engines on media platforms are dominating our media decisions. Instead of allowing the randomness of couch surfing decide our viewing fate, the choice is being made for us across all forms of digital media, including YouTube, Facebook, Spotify, etc. According to a McKinsey report, 75% of Netflix viewing decisions are from product recommendations.

While at face value this equates to user convenience, as the system recommends things that align with the data it has gathered to create a profile of user interests, in reality, the recommendation system domination belies ethical and privacy concerns.

Ethical Issue #1: Addictiveness

One ethical issue in recommendation systems is that they are created to be addictive. Their purpose is to capture and sustain user interest for prolonged periods of time. Take, for example, the autoplay features on YouTube and Netflix. Both supply content that is tailored to your data profile and plays it automatically to keep you hooked.

What’s the ethical dilemma with that? Isn’t that just a way to survive in this New Economy of Attention? At first blush, perhaps. But what about when these global companies are taking advantage of human psychology to create an addictive product, as Sean Parker, founding member of Facebook, acknowledged in an interview with Axios:

“The thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible.’”

Stanford professor Nir Eyal, who wrote a book on designing addictive products, is calling for new ethical standards. Eyal says that different persuasion techniques can be ethical depending on the circumstances; for example, employing a “streak” on Duolingo to encourage people to learn a new language may seem acceptable, while a SnapChat “streak” that is meant to get teens to compulsively check the app could be in that ethical gray area, which makes it a complex issue. So, he posits that, if the platform is getting the user to do something they don’t want to do, it’s not simply persuasion anymore — it’s coercion.

While this may be a bigger issue in apps like SnapChat and Facebook, this is still pertinent to recommendation engines, as they are often built to hook and sustain viewers’ attention. Netflix’s “Are You Still Watching?” feature should be employed on other apps, such as YouTube, as well.

Ethical Issue #2: Extreme Content = Viewers’ Attention, But at What Cost

The fight to capture and sustain users attention has led to another critical ethical issue regarding recommendation engines: the content served up may not actually be in the users best interests, and contributes to polarization. As Renee Diresta puts it in Wired, recommendation systems are “broken” and have become “The Great Polarizer.”

For example, on YouTube, the algorithm tends to serve up increasingly radical content in the efforts that you will keep watching so that Google can make more money off of ads. As Zeynep Tufekci puts it in The New York Times,

“YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

Tufekci found that whether you started watching slightly liberal or conservative leading videos, YouTube ended up suggested increasingly radicalized content, including supremacist and conspiracy theory videos. Even nonpolitical videos, such as ones about vegetarianism, would lead to videos on veganism in an effort to keep the user hooked.

Tufekci ultimately relates YouTube to a fast food restaurant: serving up volumes of sugary, salty foods that make us want to keep eating even though we’re already full.

This is an ethical issue for Google’s engineers because they know people go to YouTube to get information on a variety of topics, and the algorithm ultimately can skew people into watching incorrect, radicalized, extreme videos.

Content creators are also affected, writes Guillaume Chaslot, an ex-Google engineer who has criticized the ethics of YouTube’s algorithm. As increasingly radical content is what performs well, creators are incentivized to create incendiary videos and posts to get eyeballs on their content.

“AI is not yet creating fake news and starting a war against the media per se, but it is incentivizing content creators and public figures to do so.”

The issue of recommendation systems leading to increasingly radicalized content isn’t limited to YouTube’s algorithm. Facebook is considering changing its newsfeed recommendation system to promote Facebook Groups more, which could be a mistake, according to BuzzFeed News, as these Groups can turn into microcosms of extremism.

Security researcher Renne DiResta told BuzzFeed news that groups already act as a target for bad actors:

“Propagandists and spammers need to amass an audience, and groups serve that up on a platter. There’s no need to run an ad campaign to find people receptive to your message if you can simply join relevant groups and start posting.”

Facebook changing its algorithm to focus on Groups more, not less, is an ethical gray area. On the one hand, people are posting more in Groups, so to capture more people’s attention, it makes sense for the performance of its platform. On the other, knowing that bad actors often take advantage of Groups specifically — BuzzFeed News also states that Russian hackers repeatedly cite Facebook groups as a focus — shows that Facebook may be helping users become more susceptible to these bad influences.

Diresta suggests a few ways to mitigate this issue. The first is to create a recommendation issue that points people in the opposite direction — toward content that aims to de-radicalize the audience. Giving users more control over their algorithmic filtering is another option. For example, on Twitter, you can filter out content from low-quality accounts.

Another option is for algorithms to take into account the source of the quality of the content and its source so that low-quality content isn’t recommended. This would “nudge” users in the direction of higher quality content, which Diresta likens to having apples in the lunch food line instead of potato chips.

According to NBC News, a YouTube spokesperson said that they have shifted the algorithm to focus on “satisfaction” instead of “watch time,” so comments, likes, dislikes, shares and surveys are taken into account. The algorithm has been changed to incorporate the authority of the video source more.

Privacy Concern: Personal Data Collection

Personalized recommendations typically require the collection of personal data for analysis, which can make users susceptible to privacy violation issues. According to a 2018 research paper published in Engineering, the data “undesirably discloses the users’ personal interests to the recommender.” In addition, the data can be sold to third parties without user consent. Take, for example, the Facebook/Cambridge Analytica scandal. A third privacy issue is that platforms can be hacked and users’ personal data can be leaked, which has happened to Facebook (and other platforms) several times.

It is essential for platforms to develop privacy-preserving practices to avoid such violations. The 2018 research paper recommends cryptographic and randomization techniques to protect and preserve private data. In tandem with these methods is user grouping, which groups members by similar traits and removes personally identifying data. That way, characteristics that are important to platforms and advertisers are retained without sacrificing user privacy.

Recommendation Engines: Dominating The Future of Media

As the media landscape is becoming increasingly dominated by social media and streaming platforms, the algorithms that form recommendation systems play a monumental role in what people watch. The creators of these systems need to understand and mitigate the ethical and privacy issues threatening consumers.

--

--

Digital marketer by day. Wedding band musician by night. Grad student specializing in audience analytics at University of Florida.