Monday, September 2, 2024

When "Less" Personalization is a Good Thing

Recommendation and personalization algorithms almost always use a user’s past behavior as a guide to predicting content. That is very useful--and creates ad efficiency--for sellers of specific products and services purchased by specific consumers. 


But it is not a completely beneficial practice, in some instances.


When algorithms leverage user data, such as search history, clicks, purchases, and interactions or dwell time with content, to tailor results and recommendations, they also create echo chambers and filter bubbles for content related to ideas, news and information useful for citizens (not as consumers). 


That might not be an issue for advertisers selling niche, specialty or inherently-targeted products, or the users who have those interests. Suppliers selling products for surfers--and surfers--might not care at all about echo chambers or filter bubbles. 


The issues are more acute for news and information related to citizens rather than consumers. In such cases, past behavior can mean that  users are exposed to a limited range of information that aligns with their existing beliefs and preferences. And we can argue that this is generally unhelpful for civic life. 


So filter bubbles and echo chambers arguably are not much of an issue for advertisers. The same cannot be said for news and information providers whose products supposedly are designed to inform the public; deal with truth; and do so in fair and balanced ways. 


Study Name

Date

Publishing Venue

Key Conclusions

"The Filter Bubble: What the Internet Is Doing to Your Brain"

2011

Farrar, Straus and Giroux

Argues that online algorithms can create personalized filter bubbles, limiting users' exposure to diverse information.

"The Effect of Algorithmic Personalization on Political Polarization"

2018

Proceedings of the ACM on Web Science

Finds that algorithmic personalization can exacerbate political polarization by exposing users to content that reinforces their existing beliefs.

"Algorithmic Fairness in Recommender Systems"

2019

IEEE Transactions on Knowledge and Data Engineering

Examines the potential for bias in recommender systems and proposes techniques to mitigate bias.

"The Impact of Algorithmic News Personalization on Political Polarization"

2020

Proceedings of the ACM on Human-Computer Interaction

Investigates how algorithmic news personalization can affect political polarization and engagement.

Through the Newsfeed Glass: Rethinking Filter Bubbles and Echo Chambers

2022

NCBI

Most empirical research found little evidence of algorithmically generated informational seclusion. People online engage with information opposing their beliefs.

What are Filter Bubbles and Digital Echo Chambers?

2022

Heinrich Böll Foundation

The role of algorithmic curation in creating bias is limited. User vulnerability to lack of diverse content depends more on motivation and broader information environment.

Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption

2020

MIS Quarterly

Increased Facebook use was associated with increased information source diversity and a shift toward more partisan sites in news consumption.

A scientific study from Wharton on personalized recommendations


Wikipedia

Found that personalized filters can create commonality, not fragmentation, in online music taste.

Through the Newsfeed Glass: Rethinking Filter Bubbles and Echo Chambers

2022

NCBI

Most empirical research found little evidence of algorithmically generated informational seclusion. People online engage with information opposing their beliefs.

"The Filter Bubble: What the Internet is Hiding from You"

2011

Book by Eli Pariser

Personalization algorithms can isolate individuals from diverse perspectives, reinforcing their pre-existing beliefs and creating a "filter bubble."

"How Algorithms Create and Prevent Filter Bubbles: A Theory of Refracted Selective Exposure"

2015

Journal of Communication

Algorithms can both reinforce and mitigate filter bubbles. The extent to which they do depends on the design of the algorithm and users' existing preferences.

"Breaking the Echo Chamber: Mitigating Selective Exposure to Extreme Content"

2017

Proceedings of the ACM

Echo chambers can be mitigated by introducing diverse content in algorithmic recommendations, though this depends on user engagement with such content.

"Exposure to Ideologically Diverse News and Opinion on Facebook"

2015

Science

Personalization on Facebook does expose users to some ideologically diverse content, but the overall effect is that users tend to see more content that aligns with their pre-existing views.

"Algorithmic Accountability: A Primer"

2016

Data & Society Research Inst.

Algorithms often lack transparency, which makes it difficult to address issues like filter bubbles. Greater accountability and transparency are needed to ensure diverse content exposure.

"Echo Chambers on Facebook"

2016

PLoS ONE

Users on Facebook are likely to be exposed to content that aligns with their own views, leading to the formation of echo chambers. The network structure and algorithmic sorting contribute.

"Polarization and the Use of Technology in Political Campaigns"

2018

Political Communication

Political campaigns' use of personalization algorithms can exacerbate polarization by targeting individuals with content that reinforces their existing political beliefs.

"Online Echo Chambers and the Effects of Selective Exposure to Ideological News"

2017

Public Opinion Quarterly

Selective exposure to ideological news through personalized algorithms can deepen echo chambers, leading to more polarized opinions among users.

"The Role of Personalization in Political Polarization"

2019

Digital Journalism

Personalization in news feeds can contribute to political polarization by filtering out dissenting viewpoints and reinforcing users' existing beliefs.

"Algorithmic Personalization and the Filter Bubble: A Literature Review"

2020

Internet Policy Review

A review of existing studies that suggests while filter bubbles exist, their impact is variable and depends on individual behavior, platform design, and other factors.


What is not so clear is how algorithms can be redesigned to counteract such issues. In principle, algorithms might be deliberately designed not to respond so directly to user behavior, perhaps by increasing “serendipity” into recommended content (recommending content that is unrelated to a user's typical preferences). 


That might work better for social media or other news content than e-commerce; worse in the legal or medical domain; arguably better for food, travel, hospitality recommendations. Serendipitous content might help or might not, for advertisers. 


When the objective is the largest-possible audience, it might not matter what the specific content happens to be. If the objective is to reach a defined buying public, content will matter more. 


And perhaps some elements of the traditional journalistic profession’s emphasis on fairness and balance could help as well, such as the necessity of “showing both sides” or multiple viewpoints and using multiple sources. 


It might also be possible to enhance transparency and provide some measures of user control. For example, it might be possible to give users more control over their recommendations, such as the ability to opt out of personalized content or request alternative viewpoints.


In  some cases it might be possible to use a broader contextual approach, such as embracing the broader context of user queries and recommendations and avoiding overly-narrow personalization. 


Of course, these sorts of techniques may run counter to the targeting features that have driven advertisers to highly-personalized content and venues. What made personalized content and venues so compelling for advertisers was the belief that they provided a more-efficient way to reach likely buyers of any product. 


To the extent that less reliance on past behavior influences content presentation, it might also reduce the “personalization” that advertisers prefer. 


But that is less an issue--if an issue at all--for advertisers selling products and services. The problems are centered on news and information deemed important for people as citizens, not consumers.


No comments:

It Will be Hard to Measure AI Impact on Knowledge Worker "Productivity"

There are over 100 million knowledge workers in the United States, and more than 1.25 billion knowledge workers globally, according to one A...