NEW ORLEANS – A study co-authored by Tulane University professor Eugina Leung has found that ninety percent of people unknowingly phrase online searches in ways that reinforce what they already believe—even when they’re trying to remain neutral.
The study, published in Proceedings of the National Academy of Sciences, tested nearly 10,000 participants, and found that even artificial intelligence tools like ChatGPT can reinforce bias unless platforms rethink how search is designed.
Leung is Assistant Professor of Marketing at Tulane University’s A. B. Freeman School of Business. Her research explores how technology shapes consumer judgment in a pattern that she calls the “narrow search effect.”
According to the study, whether the search subject is caffeine or COVID-19, the simple act of typing into a search bar often pushes users deeper into digital echo chambers. In one experiment, participants assigned to search about caffeine’s health effects not only shifted their beliefs but also chose either a caffeinated or decaffeinated beverage in line with their search terms—a result that parallels consumer purchase decisions. Leung says the solution may be surprisingly simple: an algorithmic tweak that broadens the range of results.
Profit and the Algorithm
Current search engines are built on commercial models that prioritize advertising revenue and user retention. In practice, this means delivering highly relevant results quickly—results that often match what users expect rather than challenge them.
“The design of contemporary search engines is influenced by a commercial model that prioritizes user retention for advertising purposes. This is typically achieved by delivering highly relevant results with speed. Despite this, our research identifies incentives for modifying algorithms to provide broader results,” said Leung.
Her studies found that people did not judge broader or mixed results as less useful than those confirming their original queries. That suggests a modification would not harm the user experience. In fact, there may be an appetite for it.
“A majority of users (84%) would be interested in a feature like a ‘Search Broadly’ button,” said Leung. “This user interest, combined with competitive pressures from services like Microsoft’s New Bing, which has experimented with reformulating queries, and potential regulatory interest in mitigating belief polarization, could motivate search providers to implement such changes.”
Testing the Narrow Search Effect
Across 21 studies involving nearly 10,000 participants, Leung and her colleagues mapped how search queries can entrench bias. “Our research was conducted involving a total of 9,906 individuals,” she explained.
The bias may be hardwired into human cognition shaped by evolutionary design. While psychological research suggests confirmation bias evolved as a mental shortcut for fast decision-making, the Tulane studies point to a more optimistic conclusion.
“In our studies, we found that people often possess a genuine interest in obtaining information that is simply obstructed by their method of query formulation. Critically, when we implemented algorithmic interventions to broaden the search results, participants updated their beliefs,” Leung said.
Implications for Business
The findings extend well beyond politics or health debates. They also matter for the way consumers shop online. “The ‘narrow search effect’ has a direct implication to how consumers search for products and services, which consequently affects business sales. Our findings show that people’s search queries limit their exposure to a full range of options,” Leung explained. For companies, that means the wording of a customer’s first search may determine whether they ever appear in the consideration set at all.
This dynamic, Leung argues, plays out every day in e-commerce, where brand preference can be reinforced by how a search is phrased. A query anchored to a specific brand or feature may narrow results to confirm that preference, pushing alternative competitors out of view.
Next Steps in Research
The team is now expanding its work into more politically charged areas, where the stakes for belief polarization are even higher. “We need to determine the conditions under which broadening results could inadvertently increase exposure to inaccurate content,” Leung said. Future studies will also test psychologically informed prompts for AI chatbots and examine how a “Search Broadly” feature performs in real-world settings.
Breaking the Cycle
Leung’s research points to a key insight: people may be more willing to consider broader information than they realize. The obstacle is not necessarily resistance but the way search engines are built and queries are framed. With algorithmic adjustments, search platforms could present users with a fuller picture—one that informs choices in politics, health, and business alike.
As Leung’s findings suggest, the challenge now is to design systems that help people step beyond their digital echo chambers, making it possible to update their thinking in light of broader, more balanced information.