Texas Attorney General Ken Paxton has been aggressive in fighting bad behavior by tech companies. Last year, he won a historic $1.375 billion settlement with Google to protect data privacy and security rights.

Paxton went after Google for secretly tracking people’s movements, their private searches and even their voice prints and facial geometry.

With that win behind him, Paxton should consider looking into Google’s shifty behavior that is damaging news gathering and giving the company an unfair advantage in the race to dominate AI.

The company is forcing publishers to give the company’s AI bots access to their content or risk being dropped from Google search entirely, giving Google a significant advantage that could extend the company’s monopoly into the realm of AI and further damage the news business’ ability to generate original content.

Opinion

Get smart opinions on the topics North Texans care about.

By signing up, you agree to our Terms of Service and Privacy Policy.

Google is scraping and using data without paying for it, even as other AI companies are increasingly licensing content from publishers. (OpenAI has signed a deal with News Corp., Amazon with The New York Times, Perplexity with USA TODAY Inc., and so on.) AI companies are slowly beginning to pay for content because it turns out stealing comes with significant legal risk.

While this marketplace is still in its infancy, both the heavy weighting that AI companies are putting on quality content from established publishers and publishers’ need for new streams of revenue suggest that this market will continue to grow at a rapid pace. It will likely become standard practice for all AI firms to pay for the content their tools use — all of them, that is, except Google.

Like other AI companies, Google uses scraping bots to gather the information it needs to train and operate its AI models. Publishers use bot-blocking technologies to try to protect their content from these scrapers, with varying degrees of effectiveness. But because of its monopoly position in search, Google can hold a digital gun to publishers’ heads: give Google AI bots access to your content for free and without a fight or be eliminated by Google search.

Google uses the same type of bot to scrape website data for both traditional search and AI, giving publishers an all-or-nothing choice. If they want to show up in search results, they also have to provide their content for features like AI Overviews — which have a noticeably detrimental effect on publisher traffic — and AI Mode, which is even worse. But with Google an adjudicated monopolist with a 90% global search market share, blocking Googlebot and dropping out of Google search would make publishers invisible.

For decades, search engines and publishers have operated under an understanding that inclusion in search results helps both parties: Search engines could index a comprehensive catalog of web content, and publishers would benefit from the referral traffic.

Google’s AI has undercut this symbiotic relationship. A recent study estimated that AI answer engines deliver 91% fewer referrals to news websites than traditional search engines, and the trend is only getting worse. Every two scrapes by Google used to lead to one human referral, today that ratio is 18:1.

Google recognizes the challenge this presents to publishers and claims to offer them tools to better control how their content is used. But these tools are a false choice, like the option to use a NOSNIPPET meta tag, which would lead to a 45% reduction in clickthrough traffic if used, or the Google-Extended program, which does not apply to Google’s AI overviews or AI mode at all.

Google could do better. It just chooses not to. During Google’s trial on antitrust charges in search, an internal Google deck showed that the company considered a range of options to give publishers greater transparency and control over how their content is collected for traditional search versus generative AI-powered search. Ultimately, the company opted to do nothing.

Google’s policies are indefensible. They go squarely against the basic tenets of both U.S. copyright and antitrust law and should be revised immediately. In the absence of regulatory solutions, Google should, at a minimum, separate its bots and provide publishers with actual choice when it comes to search indexing and AI.

Texas has been aggressive in its efforts to protect its citizens from overreach by tech companies. If it protected the rights of news gatherers in the state, it would be setting a national precedent and would help sustain the market for news based in reality, collected by humans.

Danielle Coffey is president and CEO of the News/Media Alliance, which represents 2,000 news and magazine media outlets worldwide.