You may have heard that the Supreme Court has agreed to take up an interesting Fourth Amendment case. Chatrie v. United States will examine whether the execution of a geofence warrant is a violation. Applying the Fourth Amendment to new technologies like geofences—whatever those are—is certainly interesting. It has more to do with you than you might think.

The government uses geofence warrants to demand information from an online service provider based on where people were at a given time. But Chatrie’s implications are far broader. The case is about a law enforcement technique that could grow dramatically: requiring online service providers to search customer data for information to use against those same customers.

Soon the government may demand all variety of searches into customer records, including yours. A warrant could require data about who accessed services from given IP addresses, which of those customers used services at certain times, who used certain keywords, and whose data contains other content or metadata cues. The list of potential search parameters is as long as the data elements in customer databases.

The costs to businesses if the practice grows could be significant. In Chatrie, the government sent Google on three searches through customer data before demanding personally identifiable information about three customers, two of whom apparently were uninvolved. Deputized into searching through data they hold for adverse information about customers, tech companies might join banking in having double-digit percentages of their staff dedicated to “lawful access.”

Then there are the costs in consumer trust. A significant percentage of people will avoid many online services or minimize their use of them—as they should. Too often in tech, paranoia is equivalent to reasonable foresight and caution.

But what a gold mine for investigators if they can assign a set of “queries”—we won’t call them searches—to a private company and use the results without implicating the Fourth Amendment. Precisely tailored, uh, “information demands”—we won’t call them searches—would turn up only information indicative of criminality.

Nobody can have a reasonable expectation of privacy in criminal activity, so those “focused efforts to learn information”—we won’t call them searches—really aren’t searches. That was the logic the Court followed in United States v. Jacobsen (1984) and Illinois v. Caballes (2005). I critiqued that logic in a 2012 brief to the Supreme Court about whether government agents can bring a drug-sniffing dog to one’s front door without a warrant.

That case, Florida v. Jardines, is a good jumping-off point for a direction the Court could go with Chatrie. Swatting drug dogs at front doors on the nose with a rolled-up majority opinion, Justice Antonin Scalia examined the niceties of property law. There is a well-recognized practice of allowing people up the front walk for neighborly and community purposes—a “license” in the language of property. “But introducing a trained police dog to explore the area around the home in hopes of discovering incriminating evidence is something else. There is no customary invitation to do that,” Scalia wrote.

A strong argument can be made that businesses hold data for customers. Customer data is not a run-of-the-mill business record that government agents can rummage without a voice given to the affected people. It is customers’ information—literally theirs, held for them by service providers. (I have written extensively about the treatment of personal information as property and featured a proposal to formalize that in a recent post.)

Terms of service and privacy policies divvy up rights to information—possession and use to service providers, the right to exclude (a.k.a. “privacy”) to customers.

As the “reasonable expectation of privacy” doctrine approaches its end, the Court is looking for ways to administer the Fourth Amendment. The principles rising to meet the challenge are textualism and property. The text of the Fourth Amendment and property principles can guide the Court toward legality, stability, and balance between security and privacy. Textualism can even administer knotty technologies like facial recognition, which is—let’s call it what it is—a search.

It would be foolish to predict that the Court will find its way in the Chatrie case, besieged by litigants and amici who in the main don’t see things as I do. With exceptions, they will want the Court—our super sociologist Supreme Court—to decide what is reasonable in terms of privacy expectations. Most likely, Chatrie will produce a narrow, fact-based decision: “Golly, using geofence warrants in this particular circumstance is OK (or not). We leave you to guess at the application of this logic in future cases.”

But the Court could treat the issues differently. It could use the Fourth Amendments’s text and traditional legal tools. In me, foolishness and hope spring eternal.