Better Software Patents

The Habitual Stranger

The inventor no longer lived among others. He passed through public spaces the way thoughts passed through the heads of bureaucrats—quietly, unnoticed, but with purpose. In the pale blue glow of his apartment, he watched, calculated, and recorded. He believed not in progress, nor in revolution, but in pattern.

His city was a symphony of misaligned intentions. Millions of individuals moved through it each day, independently but predictably. What struck the inventor was not their freedom but their repetition. The same trips, the same times, the same empty seats. Every journey, a wasted opportunity.

He had once tried to arrange carpools among neighbors. It failed. People mistrusted spontaneity. What they wanted—though they never admitted it—was structure. Routine, but flexible. Predictable, but private. The inventor obliged.

His creation was not merely software. It was an idea, incarnated as a computer-implemented method. It clustered trips not by geography, but by intention: same start, same end, habitual timing. It calculated “habit values”—numerical fingerprints of daily life. If a person took a particular route often enough, the system remembered. It could match that driver with a passenger not by chance, but by routine.

The system requested almost nothing from its users. A start, a destination, a time. From there, it searched the database—not all data, but filtered, pruned, optimized by habit thresholds. Only drivers whose habits exceeded a certain value were even considered. If no match was found, the system lowered its standards, just slightly, casting a wider net.

The key was distinction: driver or passenger. Without this, the matching collapsed into absurdity. The system needed to know not just who went where, but in what role. That difference—the line between offering a ride and requesting one—was everything.

He wrote it all down. Not in prose, but in claims. With numbered steps and flowcharts. It became European patent application EP 15909072.9. He filed it and waited.


The first letter from the European Patent Office came with a hollow tone, one he recognized from decades of formality. The Examining Division had read his claims. They thanked him, formally, for his submission. And they refused it.

The invention, they said, lacked inventive step under Articles 52(1) and 56 EPC. They referred to Document D1, a paper titled “Investigating ride sharing opportunities through mobility data analysis”. According to them, most of what he claimed had already been said.

They dissected his work as if it were an insect pinned to cork. Feature (A) to (N)? Present, they said. Document D1 had already tracked driving patterns, matched trips, used time slots, applied thresholds. Even the statistical “habit value” was there, dressed differently but fundamentally unchanged.

Only one fragment survived scrutiny: feature (G), where the system indicated whether a user was a driver or a passenger. That, they admitted, was not in D1. But it didn’t save him.

They declared that this feature was “a purely organisational and administrative constraint.” Recording roles in a database? An obvious solution, they said. The skilled person—this mythical average practitioner—would naturally structure the data this way, if asked to minimize communication load. No technical problem, no technical solution.

The refusal came not with anger, but with inevitability. The system thanked him again, and closed the file.


The inventor appealed.

He did not believe himself to be wrong. Merely misunderstood. He prepared diagrams, arguments, and logic as weapons against bureaucracy’s indifference. The Board of Appeal accepted the case as T 2137/21.

He pointed again to his key differentiators:

  • D1 grouped trips differently, more crudely.
  • D1 didn’t filter by habit values during the matching.
  • D1 didn’t distinguish drivers from passengers.

Together, he argued, these were technical differences. They improved system efficiency. They reduced network load. They made the matching more precise.

At the oral proceedings, he explained how roles might be determined. Perhaps by biometric data. Or AI-driven camera systems. Or simple forms filled on a smartphone. The method didn’t matter—the feature did.

The Board listened. And then they ruled.

They agreed that only a portion of feature (G) was missing from D1: the explicit distinction between driver and passenger. But they were not impressed.

Even assuming this difference created a technical effect—of which they were not fully convinced—it was still obvious. Any skilled person trying to improve database efficiency would do it. Store the role. Index it. Filter on it. The Board cited the “basic principle of databases” to justify this. The act of recording such data, they said, was “a purely administrative act.”

As for how the role was determined? The Board noted the absence of detail. Biometric recognition and camera-based inference were speculative, unmentioned in the application. And a simple form? That too was not disclosed. Therefore, feature (G) was a wish, not an invention. A hope, not an implementation.

The auxiliary request, with its dual thresholds, fared no better. Adjusting the match threshold when no results are found? That, too, was common sense. A “general common practice.” D1 itself suggested multiple thresholds.

And so, the Board dismissed the appeal. Their language was courteous, procedural, final. “The appeal is dismissed,” they wrote. “Consequently the application is refused.”


The inventor did not feel anger. He felt emptiness.

It was not the refusal that hurt, but the implication. That what he saw as innovation—carefully balancing habit and flexibility, passenger and driver—was mere configuration. That his insight into human mobility was bureaucratically indistinguishable from filtering rows in a spreadsheet.

In the weeks that followed, he re-read the decision. Again and again. Each time, the same conclusion: the system did not reject ideas. It rejected claims.

In his loneliness, he thought of the system he had imagined: one that matched people not by what they said, but by what they did. A system that learned quietly, without praise, without thanks. Perhaps it still worked, in prototype form, on a forgotten server in his flat. Perhaps it still watched the city, waiting to be useful.

He wondered if it ever found a match for him.


Postscript: Practical Guidance

Those seeking patents on data-driven mobility systems must remember:

  • The line between technical and administrative is sharp and merciless. Labeling data is not, in itself, technical.
  • If a feature implies a complex technical realization—such as distinguishing drivers from passengers—then that realization must be disclosed.
  • Common database practices are not inventive.
  • Repetition does not become invention by being algorithmic.

In the world of European patent law, systems may learn. But they must also teach. They must explain themselves—not just to users, but to Examiners.

Only then can their habits be protected.

Based on T 2137/21.

If this was helpful, you’ll love my mailing list! You'll get software patent drafting tips, helpful checklists, and a 20% discount on my seminars. Join today:

Join 1,104 happy subscribers 🙂

Better Software Patents