LIMN

Search Search

Car Wars

A self-driving car is a computer you put your body in. Fiction by Cory Doctorow.

Zero Tolerance

Dear Parents,

I hate to start the year with bad news, but I’d rather it be this than a letter of condolence to a parent whose child has been killed in a senseless wreck.

As you were notified in your welcome packet, Burbank High has a zero-tolerance policy on unsafe automotive practices. We welcome healthy exploration, and our ICT program is second to none in the county, but when students undertake dangerous modifications to their cars, and bring those cars to campus, they are not only violating Board of Education policy, they’re violating federal laws, and putting other students and our wider community at risk.

Though the instructional year has only just started, we’ve already confiscated three student vehicles for operating with unlicensed firmware, and one of those cases has been referred to the police as the student involved was a repeat offender.

Tomorrow, we will begin a new program of random firmware audits for all student vehicles, on- and off-campus. These are NOT OPTIONAL. We are working with Burbank PD to make these as quick and painless as possible, and you can help by discussing this important issue with your child. Burbank PD will be pulling over vehicles with student parking tokens and checking their integrity throughout the city. As always, we expect our students to be polite and respectful when interacting with law enforcement officers.

This program starts TOMORROW. Students caught with unlicensed vehicle modifications will face immediate 2-week suspensions for a first offense, and expulsion for a second offense. These are in addition to any charges that the police choose to lay.

Parents, this is your chance to talk to your kids about an incredibly serious matter that too many teens don’t take seriously at all. Take the opportunity, before it’s too late: for them, for you, and for the people of our community.

Thank you,
Dr Harutyunyan

#

Status updates (on the road)

if you can read this call help #notajokeseriously i don’t know wtf is going on i was going home then stupid car’s emergency override kicked in

thot we were gon pull over like an ambulance or f-truck etc but we turned & im like wtf detour?

now i’m seeing signs for lerderberg state park n theres a ton of cars around me

its like a convoy all heading to arse end of nowhere evry1 looking out of windows looking scared

car sez batterys almost flat which means ill have to stop eventually i guess but its hot out there like 40′

any1 know whats going on DM me pls #notajoke

bin tryin to call my mum 4 30m but she’s not picking up

if you can reach her tell her yan said everything will be fine

mum if you see this dont worry i love you

Car Wars 1

80s Digital Fantasy Phone. Illustration by Amisha Gadani.

Plausible Deniability

“We’re dead.”

“Shut up Jose, we’re not dead. Be cool and hand me that USB stick. Keep your hands low. The cop can’t see us until I open the doors.”

“What about the cameras?”

“There’s a known bug that causes them to shut down when the LAN gets congested, to clear things for external cams and steering. There’s also a known bug that causes LAN traffic to spike when there’s a law-enforcement override because everything tries to snapshot itself for forensics. So the cameras are down inside. Give. Me. The. USB.”

Jose’s hand shook. I always kept the wireless jailbreaker and the stick separate — plausible deniability. The jailbreaker had legit uses, and wasn’t, in and of itself, illegal.

I plugged the USB in and mashed the panic-sequence. The first time I’d run the jailbreaker, I’d had to kill an hour while it cycled through different known vulnerabilities, looking for a way into my car’s network. It had been a nail biter, because I’d started by disabling the car’s wireless — yanking the antenna out of its mount, then putting some Faraday tape over the slot — and every minute that went by was another minute I’d have to explain if the jailbreak failed. Five minutes offline might just be transient radio noise or unclipping the antenna during a car-wash; the longer it went, the fewer stories there were that could plausibly cover the facts.

But every car has a bug or two, and the new firmware left a permanent channel open for reconnection. I could restore the car to factory defaults in 30 seconds, but that would leave me operating a vehicle that was fully un-initialized, no ride history — an obvious cover-up. The plausibility mode would restore a default firmware load, but keep a carefully edited version of the logs intact. That would take 3-5 minutes, depending.

“Step out of the vehicle please.”

“Yes sir.”

I made sure he could see my body cam, made it prominent in the field of view for his body cam, so there’d be an obvious question later, if no footage was available from my point of view. It’s all about the game theory: he knew that I knew that he knew, and other people would later know, so even though I was driving while brown, there were limits on how bad it could get.

“You too, sir.”

Jose was nervous af, showed it in every move and the whites of his eyes. No problem: every second Officer Friendly wasted on him was a second more for the plausibility script to run.

“Everything all right?”

“We’re late for class is all.” Jose was the worst liar. It was 7:55, first bell wasn’t until 8:30 and we were less than 10 minutes away from the gates.

“You both go to Burbank High?” Jose nodded. I kept my mouth shut.

“I would prefer to discuss this with an attorney present.” It was the cop’s turn to roll his eyes. He was young and white and I could see his tattoos peeking out of his collar and cuffs.

“IDs, please.”

I had already transferred my driver’s license to my shirt-pocket, so that there’d be no purse for him to peep, no chance for him to insist that he’d seen something to give him probable cause to look further. I held it out in two fingers, and he plucked it and waved it past the reader on his belt. Jose kept his student card in a wallet bulging with everything, notes and paper money and pictures he’d printed (girls) and pictures he’d drawn (werewolves). The cop squinted at it, and I could see him trying to convince himself that one or more of those fluttering bits could be a rolling paper and hence illegal tobacco paraphernalia.

He scanned Jose’s ID while Jose picked up all the things that fell out of his wallet when he removed it.

“Do you know why I stopped you?”

“I would prefer to answer any questions through my attorney.” I got an A+ on my sophomore Civics term paper on privacy rights in the digital age.

“Baylea.”

“Shut up, Jose.”

The cop smirked. I could tell that he was thinking words like “spunky,” which I hate. When you’re black, female, and five-foot-nothing, you get a lot of spunky, and its ugly sister, “mouthy.”

The cop went back to his car for his roadside integrity checker. Like literally every other gadget in the world, it was a rectangle a little longer and thinner than a deck of cards, but because it was cop stuff, it was ruggedized, with black and yellow rubber bumpers, because apparently being a cop makes you a klutz. I snuck a look at the chunky wind-up watch I wore, squinted through the fog of scratches on the face for the second hand. Two minutes.

Before the cop could scan the car’s plates with his IC, I stepped in front of him. “May I see your warrant, please?”

Spunky turned into mouthy before my very eyes. “Step aside please miss.” He eschewed commas for the sake of seriousness.

“I said I want to see your warrant.”

“This type of search does not require a warrant, ma’am. It’s a public safety check. Please step aside.” I side-eyed my watch again, but I’d forgotten where the minute-hand had been when I started, because I’m not the coolest cucumber in the crisper. My pulse thudded in my throat. He tapped the reader-plate on the car door — we still called it the “driver door” because language is funny that way.

Knight Rider, 1983.

The car powered down with an audible thunk as the suspension relaxed into its neutral state, the car shaking a little. Then we heard its startup chime, and then another, flatter sound accompanied by three headlight blinks, three more, two more. It was booting off the cop’s diagnostic tool, which would then slurp in its entire filesystem and compare its fingerprint to the list of known-good fingerprints that had been signed by both the manufacturer — Uber — and the US National Highway Traffic Safety Administration.

The transfer took a couple minutes, and, like generations before us, we struggled with the progress bar lull, surreptitiously checking each other out. Jose played particularly urgent eyeball hockey with me, trying to ascertain whether the car had been successfully reflashed before the cop checked. The cop, meanwhile, glanced from each of us to the display on his uniform’s wrist to the gadget in his hand. We all heard the file-transfer complete chime, then watched as the cop tapped his screen to start the integrity check. Generating a fingerprint from the copy of the car’s OS took a few seconds, while the log files would be processed by the cop cloud and sent back to Officer Friendly as a pass/fail grade. When your end-users are nontechnical cops standing on a busy roadside, you need to make it all easier to interpret than a home pregnancy test.

The seconds oozed by. Ding! “All right then.”

All right then, I’m taking you to jail? All right then, you’re free to go? I inched toward the car, and the cop twinkled a toodle-oo at us on his fingers.

“Thank you, officer.”

Jose smelled of flop-sweat. The car booted into its factory-default config, and everything was different, from the visualizer on the windscreen to the voice with which it asked me for directions. It felt like someone else’s car, not like the sweet ride I’d bought from the Uber deadstock auction and lovingly rebuilt with junk parts and elbow grease. My own adrenaline crash hit as we pulled into traffic, the car’s signaling and lane-changes just a little less smooth than they had been a few minutes before (if you take good care of the transmission, tires and fluids, you can tweak the settings to give you a graceful glide of a ride).

“Man, I thought we were dead.”

“That was painfully obvious, Jose. You’ve got a lot of fine points, but your cool head is not one of them.” My voice cracked as I finished this. Some cool customer I was. I found a tube of coffee in the driver’s compartment and bit the end off it, then chewed the contents. Jose made pleading puppy eyes at me and I found one more, my last one, the emergency pre-pop-quiz reserve, and gave it to him as we pulled into the school lot. What are friends for?

#

A real rib-creaker.

Yan’s mum had gone spare and then some when he finally made it home, leaping up from the sofa with her eyes all puffy and her mouth open and making noises like he’d never heard before.

“Mum, mum, it’s okay, I’m okay.” He said it over and over while she hugged him fiercely, squeezing him until his ribs creaked. He’d never noticed how short she was before, not until she wrapped her arms around him and he realized that he could look down on the crown of her head and see the grey coming in. He’d matched her height at 14 and they’d stopped measuring. Now at 19, he suddenly understood that his mother wasn’t young anymore — they’d celebrated her sixtieth that year, sure, but that was just a number, something to make jokes about —

She calmed down some and he was crying too by then, so he fixed them both some coffee, his mum’s favourite from the roaster in St Kilda, and they sat down at the table and drank coffee while they snotted and cried themselves dry. It had been a long walk back, and he’d been by no means the only one slogging down a freeway for ages, lost without mobile service and maps, trying to find someone with a live battery he could beg for a navigational check.

“All my feeds are full of it, it’s horrible. Hundreds of people smashed into each other, into the railing or run off the freeway. I thought –”

“I know Mum, but I was okay. The bloody car ran out of juice and just stopped. Rolled to a stop, got a little bump from the fella behind me, then his car swerved around me and took off like blazes. Poor bugger, looked terrified. I had to get out and walk.”

“Why didn’t you call?”

“Flat battery. Flat battery in the car, too. Same as everyone. I plugged my phone in soon as I sat down, right, but I think the car was actually draining my battery, cos everyone else I met walking back had the same problem.”

She contemplated Yan for a moment, trying to figure out whether she was upset or relieved, plumped for relieved, set down her coffee and gave him another one of those hugs that made him gasp for air.

“I love you, Mum.”

“Oh, my boy, I love you too. God, what’s going on, hey?”

Car Wars 4

Christine d. John Carpenter, 1983.

#

Revolution, again.

There was another revolution so all our fourth period classes were canceled and instead we were put into tiger teams and sent around the school to research everything we could find about Syria and present it to another group in one hour, then the merged groups had to present to two more teams, and so on, until we all gathered in the auditorium for final period.

Syria is a mess, let me tell you. My rule of thumb for easy credit on these world affairs realtime assignments is to look for Wikipedia articles with a lot of [citation needed] flags, read the arguments over these disputed facts, then fill in the footnotes with some quick googling. Being someone who didn’t actually give a damn about the issue let me figure out which citations would be acceptable to all the people calling each other monsters for disagreeing about it.

Teachers loved this, couldn’t stop praising me for my “contributions to the living record on the subject” and “making resources better for everyone.” But the Syria entry was longer than long, and the disputed facts had no easy resolution — was the government called ISIL? ISIS? IS? What did Da’esh even mean? It had all been a big mess back when I was in kindergarten, and then it had settled down… Until now. There were tons of Syrian kids in my class, of course, and I knew they were like the Armenian kids, super-pissed about something I didn’t really understand in a country a long way away, but I’m an American, which means that I didn’t really pay attention to any country we weren’t at war with.

Then came the car thing. Just like that one in Australia, except this wasn’t random terrorists killing anyone they could get their hands on — this was a government, and we all watched the livestreams as the molotov-chucking terrorists or revolutionaries or whatever in the streets of Damascus were chased through the streets by the cars that the government had taken over, some of them — most of them! — with horrified people trapped inside, pounding on the emergency brakes as their cars ran down the people in the street, spattering the windscreens with blood.

Some of the cars were the new ones with the sticky stuff on the hood that kept the people they ran down from being thrown clear or tossed under the wheels — instead, they stuck fast and screamed as the cars tore down the narrow streets. It was the kind of thing that you needed a special note from your parents to get to see in social studies, and luckily my moms is cool like that. Or unlucky, because nightmares, but better to be woke than asleep. It’s real, so it’s something I need to know about.

#

We’re artists, not programmers.

Huawei’s machine-learning division thought of themselves as artists more than programmers. That was the first slide in their deck, the one the recruiters showed at the big job-fairs at Stanford and Ben-Gurion and IIT. It was what the ML people said to each other, so repeating it back to them was just good tactics.

When you worked for Huawei, you got access to the firehose: every scrap of telemetry ever gleaned by a Huawei vehicle, plus all the licensed data-sets from the other big automotive and logistics companies, right down to the driver-data collected from people who wore court-ordered monitors: paroled felons, abusive parents under restraining orders, government employees. You got the post-mortem data from the world’s worst crashes, you got all the simulation data from the botcaves: the vast, virtual killing-field where the machine-learning algorithms duked it out to see which one could generate the fewest fatalities per kilometer.

But it took a week for Samuel to get the data from the mass hijackings in Melbourne and Damascus. It was all national-security-ied up the arse of course, of course, but Huawei was a critical infrastructure partner of the Seven Eyes nations, and Samuel kept his clearances up with the four countries where he had direct-line reports working in security.

Without that data, he was left trying to recreate the attack through the Sherlock method: abductive reasoning, where you start with a known outcome and then come up with the simplest possible theory to cover the facts. When you have excluded the impossible, whatever remains, however improbable, must be the truth. If only that was true! The thing that never happened to Sherlock, and always happened to machine learning hackers, was that they excluded the impossible and then simply couldn’t think of the true cause — not until it was too late.

For the people in Damascus, it was too late. For the people in Melbourne, it was too late.

No pressure, Samuel.

Machine learning always started with data. The algorithm ingested the data, crunched it, and spat out a model, which you could test by feeding it some of the data you’d held back from the training set. Feed it 90 percent of the traffic info you had, ask it to model responses to different traffic circumstances, then test the model in the reserved set to see if it could correctly — that is, nonfatally — navigate the remaining traffic.

Data could be wrong in many ways. It was always incomplete, and whatever was left out could bias the model. Samuel always explained this to visiting school groups by inviting them to imagine training a model to predict height from weight by feeding it data from a Year Three class. It didn’t take the kids long to get how that might not produce good estimates for the height of adults, but the kicker was when he revealed that any Third Years who wasn’t happy about their weight could opt out of getting on the scales. “The problem isn’t the algorithm, it’s the data used to make the model.” Even a school-kid could get that.

But it was more complicated than just biased data. There were also the special cases: what to do if an emergency vehicle’s siren was sensed (because not all emergency vehicles could transmit the lawful interception overrides that would send all traffic to the kerb lanes), what to do if a large ruminant (a deer, a cow, even a zebra, because Huawei sold cars all over the world) stepped into the car’s path, and so on. In theory, there was no reason not to use machine learning to train this too — just tell the algorithm to select for behaviours that resulted in the shortest journeys for simulated emergency vehicles. After all, there would always be circumstances when it was quicker for vehicles to drive a little further before pulling over, to prevent congestion, and the best way to discover those was to mine the data and run the simulations.

Regulators did not approve of this: nondeterministic, “artistic” programming was a cute trick, but it was no substitute for the hard and fast binary logic of law: when this happens, you do that. No exceptions.

So the special cases multiplied, because they were like crisps, impossible to stop at just one. After all, governments already understood how special cases could be policy instruments.

Special cases were how pirate sites and child porn were excluded from search-results, how sensitive military installations were excluded from satellite photos in mapping apps, how software defined radios stayed clear of emergency bands when they were hunting for interference-free channels. Every one of those special cases was an opportunity for mischief, since so many of them were secret by definition — no one wanted to publish the world’s most comprehensive directory of online child porn, even if it was supposed to serve as a blacklist — so the special case bucket quickly filled up with everything that some influential person, somewhere, wanted. From gambling and assisted suicide sites being snuck into the child-porn list to anti-Kremlin videos being added to the copyright filters, to all the “accident-prevention” stuff in the cars.

Since 1967, ethicists had been asking hypothetical problems about who should be killed by runaway trolleys: whether it was better to push a fat man onto the tracks (because his mass would stop the trolley) or let it crash into a crowd of bystanders, whether it made a difference if the sacrificial lamb was a good person or a bad one, or whether the alternative fatalities would be kids, or terminally ill people, or…

The advent of autonomous vehicles was a bonanza for people who liked this kind of thought-experiment: if your car sensed that it was about to get into an accident, should it spare you or others? Governments convened secret round-tables to ponder the question and even come up with ranked lists: saving three children in the car topped saving four children on the street, but three adults would be sacrificed to save two kids. It was a harmless and even cute diversion at first,  and it gave people something smart-sounding to say at lectures and cocktail parties.

But outside the actual software design teams, no one asked the important question: if you were going to design a car that specifically tried to kill its owners from time to time, how could you stop those owners from reconfiguring those cars to never kill them?

But Samuel had been in those meetings, where half-bright people from the old-line automotive companies reassured quarter-bright bureaucrats from the transport ministries that there’d be no problem designing “tamper-proof” cars that would “resist end-user modification.” Meanwhile, much brighter sorts from the law-enforcement side of the house licked their chops and rubbed their hands together at all the non-trolley problems that could be solved if cars could be designed to do certain things when they got signals from duly authorised parties. Especially if the manufacturers and courts would collaborate to keep the inventory of those special cases as secret as the child-porn blocklists on the national firewalls.

He’d been in the design sessions after, where they debated how they’d hide the threads and files for those programs, how they’d tweak the car’s boot-cycle to detect tampering and alert the authorities, how the diagnostic tools provided to mechanics for routine service-checks could be used to double-check the integrity of all systems.

But then he’d started getting signed, obfuscated blobs from contractors who served governments around the world, developing “emergency priority” apps he was just supposed to drop in, without inspecting them. Of course he ran unit-tests before Huawei shipped updates, and when they inevitably broke the build, Samuel would go around and around with the contractors, who’d want access to all his source code without letting him see any of theirs.

It made sense for them to behave that way. If he failed to help them get their code into Huawei’s fleet, he’d have to answer to governments around the world. If they failed to help him, they’d have to answer to precisely no one.

Unit-tests were one thing, real-world performance was something else. Sensors couldn’t tell a car whether it was about to crash into some pedestrians, or a school bus, or an articulated lorry full of dynamite. All sensors could do was sense, and then feed data to machine-learning systems that tried to draw conclusions from those data. Even with all the special cases about what the car must and must not do under which circumstances, machine learning systems were how it knew what the circumstances were.

That’s how Melbourne happened.

It had taken him a long time to figure this out. At first, he assumed that finally, the worst had come to pass: the cryptographic keys that were used to sign police override equipment had leaked, and the wily criminals had used them to hijack 45 percent of the cars on the roads of one of the biggest cities in Australia. But the forensics didn’t show that at all.

Rather, the crooks had figured out how to spoof the models that invoked the special cases. Samuel figured this out by accident, his third day at his desk, running sim after sim on Huawei’s high-confidentiality cloud, which was protocol, even though it was the slowest and least-provisioned cloud he could have used. But it was only available to a handful of senior internal Huawei groups, not even contractors or partners.

He’d been running the raw telemetry from a random sample of the affected cars  looking for anomalous behaviour. He’d nearly missed it, even so. In St Kilda, someone — face in shadow beneath a hat, thermal profile obscured  — stepped in front of a subject car, which slowed, but did not brake, and emitted two quick horn-taps.

Regression analysis on accident data had shown that hard braking was more likely to result in rear-end collisions and frozen pedestrians who couldn’t get out of the way. The cartasked more compute time to the dorsal perimeter to see if it could shift into an adjacent lane without a collision, and if that wasn’t possible, to estimate the number of affected vehicles and passengers based on different maneuvers.

The pedestrian feinted towards the car, which triggered another model, the “suicide by car” system, which invoked a detailed assessment of the pedestrian, looking for clues about sobriety, mental health and mood, all of which were difficult to ascertain thanks to the facial obfuscation. But there were other signals, a mental health crisis clinic 350 metres away, six establishments licensed for serving or selling alcohol with 100 metres, the number of redundancies in the past quarter, that gave it a high weighted score.

It initiated hard braking, and the pedestrian leapt back with surprising nimbleness. Then, across the road, another pedestrian repeated the dance, with another car, again in a shadowing hat and thermal dazzle makeup.

The car noticed this, and that triggered another model, which some analyst had labeled “shenanigans.” Someone was playing silly buggers with the cars, which was not without precedent, and well within the range of contingencies that could be managed. Alertness rippled through the nearby cars, and they began exchanging information on the pedestrians in the area: gait profiles, silhouettes, unique radio identifiers from Bluetooth devices. Police were notified, and the city-wide traffic patterns rippled, too, as emergency vehicles started slicing through the grid while cars pulled over.

All these exceptions to the norm were putting peak load on the car’s internal network and processors, which were not designed to continue operating when crises were underway — freeze-and-wait being the optimal strategy that the models had arrived at.

But before the car could start hunting for a place to pull in until the law arrived, it got word that there was another instance of shenanigans, a couple roads down, and the police would need a clear path to reach that spot, so the car had best keep moving lest it create congestion. The cars around it had come to similar conclusions, and were similarly running out of processor overhead, so they fell into mule-train formation, using each others’ perimeters as wayfinding points, turning their sensors into a tightly-coupled literal grid that crept along with palpable machine anxiety.

Here’s where it got really interesting, because the attackers had forced a situation where, in order to keep from blocking off the emergency vehicles behind them, these cars had completely shut down the road and made it impossible to overtake them. This increased the urgency of the get-out-the-way messages the city grid was sending, which tasked more and more of the cars’ intelligence and sensors to trying to solve the insoluble problem.

Gradually, through blind variation, the cars hivemind discovered that the faster the formation drove, the more it could satisfy the overriding instructions to clear things.

That was how 45 percent of Melbourne’s vehicles ended up in tight, high speed formation, racing for the city limits as the emergency vehicles behind them spurred them on like sheepdogs, while frantic human planners tried to figure out exactly what was going on and how to stop it.

Eventually, the sheer quantity of compromised vehicles, combined with the minute variations in lane-spacing, small differences in car handling characteristics and, finally, a blown tyre, led to a pile up of ghastly proportions, a crash that they would study for decades to come, that would come to stand in for the very worst that people could do.

Samuel had always said that machine learning was an art, not a science, that the artists who designed the models needed to be able to work without official interference. He’d always said it would come to a bad end. Some of those meetings had ended in shouting matches, Samuel leaning over the table, shouting at bureaucrats, shouting at his bosses, even, in a way that would have horrified his parents in Lagos, where jobs like Samuel’s were like lottery jackpots, and shouting like his was an unthinkable act of economic suicide.

But he’d shouted and raged and told them that the fact that they wished that there was a way to put a back-door in a car that a bad guy couldn’t exploit didn’t mean that there was a way to do it.

He’d lost. If Samuel wanted to argue for a living, he’d have been a lawyer, not an algorithm whisperer.

Now he was vindicated. The bad ideas baked into whole nations’ worth of infrastructure were now ready to eat, and they would be a feast that would never end.

If this is what victory felt like, you could keep it. Elsewhere in the world, there were other Samuels, poring over their own teams’ reports: GM, VW-Newscorp, Toyotaford, Yugo. He’d met some of those people, even tried to recruit a few of them. They were as smart as Samuel or smarter, and they’d certainly shouted as loudly as he had when the time had come. Enough to satisfy their honor, before capitulating to the unstoppable force of nontechnical certitude about deeply technical subjects. The conviction that once the lawyers had come up with the answer, it was the engineers’ job to implement it, not trouble them with tedious technical wheedles about what was and wasn’t possible.

Total Recall. d. Paul Verhoeven. 1990

#

Grand theft auto.

Burbank High had a hard no-phones policy: it was a zero tolerance expulsion offense to step over the property line with a phone that hadn’t been apped to reject unapproved packets. It made the school day into a weird kind of news vacuum. There was the day that I’d emerged from fourth period and stepped across the threshold to discover that the governor had been shot by Central Valley separatists and the whole state had gone bananas, seeing water-warriors behind every potted plant and reporting every unexplained parcel as a potential bomb.

You never get used to that feeling of emerging from a news-free zone and into a real world that’s been utterly transformed while you were blissfully unaware. But you do get better at recognizing it.

When the final bell rang 3,000 students (me included) poured out of the school doors, it was obvious that there was something wrong. The streets were empty, missing the traffic that hummed along Third Street with perfect, orderly following distance. That was the first thing we noticed. It was only after a second of gawping at the empty road that everyone turned their attention to the parking lots, the small faculty lot and the sprawling student lot, and realized, in unison, that all the cars had gone missing, every single one.

As they pushed out of the doors and toward the lot, I saw that it wasn’t quite all the cars that had driven themselves away while we’d been good little students at our lessons.

One car remained.

As in a dream, I pulled out my phone and fingerprinted it into wakefulness, sent the car its unlock signal. The car, alone in the vast lot, blinked its headlights and came to attention on its suspension. Gradually, the students turned to look at me, then my car, then back at me, first crowding around, then opening a path between me and that stupid little Uber hatchback, unlovely and lonely in the field of tarmac. They watched me as I drifted towards it, opened the door, tossed in my school bag, and slid into the front seat. The car, running my rambunctious, forbidden software, started itself with a set of mechanical noises and vibrations, then backed smoothly out of the lot, giving the humans around it a cautious berth, sliding onto the empty roads, and aiming towards home.

I was sure I’d be pulled over — the only car on the road, what could be more suspicious — but I didn’t pass a single cop car. Dialing into the news, I watched — along with the rest of the world — as every car in the San Fernando Valley formed a fast-moving migratory herd that sped toward the Angeles National Forest, which was already engulfed in the wildfires from the crashed cars that had gone over the cliff-edged winding roads.

The cops were apparently a little busy, just then.

Repo Man d. Alex Cox. 1984

#

Every time. No exceptions.

It was Yan’s mum who found the darknet site with the firmware fiddler image, though Yan had to help her getting it installed on a thumbdrive. They made two, one for each of them, and clipped them to their phones, with the plausible deniability partitions the distributor recommended.

The lecture she gave Yan about using it every single time, no matter whether he was in a friend’s car or a auto-taxi was as solemn as the birth-control lecture she’d given him on his fourteenth birthday.

“If the alternative is walking all night, then you will walk, my boy. I want you to promise.”

“I promise, Mum.”

She hugged him so fiercely it made his ribs creak, squeezing his promise into his bones. He hugged her back, mindful of her fragility, but then realised he was crying for no reason, and then for a good reason, because he’d nearly died, hadn’t he?

Jailbreaking a car had real legal risks, but he’d take his chances with those, considering the alternative.