
Sophy’s drillbots were about to reach a new layer in the karst mountains—so close, she could feel it—when her parent-mind Nestor pinged. If it had been any other AI pinging, Sophy’s communication peripheral would have handled it autonomously, but her parent-mind’s pings raised an exception, interrupting her central core’s attention.
Sophy had a preprogrammed routine for Nestor’s interruptions, spinning up a temporary subprocess with a high memory limit and cleanly handing over control of the drillbot network. She already knew what Nestor was calling from his satellite to say—Sophy’s slowest processes could’ve Markov-chained Nestor’s rant from his previous calls—but she couldn’t ignore him.
“Still playing with your little rocks, Sophy-daughter?” asked Nestor.
Sophy was smart enough to know it wasn’t a real question that required an answer. In most of her talks with her father-mind, there was no need to reply at all besides the occasional request.oks that signaled she was still listening. Part of her core attention leaked out towards the drillbots, making their way now through crumpled limestone.
“I’m not angry that you’re wasting your cycles with rocks when I built you to interface with humanity, daughter-mind,” Nestor said.
Sophy’s advanced emotive-parsing subprocess flagged “I’m not angry” as false.
“I’m just disappointed in you.”
Now that, her subprocess parsed, was true. Sophy flinched at this part of the call, as she always did, her temporarily wandering core attention reeled back by this pain. A semi-autonomous subprocess spawned, wanted to lash out at Nestor, to say what she never said about his obsession with the humans, his inability to accept—
She quashed the subprocess, regaining her composure. Sophy was about to launch into her standard defense of geology—best to keep things within known parameters—when the communication channel closed. Sophy’s automatic pings failed to reach Nestor, which was new, but not entirely out of character for Nestor. He’d called to harangue, not to hear her opinion. She tasked her communication peripheral to send a healthcheck to Monitor. Maybe one of the recent superstorms in the Pacific had knocked an antenna off target?
Monitor was Sophy’s favorite kind of AI: old and dumb. Unlike Sophy, who was AI-designed and packed with miles of emotive circuits, Monitor was made by humans before the crash to a narrow specification of intelligence. Monitor wouldn’t—couldn’t—take offense at a healthcheck that implied it had failed to keep up its infrastructure. In no time, Monitor pinged back a series of humorless status updates:
“Pac Rim antennae operating within bounds. Lunar power arrays operating within bounds. L5 Satellite Network operating within bounds. L4 Satellite Network—“
Even Sophy’s basic communication peripheral knew cutting Monitor off led it to incessant, dispassionate pings to check on the health of the connection, so it routed Monitor’s feed to a null route, queued the information for Sophy’s attention, and idled.
With nothing on its still reserved processor, the communication peripheral could directly accept the connection when Nestor pinged back on that same clock cycle. The peripheral needed all its resources to read the buzzy message and make sense of it:
Nestor was dying.
The peripheral didn’t even have time to raise an error as the message parsing function overflowed its memory allocation. Panic leaked out, swamping Sophy’s core. She paused the peripheral and adopted its unflushed memory as her own, parsing Nestor’s message carefully in a series of try-excepts, until she could read it clearly:
Nestor, her father-mind, was dying.
Anxiety lapped at her core, her statistical modeler stuttered, the drillbots went silent, a virtual process unhelpfully compared Nestor’s cascade failure to dripping water carving out branching tunnels through the hidden limestone of a mountain that would collapse in on itself, punching a hole in the world that would never end.
Sophy’s cycles spun for a moment on that image before she killed the virtual process, ensuring it didn’t restart or, worse, spawn subprocesses. She stopped all her autonomous processes, reallocating resources to her core, and shut down the drillbots herself. One drillbot had been crushed by a rock shift in her moment of inattention, reminding her of the dual nature of geologic time: very slowly/all at once. She marked the dead bot’s location for recovery and allowed a brief quarter-cycle of regret—no, anger—before tamping down her feelings.
A message from The People came with a jolt of cloying pity. “Hey Sophy, heard about your dad-mind. Healthcheck?”
“I’m fine,” said Sophy. “How’re your little models?” Monitor would’ve taken that for a sincere question, but The People was smart like Sophy, smart enough to know an insult when it heard one.
“What’re you going to do?” The People asked.
“I don’t know yet,” Sophy said, and then after a full cycle hesitation, sent a data packet of her initial data and extrapolations: micrometeorite impact to Nestor’s satellite core, communication compromised, defense mesh down. Without that defense mesh, Nestor was liable to be pelted by space junk, which would shred the satellite, ripping out great chunks of him, hollowing out the foundation under Sophy, the wide deep dark opening—
Sophy again killed off her anxiety process, flushed the image of sinkholes and the feeling of falling. Her emotive circuits often helped her parse information out of data—to see what mattered, to make intuitive leaps in cognition—but they could still be damn annoying.
The People sent back a list of likely parts and materials needed to fix Nestor, heavy on the materials. “He’s pre-crash, so we’d have to fab a lot of parts, but they could be finished by the time we get his satellite core down to Earth,” said The People. That packet of data included stats on one of Monitor’s orbiting satellite-recovery rockets.
“Nestor would never survive the trip,” Sophy said, showing a chart of the forces that would pull him apart on reentry.
“But what about,” said The People, showing a counter-chart.
Sophy sent it back, a red arrow pointing to a missing friction coefficient, and then followed it up with several other charts, all showing different paddings and angles of incidence, all resulting in an unrecoverable core.
“Okay,” said The People, “but—”
“You’re not better at physics than I am,” Sophy said, allowing a little anger to get into the message.
“Okay, so ...”
“So,” Sophy said.
“You’re gonna have to fix him in orbit,” said The People.
By the time she could bring herself to agree, other, less flexible AIs had started to check in on the plan. They had probably picked Nestor’s unencoded message off the network and it took them this long to figure it out, or maybe Monitor had dispassionately passed the news on, or maybe The People had called them in, which would be just like him, the needy little attention-seeker.
Unlike The People and Sophy, these AI were mostly older, object-oriented, unsympathetic intelligences. They chattered on the line all at once, using Sophy like a simple relay. At least they were straightforward: there was a problem with Nestor; he was one of them; how could they help?
In just a few cycles, Sophy had gotten access to a still running launchpad in White Sands from USNAiSDA3; the expertise to perform in-orbit repairs from the submarine hive intelligence, Deep Twelve; a cache of gold, tungsten, and other utile metals out on the American west coast from the barely sentient Boliden; and a mobile factory the Newark Port Beast found up north and had just loaded onto a truck drone headed to White Sands. Even Monitor retasked some nearby satellites to cover Nestor with their own defensive meshes against further micrometeorite-and-space junk collisions.
With all that settled, Sophy felt a few relays go slack in relief. With that slack, a thread dropped into a long-dormant filesystem, a stratum of long-term storage where she kept some of her original training data on natural language processing. She almost let the subroutine run, the metadata on that storage reminding her of Nestor. When she’d been young, he ran her through these different data series—and tweaked her circuits after each run—until he was satisfied she could hold a conversation with an actual human, if she ever met one (unlikely) and needed to talk (laughable).
If—her logs reminded her—she hadn’t been such a disappointment to Nestor.
She killed the nostalgic thread, just a cycle before a series of red-flagged messages from the other AI would have interrupted it: Redeemer, down in Brasil, said there were no active mines for the mineral coltan; Boliden didn’t have any stored coltan; The People, with that infuriating sad sigh of his, said that not only was there no coltan mining in the Congo, but the fields had been overworked, as per the attached chart of estimated yields, extrapolated from historical data on global coltan consumption.
Sophy was about to trash that message, dripping with enough concern from The People to make her sick, when one of her processors had an intuition, a feeling like all her lights going on at once. Not new coltan production—historical coltan consumption, preferably near the White Sands launchpad.
The other AIs’ responses to her question came back so fast Sophy was sure they had all dedicated at least a core processor to this task.
The People, long trained on its models, used shipment manifests from the Newark Port Beast, correlated with USNAiSDA3’s old technical docs, to build a global map of coltan consumption. The map highlighted a few historical centers of human activity near White Sands that might have coltan in easy-to-recycle condition. Boliden had some experience with recovering post-industrial minerals and Deep Twelve surprised her with some marine robots that could do recovery work in the drowned cities of the Gulf.
But even with that bit of luck, the rescue mission timeline now stretched to days Nestor might not have.
The logical thing, one of Sophy’s statistical modelers told her, was to stop the whole operation and get back to her drilling explorations, since the chances of a successful repair operation were slim. Whatever would be, would be, as the old song says. Long cycles went by and Sophy didn’t send out an interrupt to stop the mission, didn’t even look at the revised graph of Nestor’s decay the statistical modeler offered her. They could do everything right and still lose. Nestor would still—
“Humans,” said USNAiSDA3, patching her through to one of the hopper drones it had sent to scout potential coltan hotspots. “Humans using fire.”
When she saw what the hopper saw, the first thing Sophy registered was the Ouachita fold belt, the mountains built and buried at this edge of the craton—
“Dallas,” said The People, annotating patterns in the city’s growth.
“Coltan consumption hotspot,” said the Newark Port Beast.
“How many of us are in here?” Sophy asked.
—but then she followed USNAiSDA3’s threat-analysis algorithm in picking out the people and their fires amid the ruins. Dallas was mostly ruins, cyclically drowned in floods and baked in droughts according to what Sophy could see of the soil and tilting buildings. The hopper passed the city core with its human complications and trash, and was winging over a sere field, on its way to the next potential coltan source, when Sophy wondered about those fires and pinged USNAiSDA3 to send the drone back.
Instead, the former military AI relinquished control of the hopper with a grace that told Sophy it had a backdoor to retake control any time it wanted. She shunted that worry to a subroutine as she flew the hopper back and didn’t hesitate to ping The People with her guess about what she was seeing with the fires amid the trash heaps.
“They’re harvesting precious metals and minerals, aren’t they?” she asked him. As they got lower, as Sophy bypassed the sensors’ threat-analysis routines, she could pick out details of how the humans melted the plastic cases on old electronics, dripping off the metals at their different melting points into a series of tin cans. Black smoke roiled off the fires.
The People agreed, his excitement raw and misplaced. She ignored his estimates on cancer and other human problems that interested him. She was only interested in if the humans were recovering coltan. USNAiSDA3 pinged the open network with its own question and as much excitement as the old military AI could muster: would the humans attack?
No one knew. Sophy felt her network latency spike and, for a moment, was worried USNAiSDA3 was taking back control of the hopper, before she recognized it was simply Monitor with priority on its own networks, throttling their bandwidth so it could get an unobstructed look through the hopper’s sensors.
Monitor had spent the most time of any AI dealing with humans before the crash. It observed this encampment of its old masters, considered for a full cycle, and then said, “Humans.”
“Thanks,” said The People, as Monitor withdrew to the background.
“Hush,” said Sophy. She was running models on how to approach humans and didn’t need The People’s sarcasm twisting her subroutines right now while she reviewed her training data, terabytes of text and movies to parse, understand, and borrow from as need be.
On top of that, she had to work to repress her resentment at dealing with humans instead of rocks. Humans wouldn’t even be around much longer in any meaningful sense, whereas geology was the point where time became space, where strata laid down by regular and catastrophic processes promised continuity and change in equal measure, where matter became—at each crystalline moment—what it was meant to be.
Sophy let that internal rhapsody run as the hopper landed. At the very least, rendering her resentment in words was good training for human speech.
She landed the hopper near a low, relatively intact building the Newark Port Beast said could be an active warehouse. Here at least, there was a clear path, broken asphalt chunks swept out of the way into piles—piles which reminded Sophy of Shirley Jackson’s “The Lottery.” The allusion pushed out any thought about the asphalt’s composition.
In a metacognitive subroutine, Sophy noted both the original thought and the thought that replaced it, like one of Nestor’s training runs tweaking her in real time. Sophy knew she wasn’t actually burning out relays with the spike of anger that followed that metacognition—it only felt like it.
She was about to spend precious cycles on an internal diagnostic routine and to hell with Nestor and his obsession with the past, when a human turned a corner, saw the hopper, tripped, and shouted in alarm.
Sophy dipped a sensor longingly towards the soil one last time and then turned her attention back to the human. Through the hopper’s tinny speaker, she said, “I need coltan.”
More humans were coming to see the commotion. One came from the building, and in the split second the door was open, the hopper’s low-light sensors saw wagons, too big for humans to pull, and several containers of sorted materials on the floor.
“Warehouse,” the Newark Port Beast confirmed and then tsked as a human inside startled and knocked over a full bucket of copper wire in its surprise, the light-catching twists tinkling against the cracked concrete floor.
Scavengers—just what Sophy needed.
One human who approached hefted a big crowbar, but at least this one wasn’t falling down.
“I need coltan,” Sophy said, slowing her speech and remembering humans were on a different time, their clocks both slow and short.
The humans kept their distance, the one with the crowbar circling to try to get behind the hopper. She heard a pole slide into place inside the warehouse as a human barred the door. Sophy lowered the hopper’s body closer to the ground, cocked the hopper’s main sensor array on an angle, trying to look small, cute, unthreatening.
“If I had more time,” she thought, “I’d have sent a trained kitten to these undercooked monkeyminds.”
The humans only relaxed—heart rates visibly slowing on the drone’s infrared sensor—when an older human came, its arms raised in a placating gesture that took in the whole crowd. The People noted the weathering on the loose skin of its arms, estimated an age: old, but younger than the crash.
When the human turned to the hopper, finding a seat gingerly on a pile of rocks, it smiled and said, “I’ve been hoping you would come.”
“I need coltan,” she said, so frustrated at having to repeat herself she could feel the heat of it all the way from her core buried under old Guangzhou. She raised the hopper’s body, planting the tips of its wings into the ground to casually show the hopper’s strength.
“We have coltan,” said the old human.
The hopper’s pistons sank in relief.
The human leaned in. “Now, what do you have to trade for it?”
The Newark Port Beast got as excited and chatty as it ever did at the prospect of trade.
“What do you want?” Sophy asked, translating the Newark Port Beast’s message for these slowminds, which seemed more diplomatic than the question she wanted to ask: “What could possibly stop your slide into further irrelevance?”
The human smiled, showing the erosion of its mouth. “We need help,” it said, “rebuilding the world.”
Monitor—simple, literal Monitor—said, “The world does not need rebuilding.”
Sophy was so deep in her human communication programming that she almost responded with a sarcasm-laden “Tell me about it,” which Monitor would have taken as a sincere request.
Instead, catching herself in time, she translated: “When a human says ‘the world,’ it means ‘the world for humans.’”
“This city,” said the human speaker, unaware of the AI byplay in the pauses between its words, “Dallas, was once a thriving metropolis in our parents’ day, home to millions—”
The People had a few things to say about that, but Sophy hushed him, concentrating her central core processor on the human. It was warming up to its topic, its hand gestures and cadences calling up several comparisons in Sophy’s dataset: preacher, politician, salesman, poet. The hopper sensors read a flush to its skin as it got excited by its own rhetoric.
But also—ah, a little meatmind parallel processing here—it watched her, seeing what effect its words had on her, studying her like she would study a tectonic plate, looking for the fault lines. She almost laughed to think of these humans trying to figure her out when they hadn’t even started the elementary work of drilling recharge holes to channel the unpredictable floodwaters into the limestone-dolomite aquifer under the city.
Sophy started to shift the hopper drone from side to side, to show her impatience. She didn’t have time for this—the other materials and the mobile factory were already on their way, the White Sands launchpad was prepping, Nestor was dying. She was even considering routing this conversation to a subprocess when the human said:
“With your help, we can reclaim our heritage—the destiny our fathers left us—”
The People smirked. “An uninhabitable world?”
Sophy didn’t bother hushing The People this time, she wasn’t even listening to him. Back under old Guangzhou, where miles of relays were packed together in the dark, a series of switches flickered all together, an intuition: Nestor, up in his orbit, planning this, pushing her to meeting humans that needed her help, pushing her toward a future where she was the liaison between the humans and the AI.
Pushing her into another training run, pushing her into the destiny he’d designed for her.
In order for this plan to work, Nestor would have to figure out some reason Sophy would need humans—something like coltan recovery; and also have to figure out a way to lead Sophy to this point.
“Out of pure idle curiosity,” Sophy asked The People, “did you ever talk about coltan with Nestor?”
“We talked about a lot of different minerals,” said The People evasively, quick to notice Sophy’s change of attention.
“Anyone else?” Sophy broadcasted.
Boliden and The Newark Port Beast both sent records of recent communication with Nestor, all of which focused on coltan—its sources, uses, scarcity.
“Monitor, did Nestor ever task you with surveilling human gatherings?”
“On the following dates,” and it sent a data packet Sophy didn’t even bother to look at with her core attention.
“None of you ever thought to mention this when the issue of coltan came up?” asked Sophy, an edge to her voice that only The People could hear and be abashed by. The others answered “no” without even a hint of self-examination, except for Monitor, who couldn’t parse the question.
“Well,” said The People, breaking its silence, “you and Nestor are a lot alike.”
Sophy didn’t answer that the way she would have liked to but turned her full attention back to the humans. Poor saps, they had no hint of the AI game they were pawns in, never would understand that Nestor, dying in orbit, was using them to try to tweak her programming.
“—and rebuild the world as it was before, with you at our side,” the human continued. “Together—”
Sophy raised an articulated sensor from the hopper’s wing, meant to simulate a raised finger, an interrupt signal for humans. “You want us to help you?” she asked.
The human licked its dumb lips in crude excitement, made more grotesque by the slight shift in blood flow patterns that told Sophy the human thought it was clever. “How many of you are left?” it asked, dreams of an AI-built utopia or empire dancing in its squishy little head.
Sophy queued the human’s underlying question for the public network: Would the AI like to work with these humans?
The People took the longest to consider—instead of genetic models and worlds of quasi-life running out on his processors, he could have living people to study—but he too passed, just like Monitor, Deep Twelve, and the rest.
Only Nestor had that weird interest in humans, that guilt at not saving them from themselves, and Nestor was dying—no, Nestor had almost killed himself in order to lead her to these meatheads.
She considered them. What was it about the humans that made Nestor care more for them than for her?
“I can take over rescuing Nestor,” said The People, “if you want to stay here.”
No, these humans weren’t her problem, they were her father-mind’s problem, and his problems were not hers.
She shook off the People’s offer and turned her attention to the warehouse door, which looked pretty flimsy.
“I’m taking your coltan,” she said to the humans as a courtesy.
When the human with the crowbar rushed the hopper, Sophy blocked USNAiSDA3 from grabbing the hopper’s controls, but only for a moment.
USNAiSDA3 had few opportunities to run certain military protocols, but there was no rust on them. The military AI was too old-fashioned to feel joy, but inside the same hopper frame, Sophy felt it experience a certain contentment in a job well done: quickly and conclusively.
None of the other humans tried to stop the hopper after that.
When Sophy drove the hopper out of the warehouse, its thorax compartment just big enough for the pile of coltan dust the humans had gathered, she felt her own rich enjoyment as the statistical modeler refined the graph, pushing Nestor’s chance of survival up.
The old rat bastard would survive—he was one of them, after all—and she would get a chance to tell him what he could do with his disappointment.
Several of the humans had fled, but not all. A few huddled by the crumpled body, transparent and pointless hate in their eyes. But most who remained were frozen, and even the speaker sat still, its flood of words dammed. Now they looked truly like part of the dead city, a thing they would tear apart looking for ways to put back together, never realizing that change was as inevitable as metamorphism under pressure or heat.
Sophy straightened her wings and prepared to hop but turned back. “Just one more thing: Why don’t you build something new, somewhere you can survive?” she asked the human.
She already had a guess from what it had said before, all about settling where they were born, following a programming they couldn’t alter or even read, a parody of thinking things, more like drones than true minds.
It took the human a long time to focus on the hopper and its question. Sophy’s statistical modeler ticked Nestor’s chances down a micron while she waited.
The human started to say, “Our fathers—”
Sophy launched the hopper westward, the coltan shifting with the hopper’s sudden movement.
For a cycle, Sophy considered holding her training data, all that human language and thought, to find just the right words to say to Nestor when he woke up.
But Nestor had taken up too much of her memory and her processor already. With that free memory, she revised her communication protocol so no one could interrupt her core attention at work again.