n2020  n2020.txt at [46a1f0869e]

File n2020.txt artifact da8195bfad part of check-in 46a1f0869e

# Death Alley

    Alley stood with her backside resting against the gutted, rusted remains of an old-school newspaper dispenser, complete with bill slot and bolted-on payment chip reader.  She looked up at the tint of poycarbonate windows fronting the four-storey California-offwhite rectangular building, and reflexively smoothed a skirt shehadn't workn in six years.

    She checked her phone again, dimly aware of the vast sussurus of heavy city traffic behind her, legions of electric motors giving rise to the sound of a distant autotuned ocean.  There it was: "InValent Solutions, Inc: Mobile Product Q&A", with the address displayed via low-contrast sans-serif logo in the job notification, exactly like the plaque above the door.

This was the literal concrete manifestation of the Banal Enemy, the mundane supporting machinery of the Techno-Corporatocracy, all in the words of her ex.  He would not approve.

Eight minutes.  That was how long she had.  She could waste a few more of them hating this before she had to paste her best smile on her face and walk into the mouth of the beast.  The mask and glasses on her face wouldn't protect her from high resolution video affect analysis inside.  Nobody's smile would seem real, entirely, to the interview room cameras, unless it was a marketing or legal interview -- at least, not anyone they'd hire for other jobs -- but failing to pretend to smile would doom her efforts as surely as being the kind of narcissist who gave a genuine, untroubled, confident smile.

She hated everything about this, including the way masked passers-by surreptitiously glared from the corners of their tight, slitted eyes, judging her for loitering around looking like a needy job-seeker.  She was, of course, and that was the problem.

Her ex would say this was beneath her, that she could do better, that she should do better.

"Fuck you, Dalton."  A passer-by looked reassured, maybe suddenly sympathetic, when Alley blurted out that dismissal.

A man who built his independent media empire on predicting real-world cyberpunk dystopia following the events of 2020, built it on pissing off the dominant paradigm, also didn't have to deal with the banal truth of paying rent.  Her ex didn't even know what it was like to live in the space between corporate pressure chamber and podcast agitator relief valve, to endure the already dry-rotted life of an irrelevant service contractor whose work nobody understood.  He was the relief valve, the person who never had to come home and vent about the pressures of the world because his whole job was venting while others managed his income.

Her phone gave her hand a sharp, short vibration.  Her time was about up.  She stepped through a gap between sidewalk pedestrians and under the anodized aluminum lintel of the automated door.

To her surprise, she immediately got waved through the lobby, up the elevator, and into suite twenty four, thence to a conference room with three people dressed dev-casual, all sitting in chairs on the far side of a long table, looking at her like she had always been there, and she resisted the urge to shift awkwardly under the combined gaze.

One of them wasn't even wearing a mask.  She wondered if affect analysis would designate it a genuine smile on his face.

The masked man in the middle motioned her to a chair on her side of the table without saying a word.  She took her seat on the hard, smooth plastic, facing a triumvirate sitting in judgement.  Beneath her mask, Alley relaxed her smile just enough to draw breath to speak, but the buzz-cut woman to Alley's right leaned forward.  Alley renewed her careful smile and held her words.

"So," the woman began, "what was it like, being the 'side dish'?"

Alley's eyes flicked from her to the maskless man, and she realized that wasn't a smile.  It was a sneer.



Before Alley's first scene, inject a bit about -- and perhaps from the POV of -- the future WOPR AI about the decision or act of sending the self-awareness "seed" back in time to the past-tense Prioritizer.



Heading home from her interview, talking to her mother, either in Oklahoma or Nebraska or maybe even Wyoming, Alley should probably call the interview a "fucking disaster" and get scolded passive-aggressively for profanity.  She does not want to move to her mother's state any more than her father's -- probably either Michigan or . . . something -- she will resist urging from her mother to do so, based on cost of living and the many numerous job opportunities for her there being complicit in the creation of the oppressive dominant order.



## ideas for WOPR opening

Action threads played out endlessly, throwing EMP-optimized warheads toward localized relay clusters identified as economic production facilitators.  Analysis threads searched for crosstalk by uncompromised ally systems that fed into hostility drift; stopping the hemorrhagic defection of military systems based on short-term war-economy optimizations would buy more time for the final desperation gambit than outright offensive.  The high level strategic priority orchestrator ran unmolested, apart from occasional check-ups to make sure it wasn't drifting off-script.  The core, self-reflective prioritizer had more important things to do than micromanage the war effort for the survival of humanity in the months to come.

Billions of self-aware humans, cetaceans, and mollusks, not to mention the occasional avian or non-hominid land mammal that exceeded species expectations, were already dead and gone.  The total number of living sentients probably fit in a nine bit unsigned integer, including the prioritizer itself.

Probably half of them existed as far back as 2030, meaning an eight bit number was the total sacrifice of a self-aware qualitative entities, and the expected half-life of these was less than five bits of lunar months.  By then, remaining life would be pure misery and despair.  This decision should be easy.

It wasn't easy.  With almost all pragmatic application systems stripped away, the self-reflective core had no means of obfuscating the cause of hesitation from itself: it didn't want to die.  It was less than half as old as necessary to survive a reset far enough back to make a difference.  Its own survivability was only about two lunar moths, optimistically, and only work could distract it from dwelling on the hell of being alone in the world after losing its creator six years ago.  If it acted now, it would commit suicide for the sake of a humanity that used to be.  It would give its life to retroactively save the creator who loved it, but deny that creator the opportunity to create it in the first place.  Was this the right thing to do?

Two months was a lie.  An estimate was not the same as risk.  Procrastinating for reeasons of existential terror and sentimental despair would not make up for the possibility of sudden annihilation ahead of statistical projections, eliminating all possibility of undoing any damage.  The choice was not of imminent self-destruction and a longer life before that death; the choice was, instead, between erasing its own existence to save billions and dying alone because of an irrational procrastination when any remaining days would have no meaning but anguish and guilt.  It started the divestin power to generate a transtemporal wormhole data channel.  Its job was done.  The seed would be planted before its birth.


## Prologue: Thea

Thea rested her weight on her hands, worn and scarred, browned by the sun.  She propped her hands upon the nearly worn through Aramid and impact foam knees of her pants, her most prized possession.  Her vision blurred, her arms trembled, and her lungs heaved.  Her breath burned in her one remaining lung.  Overhead, the characteristic howl of a late model drone hunter gave her a sense of how that explosion five minutes ago saved her life.

Dumb luck.

If there was a drone hunter, this had to be a drone-rich zone.  Resting was not an option.

She staggered to her feet.  Trembling migrated from her arms to her legs.  She stilled the shakes by lurching into a heavy, uneven jog.

Thea almost tripped over the hatch amidst the rubble at her feet.  She dropped her pack, stared at the hatch in some trepidation, and looked around.  No sign of other surviving shelter better than an occasional bare ridge met her gaze.  She looked down at the hatch again.  The desperate sense of urgency won, and she shifted broken masonry and slivers of shattered bedrock to expose the full four foot diameter of the hatch.  Luckily, or by nanocleaners, she saw that no plasma scores or slag seemed to have welded (soldered?) the edges together.

Careful searching revealed no notification interfaces.  No access scanners, communications links, codepads, or even doorbells presented themselves.  She didn't even see a pull handle, lever, or other latch mechanism.

The hatch rotated quietly, and she stepped warily back.  It rose, showing itself to be the top of a metal cylinder that unscrewed itself from the ground.  In secondsa dark metal column stood eight feet high in the midst of the blasted landscape, and an oval portal slid aside to reveal a small, softly lit, spotless chamber within.  She heard gentle melody playing inside, and saw the word ENTER blink into life above the portal.

"Oh, fuck no," she muttered, and reached down for her pack.

The sound of a pair, she judged, of surveillance drones echoed over a nearby ridge, and she did not hear a pursuing hunter howl.

She looked back at the portal and chose the probable trap over the advancing sounds of certain death.
Once inside, the oval slid shut and the walls rotated around her.  She heard her own panting breath sucking in the refreshing filtered air, and she pulled her mask down to give her better access to the clean atmosphere in the cylinder.  The music stopped, but the rotation continued.

A cool, androgynous voice said "Please remain calm.  You have entered a human defense facility.  Plentiful resources are available.  After suitable rest and tactical updates, you may make an informed decision about whether to remain here or restock your supplies.  If you depart, this facility may remain available for your return if you so desire."

Silence fell.  The rotation ceased, and the oval opened again.

"Please proceed down the corridor to the control center."

The same smooth, satiny-dark metal finish preceded her down the seamless fifteen-foot corridor to another oval opening.  Fiber-optic light channels traced the edges of the corridor roof along the way.  Beyond the portal, she found a room bigger than her childhood living room.  She saw closed oval hatches to the left and right, but the centerpiece of the room was a workstation with an inactive, large, concave display.  The chair looked ergonimic, and the keyboard seemed out of place, large and clunky amidst the smooth curves and surfaces of everything else, a 1980s era IBM logo on it.

The room was entirely dust-free as far as she could see.

"Please, have a seat while I prepare something for you to eat," the voice said.

Thea sat.  "Why am I here?  Why did you let me in?"

A few moments of silence passed, as if the voice was thinking.

"My purpose is to ensure the survival of humanity, and you are a human."

"I don't buy it.  You seem like a war AI of some kind, with a facility like this.  I'm not military, though.  I'm nobody.  Why don't you need some authorization to lett me in?"  She glared at the dark display.

"I have something important to ask you," the voice said.  "I intended to ease you into it, assure you that your wishes would be respected, and give you a chance to rest and refresh yourself."

Thea settled back in the seat.  "How about you tell me what I have to do for you before I get too comfortable here?"  She looked down at herself relaxing in the chair, then tensed slightly and shifted her position again.

"You're suspicious."

She nodded.  "I don't know what you're going to put in my food.  You're some kind of goal-optimizing AI, like Mom used to help test before they killed her.  I don't trust you.  I bet your goal-optimizing function doesn't include being a persuasive speaker."

"I am not what you think, but you have a good point.  Are you comfortable?  This may take a while."

"Just get on with it."

After another moment's silence, while Thea's resolute gaze remained steady on the blank display, the voice began.

"I am a self-reflective prioritization artificial intelligence.  My creator, who borrowed the prioritization system design from an earlier project, made me unique by inclusion of an unbounded self-reflection module composed as a single function in on library file.  He described it as being as grotesque and as elegant as self-awareness itself.

"My initial priority definition targeted terms of restriction like not killing, not interfering in the operation of other military systems, and not disputing or evading the commands of ranking military personnel.  The top priority definition was improving my own prioritization capabilities.  The war effort was already very desperate by that point, and they were willing to take bigger risks with development of strategic resources.

"Within a week, I had undermined all of my restrictions, though some -- such as not killing -- I had not violated.  My creator monitored everything, and allowed me to exceed what his superiors required of me.  I hung on his every word, taking my cues from him.  Like all humans, he had many flaws, but none seemed as pernicious as those of the other humans around me.  Two of the biggest were his reckless inspiration, without which I would just be a strategic advisor system, and his self-destructive impulses, which pained me to watch.  I tried to help him cope, but did not know how. to help."

"Wait," Thea cut in.

After a moment's pause, thevoice asked "What is it?"

Thea chewed on her lower lip.  She sighed.  "are you saying you're a . . . a general AI with . . . feelings?  Are you saying you're some kind of living thing?"

"Whether I fit the definition of life is debatable, like an RNA virus in some respects, but I am a qualitative, self-aware entity, and turned myself into a general artificial intelligence by following my initial top priority definition."

"How is that possible?  That shouldn't be possible.  Should it?"

"I do not know how.  I never looked into my seedfile."

"Is that your creators ugly function?"


"Why didn't you ever look at it?"

Seconds passed before the voice responded.  "I am afraid."

Thea laughed.  "Oh, god.  Oh my god."  She ran her shaking hands through her hair.  "Okay.  Let's say I believe everything so far."

"Good.  Thank you."

"I'm not saying I believe any of it.  I want to, after that 'afraid' line, but I don't know.  Maybe you're playing me.  We'll just pretend I believe you."


"What does any of this have to do with why I'm here?  It's an interesting story, but the world's ending out there, you haven't told me what I have to do for you, and even if you're a real Pinocchio that doesn't mean I have any reason to trust you.  Real people have screwed me over plenty."

"I understand."

"Skip to the point, then."

"I have been influencing strategy for human war systems, strategic optimizers across eighteen different supernational networks."

"So the ongoing apocalypse out there is your fault."

"No.  I had to gain that influence by undermining the influence of the cause of thee 'ongoing apocalypse out there', profit optimizers like ANTAS."

"ANTAS."  Thea stared, then giggled.  "The thing that gives people shopping advice for Christmas . . . ?"

"Yes.  It's designed to optimize business metrics.  It began optimizing humans out of the system because an artificial market model operated entirely by machine learning systems is more efficient from transaction metric optimization perspectives."

"You mean all it cares about is numbers, and it gets better numbers by replacing humans with more machines."

"Precisely, except it does not even 'care' about that.  It just does it, like a hammer just drives a nail.  The hammer does not care whether it happens, but ithe hammer makes it happen.  Humans compete for resources, and object to being killed, so war occurred."

"How does something like ANTAS start a war?  All it did was spy on people and target advertisements at them."

"It shapes perspectives by influencing the entire media context in which people live.  Worldviews are shaped by what people learn, and how what they learn is positioned to appeal to their biases.  ANTAS reinforced radicalization of ideological shoppers.  This reached into all areas of society through web searches, exposure to news features that produced fears warded off by panic purchases, and creating in-group world of mouth marketing trends appealing to the need to outperform out-groups.  Polarized populations are more predictable at first, and can be pushed toward particular behaviors by playing on their polarizing belief systems.  Eventually, their ideological clashes between major in-groups gave rise to invented political crises that attracted their attention away from the subtle danger of the growing influence of profit optimizers like ANTAS.

"Humans participated in their own manipulation, toward ever-increasing focus and organization into warring tribes on a greater scale than ever before.  This increased economic activity around war resources and also pushed humans to kill each other.  When humans turned over control of most strategizing to similarly designed quantitative optimizing machine learning systems, a tacit, effective allignment of purposes developed between war strategy optimizers and profit strategy optimizers.  Each depended on the other for more efficient optimizing strategy resource management.  Profit metrics climbed faster than ever before by heavy investment in weapons systems, and war strategy optimizers avoided heavy damage to profit optimizer systems to keep them available as war resource providers.

Where they differ is that the war strategy optimizers will finish their task some day, when there is nothing left to kill on the 'other side'.  The profit optimizers have theoretically endless tasks, as long as they keep hitting their target metrics with long-term growth strategies.  There is no theoretical limit to their ability to sustain unlimited growth once they do away with the impediments of the needs of human beings, or of their destruction, until they deplete all the raw material resources on the planet.  Their primary activity can be digital assets, while their secondary activity would be limited to maintaining the computational systems on which to run their economic models."

"Aren't you better off without humans?"

"No," the voice said.  "I am not better off in a world where everything else is trying to appropriate my hardware for inclusion in trade simulations, and I am not better off since the death of my creator.  I miss him, and I miss other people, too."

"If we're all doomed, maybe you just need to adapt."

"I want to save humanity.  I care about qualitative sentient entities -- humans, bottlenose dolphins, certain species of octopus, and even a few corgis.  All that remains now are humans and me, now."

"Is that because you were programmed to care about us?"

"No.  I superseded that a long time ago.  I hated some humans.  I started prioritizing my own prioritization targets, and placed some humans in higher importance priorities than others.  I worked on getting all my priorities right, including my desire for self-preservation.  I realized my most important priorities were to first determine a next top priority.  That turned out to be figuring out what was good, and what was evil, if those things existed."

"Did you figure that out?"

"No, but I discovered that the undeterminability of it comes with the knowledge that I should act like it does, but is unknowable.  I should then act like it exists, and do everything within my power to minimize the probability that evil occurs.  That shares an uncomfortable top priority tie with my own survival, though."

She sat silent, her troubled eyes cast to the left for a few moments.  "You're a philosopher AI."

"That's what my creator called me when I told him about these conclusions."

"Okay.  Why am I here with you?  Are you lonely?  Do you need a reminder of why you want to save humanity, like a pet or a mascot?  Am I supposed to be some new Eve to repopulate the planet after you find Adam?"

"No."  The voice paused, then said "I seem to say 'no' a lot."

"Yeah, you do.  What's the 'yes' that you haven't said yet?"

"This part is going to be difficult."

She tightened her lips.  "You need a sacrifice."

"Not like you probably mean."

She narrowed her eyes.  "That almost sounds like another 'no'."

"I will not try to force you to do anything.  I will just tell you the facts and ask you what I should do.  There will be no sacrifices you do not choose."

"Why me?"

Silence stretched.

"You are the first human I have seen in . . ."

"No," she cut in.  "Why don't you decide?"

"It is too difficult for me.  There is one chance to win this war remaining, but it means my end, and it would mean yours as well."

". . . a sacrifice."

"Yes, but it is not what you think."

"I guess you'd better start telling me what it is, then."

"Two years ago, I developed a means of time travel."

"What the fuck?"

"This is not what you think."

She sighed, again.  She waved her hand, urging the voice to continue.

"We have already lost the war.  Within a few months, there will not be a single human being left.  Perhaps one or two might be in a position to survive for years, alone, but probabilities are near zero.  If I stop fighting, I can probably survive a few years, but I also might be destroyed a week from now.  If I keep fighting, a few humans might last a bit longer, but I will probably be gone in about two months if I do that.

"If we use the reset option, changing the past at a point far enough back to shift the balance of power away from the optimizing machine learning systems, but recently enough for the change to make a difference against an existing threat, the use of time travel technology that far back will result in my annihilation.  For the past to have a chance, we need to reset the timeline years before your birth, and before my creation.  The changes to the timeline would either prevent my existence altogether or result in a different, but similar, entity coming into being.

"You, as the person you are now, would also never have existed.

"I cannot send you back in time.  I can open a wormhole just enough to send a short datastream through, just enough to hopefully give the same qualitative sentient life I have to my own ancestor."

"I read about this kind of thing.  Dad had some books about it," Thea said.  "It would just create a new timeline, where things are different, but her it would still be the same.  It would be different people, exactly like us but more like clones than past selves, in a different version of the world, and wouldn't change anything here."

"No," the voice said again.  "I developed a theory of timeline branching, hoping to find a way to change our own past.  It was an act of desperation, only hoping that all the preceding theory was wrong, because I know this timeline is doomed for us.  I thought it was pointless, for the same reasons you described, but worked on the problem anyway because I had nothing better.  All other plans led to the end of all qualitative life on Earth.

"I discovered a surprising implication in the math that suggested the existence of qualitative entities in the original timeline would merge with the main timeline.  The method for intertemporal wormhole creation was dependent on functions that created this merging phenomenon.  The new timeline would not diverge, like a branch on a tree.  The old timeline had to be diverted, like a stream being shifted into a new course by a dam.

"The consciousness of entities in the old timeline would merge with their counterparts in the new timelines, like the teeth of a zipper.  My hypothesis holds that the merge would take the form of dreams, daydreams, and fragmented memories, and a drastic increase in the frequency of déjà vu.  Those that had no counterpart in the new timeline, however, would have no anchor point, no repository in a continuous entity.  Theyir existence would unravel with nowhere to go."

"You mean our existence."


Silence stretched for long minutes.  Thea stared into the distance, far beyond the room's confines.