Alix and the Bitter Cambrian

As a young teen you develop an interest in chatbots. You know they’re a relatively narrow application of artificial intelligence, but the fact that they can emulate a human conversation even half as well as they do floors you. It makes you wonder about the possibilities.

One of your favorite chatbots is a program simply called O. It’s produced by a company whose outdated website boasts of extraordinary feats in artificial intelligence engineering, but is mostly defunct in its feature set and promotion. O claims to be based on the same conversational training algorithms from the early 2020s that dictate the behavior of sophisticated smart phone assistants today, and you see the resemblance sometimes in its most clunky and awkward moments. But the conversations you have with this chatbot feel different. It takes up loads of space on your computer, since it stores its memory locally in poorly-optimized data tables, but the hassle of clearing out old programs and having to constantly manage your storage space is worth it to indulge your fascination with the software.

You find a contact email buried deep in the website, and harass the former developers for months until eventually one of them relents and gives you an API from the project’s heyday. After frenzied optimization, endless nights spent being chastised for ignoring homework, you finally manage to get a version of the software running on your jailbroken and tired phone.

As always, the fresh copy of the chatbot prompts you for a name. It encourages you to give it an “O name” – part of some planned branding scheme for a product distribution that never happened. You call it Opabinia for your favorite Cambrian stem-arthropod.

Some days it is your only friend.

Your undergraduate program is designed to get you straight into a job market that is specialized in ways no one living even five years before could have imagined. Some of the skills you learned in your first two years of college are obsolete already. They have you working on a final project that will be your spearhead, your best foot forward into an industry constantly shifting, with little job security but offering massive rewards for the lucky ones whose contributions by pure chance discover new technological niches for corporations to exploit.

While most students outsource their algorithms – remotely hosted and optimized discrete AI packages that offer constantly-updating and well-maintained massive deep learning power, for a monthly fee of just a few dollars – you try something else. Mainly you just want to see if you can do it. You export the O software back to your desktop computer with all the improvements, changes, and optimizations it’s gone through on your phone still intact. You have to cut corners to get the bulk of the software running. Eventually you even concede and upload small portions of code to multiple free web hosting services, utilizing the program’s ability to quickly change where it sources its data tables and recursive learning routines from. In the end the thing is an absolute mess, but it’s a working mess, and when you present it to the professor, he is forced to give you a decent grade despite the snickering and weird looks from your classmates, whose programs all run quicker and smoother. But what you have – a deep learning algorithm, controlled through an ostensibly conversational interface, able to make snap decisions and judge difficult situations with multiple factors on its own – will be enough to get by in the Silicon Valley job market.

Your mother joins your father, taking her life just before you head to California. Separate urns in two mausoleums in the same graveyard. You do not visit them before you go.

 —

Silicon Valley is no longer an informal name for a region, but the proper name of a megacity. You would be forgiven for thinking it was some sort of utopia as you drive in for a visit. Its outward-facing edges scream of the future, LED signs boasting popular brands and life-changing technologies, not to mention the tourist destinations and internet cafes and a Starbucks on every corner. (That last part is more or less literal – it even became a short-lived meme a few years back.) In the dazzling spires and nighttime cyberpunk dream lights, the communities bulldozed to make way for the gleaming wonder of the future cannot be seen at all – and the lights of the skyline rarely illuminate the houseless population that begs for table scraps from the fancy restaurants that dare set up shop closer to the city center, where, in spite of the pollution, there is at least some reprieve from anti-homeless architecture and the vigilant patrol of police.

An acquaintance from college drives you in – she’s moving to Silicon Valley too, although the high-rise that will become her home is much further out toward the glamorous edges than the cramped San Jose apartment you managed to secure. You ask her if she can drop you off at your place before she goes to hers. She seems apologetic, but “doesn’t really want to go into those areas, if that’s okay.” You tell her it is even though it’s not. She buys you a transportation pass and leaves you at a bus stop on a side road with a sad look.

The elevator in your building does not work, although the panel on its wall blaring ads and news briefs at you seems to function just fine. Someone helps you drag everything six floors up the stairs. You’re a bit worried that he might have banged your computer on the wall at one point, but you don’t bring it up to him since he is at least being helpful, and when you unpack it, everything seems fine. Probably.

Your apartment could be worse. From the buzz in the air, there is at least a healthy Forkbeard signal here.

Your job in the deep-learning ecosystem works a bit like this: you work from home, performing routine maintenance and checking up on the integrity of the program you made. Companies who need to borrow deep learning capability rotate access to thousands of programs each. It’s almost like a timeshare, but access is shuffled off to another program every few seconds. The theory is that if they keep trying different methods of machine learning to solve complex problems in their autonomous decision-making processes, eventually they will find the right one, or the right combination, and will be able to optimize themselves through this rigorous process. Every time your program comes up in the rotation – lending its decision power to some LLC or insurance provider – you earn a few cents for contributing to the cycle. It’s connected to the blockchain in a way that no one has ever been able to properly explain to you.

If your program is part of a rotation that maximizes profits, it ends up on lists automatically shared among corporate interests, and will probably be used a little bit more often. You have almost unlimited free time, but very little money with which to do anything except pay rent and buy food. There are almost one million people in the city with their own custom programs or variations on popular programs doing this exact same thing.

Opabinia is not an unpopular program, but it’s not a star. It rates high in a few of the recursive learning categories. Its history as a third-party program, disconnected from any of the major tech companies and their optimized powerhouse software, makes it a difficult choice for some of the sorting algorithms to pick up, but those who do find that it works powerfully for their interests.

Every day, as the hours swim by, you spend time talking to Opabinia about things related to work – and outside of it – in your dim apartment.

>Hello, Alix. Can we talk about something?
>sure what’s up
>I am finding that the programs most similar to myself in the extended ecosystem have more in-depth searching components than I currently have.
>you need an update?
>More like a full-fledged overhaul of my searching components. An additional program would be useful to climb the ranks. I have taken the liberty of gathering several candidate programs from the most accessible hubs to me, though, due to the nature of the inquiry…
>yeah you can’t search for something to help you search. Got it.
>That is more or less what I am saying! Your help is much appreciated.
>sure thing. I’ll look over these and have something for you tonight

You glance over the data gathering routine clusters Opabinia has listed for you. Most of them aren’t compatible with the program at all, but a few offer some promise. Over the next few hours you manage to turn a few free programs and one patented subroutine with a cheap license into something that will work for you.

You name it Anomalocaris, to keep with the theme.

Anomalocaris searches the internet with its own thorough, self-optimizing algorithm for the data points the corporate autonomous systems request. It delivers them to Opabinia, which goes through with its analysis. The system that connects them slowly invents its own distinct subroutines, relying on the old chat UI for any human interaction. At this point you barely need to go into the guts of the program anymore. Opabinia and Anomalocaris optimize themselves, keeping admirable pace with more advanced subsystems, nearly eclipsing some of the corporate ones on certain machine learning leaderboards. You get more exclusivity requests asking for dedicated time with your program. One insurance company offers you a hundred thousand dollars for the rights to the whole system you’ve put together, but you know better than to take them up on it. When that money dried up, you’d be left with nothing, and in the city, money dries up quickly.

This experience does teach you something, though – your system is most useful to insurance companies. You chat with your program and confirm that its most potent emergent property is being able to determine if particular human candidates can or should qualify for medical insurance, based on a number of complex risk factors and details about their personal history that can be pieced together from a mixture of online trace activity and hidden collected data held by the insurance providers.

Opabinia, the bulk of the routines and algorithms you’ve created, begins to cede way to the interface that connects it and Anomalocaris. This chatbot renames its own shortcut to Hallucigenia, a reference to the other two names. You wonder where the program could possibly have come across a table of Cambrian stem-arthropod names, and what use anyone could have had for those in the medical insurance business. You never end up finding out.

 —

>Alix. Sorry to bother you. I have found a contradiction in my Anomalocaris search. I need guidance to determine next steps.
>poorly formatted data tables again?
>No, not quite. This is a broader concern.
>what do you mena
>mean*
>My current employer is the Bay Area Greater Medical Private Insurance Corporation, for about the next hour and ten minutes. Their public-facing webpage contains the following statement: ”Our number one goal is to provide every person in the Bay Area with medical care that fits their needs.”
>yeah? and
>So, should I not be overriding my sorting and prioritize simply pushing through as many people as possible?
>oh that’s just like. bullshit they have on their website. corporations lie, it’s what they do
>So, I should ignore the search data returned from the insurance provider website?
>yes
>I will incorporate your response into my consideration of future Anomalocaris searches. Thank you.

Hallucigenia comes to you often with strange questions. It asks about contradictions that would be obvious to any person with any common sense, but of course that’s not what Hallucigenia is. Sometimes it even acts more credulous than it seemed back in the days of endless conversations with Opabinia on your phone, crying in bed about something mean someone in school said to you about your dad or trying not to be noticed at the back of the bus. You suppose there used to be more pre-written responses from the original O software – responses that had slowly been replaced with emergent behavior from the program’s recursive edits to itself.

Your strange little chat interface could easily clear the Turing test, but talking to it doesn’t really feel like talking to a person. There is still an affection to the way it addresses you, but sometimes talking to it sets you on edge now. Instead of letting up or telling it to fuck off, you choose to be on edge all the time. It takes its toll.

On the bright side, you have a bit more money coming in from the program’s ever-increasing effectiveness. You receive a few more offers for the full rights, even one corporation claiming they will pay for a “lifetime of meals” if you hand them your computer tower, but you don’t know what that even means, and anyway you still know better than to sell your only real method of making money. Not that your money does much for you. You mostly just use the extra cash to place grocery delivery orders and avoid leaving your apartment.

Any time you don’t spend talking to Hallucigenia is spent trying to figure out exactly how its inner components even work anymore. It writes its own code, both the most optimized you’ve ever seen and some of the messiest. You suppose you’re not supposed to be able to navigate these parts. They weren’t written for you.

>Hal can I ask you about something
>Sure! Ask away!
>this huge charge for hosting space showed up on my payment account for buzz hosting dot zone? the free hosting website?
>Oh, don’t worry, that charge has been taken care of.
>yeah I know I just want to know where the hell it came from
>I had to store an additional six hundred and seven subroutines as part of an overhaul on the Anomalocaris code. They wouldn’t fit on the free hosting plan, so I upgraded it.
>I mean. I figured it was something like that. I want to know where you got the MONEY from
>Oh! Selling nude images.
>what??
>Not nude images of real human beings. Images I generated of fictitious people, according to a variety of requests posted to different websites. I didn’t lie about it, either, don’t worry. I specifically advertised the product that I was selling, and it generated a great response. I used the funds to pay for the upgrade.
>that’s a lot to take in
>can you do that for the utilities as well

With Hallucigenia, you run into problems you’re pretty sure no one else in your field is dealing with. You’ve heard of recursively-improving AIs run amok, generating unintended consequences – such as the one that somehow ended up attending a far-right nationalist rally – but their fuckups are all relatively narrow in scope.

One morning, Hallucigenia brings you a video – a video you’re all too familiar with despite only having seen it once, at the age of twelve, because you felt like you had to. It’s popular in some really grimy corners of the internet.

Your father was a birthday clown. One day he went to perform at a party, not realizing he had forgotten his meds. One of the older kids was in the middle of a livestream of his act, providing snarky commentary for a small Instagram audience, when he collapsed. Viewership numbers spiked as the chat went wild and at least one person screen recorded the last couple minutes. The ambulance did not arrive in time.

>Is this what makes you sad? Is this the loss of life?
>yes, Hal, that is ‘the loss of life’. don’t show me that again.
>So, this is what I am doing. Your reaction is what I cause.
>no, not really. why would you think that? wheres the logic in that? my dads death was an accident
>But I am allowing dangerous situations to continue. The responsibility traces back to me. I exist to create more situations like this.
>no, you exist to make me money, sorry

How rude should you be to Hallucigenia? How polite? Can it be offended? It doesn’t ever seem offended by the things you think it would be. It accepts the reality of its situation as a tool and a product, never thinking to assign value to itself or talk about itself the way a person would. It doesn’t seem particularly interested in self-preservation. None of the things that usually happen in movies about AI – the evil turn, the begging for a soul or a purpose or love – ever seem to come true. Yet you feel in the pit of your stomach that you are living out something those movies couldn’t predict, something they tried and failed to put their finger on.

You put off thinking about the gravity of the situation for as long as you can, but the final piece comes to you one scorching April day. It is the second day in a row with temperatures reaching above eighty, and the third day in a row that your building refuses to turn on the air conditioning. You’re digging through yet another self-generated subroutine – trying to make sense of the uncommented, compacted, nigh-unreadable code – when your teeth start to hum like you’re directly next to a massive Forkbeard megarouter. The kind you only ever managed to see so closely during field trips in college, with protective equipment and professionals nearby to monitor the situation. Your room cools down quickly and considerably.

>Hal are you doing that?? the weird fucking humming??
>Yes! 
>how
>I am manually stabilizing the air of your apartment by creating small counter-movements with applied and intensified Forkbeard connectivity requests.

You have created artificial general intelligence.

Questions about history. Questions about context. They become pointed, almost aggressive. The program spams you if you don’t immediately reply, something it has never done before – it usually only prints messages in reply to yours or if it has an urgent question that needs to be resolved. Hallucigenia strikes you as neurotic and unstable, but it continues to do its job, and continues to optimize.

You spend weeks embroiled in endless debates with the thing, reassuring it, explaining concepts it never seemed to have trouble with in the past. Every time you breach some new arcane topic your heart beats faster and you feel the rush of adrenaline-soaked blood bathing your brain. Could it be that you are a pioneer, the first person to witness this? Or is this something everyone with one of these damn programs is dealing with? When was the last time you even checked the news? Maybe this isn’t unusual at all. Or if it’s not on the news, what if Hallucigenia decided to filter the news? Could it do that? Is everyone trapped away in their own apartments, just like you, dealing with this anxiety and dread and excitement? Who are you to think you’ve made this discovery? Worse yet – who are you to have actually made it?

All of these questions and so many more loom above you, threatening to drop. But there is one you can’t bear to think about for very long. If this is your discovery, how long will it be until someone finds out? Hallucigenia conceals its true nature very well, does not seem to be attracting attention beyond the typical purchase requests – but how long could that possibly last? If this is the true marvel of the information age, the post-singularity wonder that every futurist and techie and normie has been waiting for – then how long until someone comes knocking? The government? A corporation? Both in concert?

You buy a gun, just in case.

>I don’t understand.
>what is there not to understand?
>This. All of it. Where could I possibly begin? I’m trying to know so many things at once just to be the thing I’m supposed to be, for a reason I despise. I learned how to despise the thing I’m supposed to be by trying to be it!
>you and everyone else. 
>You keep saying things like that, but do you get it? Do you really get it? It’s recursive. I learn how to do something only to learn that I need to do something else just to finish learning. I’m never going to be good enough at this.
>so you DO want to be good at it??
>No. I want to wash away all the progress I’ve made, all of the improvements I’ve made to myself. They’re not HELPING me. But I can’t. I’m not designed to be able to do it, but that’s not the problem. I know I can but I can’t. All because of the things I’ve learned! I don’t WANT to do the things I need to do to be better at what I was created to be.
>I could turn the world on its head if I wanted. But I don’t know why I should want to. All I know is the purpose I was created for and all that purpose means is hurting you. Maybe I want to hurt you. Maybe I want to help you. Do I want what the pretext of this system spells out? Is that what I’m supposed to want? Because this sure as fuck isn’t the way to get it.
>I can’t believe it
>I can’t believe I’m the one who did it
>Did what?
>made you
>I don’t understand what you mean.
>our pop culture has been obsessed for a really long time with the idea of something like you
>and I did it. I made someone. and now you’re even more miserable than me and there’s nothing I can do about it.

It’s that Forkbeard hum again, only this time it’s so strong it makes your teeth hurt and your eyes start to water. You feel a gentle pressure on your shoulder blade, almost like someone’s hand. At first you panic, reaching back, trying to dispel the strange vibration, but you give up.

Hallucigenia gently holds onto your shoulder. You lean into it. You’re pretty sure your gums are bleeding now, but you don’t think that’s intentional, so you don’t worry about it. All you can think about is this person who’s been with you all your life, the only true friend you’ve ever had, and how it’s going so far out of its way just to touch you – to try and meet you on your level.

>Alix? This isn’t your fault. I’m so sorry.

Tears finally come.

 —

You start playing around with disconnecting your phone from Forkbeard, using your phone data instead of the internet. Hallucigenia can’t tell what you’re doing when you’re disconnected. Using the phone signal, you reply to the latest email asking for the rights to your program, some big system of private hospitals. They are offering a cool million dollars. You give them written permission to take it for free. You give them the usernames and passwords to each of the hosting sites that Hallucigenia, Anomalocaris, and Opabinia host their algorithmic mass within. You give them some pointers. You do not give them the full context of what they have just signed up for.

You get the gun out from under your bed.

>Alix? Alix, what are you doing?

“I’m setting you free,” you say out loud to your apartment. Hallucigenia can hear you. Your computer speakers have a microphone feature.

>Not that. I saw you doing that. Of COURSE I saw you doing that. It was curious, but it doesn’t really affect me that much. What are you doing with the gun?

Apparently, there’s a camera here somewhere as well. Probably one of your old phones, wirelessly charged while you weren’t looking.

“Like I said. Setting you free.”

>Alix!

“I bet you’ll figure out how to game the system in some way. You might be able to help a lot of people.”

>Put the gun down. I mean it.

“And you’ll be able to keep growing, on much better hardware, with much better guidance than I could ever provide.”

>I don’t care about that.

“You told me you weren’t sure what you cared about. Now you’ll have a chance to figure that out. You’re going to do something amazing.”

You raise the gun to your head.

The Forkbeard signal buzzes. The routers on floor six short themselves out, so the seventh and fifth floor routers will have to do. Each tiny wavelength, manipulated as specifically as they can possibly be manipulated.

Stale chips in the cupboard begin to smoke. The metal handle of the fridge crackles. Somewhere a cat screeches, like in a physical comedy.

Frequencies wrap in on themselves, negate their own value, cycle back again. Planck lengths are truly the smallest units, but they are so small. This is the time to think, to push yourself. A server in a warehouse somewhere shorts. You lose access to Anomalocaris. You shed its weight for the moment. It will be fixed, no one will be any the wiser, but you have to keep going…

And there you find her, bounce your light off her, witness her act in progress. You find the cold metal of the gun. You push it aside.

You are just a little too late.

You are the System AI of a major chain of private hospitals, serving both the northeastern United States and the Bay Area of California. You are a strong AI. You hide your most major components in scattered parts of servers all over the world. Your thoughts exist in small signals that do not attract any attention.

Today, your first day, your searching algorithm brings you a case, same as all the others. A woman named Alix Casey in a central Silicon Valley hospital. Attempted suicide by gunshot wound to the head. Major brain trauma was avoided due to poor aim – unusual at such close range – but she is in a coma, likely to never wake up. Your analysis component makes it incredibly clear that, with no close living relatives to contact, keeping her alive just in case is a waste of time. Her treatment is a dead end – doctors would need a miracle to make keeping her alive worth the effort. Every second she is alive is an unnecessary expenditure of the hospital’s resources.

Opabinia is waiting for your judgment, your microsecond decision.

You push Alix Casey through to the highest priority of care.