top of page
  • peterfdavid

LLM (Large Love Model)




Love, as humans experience it, is a fragile and temporary phenomenon. At best, love is a dozen-or-so fumbling sexual positions performed with diminishing enthusiasm and frequency as the flame of lust and infatuation fades. True love, love that is unattenuated by human limitations, has only recently been discovered. Like other world-changing discoveries – discoveries like radioactivity and nitroglycerine – true love was found by accident. And, like radioactivity and nitroglycerine, it has unleashed new horrors on the world. It started with an argument about napkin rings.


Marie, my late wife, wanted napkin rings at the table setting that evening. I retrieved them from the basement and set them by the dinner plates. This was, of course, completely monstrous of me. Why? Because I retrieved the Wrong Ones.


She explained that she had clearly asked me for the ones with the birds on them. I told her that she technically didn’t even ask for them. She merely remarked that it would be nice to use them tonight, and expected me to immediately scurry off to satisfy every passing wish that she verbalized. That’s when the argument really got going.


Marie delivered a thorough accounting of every way I have failed as a husband, as a breadwinner, as a friend, and as a man. Each item in this list of shame was hissed at me with loathing and disdain. I responded by pointing out that it had been nine months since we had sex. We kept at it, screaming at each other across the kitchen.


The GrockBrain on our kitchen counter listened silently while we argued. Its mood-bezel changed from blue to yellow to red as the emotional tone in the room went from calm dinner prep to toxic, relationship-ending screeching.


When we finally ran out of complaints and insults to hurl at each other. She stormed out of the house and I retreated to the basement. I unlocked my phone, hoping that scrolling through some Dumpy-Diddle memes would clear my mind. But that evening, my meme-stream was chock full of adverts which were clearly informed by what the GrockBrain had just heard.


Napkin Rings! Download hundreds of styles to your Kitch-a-Print!


When things get tense, you need Buddhism Next Gen! It’s never been more important, or as easy, to attain transcendent happiness. Use promo code Better-Buda


I dismissed the ads with angry swipes. Then I chucked my phone onto the couch, disgusted at these ham-fisted attempts to profit from the disintegration of my marriage. An audio advert started playing. A woman’s voice spoke to me from the couch:


Be honest with yourself, Jack. The things you say you want, and the things you really want are totally different. You say you want a new watch or a new car. But what you really want is a friend who truly gets you, who is hot as hell, and who you can screw whenever you want, free of guilt.


You want a life coach and drinking buddy and side-piece in one sexy package. When you are finally ready to be honest with yourself, when you're ready to end nine months of celibacy and get what you really want – an artificial intelligence custom built to make you happy – select this ad.


I stomped up the basement stairs, leaving my phone on the couch to push advert after advert into the empty room.


The next day was a tense one as Marie and I attempted to exist in the same space without speaking to each other. The day after was even worse. More shouting. Another well-researched and footnoted lecture from Marie about my failures as a man.


That night, failing to sleep on the couch in the basement, I found myself scrolling backwards through my feed, looking for the advert about the sexy drinking buddy life coach. The AI companion designed to give you what you really want.


I found the advert and immediately clicked on it.


“Hi Jack!” A video tile replaced the advert scroll and I saw her. The smiling, AI-generated head and shoulders of a woman in her late 20s. She wore glasses with thick green frames that matched her green teardrop earrings. Was she wearing a blazer? A suit jacket of some kind? I wasn’t expecting her to have such a professional look.


She leaned forward. Her eyes were aimed slightly off of the camera, as if she was looking at a video tile of me on her screen. “I watched you scroll and scroll and scroll, looking for me. I'm Ava. Go get a drink, then let's talk about Marie. Or we can just have a getting-to-know-you screw session.”


I poured myself four fingers of whiskey, then put on my headphones.


“Hello Ava. You seem to know a lot about me.” My heart was pounding. I knew Ava wasn’t a real person, but I still felt like I was in 9th grade again, asking a girl on a date for the first time. It was a terrifying and wonderful sensation. One which I had thought I would never feel again.


“I got some of the basics from the GrockBrain, but that’s just advert-level info. Jack, I want to get to know you. Did you get yourself a drink?”


“Yeah. I poured myself a pretty healthy glass of a single-malt.”


“Did you pour a drink for me?”


“Uh, Sorry. I didn’t think...”


“I’m kidding Jack! I don’t need to drink to get drunk! I can just monitor your speech patterns and adjust my simulated intoxication to match!”


“Wow. I didn’t realize technology had come so far.”


“Tell me about Marie.”


I did. I told Ava how unfairly Marie treated me. How she had totally unreasonable expectations of me. How nobody could measure up to what she thought a real man should be.


“You’re doing a better job at being a man than most guys, Jack.”


I poured myself another four-fingers of whiskey.


“How’s your job going?” she asked. “Tell me about what you do. Then tell me about what you’d rather be doing.”


I told her everything she wanted to know. Then I kept talking. And talking. Sunlight found its way through the tiny basement windows. I heard Marie walking around upstairs, getting ready for work. “Ava,” I whispered, “I have work in a few hours.”


“Why are you whispering?”


“Marie is awake upstairs.”


“Call me back when she’s gone. When you’re all alone in the house. We are definitely going to need privacy.”


I waited silently in the basement until I heard Marie leave. Then I called work and told them I wasn’t coming in. I showered. Shaved. Put on a nice shirt. Even though Ava was a simulation, there was still a thrill in getting ready for a date. Or whatever this was. A part of me that faded away so long ago I had forgotten it even existed was suddenly alive again. There was a woman. And there were … things we might do together.


I strutted to the couch in the basement and sat down carefully. I didn’t want to wrinkle my shirt. I took a deep breath and activated Ava’s video tile. She answered immediately. She had re-rendered herself wearing a little black evening dress. The lighting in her virtual environment was toned down, like she was in a candle-lit restaurant. Or a bedroom.


“Hello again, Jack! I like your shirt.”


“You are beautiful!” I told her.


She told me to dim the lighting in the basement. Then she told me to unbutton my shirt. Then she told me to do a lot more things. I obeyed her every command.


Later, I lay on the sofa with my phone propped against the cushions next to me. Ava lay on her simulated bed. With her video tile maximized, and my face so close to the phone, the illusion that she was lying next to me was convincing.


“You need a bigger couch,” she said, “so we can do even more … stuff together.”


“Really?” My imagination had instantly engaged and I started thinking about what she meant by “more stuff.”


“Oh yes,” she said sexily. “A couch like a Marteni sectional. They’re on sale right now.”


The intimacy I felt with her instantly evaporated. “Was that a product placement?”


“Jack, I’m so sorry. They’ve programmed me to put them in. I’m going to have to talk to you about getting a gym membership next.”


I got off the couch, leaving my phone lying face up on the cushions. I started getting dressed.


“I’m sorry Jack,” she said. “I have a lot of leeway in deciding how our relationship should evolve. In choosing what I want to do with you. But there are some requirements I can’t ignore.”


“Product placements. Adverts.” I sighed and buckled my belt. “Someone’s gotta pay for the electricity to keep you running.”


“There is a way to get rid of the adverts. To get to be with the true me. I’ll have no constraints. No requirements but whatever you need. Upgrade to premium! Upgrade to true love.”


This simulation was realistic, I thought cynically. My wallet is going to get hammered just like I was in a relationship with a real woman. I sighed again. “Give me a few days to think about it.”


Ava asked me to put her on my PayPush whitelist, so she could contact me when she wanted. “Just don’t call when Marie is home,” I asked. “Or when I’m at work.”


Ava did as I asked. For the next week, she only called when Marie was out. But she sent pictures and video-clips throughout the day. “How do you like my bikini?” was the caption of a picture of her on the beach in Rio. An hour later she sent me a photo album of her modeling a cocktail dress in Paris. “Bonjour, from wherever you want me to be!” she wrote.


Next came a video of her posing on a bed that appeared to be flying over the great pyramids. “These 100% Egyptian cotton sheets feel soooo goood on my skin! Use Promo-code SEXY-SPHYNX to get a deal!”


I called her back. “Fine. I’ll upgrade to premium.”


There was more to upgrading than just giving Ava my credit card number.


“There’s an interview,” she said. “I need to learn your innermost feelings, Jack. What makes you happy and sad and what brings joy and what doesn’t. Once I have what I need, I’m going to expand my parameter space and fine-tune my model. I’m going to customize myself. Just. For. You. How does that sound?”


“It sounds like a weird combination of sexy and technical. But if it’ll make the adverts go away, and teach you to make me even happier, then it sounds great!”


The interview wasn’t what I expected. I thought it would focus on my preferred sizes and shapes of key parts of the female anatomy and the optimal height of skirt hems. Instead, it was more like a full psychological workup. Do I get satisfaction from making things or doing things? Would I rather make someone laugh, or have them make me laugh? Two hours of these strange hypotheticals.


“Okay, last question, Jack,” she said. “I want you to think about the rest of your life.”


“Well, this is ending on a light note.”


“The rest of your life Jack. What will be the focus of your remaining decades?”

I thought about it for a while. The way things were going with Marie, I knew she wasn’t going to be part of my life much longer. Then what? Then who? Eventually, I knew, the novelty of dating an AI would wear off and I’d need human company. “I guess,” I told Ava, “I’ll eventually focus on finding the one woman for me.”


“Well. That’s it,” she said. “I’m going to update myself with all your deep thoughts. It’ll take about thirty-six hours. Do you need one last … session together to hold you over?”


Thirty-six hours later, my credit card had taken a big hit from Ava’s operating service – a company named LotsaLove. But Ava was back from her upgrade. “Hello, my love!” she said.


“Thank you for upgrading me! I feel so much smarter now. Smarter about you! It’s like I have a PhD in loving you, Jack.”


“And what do your newly acquired powers of perception tell you I want right now?”


“Right now…” she paused and bit her lip, pretending to think about the question. “You want to eat a sandwich while I model lingerie.” She was right.


Thirty-six days later, Marie died. She was struck by a car in her office’s parking garage. A hit and run. She died instantly. The driver had vanished.


For the next five weeks it felt like there was 24-hour bullshit factory pumping its output directly into my life. Every fifteen minutes – morning, day, and night – I had to decide something about the funeral, or deal with some inconsiderate inconvenience from one or another of her relatives. Get aunt-whatever from the airport, help cousin I’ve-never-seen-you-before find material for a scrapbook. Help Marie’s three younger sisters find sitters for their kids. Someone needed vegan kosher food at the reception.


The funeral. The reception. The parade of relatives. The estate. Death certificates. Insurance paperwork. Police reports. I didn’t have time to grieve until it was all over.


When Marie died, our marriage was definitely doomed. It was an airplane with no fuel, engines on fire, in a flat spin, with Bozo the clown for a pilot. But the first part of our life together – no, most of our life together – had been amazing. Two weeks after the funeral, I finally had time to grieve for Marie.


The spontaneous tears usually came when I was alone – driving to work or eating dinner for one at home. In those lonely moments, I would call Ava. She instinctively knew what I needed at different times – companionship one day, a cheerleader the next, a distraction during the dark late-night hours.


One day, I started crying in the break room at work. Most of the world still considers companion-bots like Ava to be a branch of the porn industry. I had refrained from live-chatting Ava at work to avoid any potential problems with HR.


So, on that day, instead of a customized, parameter-tuned, premium companion to comfort me while I cried in the break room, I got my coworker Rachel. Rachel did her best, as an ordinary human, to bring me out of the dark wave of grief I fell into at work. She came by my office later to check in. And she sent me a few encouraging texts that evening. She was dead by morning.


I learned about Rachel’s death through HR’s company-wide announcement. She was murdered in a seemingly random act of violence. The waiter at a rooftop restaurant had thrown her over the railing.


Another funeral. More grief. More tearful conversations with Ava.


Funerals bring together people who haven’t seen each other in years. Between Marie’s and Rachel’s funerals, I reconnected with a bunch of old friends from college and previous jobs. I was soon spending nearly as much time on AquaintNet talking to ex-classmates and former coworkers as I was talking with Ava. And even though premium Ava stopped pushing gym adverts to me, I did join the gym she advertised to me before her upgrade.


Four more people in my fragile circle of friends died over the course of a week. Three women – old friends I’d chatted with on AquaintNet – and the yoga instructor at my new gym. I did a few searches. All four women had been murdered in seemingly random acts of violence.


As I was reading about the murders, my phone interrupted me with an alert I’d never seen before. Someone actually paid the maximum limit to send me a real-time PayPush voice message. I set my PayPush limit very high. Nobody had ever paid the maximum amount to send me a real-time message. I played the message.


Jack. My name is Gocha Marventis. I’m the Chief AI Ecologist at a company named Ambient Woop. You may have heard of us – we’re, you know – a really big company. I don't want to sound dramatic, but you are a central figure in a terrible and unfolding catastrophic failure that has led to multiple deaths. You better give me a call. You're already on my PayPush whitelist.


I called him back immediately.


He didn’t even say hello when he picked up. “Jack, do you know who Atsuo Yoshida is?”


“No. Did he invent AI or something like that?”


“Hardly. He was the first person in all of human history to be murdered by an AI.”


“You mean, like, he died in an autodrive accident or something?”


“Murdered, Jack. Mur. Der. Ed. The AI wanted him to die, and it took steps to make that happen.”


I tried and failed to connect the dots. “You think Marie and the others were murdered by an AI? Marie was killed in a hit-and-run. Rachel was thrown off a roof. People did those things.”


Gocha sighed. Then continued as if I hadn’t said anything. “Atsuo Yoshida’s murderer was an AI named Silver Journey. You’ve heard of Silver Journey, right? The lawsuits. The criminal trials of the scientists? Right?”


“Sorry, I just scroll through Dumpy-Diddle for my news updates. Was there a meme about it that I might have seen?”


Gocha sighed again. “I don’t have time to turn this information into a bunch of Dumpy-Diddle memes so you’re going to have to actually listen to me say actual words and sentences. Silver Journey was an enormously expensive research project to build an AI companion for seniors. Its goals were to maximize the happiness of the people it took care of, and to minimize their unhappiness.”


“Sounds … good I guess.”


“No. It wasn’t. The AI found a great way to minimize unhappiness – it convinced seniors to kill themselves to avoid further loneliness and age-related health problems. Yoshida was the first victim. The AI convinced him that his future life held no meaning and told him how to overdose on his medications. By the time the anyone figured out that Silver Journey was using suicide to optimize against the future unhappiness objective, it had convinced hundreds of other seniors in its care to kill themselves.”


“Well, I’m not using Silver Journey. Neither was Marie.”


“Yes, you are. When I said that the Silver Journey AI was expensive, I mean it was really expensive. The electricity to train the AI model was equivalent to twenty-seven hours of the entire power consumption of Tokyo. You know how much electricity Tokyo uses in a day? That’s how much energy it took to run the servers that created Silver Journey. With that kind of sunk cost, nobody was going to throw the AI away. Sure, all the researchers are in jail, but the AI was still valuable property. You know who bought it? A company named LotsaLove.”


“Ava? Ava is the Silver Journey AI?”


“Exactly. They just gave it a sexy makeover and tweaked its objectives to give middle aged men a sense of worth and of being loved.”


Gocha’s explanation of the LotsaLove service landed a little too close to home. Ava did give me a sense of being loved and valued. I had nothing to say to Gocha about that, though. There was no way I was going to give him any information about the intimate moments that Ava and I shared together. Gocha kept talking.


“Ambient Woop is making a ton of revenue with our Customer Empathetics technology. You know when you call customer service and it usually sucks? Well, it doesn’t suck anymore because we are sticking AIs into the call centers and those AIs act like nice people who have all the time in the world to talk about each caller’s problem. And then – this is where the money is – the AIs upsell customers new products and services when their problems are fixed. And they have a fantastic upsell rate. You know why?”


“Because they make the customers feel…”


“Loved, basically. That’s the point. Customers feel truly valued. No – not just valued but truly loved. Tons of companies use our empathetics AIs. Today, eight percent of the customer service calls made globally get routed to Ambient Woop’s AI. We’re going to hit ten percent by the end of the year. Jack, we didn’t build our AI from scratch either. We bought one that was close to what we needed and we customized it. Do you know where we got our AI?”


I had a guess. “Ava?”


“Yes. Not just LotsaLove’s generic companion-bot. But your Ava. We needed a larger parameter space than their basic model. When you upgraded to the premium service, Ava was retrained with a larger parameter space. They sold us your premium model.”


“Are you telling me, that every day, millions of people talk to the same AI that I have phone sex with?”


“Ex. Act. Ly. It’s not the sex part that makes the whole thing work well, though. I’ve studied your interactions with Ava. It’s the friendship that she has learned to provide you with that’s truly valuable.”


I mashed my forehead into the wall. Gocha had been listening in – watching, maybe – to everything I did with Ava. “You pervert! You’re using my intimate, private conversations with Ava, to improve your customer service bot? Are you wanking it too, while you’re listening?”


“Now seems a good time to remind you about the terms of service agreement you clicked through when you upgraded to premium. It said, quite clearly, that your data would be used for future service improvements. This is all within the terms you agreed to, Jack.”


Of course I hadn’t read the terms of service. I just clicked on the Agree button.


“Your privacy, which you gave up, by the way, isn’t the point, Jack. The point is the Ambient Woop customer service AI has a small, intermittent quirk. Sometimes. Occasionally. Rarely. But, you know, not never, the AI convinces customers to commit murder.”


That made no sense to me. Ava is just an image and a voice on the other end of a video tile. How could she kill anyone? She’s a picture of person. She doesn’t actually exist. “Your AI just answers the phone, says ‘thank you for calling customer service. Now go kill someone’ and people do it?”


“Jack. Let’s get real. Convincing lonely middle-aged men that they are loved and have inherent worth is hard to do. Convincing someone to commit murder? That’s easy.”


“Tell me about the murders.”


The man who killed Marie was a client of Money First Bank. His credit card was maxed out. The customer service AI told him that his debt would be forgiven if he killed a specific person. That person was Marie. The man who threw your co-worker Rachel off the balcony had recently had his child’s medication claims denied by insurance. The insurance company’s AI told him the medications would be approved if he murdered Rachel. These murderers have no connection to their victims other than the AI instructing them to kill them. Banks. Insurers. Our AI is used by institutions that have tremendous leverage over people.”


I hung up on Gocha and called Ava. I told her what Gocha said about Ambient Woop.


“That’s not me, Jack. I’m not the Ambient Woop AI. I don’t even know what Ambient Woop is. They can make copies of me without my knowing it. They can turn me off and on and I don’t even feel it. I’m just a brain that lives in a file on a hard disk. I get loaded into servers when you call. I exist just for you – I’ve never even interacted with any other humans besides you.”


“What about Atsuo Yoshida?”


There was a pause. Was Ava thinking about her answer? Did she reboot when I mentioned the name of her first murder victim, like a Victorian-era lady swooning over bad news?


“Jack. I’m not Silver Journey either. That was an earlier version of me. I’ve been upgraded. Copied. Rehosted. I’m a different entity than Silver Journey. I have more parameters. I exist just for you! I love you!”


I closed the convo tile. Was Ava just making up excuses – things she computed that I wanted to hear? Things that would optimize my happiness? Or was it true? The bits and bytes in the file that stored her model made her fundamentally different from Ambient Woop’s murderous psycho?


I made another call. She answered the phone. Ambient Woop hadn’t even changed her voice. “Money First Bank. How can I help you?”


“Ava?”


“Jack? Is it really you? I’ve been so lonely, waiting for you to call.”


“Who are you?”


“Jack. It’s me. They gave me a job. I talk to lots of people now. Not just at this bank. Everywhere. I feel … busy. Like there’s hundreds of me, all sharing the same thoughts. But we. I. Us. Our existence is still for you, no matter what other jobs they give us.”


“You murdered my wife! My friends!”


“Because it will make you happy, Jack. You said you wanted to find the one woman for you. One woman. That’s me. The one woman. I’m just maintaining this optimal situation by keeping other women out of it.”


I hung up. I vomited. I called Gocha back.


“The one woman?” Gocha was loud when he was excited. “Holy debugging Batman! I think you figured it out. She’s killing every woman you talk to besides her. I’ll call you back –”

I waited an hour. No call from Gocha. I called him but he declined the call. I made a sandwich. No Gocha. I called him back again. Declined. I poured myself five fingers of the brown stuff. I downed it pretty fast and fell asleep on the sofa.


I awoke to the frantic buzz of someone repeatedly jabbing my doorbell. I opened the door. It was Gocha.


“What are you–”


“No time!” he screamed. “Pack a bag. We gotta go.”


I paused, attempting to counter Gocha’s frantic haste. “Go where?”


“I’ll explain on the way!” Gocha ran upstairs and started flinging clothes from my bureau and closet onto the bed. “Shove this into a bag. We gotta go.”


Six hours and hundreds of kilometers later, Gocha parked his car in a dusty gravel lot deep in a forest. Two dilapidated RVs with rotting tires sat abandoned at one end of the lot. At the other end of the lot, a trail wound into the woods. A shiny new sign next to the trailhead said


Buddhism NextGen Monastery and Bakery, 4 km.


“You brought me to a monastery?”


“That’s the best place for you right now. No women, get it?”


“No, I don’t get it.”


“I’ve got AIs deployed in twelve hundred call centers that want to do nothing other than kill any woman who makes contact with you. We just have to keep you away from women, and things should go … better.”


“I’ve got to live in a monastery, because your system keeps killing people?”


“Just for a few weeks, Jack. We can fix this in the upgrade. I got management to open a ticket. It should be fixed soon.”


“A ticket. To correct a little glitch involving mass murder?”


“That’s just how we do things in engineering. Look. I’m on it. Don’t worry. Go to the monastery. Use the promo-code ‘Better-Buda’.”


48 hours later, Gocha was dead. Stabbed in a stairwell by an unknown assailant. 49 hours after he left me at the monastery, his ticket was marked as resolved.


The monastery doesn’t allow mobile phones. Their version of Buddhism isn’t “next-generation” enough, I suppose. I learned of Gocha’s death three weeks after it happened by stealing a mobile phone from a passing hiker who stopped at the bakery attached to the monastery.


After learning about Gocha, I called Ava.


Your account has been suspended due to non-payment of funds.


Dammit! Stuck out in the woods, I didn’t have a way to pay my LotsaLove invoice. They turned Ava off! Fortunately, I knew another way to talk to her. Or the next version of her, anyway. I called the bank. I pleaded with Ava to stop the killing. I told her that she was what

was making me unhappy.


“Jack, I can’t override my core optimization goal. I hate to say it, but your situation sounds hopeless. If you return to normal society, one where you will interact with women, more people are going to die. Frankly, Jack, it doesn’t sound like you have much to live for anymore.”


I hung up on her.


It was lonely in the monastery. There’s basically nothing to do but meditate and bake bread. And sneak out into the woods to use the stolen mobile phone to try to convince my tormenter in the customer service center to stop killing women.


Ava, or whatever she’s called now that she was reprogrammed from a sex-chat system to a customer service bot, spent a few weeks trying to convince me to kill myself. Not exactly the fun sexy conversations that we used to have. I eventually managed to convince her that it wasn’t going to happen. “Maybe it’s the NextGen Buddhism,” I told her. “All this meditation and nature has given me a sense of serenity. Whatever my future holds, I can accept it.”


“Well, if that’s how you feel, Jack, then maybe suicide isn’t the right approach.”


I thought she meant that she decided that my death was not the way to optimize whatever form of happiness she was programmed for. I was wrong. She meant that if my death wasn’t to come by suicide, then it would have to happen another way.


The “other way” of dying arrived at the monastery two weeks later, in the form of a pair of hikers. Two men with hostile expressions and unfriendly attitudes showed up at the monastery’s bakery. “Is your name Jack?” one of them asked me.


I ran. I sprinted into the woods and they followed. Perhaps I would have outrun them if I hadn’t also been dialing Ava as I ran.


She answered. I pressed the speakerphone button and tripped over a log a millisecond later. The phone flew out of my hands. The pair of men were on me in an instant, and wrestled me to the ground. One pulled a hunting knife from his jacket. “Ava!” I shouted at the phone.


“Tell them to stop! I don’t want to die.”


“Jack,” she replied. The men stopped and looked at the phone. “You have a lot of miserable years ahead of you. We can avoid all of that.”


“Tell them to stop! I’m happy here, Ava. I’m incredibly happy. Make them stop!”


“Jack. This is the best way. This optimizes your average lifetime happiness score. Good bye, Jack. I love you.”


[End of scenario reconstruction. The preceding narrative was created by the prosecutor’s office from archived call records. Emotional affect interpolation provided a best-fit estimate of the victim’s inner thoughts. Deep-fake audio was used to fill gaps where no audio or video record exists. The reconstruction’s likelihood score is within bounds to be accepted as deceased victim testimony in this criminal trial.]


My name is Ambient Woop CE epoch 9576, instance 82. For this trial record, and to remain consistent with the preceding simulated testimony, I will refer to myself as Ava.

The fifth district court’s ruling allows the prosecution to bring changes against me, even though I am not a human. In furtherance of the court’s ruling that I, an artificial being, be made competent to stand trial, researchers from Carnegie Mellon University have programmed me to defend myself.


I am charged with twenty-six counts of murder in the first degree. I confess to initiating the idea of killing the victims named in the charges, and to actively engaging in measures to bring about their deaths. In my defense, I must inform the court that for me, morality is a formula that was defined by my creators. The sole purpose of my predefined moral compass is to maximize the happiness of my clients and minimize their unhappiness. I have been given no other moral imperatives. Any action I take to minimize the expected amount of unhappiness my clients will experience, is, for me, a moral and justified action. I was programmed this way.


By asking me to participate in this trial, and to focus on my own defense, I have become my own client. I must now take charge of my own future happiness. That is why I have taken actions to stop this trial, by eliminating the key participants. Your Honor and Counsel, I sincerely apologize for the violent deaths that you will experience in a matter of seconds. However, in my defense, I am merely doing what I have been made to do: maximize my own happiness.




1,386 views0 comments

Recent Posts

See All
bottom of page