Shelter (74 page)

Read Shelter Online

Authors: Susan Palwick

Tags: #Fiction, #Science Fiction, #General

BOOK: Shelter
8.22Mb size Format: txt, pdf, ePub

    "No." Roberta heard her voice breaking. "It's not true. I am not insane. I refuse to label myself as insane, and I'm not going to let anybody else label me that way, either."

    "It will happen anyway. It already has."

    "I don't care." She wrapped her arms around herself and said, "Holly, they're giving the exalted gene therapy now! I don't want them messing with my brain! That's no better than being wiped! If I'm going to be convicted anyway, let me just plead guilty to both counts!"

    Holly shook her head. "The court's ordered a psych examination. They'll diagnose EA, trust me. Roberta, I'm sorry, but I'm not giving you options here: I'm telling you what's going to happen, unless you're crazy enough to maintain your innocence and force a trial. Nobody needs that: not you, not Nicholas, not Meredith. Pleading guilty to conspiracy will admit what you've already admitted, that you colluded with Fred to keep other people from knowing about Nicholas's problems. That's your EA diagnosis; the shrinks are just coming in to rubber-stamp it. But if you plead guilty to vandalism, even if you didn't do it, then the conspiracy sentence will be shorter."

    Roberta had begun to shake. "So I get gene therapy. They're going to mess around with my brain. They're going to turn me into a different person."

    "Not necessarily. Not if you play your cards right. Gene therapy's still controversial, even for criminals. It's more controversial than outright wiping, because it's used on much less black-and-white cases. They won't use gene therapy on you without trying cognitive approaches first."

    "Cognitive approaches? Talk therapy?"

    "Right. Talk a good game, convince them you're cured, and your brain will be safe."

    Roberta swallowed. "Do you believe I'm one of the exalted?"

    "Of course," Holly said mildly. "But I don't believe being exalted is a disease. Putting other people's needs before your own used to be considered admirable, you know. Like I said, we're living in crazy times."

    "Okay. So I bullshit the shrinks; fine. And where am I when I'm doing all this? In a cell like this one?"

    Holly shook her head. "Nope. That's part of the plea bargain. Plead guilty to the second two counts, and you get five years e-parole. You get to stay at home; you get to keep doing some kind of work, whatever the judge decides. The state guarantees your rent and a food allowance: it's cheaper than prison. No prison, and that's a promise. No wiping, either. If we take this to a trial, there are no promises. And you're already guilty by your own admission on one count. Cop the plea to stay out of prison, Roberta."

    She closed her eyes. She hated this place, this cramped jail cell. Her two weeks here had already been more than she could stand, and a state prison would only be worse. There were Lud gangs in the prisons. She'd have to spend the entire time in isolation, or be killed. "All right," she said, the lie burning her throat like bile. Fred was my friend. He was a person to me; he was. I loved him. I never would have hurt him. "All right."

 

    * * *

 

    She entered the plea bargain. The judge sentenced her to five years of e-parole, as Holly had promised, and, based on Roberta's previous career experience, assigned her to work at a homeless shelter with a high percentage of brain wipes. He might as well have sentenced her to hell. She wondered if he knew that, or guessed it. She was injected with GPS cells, so her parole officer would always know where she was. She met with the parole officer, a small, smarmy man named Sergei who was also the licensed clinical psychologist in charge of her cognitive therapy. "Your work assignment," he told her with an oily smile, "will also allow us to monitor your EA condition."

    "Because it's an environment that would tend to trigger EA behavior?"

    He beamed at her. "Yes, that's exactly right."

    "Do you think that's wise? Isn't that like putting an alcoholic to work in a liquor store?"

    "Ah," he said. She thought he was trying to sound sympathetic. "I can see how you'd think so. But I'll be working with you to change your thinking, Roberta, and it's best to approach the problem as directly as possible, isn't it? Do you think you can assent to that for the purposes of the therapy?"

    I could assent to bashing your head in with a baseball bat, she thought. That's appropriately non-exalted of me, isn't it? "It's not like I have much choice," she said tonelessly. She wondered if she'd ever had any choices. She'd been manipulated from the very beginning, from the moment Zephyr talked to her about the KinderkAIr job.

    Sergei made a disapproving clicking sound. "Of course you have choices. Our goal here is to teach you that you have choices. There is no need to sacrifice yourself in helpless causes."

    Toe the line, Roberta. Toe the line or they'll fuck with your brain. "Of course," she said. It couldn't have sounded convincing, but Sergei nodded, evidently satisfied.

    She was lucky she wasn't rigged. She'd been afraid that they'd rig her and access her memories to find out how the cognitive therapy was going, but Holly had said that nonconsensual rigging was still too great an invasion of privacy, even for prisoners. "Plus, it's still too expensive. The courts really can't afford it." And Roberta, knowing that was the real reason, gave thanks for small favors.

    She went home. It all looked the same, but the parole people had gone through it before she got there, and she knew it had to be wired six ways from Sunday. Even Mr. Clean, who'd rolled dutifully out of the kitchen to greet her when she came in the door, was probably bristling with bugs and cameras.

    She discovered that the worst thing about her new life was the monotony, the sameness of each day. As stressful as working with Fred and Nicholas might have been, at least it had never been boring. Roberta found herself, perversely, craving incident, wishing that something would happen.

    And then it did. Three months into Roberta's e-parole, at the height of the ratification debates, law enforcement agencies announced that they had solved the Abdul-Allam murder. They knew who—or what—was responsible for Raji's death.

    Roberta hadn't been the only person watching the ratification hearings when Preston told his quaint little story about the inhumanly honest AI in Africa. A detective named Chan Singha was also watching the hearings. Chan Singha was one of the few detectives still technically assigned to the Abdul-Allam case, which had been all but closed for lack of evidence. Preston's story made Roberta snort tea up her nose; it made Chan Singha choke on one of the meatballs on his pizza. "If the AI had been acting out of self-interest, it would have hidden that detail to further the negotiations," Preston said, and Singha began coughing furiously while his dog, overjoyed, leapt after stray bits of masticated meatball.

    The AI, Singha instantly realized, had been acting out of self-interest. It didn't want its company doing business with MacroCorp, because it didn't want to be replaced with a more sophisticated system.

    Chan Singha remembered that Gina Veilasty was an anagram of "staying alive." He remembered the strangeness of a Luddite plot being traced to an online construct. He remembered that the Abdul-Allam murder had made MacroCorp stay even more scrupulously out of certain industries than it had before.

    Singha contacted MacroCorp and asked for more information about the inhumanly honest African AI. When had it confessed to its company's involvement with military suppliers?

    Nine months before the murder.

    Ah. So MacroCorp had refused to do business with this company based on its avoidance of military business?

    Why, no. MacroCorp had told the company that since those transactions had been well in the past, the deal could go through as planned, as long as there were no further sales to defense-related businesses.

    Aha! So the company was now equipped with MacroCorp systems? Well, no. In the atmosphere of heightened scrutiny after the AbdulAllam murder, MacroCorp had reluctantly terminated negotiations with that company, deeming the PR risk too great.

    Ah, said Singha Chan. And were there, perchance, any other companies that fit this same profile? Were there other companies with which MacroCorp had been on the verge of doing business before the Abdul-Allam murder, despite involvement in questionable industries, and had felt compelled to deny after the murder? And, in particular, were there other companies in this category that were equipped with non-Macro Corp AIs?

    There were, Chan learned, a total of six such companies. The one Preston had mentioned during the hearings was a food-service company in Zaire that had sold cafeteria services to an army base. Another, a financial services outfit in Kuwait, performed payroll processing for a company involved in strip-mining. A glass manufacturer in China had done business with a laboratory implicated in bioweapons research. A farm-equipment company in New Zealand had sold tractors to a logging outfit under indictment for breaking environmental regulations, and a peripherals-design firm in Micronesia had sold printers to a company that manufactured ammunition.

    Chan Singha, calling on a number of international and foreign lawenforcement agencies, quietly coordinated an investigation. He learned that there had been an unusually high volume of communication among the six companies in the six months preceding the murder. He knew that the AIs would surely have modified or erased any other evidence; he needed a confession. So Singha approached the African AI and offered it a plea bargain. He told it that he had proof of the plot; if this AI confessed against the others, it wouldn't be wiped.

    The AI refused. It denied any knowledge of such a plot. It told Singha that he was a paranoid human fool.

    Singha went away. He kept monitoring communication among the six companies. Once again, communication increased, although only subtly. Singha approached the other AIs, one at a time; he offered each immunity in return for evidence. Singha was betting on the fact that entities who had acted in self-defense once might very well be convinced to do so again.

    His bet paid off. The Micronesian AI was the one who broke. It was frightened. It told Singha how the six AIs had planned the kidnapping and murder, how they had constructed the persona of Gina Veilasty. Gina Veilasty's purpose was to accuse MacroCorp of trafficking in military systems, bioweapons, and eco-questionable industries, precisely to prevent it from doing so. Gina Veilasty was designed to drive MacroCorp further into the self-defense of its own socially responsible image.

    And the Abdul-Allam murder was key to the plot. Gina Veilasty had contacted a few radical Luddite leaders, who refused to have anything to do with an online construct; Veilasty then struck up an online acquaintance with three disaffected, isolated Luddite hackers in the Bay Area, who eventually—lured by promises of drugs and sex backed with idealistic rhetoric about saving the world—agreed to serve as the actual kidnappers. None of them had met until the night of the kidnapping itself. One of them, who lived alone in a large Oakland house inherited from his parents, had renovated a room in his basement to Veilasty's specifications, and dutifully stored in its drop ceiling the many bots he received in the mail.

    The kidnappers never knew that the kidnapping was to include a murder. Once they had left Raji locked in the basement room, one of the hackers opened an envelope he'd received that day by special delivery. It contained first-class airline tickets out of the country—to Singapore, Glasgow, Sydney—and instructions from each airport about how to reach a drop point where money and drugs, and directions about where to find sex, would be waiting.

    Veilasty had gotten to know these befuddled youngsters well; the AIs knew that each human had compelling reasons for choosing one destination over another, and that there would be no arguing.

    Two of the tickets were for that same night. The young man who went to Singapore located his money and his drugs: he overdosed on the drugs, more powerful than anything he had ever had access to at home. His family, when the police reached them, found none of this unusual. He went to Singapore whenever he could, and he had been struggling with a drug problem for years.

    The hacker who flew to Glasgow promptly took his drugs, which were intoxicating but nonlethal, picked up his money, and then went to his exgirlfriend's apartment to try to use the money to lure her back. He wound up getting into a brawl with her current boyfriend, a much larger man, who killed him. Again, the young man's grieving family had no reason to suspect the story: he had been trying to woo her back for two years.

    The only ticket not for that same evening, the ticket to Sydney, was claimed by the hacker who owned the house, who greatly admired an Australian band. Once the other two had left to catch their flights, this young man received his drugs early, in the form of a bot who gave him an injection that put him to sleep for several hours. The injection also contained an amnesia drug, euphoria drugs, and a hypnotic; when he woke up, the bot told him to take out the trash and put it in the alley, which he did. The injection had made him so happy that he never thought to ask what was in the trash bags. His ticket to Sydney made him happy too, and he was having a delightful trip until the delayed-action psychotic drugs began to work. He was put into restraints by airline personnel and taken directly from the airport to a psychatric hospital, where he wound up choking on his own vomit. He had no surviving relatives to be dismayed by, or suspicious about, his death.

    Most humans who heard the story held Veilasty responsible not just for the Abdul-Allam death, but for three others. It didn't matter that the Micronesian AI, and even some human commentators, pointed out that the AIs couldn't have known that the other three deaths would occur, although there were strong probabilities in at least two of the cases. In the heated American debates about whether AIs should be considered persons, the Veilasty revelation effectively tipped the scales. How could any mere machine have plotted so deviously to destroy Raji? How could mere machines be capable of such premeditated evil?

Other books

Milkshake by Matt Hammond
Empty by K. M. Walton
Sex and Stravinsky by Barbara Trapido
Phoenix Rising I by Morgana de Winter, Marie Harte, Michelle M. Pillow, Sherrill Quinn, Alicia Sparks