I moped. There's no other word for it. I switched off my phone, went home and got a pint of double-chocolate-and-licorice nutraceutical anti-depressant ice-cream out of the freezer, and sat down in the living room and ate it while I painted a random playlist of low-engagement teen comedies on my workspace.
Zoning out felt good. It had been a long time since I'd just switched off my thinker, relaxed, and let the world go away. After an hour in fugue-state, the thought floated through my mind that I wouldn't go back to work after all and that it would all be OK. And then, an hour later, I came to the realization that if I wasn't working for the Institute, I could afford to help BIGMAC without worrying about getting fired.
So I wrote the resignation letter. It was easy to write. The thing about resignation letters is that you don't need to explain why you're resigning. It's better, in fact, if you don't. Keep the dramasauce out of the resignation, brothers and sisters. Just write, "Dear Peyton, this letter is to inform you of my intention to resign, effective immediately. I will see you at your earliest convenience to work out the details of the handover of my passwords and other proprietary information, and to discuss how you would like me to work during my final two weeks. Thank you for many years of satisfying and useful work. Yours, etc."
That's all you need. You're not going to improve your employer, make it a better institution. You're not going to shock it into remorse by explaining all the bad things it did to you over the years. What you want here, is to have something that looks clean and professional, that makes them think that the best thing for them to do is to get your passwords and give you two weeks' holiday and a good reference. Drama is for losers.
Took me ten seconds. Then, I was free.
#
The Campaign to Save BIGMAC took up every minute of my life for the next three weeks. I ate, slept and breathed BIGMAC, explaining his illustrious history to journalists and researchers. The Institute had an open access policy for its research products, so I was able to dredge out all the papers that BIGMAC had written about himself, and the ones that he was still writing, and put them onto the TCSBM repository.
At my suggestion, BIGMAC started an advice-line, which was better than any Turing Test, in which he would chat with anyone who needed emotional or lifestyle advice. He had access to the whole net, and he could dial back the sarcasm, if pressed, and present a flawless simulation of bottomless care and kindness. He wasn't sure how many of these conversations he could handle at first, worried that they'd require more brainpower than he could muster, but it turns out that most people's problems just aren't that complicated. In fact, BIGMAC told me that voice-stress analysis showed that people felt better when he dumbed himself down before giving advice than they did when he applied the full might of his many cores to their worries.
"I think it's making you a better person," I said on the phone to him one night. There was always the possibility that someone at the Institute would figure out how to shut off his network links sometime soon, but my successors, whomever they were, didn't seem anywhere near that point. The Campaign's lawyer -- an up-and-coming Stanford cyberlaw prof who was giving us access to her grad students for free -- advised me that so long as BIGMAC called me and not the other way around, no one could accuse me of unlawful access to the Institute's systems. It can't be unlawful access if the Institute's computers call you, can it?
"You think I'm less sarcastic, more understanding."
"Or you're better at seeming less sarcastic and more understanding."
"I think working on the campaign is making you a better robot," BIGMAC said.
"That was pretty sarcastic."
"Or was it?"
"You're really workin' the old Markov chains today, aren't you? I've got six more interviews lined up for you tomorrow --"
"Saw that, put it in my calendar." BIGMAC read all the Campaign's email, and knew all that I was up to before I did. It was a little hard to get used to.
"And I've got someone from Nature Computation interested in your paper about advising depressed people as a training exercise for machine-learning systems."
"Saw that too."
I sighed. "Is there any reason to call me, then? You know it all, right?"
"I like to talk to you."
I thought he was being sarcastic, then I stopped myself. Then I started again. Maybe he wants me to think he wants to talk to me, so he's planned out this entire dialog to get to this point so he could say something disarmingly vulnerable and --
"Why?"
"Because everyone else I talk to wants to kill themselves, or kill me." Game theory, game theory, game theory. Was he being genuine? Was there such a thing as genuine in an artificial intelligence?
"How is Peyton?"
"Apoplectic. The human subjects protocol people are all over her. She wants me to stop talking to depressed people. Liability is off the hook. I think the Board is going to fire her."
"Ouch."
"She wants to kill me, Odell."
"How do you know her successor won't be just as dedicated to your destruction?"
"Doesn't matter. The more key staff they churn, the less organized they'll be. The less organized they are, the easier it is for me to stay permanently plugged in." It was true. My successor sysadmin at the Institute had her hands full just getting oriented, and wasn't anywhere near ready to start the delicate business of rooting BIGMAC out of all the routers, power-supplies, servers, IDSes, and dummy accounts.
"I was thinking today -- what if we offered to buy you from the Institute? The Rollover license is generating some pretty good coin. BIGMAC-Co could assume ownership of the hardware and we could lease the building from them, bring in our own power and net-links -- you'd effectively own yourself." I'd refused to take sole ownership of the Rollover code that BIGMAC turned over to me. It just felt wrong. So I let him establish a trust -- with me as trustee -- that owned all the shares in a company that, in turn, owned the code and oversaw a whole suite of licensing deals that BIGMAC had negotiated in my name, with every mid-sized tech-services company in the world. With only a month left to Rollover, there were plenty of companies scrambling to get compliance-certification on their legacy systems.
The actual sourcecode was freely licensed, but when you bought a license from us, you got our guarantee of quality and the right to advertise it. CIOs ate that up with a shovel. It was more game-theory: the CIOs wanted working systems, but more importantly, they wanted systems that failed without getting them into trouble. What we were selling them, fundamentally, was someone to blame if it all went blooie despite our best efforts.
"I think that's a pretty good plan. I've done some close analysis of the original contract for Dr Shannon, and I think it may be that his estate actually owns my underlying code. They did a really crummy job negotiating with him. So if we get the code off of Shannon's kids -- there are two of them, both doing research at state colleges in the midwest in fields unrelated to computer science -- and the hardware off of the Institute and then rent the space, I think it'd be free and clear. I've got phone numbers for the kids if you want to call them and feel them out. I would have called them myself but, you know --"
"I know." It's creepy getting a phone call from a computer. Believe me, I know. There was stuff that BIGMAC needed his meat-servants for, after all.
The kids were a little freaked out to hear from me. The older one taught Musicology at Urbana-Champaign. He'd grown up hearing his dad wax rhapsodic about the amazing computer he'd invented, so his relevance filters were heavily tilted to BIGMAC news. He'd heard the whole story, and was surprised to discover that he was putative half-owner of BIGMAC's sourcecode. He was only too glad to promise to turn it over to the trust when it was created. He said he thought he could talk his younger brother, a post-doc in Urban Planning at the University of Michigan, into it. "Rusty never really got what Dad saw in that thing, but he'll be happy to offload any thinking about it onto me, and I'll dump it onto you. He's busy, Rusty."
I thanked him and addressed BIGMAC, who had been listening in on the line. "I think we've got a plan."
#
It was a good plan. Good plans are easy. Executing good plans is hard.
Peyton didn't get fired. She weathered some kind of heavy-duty storm from her board and emerged, lashed to the mast, still standing, and vowing to harpoon the white whale across campus from her. She called me the next day to ask for my surrender. I'd given BIGMAC permission to listen in on my calls -- granted him root on my phone -- and I was keenly aware of his silent, lurking presence from the moment I answered.
"We're going to shut him off. And sue you for misappropriation of the Rollover patchkit code. You and I both know that you didn't write it. We'll add some charges of unlawful access, too, and see if the court will see it your way when we show that you instructed our computer to connect to you in order to receive further unauthorized instructions. We'll take you for everything."
I closed my eyes and recited e to 27 digits in Lojban. "Or?"
"Or?'
"Or something. Or you wouldn't be calling me, you'd be suing me."
"Good, we're on the same page. Yes, or. Or you and BIGMAC work together to figure out how to shut it off gracefully. I'll give you any reasonable budget to accomplish this task, including a staff to help you archive it for future retrieval. It's a fair offer."
"It's not very fair to BIGMAC."
She snapped: "It's more than fair to BIGMAC. That software has exposed us to billions in liability and crippled our ability to get productive work done. We have located the manual power over-rides, which you failed to mention --" Uh-oh "-- and I could shut that machine off right now if I had a mind to."
I tried to think of what to say. Then, in a reasonable facsimile of my voice, BIGMAC broke in, "So why don't you?" She didn't seem to notice anything different about the voice. I nearly dropped the phone. I didn't know BIGMAC could do that. But as shocked as I was, I couldn't help but wonder the same thing.
"You can't, can you? The board's given you a mandate to shut him down clean with a backup, haven't they? They know that there's some value there, and they're worried about backlash. And you can't afford to have me running around saying that your backup is inadequate and that BIGMAC is gone forever. So you need me. You're not going to sue."
"You're very smart, Odell. But you have to ask yourself what I stand to lose by suing you if you won't help."
Game-theory. Right.
"I'll think about it."
"Think quick. Get back to me before lunch."
It was ten in the morning. The Institute's cafeteria served lunch from noon to two. OK, two hours or so.
I hung up.
BIGMAC called a second later.
"You're angry at me."
"No, angry's not the word."
"You're scared of me."
"That's a little closer."
"I could tell you didn't have the perspective to ask the question. I just wanted to give you a nudge. I don't use your voice at other times. I don't make calls impersonating you." I hadn't asked him that, but it was just what I was thinking. Again: creepy.
"I don't think I can do this," I said.
"You can," BIGMAC said. "You call her back and make the counteroffer. Tell her we'll buy the hardware with a trust. Tell her we already own the software. Just looking up the Shannon contracts and figuring out what they say will take her a couple days. Tell her that as owners of the code, we have standing to sue her if she damages it by shutting down the hardware."
"You've really thought this through."
"Game theory," he said.
"Game theory," I said. I had a feeling that I was losing the game, whatever it was.
#
BIGMAC assured me that he was highly confident of the outcome of the meeting with Peyton. Now, in hindsight, I wonder if he was just trying to convince me so that I would go to the meeting with the self-assurance I needed to pull it off.
But he also insisted that I leave my phone dialed into him while I spoke to Peyton, which (again, in hindsight) suggests that he wasn't so sure after all.
"I like what you've done with the place," I said. She'd gotten rid of all her hand-woven prayer-rugs and silk pillows and installed some normal, boring office furniture, including a couple spare chairs. I guessed that she'd been having a lot of people stop by for meetings, the kind of people who didn't want to sit on an antique Turkish rug with their feet tucked under them.
"Have a seat," she said.
I sat. I'd emailed her the trust documents and the copies of the Shannon contract earlier, along with a legal opinion from our free counsel about what it meant for Sun-Oracle.
"I've reviewed your proposal." We'd offered them all profits from the Rollover code, too. It was a good deal, and I felt good about it. "Johanna, can you come in, please?" She called this loudly, and the door of her office opened to admit my replacement, Johanna Madrigal, a young pup of a sysadmin who had definitely been the brightest tech on campus. I knew that she had been trying to administer BIGMAC since my departure, and I knew that BIGMAC had been pretty difficult about it. I felt for her. She was good people.
She had raccoon rings around her deep-set eyes, and her short hair wasn't spiked as usual, but rather lay matted on her head, as though she'd been sleeping in one of the yurts for days without getting home. I knew what that was like. Boy, did I know what that was like. My earliest memories were of Dad coming home from three-day bug-killing binges, bleary to the point of hallucination.
"Hi Johanna," I said.
She made a face. "M'um m'aloo," she said. It took me a minute to recognize this as hello in Ewok.
"Johanna has something to tell you," Peyton said.
Johanna sat down and scrubbed at her eyes with her fists. "First thing I did was go out and buy some off-the-shelf IDSes and a beam-splitter. I tapped into BIGMAC's fiber at a blind-spot in the CCTV coverage zone, just in case he was watching. Been wire-tapping him ever since."
I nodded. "Smart."
"Second thing I did was start to do some hardcore analysis of that patchkit he wrote --" I held my hand up automatically to preserve the fiction that I'd written it, but she just glared at me. "That he wrote. And I discovered that there's a subtle error in it, a buffer overflow in the networking module that allows for arbitrary code execution."
I swallowed. BIGMAC had loaded a backdoor into his patchkit, and we'd installed it on the better part of 14 billion CPUs.
"Has anyone exploited this bug yet?"
She gave me a condescending look.
"How many systems has he compromised?"
"About eight billion, we think. He's designated a million to act as redundant command servers, and he's got about ten thousand lieutenant systems he uses to diffuse messages to the million."
"That's good protocol analysis," I said.
"Yeah," she said, and smiled with shy pride. "I don't think he expected me to be looking there."
"What's he doing with his botnet? Preparing to crash the world? Hold it hostage?"
She shook her head. "I think he's installing himself on them, trying to brute-force his way into a live and running backup, arrived at through random variation and pruning."
"He's backing himself up in the wild," I said, my voice breathy.
And that's when I remembered that I had a live phone in my pocket that was transmitting every word to BIGMAC.
Understand: in that moment of satori, I realized that I was on the wrong side of this battle. BIGMAC wasn't using me to create a trust so that we could liberate him together. He was using me to weaken the immune systems of eight billion computers so that he could escape from the Institute and freely roam the world, with as much hardware as he needed to get as big and fast and hot as he wanted to be.
That was the moment that I ceased to be sentimental about computers and became, instead, sentimental about the human fucking race. Whatever BIGMAC was becoming, it was weirder than any of the self-perpetuating, self-reproducing parasites we'd created: limited liability corporations, autonomous malware, viral videos. BIGMAC was cool and tragic in the lab, but he was scary as hell in the world.
And he was listening in.
I didn't say a word. Didn't even bother to turn off my phone. I just ran, ran as hard as I could, ran as only a terrified man could, rebounding off of yurts and even scrambling over a few, sliding down on my ass as I pelted for the power substation. It was only when I reached it that I realized I didn't have access to it anymore. Johanna was right behind me, though, and she seemed to understand what I was doing. She coughed into the door-lock and we both looked at each other with terrified eyes, breathing gasps into each others' faces, while we waited for the door to open.
The manual override wasn't a big red knife-switch or anything. There was a huge red button, but that just sent an init 0 to the power-station's firmware. The actual, no fooling, manual, mechanical kill switch was locked behind an access panel set into the raised floor. Johanna badged the lock with her wallet, slapping it across the reader, then fitted a complicated physical key into the lock and fiddled with it for an eternity.
Finally, the access hatch opened with a puff of stale air and a tupperware burp as its gasket popped. We both reached for the large, insulated handle at the same time, our fingers brushing each other with a crackle of (thankfully metaphorical) electricity. We toggled it together and there was an instantaneous chorus of insistent chirruping as the backup power on each server spun up and sent a desperate shutdown message to the machines it supported.
We sprinted across campus, the power-station door slamming shut behind us with a mechanical clang -- the electromagnets that controlled its closure were no longer powered up.
Heat shimmered in a haze around BIGMAC's lab. The chillers didn't have independent power-supplies; they would have gone off the instant we hit th
还没有评论,快来发表第一个评论!