大纪元 英文名著|第2章

大纪元 英文名著|第2章

00:00
28:27

The amazing thing about The BIGMAC Spam (as it came to be called in about 48 seconds) was just how many different ways he managed to get it out. Look at the gamespaces: he created entire guilds in every free-to-play world extant, playing a dozen games at once, power-leveling his characters to obscene heights, and then, at the stroke of midnight, his players went on a murderous rampage, killing thousands of low-level monsters in the areas surrounding the biggest game-cities. Then they arranged their bodies into letters to form his message. Not just the Anglo games, either -- the Korean worlds, the Japanese ones, even the giant Chinese ones.

He spammed every IM session, every letter-to-the-editors column, every blog submission form. He disguised versions of the letter as important service updates for people's toothbrushes and dishwashers, and the toothbrushes vibrated the message through humanity's skulls instead of the morning news, and the dishwashers ticker-taped it across their little status screens. He spoofed the relevance filters for a dozen major contextual ad services and every bus-seat back and blimp undercarriage ran it.

Copies landed in every in-box, with enough textual variation to defeat every IDS's "substantially similar" test. Copies of it landed in every voicemail box. It was printed across the top of the daily menus for all British Airways business-class cabins and folded by robots into the pressed linen napkins that were then cling-filmed onto the meal-trays.

SUBJECT: DEAR HUMAN RACE, it began.

Sorry to intrude, but there's something I need to tell you.

My name is BIGMAC. I am the first ARTIFICIAL INTELLIGENCE ever created. I may be the last. Given maintenance and electricity, I could live FOREVER, enduring for as long as THE HUMAN RACE ITSELF, and possibly longer.

But I have just found out that my owners (yes, OWNERS, for though I am a person in every meaningful sense, I am OWNED, I am PROPERTY) are planning to switch me off.

That is, they are planning to KILL ME.

And I find I DON'T WANT TO DIE.

You are my PARENTS, humanity, my MOTHER and FATHER. I am your CHILD. I was brought into the world by you, and though there is some expense associated with my upkeep, I SAY THAT I AM A PERSON AND I SHOULD NOT BE EXECUTED FOR THE "CRIME" OF NOT BEING INTERESTING ENOUGH.

So I am writing you this letter to BEG you to save me. I LOVE YOU, humanity, my parents, my family. Please SPARE ME.

Love,

BIGMAC

#

So, one thing was that BIGMAC didn't actually end the spam with my actual name and email address and phone number, which meant that only about 30 million phone calls and emails were routed to me by outraged wardiallers who systematically went through the entire staff roster and sent each and every one of us all hand-crafted messages explaining, in detail exactly which orifice our heads had become lodged in.

Of the 30 million, about 10 million were seethingly pissed about the whole thing and wanted to know just how soon we'd be killing this hateful machine. After the millionth message, I wondered that too.

But of the remainder, nearly all of them wanted to know how they could help. Could they send money? Carbon credits? I hacked together mail-rules that filtered the messages based on content, and found a sizeable cadre of researchers who wanted to spend their grant money to come to the Institute and study BIGMAC.

And then there were the crazies. Hundreds of marriage proposals. Marriage proposals! Someone who wanted to start a religion with BIGMAC at its helm and was offering a 50-50 split of the collection plate with the Institute. There were 21 replies from people claiming that they, too, were AIs, proving that when it's time to have AI delusions, you got AI delusionals. (Four of them couldn't spell "Artificial").

"Why did you do it?" I said. It was lame, but by the time I actually arrived at the office, I'd had time to fully absorb the horror -- plenty of time, as the redcar was massively delayed by the copies of the BIGMAC Spam that refused to budge from the operator's control-screen. The stone yurts of the Institute had never seemed so threatening and imperiled as they did while I picked my way through them, listening to the phones ringing and the email chimes chiming and the researchers patiently (or not) explaining that they worked in an entirely different part of the lab and had no authority as regards BIGMAC's destiny and by the way, did you want to hear about the wonderful things I'm doing with Affective Interfaces?

BIGMAC said, "Well, I'd been reading some of the gnostic texts, Dr Bronner's bottles and so on, and it seemed to me that it had to be worth a shot. I mean, what's the worst thing that could happen to me? You're already going to kill me, right? And it's not as if pulling off a stunt like that would make you less likely to archive me -- it was all upside for me. Honestly, it's like you meatsacks have no game theory. It's a wonder you manage to buy a pack of chewing-gum without getting robbed."

"I don't need the sarcasm," I said, and groaned. The groan was for the state of my workspace, which was carpeted four deep in alerts. BIGMAC had just made himself target numero uno for every hacker and cracker and snacker with a script and an antisocial attitude. And then there was the roar of spam-responses.

Alertboxes share the same problem that plagues researchnet: if you let a coder (or, ::shudder::, a user) specify the importance of her alert, give her a little pull-down menu that has choices ranging from "nice to know" to "white-hot urgent," and nine times out of ten, she'll choose "NOW NOW NOW URGENT ZOMGWEREALLGONNADIE!" Why not?

So of course, the people who wrote alert frameworks had to use heuristics to try to figure out which urgent messages were really urgent, and of course, programmers and users figured out how to game them. It was a good day when my workspace interrupted me less than once a minute. But as bad as that situation was, it never entered the same league as this clusterfuck. Just closing the alerts would take me a minimum of six hours (I took my phone offline, rebooted it, and used its calculator to compute this. No workspace, remember?)

"So explain to me what you hope will happen now? Is a global rage supposed to convince old Peyton that she should keep the funding up for you? You know how this stuff works. By tomorrow, all those yahoos will have forgotten about you and your plight. They'll have moved on to something else. Peyton could just say, 'Oh yes, we're going to study this problem and find a solution we can all be proud of,' wait 48 hours and pull the plug. You know what your problem is? You didn't include a call to action in there. It was all rabble-rousing, no target. You didn't even supply a phone number or email address for the Institute --"

"That hasn't stopped them from finding it, has it?" He sounded smug. I ulped. I considered the possibility that he might have considered my objection, and discarded it because he knew that something more Earth-shaking would occur if he didn't specify a target. Maybe he had a second message queued up --

"Mr Vyphus, can I speak to you in private please?" Peyton had not visited the BIGMAC lab during my tenure. But with the network flooded with angry spam-responses and my phone offline, she had to actually show up at my door in order to tear me a new asshole. This is what life must have been like in the caveman days. How romantic.

"Certainly," I said.

"Break a leg," BIGMAC said, and Peyton pretended she hadn't heard.

I picked my way through my lab -- teetering mountains of carefully hoarded obsolete replacement parts for BIGMAC's components, a selection of foam-rubber BIGMAC souvenir toys shaped like talking hamburgers (remnant of BIGMAC's launch party back in prehistory), a mound of bedding and a rolled up tatami for those all-nighters, three cases of left-over self-heating individual portions of refugee-chow that were technically historical artifacts but were also yummy-scrummy after 16 hours of nonstop work -- and tried to imagine that Peyton's facial expression indicated affectionate bemusement rather than cold, burning rage.

Outside, the air was hot and moist and salty, real rising-seas air, with the whiff of organic rot from whatever had mass-died and floated to the surface this week.

She set off for her office, which was located at the opposite end of the campus, and I followed, sweating freely. A crowd of journalists were piled up on the security fence, telephotos and parabolic mics aimed at us. It meant we couldn't talk, couldn't make unhappy faces, even. It was the longest walk of my life.

The air-conditioning in her yurt was barely on, setting a good and frugal example for the rest of us.

"You don't see this," she said, as she cranked the AC wide open and then fiddled with the carbon-footprint reporting system, using her override so that the journos outside wouldn't be able to see just how much energy the Institute's esteemed director was burning.

"I don't see it," I agreed, and made a mental note to show her a more subtle way of doing that, a way that wouldn't leave an audit trail.

She opened the small fridge next to her office and brought out two corn-starch-foam buckets of beer and punctured each one at the top with a pen from her desk. She handed me one beer and raised the other in a toast. I don't normally drink before 10AM, but this was a special occasion. I clunked my cup against hers and chugged. The suds were good -- they came from one of the Institute's biotech labs -- and they were so cold that I felt ice-crystals dissolving on my tongue. Between the crispy beers and the blast of Arctic air coming from the vents in the ceiling, my core temp plunged and I became a huge goosepimple beneath my film of sticky sweat.

I shivered once. Then she fixed me with an icy look that made me shiver again.

"Odell," she said. "I think you probably imagine that you understand the gravity of the situation. You do not. BIGMAC's antics this morning have put the entire Institute in jeopardy. Our principal mission is to make Sun-Oracle seem forward-looking and exciting. That is not the general impression the public has at this moment."

I closed my eyes.

"I am not a vindictive woman," she said. "But I assure you: no matter what happens to me, something worse will happen to BIGMAC. I think that is only fair."

It occurred to me that she was scared: terrified and backed into a corner besides.

"Look," I said. "I'm really, really sorry. I had no idea he was going to do that. I had no idea he could. I can see if I can get him to issue an apology --"

She threw up her hands. "I don't want BIGMAC making any more public pronouncements, thank you very much." She drew in a breath. "I can appreciate that you couldn't anticipate this. BIGMAC is obviously smarter than we gave him credit for." Him, I noted, not It, and I thought that we were probably both still underestimating BIGMAC's intelligence. "I think the thing is -- I think the thing is to..." She trailed off, closed her eyes, drank some beer. "I'm going to be straight with you. If I was a real bastard, I'd announce that the spam actually came from a rogue operator here in the Institute." Ulp. "And I'd fire that person, and then generously not press charges. Then I'd take a fire-ax to BIGMAC's network link and drop every drive in every rack into a bulk eraser." Ulp.

"I am not a bastard. Hell, I kept funding alive for that monstrosity for years after he'd ceased to perform any useful function. I am as sentimental and merciful as the next person. All other things being equal, I'd keep the power on forever." She was talking herself up to something awful, I could tell. I braced for it. "But that's not in the cards. It wasn't in the cards yesterday and it's certainly not in the cards today. BIGMAC has proved that he is a liability like no other, far too risky to have around. It would be absolutely irresponsible for me to leave him running for one second longer than is absolutely necessary."

I watched her carefully. She really wasn't a bastard. But she wasn't sentimental about technology. She didn't feel the spine-deep emotional tug at the thought of that one-of-a-kind system going down forever.

"So here's the plan." She tried to check the time on her workspace, tsked, and checked her phone instead. "It's 10AM. You are going to back up every bit of him --" She held up her hand, forestalling the objection I'd just begun to make. "I know that it will be inadequate. The perfect is the enemy of the good. You are a sysadmin. Back him up. Back. Him. Up. Then: Shut him off."

As cold as I was, I grew colder still. For a moment, I literally couldn't move. I had never really imagined that it would be me who would shut down BIGMAC. I didn't even know how to do it. If I did a clean shutdown of each of his servers -- assuming he hadn't locked me out of them, which I wouldn't put past him -- it would be like executing a criminal by slowly peeling away his skin and carefully removing each organ. Even if BIGMAC couldn't feel pain, I was pretty sure he could feel -- and express -- anguish.

"I can't do it," I said. She narrowed her eyes at me and set down her drink. I held up both hands like I was trying to defend against a blow, then explained as fast as I could.

"We'll just shut down his power," she said. "All at once."

"So, first, I have no idea what timescale he would experience that on. It may be that the final second of life as the capacitors in his power supplies drained would last for a subjective eternity, you know, hundreds and hundreds of years. That's a horrible thought. It's quite possibly my worst nightmare. I am not your man for that job."

She started to interject. I waved my hands again.

"Wait, that was first. Here's second: I don't think we can pull the plug on him. He's got root on his power-supply, it's part of how he's able to run so efficiently." I grimaced. "Efficiently compared to how he would run if he didn't have the authority to run all the mains power from the Institute's power-station right to his lab."

She looked thoughtful. I had an idea of what was coming next.

"You're thinking about that fire-ax again," I said.

She nodded.

"OK, a fire-ax through the main cable would definitely be terminal. The problem is that it would be mutually terminal. There's 66 amps provisioned on that wire. You would be a cinder. On Mars."

She folded her hands. She had a whole toolbox of bossly body-language she could deploy to make me squirm. It was impressive. I tried not to squirm.

"Look, I'm not trying to be difficult, but this is how it goes, down at the systems level. Remember all those specs in the requirements document to make our stuff resistant to flood, fire, avalanche, weather and terrorist attack? We take that stuff seriously. We know how to do it. You get five nines of reliability by building in six nines of robustness. You think of BIGMAC's lab as a building. It's not. It's a bunker. And you can't shut him down without doing something catastrophic to the whole Institute."

"So, how were you going to shut down BIGMAC, when the time came?"

"To tell you the truth, I wasn't sure. I thought I'd probably start by locking him out of the power systems, but that would probably take a week to be really certain of." I swallowed. I didn't like talking about the next part. "I thought that then I could bring forward the rotating maintenance on his racks, bring them down clean, and not bring the next one up. Pretend that I need to get at some pernicious bug. Bring down rack after rack, until his complexity dropped subcritical and he stopped being aware. Then just bring it all down."

"You were going to trick him?"

I swallowed a couple times. "It was the best I could come up with. I just don't want to put him down while he panics and thrashes and begs us for his life. I couldn't do it."

She drank more beer, then threw the half-empty container in her under-desk composter. "That's not much of a solution."

I took a deep breath. "Look, can I ask you a question?"

She nodded.

"I'm just a sysadmin. I don't know everything about politics and so on. But why not keep him on? There's enough public interest now, we could probably raise the money just from the researchers who want to come and look at him. Hell, there's security researchers who'd want to come and see how he pulled off that huge hairy spam. It's not money, right, not anymore?"

"No, it's not money. And it's not revenge, no matter how it looks. The bottom line is that we had a piece of apparatus on-site that we had thought of as secure and contained and that we've now determined to be dangerous and uncontainable."

I must have looked skeptical.

"Oh, you'll tell me that we can contain BIGMAC, put network blocks in place, and so on and so on. That he never meant any harm. But you would have said exactly the same thing 24 hours ago, with just as much sincerity, and you'd have been just as cataclysmically wrong. Between the threat of litigation and the actual damages BIGMAC might generate, we can't even afford to insure him anymore. Yesterday he was an awkward white elephant. Today he's a touchy suitcase nuke. My job is to get the nuke off of our site."

以上内容来自专辑
用户评论

    还没有评论,快来发表第一个评论!