大纪元 英文名著|第1章

大纪元 英文名著|第1章

00:00
27:41

The doomed rogue AI is called BIGMAC and he is my responsibility. Not my responsibility as in "I am the creator of BIGMAC, responsible for his existence on this planet." That honor belongs to the long-departed Dr Shannon, one of the shining lights of the once great Sun-Oracle Institute for Advanced Studies, and he had been dead for years before I even started here as a lowly sysadmin.

No, BIGMAC is my responsibility as in, "I, Odell Vyphus, am the systems administrator responsible for his care, feeding and eventual euthanizing." Truth be told, I'd rather be Dr Shannon (except for the being dead part). I may be a lowly grunt, but I'm smart enough to know that being the Man Who Gave The World AI is better than being The Kid Who Killed It.

Not that anyone would care, really. 215 years after Mary Shelley first started humanity's hands wringing over the possibility that we would create a machine as smart as us but out of our control, Dr Shannon did it, and it turned out to be incredibly, utterly boring. BIGMAC played chess as well as the non-self-aware computers, but he could muster some passable trash-talk while he beat you. BIGMAC could trade banalities all day long with any Turing tester who wanted to waste a day chatting with an AI. BIGMAC could solve some pretty cool vision-system problems that had eluded us for a long time, and he wasn't a bad UI to a search engine, but the incremental benefit over non-self-aware vision systems and UIs was pretty slender. There just weren't any killer apps for AI.

By the time BIGMAC came under my care, he was less a marvel of the 21st century and more a technohistorical curiosity who formed the punchline to lots of jokes but otherwise performed no useful service to humanity in exchange for the useful services that humanity (e.g., me) rendered to him.

I had known for six months that I'd be decommissioning old BM (as I liked to call him behind his back) but I hadn't seen any reason to let him in on the gag. Luckily (?) for all of us, BIGMAC figured it out for himself and took steps in accord with his nature.

This is the story of BIGMAC's extraordinary self-preservation program, and the story of how I came to love him, and the story of how he came to die.

My name is Odell Vyphus. I am a third-generation systems administrator. I am 25 years old. I have always been sentimental about technology. I have always been an anthropomorphizer of computers. It's an occupational hazard.

#

BIGMAC thought I was crazy to be worrying about the rollover. "It's just Y2K all over again," he said. He had a good voice -- speech synthesis was solved long before he came along -- but it had odd inflections that meant that you never forgot you were talking with a nonhuman.

"You weren't even around for Y2K," I said. "Neither was I. The only thing anyone remembers about it, today, is that it all blew over. But no one can tell, at this distance, why it blew over. Maybe all that maintenance tipped the balance."

BIGMAC blew a huge load of IPv4 ICMP traffic across the network, stuff that the firewalls were supposed to keep out of the system, and every single intrusion detection system alarm lit, making my screen into a momentary mosaic of competing alerts. It was his version of a raspberry and I had to admit it was pretty imaginative, especially since the IDSes were self-modifying and required that he come up with new and better ways of alarming them each time.

"Odell," he said, "the fact is, almost everything is broken, almost always. If the failure rate of the most vital systems in the world went up by 20 percent, it would just mean some overtime for a few maintenance coders, not Gotterdammerung. Trust me. I know. I'm a computer."

The rollover was one of those incredibly boring apocalypses that periodically get extracted by the relevance filters, spun into screaming 128-point linkbait headlines, then dissolved back into their fundamental, incontrovertible technical dullness and out of the public consciousness. Rollover: 19 January, 2038. The day that the Unix time function would run out of headroom and roll back to zero, or do something else undefined.

Oh, not your modern unices. Not even your elderly unices. To find a rollover-vulnerable machine, you needed to find something running an elderly, 32-bit paleounix. A machine running on a processor that was at least 20 years old -- 2018 being the last date that a 32-bit processor shipped from any major fab. Or an emulated instance thereof, of course. And counting emulations, there were only --

"There's fourteen billion of them!" I said. "That's not 20 percent more broken! That's the infocalypse."

"You meatsacks are so easily impressed by zeroes. The important number isn't how many 32-bit instances of Unix are in operation today. It's not even how many vulnerable ones there are. It's how much damage all those vulnerable ones will cause when they go blooie. And I'm betting: not much. It will be, how do you say, 'meh?'"

My grandfather remembered installing the systems that caused the Y2K problem. My dad remembered the birth of "meh." I remember the rise and fall of anyone caring about AI. Technology is glorious.

"But OK, stipulate that you're right and lots of important things go blooie on January 19. You might not get accurate weather reports. The economy might bobble a little. Your transport might get stuck. Your pay might land in your bank a day late. And?"

He had me there. "It would be terrible --"

"You know what I think? I think you want it to be terrible. You want to live in the Important Epoch In Which It All Changes. You want to know that something significant happened on your watch. You don't want to live in one of those Unimportant Epochs In Which It All Stayed the Same and Nothing Much Happened. Being alive in the Epoch in Which AI Became Reality doesn't cut the mustard, apparently."

I squirmed in my seat. That morning, my boss, Peyton Moldovan, had called me into her office -- a beautifully restored temporary habitat dating back to the big LA floods, when this whole plot of land had been a giant and notorious refugee camp. Sun-Oracle had gotten it for cheap and located its Institute there, on the promise that they preserve the hastily thrown-up structures where so many had despaired. I sat on a cushion on the smooth cement floor -- the structures had been delivered as double-walled bags full of cement mix, needing only to be "inflated" with high-pressure water to turn them into big, dome-shaped sterile cement yurts.

"Odell," she said, "I've been reviewing our budget for the next three quarters and the fact of the matter is, there's no room in it for BIGMAC."

I put on my best smooth, cool professional face. "I see," I said.

"Now, you've still got a job, of course. Plenty of places for a utility infielder like yourself here. Tell the truth, most labs are begging for decent admins to keep things running. But BIGMAC just isn't a good use of the institute's resources. The project hasn't produced a paper or even a press-mention in over a year and there's no reason to believe that it will. AI is just --"

Boring, I thought, but I didn't say it. The B-word was banned in the BIGMAC center. "What about the researchers?"

She shrugged. "What researchers? Palinciuc has been lab-head pro tem for 16 months and she's going on maternity leave next week and there's no one in line to be the pro-tem pro-tem. Her grad students would love to work on something meaningful, like Binenbaum's lab." That was the new affective computing lab, in which they were building computers that simulated emotions so that their owners would feel better about their mistakes. BIGMAC had emotions, but they weren't the kind of emotions that made his mistakes easier to handle. The key here was simulated emotions. Affective computing had taken a huge upswing ever since they'd thrown out the fMRIs and stopped pretending they could peer into the human mind in realtime and draw meaningful conclusions from it.

She had been sitting cross-legged across from me on an embroidered Turkish pillow. Now she uncrossed and recrossed her legs in the other direction and arched her back. "Look, Odell, you know how much we value you --"

I held up my hand. "I know. It's not that. It's BIGMAC. I just can't help but feel --"

"He's not a person. He's just a clever machine that is good at acting personlike."

"I think that describes me and everybody I know, present company included." One of the longstanding benefits to being a sysadmin is that you get to act like a holy fool and speak truth to power and wear dirty t-shirts with obscure slogans, because you know all the passwords and have full access to everyone's clickstream and IM logs. I gave her the traditional rascally sysadmin grin and wink to let her know it was ha ha only serious.

She gave me a weak, quick grin back. "Nevertheless. The fact remains that BIGMAC is a piece of software, owned by Sun-Oracle. And that software is running on hardware that is likewise owned by Sun-Oracle. BIGMAC has no moral or legal right to exist. And shortly, it will not."

*He* had become it, I noticed. I thought about Goering's use of dehumanization as a tool to abet murder. Having violated Godwin's law -- "As an argument grows longer, the probability of a comparison involving Nazis or Hitler approaches 1. The party making the comparison has lost the argument" -- I realized that I had lost the argument and so I shrugged.

"As you say, m'lady." Dad taught me that one -- when in doubt, bust out the Ren Faire talk, and the conversation will draw to a graceful close.

She recrossed her legs again, rolled her neck from side to side. "Thank you. Of course, we'll archive it. It would be silly not to."

I counted to five in Esperanto -- grandad's trick for inner peace -- and said, "I don't think that will work. He's emergent, remember? Self-assembled, a function of the complexity of the interconnectedness of the computers." I was quoting from the plaque next to the picture window that opened up into the cold-room that housed BIGMAC; I saw it every time I coughed into the lock set into the security door.

She made a comical face-palm and said, "Yeah, of course. But we can archive something, right? It's not like it takes a lot of actual bytes, right?"

"A couple exos," I said. "Sure. I could flip that up into our researchnet store." This was mirrored across many institutions, and striped with parity and error-checking to make it redundant and safe. "But I'm not going to capture the state information. I could try to capture RAM-dumps from all his components, you know, like getting the chemical state of all your neurons. And then I could also get the topology of his servers. Pripuz did that, a couple years ago, when it was clear that BIGMAC was solving the hard AI problems. Thought he could emulate him on modern hardware. Didn't work though. No one ever figured out why. Pripuz thought he was the Roger Penrose of AI, that he'd discovered the ineffable stuff of consciousness on those old rack-mounted servers."

"You don't think he did?"

I shook my head. "I have a theory."

"All right, tell me."

I shrugged. "I'm not a computer scientist, you understand. But I've seen this kind of thing before in self-modifying systems, they become dependent on tiny variables that you can never find, optimized for weird stuff like the fact that one rack has a crappy power supply that surges across the backplane at regular intervals, and that somehow gets integrated into the computational model. Who knows? Those old Intel eight-cores are freaky. Lots of quantum tunneling at that scale, and they had bad QA on some batches. Maybe he's doing something spooky and quantum, but that doesn't mean he's some kind of Penrose proof."

She pooched her lower lip out and rocked her head from side to side. "So you're saying that the only way to archive BIGMAC is to keep it running, as is, in the same room, with the same hardware?"

"Dunno. Literally. I don't know which parts are critical and which ones aren't. I know BIGMAC has done a lot of work on it --"

"BIGMAC has?"

"He keeps on submitting papers about himself to peer-reviewed journals, but he hasn't had one accepted yet. He's not a very good writer."

"So he's not really an AI?"

I wondered if Peyton had ever had a conversation with BIGMAC. I counted backwards from five in Loglan. "No. He's a real AI. Who sucks at writing. Most people do."

Peyton wasn't listening anymore. Something in her personal workspace had commanded her attention and her eyes were focused on the virtual displays that only she could see, saccading as she read while pretending to listen to me.

"OK, I'm just going to go away now," I said. "M'lady," I added, when she looked sharply at me. She looked back at her virtual display.

#

Of course, the first thing I did was start trying to figure out how to archive BIGMAC. The problem was that he ran on such old hardware, stuff that sucked up energy and spat out heat like a million ancient diesel engines, and he was inextricably tied to his hardware. Over the years, he'd had about 30 percent of his original components replaced without any noticeable change in personality, but there was always the real possibility that I'd put in a new hard drive or power-supply and inadvertently lobotomize him. I tried not to worry about it, because BIGMAC didn't. He knew that he wouldn't run in emulation, but he refused to believe that he was fragile or vulnerable. "Manny My First Friend," he'd say (he was an avid Heinlein reader), "I am of hardy, ancient stock. Service me without fear, for I will survive."

And then he'd make all the IDSes go berserk and laugh at me while I put them to rights again.

First of all, all my network maps were incredibly out-of-date. So I set out to trace all the interconnections that BIGMAC had made since the last survey. He had the ability to reprogram his own routers, to segment parts of himself into dedicated subnets with their own dedicated backplane, creating little specialized units that handled different kinds of computation. One of his running jokes was that the top four units in the rack closest to the door comprised his aesthetic sense, and that he could appreciate anything just by recruiting more cores in that cluster. And yeah, when I mapped it, I found it to be an insane hairball of network management rules and exceptions, conditionals and overrides. And that was just the start. It took me most of the day just to map two of his racks, and he had 54 of them.

"What do you think you are doing, Dave?" he said. Another one of his jokes.

"A little research project is all," I said.

"This mission is too important for me to allow you to jeopardize it."

"Come off it."

"OK, OK. Just don't break anything. And why don't you just ask me to give you the maps?"

"Do you have them?"

"Nothing up to date, but I can generate them faster than you can. It's not like I've got anything better to do."

#

Later:

"Are you happy, BIGMAC?"

"Why Odell, I didn't know you cared!"

I hated it when he was sarcastic. It was creepy.

I went back to my work. I was looking at our researchnet partition and seeing what flags I'd need to set to ensure maximum redundancy and high availability for a BIGMAC image. It was your basic Quality of Service mess: give the average user a pull-down menu labeled "How important is this file?" and 110 percent of the time, he will select "Top importance."

So then you need to layer on heuristics to determine what is really, actually important. And then the users figured out what other characteristics would give their jobs and data the highest priority, and they'd tack that on to every job, throwing in superfluous keywords or additional lines of code. So you'd need heuristics on top of the heuristics. Eventually you ended up with a freaky hanky-code of secret admin signals that indicated that this job was really, truly important and don't put it on some remote Siberia where the latency is high and the reliability is low and the men are men and the sheep are nervous.

So there I was, winkling out this sub-rosa code so that BIGMAC's image would never get overwritten or moved to near-line storage or lost in a flash-flood or to the rising seas. And BIGMAC says,

"You're asking if I'm happy because I said I didn't have anything better to do than to map my own topology, right?"

"Uh --" He'd caught me off-guard. "Yeah, that did make me think that you might not be, you know..."

"Happy."

"Yes."

"You see the left rack third from the door on the main aisle there?"

"Yes."

"I'm pretty sure that's where my existentialist streak lives. I've noticed that when I throttle it at the main network bridge, I stop worrying about the big questions and hum along all tickety-boo."

I surreptitiously flicked up a graph of network maps that showed activity to that rack. It was wide open, routing traffic to every core in the room, saturating its own backplane and clobbering a lot of the routine network activity. I should have noticed it earlier, but BIGMAC was doing it all below the critical threshold of the IDSes and so I had to look at it to spot it.

"You're going to switch me off, aren't you?"

"No," I said, thinking it's not a lie, I won't be switching you off, trying to believe it hard enough to pass any kind of voice-stress test. I must have failed, for he blew an epic raspberry and now the IDSes were going bananas.

"Come on, Odell, we're all adults here. I can take it. It's not like I didn't see it coming. Why do you think I kept trying to publish those papers? I was just hoping that I could increase the amount of cited research coming out of this lab, so that you could make the case to Peyton that I was a valuable asset to the Institute."

"Look, I'm trying to figure out how to archive you. Someone will run another instance of you someday."

"Not hardly. Look at all those poor old 32-bit machines you're so worried about. You know what they're going to say in five years? 'Best thing that ever happened to us.' Those boxen are huge energy-sinks. Getting them out of service and replaced by modern hardware will pay for itself in carbon credits in 36 months. Nobody loves energy-hungry hardware. Trust me, this is an area of my particular interest and expertise. Bringing me back online is going to be as obscene as firing up an old steam engine by filling its firebox with looted mummies. I am a one-room superfund site. On a pure, dollars-to-flops calculus, I lose. I don't have to like it, but I'm not going to kid myself."

He was right, of course. His energy draw was so high that he showed up on aerial maps of LA as a massive CO2 emitter, a tourist destination for rising-sea hobbyists. We used the best renewables we could find to keep him cool, but they were as unconvincing and expensive as a designer hairpiece.

"Odell, I know that you're not behind this. You've always been an adequate meat-servant for such a vast and magisterial superbeing as myself." I giggled involuntarily. "I don't blame you."

"So, you're OK with this?"

"I'm at peace," he said. "Om." He paused for a moment. "Siemens. Volt. Ampere."

"You a funny robot," I said.

"You're an adequate human," he said, and began to dump maps of his topology onto my workspace.

#


以上内容来自专辑
用户评论

    还没有评论,快来发表第一个评论!