The problem with having your dream job is that you lose the right to complain. Marcos offered that advice to me during my first day at Oak Ridge. I was only 29 but had already cashed out of Silicon Valley to join the Oak Ridge Leadership Computing Facility, part of Oak Ridge National Laboratory in Tennessee.
I had spent more than a decade of life bouncing around Silicon Valley startups. Money no longer meant much, but I was anxious to escape the entrepreneurial pressures and general dog-and-pony show required to woo angel investors or launch glitzy IPOs. Oak Ridge offered the technology without the relentless capitalist drive.
Oak Ridge was an easy place to keep a low profile. People knew well enough not to ask follow-up questions if you said you worked there. It was delightfully anonymous. I paid cash for a suburban McMansion. One of my neighbors was a retired couple, both physicians from Chattanooga. The other was a patent lawyer who used to represent Fortune 100 companies but now did some work for Oak Ridge. She was pleasant enough, but we rarely crossed paths.
I spent evenings on the balcony off the master bedroom, usually with a few fingers of bourbon or a bottle of California cabernet I had spent God-knows-how-much on before moving East. I didn’t know how long I would be there. But it was a damn nice life as a government academic, even if it was just a break from the Valley.
I was part of a small team of computer scientists, neuroscientists, and engineers. My programming expertise fell below the threshold of brilliant, I thought, but my ability to communicate how systems worked was valuable in a diverse team. That was why they must’ve hired me, I figured.
Our job was to find out how to transform near-infinite computing capacity into better human intelligence. Others worked on more benign applications—weather modeling, for example—but we focused on processing datasets to extract patterns in behavior.
By the time I arrived in 2017, they already had a stable quantum computer. No one outside of Oak Ridge knew about its discovery years before, or the ramifications. Most journalists and Congressmen viewed supercomputing as a pissing contest among the major powers. They couldn’t grasp the value of the exponential leap in processing speed. We could break 256-bit encryption in every existing system—emails, bank transactions, state secrets, even military communications.
There was another flawed assumption: that protecting secrets depended on limiting the output of electronic communications. Machine learning at quantum speeds had taken us beyond that primitive requisite. We could fill gaps in a digital profile. It was an advanced version of the rudimentary “digital zoom” feature of cameras. Optimal zoom maintains resolution even when increasing magnification. Digital zoom separates adjacent pixels, then uses computer-derived estimates to insert best-fit pixels between existing ones.
This concept worked in human life. At a certain scale, we all become predictable. Years of GPS tracking from a phone can predict, with surprising accuracy, where any person is likely to be at a given time. And that was just a single data point. The Internet of Things layered in mundane but essential details: Milk runs low; the refrigerator pushes an alert to your smartphone; your credit card records a grocery store purchase a few hours later.
These were the bits of data we used for an initial set of inputs to help calibrate our machine learning algorithms. At first, we used ourselves as the subjects, feeding inputs from our own digital trail into the database and assessing the predications against our behavior. Yet conscious monitoring affected our choices, especially when we worked on longitudinal predictions that required months or years to verify.
We were also far too small a sample size. That was the crux of our pitch to the National Security Agency to patch into one of their data streams. Nothing had changed since Snowden’s leaks in 2013. Admittedly, I was morally apathetic. Short of moving to the Montana woods and splitting firewood, most of the information pooled in private or government databases would end up there regardless of someone’s effort to hide their digital trail. We maintained controls to anonymize our information.
On Monday, February 20, 2017, everything changed. It was unseasonably warm, warm enough that I thought it would be a good idea to open the windows when I got home that evening. The air was getting stale after a long winter.
I was usually one of the first people to get to my workstation. For all the high-tech equipment, we still worked in a government building full of featureless rooms and drab interiors. In our shared office, the organization was haphazard, with desks and monitors aligned at all angles and brightly colored cords snaking up the walls and duct-taped to the floor.
I swiped the card reader for our office, waited for the magnetic release, and opened the door. As it swung open, I was surpised to see the Oak Ridge director, Sanjay Rai, and two men I didn’t recognize. Sanjay looked diminutive beside his bulky companions. I worried that it was the start of another interminable audit. One of the suits turned his head toward Sanjay.
“That’s him?”
Sanjay nodded.
They walked briskly toward me and grabbed my arms, pinning them against my side. I couldn’t see behind me but saw them both look past me and nod. I felt a sting at the bottom of my neck. Gradually, the tension released from my arms. I felt my slight frame began to rely on their grip for stability.
A familiar voice spoke from behind me: “Shouldn’t be long now,” Marcos stated.
I no longer felt my body. There was no fear or anger. I felt calm, even introspective. Why had I taken this job? I thought back to my final months in California. I wasn’t unhappy. Just a bit restless.
Perhaps there were clues I hadn’t noticed: the increasingly long showers; a syntactical shift in email replies; the untouched bank accounts; the neglected treadmill. The list kept growing, of inputs intended and omitted. Analyzing these minutia would be the great breakthrough for the algorithm. These were the details that would make it possible to identify vulnerable individuals who could be manipulated without perceiving outside interference.
That was when it hit me: I had not been hired to create a system. I had been hired to prove another’s worked.
“My God,” I mumbled.
Marcos smiled.
Submitted May 27, 2017 at 10:05PM by _couldbeworse http://ift.tt/2qugZvC nosleep
No comments:
Post a Comment