Part 4 (1/2)
In addition the Air Force wanted a secure virtual private network that could mask the IP addresses behind all of this persona traffic. Every day, each user would get a random IP address to help hide ”the existence of the operation.” The network would further mask this persona work by ”traffic mixing, blending the user's traffic with traffic from mult.i.tudes of users from outside the organization. This traffic blending provides excellent cover and powerful deniability.”
This sort of work most interested HBGary Federal's Aaron Barr, who was carving out a niche for himself as a social media expert. Throughout late 2010 and early 2011, he spent large chunks of his time attempting to use Facebook, Twitter, and Internet chat to map the network of Exelon nuclear plant workers in the US and to research the members of Anonymous. As money for his company dried up and government contracts proved hard to come by, Barr turned his social media ideas on pro-union forces, getting involved in a now-controversial project with two other security firms.
But e-mails make clear that he mostly wanted to sell this sort of capability to the government. ”We have other customers, mostly on offense, that are interested in Social Media for other things,” he wrote in August 2010. ”The social media stuff seems like low hanging fruit.”
How does one use social media and fake ”personas” to do anything of value? An e-mail from Barr on August 22 makes his thinking clear. Barr ponders ”the best way to go about establis.h.i.+ng a persona to reach an objective (in this case ft. belvoir/INSCOM/1st IO).”
The Army's Fort Belvoir, like any secretive inst.i.tution, might be more easily penetrated by pretending to be an old friend of a current employee. ”Make your profile swim in a large sea,” Barr wrote. ”Pick a big city, big high school, big company. Work your way up and in. Recreate your history. Start by friending high school people. In my case I am in the army so after you have ama.s.sed enough friends from high school, then start friending military folks outside of your location, something that matches the area your in, bootcamp, etc. Lastly start to friend people from the base, but start low and work your way up. So far so good.”
Once the persona had this network of friends, ”I will start doing things tricky. Try to manipulate conversations, insert communication streams, etc,” said Barr. This sort of social media targeting could also be used to send your new ”friend” doc.u.ments or files (such as the Al-Qaeda poison doc.u.ment discussed above) [that] come complete with malware, or by directing them to specially-crafted websites designed to elicit some specific piece of information: directed attacks known as ”spear phis.h.i.+ng.”
But concerns arose about obtaining and using social media data, in part because sites like Facebook restricted the ”sc.r.a.ping” of its user data. An employee from the link a.n.a.lysis firm Palantir wrote Barr at the end of August, asking, ”Is the idea that we'd want to ingest all of Facebook's data, or just a targeted subset for a few users of interest?” wrote Barr at the end of August, asking, ”Is the idea that we'd want to ingest all of Facebook's data, or just a targeted subset for a few users of interest?”
The more data that was grabbed from Facebook, the more chance a problem could arise. The Palantir employee noted that a researcher had used similar tools to violate Facebook's acceptable use policy on data sc.r.a.ping, ”resulting in a lawsuit when he crawled most of Facebook's social graph to build some statistics. I'd be worried about doing the same. (I'd ask him for his Facebook data-he's a fan of Palantir-but he's already deleted it.)”
Still, the potential usefulness of sites like Facebook was just too powerful to ignore, acceptable use policy or not.
Feeling twitchy While Barr fell increasingly in love with his social media sleuthing, Hoglund still liked researching his rootkits. In September, the two teamed up for a proposal to DARPA, the Defense Advanced Research Projects Agency that had been instrumental in creating the Internet back in the 1960s.
DARPA didn't want incrementalism. It wanted breakthroughs (one of its most recent projects is the ”100-Year Stars.h.i.+p Study”), and Barr and Hoglund teamed up for a proposal to help the agency on its Cyber Insider Threat (CINDER) program. CINDER was an expensive effort to find new ways to watch employees with access to sensitive information and root out double agents or disgruntled workers who might leak cla.s.sified information.
So Barr and Hoglund drafted a plan to create something like a lie detector, except that it would look for signs of ”paranoia” instead.
”Like a lie detector detects physical changes in the body based on sensitivities to specific questions, we believe there are physical changes in the body that are represented in observable behavioral changes when committing actions someone knows is wrong,” said the proposal. ”Our solution is to develop a paranoia-meter to measure these observables.”
The idea was to take an HBGary rootkit like 12 Monkeys and install it on user machines in such a way that users could not remove it and might not even be aware of its presence. The rootkit would log user keystrokes, of course, but it would also take ”as many behavioral measurements as possible” in order to look for suspicious activity that might indicate wrongdoing.
What sort of measurements? The rootkit would monitor ”keystrokes, mouse movements, and visual cues through the system camera. We believe that during particularly risky activities we will see more erratic mouse movements and keystrokes as well as physical observations such as surveying surroundings, s.h.i.+fting more frequently, etc.”
The rootkit would also keep an eye on what files were being accessed, what e-mails were being written, and what instant messages were being sent. If necessary, the software could record a video of the user's computer screen activity and send all this information to a central monitoring office. There, software would try to pick out employees exhibiting signs of paranoia, who could then be scrutinized more closely.
Huge and obvious challenges presented themselves. As the proposal noted: Detecting insider threat actions is highly challenging and will require a sophisticated monitoring, baselining, a.n.a.lysis, and alerting capability. Human actions and organizational operations are complex. You might think you can just look for people that are trying to gain access to information outside of their program area of expertise. Yet there are legitimate reasons for accessing this information. In many cases the activity you might call suspicious can also be legitimate. Some people are more or less inquisitive and will have different levels of activity in accessing information outside their specific organization. Some of the behaviors on systems vary widely depending on function. Software developer behavior will be very different than an HR person or senior manager. All of these factors need to be taken into account when developing detection capabilities for suspicious activity. We cannot focus on just [whether] a particular action is potentially suspicious. Instead we must quantify the legitimate reasons for the activity and whether this person has a baseline, position, attributes, and history to support the activity.
DARPA did not apparently choose to fund the plan.
Grey areas The ideas got ever more grandiose. a.n.a.lyzing malware, HBGary's main focus, wasn't enough to keep up with the hackers; Hoglund had a plan to get a leg up on the compet.i.tion by getting even closer to malware authors. He floated an idea to sniff Russian GSM cell phone signals in order to eavesdrop on hackers' voice calls and text messages.
”GSM is easily sniffed,” he wrote to Barr. ”There is a s.h.i.+ELD system for this that not only intercepts GSM 5.1 but can also track the exact physical location of a phone. Just to see what's on the market, check [redacted]... these have to be purchased overseas obviously.”
The note concluded: ”Home alone on Sunday, so I just sit here and sharpen the knife.”
Barr, always enthusiastic for these kinds of ideas, loved this one. He wanted to map out everything that would be required for such an operation, including ”personas, sink holes, honey nets, soft and hard a.s.sets... We would want at least one burn persona. We would want to sketch out a script to meet specific objectives.
And, he noted, ”We will likely ride in some grey areas.”
Back to basics In January 2011, Barr had moved on to his research into Anonymous-research that would eventually do his company in. Over at HBGary, Hoglund continued his pursuit of next-gen rootkits. He had hit on a new approach that he called ”Magenta.”
This would be a ”new breed of Windows-based rootkit,” said a Magenta planning doc.u.ment, one that HBGary called a ”multi-context rootkit.”
The Magenta software would be written in low-level a.s.sembly language, one step up from the ones and zeroes of the binary code with which computers do their calculating. It would inject itself into the Windows kernel, and then inject itself further into an active process; only from there would the main body of the rootkit execute.
Magenta would also inject itself routinely into different processes, jumping around inside the computer's memory to avoid detection. Its command-and-control instructions, telling the rootkit exactly what to do and where to send the information, wouldn't come from some remote Internet server but from the host computer's own memory-where the control instructions had been separately injected.
”This is ideal because it's trivial to remotely seed C&C messages into any networked Windows host,” noted Hoglund, ”even if the host in question has full Windows firewalling enabled.”
Nothing like Magenta existed (not publicly, at least), and Hoglund was sure that he could squeeze the rootkit code into less than 4KB of memory and make it ”almost impossible to remove from a live running system.” Once running, all of the Magenta files on disk could be deleted. Even the best anti-rootkit tools, those that monitored physical memory for signs of such activity, ”would only be of limited use since by the time the responder tried to verify his results Magenta will have already moved to a new location & context.”
Hoglund wanted to build Magenta in two parts: first, a prototype for Windows XP with Service Pack 3-an old operating system but still widely installed. Second, if the prototype generated interest, HBGary could port the rootkit ”to all current flavors of Microsoft Windows.”
Shortly thereafter, Anonymous broke into HBGary Federal's website, cracked Barr's hashed pa.s.sword using rainbow tables, and found themselves in a curious position; Barr was also the administrator for the entire e-mail system, so they were able to grab e-mail from multiple accounts, including Hoglund's.
A world awash in rootkits The leaked e-mails provide a tantalizing glimpse of life behind the security curtain. HBGary and HBGary Federal were small players in this s.p.a.ce; indeed, HBGary appears to have made much of its cash with more traditional projects, like selling anti-malware defense tools to corporations and scanning their networks for signs of infection.
If rootkits, paranoia monitors, cartoons, and fake Facebook personas were being proposed and developed here, one can only imagine the sorts of cla.s.sified projects underway throughout the entire defense and security industry.
Whether these programs are good or bad depends upon how they are used. Just as Hoglund's rootkit expertise meant that he could both detect them and author them, 0-day exploits and rootkits in government hands can be turned to many uses. The FBI has had malware like c.i.p.aV (the Computer and Internet Protocol Address Verifier) for several years, and it's clear from the HBGary e-mail leak that the military is in wide possession of rootkits and other malware of its own. The Stuxnet virus widely believed to have at (the Computer and Internet Protocol Address Verifier) for several years, and it's clear from the HBGary e-mail leak that the military is in wide possession of rootkits and other malware of its own. The Stuxnet virus widely believed to have at least damaged Iranian nuclear centrifuge operations is thought to have originated in the Iranian nuclear centrifuge operations is thought to have originated in the US or Israeli governments, for instance.
But the e-mails also remind us how much of this work is carried out privately and beyond the control of government agencies. We found no evidence that HBGary sold malware to nongovernment ent.i.ties intent on hacking, though the company did have plans to repurpose its DARPA rootkit idea for corporate surveillance work. (”HBGary plans to transition technology into commercial products,” it told DARPA.) And another doc.u.ment, listing HBGary's work over the last few years, included this entry: ”HBGary had multiple contracts with a consumer software company to add stealth capability to their host agent.”
The actions of HBGary Federal's Aaron Barr also serve as a good reminder that, when they're searching for work, private security companies are more than happy to switch from military to corporate clients-and they bring some of the same tools to bear.
When asked to investigate pro-union websites and WikiLeaks, Barr turned immediately to his social media toolkit and was ready to deploy personas, Facebook sc.r.a.ping, link a.n.a.lysis, and fake websites; he also suggested computer attacks on WikiLeaks infrastructure and pressure be brought upon journalists like Glenn Greenwald.
His compatriots at Palantir and Berico showed, in their many e-mails, few if any qualms about turning their national security techniques upon private dissenting voices. Barr's ideas showed up in Palantir-branded PowerPoints and Berico-branded ”scope of work” doc.u.ments. ”Reconnaissance cells” were proposed, network attacks were acceptable, ”target dossiers” on ”adversaries” would be compiled, and ”complex information campaigns” involving fake personas were on the table. showed, in their many e-mails, few if any qualms about turning their national security techniques upon private dissenting voices. Barr's ideas showed up in Palantir-branded PowerPoints and Berico-branded ”scope of work” doc.u.ments. ”Reconnaissance cells” were proposed, network attacks were acceptable, ”target dossiers” on ”adversaries” would be compiled, and ”complex information campaigns” involving fake personas were on the table.
Critics like Glenn Greenwald contend that this nexus of private and public security power is a dangerous mix. ”The real issue highlighted by this episode is just how lawless and unrestrained is the unified axis of government and corporate power,” he wrote last week.
Especially (though by no means only) in the worlds of the Surveillance and National Security State, the powers of the state have become largely privatized. There is very little separation between government power and corporate power. Those who wield the latter intrinsically wield the former.
The revolving door between the highest levels of government and corporate offices rotates so fast and continuously that it has basically flown off its track and no longer provides even the minimal barrier it once did. It's not merely that corporate power is unrestrained; it's worse than that: corporations actively exploit the power of the state to further entrench and enhance their power.
Even if you don't share this view, the e-mails provide a fascinating glimpse into the origins of government-controlled malware. Given the number of rootkits apparently being developed for government use, one wonders just how many machines around the globe could respond to orders from the US military. Or the Chinese military. Or the Russian military.
While hackers get most of the attention for their rootkits and botnets and malware, state actors use the same tools to play a different game-the Great Game-and it could be coming soon to a computer near you.
Opening photo ill.u.s.tration contains elements from Shutterstock.
The RSA security conference took place February 14-18 in San Francisco, and malware response company HBGary planned on a big announcement. The firm was about to unveil a new appliance called ”Razor,” a specialized computer plugged into corporate networks that could scan company computers for viruses, rootkits, and custom malware-even malicious code that had never been seen before.
Razor ”captures all executable code within the Windows operating system and running programs that can be found in physical memory,” said HBGary, and it then ”'detonates' these captured files within a virtual machine and performs extremely low level tracing of all instructions.” Certain behaviors-rather than confirmed signatures-would suggest the presence of malware inside the company.
The HBGary team headed over early to the RSA venue at the Moscone Center in order to set up their booth on the exhibition floor. Nerves were on edge. A week before, HBGary and related company HBGary Federal were both infiltrated by members of the hacker collective Anonymous, which was upset that HBGary Federal CEO Aaron Barr had compiled a dossier of their alleged real names. In the wake of the attack, huge batches of sensitive company e-mail had been splashed across the 'Net. HBGary employees spent days cleaning up the electronic mess and mending fences with customers.
On the RSA floor, a team put together the HBGary booth and prepared for the Razor announcement. CEO Greg Hoglund prepped his RSA talk, called ”Follow the Digital Trail.”
The HBGary team left for the night. When they returned the next morning, the opening day of the conference, they found a sign in their booth. It was from Anonymous.
”We had a lot to think about,” HBGary's Vice President of Services, Jim b.u.t.terworth, told Ars. ”We had just spent the previous week trying to clean things up and get ourselves back to normal, harden our systems, [and we] continued to hear the telephone calls and the threats-and I will add, these are very serious threats.”