What's New with My Site?
Cyber-Security Chief Resigns in Protest
Calls consolidation of power in National Security Agency 'bad strategy'
The official in charge of coordinating the U.S. government's cyber-security operations has quit, saying the expanding control of the National Security Agency over the nation's computer security efforts poses "threats to our democratic processes."
"Even from a security standpoint, it is unwise to hand over the security of all government networks to a single organization," said Rod A. Beckstrom, the head of the Department of Homeland Security's National Cyber Security Center (NCSC) when speaking to United Press International.
"If our Founding Fathers were taking part in this debate [about the future organization of the government's cyber-security activities], there is no doubt in my mind they would support a separation of security powers among different [government] organizations, in line with their commitment to checks and balances," he said.
In a letter to Homeland Security Secretary Janet Napolitano last week, Mr. Beckstrom said the NSA "dominates most national cyber efforts" and "effectively controls DHS cyber efforts through detailees, technology insertions and the proposed move" of the NCSC to an NSA facility at the agency's Fort Meade, Md., headquarters.
"I believe this is a bad strategy on multiple grounds," Mr. Beckstrom wrote in the letter, a copy of which was obtained by UPI. "The intelligence culture is very different than a network operations or security culture. In addition, threats to our democratic processes are significant if all top-level government network security and monitoring are handled by any one organization."
Greg Garcia, who was the Bush administration's first presidentially appointed head of cyber-security at DHS before leaving last December - and who worked with Mr. Beckstrom for nine months - told UPI that although he did not share Mr. Beckstrom's anxiety, "I recognize the cautionary flag he is raising."
Mr. Beckstrom's resignation - after less than a year in office - comes as the Obama administration moves to complete a 60-day review of the way cyber-security efforts are organized in the U.S. government. Successive administrations have wrestled with the complex problem of how to delineate and define the roles of various intelligence, military and security agencies in assuring the integrity of the nation's computer networks, the vast majority of which are owned and operated by the private sector and depend for their efficacy on their open and accessible, and therefore security-unfriendly, architecture.
"There's been a lot of duplication and not enough coordination," Jessica Herrera-Flanigan, a former senior congressional staffer on the House Homeland Security Committee, told UPI.
Mr. Garcia said there had been a "fairly collaborative partnership, not just between NSA and DHS, but ... with a whole lot of moving parts" and different agencies within the government.
"Clearly, both operationally and technologically, the intelligence community is a key element," he said of the sprawling and sometimes fractious collection of spy agencies that serve the U.S. government. But he said DHS' role had to be primary "from a legal standpoint and from a trust and privacy standpoint."
"Unlike the [Department of Defense] or the intelligence community, DHS has a statutory responsibility to work across all levels of federal, state and local government and the private sector," he said.
DHS has come under fire for its cyber-security work, with some criticizing an approach they saw characterized by turf squabbles and overlapping and contradictory lines of authority. Some, most recently including Director of National Intelligence Dennis C. Blair, have called for a greater role for U.S. intelligence agencies in cyber-security as a result of the 60-day review, which is being led by an official in Mr. Blair's office.
Mr. Garcia acknowledged what he called "growing pains" in DHS' cyber-security efforts but maintained it would be a mistake to shift primary responsibility for the issue away from the department.
"If there were a move," as a result of the 60-day review, "to centralize or focus cyber-security strategy on the intelligence community, that would jeopardize the relationship we [at DHS] built up over several years with the private sector."
Another Bush administration DHS official, former Assistant Secretary for Policy Stewart Baker, told UPI that although Mr. Beckstrom's criticism of the NSA's role was receiving more media attention, "I suspect his frustration was driven as much by the funding and organizational issues as by NSA."
In his resignation letter, Mr. Beckstrom wrote that "the NCSC did not receive appropriate support inside DHS. ... During the past year, the NCSC received only five weeks of funding, due to various roadblocks engineered within the department and by the Office of Management and Budget."
"Someone canceled all our contracts for office space, computers and furniture ... without telling us," he told UPI. "I never had a one-on-one meeting with the new secretary, although I reported directly to her ... and last year, there were only five weeks during which we had access to the money to make hires, rent office space and buy equipment we needed."
"He came from a very different background" than most federal officials, said Miss Herrera-Flanigan of Mr. Beckstrom, who was a Silicon Valley entrepreneur and author before joining the department.
"There was a big challenge" for him in negotiating "the dynamics between the different players" in the department, she said. "Unless he was given the authorities to coordinate across the department, it would have been a problem," she added.
Mr. Beckstrom said the center he ran, set up to oversee and coordinate all federal government cyber security activity, "has a role which is much coveted by others in government, so there were natural tensions."
"Rod is a friend and a remarkable talent," Mr. Baker said. "He understood Washington much better than most in Silicon Valley. His inability to move the bureaucracy shows how deep is the divide between government and the tech community."
Department spokeswoman Amy Kudwa told UPI that DHS "has a strong relationship with the NSA ... and is fully engaged with the 60-day cyber-security review. ... We look forward to our continued, positive working relationship with all our partners on outreach to the private sector as we strive to further secure our nation's cyber networks.
"We thank Rod for his service, and regret his departure," she concluded.
The NSA's public affairs office referred requests for comment to DHS.
What's New with My Subject?
The dangers of a high-information diet
NO ONE ever tells you how dangerous this stuff can be: they just go on pumping it out, hour after hour, day after day. You're consuming it right now, without a clue about the possible consequences. The worst thing is, evolution has predisposed your brain to crave it as much as your body craves fat and sugar. And these days - as with fat and sugar - you can get it everywhere.
That's because we live in the information age - and the stuff that risks doing the damage is information itself. As certain scientists and philosophers see it, the discovery and dissemination of knowledge is far from being an unqualified boon. We might be in danger of knowing too much. "Information can potentially be extremely dangerous," says philosopher Nick Bostrom, director of the Future of Humanity Institute at the University of Oxford. "The effects arising from knowledge can be momentous."
Humans are uniquely at risk because we have always craved information. Anthropologist Robin Dunbar and his colleagues at the University of Oxford suggest that this trait has almost certainly been bred into us during our evolutionary history. Evidence for this idea comes from the observation that in birds and primates, brain size is correlated with the ability to reason, to develop new feeding strategies and to survive extinction. "Clearly, the capacity to discover novel facts about the environment has a very ancient basis," Dunbar says.
For humans, new information has in the past brought a clear evolutionary advantage. The invention of spiked clubs, triremes, longbows, gunpowder and all the other military technologies can be traced to the discovery of new information. Each one enabled its inventors to steal a march on the competition. The information embodied in the laws of thermodynamics led to the development of efficient steam engines and, in short order, all the prosperity and exploitation of the British empire.
The question now rearing its head is whether we now know too much. Does the recent explosion in available information, primarily thanks to the internet
Statements like this come as a bit of a shock. After all, most of us take it as a given that the more we know - the more information we have at our disposal about the world around us - the better off we will be. As the philosopher Francis Bacon pointed out back in the 17th century, knowledge is power. The thing is, power can be put to bad uses as well as good. "Right now, for example, we're thinking about how to prevent the growing knowledge and power arising from biotechnology from being put to evil ends," Bostrom says.
Bostrom has coined a term for the danger that arises from knowledge: he calls it "information hazard". A case in point relates to the influenza virus that spread around the world in 1918, killing more than 50 million people. Now its genome has been made publicly available in the online GenBank database, and anyone with the right tools and skills can reconstruct it.
In an article in The New York Times in 2005, futurologist Ray Kurzweil and computer scientist Bill Joy, the former chief technology officer of Sun Microsystems, condemned the publication of the genome as "extremely foolish". Recreating the virus from this information would be easier than building an atomic bomb, they claimed. And once that was done, releasing it could cause far greater devastation.
Not everyone takes this pessimistic view. Geoffrey Smith, a virologist at Imperial College London, opposes censorship of the kind Kurzweil and Joy appear to be advocating. Risks from biotechnology have been exaggerated, he says, pointing out that the security threat posed by biotechnology research is reviewed at numerous stages, from funding onwards. Data should always be published, Smith reckons. "That removes the feeling that there's something secret going on."
The human craving for information makes censorship a particularly problematic response to any perceived information hazard, and openness is often the preferred option. As swine flu started to spread last year, for example, governments and bodies such as the World Health Organization were quick to make the public aware of the risks. Bitter experience has taught us the dangers of allowing the suspicion to take hold that the authorities are withholding information. People's appetite for facts goes into overdrive and it gets easier for false notions to gain credence. "This happened in the UK with the MMR vaccine," says Ian Pearson, a futurologist at the Futurizon consultancy in Switzerland. "The government created a situation where one lone scientist was able to cause mass panic, which has resulted in many kids catching measles - and, of course, a few have died."
Out of control
The fear that information is being kept secret causes havoc in other areas too. A run on a bank can be caused if people feel they can no longer trust those who control the information about the bank's ability to meet its debts, and here, too, gauging the appropriate response is tricky. If governments guarantee deposits - something that the UK government did in 2008 after a rush of savers withdrawing their money threatened to bring down the Northern Rock bank - that can create a further information hazard. "Sometimes banks refuse government assistance as it could be interpreted as a sign of weakness, leading to a further loss of public confidence," Bostrom says.
An information hazard is also confronting the health insurance industry. The advent of companies that offer genome scans has allowed individuals to assess their likelihood of succumbing to various ailments over the course of their lifetime. This threatens to upset the risk-sharing that is the cornerstone of insurance. "The insurance market only functions where neither the individuals nor the company can tell for certain who will actually need the insurance," Bostrom says.
Pearson sees problems like this as unavoidable. Secrecy, censorship and curtailing of scientific research are dangerous options, fuelling distrust of the censors, and depriving society of potentially beneficial discoveries: such "cures" are likely to turn out to be worse than the disease, he reckons. And if external censorship is bad, expecting self-censorship from people who are naturally inclined to satisfy their every craving is just unrealistic. "I don't think there is much scope for people self-regulating their information consumption effectively," Pearson says. "There is no evidence that they can limit their consumption in other areas."
I don't think there is much scope for people self-regulating their information consumption
That leaves us with a problem - and the search for a solution is under way. "Information hazards will become an increasingly critical area of inquiry," says George Dvorsky, a director of the US-based Institute for Ethics and Emerging Technologies.
A little knowledge is a said to be a dangerous thing, but a little knowledge about the power and importance of knowledge itself might be more dangerous still. "This is an area we neglect at our peril," Pearson says.
- "Information Hazards: A typology of potential harms from knowledge", by Nick Bostrom (bit.ly/hHfhs); for a YouTube video of astronomer Martin Rees speaking at the 2008 TED conference in Monterey, California, see bit.ly/I7peH
Paul Parsons is a writer based in the UK, and the author of The Science of Doctor Who (Icon Books)