illurity-logo
Log in

Site menu:

Categories

Tags

Site search

September 2018
M T W T F S S
« Dec    
 12
3456789
10111213141516
17181920212223
24252627282930

Links:

Archives

On the Cybersecurity Act of 2009

Making predictions is hard, especially about the future. In descending order, when predictions come true, it is likely because of: 1) some undisclosed foreknowledge of the event; 2) discernible writings on walls, patterns, trajectories, trends, or nigh inevitabilities; 3) pure random luck; 4) voices, visions, and other sorts of esoteric transmissions. The recent introduction of S773, better known as the Cybersecurity Act of 2009 would suggest that last December’s article Quackery was the result of one or more of the above causes. I won’t say which, but I will admit that my neighbors have a black Labrador retriever.

The body of the Cybersecurity Act opens with 14 findings about how important and vulnerable our government and critical infrastructure (i.e. SCADA) information systems are, and how we lack “a coherent national strategy” for dealing with threats and incidents. That this is largely the same material we’ve been hearing and saying for as long as we’ve been in infosec should not dilute the message. What follows is not entirely the same toothless posturing that we’ve seen in the past… much of what is proposed is more than simply a new cause at which civil-liberties advocates (largely idle since the end of the Bush administration) will be able to disgorge their vitriolic righteous-indignation, and will likely cause concern among productive people, as well.

Section 3 begins the proposals, starting with the President appointing a Cybersecurity Advisory Panel comprising members from industry, academia, government, and interest groups whose overarching duty will be to advise the President on “matters relating to the national cybersecurity program and strategy” and to write reports at least every two years. It also offers-up taxpayer dollars to cover non-Federal members’ travel expenses, ensuring that participants will always get to fly first-class.

Section 4 has the Secretary of Commerce working with the Office of Management and Budget to create a security dashboard (something like a cross between this and one of these or this) for all Federal Government and Department of Commerce information systems. As long as it’s not done by the same visualization virtuosos who brought us this, then mandating the mythical single pane-of-glass will likely provide some benefit, so long as those who gaze upon it don’t tragically believe it to have the power to confer omniscience.

Section 5 proposes the Secretary of Commerce create Regional Cybersecurity Centers to “enhance the cybersecurity of small and medium sized businesses in United States” by disseminating “cybersecurity strategies, best practices, standards, and technologies” developed by the National Institute of Standards and Technology (NIST). Great idea, but as written, this section is trouble. First, it uses the term “best practices” which is immediately at least partially invalidating because “best practices,” in practice, are usually little more than tokenistic fantasies of the ill-informed or lazy. We should be encouraging understanding and critical thinking, not oblivious rote mimicry or distorted reinterpretations.

Next, since it doesn’t indicate that the training would be mandatory, it must be optional, and with “firefighting” being the normative mode of operation for most infosec people, it is likely that attendance will be low for non-mandatory training. Further, since there is no mention of a measurement of the effectiveness of the training (i.e. testing), it would be fair to assume that many of the people who do attend will merely be doing so either because their boss made them, or because they prefer a day in a classroom (or a vendor seminar, or a trade show, etc.) to a day at the office; not ideal conditions for learning.

In addition to training companies and enterprises, another of the activities of these funded, non-profit Centers is to “make loans, on a selective, short-term basis, of items of advanced cybersecurity countermeasures to small businesses with less than 100 employees.” Huh? Like a public library full of firewalls instead of books? What objective criteria will they use to select the gear that they will stock? Will they decline to stock the gear of those foolishly paranoid vendors who fecklessly try to avoid selling product to their competition? Will they offer both hardware and software? Will they offer technical support, or will the repeated burden (but only the one time revenue) fall to the vendor? Will they charge late fees?

While not entirely analogous, this section does bring to mind the recent controversy stirring over the Obama administration’s recent move to reintroduce the “comparative effectiveness” method of evaluating medical treatments (as part of the American Recovery and Reinvestment Act of 2009). Looking at the debate between proponents (who say “such studies are essential to curbing the widespread use of ineffective treatments and to helping control health care costs”) and opponents (who invoke the tritely lame slippery slope warning that the “movement could lead to inadequate treatment for some patients and even the rationing of health care”), it’s reasonable to expect that this section will similarly elicit accusations of “socialist cybersecurity”. Despite that fact that this section forces nothing upon anyone, we should be prepared for some such melodramatic rhetoric.

Section 6 charges NIST with creating a research program to develop metrics and “automated tools” for measuring the economics of cybersecurity, including the measurement of risk and the cost of defense. I imagine the good people at NIST will look at this and say “You want what? Why don’t you just ask us to calculate how much Thursday weighs while you’re at it.” Not to say that measuring risk is not possible (e.g. risk = threat * (vulnerabilities – countermeasures) * impact), but making the transition from this abstract to the concrete (i.e. a representation that people expect… dollars) is painstakingly particular, and nearly impossible to make simultaneously accurate and automated.

It’s easy to ask questions such as “how many servers do you have?”, “what is the estimated daily value of your Internet connection?”, and “do your workstations run up-to-date anti-virus software?” and for many, it will provide a better measurement of their assets and risks than they have ever before had. But what about the less-easy, ponderously imponderable considerations like “do you run any software written by a company who had one or more lazy, incompetent, disgruntled, or sleep-deprived-because-they-were-driven-by-their-capitalist-boss-to-meet-a-deadline employees on the development team, and/or that employed inadequate code-review procedures?“ or something like “do you employ any servers whose CPUs have undocumented or otherwise unprotected interfaces to microcode or System Management Mode code updates that might be catastrophically re-written by an attacker sending a maliciously crafted packet over the network exploiting the interaction of simultaneous vulnerabilities in your network card driver and your operating system’s System Management Interrupt handler?” Really, can you blame China for developing Kylin or Loongson?

NIST is also asked to “establish standards for continuously measuring the effectiveness of a prioritized set of security controls that are known to block or mitigate known attacks.” It’s laudable that they had the sense to say “known attacks”, and while there is certainly value to preventing known-attacks (e.g. even though it’s about 7 months old, given the number of unpatched systems it still reasonable to block Conficker), it ignores a natural, thoroughly neutering sequence:

  1. A vulnerability is discovered, and an attack is created. At this stage, there is no way to ensure detection or defensibility. Encouragingly, even some preventative security vendors get this, and are working to expose the problem.
  2. Once the attack becomes known, the specific attack becomes preventable, and the underlying vulnerability becomes remediable.
  3. Countermeasures will be created. As they are circulated over time, exploitation begins to drop.
  4. When sufficiently ineffective as to no longer provide adequate utility to its employers, the attack will be superseded (by variants and/or entirely new attacks).
  5. Variant species of the attack will be manufactured. Systems on which the underlying vulnerability has been remedied will not be exploitable, but systems merely protected by some form of prevention will likely again become exploitable. These system will be condemned to a loop between step 2 and step 5 until the vulnerability is remedied, or until the attackers stop creating variants.
  6. The reentrant cycle starts over at step 1.

Further, it asks that the Institute establish standards for “measuring the software security using a prioritized list of software weaknesses known to lead to exploited and exploitable vulnerabilities” (such as CWE (Common Weakness Enumeration) and maybe CVE (Common Vulnerabilities and Exposures)),
“…computer-readable language for completely specifying the configuration of software…“ and “…security settings for operating system software and software utilities…” (like NIST’s FDCC (Federal Desktop Core Configuration), SCAP (Security Content Automation Protocol), or MITRE’s CCE (Common Configuration Enumeration) which attempts to map overlapping guidelines from NIST, NSA, DISA, and “…computer-readable language for specifying vulnerabilities in software…“ (OVAL (Open Vulnerability Assessment Language), or something akin to
CVSS (Common Vulnerability Scoring System), CWSS (Common Weakness Scoring System), or Microsoft’s Exploitability Index.

Surprisingly absent from section 6a is an area that is at least as practically essential as the rest. Allow me to correctively propose 6a (8):

“INCIDENT RESPONSE METHODS AND PROCEDURES – The Institute shall establish standards for technological and procedural preparedness in response to the inevitable security events that will occur even on the best defended networks, ensuring the ability to effectively determine the scope and detail of the breach.”

Cynics might say this seems a bit self-serving, a forensics company suggesting that forensics provisions be incorporated into law. Some might even invoke the poetically censorious words of U.S. Supreme Court Justice Oliver Wendell Holmes (from Abrams v. United States):

“If you have no doubt of your premises or your power and want a certain result with all your heart you naturally express your wishes in law and sweep away all opposition.”

Holmes then exposes the folly of mandating ideas into law by explaining that:

“…the ultimate good desired is better reached by free trade in ideas — that the best test of truth is the power of the thought to get itself accepted in the competition of the market…”

I do not agree that the free market for ideas is always the most effective or beneficial; for proof, simply ask the typical 5 year old if he’d rather have cotton-candy or broccoli for dinner, or even the typical 35 year old if he’d rather have potato chips or broccoli as a snack. Left to our own devices, we don’t always make the best decisions. Sometimes we need guidance, and there is no shame, freedom-robbing conspiracy, or overtly oppressive statism in such an admission. Yes, the suggestion might boost the sales of forensic technology vendors, but it (along with my recommendation to choose the broccoli) is entirely altruistic.

Section 6 next offers a prescription to achieve “representation in all international standards development related to cybersecurity“ and compliance with “standards based on risk profiles.” These read as conspicuous endorsements for a broader adoption of Common Criteria, while the focus on risk profiles seems a foreshadowing of the imminent transition from Evaluation Assurance Levels (EAL ratings) to Robustness assessments. This should, at the very least, be encouraging to the folks at Corsec and Infogard.

The final item in section 6 refers to section 6001(k) of the American Recovery and Reinvestment Act, which calls for a national broadband plan. It asks that the FCC “report on the most effective and efficient means to ensure the cybersecurity of commercial broadband networks, including consideration of consumer education and outreach programs.” Of course, the immediate concern here will be that this stretches the scope of the act from Federal and critical infrastructure into the private sector, but before anything starts yelling about “nationalization” or “privacy invasion” or “economic or innovative suffocation”, consider that this is simply calling for a report and recommendations, not regulations. There is nothing wrong with the government helping to make private sector information systems more secure, so long as they don’t mandate security measures. Steering is good, rowing is bad, and this seems like some much needed steering.

Section 7 is one of the more controversial bits. It asks that the Secretary of Commerce institute “national licensing, certification, and periodic recertification program for cybersecurity professionals”. It goes on to mandate that within three years “it shall be unlawful for any individual to engage in business in the United States, or to be employed in the United States, as a provider of cybersecurity services to any Federal agency or an information system or network designated by the President, or the President’s designee, as a critical infrastructure information system or network, who is not licensed and certified under the program.”

This is not as oppressive or Machiavellian as some might make it out to be. Suffice it to say, we expect licensure in most professions that require any skill, or wherein malicious or supremely incompetent practitioners have the ability to kill their patronage.

Section 8 calls for a review of the NTIA (National Telecommunications and Information Adminstration) IANA (Internet Assigned Numbers Authority) contracts. Not too surprising, considering recent issues with ICANN and unrestricted generic top-level domains, as well as Senators Snowe (R-Ma) and Nelson’s (D-Fla) concerns that “much of the progress ICANN has made could be jeopardized if its historic link to the United States is diminished” (yes, the same Snowe and Nelson who co-authored S.773).

Section 9 charges the Assistant Secretary of Commerce for Communications and Information to secure the foundationally critical DNS infrastructure against attacks, clearly a reference to DNSSEC. This is a much needed move, and at first glance the 3 year timeline might seem a little lax (especially considering that Verisign and ccTLDs such as Puerto Rico, Mexico, and the Czech Republic are already in pilots); but considering that DNSSEC is at the intersection of PKI, crypto, national interests, and commercial interests, all at a global level, then 3 years might not be enough time for resolution.

Section 10 calls for the Secretary of Commerce to develop cybersecurity public awareness campaigns. No firm direction or dates. I’m imagining 1970’s-style public service announcements. Maybe they can get Bob Dorough to do the music.

Section 11 (particularly 11a) attempts to boil the ocean. Followed by freezing it into ice cubes, sublimating them, condensing the vapor, electrolyzing it with palladium, and then powering Navy vessels with the output. In other words, this one is biting off a bit much. In essence: Section 11a proposes the NSF (National Science Foundation) research ways to build near-perfect software and protocols, guarantee privacy of data-at-rest and data-in-motion, provide attribution for internet communications, and thwart insider threats. Wow.

Section 11b and 11c more realistically call for secure coding research and education. 11d asks that grants be awarded for academic innovations in the area of modeling cyber attacks and defenses, and 11e-11l make some modifications to the CyberSecurity Research and Development Act.

Section 12 allocates some tens of millions of dollars to scholarships programs “to recruit and train the next generation of Federal information technology workers and security managers” with preferential treatment to those who’ve participated in the challenge described in section 13.

Section 13 asks the Director of NIST to establish cybersecurity competitions to “attract, identify, evaluate, and recruit talented individuals for the Federal information technology workforce” and to stimulate innovation of technologies “that have the potential for application to the Federal information technology activities of the Federal Government.” Although not stated explicitly, one would expect that such competitions would include both offensive and defensive components, with offense  (i.e. “hack this”) being somewhat easier to measure and judge, but with defense being of greater value to the initiative. However, it’s worthwhile to recognize recent reports indicating the emerging value of offensive operations, and to consider the effect such positions might have on the nature of such competitions (and on cybersecurity technologies, in general):

“We are not comfortable discussing the question of offensive cyberoperations, but we consider cyberspace a war-fighting domain,” said Bryan Whitman, a Pentagon spokesman as reported by the New York Times. “We need to be able to operate within that domain just like on any battlefield, which includes protecting our freedom of movement and preserving our capability to perform in that environment.”

Section 14 designates the Department of Commerce to “serve as the clearinghouse of cybersecurity threat and vulnerability information.” Section 14c seems the most functionally interesting piece of 14, stating that “within 90 days after the date of enactment of this Act, the Secretary shall publish in the Federal Register a draft description of rules and procedures on how the Federal Government will share cybersecurity threat and vulnerability information.” Assigning this role to Commerce (rather than NIST or DHS (via NCSD or US-CERT)) seems designed to reinforce the idea that cybersecurity will not come at some economic expense that might threaten our non-negotiable American way of life.

But it is section 14b (1) that is the most concerning component of this section. It states that the Secretary of Commerce “shall have access to all relevant data concerning such networks [Federal Government and private sector owned critical infrastructure information systems and networks] without regard to any provision of law, regulation, rule, or policy restricting such access.” That “without regard” bit might be more than merely irresistibly delicious fodder for conspiracy theorist nutcases; in this case they might have a point. This should probably be toned down.

Section 15 calls for a risk management report, including a feasibility study on “(1) creating a market for cybersecurity risk management, including the creation of a system of civil liability and insurance (including government reinsurance); and (2) requiring cybersecurity to be a factor in all bond ratings.” I’ve talked about the potential role of insurance in infosec before, so it’s good to see (1), but the foreseeable difficulty of assessing and enforcing (2) is likely to limit its adoption and effectiveness.

Section 16 calls for a review and report on “the Federal statutory and legal framework applicable to cyber-related activities in the United States.” In other words, an exhaustive review of any acts or orders directly or indirectly cyber-related. Just one year?

Section 17 asks for a report “on the feasibility of an identity management and authentication program.” Yes, it’s the mark-of-the-beast law… the “with the appropriate civil liberties and privacy protections” verbiage fools no one.

Section 18 is, by far, the most troublesome section of the act. This is the one that has patriots issuing warnings that “Rockefeller is shutting down the Internet”. Section 18 gives the President certain powers and obligations, including that he “(2) may declare a cybersecurity emergency and order the limitation or shutdown of Internet traffic to and from any compromised Federal Government or United States critical infrastructure information system or network” and that he “(6) may order the disconnection of any Federal Government or United States critical infrastructure information systems or networks in the interest of national security.” First, even though it is inclusive of public-sector “critical infrastructure information system[s],” it is clear that this is not “the whole Internet”. Second, and more importantly, since this is not really currently feasible on anything but the smallest of scales, it seems that this is more a provisional tool in the event of a worst-case cyber-scenario than it is a potentially practicable commandeering of the Internet. Section 18 also says some other stuff, but no one notices.

Section 19 calls for a cyber-review every four years, starting in the year 2013 involving the Advisory Panel designated in section 3. This is not an agricultural report, this is a cybersecurity report… Quite a lot can happen in 4 years.

Section 20 calls for the Director of National Intelligence and the Secretary of Commerce to submit an annual cybersecurity report to Congress. Much better than quadrennial.

Section 21 encourages the President to work with foreign governments to create more cyber-bureaucracy, and to report on the initiatives to Congress.

Section 22 calls for the establishment of a Secure Products and Services Acquisition Board to work in conjunction with NIST and the OMB on devising standards for the “review and approval of high value products and services”. Of importance to software-vendors (and static and dynamic code analysis tool vendors) is the piece that says “[the] Board may consider independent secure software validation and verification as key factor for approval [of software].” It further says that “any proposal submitted in response to a request for proposals issued by a Federal agency shall demonstrate compliance” with the published standards.

Section 23 provides a definition of terms, including some disturbingly circular reasoning that basically says “critical infrastructure information systems are whatever the President says critical infrastructure information systems are.”

Will it pass? I won’t make a prediction on that one, but I will advise preparing for it.

Share: These icons link to social bookmarking sites where readers can share and discover new web pages.
  • Twitter
  • LinkedIn
  • Facebook
  • email
  • Google Bookmarks
  • del.icio.us
  • StumbleUpon
  • Reddit

Comments

Comment from jcaldwell
Time: 2009-06-30, 00:00

The U.S. government has a history of well intended but poorly executed attempts regarding efforts to regulate Internet information control. I will not address the question of whether the Cybersecurity act is a bill founded on good intentions.

I will bring back into focus and remind readers of the acts of the U.S. Congress in 1998 and the Digital Millennium Copyright Act (DMCA). At that time, despite objections from well informed academia and citizens regarding the potential and _expected_ ramifications, concerns were left unheeded. The DMCA was offered to prohibit the circumvention of technological protection measures used by copyright owners to control access to their works. It also banned devices whose primary purpose was to enable circumvention of technical protection systems.

The unintended consequences have proven to be material and bone chilling. IEEE which publishes 30% of all computer science journals worldwide required all contributing authors to indemnify IEEE against any liabilities. This alienated large segments of the research community (IEEE, 2002) and limited published contributions. Foreign scientists have expressed concerns about travelling to conferences in the US following the arrest of Russian programmer Dmitry Sklyarov on DMCA violations (EFF, 2008). The USENIX conference is now being urged to move its annual conference offshore (EFF, 2008) and other important conferences are now no longer held in the US.

Is the Cybersecurity of 2009 act a body of legislation which will make the DMCA of 1998 pale in comparison? This is an exercise left to the reader.

References:
Samuelson, P. (2003) Mapping the Digital Public Domain: Threats and Opportunities. Law & Contemporary Problems, 66 (1) pp. 147-173
Electronic Frontier Foundation. (2008). Unintended consequences: Ten years under the DMCA. Retrieved June 29, 2009 from http://www.eff.org/wp/unintended-consequences-seven-years-under-dmca
IEEE (2002). IEEE to Revise New Copyright Form to Address Author Concerns. Retrieved June 29, 2009 from http://www.ieee.org/portal/pages/newsinfo/dmca.html

Pingback from Worth A Glance » Rumors and Preparedness
Time: 2009-07-09, 20:20

[…] been hearing lately? Do you think it could be a well-timed vehicle to garner support for the Cybersecurity Act of 2009? Do you think it’s a publicity stunt for […]

Pingback from Solera Networks – See Everything. Know Everything. » Rumors and Preparedness
Time: 2009-07-13, 12:58

[…] been hearing lately? Do you think it could be a well-timed vehicle to garner support for the Cybersecurity Act of 2009? Do you think it’s a publicity stunt for […]

You must be logged in to post a comment.