Log in

Site menu:



Site search

July 2018
« Dec    




The 1980’s marked the beginning of what many consider to be the Information Age. A quarter of a century into it, my waning hope that science might someday have a fighting chance against superstition is somewhat renewed by the fact that President-elect Obama is planning to appoint the first Chief Technology Officer for the United States. Beyond the immediate heartening we should feel from this gesture, I am further encouraged by a certain historical “coming of age” analogue: the regulation of the practice of medicine in the Industrial Age. Although past performance is no indication of future returns, history tends to repeat itself, mostly because human nature doesn’t change, at least not all that quickly.

In particular, I am reminded of the history of “nostrum remedium” (Latin for “our remedies”), better known as “patent medicines” that were prominent from the 17th to 19th centuries. Patent medicines were the pile of liniments, tonics, tinctures, and salves that collectively came to be characterized as “snake oil”. More broadly, patent medicines fell into the general realm of “Quackery,” (a pejorative labeling of unscientific claims to certain knowledge, skills, capabilities, or attributes) short for “quacksalver”, one who “quacks” or boasts about his salves.

The long and infamous history of quackery is morbidly fascinating. Perusing the chronicles of some of the more egregious, and shameful, and grievous offenses, a common response is to wonder how people can be so gullible as to fall for such chicanery. But consider the combined effect of such elements as:

  • The unavailability or expense of honest medical practitioners, practices, and goods relative to the availability and affordability quackery (which doesn’t require such expenses as years of schooling, research and development, or testing)
  • The desperation of those suffering from some disorder to identify a treatment or remedy, particularly when conventional and scientific methods continue to produce ostensibly trivial, profit-obsessed commercial advances, but fail to yet provide cures for real maladies
  • The tendency for people to try to find shortcuts – it is much easier to take a pill or even undergo a surgical procedure than it is to exercise a degree of self-discipline.  This is intensified by phenomenon such as mass cultural “syndromization” (which dissolves that vestigial psychological nuisance, personal accountability), and insurance co-pays for semi-cosmetic procedures and fad drugs (which has the economic effect of making vital medical practice, treatment, and insurance even more expensive)
  • The extent, irrespective of intent, to which pseudoscience goes to simulate adherence to the scientific method (i.e. observable, testable, measurable, repeatable, modifiable, verifiable)
  • The “exceptionality of correctness” delusion – Typified by the culturally rampant concept of “I’m on a diet,” which is essentially as silly as saying “my network is on security”. You either have a nutritionally well-balanced diet or you don’t. Your information systems are either securely designed or they’re not. You can’t bolt on some piece of technology in pursuit of legitimate security any more than you can engage in some symbolic temporary deviation from an unhealthful diet in pursuit of fitness. Fitness is an ongoing process, a lifestyle. Security is also an ongoing process. But, alas, these systems are complex
  • The fact that complex systems are, well… complex – You couldn’t describe this to a goldfish, but you could tweet “Bacon good, bread bad” with 119 characters to spare.

So what could protect a quarry that well-nigh demands to be preyed upon against unscrupulous predators endowed with unlimited supplies of elixirs and avarice? Only the bane of every system of supply and demand: government regulation.  The first signs of efforts to regulate the quack industry began in the early 19th century with the formation of U.S. Pharmacopeia in 1820, followed by the Drug Import Act of 1848 to stop the flow of adulterated medicines which were coming in from Europe. But it wasn’t until Abraham Lincoln established the US Department of Agriculture that there was a foundation for real improvement.

Irresistible Digression
Those who are “so scientifically illiterate” as to be inclined to indulge the supernatural might see auspiciously presaging similarities between President-elect Obama and Abraham Lincoln. But even though (much like divination) “Lincoln is a Rorschach test… Everybody finds themselves in Lincoln… Everybody finds what they want to find in Lincoln” it’s still worth noting that just as Obama is planning to appoint our Nation’s first CTO, Lincoln in 1862 appointed the first national chemist to what became the Bureau of Chemistry, the precursor to the FDA. Coincidence? Hardly. Using some tannaic period numerology, a simple gematria calculator, and far more time than I should have wasted, it’s simple to prove that this is no mere coincidence:

  • בראק הוסינ אובמה – Transliteration of “Barack Hussein Obama”. Gematria value 488
  • אברהמ לנכולנ – Transliteration of “Abraham Lincoln”. Gematria value 434
  • טכנולוגיה – The modern hebrew word for “Technology”. Gematria value 139
  • כימיה – The modern hebrew word for “Chemistry”. Gematria value 85
  • 488 – 434 = 54
  • 139 – 85 = 54
  • חמאה – The biblical hebrew word for “butter”. Gematria value 54
  • Proof!

Okay, so I took some liberties with the calculations… Partly because “chemistry” and “technology” weren’t big topics in the bible, although they were probably there in the way that microbes were. And if that seemed a strange illustrative detour, try this for perspective.

I’m from the government and I’m here to help
Anyhow, with Lincoln having set the ball in motion, a sequence of other milestones followed, frequently in response to what tends to be our greatest incitement to legislation: some sort of scare, outrage, or tragedy. For example:

  • In response to contaminated vaccines that caused the deaths of 22 children, the Biologics Control Act was passed in 1902, which went on to establish the Center for Biologics Evaluation and Research (CBER), overseeing biological, as contrasted to chemical drugs.
  • In 1910, “Dr. Johnson’s Mild Combination Treatment for Cancer” made false curative claims, and even shamelessly attacked the effectiveness of legitimate treatments (a common ploy of pseudoscience). When brought to court, the “treatment” was found to not be in violation of the Pure Food Act since it was not misbranded. In response to this loophole, Congress in 1912 enacted the Sherley Amendment to the Pure Food and Drug Act making a drug illegal “…if its package or label shall bear or contain any statement, design, or device regarding the curative or therapeutic effect of such article or any of the ingredients or substances contained therein, which is false and fraudulent.”
  • Following the death of more than 100 patients caused by a treatment for infection distributed in a solvent that turned out to be toxic, the Federal Food Drug, and Cosmetic Act was passed by congress in 1938, giving authority to the Food and Drug Administration (FDA). It required that companies perform safety testing on their proposed drugs and submit the data to the FDA for review and approval before the drug could be brought to market. It also served as the foundation upon which a significant number of additional protective amendments stand.
  • In response to the 1950’s Thalidomide tragedy that caused more than 10,000 birth-defects worldwide, Congress passed the Kefauver-Harris Amendment in 1962, requiring drug manufacturers to prove the effectiveness of their products before marketing them.

Conspicuously absent from the partial chronicling above is an event that deserves special attention: 1906’s Pure Food Act, which mandated that all food and drugs clearly and accurately list their contents. The Pure Food Act was the culmination of years of work by legislative, journalistic, and medical professionals who crusaded to expose the fraud and danger of patent medicines. Of particular interest was a scathing piece of muckraking journalism by Samuel Hopkins Adams titled “The Great American Fraud”, which exposed hundreds of dangerous patent medicines, products and their hucksters, documented the deaths of hundreds of their victims, and revealed that they contained mostly valueless inert ingredients, alcohol, and various toxic and addictive compounds. This multipart series from 1905 opened with the line “Gullible America will spend this year some seventy-five millions of dollars in the purchase of patent medicines.” (For reference, by today’s standards, $75 million would just barely pay for 90 minutes of interest on our national debt, but according to this Consumer Price Index calculator, $75 million in the year 1905 has the same “purchase power” as $1.8 billion in the year 2007. Still, this number is a fraction of analyst’s estimates on worldwide network security spending.)

Leaving no stone within the ecosystem unturned, the Great American Fraud also described the “selectivity” of advertising, and the corrupt nature of the relationship between the advertisers and publications:

“We see recorded only the favorable results:  the unfavorable lie silent.  How could it be otherwise when the only avenues of publicity are controlled by the heavy advertisers?  So while many of the printed testimonials are genuine enough, they represent not the average evidence, but the most glowing opinions which the nostrum vender can obtain, and generally they are the expression of a low order of intelligence.”

“If there is no limit to the gullibility of the public on the one hand, there is apparently none to the cupidity of the newspapers on the other… Pin a newspaper owner down to the issue of fraud in the matter, and he will take refuge in the plea that his advertisers and not himself are responsible for what appears in the advertising columns. Caveat emptor is the implied superscription above this department. The more shame to those publications which prostitute their news and editorial departments to their greed.”

Suffice it to say, the practical implications of such ethical considerations are not only timeless, but are even more relevant in today’s overabundance of and overdependence on the media for edification.

Regulate? Why?
While most legitimate practitioners and scientists in the medical industry presumably appreciate that regulation serves to separate the qualified from the unfit, sparing them the need to directly have to compete against the (often more attractive, and still, unfortunately, partly on the loose) riff-raff, the grass remains greener on the other side for some. In response to the general “regulation is bad” argument, I offer that this is not about regulation for the sake of bigger government, it’s about injecting some much-needed science into an increasingly critical system. There are instances where bad things happen (Enron) because of calculated villainy, and our knee-jerk response is to suffocatingly over-regulate, and there are unintended tragedies that occur, such as the 1937 Elixir Sulfanilamide incident’s still troublesome Diethylene glycol-tainted catalyst to 1938’s Federal Food, Drug, and Cosmetic Act, which make regulation seem indispensable. The difference? Regulation designed to protect against premeditated bad guys will fail to thwart the devious and lawless, affecting only the good and law-abiding, whereas regulation designed to protect against accidental harm caused by ignorance, incompetence, negligence, or superstition can prevent misfortune.

Some might be inclined to say that it’s not fair to compare medicine to information security – one is a matter of life and death while the other is simply a matter of bits and bytes. But as we move further into the information age, and become more, and more, and more, and more dependent on our information systems, we’ve “optimized” ourselves into a position where our military, our public transportation, our communication systems, our hospitals, our power plants, and our emergency services are all susceptible to attacks against our overstretched, outstripped information systems; Information systems that are inherently crippled with outdated protocols and a capitalist driven mandate for backward compatibility; designed at a time when systems weren’t critical and everyone was friendly; held hostage by rapacious commercial interests who chant “openness”, “transparent connectivity”, and “ease-of-use” just so as to not clog the pipes through which the money flows, and abetted by mountebanks and supernaturalists with a disdain for any scientific motions toward security. And unlike conventional warfare, these attacks can be launched remotely, anonymously, and with zero expense incurred by the attacker – in other words, an enemy that is both invisible and inexhaustible. When faced with actual threats from a foe with mythical potency, how do we respond? By employing whatever parlor tricks and panaceas we can that will create the illusion of security, just as long as it doesn’t hamper profitability. Regrettably, given current economic conditions and outlooks, we can probably expect the effect to worsen. <ahem>

Some say that regulation is not well-defined, thorough, or effective enough, and that too much continues to fall through the cracks, so that it’s not only an expense and hindrance, but inadequate, to boot. However, considering this as a condemnation of regulation would be akin to asserting that “there are still crimes being committed on the streets, so since the police can’t stop them all, we’d better get rid of the police.” On the contrary, this is a call for better standards, and as history shows, ongoing relevance requires adaptability and evolution. But it does raise the important question: “how much testing is enough?” There are still plenty of instances of drugs being approved and later recalled because of insufficient testing. Why? Simply, because not all conditions can be known in advance, and testing cannot be perfectly exhaustive or it will never be completed, meaning the product will never be brought to market, thus denying potential beneficial treatment to those who need it. Like the FDA, Quality Assurance (QA) departments in every information technology developer deal daily with this conundrum. To further complicate the task are such paradoxes as “customers demand more features, which increases complexity, which increases test-cases, test-time, and the overall potential for product failure” and “increasing economic pressures demand that we bring products to market sooner and more cost-effectively than the competition, which tempts cutting development corners, QA resources and test-time.” For the software development lifecycle itself, there are an overwhelming number of standards and frameworks available, and for the finished products there are better recognized industry certifications like FIPS and Common Criteria to help to ensure cryptographic integrity, to protect against attacks targeting development environments and supply-chains, and to weed-out fraudulent vendor claims. But the effectiveness of these methodologies and certifications are crippled by the fact that they are not universally understood or applied.

For the QA testing process itself, there is no answer to the question “how much testing is enough” because there will always be unknown unknowns. Even if we defined some set of sane minimally acceptable QA practices (e.g. peer code-reviews, static code-analysis, validation/sanitization testing, input-output comparison testing, stress/load testing, mutation testing) how could we ensure that vendors adhere to them unless regulated? Sure, economics suggests that those vendors who went to the expense of voluntarily producing and distributing such reports would earn a competitive advantage through the enhancement to consumer confidence that the practice would offer, but this naively presupposes that consumers know and care about such disclosures. Not to mention that as a strictly voluntary, unregulated practice there would be no assurance of the legitimacy of the claims. Imagine the value of Common Criteria evaluation assurances if self-certification was permitted rather than going through a licensed testing lab?

Economics of regulation
Of course, there will be costs to properly fund such a regulatory effort, both direct and indirect. There is also the Economics 101 “law of unintended consequences” argument against any form of regulation which basically states: more regulation = greater development/operating costs = decreased company profits = less incentive to innovate = fewer breakthroughs/advances brought to market = anti-capitalists kill babies. True as the fact that “sometimes interference with a system committed to protect a victim only makes that victim weaker” (e.g. the illusion of security), this argument is typically only invoked when convenient to the rhetorician, and only to the extent that it serves his agenda, for example, against affirmative action, rent controls, minimum wage, and equal opportunity employment. Certain economists have even gone so far as to argue against child-labor laws, asking “If child labor were legalized tomorrow, would you send your eight-year-old to the factories to bring home an extra $200 or so a month?” Absurdly, this asks the question only of an audience who, predictably, does not need the extra $200 a month – what about people for whom the $200 would mean the difference between paying the rent and eviction, feeding the family or going hungry? This sort of demagogic, one-dimensional logic would also suggest that “practicing moderate caloric intake could result in hunger, which could lead to binge eating and weight gain, therefore no one should eat in moderation lest they risk obesity.”  And if that’s not ridiculously myopic enough, why not invite this law’s pretend disciples to apply it to the administration of antibiotics when they or their loved ones contract strep or some other life threatening infection. After all, a well-known unintended consequence of antibiotics is that they create stronger, antibiotic-resistant pathogens. In other words, complex, multivariate systems are not black and white – sometimes applying this principle makes sense, sometimes it doesn’t.

It would be rationally and morally satisfying to see the discontinuation of the willy-nilly application of the “law of unintended consequences” to vital consumables. Consider what the state of medicine would be today without regulation… True, innovative new drugs might make it to market much more quickly and inexpensively, but then the average citizen would have to be a chemist to know what drugs, services, and procedures are safe and effective to use. Anti-regulationists will argue that no rational free-market enterprise would go to the extents necessary to develop, manufacture, and distribute a product for financial gain only to then lose their market to the ill-effects of an inferior product, such as reputational defamation, customer migration, or manslaughter.  But this line of reasoning, relying on economics in a vacuum, presupposes that it would be relatively prohibitively expensive to enter into the purveyance of said product. This is why these economist’s examples often use industries or products where there are obvious, practical barriers to entry, such as automobiles or airplanes. What they ignore is that today, but for the aegis of regulation, there are no such barriers to medicine or information technology. Anyone with little more than an exam-cram style education can mix herbs, vitamins or legal chemicals, can administer electrical stimulation or therapeutic massage (to be fair, cracking your bones does require 4,500+ classroom hours), can configure your mission-critical ESX server or router, and can use a melange of iptables, iproute2, spamassasin, and clamav to create a “UTM appliance.” But are these sufficiently safe and effective?

Buying anything on which critical systems depend (e.g. drugs, medical services, information technology goods and services) is not the same as buying a pair of jeans. While we can certainly expect free-market forces to eventually filter-out a particularly poorly made or ugly pair of jeans, the difference is that at worst, the fashion-illiterate might be ridiculed, but they will surely not suffer real harm, such as having their identity stolen, their database breached, or their health or lives compromised or taken as a result of their illiteracy. Anti-regulationists will then further argue that there is plenty of information available today to allow consumers to educate themselves, and that anything short of this sort of freedom is tantamount to a nanny state. Seriously, even with the Internet, do most people have the resources, namely, time and expertise, to diagnose themselves? To decide on the correct treatment? To select the most appropriate and effective network protection? Intentionally or unintentionally, through advertising or testimonials, deception or ignorance, unmoderated forums will invariably at times contain bad, biased, or incomplete information. And for perspective on the cost for the protection offered by regulation, the FDA has an FY09 budget of $2.4 billion dollars, the same figure as the cost of waging one week of the war in Iraq (the benchmark for downplaying the cost of anything we’d rather not pay for… The “B” word is reserved for monetary comparisons of 11 digits or more) .

Finally, every process has its byproducts, so we could expect variations of the inevitable lobby backlash, scandals (real and alleged), pursuit by ambulance-chasers, and every other idiosyncratically human form of corruption and parasitism. But we’re used to this, mostly because human nature doesn’t change, at least not all that quickly.

Science cures
“There are none so credulous as sufferers from disease. The need is urgent for legislation which will prevent the raising of false hopes of speedy cures of serious ailments by misstatement of facts as to worthless mixtures on which the sick will rely while their disease progresses unchecked.” (William Taft, 1911, on the need for greater protection against patent medicines)

Is it really a stretch to liken the current state of information technology to a disease sufferer, desperate for a cure?  Our quest for remedies and palliatives has spawned a new generation of quacksalvers making exaggerated, jargon-laden claims, and hawking goods and services of questionable worth. They are supported by symbiotic relationships of dubious ethicality with analysts and trade rags, overhyping by the media, and flagrant touting in commissioned reviews that present themselves as objective analyses. In some respects, many information security products are “real” just like snake oil is “real”; There might be some underlying validity at the component or principle level, but only thinly, and with limited practical value. And unlike medicine, information systems seem to derive little benefit from the placebo effect.

There are many legitimate and potentially beneficial instances of information security technologies, just as there are many legitimate and potentially beneficial drugs, but the safety and effectiveness of either of these technologies cannot be measured in isolation because they act on complex, dynamic, multivariate systems. The actual effects of many of the compounds taken as drugs are not known until they are metabolized by the various systems in the human body. Even then it is an iterative process as the “output” from one system (e.g. the liver) is circulated back to all other systems, where the process repeats until metabolism completes. And that does not even account for the subtle bioactive variances from one person to the next. To better understand this, technological advances are being developed to enhance our abilities to realistically model these systems, much the same way that HPC/supercomputing clusters are being employed to create “network situational awareness” models and visualizations in pursuit of network security. To some extent, the same recursive variability that exists in biological systems also exists in information technology systems, so without operative insertion into a live environment, it is no more possible to claim that “this firewall (or NAC, or DLP, etc.) will make me secure” than it is to say that “this drug will make me well”.  At best, we can attempt to reduce the risks of harm or ineffectiveness, first by scientifically proving the remedy to be safe and effective according to some generally agreed upon standard, and second, by ensuring that it is “used only as directed.”

Indeed, information technology is a profit-seeking, commercial enterprise, but as the lifeblood of the information age, isn’t it time we start taking information technology a little more seriously? Scientific rigor and discipline are not the nemeses of the free-market, and while not many relish the costs or ministrations of bureaucracy, the rational and objective monitoring, inspection, and supervision of critical systems can help to ensure the service and safety of those who depend upon them.

Share: These icons link to social bookmarking sites where readers can share and discover new web pages.
  • Twitter
  • LinkedIn
  • Facebook
  • email
  • Google Bookmarks
  • StumbleUpon
  • Reddit


Comment from Wrieck
Time: 2009-01-08, 10:15

3900 words…. Hail my bromidic friend!

“We see recorded only the favorable results: the unfavorable lie silent…..” I find it interesting that this quote lies in the body of a work whose intent appears to be exactly what this quote describes.
Come see the amazing Joe the prestidigitator. Watch carefully his right hand while his left conceals shocking realities.
You have set yourself in the role of the very thing you claim to attempt to defeat; purveyor of snake oils.

As your novella seems an attempt to prove the idea that verbosity is an effective means to cow a population I will act on the one counter that brevity will empower.

Although I have many objections, I can script a counter to this in one sentence. What you describe is an attempt to create a system of unlimited growth within a closed system; both impossible and unhealthy. If I am too terse for the consumer of this content…well you are not forgot; in fact you are very much in my heart as I write this.

To quote Chief Dan George in his role as Lone Watie: “you drink it”.

Pingback from Worth A Glance » On the Cybersecurity Act of 2009
Time: 2009-05-31, 00:19

[…] better known as the Cybersecurity Act of 2009 would suggest that last December’s article Quackery was the result of one or more of the above causes. I won’t say which, but I will admit that […]

You must be logged in to post a comment.