2010  

January 2010

Taheri, Amir. The Persian Night. New York: Encounter Books, 2009. ISBN 978-1-59403-240-0.
With Iran continuing its march toward nuclear weapons and long range missiles unimpeded by an increasingly feckless West, while simultaneously domestic discontent over the tyranny of the mullahs, economic stagnation, and stolen elections are erupting into bloody violence on the streets of major cities, this book provides a timely look at the history, institutions, personalities, and strategy of what the author dubs the “triple oxymoron”: the Islamic Republic of Iran which, he argues, espouses a bizarre flavour of Islam which is not only a heretical anathema to the Sunni majority, but also at variance with the mainstream Shiite beliefs which predominated in Iran prior to Khomeini's takeover; anything but a republic in any usual sense of the word; and motivated by a global messianic vision decoupled from the traditional interests of Iran as a nation state.

Khomeini's success in wresting control away from the ailing Shah without a protracted revolutionary struggle was made possible by support from “useful idiots” mostly on the political left, who saw Khomeini's appeal to the rural population as essential to gaining power and planned to shove him aside afterward. Khomeini, however, once in power, proved far more ruthless than his coalition partners, summarily putting to death all who opposed him, including many mullahs who dissented from his eccentric version of Islam.

Iran is often described as a theocracy, but apart from the fact that the all-powerful Supreme Guide is nominally a religious figure, the organisation of the government and distribution of power are very much along the lines of a fascist state. In fact, there is almost a perfect parallel between the institutions of Nazi Germany and those of Iran. In Germany, Hitler created duplicate party and state centres of power throughout the government and economy and arranged them in such a way as to ensure that decisions could not be made without his personal adjudication of turf battles between the two. In Iran, there are the revolutionary institutions and those of the state, operating side by side, often with conflicting agendas, with only the Supreme Guide empowered to resolve disputes. Just as Hitler set up the SS as an armed counterpoise to the Wehrmacht, Khomeini created the Islamic Revolutionary Guard Corps as the revolution's independent armed branch to parallel the state's armed forces.

Thus, the author stresses, in dealing with Iran, it is essential to be sure whether you're engaging the revolution or the nation state: over the history of the Islamic Republic, power has shifted back and forth between the two sets of institutions, and with it Iran's interaction with other players on the world stage. Iran as a nation state generally strives to become a regional superpower: in effect, re-establishing the Persian Empire from the Mediterranean to the Caspian Sea through vassal regimes. To that end it seeks weapons, allies, and economic influence in a fairly conventional manner. Iran the Islamic revolutionary movement, on the other hand, works to establish global Islamic rule and the return of the Twelfth Imam: an Islamic Second Coming which Khomeini's acolytes fervently believe is imminent. Because they brook no deviation from their creed, they consider Sunni Moslems, even the strict Wahabi sect of Saudi Arabia, as enemies which must be compelled to submit to Khomeini's brand of Islam.

Iran's troubled relationship with the United States cannot be understood without grasping the distinction between state and revolution. To the revolution, the U.S. is the Great Satan spewing foul corruption around the world, which good Muslims should curse, chanting “death to America” before every sura of the Koran. Iran the nation state, on the other hand, only wants Washington to stay out of its way as it becomes a regional power which, after all, was pretty much the state of affairs under the Shah, with the U.S. his predominant arms supplier. But the U.S. could never adopt such a strategy as long as the revolution has a hand in policy, nor will Iran's neighbours, terrified of its regional ambitions, encourage the U.S. to keep their hands off.

There is a great deal of conventional wisdom about Iran which is dead wrong, and this book dispels much of it. The supposed “CIA coup” against Mosaddegh in 1953, for which two U.S. presidents have since apologised, proves to have been nothing of the sort (although the CIA did, on occasion, claim credit for it as an example of a rare success amidst decades of blundering), with the U.S. largely supporting the nationalisation of the Iranian oil fields against fierce opposition from Britain. But cluelessness about Iran has never been in short supply among U.S. politicians. Speaking at the World Economic Forum, Bill Clinton said:

Iran today is, in a sense, the only country where progressive ideas enjoy a vast constituency. It is there that the ideas I subscribe to are defended by a majority.

Lest this be deemed a slip of the tongue due to intoxication by the heady Alpine air of Davos, a few days later on U.S. television he doubled down with:

[Iran is] the only one with elections, including the United States, including Israel, including you name it, where the liberals, or the progressives, have won two-thirds to 70 percent of the vote in six elections…. In every single election, the guys I identify with got two-thirds to 70 percent of the vote. There is no other country in the world I can say that about, certainly not my own.

I suppose if the U.S. had such an overwhelming “progressive” majority, it too would adopt “liberal” policies such as hanging homosexuals from cranes until they suffocate and stoning rape victims to death. But perhaps Clinton was thinking of Iran's customs of polygamy and “temporary marriage”.

Iran is a great nation which has been a major force on the world stage since antiquity, with a deep cultural heritage and vigorous population who, in exile from poor governance in the homeland, have risen to the top of demanding professions all around the world. Today (as well as much of the last century) Iran is saddled with a regime which squanders its patrimony on a messianic dream which runs the very real risk of igniting a catastrophic conflict in the Middle East. The author argues that the only viable option is regime change, and that all actions taken by other powers should have this as the ultimate goal. Does that mean going to war with Iran? Of course not—the very fact that the people of Iran are already pushing back against the mullahs is evidence they perceive how illegitimate and destructive the present regime is. It may even make sense to engage with institutions of the Iranian state, which will be the enduring foundation of the nation after the mullahs are sent packing, but it it essential that the Iranian people be sent the message that the forces of civilisation are on their side against those who oppress them, and to use the communication tools of this new century (Which country has the most bloggers? The U.S. Number two? Iran.) to bypass the repressive regime and directly address the people who are its victims.

Hey, I spent two weeks in Iran a decade ago and didn't pick up more than a tiny fraction of the insight available here. Events in Iran are soon to become a focus of world attention to an extent they haven't been for the last three decades. Read this book to understand how Iran figures in the contemporary Great Game, and how revolutionary change may soon confront the Islamic Republic.

 Permalink

Bryson, Bill. The Life and Times of the Thunderbolt Kid. London: Black Swan, 2007. ISBN 978-0-552-77254-9.
What could be better than growing up in the United States in the 1950s? Well, perhaps being a kid with super powers as the American dream reached its apogee and before the madness started! In this book, humorist, travel writer, and science populariser extraordinaire Bill Bryson provides a memoir of his childhood (and, to a lesser extent, coming of age) in Des Moines, Iowa in the 1950s and '60s. It is a thoroughly engaging and charming narrative which, if you were a kid there, then will bring back a flood of fond memories (as well as some acutely painful ones) and if you weren't, to appreciate, as the author closes the book, “What a wonderful world it was. We won't see its like again, I'm afraid.”

The 1950s were the golden age of comic books, and whilst shopping at the local supermarket, Bryson's mother would drop him in the (unsupervised) Kiddie Corral where he and other offspring could indulge for free to their heart's content. It's only natural a red-blooded Iowan boy would discover himself to be a superhero, The Thunderbolt Kid, endowed with ThunderVision, which enabled his withering gaze to vapourise morons. Regrettably, the power seemed to lack permanence, and the morons so dispersed into particles of the luminiferous æther had a tedious way of reassembling themselves and further vexing our hero and his long-suffering schoolmates. But still, more work for The Thunderbolt Kid!

This was a magic time in the United States—when prosperity not only returned after depression and war, but exploded to such an extent that mean family income more than doubled in the 1950s while most women still remained at home raising their families. What had been considered luxuries just a few years before: refrigerators and freezers, cars and even second cars, single family homes, air conditioning, television, all became commonplace (although kids would still gather in the yard of the neighbourhood plutocrat to squint through his window at the wonder of colour TV and chuckle at why he paid so much for it).

Although the transformation of the U.S. from an agrarian society to a predominantly urban and industrial nation was well underway, most families were no more than one generation removed from the land, and Bryson recounts his visits to his grandparents' farm which recall what was lost and gained as that pillar of American society went into eclipse.

There are relatively few factual errors, but from time to time Bryson's narrative swallows counterfactual left-wing conventional wisdom about the Fifties. For example, writing about atomic bomb testing:

Altogether between 1946 and 1962, the United States detonated just over a thousand nuclear warheads, including some three hundred in the open air, hurling numberless tons of radioactive dust into the atmosphere. The USSR, China, Britain, and France detonated scores more.

Sigh…where do we start? Well, the obvious subtext is that U.S. started the arms race and that other nuclear powers responded in a feeble manner. In fact, the U.S. conducted a total of 1030 nuclear tests, with a total of 215 detonated in the atmosphere, including all tests up until testing was suspended in 1992, with the balance conducted underground with no release of radioactivity. The Soviet Union (USSR) did, indeed, conduct “scores” of tests, to be precise 35.75 score with a total of 715 tests, with 219 in the atmosphere—more than the U.S.—including Tsar Bomba, with a yield of 50 megatons. “Scores” indeed—surely the arms race was entirely at the instigation of the U.S.

If you've grown up in he U.S. in the 1950s or wished you did, you'll want to read this book. I had totally forgotten the radioactive toilets you had to pay to use but kids could wiggle under the door to bask in their actinic glare, the glories of automobiles you could understand piece by piece and were your ticket to exploring a broad continent where every town, every city was completely different: not just another configuration of the same franchises and strip malls (and yet recall how exciting it was when they first arrived: “We're finally part of the great national adventure!”)

The 1950s, when privation gave way to prosperity, yet Leviathan had not yet supplanted family, community, and civil society, it was utopia to be a kid (although, having been there, then, I'd have deemed it boring, but if I'd been confined inside as present-day embryonic taxpayers in safetyland are I'd have probably blown things up. Oh wait—Willoughby already did that, twelve hours too early!). If you grew up in the '50s, enjoy spending a few pleasant hours back there; if you're a parent of the baby boomers, exult in the childhood and opportunities you entrusted to them. And if you're a parent of a child in this constrained century? Seek to give your child the unbounded opportunities and unsupervised freedom to explore the world which Bryson and this humble scribbler experienced as we grew up.

Vapourising morons with ThunderVision—we need you more than ever, Thunderbolt Kid!

A U.S. edition is available.

 Permalink

February 2010

Churchill, Winston S. The World Crisis. London: Penguin, [1923–1931, 2005] 2007. ISBN 978-0-14-144205-1.
Churchill's history of the Great War (what we now call World War I) was published in five volumes between 1923 and 1931. The present volume is an abridgement of the first four volumes, which appeared simultaneously with the fifth volume of the complete work. This abridged edition was prepared by Churchill himself; it is not a cut and paste job by an editor. Volume Four and this abridgement end with the collapse of Germany and the armistice—the aftermath of the war and the peace negotiations covered in Volume Five of the full history are not included here.

When this work began to appear in 1923, the smart set in London quipped, “Winston's written a book about himself and called it The World Crisis”. There's a lot of truth in that: this is something somewhere between a history and memoir of a politician in wartime. Description of the disastrous attempts to break the stalemate of trench warfare in 1915 barely occupies a chapter, while the Dardanelles Campaign, of which Churchill was seen as the most vehement advocate, and for which he was blamed after its tragic failure, makes up almost a quarter of the 850 page book.

If you're looking for a dispassionate history of World War I, this is not the book to read: it was written too close to the events of the war, before the dire consequences of the peace came to pass, and by a figure motivated as much to defend his own actions as to provide a historical narrative. That said, it does provide an insight into how Churchill's experiences in the war forged the character which would cause Britain to turn to him when war came again. It also goes a long way to explaining precisely why Churchill's warnings were ignored in the 1930s. This book is, in large part, a recital of disaster after disaster in which Churchill played a part, coupled with an explanation of why, in each successive case, it wasn't his fault. Whether or not you accept his excuses and justifications for his actions, it's pretty easy to understand how politicians and the public in the interwar period could look upon Churchill as somebody who, when given authority, produced calamity. It was not just that others were blind to the threat, but rather than Churchill's record made him a seriously flawed messenger on an occasion where his message was absolutely correct.

At this epoch, Churchill was already an excellent writer and delivers some soaring prose on occasions, but he has not yet become the past master of the English language on display in The Second World War (which won the Nobel Prize for Literature when it really meant something). There are numerous tables, charts, and maps which illustrate the circumstances of the war.

Americans who hold to the common view that “The Yanks came to France and won the war for the Allies” may be offended by Churchill's speaking of them only in passing. He considers their effect on the actual campaigns of 1918 as mostly psychological: reinforcing French and British morale and confronting Germany with an adversary with unlimited resources.

Perhaps the greatest lesson to be drawn from this work is that of the initial part, which covers the darkening situation between 1911 and the outbreak of war in 1914. What is stunning, as sketched by a person involved in the events of that period, is just how trivial the proximate causes of the war were compared to the apocalyptic bloodbath which ensued. It is as if the crowned heads, diplomats, and politicians had no idea of the stakes involved, and indeed they did not—all expected the war to be short and decisive, none anticipating the consequences of the superiority conferred on the defence by the machine gun, entrenchments, and barbed wire. After the outbreak of war and its freezing into a trench war stalemate in the winter of 1914, for three years the Allies believed their “offensives”, which squandered millions of lives for transitory and insignificant gains of territory, were conducting a war of attrition against Germany. In fact, due to the supremacy of the defender, Allied losses always exceeded those of the Germans, often by a factor of two to one (and even more for officers). Further, German losses were never greater than the number of new conscripts in each year of the war up to 1918, so in fact this “war of attrition” weakened the Allies every year it continued. You'd expect intelligence services to figure out such a fundamental point, but it appears the “by the book” military mentality dismissed such evidence and continued to hurl a generation of their countrymen into the storm of steel.

This is a period piece: read it not as a history of the war but rather to experience the events of the time as Churchill saw them, and to appreciate how they made him the wartime leader he was to be when, once again, the lights went out all over Europe.

A U.S. edition is available.

 Permalink

Carroll, Sean. From Eternity to Here. New York: Dutton, 2010. ISBN 978-0-525-95133-9.
The nature of time has perplexed philosophers and scientists from the ancient Greeks (and probably before) to the present day. Despite two and half millennia of reflexion upon the problem and spectacular success in understanding many other aspects of the universe we inhabit, not only has little progress been made on the question of time, but to a large extent we are still puzzling over the same problems which vexed thinkers in the time of Socrates: Why does there seem to be an inexorable arrow of time which can be perceived in physical processes (you can scramble an egg, but just try to unscramble one)? Why do we remember the past, but not the future? Does time flow by us, living in an eternal present, or do we move through time? Do we have free will, or is that an illusion and is the future actually predestined? Can we travel to the past or to the future? If we are typical observers in an eternal or very long-persisting universe, why do we find ourselves so near its beginning (the big bang)?

Indeed, what we have learnt about time makes these puzzles even more enigmatic. For it appears, based both on theory and all experimental evidence to date, that the microscopic laws of physics are completely reversible in time: any physical process can (and does) go in both the forward and reverse time directions equally well. (Actually, it's a little more complicated than that: just reversing the direction of time does not yield identical results, but simultaneously reversing the direction of time [T], interchanging left and right [parity: P], and swapping particles for antiparticles [charge: C] yields identical results under the so-called “CPT” symmetry which, as far is known, is absolute. The tiny violation of time reversal symmetry by itself in weak interactions seems, to most physicists, inadequate to explain the perceived unidirectional arrow of time, although some disagree.)

In this book, the author argues that the way in which we perceive time here and now (whatever “now” means) is a direct consequence of the initial conditions which obtained at the big bang—the beginning of time, and the future state into which the universe is evolving—eternity. Whether or not you agree with the author's conclusions, this book is a tour de force popular exposition of thermodynamics and statistical mechanics, which provides the best intuitive grasp of these concepts of any non-technical book I have yet encountered. The science and ideas which influenced thermodynamics and its practical and philosophical consequences are presented in a historical context, showing how in many cases phenomenological models were successful in grasping the essentials of a physical process well before the actual underlying mechanisms were understood (which is heartening to those trying to model the very early universe absent a theory of quantum gravity).

Carroll argues that the Second Law of Thermodynamics entirely defines the arrow of time. Closed systems (and for the purpose of the argument here we can consider the observable universe as such a system, although it is not precisely closed: particles enter and leave our horizon as the universe expands and that expansion accelerates) always evolve from a state of lower probability to one of higher probability: the “entropy” of a system is (sloppily stated) a measure of the probability of finding the system in a given macroscopically observable state, and over time the entropy always stays the same or increases; except for minor fluctuations, the entropy increases until the system reaches equilibrium, after which it simply fluctuates around the equilibrium state with essentially no change in its coarse-grained observable state. What we perceive as the arrow of time is simply systems evolving from less probable to more probable states, and since they (in isolation) never go the other way, we naturally observe the arrow of time to be universal.

Look at it this way—there are vastly fewer configurations of the atoms which make up an egg as produced by a chicken: shell outside, yolk in the middle, and white in between, as there are for the same egg scrambled in the pan with the fragments of shell discarded in the poubelle. There are an almost inconceivable number of ways in which the atoms of the yolk and white can mix to make the scrambled egg, but far fewer ways they can end up neatly separated inside the shell. Consequently, if we see a movie of somebody unscrambling an egg, the white and yolk popping up from the pan to be surrounded by fragments which fuse into an unbroken shell, we know some trickster is running the film backward: it illustrates a process where the entropy dramatically decreases, and that never happens in the real world. (Or, more precisely, its probability of happening anywhere in the universe in the time since the big bang is “beyond vanishingly small”.)

Now, once you understand these matters, as you will after reading the pellucid elucidation here, it all seems pretty straightforward: our universe is evolving, like all systems, from lower entropy to higher entropy, and consequently it's only natural that we perceive that evolution as the passage of time. We remember the past because the process of storing those memories increases the entropy of the universe; we cannot remember the future because we cannot predict the precise state of the coarse-grained future from that of the present, simply because there are far more possible states in the future than at the present. Seems reasonable, right?

Well, up to a point, Lord Copper. The real mystery, to which Roger Penrose and others have been calling attention for some years, is not that entropy is increasing in our universe, but rather why it is presently so low compared to what it might be expected to be in a universe in a randomly chosen configuration, and further, why it was so absurdly low in the aftermath of the big bang. Given the initial conditions after the big bang, it is perfectly reasonable to expect the universe to have evolved to something like its present state. But this says nothing at all about why the big bang should have produced such an incomprehensibly improbable set of initial conditions.

If you think about entropy in the usual thermodynamic sense of gas in a box, the evolution of the universe seems distinctly odd. After the big bang, the region which represents today's observable universe appears to have been a thermalised system of particles and radiation very near equilibrium, and yet today we see nothing of the sort. Instead, we see complex structure at scales from molecules to superclusters of galaxies, with vast voids in between, and stars profligately radiating energy into space with a temperature less than three degrees above absolute zero. That sure doesn't look like entropy going down: it's more like your leaving a pot of tepid water on the counter top overnight and, the next morning, finding a village of igloos surrounding a hot spring. I mean, it could happen, but how probable is that?

It's gravity that makes the difference. Unlike all of the other forces of nature, gravity always attracts. This means that when gravity is significant (which it isn't in a steam engine or pan of water), a gas at thermal equilibrium is actually in a state of very low entropy. Any small compression or rarefaction in a region will cause particles to be gravitationally attracted to volumes with greater density, which will in turn reinforce the inhomogeneity, which will amplify the gravitational attraction. The gas at thermal equilibrium will, then, unless it is perfectly homogeneous (which quantum and thermal fluctuations render impossible) collapse into compact structures separated by voids, with the entropy increasing all the time. Voilà galaxies, stars, and planets.

As sources of energy are exhausted, gravity wins in the end, and as structures compact ever more, entropy increasing apace, eventually the universe is filled only with black holes (with vastly more entropy than the matter and energy that fell into them) and cold dark objects. But wait, there's more! The expansion of the universe is accelerating, so any structures which are not gravitationally bound will eventually disappear over the horizon and the remnants (which may ultimately decay into a gas of unbound particles, although the physics of this remains speculative) will occupy a nearly empty expanding universe (absurd as this may sound, this de Sitter space is an exact solution to Einstein's equations of General Relativity). This, the author argues, is the highest entropy state of matter and energy in the presence of gravitation, and it appears from current observational evidence that that's indeed where we're headed.

So, it's plausible the entire evolution of the universe from the big bang into the distant future increases entropy all the way, and hence there's no mystery why we perceive an arrow of time pointing from the hot dense past to cold dark eternity. But doggone it, we still don't have a clue why the big bang produced such low entropy! The author surveys a number of proposed explanations, some of which invoke fine-tuning with no apparent physical explanations, summon an enormous (or infinite) “multiverse” of all possibilities and argue that among such an ensemble, we find ourselves in one of the vanishingly small fraction of universes like our own because observers like ourselves couldn't exist in all the others (the anthropic argument), or that the big bang was not actually the beginning and that some dynamical process which preceded the big bang (which might then be considered a “big bounce”) forced the initial conditions into a low entropy state. There are many excellent arguments against these proposals, which are clearly presented. The author's own favourite, which he concedes is as speculative as all the others, is that de Sitter space is unstable against a quantum fluctuation which nucleates a disconnected bubble universe in which entropy is initially low. The process of nucleation increases entropy in the multiverse, and hence there is no upper bound at all on entropy, with the multiverse eternal in past and future, and entropy increasing forever without bound in the future and decreasing without bound in the past.

(If you're a regular visitor here, you know what's coming, don't you?) Paging friar Ockham! We start out having discovered yet another piece of evidence for what appears to be a fantastically improbable fine-tuning of the initial conditions of our universe. The deeper we investigate this, the more mysterious it appears, as we discover no reason in the dynamical laws of physics for the initial conditions to be have been so unlikely among the ensemble of possible initial conditions. We are then faced with the “trichotomy” I discussed regarding the origin of life on Earth: chance (it just happened to be that way, or it was every possible way, and we, tautologically, live in one of the universes in which we can exist), necessity (some dynamical law which we haven't yet figured out caused the initial conditions to be the way we observe them to have been), or (and here's where all the scientists turn their backs upon me, snuff the candles, and walk away) design. Yes, design. Suppose (and yes, I know, I've used this analogy before and will certainly do so again) you were a character in a video game who somehow became sentient and began to investigate the universe you inhabited. As you did, you'd discover there were distinct regularities which governed the behaviour of objects and their interactions. As you probed deeper, you might be able to access the machine code of the underlying simulation (or at least get a glimpse into its operation by running precision experiments). You would discover that compared to a random collection of bits of the same length, it was in a fantastically improbable configuration, and you could find no plausible way that a random initial configuration could evolve into what you observe today, especially since you'd found evidence that your universe was not eternally old but rather came into being at some time in the past (when, say, the game cartridge was inserted).

What would you conclude? Well, if you exclude the design hypothesis, you're stuck with supposing that there may be an infinity of universes like yours in all random configurations, and you observe the one you do because you couldn't exist in all but a very few improbable configurations of that ensemble. Or you might argue that some process you haven't yet figured out caused the underlying substrate of your universe to assemble itself, complete with the copyright statement and the Microsoft security holes, from a generic configuration beyond your ability to observe in the past. And being clever, you'd come up with persuasive arguments as to how these most implausible circumstances might have happened, even at the expense of invoking an infinity of other universes, unobservable in principle, and an eternity of time, past and present, in which events could play out.

Or, you might conclude from the quantity of initial information you observed (which is identical to low initial entropy) and the improbability of that configuration having been arrived at by random processes on any imaginable time scale, that it was put in from the outside by an intelligent designer: you might call Him or Her the Programmer, and some might even come to worship this being, outside the observable universe, which is nonetheless responsible for its creation and the wildly improbable initial conditions which permit its inhabitants to exist and puzzle out their origins.

Suppose you were running a simulation of a universe, and to win the science fair you knew you'd have to show the evolution of complexity all the way from the get-go to the point where creatures within the simulation started to do precision experiments, discover curious fine-tunings and discrepancies, and begin to wonder…? Would you start your simulation at a near-equilibrium condition? Only if you were a complete idiot—nothing would ever happen—and whatever you might say about post-singularity super-kids, they aren't idiots (well, let's not talk about the music they listen to, if you can call that music). No, you'd start the simulation with extremely low entropy, with just enough inhomogeneity that gravity would get into the act and drive the emergence of hierarchical structure. (Actually, if you set up quantum mechanics the way we observe it, you wouldn't have to put in the inhomogeneity; it will emerge from quantum fluctuations all by itself.) And of course you'd fine tune the parameters of the standard model of particle physics so your universe wouldn't immediately turn entirely into neutrons, diprotons, or some other dead end. Then you'd sit back, turn up the volume on the MultIversePod, and watch it run. Sure 'nuff, after a while there'd be critters trying to figure it all out, scratching their balding heads, and wondering how it came to be that way. You would be most amused as they excluded your existence as a hypothesis, publishing theories ever more baroque to exclude the possibility of design. You might be tempted to….

Fortunately, this chronicle does not publish comments. If you're sending them from the future, please use the antitelephone.

(The author discusses this “simulation argument” in endnote 191. He leaves it to the reader to judge its plausibility, as do I. I remain on the record as saying, “more likely than not”.)

Whatever you may think about the Big Issues raised here, if you've never experienced the beauty of thermodynamics and statistical mechanics at a visceral level, this is the book to read. I'll bet many engineers who have been completely comfortable with computations in “thermogoddamics” for decades finally discover they “get it” after reading this equation-free treatment aimed at a popular audience.

 Permalink

D'Souza, Dinesh. Life After Death: The Evidence. Washington: Regnery Publishing, 2009 ISBN 978-1-59698-099-0.
Ever since the Enlightenment, and to an increasing extent today, there is a curious disconnect between the intellectual élite and the population at large. The overwhelming majority of human beings who have ever lived believed in their survival, in one form or another, after death, while materialists, reductionists, and atheists argue that this is nothing but wishful thinking; that there is no physical mechanism by which consciousness could survive the dissolution of the neural substrate in which it is instantiated, and point to the lack of any evidence for survival after death. And yet a large majority of people alive today beg to differ. As atheist H. G. Wells put it in a very different context, they sense that “Worlds may freeze and suns may perish, but there stirs something within us now that can never die again.” Who is right?

In this slim (256 page) volume, the author examines the scientific, philosophical, historical, and moral evidence for and implications of survival after death. He explicitly excludes religious revelation (except in the final chapter, where some evidence he cites as historical may be deemed by others to be argument from scriptural authority). Having largely excluded religion from the argument, he explores the near-universality of belief in life after death across religious traditions and notes the common threads uniting them.

But traditions and beliefs do not in any way address the actual question: does our individual consciousness, in some manner, survive the death of our bodies? While materialists discard such a notion as absurd, the author argues that there is nothing in our present-day understanding of physics, evolutionary biology, or neuroscience which excludes this possibility. In fact, the complete failure so far to understand the physical basis of consciousness can be taken as evidence that it may be a phenomenon independent of its physical instantiation: structured information which could conceivably transcend the hardware on which it currently operates.

Computer users think nothing these days of backing up their old computer, loading the backups onto a new machine (which may use a different processor and operating system), and with a little upward compatibility magic, having everything work pretty much as before. Do your applications and documents from the old computer die when you turn it off for the last time? Are they reincarnated when you load them into the replacement machine? Will they live forever as long as you continue to transfer them to successive machines, or on backup tapes? This may seem a silly analogy, but consider that materialists consider your consciousness and self to be nothing other than a pattern of information evolving in a certain way according to the rules of neural computation. Do the thought experiment: suppose nanotechnological robots replaced your meat neurons one by one with mechanical analogues with the same external electrochemical interface. Eventually your brain would be entirely different physically, but would your consciousness change at all? Why? If it's just a bunch of components, then replacing protein components with silicon (or whatever) components which work in the same way should make no difference at all, shouldn't it?

A large part of what living organisms do is sense their external environment and interact with it. Unicellular organisms swim along the gradient of increasing nutrient concentration. Other than autonomic internal functions of which we are aware only when they misbehave, humans largely experience the world through our sensory organs, and through the internal sense of self which is our consciousness. Is it not possible that the latter is much like the former—something external to the meatware of our body which is picked up by a sensory organ, in this case the neural networks of the brain?

If this be the case, in the same sense that the external world does not cease to exist when our eyes, ears, olfactory, and tactile sensations fail at the time of death or due to injury, is it not plausible that dissolution of the brain, which receives and interacts with our external consciousness, need not mean the end of that incorporeal being?

Now, this is pretty out-there stuff, which might cause the author to run from the room in horror should he hear me expound it. Fine: this humble book reviewer spent a substantial amount of time contributing to a project seeking evidence for existence of global, distributed consciousness, and has concluded that such has been demonstrated to exist by the standards accepted by most of the “hard” sciences. But let's get back to the book itself.

One thing you won't find here is evidence based upon hauntings, spiritualism, or other supposed contact with the dead (although I must admit, Chicago election returns are awfully persuasive as to the ability of the dead to intervene in affairs of the living). The author does explore near death experiences, noting their universality across very different cultures and religious traditions, and evidence for reincarnation, which he concludes is unpersuasive (but see the research of Ian Stevenson and decide for yourself). The exploration of a physical basis for the existence of other worlds (for example, Heaven and Hell) cites the “multiverse” paradigm, and invites sceptics of that “theory of anything” to denounce it as “just as plausible as life after death”—works for me.

Excuse me for taking off on a tangent here, but it is, in a formal sense. If you believe in an infinite chaotically inflating universe with random initial conditions, or in Many Worlds in One (October 2006), then Heaven and Hell explicitly exist, not only once in the multiverse, but an infinity of times. For every moment in your life that you may have to ceased to exist, there is a universe somewhere out there, either elsewhere in the multiverse or in some distant region far from our cosmic horizon in this universe, where there's an observable universe identical to our own up to that instant which diverges thence into one which grants you eternal reward or torment for your actions. In an infinite universe with random initial conditions, every possibility occurs an infinite number of times. Think about it, or better yet, don't.

The chapter on morality is particularly challenging and enlightening. Every human society has had a code of morality (different in the details, but very much the same at the core), and most of these societies have based their moral code upon a belief in cosmic justice in an afterlife. It's self-evident that bad guys sometimes win at the expense of good guys in this life, but belief that the score will be settled in the long run has provided a powerful incentive for mortals to conform to the norms which their societies prescribe as good. (I've deliberately written the last sentence in the post-modern idiom; I consider many moral norms absolutely good or bad based on gigayears of evolutionary history, but I needn't introduce that into evidence to prove my case, so I won't.) From an evolutionary standpoint, morality is a survival trait of the family or band: the hunter who shares the kill with his family and tribe will have more descendants than the gluttonous loner. A tribe which produces males who sacrifice themselves to defend their women and children will produce more offspring than the tribe whose males value only their own individual survival.

Morality, then, is, at the group level, a selective trait, and consequently it's no surprise that it's universal among human societies. But if, as serious atheists such as Bertrand Russell (as opposed to the lower-grade atheists we get today) worried, morality has been linked to religion and belief in an afterlife in every single human society to date, then how is morality (a survival characteristic) to be maintained in the absence of these beliefs? And if evolution has selected us to believe in the afterlife for the behavioural advantages that belief confers in the here and now, then how successful will the atheists be in extinguishing a belief which has conferred a behavioural selective advantage upon thousands of generations of our ancestors? And how will societies which jettison such belief fare in competition with those which keep it alive?

I could write much more about this book, but then you'd have to read a review even longer than the book, so I'll spare you. If you're interested in this topic (as you'll probably eventually be as you get closer to the checkered flag), this is an excellent introduction, and the end notes provide a wealth of suggestions for additional reading. I doubt this book will shake the convictions of either the confirmed believers or the stalwart sceptics, but it will provide much for both to think about, and perhaps motivate some folks whose approach is “I'll deal with that when the time comes” (which has been pretty much my own) to consider the consequences of what may come next.

 Permalink

Benioff, David. City of Thieves. New York: Viking, 2008. ISBN 978-0-670-01870-3.
This is a coming of age novel, buddy story, and quest saga set in the most implausible of circumstances: the 872 day Siege of Leningrad and the surrounding territory. I don't know whether the author's grandfather actually lived these events and recounted them to to him or whether it's just a literary device, but I'm certain the images you experience here will stay with you for many years after you put this book down, and that you'll probably return to it after reading it the first time.

Kolya is one of the most intriguing characters I've encountered in modern fiction, with Vika a close second. You wouldn't expect a narrative set in the German invasion of the Soviet Union to be funny, but there are quite a number of laughs here, which will acquaint you with the Russian genius for black humour when everything looks the bleakest. You will learn to be very wary around well-fed people in the middle of a siege!

Much of the description of life in Leningrad during the siege is, of course, grim, although arguably less so than the factual account in Harrison Salisbury's The 900 Days (however, note that the story is set early in the siege; conditions deteriorated as it progressed). It isn't often you read a historical novel in which Olbers' paradox figures!

 Permalink

March 2010

Sowell, Thomas. The Housing Boom and Bust. 2nd. ed. New York: Basic Books, [2009] 2010. ISBN 978-0-465-01986-1.
If you rely upon the statist legacy media for information regarding the ongoing financial crisis triggered by the collapse of the real estate bubble in certain urban markets in the United States, everything you know is wrong. This book is a crystal-clear antidote to the fog of disinformation emanating from the politicians and their enablers in media and academia.

If, as five or six people still do, you pay attention to the legacy media in the United States, you'll hear that there was a nationwide crisis in the availability of affordable housing, and that government moved to enable more people to become homeowners. The lack of regulation caused lenders to make risky loans and resell them as “toxic assets” which nobody could actually value, and these flimsy pieces of paper were sold around the world as if they were really worth something.

Everything you know is wrong.

In fact, there never was a nationwide affordable housing crisis. The percentage of family income spent on housing nationwide fell in the nineties and oughties. The bubble market in real estate was largely confined to a small number of communities which had enacted severe restrictions upon development that reduced the supply of housing—in fact, of 26 urban areas rated as “severely unaffordable”, 23 had adopted “smart growth” policies. (Rule of thumb: whenever government calls something “smart”, it's a safe bet that it's dumb.)

But the bubble was concentrated in the collectivist enclaves where the chattering class swarm and multiply: New York, San Francisco, Los Angeles, Washington, Boston, and hence featured in the media, ignoring markets such as Dallas and Houston where, in the absence of limits on development, housing prices were stable.

As Eric Sevareid observed, “The chief cause of problems is solutions”, and this has never been better demonstrated than in the sorry sequence of interventions in the market documented here. Let's briefly sketch the “problems” and “solutions” which, over decades, were the proximate cause of the present calamity.

First of all, back in the New Deal, politicians decided the problem of low rates of home ownership and the moribund construction industry of the Depression could be addressed by the solution of government (or government sponsored) institutions to provide an aftermarket in mortgages by banks, which could then sell the mortgages on their books and free up the capital to make new loans. When the economy started to grow rapidly after the end of World War II, this solution caused a boom in residential construction, enabling working class families to buy new houses in the rapidly expanding suburbs. This was seen as a problem, “suburban sprawl”, to which local politicians, particularly in well-heeled communities on the East and West coasts, responded with the solution of enacting land use restrictions (open space, minimum lot sizes, etc.) to keep the “essential character” of their communities from being changed by an invasion of hoi polloi and their houses made of ticky-tacky, all the same. This restriction of the supply of housing predictably led to a rapid rise in the price of housing in these markets (while growth-oriented markets without such restrictions experienced little nor no housing price increases, even at the height of the bubble). The increase in the price of housing priced more and more people out of the market, particularly younger first-time home buyers and minorities, which politicians proclaimed as an “affordable housing crisis”, and supposed, contrary to readily-available evidence, was a national phenomenon. They enacted solutions, such as the Community Reinvestment Act, regulation which required lenders to effectively meet quotas of low-income and minority mortgage lending, which compelled lenders to make loans their usual standards of risk evaluation would have caused them to decline. Expanding the pool of potential home buyers increased the demand for housing, and with the supply fixed due to political restrictions on development, the increase in housing prices inevitably accelerated, pricing more people out of the market. Politicians responded to this problem by encouraging lenders to make loans which would have been considered unthinkably risky just a few years before: no down payment loans, loans with a low-ball “teaser” rate for the first few years which reset to the prevailing rate thereafter, and even “liar loans” where the borrower was not required to provide documentation of income or net worth. These forms of “creative financing” were, in fact, highly-leveraged bets upon the housing bubble continuing—all would lead to massive defaults in the case of declining or even stable valuations of houses.

Because any rational evaluation of the risk of securities based upon the aggregation of these risky loans would cause investors to price them accordingly, securities of Byzantine complexity were created which allowed financial derivatives based upon them, with what amounted to insurance provided by counterparty institutions, which could receive high credit ratings by the government-endorsed rating agencies (whose revenue stream depended upon granting favourable ratings to these securities). These “mortgage-backed securities” were then sold all around the world, and ended up in the portfolios of banks, pension funds, and individual investors, including this scrivener (saw it coming; sold while the selling was good).

Then, as always happens in financial bubbles, the music stopped. Back in the days of ticker tape machines, you could hear the popping of a bubble. The spasmodic buying by the greatest fools of all would suddenly cease its clatter and an ominous silence would ensue. Then, like the first raindrops which presage a great deluge, you'd hear the tick-tick-tick of sell orders being filled below the peak price. And then the machine would start to chatter in earnest as sell orders flooded into the market, stops were hit and taken out, and volume exploded to the downside. So it has always been, and so it will always be. And so it was in this case, although in the less liquid world of real estate it took a little longer to play out.

As you'll note in these comments, and also in Sowell's book, the words “politicians” and “government” appear disproportionately as the subject of sentences which describe each step in how a supposed problem became a solution which became a problem. The legacy media would have you believe that “predatory lenders”, “greedy Wall Street firms”, “speculators”, and other nefarious private actors are the causes of the present financial crisis. These players certainly exist, and they've been evident as events have been played out, but the essence of the situation is that all of them are creations and inevitable consequences of the financial environment created by politicians who are now blaming others for the mess they created and calling for more “regulation” by politicians (as if, in the long and sorry history of regulation, it has ever made anything more “regular” than the collective judgement of millions of people freely trading with one another in an open market).

There are few people as talented as Thomas Sowell when it comes to taking a complex situation spanning decades and crossing the boundary of economics and politics, and then dissecting it out into the essentials like an anatomy teacher, explaining in clear as light prose the causes and effects, and the unintended and yet entirely predictable consequences (for those acquainted with basic economics) which led to the present mess. This is a masterpiece of such work, and anybody who's interested in the facts and details behind the obfuscatory foam emerging from the legacy media will find this book an essential resource.

Dr. Sowell's books tend to be heavily footnoted, with not only source citations but also expansions upon the discussion in the main text. The present volume uses a different style, with a lengthy “Sources” section, a full 19% of the book, listing citations for items in the text in narrative form, chapter by chapter. Expressing these items in text, without the abbreviations normally used in foot- or end-notes balloons the length of this section and introduces much redundancy. Perhaps it's due to the publisher feeling a plethora of footnotes puts off the causal reader, but for me, footnotes just work a lot better than these wordy source notes.

 Permalink

Smith, Lee. The Strong Horse. New York: Doubleday, 2010. ISBN 978-0-385-51611-2.
After the attacks upon the U.S. in September 2001, the author, who had been working as an editor in New York City, decided to find out for himself what in the Arab world could provoke such indiscriminate atrocities. Rather than turn to the works of establishment Middle East hands or radical apologists for Islamist terror, he pulled up stakes and moved to Cairo and later Beirut, spending years there living in the community, meeting people from all walks of life from doormen, cab drivers, students, intellectuals, clerics, politicians, artists, celebrities, and more. This book presents his conclusions in a somewhat unusual form: it is hard to categorise—it's part travelogue; collection of interviews; survey of history, exploration of Arab culture, art, and literature; and geopolitical analysis. What is clear is that this book is a direct assault upon the consensus view of the Middle East among Western policymakers which, if correct (and the author is very persuasive indeed) condemns many of the projects of “democratisation”, “peace processes”, and integration of the nations of the region into a globalised economy to failure; it calls for an entirely different approach to the Arab world, one from which many Western feel-good diplomats and politically correct politicians will wilt in horror.

In short, Smith concludes that the fundamental assumption of the program whose roots can be traced from Woodrow Wilson to George W. Bush—that all people, and Arabs in particular, strive for individual liberty, self-determination, and a civil society with democratically elected leaders—is simply false: those are conditions which have been purchased by Western societies over centuries at the cost of great bloodshed and suffering by the actions of heroes. This experience has never occurred in the Arab world, and consequently its culture is entirely different. One can attempt to graft the trappings of Western institutions onto an Arab state, but without a fundamental change in the culture, the graft will not take and before long things will be just as before.

Let me make clear a point the author stresses. There is not the slightest intimation in this book that there is some kind of racial or genetic difference (which are the same thing) between Arabs and Westerners. Indeed, such a claim can be immediately falsified by the large community of Arabs who have settled in the West, assimilated themselves to Western culture, and become successful in all fields of endeavour. But those are Arabs, often educated in the West, who have rejected the culture in which they were born, choosing consciously to migrate to a very different culture they find more congenial to the way they choose to live their lives. What about those who stay (whether by preference, or due to lack of opportunity to emigrate)?

No, Arabs are not genetically different in behaviour, but culture is just as heritable as any physical trait, and it is here the author says we must look to understand the region. The essential dynamic of Arab political culture and history, as described by the 14th century Islamic polymath Ibn Khaldun, is that of a strong leader establishing a dynasty or power structure to which subjects submit, but which becomes effete and feckless over time, only to eventually be overthrown violently by a stronger force (often issuing from desert nomads in the Arab experience), which begins the cycle again. The author (paraphrasing Osama bin Laden) calls this the “strong horse” theory: Arab populations express allegiance to the strongest perceived power, and expect changes in governance to come through violent displacement of a weaker existing order.

When you look at things this way, many puzzles regarding the Middle East begin to make more sense. First of all, the great success which imperial powers over the millennia, including the Persian, Ottoman, French, and British empires, have had in subduing and ruling Arabs without substantial internal resistance is explained: the empire was seen as the strong horse and Arab groups accepted subordination to it. Similarly, the ability of sectarian minorities to rule on a long-term basis in modern states such as Lebanon, Syria, and Iraq is explained, as is the great stability of authoritarian regimes in the region—they usually fall only when deposed by an external force or by a military coup, not due to popular uprisings.

Rather than presenting a lengthy recapitulation of the arguments in the book filtered through my own comprehension and prejudices, this time I invite you to read a comprehensive exposition of the author's arguments in his own words, in a transcript of a three hour interview by Hugh Hewitt. If you're interested in the topics raised so far, please read the interview and return here for some closing comments.

Is the author's analysis correct? I don't know—certainly it is at variance with that of a mass of heavy-hitting intellectuals who have studied the region for their entire careers and, if correct, means that much of Western policy toward the Middle East since the fall of the Ottoman Empire has been at best ill-informed and at worst tragically destructive. All of the debate about Islam, fundamentalist Islam, militant Islam, Islamism, Islamofascism, etc., in Smith's view, misses the entire point. He contends that Islam has nothing, or next to nothing, to do with the present conflict. Islam, born in the Arabian desert, simply canonised, with a few minor changes, a political and social regime already extant in Arabia for millennia before the Prophet, based squarely on rule by the strong horse. Islam, then, is not the source of Arab culture, but a consequence of it, and its global significance is as a vector which inoculates Arab governance by the strong horse into other cultures where Islam takes root. The extent to which the Arab culture is adopted depends upon the strength and nature of the preexisting local culture into which Islam is introduced: certainly the culture and politics of Islamic Turkey, Iran, and Indonesia are something very different from that of Arab nations, and from each other.

The author describes democracy as “a flower, not a root”. An external strong horse can displace an Arab autocracy and impose elections, a legislature, and other trappings of democracy, but without the foundations of the doctrine of natural rights, the rule of law, civil society, free speech and the tolerance of dissent, freedom of conscience, and the separation of the domain of the state from the life of the individual, the result is likely to be “one person, one vote, one time” and a return to strong horse government as has been seen so many times in the post-colonial era. Democracy in the West was the flowering of institutions and traditions a thousand years in the making, none of which have ever existed in the Arab world. Those who expect democracy to create those institutions, the author would argue, suffer from an acute case of inverting causes and effects.

It's tempting to dismiss Arab culture as described here as “dysfunctional”, but (if the analysis be correct), I don't think that's a fair characterisation. Arab governance looks dysfunctional through the eyes of Westerners who judge it based on the values their own cultures cherish, but then turnabout's fair play, and Arabs have many criticisms of the West which are equally well founded based upon their own values. I'm not going all multicultural here—there's no question that by almost any objective measure such as per capita income; industrial and agricultural output; literacy and education; treatment of women and minorities; public health and welfare; achievements in science, technology, and the arts; that the West has drastically outperformed Arab nations, which would be entirely insignificant in the world economy absent their geological good fortune to be sitting on top of an ocean of petroleum. But again, that's applying Western metrics to Arab societies. When Nasser seized power in Egypt, he burned with a desire to do the will of the Egyptian people. And like so many people over the millennia who tried to get something done in Egypt, he quickly discovered that the will of the people was to be left alone, and the will of the bureaucracy was to go on shuffling paper as before, counting down to their retirement as they'd done for centuries. In other words, by their lights, the system was working and they valued stability over the risks of change. There is also what might be described as a cultural natural selection effect in action here. In a largely static authoritarian society, the ambitious, the risk-takers, and the innovators are disproportionately prone to emigrate to places which value those attributes, namely the West. This deprives those who remain of the élite which might improve the general welfare, resulting in a population even more content with the status quo.

The deeply pessimistic message of this book is that neither wishful thinking, soaring rhetoric, global connectivity, precision guided munitions, nor armies of occupation can do very much to change a culture whose general way of doing things hasn't changed fundamentally in more than two millennia. While change may be possible, it certainly isn't going to happen on anything less than the scale of several generations, and then only if the cultural transmission belt from generation to generation can be interrupted. Is this depressing? Absolutely, but if this is the case, better to come to terms with it and act accordingly than live in a fantasy world where one's actions may lead to catastrophe for both the West and the Arab world.

 Permalink

Thor, Brad. The Last Patriot. London: Pocket Books, 2008. ISBN 978-1-84739-195-7.
This is a page-turning thriller which requires somewhat more suspension of disbelief than the typical book of the genre. The story involves, inter alia, radical Islam, the assassination of Mohammed, the Barbary pirates, Thomas Jefferson, a lost first edition of Don Quixote, puzzle boxes, cryptography, car bombs, the French DST, the U.S. president, and a plan to undermine the foundations of one of the world's great religions.

If this seems to cross over into the territory of a Dan Brown novel or the National Treasure movies, it does, and like those entertainments, you'll enjoy the ride more if you don't look too closely at the details or ask questions like, “Why is the President of the United States, with the resources of the NSA at his disposal, unable to break a simple cylinder substitution cipher devised more than two centuries ago?”. Still, if you accept this book for what it is, it's a fun read; this would make an excellent “airplane book”, at least as long as you aren't flying to Saudi Arabia—the book is banned in that country.

A U.S. edition is available.

 Permalink

Emison, John Avery. Lincoln über Alles. Gretna, LA: Pelican Publishing, 2009. ISBN 978-1-58980-692-4.
Recent books, such as Liberal Fascism (January 2008), have explored the roots and deep interconnections between the Progressive movement in the United States and the philosophy and policies of its leaders such as Theodore Roosevelt and Woodrow Wilson, and collectivist movements in twentieth century Europe, including Soviet communism, Italian fascism, and Nazism in Germany. The resurgence of collectivism in the United States, often now once again calling itself “progressive”, has made this examination not just a historical footnote but rather an important clue in understanding the intellectual foundations of the current governing philosophy in Washington.

A candid look at progressivism and its consequences for liberty and prosperity has led, among those willing to set aside accounts of history written by collectivists, whether they style themselves progressives or “liberals”, and look instead at contemporary sources and analyses by genuine classical liberals, to a dramatic reassessment of the place in history of Wilson and the two Roosevelts. While, in an academy and educational establishment still overwhelmingly dominated by collectivists, this is still a minority view, at least serious research into this dissenting view of history is available to anybody interested in searching it out.

Far more difficult to find is a critical examination of the U.S. president who was, according to this account, the first and most consequential of all American progressives, Abraham Lincoln. Some years ago, L. Neil Smith, in his essay “The American Lenin”, said that if you wanted to distinguish a libertarian from a conservative, just ask them about Abraham Lincoln. This observation has been amply demonstrated by the recent critics of progressivism, almost all conservatives of one stripe or another, who have either remained silent on the topic of Lincoln or jumped on the bandwagon and praised him.

This book is a frontal assault on the hagiography of Sainted Abe. Present day accounts of Lincoln's career and the Civil War contain so many omissions and gross misrepresentations of what actually happened that it takes a book of 300 pages like this one, based in large part on contemporary sources, to provide the context for a contrary argument. Topics many readers well-versed in the conventional wisdom view of American history may encounter for the first time here include:

  • No constitutional provision prohibited states from seceding, and the common law doctrine prohibiting legislative entrenchment (one legislature binding the freedom of a successor to act) granted sovereignty conventions the same authority to secede as to join the union in the first place.
  • None of the five living former presidents at the time Lincoln took office (only one a Southerner) supported military action against the South.
  • Lincoln's Emancipation Proclamation freed only slaves in states of the Confederacy; slaves in slave states which did not secede, including Delaware, Maryland, Kentucky, and Missouri remained in bondage. In fact, in 1861, Lincoln had written to the governors of all the states urging them to ratify the Corwin Amendment, already passed by the House and Senate, which would have written protection for slavery and indentured servitude into the Constitution. Further, Lincoln supported the secession of West Virginia from Virgina, and its admittance to the Union as a slave state. Slavery was not abolished throughout the United States until the adoption of the Thirteenth Amendment in December 1865, after Lincoln's death.
  • Despite subsequent arguments that secession was illegal, Lincoln mounted no legal challenge to the declarations of secession prior to calling for troops and initiating hostilities. Congress voted no declaration of war authorising Lincoln to employ federal troops.
  • The prosecution of total war against noncombatants in the South by Sherman and others, with the approval of Grant and Lincoln, not only constituted war crimes by modern standards, but were prohibited by the Lieber Code governing the conduct of the Union armies, signed by President Lincoln in April 1863.
  • Like the progressives of the early 20th century who looked to Bismarck's Germany as the model, and present-day U.S. progressives who want to remodel their country along the lines of the European social democracies, the philosophical underpinnings of Lincoln's Republicans and a number of its political and military figures as well as the voters who put it over the top in the states of the “old northwest” were Made in Germany. The “Forty-Eighters”, supporters of the failed 1848 revolutions in Europe, emigrated in subsequent years to the U.S. and, members of the European élite, established themselves as leaders in their new communities. They were supporters of a strong national government, progressive income taxation, direct election of Senators, nationalisation of railroads and other national infrastructure, an imperialistic foreign policy, and secularisation of the society—all part of the subsequent progressive agenda, and all achieved or almost so today. An estimation of the impact of Forty-Eighters on the 1860 election (at the time, in many states immigrants who were not yet citizens could vote if they simply declared their intention to become naturalised) shows that they provided Lincoln's margin of victory in Illinois, Indiana, Iowa, Michigan, Minnesota, Ohio, and Wisconsin (although some of these were close and may have gone the other way.)

Many of these points will be fiercely disputed by Lincoln scholars and defenders; see the arguments here, follow up their source citations, and make up your own mind. What is not in dispute is that the Civil War and the policies advocated by Lincoln and implemented in his administration and its Republican successors, fundamentally changed the relationship between the Federal government and the states. While before the Federal government was the creation of the states, to which they voluntarily delegated limited and enumerated powers, which they retained the right to reclaim by leaving the union, afterward Washington became not a federal government but a national government in the 19th century European sense, with the states increasingly becoming administrative districts charged with carrying out its policies and with no recourse when their original sovereignty was violated. A “national greatness” policy was aggressively pursued by the central government, including subsidies and land grants for building infrastructure, expansion into the Western territories (with repeatedly broken treaties and genocidal wars against their native populations), and high tariffs to protect industrial supporters in the North. It was Lincoln who first brought European-style governance to America, and in so doing became the first progressive president.

Now, anybody who says anything against Lincoln will immediately be accused of being a racist who wishes to perpetuate slavery. Chapter 2, a full 40 pages of this book, is devoted to race in America, before, during, and after the Civil War. Once again, you will learn that the situation is far more complicated than you believed it to be. There is plenty of blame to go around on all sides; after reviewing the four page list of Jim Crow laws passed by Northern states between 1777 and 1868, it is hard to regard them as champions of racial tolerance on a crusade to liberate blacks in the South.

The greatest issue regarding the Civil War, discussed only rarely now, is why it happened at all. If the war was about slavery (as most people believe today), then why, among all the many countries and colonies around the world which abolished slavery in the nineteenth century, was it only in the United States that abolition required a war? If, however, the war is regarded not as a civil war (which it wasn't, since the southern states did not wish to conquer Washington and impose their will upon the union), nor as a “war between the states” (because it wasn't the states of the North fighting against the states of the South, but rather the federal government seeking to impose its will upon states which no longer wished to belong to the union), but rather an imperial conquest waged as a war of annihilation if necessary, by a central government over a recalcitrant territory which refused to cede its sovereignty, then the war makes perfect sense, and is entirely consistent with the subsequent wars waged by Republican administrations to assert sovereignty over Indian nations.

Powerful central government, elimination of state and limitation of individual autonomy, imposition of uniform policies at a national level, endowing the state with a monopoly on the use of force and the tools to impose its will, grandiose public works projects funded by taxation of the productive sector, and sanguinary conflicts embarked upon in the interest of moralistic purity or national glory: these are all hallmarks of progressives, and this book makes a persuasive case that Lincoln was the first of their kind to gain power in the United States. Should liberty blossom again there, and the consequences of progressivism be candidly reassessed, there will be two faces to come down from Mount Rushmore, not just one.

 Permalink

Flynn, Vince. Consent to Kill. New York: Pocket Books, 2005. ISBN 978-1-4165-0501-3.
This is the sixth novel in the Mitch Rapp (warning—the article at this link contains minor spoilers) series. In the aftermath of Memorial Day (December 2009), a Saudi billionaire takes out a contract on Mitch Rapp, who he blames for the death of his son. Working through a cut-out, an assassin (one of the most interesting and frightening villains in the Vince Flynn yarns I've read so far—kind of an evil James Bond) is recruited to eliminate Rapp, ideally making it look like an accident to avoid further retribution. The assassin is conflicted, on the one hand respecting Rapp, but on the other excited by the challenge of going after the hardest target of all and ending his career with not just a crowning victory but a financial reward large enough to get out of the game.

Things do not go as planned, and the result is a relentless grudge match as Rapp pursues his attackers like Nemesis. This is a close-up, personal story rather than a high concept thriller like Memorial Day, and is more morality play than an edge of the seat page-turner. Once again, Flynn takes the opportunity to skewer politicians who'd rather excuse murderers than risk bad press. Although events and characters from earlier novels figure in this story, you can enjoy this one without having read any of the others.

Vince Flynn is acclaimed for the attention to detail in his novels, due not only to his own extensive research but a “brain trust” of Washington insider fans who “brief him in” on how things work there. That said, this book struck me as rather more sloppy than the others I've read, fumbling not super-geeky minutiæ but items I'd expect any editor with a sharp red pencil to finger. Below are some examples; while none are major plot spoilers, I've put them in a spoiler block just in case, but also for readers who'd like to see if they can spot them for themselves when they read the novel, then come back here and compare notes.

Spoiler warning: Plot and/or ending details follow.  
I'll cite these by chapter number, because I read the Kindle edition, which doesn't use conventional page numbers.

Chapter 53: “The sun was falling in the east, shooting golden streaks of light and shadows across the fields.” Even in CIA safe houses where weird drug-augmented interrogations are performed, the sun still sets in the west.

Chapter 63: “The presidential suite at the Hotel Baur Au Lac [sic] was secured for one night at a cost of 5,000 Swiss francs. … The suite consisted of three bedrooms, two separate living rooms, and a verandah that overlooked Lake Geneva.” Even the poshest of hotels in Zürich do not overlook Lake Geneva, seeing as it's on the other end of the country, more than 200 kilometres away! I presume he intended the Zürichsee. And you don't capitalise “au”.

Chapter 73: “Everyone on Mitch's team wore a transponder. Each agent's location was marked on the screen with a neon green dot and a number.” A neon dot would be red-orange, not green—how quickly people forget.

Chapter 78: “The 493 hp engine propelled the silver Mercedes down the Swiss autobahn at speeds sometimes approaching 150 mph. … The police were fine with fast driving, but not reckless.” There is no speed limit on German Autobahnen, but I can assure you that the Swiss police are anything but “fine” with people driving twice the speed limit of 120 km/h on their roads.

Spoilers end here.  
The conclusion is somewhat surprising. Whether we're beginning to see a flowering of compassion in Mitch Rapp or just a matter of professional courtesy is up to the reader to decide.

 Permalink

April 2010

Todd, Emmanuel. Après la démocratie. Paris: Gallimard, 2009. ISBN 978-2-07-078683-1.
This book is simultaneously enlightening, thought-provoking, and infuriating. The author is known for having forecast the collapse of the Soviet Union in 1976 and, in 2002, the end of U.S. hegemony in the political, military, and financial spheres, as we are currently witnessing. In the present work, he returns his focus to Europe, and France in particular, and examines how the economic consequences of globalisation, the emergence of low-wage economies such as China and India in direct competition with workers in the developed West, the expansion of college education from a small fraction to around a third of the population, changes in the structure of the family due to a longer lifespan and marital customs, the near eclipse of Christianity as a social and moral force in Western Europe, and the collapse of traditional political parties with which individuals would identify over long periods of time have led to a crisis in confidence among the voting public in the élites who (especially in France) have traditionally governed them, escalating to a point where serious thinkers question the continued viability of democratic governance.

Dubiety about democracy is neither limited to the author nor to France: right-like-a-stopped-clock pundit Thomas Friedman has written admiringly of China's autocracy compared to the United States, Gaia theorist James Lovelock argues that “climate change” may require the West to “put democracy on hold for a while” while other ManBearPig fabulists argue that the “failure of democracy” on this issue requires it to give way to “a form of authoritarian government by experts”.

The take in the present book is somewhat different, drawing on Todd's demographic and anthropological approach to history and policy. He argues that liberal democracy, as it emerged in Britain, France, and the United States, had as a necessary condition a level of literacy among the population of between one third and two thirds. With a lower level of literacy the general population is unable to obtain the information they need to form their own conclusions, and if a society reaches a very high level of literacy without having adopted democratic governance (for example Germany from Bismarck through World War II or the Soviet Union), then the governing structure is probably sufficiently entrenched so as to manage the flow of information to the populace and suppress democratic movements. (Actually, the author would like to believe that broad-based literacy is a necessary and sufficient condition for democracy in the long run, but to this reader he didn't make the sale.)

Once democratic governance is established, literacy tends to rise toward 100% both because governments promote it by funding education and because the citizenry has an incentive to learn to read and write in order to participate in the political process. A society with universal literacy and primary education, but only a very small class with advanced education tends to be stable, because broad political movements can communicate with the population, and the élites which make up the political and administrative class must be responsive to the electorate in order to keep their jobs. With the broad population starting out with pretty much the same educational and economic level, the resulting society tends toward egalitarianism in wealth distribution and opportunity for advancement based upon merit and enterprise. Such a society will be an engine of innovation and production, and will produce wealth which elevates the standard of living of its population, yielding overall contentment which stabilises the society against radical change.

In the twentieth century, and particularly in the latter half, growing prosperity in developed nations led to a social experiment on a massive scale entirely unprecedented in human history. For the first time, universal secondary education was seen as a social good (and enforced by compulsory education and rising school-leaving ages), with higher (college/university) education for the largest possible fraction of the population becoming the ultimate goal. Indeed, political rhetoric in the United States presently advocates making college education available for all. In France, the number of students in “tertiary” education (the emerging term of art, to avoid calling it “superior”, which would imply that those without it are inferior) burgeoned from 200,000 in 1950 to 2,179,000 in 1995, an increase of 990%, while total population grew just 39% (p. 56). Since then, the rate of higher education has remained almost constant, with the number of students growing only 4% between 1995 and 2005, precisely the increase in population during that decade. The same plateau was achieved earlier in the U.S., while Britain, which began the large-scale expansion of higher education later, only attained a comparable level in recent years, so it's too early to tell whether that will also prove a ceiling there as well.

The author calls this “stagnation” in education and blames it for a cultural pessimism afflicting all parts of the political spectrum. (He does not discuss the dumbing-down of college education which has accompanied its expansion and the attendant devaluing of the credential; this may be less the case on the Continent than in the Anglosphere.) At the same time, these societies now have a substantial portion of their population, around one third, equipped nominally with education previously reserved for a tiny élite, whose career prospects are limited simply because there aren't enough positions at the top to go around. At the same time, the educational stratification of the society into a tiny governing class, a substantial educated class inclined to feel entitled to economic rewards for all the years of their lives spent sitting in classrooms, and a majority with a secondary education strikes a blow at egalitarianism, especially in France where broad-based equality of results has been a central part of the national identity since the Revolution.

The pessimism created by this educational stagnation has, in the author's view, been multiplied to the point of crisis by what he considers to be a disastrous embrace of free trade. While he applauds the dismantling of customs barriers in Europe and supported the European “Constitution”, he blames the abundance of low-wage workers in China and India for what he sees as relentless pressure on salaries in Europe and the loss of jobs due to outsourcing of manufacturing and, increasingly, service and knowledge worker jobs. He sees this as benefiting a tiny class, maybe 1% of the population, to the detriment of all the rest. Popular dissatisfaction with this situation, and frustration in an environment where all major political parties across the ideological spectrum are staunch defenders of free trade, has led to the phenomenon of “wipeout” elections, where the dominant political party is ejected in disgust, only to be replaced by another which continues the same policies and in turn is rejected by the electorate.

Where will it all end? Well, as the author sees it, with Nicholas Sarkozy. He regards Sarkozy and everything he represents with such an actinic detestation that one expects the crackling of sparks and odour of ozone when opening the book. Indeed, he uses Sarkozy's personal shortcomings as a metaphor for what's wrong with France, and as the structure of the book as a whole. And yet he is forced to come to terms with the fact that Sarkozy was elected with the votes of 53% of French voters after, in the first round, effectively wiping out the National Front, Communists, and Greens. And yet, echoing voter discontent, in the municipal elections a year later, the left was seen as the overall winner.

How can a democratic society continue to function when the electorate repeatedly empowers people who are neither competent to govern nor aligned with the self-interest of the nation and its population? The author sees only three alternatives. The first (p. 232) is the redefinition of the state from a universal polity open to all races, creeds, and philosophies to a racially or ethnically defined state united in opposition to an “other”. The author sees Sarkozy's hostility to immigrants in France as evidence for such a redefinition in France, but does not believe that it will be successful in diverting the electorate's attention from a falling standard of living due to globalisation, not from the immigrant population. The second possibility he envisions (p. 239) is the elimination, either outright or effectively, of universal suffrage at the national level and its replacement by government by unelected bureaucratic experts with authoritarian powers, along the general lines of the China so admired by Thomas Friedman. Elections would be retained for local officials, preserving the appearance of democracy while decoupling it from governance at the national level. Lest this seem an absurd possibility, as the author notes on p. 246, this is precisely the model emerging for continental-scale government in the European Union. Voters in member states elect members to a European “parliament” which has little real power, while the sovereignty of national governments is inexorably ceded to the unelected European Commission. Note that only a few member states allowed their voters a referendum on the European “constitution” or its zombie reanimation, the Treaty of Lisbon.

The third alternative, presented in the conclusion to the work, is the only one the author sees as preserving democracy. This would be for the economic core of Europe, led by France and Germany, to adopt an explicit policy of protectionism, imposing tariffs on imports from low-wage producers with the goal of offsetting the wage differential and putting an end to the pressure on European workers, the outsourcing of jobs, and the consequent destruction of the middle class. This would end the social and economic pessimism in European societies, realign the policies of the governing class with the electorate, and restore the confidence among voters in those they elect which is essential for democracy to survive. (Due to its centuries-long commitment to free trade and alignment with the United States, Todd does not expect Great Britain to join such a protectionist regime, but believes that if France and Germany were to proclaim such a policy, their economic might and influence in the European Union would be sufficient to pull in the rest of the Continent and build a Wirtschaftsfestung Europa from the Atlantic to the Russian border.) In such a case, and only in that case, the author contends, will what comes after democracy be democracy.

As I noted at the start of these comments, I found this book, among other things, infuriating. If that's all it were, I would neither have finished it nor spent the time to write such a lengthy review, however. The work is worth reading, if for nothing else, to get a sense of the angst and malaise in present-day Europe, where it is beginning to dawn upon the architects and supporters of the social democratic welfare state that it is not only no longer competitive in the global economy but also unsustainable within its own borders in the face of a demographic collapse and failure to generate new enterprises and employment brought about by its own policies. Amidst foreboding that there are bad times just around the corner iTunes Store, and faced with an electorate which empowers candidates which leftists despise for being “populist”, “crude”, and otherwise not the right kind of people, there is a tendency among the Left to claim that “democracy is broken”, and that only radical, transformative change (imposed from the top down, against the will of the majority, if necessary) can save democracy from itself. This book is, I believe, an exemplar of this genre. I would expect several such books authored by leftist intellectuals to appear in the United States in the first years of a Palin administration.

What is particularly aggravating about the book is its refusal to look at the causes of the problems it proposes to address through a protectionist policy. Free trade did not create the regime of high taxation, crushing social charges, inability to dismiss incompetent workers, short work weeks and long vacations, high minimum wages and other deterrents to entry level jobs, and regulatory sclerosis which have made European industry uncompetitive, and high tariffs alone will not solve any of these problems, but rather simply allow them to persist for a while within a European bubble increasingly decoupled from the world economy. That's pretty much what the Soviet Union did for seventy years, if you think about it, and how well did that work out for the Soviet people?

Todd is so focused on protectionism as panacea that he Panglosses over major structural problems in Europe which would be entirely unaffected by its adoption. He dismisses demographic collapse as a problem for France, noting that the total fertility rate has risen over the last several years back to around 2 children per woman, the replacement rate. What he doesn't mention is that this is largely due to a high fertility rate among Muslim immigrants from North Africa, whose failure to assimilate and enter the economy is a growing crisis in France along with other Western European countries. The author dismisses this with a wave of the hand, accusing Sarkozy of provoking the “youth” riots of 2005 to further his own career, and argues that episode was genuinely discouraged young versus the ruling class and had little to do with Islam or ethnic conflict. One wonders how much time Dr. Todd has spent in the “no go” Muslim banlieues of Paris and other large European cities.

Further, Todd supports immigration and denounces restrictionists as opportunists seeking to distract the electorate with a scapegoat. But how is protectionism (closing the border to products from low wage countries) going to work, precisely, if the borders remain open to people from the Third World, many lacking any skills equipping them to participate in a modern industrialised society, and bringing with them, in many cases, belief systems hostile to the plurality, egalitarianism, secularism, and tolerance of European nations? If the descendants of immigrants do not assimilate, they pose a potentially disastrous social and political problem, while if they do, their entry into the job market will put pressure on wages just as surely as goods imported from China.

Given Todd's record in predicting events conventional wisdom deemed inconceivable, one should be cautious in dismissing his analysis here, especially as it drawn from the same kind of reasoning based in demographics, anthropology, and economics which informs his other work. If nothing else, it provides an excellent view of how more than fifty years journey down the social democratic road to serfdom brings into doubt how long the “democratic” part, as well as the society, can endure.

 Permalink

Rand, Ayn. Atlas Shrugged. New York: Dutton, [1957, 1992] 2005. ISBN 978-0-525-94892-6.
There is nothing I could possibly add by way of commentary on this novel, a classic of twentieth century popular fiction, one of the most discussed books of the epoch, and, more than fifty years after publication, still (at this writing) in the top two hundred books by sales rank at Amazon.com. Instead, I will confine my remarks to my own reactions upon reading this work for the third time and how it speaks to events of the present day.

I first read Atlas Shrugged in the summer of that most eventful year, 1968. I enjoyed it immensely, finding it not just a gripping story, but also, as Rand intended, a thorough (and in some ways, too thorough) exposition of her philosophy as glimpsed in The Fountainhead, which I'd read a few years earlier. I took it as an allegorical story about the pernicious effects and ultimate consequences of collectivism and the elevation of altruism over self-interest and need above earned rewards, but viewed the world in which it was set and the events which occurred there much as I did those of Orwell's 1984 and Heinlein's If This Goes On—: a cautionary tale showing the end point of trends visible in the contemporary world. But the world of Atlas Shrugged, like those of Orwell and Heinlein, seemed very remote from that of 1968—we were going to the Moon, and my expectations for the future were more along the lines of 2001 than Rand's dingy and decaying world. Also, it was 1968, for Heaven's sake, and I perceived the upheavals of the time (with a degree of naïveté and wrongheadedness I find breathtaking at this remove) as a sovereign antidote to the concentration of power and oppression of the individual, which would set things aright long before productive people began to heed Galt's call to shed the burden of supporting their sworn enemies.

My next traverse through Atlas Shrugged was a little before 1980. The seventies had taken a lot of the gloss off the bright and shiny 1968 vision of the future, and having run a small business for the latter part of that sorry decade, the encroachment of ever-rising taxes, regulation, and outright obstruction by governments at all levels was very much on my mind, which, along with the monetary and financial crises created by those policies plus a rising swamp of mysticism, pseudoscience, and the ascendant anti-human pagan cult of environmentalism, made it entirely plausible to me that the U.S. might tip over into the kind of accelerating decline described in the middle part of the novel. This second reading of the book left me with a very different impression than the first. This time I could see, from my own personal experience and in the daily news, precisely the kind of events foreseen in the story. It was no longer a cautionary tale but instead a kind of hitch-hiker's guide to the road to serfdom. Curiously, this reading the book caused me to shrug off the funk of demoralisation and discouragement and throw myself back into the entrepreneurial fray. I believed that the failure of collectivism was so self-evident that a turning point was at hand, and the landslide election of Reagan shortly thereafter appeared to bear this out. The U.S. was committed to a policy of lower taxes, rolling back regulations, standing up to aggressive collectivist regimes around the world, and opening the High Frontier with economical, frequent, and routine access to space (remember that?). While it was hardly the men of the mind returning from Galt's Gulch, it was good enough for me, and I decided to make the best of it and contribute what I could to what I perceived as the turnaround. As a footnote, it's entirely possible that if I hadn't reread Atlas Shrugged around this time, I would have given up on entrepreneurship and gone back to work for the Man—so in a way, this book was in the causal tree which led to Autodesk and AutoCAD. In any case, although working myself to exhaustion and observing the sapping of resources by looters and moochers after Autodesk's initial public stock offering in 1985, I still felt myself surfing on a wave of unbounded opportunity and remained unreceptive to Galt's pitch in 1987. In 1994? Well….

What with the eruption of the most recent financial crisis, the veer toward the hard left in the United States, and increasing talk of productive people opting to “go Galt”, I decided it was time for another pass through Atlas Shrugged, so I started reading it for the third time in early April 2010 and finished it in a little over two weeks, including some marathon sessions where I just didn't want to put it down, even though I knew the characters, principal events, and the ending perfectly well. What was different, and strikingly so, from the last read three decades ago, was how astonishingly prescient this book, published in 1957, was about events unfolding in the world today. As I noted above, in 1968 I viewed it as a dystopia set in an unspecified future. By 1980, many of the trends described in the book were clearly in place, but few of their ultimate dire consequences had become evident. In 2010, however, the novel is almost like reading a paraphrase of the history of the last quarter century. “Temporary crises”, “states of emergency”, “pragmatic responses”, calls to “sacrifice for the common good” and to “share the wealth” which seemed implausible then are the topics of speeches by present day politicians and news headlines. Further, the infiltration of academia and the news media by collectivists, their undermining the language and (in the guise of “postmodernism”) the foundations of rational thought and objective reality, which were entirely beneath the radar (at least to me) as late as 1980, are laid out here as clear as daylight, with the simultaneously pompous and vaporous prattling of soi-disant intellectuals which doubtless made the educated laugh when the book first appeared now having become commonplace in the classrooms of top tier universities and journals of what purport to be the humanities and social sciences. What once seemed a fantastic nightmare painted on a grand romantic canvas is in the process of becoming a shiveringly accurate prophecy.

So, where are we now? Well (if you'll allow me to use the word) objectively, I found the splice between our real-life past and present to be around the start of chapter 5 of part II, “Account Overdrawn”. This is about 500 pages into the hardback edition of 1168 pages, or around 40%. Obviously, this is the crudest of estimates—many things occur before that point which haven't yet in the real world and many afterward have already come to pass. Yet still, it's striking: who would have imagined piracy on the high seas to be a headline topic in the twenty-first century? On this reading I was also particularly struck by chapter 8 of part III, “The Egoist” (immediately following Galt's speech), which directly addresses a question I expect will soon intrude into the public consciousness: the legitimacy or lack thereof of nominally democratic governments. This is something I first wrote about in 1988, but never expected to actually see come onto the agenda. A recent Rasmussen poll, however, finds that just 21% of voters in the United States now believe that their federal government has the “consent of the governed”. At the same time, more than 40% of U.S. tax filers pay no federal income tax at all, and more than a majority receive more in federal benefits than they pay in taxes. The top 10% of taxpayers (by Adjusted Gross Income) pay more than 70% of all personal income taxes collected. This makes it increasingly evident that the government, if not already, runs the risk of becoming a racket in which the non-taxpaying majority use the coercive power of the state to shake down a shrinking taxpaying minority. This is precisely the vicious cycle which reaches its endpoint in this chapter, where the government loses all legitimacy in the eyes of not only its victims, but even its beneficiaries and participants. I forecast that should this trend continue (and that's the way to bet), within two years we will see crowds of people in the U.S. holding signs demanding “By what right?”.

In summary, I very much enjoyed revisiting this classic; given that it was the third time through and I don't consider myself to have changed all that much in the many years since the first time, this didn't come as a surprise. What I wasn't expecting was how differently the story is perceived based on events in the real world up to the time it's read. From the current perspective, it is eerily prophetic. It would be amusing to go back and read reviews at the time of its publication to see how many anticipated that happening. The ultimate lesson of Atlas Shrugged is that the looters subsist only by the sanction of their victims and through the product of their minds, which cannot be coerced. This is an eternal truth, which is why this novel, which states it so clearly, endures.

The link above is to the hardbound “Centennial Edition”. There are trade paperback, mass market paperback, and Kindle editions available as well. I'd avoid the mass market paperback, as the type is small and the spines of books this thick tend to disintegrate as you read them. At current Amazon prices, the hardcover isn't all that much more than the trade paperback and will be more durable if you plan to keep it around or pass it on to others. I haven't seen the Kindle transfer; if it's well done, it would be marvellous, as any print edition of this book is more than a handful.

 Permalink

Hickam, Homer H., Jr. Back to the Moon. New York: Island Books, 1999. ISBN 978-0-440-23538-5.
Jerry Pournelle advises aspiring novelists to plan to throw away their first million words before mastering the craft and beginning to sell. (Not that writing a million words to the best of your ability and failing to sell them guarantees success, to be sure. It's just that most novelists who eventually become successful have a million words of unsold manuscripts in the trunk in the attic by the time they break into print and become well known.) When lightning strikes and an author comes from nowhere to bestseller celebrity overnight, there is a strong temptation, not only for the author but also for the publisher, to dig out those unsold manuscripts, perhaps polish them up a bit, and rush them to market to capitalise upon the author's newfound name recognition. Pournelle writes, “My standard advice to beginning writers is that if you do hit it big, the biggest favor you can do your readers is to burn your trunk; but in fact most writers don't, and some have made quite a bit of money off selling what couldn't be sold before they got famous.”

Here, I believe, we have an example of what happens when an author does not follow that sage advice. Homer Hickam's Rocket Boys (July 2005), a memoir of his childhood in West Virginia coal country at the dawn of the space age, burst onto the scene in 1998, rapidly climbed the New York Times bestseller list, and was made into the 1999 film October Sky. Unknown NASA engineer Hickam was suddenly a hot literary property, and pressure to “sell the trunk” was undoubtedly intense. Out of the trunk, onto the press, into the bookshops—and here we have it, still in print a decade later.

The author joined NASA's Marshall Space Flight Center in 1981 as an aerospace engineer and worked on a variety of projects involving the Space Shuttle, including training astronauts for a number of demanding EVA missions. In the Author's Note, he observes that, while initially excited to work on the first reusable manned spacecraft, he, like many NASA engineers, eventually became frustrated with going in circles around the Earth and wished that NASA could once again send crews to explore as they had in the days of Apollo. He says, “I often found myself lurking in the techno-thriller or science fiction area of bookstores looking unsuccessfully for a novel about a realistic spacecraft, maybe even the shuttle, going back to the moon. I never found it. One day it occurred to me that if I wanted to read such a book, I would have to write it myself.”

Well, here it is. And if you're looking for a thriller about a “realistic spacecraft, maybe even the shuttle, going back to the moon”, sadly, you still haven't found it. Now, the odd thing is that this book is actually quite well written—not up to the standard of Rocket Boys, but hardly the work of a beginner. It is tightly plotted, the characters are interesting and develop as the story progresses, and the author deftly balances multiple plot lines with frequent “how are they going to get out of this?” cliffhangers, pulling it all together at the end. These are things you'd expect an engineer to have difficulty mastering as a novelist. You'd figure, however, that somebody with almost two decades of experience going to work every day at NASA and with daily contacts with Shuttle engineers and astronauts would get the technical details right, or at least make them plausible. Instead, what we have is a collection of laugh-out-loud howlers for any reader even vaguely acquainted with space flight. Not far into the book (say, fifty or sixty pages, or about a hundred “oh come on”s), I realised I was reading the literary equivalent of the Die Hard 2 movie, which the Wall Street Journal's reviewer dubbed “aviation for airheads”. The present work, “spaceflight for space cases”, is much the same: it works quite well as a thriller as long as you know absolutely nothing about the technical aspects of what's going on. It's filled with NASA jargon and acronyms (mostly used correctly) which lend it a feeling of authenticity much like Tom Clancy's early books. However, Clancy (for the most part), gets the details right: he doesn't, for example, have a submarine suddenly jump out of the water, fly at Mach 5 through the stratosphere, land on a grass runway in a remote valley in the Himalayas, then debark an assault team composed of amateurs who had never before fired a gun.

Shall we go behind the spoiler curtain and take a peek at a selection of the most egregious and side splitting howlers in this yarn?

Spoiler warning: Plot and/or ending details follow.  
  • Apollo 17 landed in the Taurus-Littrow region, not “Frau [sic] Mauro”. Apollo 14 landed at Fra Mauro.
  • In the description of the launch control centre, it is stated that Houston will assume control “the moment Columbia lifted a millimeter off the Cape Canaveral pad”. In fact, Houston assumes control once the launch pad tower has been cleared.
  • During the description of the launch, the ingress team sees the crew access arm start to retract and exclaims “Automatic launch sequence! We've got to go!”. In fact, the ingress team leaves the pad before the T−9 minute hold, and the crew access arm retracts well before the automatic sequence starts at T−31 seconds.
  • There are cameras located all over the launch complex which feed into the launch control centre. Disabling the camera in the white room would still leave dozens of other cameras active which would pick up the hijinks underway at the pad.
  • NASA human spaceflight hardware is manufactured and prepared for flight under the scrutiny of an army of inspectors who verify every aspect of the production process. Just how could infiltrators manage to embed payload in the base of the shuttle's external tank in the manufacturing plant at Michoud, and how could this extra cargo not be detected anywhere downstream? If the cargo was of any substantial size, the tank would fail fit tests on the launch platform, and certainly some pad rat would have said “that's not right” just looking at it.
  • Severing the data cable between the launch pad and the firing room would certainly cause the onboard automatic sequencer to halt the countdown. Even though the sequencer controls the launch process, it remains sensitive to a cutoff signal from the control centre, and loss of communications would cause it to abort the launch sequence. Further, the fact that the shuttle hatch was not closed would have caused the auto-sequencer to stop due to a cabin pressure alarm. And the hatch through which one boards the shuttle is not an “airlock”.
  • The description of the entire terminal countdown and launch process suffers from the time dilation common in bad movie thrillers: where several minutes of furious activity occur as the bomb counts down the last ten seconds.
  • The intended crew of the shuttle remains trapped in the pad elevator when the shuttle lifts off. They are described as having temporary hearing loss due to the noise. In fact, their innards would have been emulsified by the acoustic energy of the solid rocket boosters, then cremated and their ashes scattered by the booster plume.
  • The shuttle is said to have entered a 550 mile orbit with the external tank (ET) still attached. This is impossible; the highest orbit ever achieved by the shuttle was around 385 miles on the Hubble deployment and service missions, and this was a maximum-performance effort. Not only could the shuttle not reach 550 miles on the main engines, the orbital maneuvering system (OMS) would not have the velocity change capability (delta-V) required to circularise the orbit at this altitude with the ET still attached. And by the way, who modified the shuttle computer ascent software to change the launch trajectory and bypass ET jettison, and who loaded the modified software into the general purpose computers, and why was the modified software not detected by the launch control centre's pre-launch validation of the software load?
  • If you're planning a burn to get on a trans-lunar injection trajectory, you want to do it in as low an Earth orbit as possible in order to get the maximum assist to the burn. An orbit as low as used by the later Apollo missions probably wouldn't work due to the drag of having the ET attached, but there's no reason you'd want to go as high as 550 miles; that's just wasting energy.
  • The “Big Dog” and “Little Dog” engines are supposed to have been launched on an Indian rocket, with the mission being camouflaged as a failed communication satellite launch. But, whatever the magical properties of Big Dog, a storable propellant rocket (which it must be, since it's been parked in orbit for months waiting for the shuttle to arrive) with sufficient delta-V to boost the entire shuttle onto a trans-lunar trajectory, enter lunar orbit, and then leave lunar orbit to return to Earth would require a massive amount of fuel, be physically very large, and hence require a heavy lift launcher which (in addition to the Indians not possessing one) would not be used for a communications satellite mission. The Saturn S-IV B stage which propelled Apollo to the Moon was 17.8 metres long, 6.6 metres in diameter, and massed 119,000 kg fully fueled, and it was boosting a stack less massive than a space shuttle, and used only for trans-lunar injection, not lunar orbit entry and exit, and it used higher performance hydrogen and oxygen fuel. Big Dog would not be a bolt-in replacement engine for the shuttle, but rather a massive rocket stage which could hardly be disguised as a communications satellite.
  • On the proposed “rescue” mission by Endeavour, commander Grant proposes dropping the space station node in the cargo bay in a “parking orbit”, whence the next shuttle mission could capture it and move it to the Space Station. But in order to rendezvous with Columbia, Endeavour would have to launch into its 28.7 degree inclination orbit, leaving the space station node there. The shuttle OMS does not remotely have the delta-V for a plane change to the 51 degree orbit of the station, so there is no way the node could be delivered to the station.
  • A first-time astronaut is a “rookie”, not “rooky”. A rook is a kind of crow or a chess piece.
  • Removing a space shuttle main engine (SSME) is a complicated and lengthy procedure on the ground, requiring special tools and workstands. It is completely impossible that this could be done in orbit, especially by two people with no EVA experience, working in a part of the shuttle where there are no handgrips or restraints for EVA work, and where the shuttle's arm (remote manipulator system) cannot reach. The same goes for attaching Big Dog as a replacement.
  • As Endeavour closes in, her commander worries that “[t]oo much RCS propellant had been used to sneak up on Columbia”. But it's the orbital maneuvering system (OMS), not the reaction control system (RCS) which is used in rendezvous orbit-change maneuvers.
  • It's “Chernobyl” (Чорнобиль), not “Chernoble”.
  • Why, on a mission where all the margins are stretched razor-thin, would you bring along a spare lunar lander when you couldn't possibly know you'd need it?
  • Olivia Grant flies from Moscow to Alma-Ata on a “TU-144 transport”. The TU-144 supersonic transport was retired from service in 1978 after only 55 scheduled passenger flights. Even if somebody put a TU-144 back into service, it certainly wouldn't take six hours for the flight.
  • Vice President Vanderheld says, “France, for one, has spent trillions on thermonuclear energy. Fusion energy would destroy that investment overnight.” But fusion is thermonuclear energy!
  • When the tethered landing craft is dropped on the Moon from the shuttle, its forward velocity will be 3,700 miles per hour, the same as the shuttle's. The only way for it to “hit the lunar surface at under a hundred miles per hour” would be for the shuttle to cancel its entire orbital velocity before dropping the lander and then, in order to avoid crashing into the lunar surface, do a second burn as it was falling to restore its orbital velocity. Imparting such a delta-V to the entire shuttle would require a massive burn, for which there would be no reason to have provided the fuel in the mission plan. Also, at the moment the shuttle started the burn to cancel its orbital velocity, the tether would string out behind the shuttle, not remain at its altitude above the Moon.
  • The Apollo 17 lunar module Challenger's descent stage is said to have made a quick landing and hence have “at least half its propellant left”. Nonsense—while Cernan and Schmitt didn't land on fumes like Apollo 11 (and, to a lesser extent, Apollo 14), no Apollo mission landed with the tanks anywhere near half-full. In any case, unless I'm mistaken, residual descent engine propellant was dumped shortly after landing; this was certainly done on Apollo 11 (you can hear the confirmation on my re-mix of the Apollo 11 landing as heard in the Eagle's cabin), and I've never heard if it not being done on later missions.
  • Jack connects an improvised plug to the “electronic port used to command the descent engine” on Challenger. But there were no such “ports”—connections between the ascent and descent stages were hard-wired in a bundle which was cut in two places by a pyrotechnic “guillotine” when the ascent stage separated. The connections to the descent engine would be a mass of chopped cables which would take a medusa of space Barney clips and unavailable information to connect to.
  • Even if there were fuel and oxidiser left in the tanks of the descent stage, the helium used to pressure-feed the propellants to the engine would have been long gone. And the hypergolic combustion wouldn't make a “plume of orange and scarlet” (look at the Apollo 17 liftoff video), and without a guidance system for the descent engine, there would be no chance of entering lunar orbit.
  • The tether is supposed to be used to generate electrical power after the last fuel cell fails. But this is done far from the Earth, where the gradient in the Earth's magnetic field across the length of the tether would be much too small to generate the required power.
  • Using the tether as an aerodynamic brake at reentry is absurd. The tether would have to dissipate the entire energy of a space shuttle decelerating from Mach 36 to Mach 25. Even if the tether did not immediately burn away (which it would), it would not have the drag to accomplish this in the time available before the shuttle hit the atmosphere (with the payload bay doors still open!). And the time between the tethered satellite entering the atmosphere and the shuttle hitting the stony blue would be a matter of seconds, far too little to close the payload bay doors.
  • “The space agency had gotten out of the operations business and moved into the forefront of research and development, handing over its scientific and engineering knowledge to American commercial space operators.” Now here we have an actually prophetic passage. Let's hope it comes to pass!
  • “[W]hen the sun goes down into the sea, just as it sinks out of sight, its rays flash up through the water. If you look fast, you'll see it—a green flash.” Well, no—actually the green flash is due to atmospheric refraction and has nothing to do with water.

Apart from these particulars (and they are just a selection from a much larger assortment in the novel), the entire story suffers from what I'll call the “Tom Swift, let's go!” fallacy of science fiction predating the golden age of the 1930s. The assumption throughout this book is that people can design fantastically complicated hardware which interfaces with existing systems, put it into service by people with no training on the actual hardware and no experience in the demanding environment in which it will be used, cope with unexpected reverses on the fly, always having the requisite resources to surmount the difficulties, and succeed in the end. Actually, I'm being unfair to Tom Swift in identifying such fiction with that character. The original Tom Swift novels always had him testing his inventions extensively before putting them into service, and modifying them based upon the test results. Not here: everything is not only good to go on the first shot, it is able to overcome disasters because the necessary hardware has always providentially been brought along.

Spoilers end here.  
If you've trudged through the spoiler block at my side, you may be exasperated and wondering why I'd spend so much time flensing such a bad novel. Well, it's because I'd hoped for so much and was sorely disappointed. Had the author not said the goal was to be “realistic”, I'd have put it down after the first fifty pages or so and, under the rules of engagement of this chronicle, you'd have never seen it here. Had it been presented as a “spaceflight fantasy”, I might have finished it and remarked about how well the story was told; hey, I give my highest recommendation to a story about a trip to the Moon launched from a 900 foot long cannon!

I'll confess: I've been wanting to write a back to the Moon novel myself for at least thirty years. My scenario was very different (and I hereby place it into the public domain for scribblers more talented and sedulous than I to exploit): a signal is detected originating from the Moon with a complex encoding originating at a site where no known probe has landed. The message is a number: "365", "364", 363",… decrementing every day. Now what it would it take to go there and find out what was sending it before the countdown reaches zero? The story was to be full of standing in line to file forms to get rocket stages and capsules out of museums, back channel discussions between Soviet and U.S. space officials, and eventual co-operation on a cobbled together mission which would end up discovering…but then you'd have to have read the story. (Yes, much of this has been done in movies, but they all postdate this treatment.)

Since I'll probably never write that story, I'd hoped this novel would fill the niche, and I'm disappointed it didn't. If you know nothing about spaceflight and don't care about the details, this is a well-crafted thriller, which accounts for its many five star reviews at Amazon. If you care about technical plausibility, you can take this as either one of those books to hurl into the fireplace to warm you up on a cold winter evening or else as a laugh riot to enjoy for what it is and pass on to others looking for a diversion from the uncompromising physics of the real world.

Successful novelists, burn the trunk!

 Permalink

Landis, Tony R. and Dennis R. Jenkins. Experimental and Prototype U.S. Air Force Jet Fighters. North Branch, MN: Specialty Press, 2008. ISBN 978-1-58007-111-6.
This beautifully produced book covers every prototype jet fighter developed by the U.S. Air Force from the beginning of the jet age in the 1940s through the present day. Only concepts which at least entered the stage of prototype fabrication are included: “paper airplane” conceptual studies are not discussed, except in conjunction with designs which were actually built. The book is lavishly illustrated, with many photographs in colour, and the text is well written and almost free of typographical errors. As the title states, only Air Force prototypes are discussed—Navy and CIA development projects are covered only if Air Force versions were subsequently manufactured.

The first decade of the jet age was a wild and woolly time in the history of aeronautical engineering; we'll probably never see its like again. Compared to today's multi-decade development projects, many of the early jet designs went from contract award to flying hardware in less than a year. Between May 1953 and December 1956, no fewer than six operational jet fighter prototypes (F-100, F-101, F-102, F-104, F-105, and F-106) made their first flights. Among prototypes which never entered into serial production were concepts which illustrate the “try anything” spirit of the age. Consider, for example, the XP-81 which had a turboprop engine in the nose and a turbojet in the tail; the XF-84H with a turbine driven propeller whose blade tips exceeded the speed of sound and induced nausea in pilots and ground crews, who nicknamed it “Thunderscreech”; or the tiny XP-85 which was intended to be carried in the bomb bay of a B-36 and launched to defend the bomber should enemy interceptors attack.

So slow has been the pace of fighter development since 1960 that the first 200 pages of the book cover events up to 1960 and everything since occupies only forty pages. Recent designs are covered in the same detail as those of the golden age—it's just that there haven't been all that many of them.

If you enjoy this book, you'll probably also want to read the companion, U.S. Air Force Prototype Jet Fighters Photo Scrapbook, which collects hundreds of photographs of the planes featured in the main work which, although often fascinating, didn't make the cut for inclusion in it. Many photos, particularly of newer planes, are in colour, although some older colour shots have noticeably faded.

 Permalink

May 2010

Invisible Committee, The. The Coming Insurrection. Los Angeles: Semiotext(e)/MIT Press, [2007] 2009. ISBN 978-1-58435-080-4.
I have not paid much attention to the “anti-globalisation” protesters who seem to pop up at gatherings of international political and economic leaders, for example at the WTO Ministerial Conference in Seattle in 1999 and the Genoa G8 Summit in 2001. In large part this is because I have more interesting things with which to occupy my time, but also because, despite saturation media coverage of such events, I was unable to understand the agenda of the protesters, apart from smashing windows and hurling epithets and improvised projectiles at the organs of state security. I understand what they're opposed to, but couldn't for the life of me intuit what policies would prevail if they had their way. Still, as they are often described as “anarchists”, I, as a flaming anarchist myself, could not help but be intrigued by those so identified in the legacy media as taking the struggle to the street.

This book, written by an anonymous group of authors, has been hailed as the manifesto of this movement, so I hoped that reading it would provide some insight into what it was all about. My hope was in vain. The writing is so incoherent and the prose so impenetrable that I closed it with no more knowledge of the philosophy and programme of its authors than when I opened it. My general perception of the “anti-globalisation” movement was one of intellectual nonentities spewing inchoate rage at the “system” which produces the wealth that allows them to live their slacker lives and flit from protest to protest around the globe. Well, if this is their manifesto, then indeed that's all there is to it. The text is nearly impossible to decipher, being written in a dialect of no known language. Many paragraphs begin with an unsubstantiated and often absurd assertion, then follow it with successive verb-free sentence fragments which seem to be intended to reinforce the assertion. I suppose that if you read it as a speech before a mass assembly of fanatics who cheer whenever they hear one of their trigger words it may work, but one would expect savvy intellectuals to discern the difference in media and adapt accordingly. Whenever the authors get backed into an irreconcilable logical corner, they just drop an F-bomb and start another paragraph.

These are people so clueless that I'll have to coin a new word for those I've been calling clueless all these many years. As far as I can figure out, they assume that they can trash the infrastructure of the “system”, and all of the necessities of their day to day urban life will continue to flow to them thanks to the magic responsible for that today. These “anarchists” reject the “exploitation” of work—after all, who needs to work? “Aside from welfare, there are various benefits, disability money, accumulated student aid, subsidies drawn off fictitious childbirths, all kinds of trafficking, and so many other means that arise with every mutation of control.” (p. 103) Go anarchism! Death to the state, as long as the checks keep coming! In fact, it is almost certain that the effete would-be philosophes who set crayon (and I don't mean the French word for “pencil”) to paper to produce this work will be among the first wave of those to fall in the great die-off starting between 72 and 96 hours after that event towards which they so sincerely strive: the grid going down. Want to know what I'm talking about? Turn off the water main where it enters your house and see what happens in the next three days if you assume you can't go anywhere else where the water is on. It's way too late to learn about “rooftop vegetable gardens” when the just-in-time underpinnings which sustain modern life come to a sudden halt. Urban intellectuals may excel at publishing blows against the empire, but when the system actually goes down, bet on rural rednecks to be the survivors. Of course, as far as I can figure out what these people want, it may be that Homo sapiens returns to his roots—namely digging for roots and grubs with a pointed stick. Perhaps rather than flying off to the next G-20 meeting to fight the future, they should spend a week in one of the third world paradises where people still live that way and try it out for themselves.

The full text of the book is available online in English and French. Lest you think the Massachusetts Institute of Technology is a beacon of rationality and intelligence in a world going dark, it is their university press which distributes this book.

 Permalink

Flynn, Vince. Act of Treason. New York: Pocket Books, 2006. ISBN 978-1-4165-4226-1.
This is the seventh novel in the Mitch Rapp (warning—the article at this link contains minor spoilers) series. I packed this thriller as an “airplane book” on a recent trip. The novel was far more successful than the journey, which ended up as a 12 hour round trip from Switzerland to England and back when my onward flight was cancelled thanks to an unexpected belch from volcano Whatchamacallit. By the time I got home, I was already more than 350 pages into the 467 page paperback, and I finished it over the next two days. Like all Vince Flynn books, this is a page turner, although this time there's less action and more puzzling out of shadowy connections.

The book begins with a terrorist attack on the motorcade of a presidential candidate who, then trailing in the polls, is swept into office on a sympathy vote. Now, just before the inauguration of the new administration, Rapp captures the perpetrator of the attack and, as he and CIA director Irene Kennedy start to follow the trail of those who ordered the strike, begin to suspect what may be a plot that will shake the U.S. to its foundations and undermine the legitimacy of its government. Under a tight deadline as inauguration day approaches, Rapp and Kennedy have to find out the facts and take direct action to avert calamity.

Characters from earlier books in the series appear here, and references to events which occurred earlier in the timeline are made, but this book works perfectly fine as a stand-alone novel—you can pick up the Mitch Rapp saga here and miss little or nothing (although there will, inevitably, be spoilers for events in the earlier books).

 Permalink

Kennedy, Gregory P. The Rockets and Missiles of White Sands Proving Ground, 1945–1958. Atglen, PA: Schiffer Military History, 2009. ISBN 978-0-7643-3251-7.
Southern New Mexico has been a centre of American rocketry from its origin to the present day. After being chased out of Massachusetts due to his inventions' proclivity for making ear-shattering detonations and starting fires, Robert Goddard moved his liquid fuel rocket research to a site near Roswell, New Mexico in 1930 and continued to launch increasingly advanced rockets from that site until 1943, when he left to do war work for the Navy. Faced with the need for a range to test the missiles developed during World War II, in February 1945 the U.S. Army acquired a site stretching 100 miles north from the Texas-New Mexico border near El Paso and 41 miles east-west at the widest point, designated the “White Sands Proving Ground”: taking its name from the gypsum sands found in the region, also home to the White Sands National Monument.

Although established before the end of the war to test U.S. missiles, the first large rockets launched at the site were captured German V-2s (December 2002), with the first launched (unsuccessfully) in April 1946. Over the next six years, around seventy V-2s lifted off from White Sands, using the V-2's massive (for the time) one ton payload capacity to carry a wide variety of scientific instruments into the upper atmosphere and the edge of space. In the Bumper project, the V-2 was used as the booster for the world's first two stage liquid rocket, with its WAC Corporal second stage attaining an altitude of 248 miles: higher than some satellites orbit today (it did not, of course, attain anything near orbital velocity, and quickly fell back to Earth).

Simultaneously with launches of the V-2, U.S. rocketeers arrived at White Sands to test their designs—almost every U.S. missile of the 1940s and 1950s made its first flight there. These included research rockets such as Viking and Aerobee (first launched in 1948, it remained in service until 1985 with a total of 1037 launched); the Corporal, Sergeant, and Redstone ballistic missiles; Loki, Nike, Hawk anti-aircraft missiles; and a variety of tactical missiles including the unguided (!) nuclear-tipped Honest John.

White Sands in the forties and fifties was truly the Wild West of rocketry. Even by the standards of fighter aircraft development in the epoch, this was by guess and by gosh engineering in its purest incarnation. Consider Viking 8, which broke loose from the launch pad during a static test when hold-down fittings failed, and was allowed to fly to 20,000 feet to see what would happen (p. 97). Or Viking 10, whose engine exploded on the launch pad and then threatened a massive explosion because leaking fuel was causing the tankage to crumple as it left a vacuum. An intrepid rocketeer was sent out of the blockhouse with a carbine to shoot a hole in the top of the fuel tank and allow air to enter (p. 100)—problem solved! (The rocket was rebuilt and later flew successfully.) Then there was the time they ran out of 90% hydrogen peroxide and were told the first Viking launch would have to be delayed for two weeks until a new shipment could arrive by rail. Can't have that! So two engineers drove a drum of the highly volatile and corrosive substance in the back of a station wagon from Buffalo, New York to White Sands to meet the launch deadline (p. 79). In the Nike program, people worried about whether its aniline fuel would be sufficiently available under tactical conditions, so they tried using gasoline as fuel instead—BOOM! Nope, guess not (p. 132). With all this “innovation” going on, they needed a suitable place from which to observe it, so the pyramid-shaped blockhouse had reinforced concrete walls ten feet thick with a roof 27 feet thick at the peak. This was designed to withstand a direct impact from a V-2 falling from an altitude of 100 miles. “Once the rockets are up, who cares where they come down?”

And the pace of rockets going up was absolutely frenetic, almost inconceivable by the standards of today's hangar queens and launch pad prima donnas (some years ago, a booster which sat on the pad for more than a year was nicknamed the “civil servant”: it won't work and you can't fire it). By contrast, a single development program, the Loki anti-aircraft missile, conducted a total of 2282 launches at White Sands in 1953 and 1954 (p. 115)—that's an average of more than three a day, counting weekends and holidays!

The book concludes in 1958 when White Sands Proving Ground became White Sands Missile Range (scary pop-up at this link), which remains a centre of rocket development and testing to this day. With the advent of NASA and massively funded, long-term military procurement programs, much of the cut, try, and run like Hell days of rocketry came to a close; this book covers that period which, if not a golden age, was a heck of a lot of fun for engineers who enjoy making loud noises and punching holes in the sky.

The book is gorgeous, printed on glossy paper, with hundreds of illustrations. I noted no typographical or factual errors. A complete list of all U.S. V-2, WAC Corporal, and Viking launches is given in appendices at the end.

 Permalink

Austen, Jane and Seth Grahame-Smith. Pride and Prejudice and Zombies. Philadelphia: Quirk Books, 2009. ISBN 978-1-59474-334-4.
Jane Austen's Pride and Prejudice is the quintessential British Regency era novel of manners. Originally published in 1813, it has been endlessly adapted to the stage, film, and television, and has been a staple of English literature classes from the Victorian era through post-post-modern de-deconstructionist decadence. What generations of litterateurs missed, however, is its fundamental shortcoming: there aren't any zombies in it! That's where the present volume comes in.

This work preserves 85% of Jane Austen's original text and names her as the primary author (hey, if you can't have a dead author in a zombie novel, where can you?), but enhances the original story with “ultraviolent zombie mayhem” seamlessly woven into the narrative. Now, some may consider this a travesty and desecration of a literary masterwork, but look at this way: if F-14s are cool and tyrannosaurs are cool, imagine how cool tyrannosaurs in F-14s would be? Adopting this Calvinist approach allows one to properly appreciate what has been done here.

The novel is set in an early 19th century England afflicted for five and fifty years with the “strange plague” that causes the dead to rise and stagger across the countryside alone or in packs, seeking to kill and devour the succulent brains of the living. Any scratch inflicted by one of these creatures (variously referred to as “unmentionables”, “sorry stricken”, “manky dreadfuls”, “Satan's armies”, “undead”, or simply “zombies”) can infect the living with the grievous affliction and transform them into another compulsive cranium cruncher. The five Bennet sisters have been sent by their father to be trained in the deadly arts by masters in China and have returned a formidable fighting force, sworn by blood oath to the Crown to defend Hertfordshire against the zombie peril until the time of their marriage. There is nothing their loquacious and rather ditzy mother wants more than to see her five daughters find suitable matches, and she fears their celebrated combat credentials and lack of fortune will deter the wealthy and refined suitors she imagines for them. The central story is the contentious relations and blossoming romance between Elizabeth Bennet and Fitzwilliam Darcy, a high-born zombie killer extraordinaire whose stand-offish manner is initially interpreted as arrogance and disdain for the humble Bennets. Can such fierce and proud killers find love and embark upon a life fighting alongside one another in monster murdering matrimony?

The following brief extracts give a sense of what you're getting into when you pick up this book. None are really plot spoilers, but I've put them into a spoiler block nonetheless because some folks might want to encounter these passages in context to fully enjoy the roller coaster ride between the refined and the riotous.

Spoiler warning: Plot and/or ending details follow.  
  • From a corner of the room, Mr. Darcy watched Elizabeth and her sisters work their way outward, beheading zombie after zombie as they went. He knew of only one other woman in Great Britain who wielded a dagger with such skill, such grace, and deadly accuracy.

    By the time the girls reached the walls of the assembly hall, the last of the unmentionables lay still.

    Apart from the attack, the evening altogether passed off pleasantly for the whole family. Mrs. Bennet had seen her eldest daughter much admired by the Netherfield party. … (Chapter 3)

  • Elizabeth, to whom Jane very soon communicated the chief of all this, heard it in silent indignation. Her heart was divided between concern for her sister, and thoughts of going immediately to town and dispensing the lot of them.

    “My dear Jane!” exclaimed Elizabeth, “you are too good. Your sweetness and disinterestedness are really angelic; you wish to think all the world respectable, and are hurt if I speak of killing anybody for any reason! …” (Chapter 24)

  • But why Mr. Darcy came so often to the Parsonage, it was more difficult to understand. It could not be for society, as he frequently sat there ten minutes together without opening his lips; and when he did speak, it seemed the effect of necessity rather than choice. He seldom appeared really animated, even at the sight of Mrs. Collins gnawing upon her own hand. What remained of Charlotte would liked to have believed this change the effect of love, and the object of that love her friend Eliza. She watched him whenever they were at Rosings, and whenever he came to Hunsford; but without much success, for her thoughts often wandered to other subjects, such as the warm, succulent sensation of biting into a fresh brain. …

    In her kind schemes for Elizabeth, she sometimes planned her marrying Colonel Fitzwilliam. He was beyond comparison the most pleasant man; he certainly admired her, and his situation in life was most eligible; but to counterbalance these advantages, Mr. Darcy had a considerably larger head, and thus, more brains to feast upon. (Chapter 32)

  • “When they all removed to Brighton, therefore, you had no reason, I suppose, to believe them fond of each other?”

    “Not the slightest. I can remember no symptom of affection on either side, other than her carving his name into her midriff with a dagger; but this was customary with Lydia. …” (Chapter 47)

  • He scarcely needed an invitation to stay for supper; and before he went away, an engagement was formed, chiefly through his own and Mrs. Bennet's means, for his coming next morning to shoot the first autumn zombies with her husband. (Chapter 55)
  • You may as well call it impertinence. It was very little else. The fact is, you were sick of civility, of deference, of officious attention. You were disgusted with the women who were always speaking, and looking, and thinking for your approbation alone. I roused, and interested you because I was so unlike them. I knew the joy of standing over a vanquished foe; of painting my face and arms with their blood, yet warm, and screaming to the heavens—begging, nay daring, God to send me more enemies to kill. The gentle ladies who so assiduously courted you knew nothing of this joy, and therefore, could never offer you true happiness. … (Chapter 60)
Spoilers end here.  

The novel concludes with zombies still stalking England; all attempts to find a serum, including Lady Catherine's, having failed, and without hope for a negotiated end to hostilities. Successful diplomacy requires not only good will but brains. Zombies do not have brains; they eat them. So life goes on, and those who find married bliss must undertake to instruct their progeny in the deadly arts which defend the best parts of life from the darkness.

The book includes a “Reader's Discussion Guide” ideal for classroom and book club exploration of themes raised in the novel. For example:

10. Some scholars believe that the zombies were a last-minute addition to the novel, requested by the publisher in a shameless attempt to boost sales. Others argue that the hordes of living dead are integral to Jane Austen's plot and social commentary. What do you think? Can you imagine what this novel might be without the violent zombie mayhem?
Beats me.

Of course this is going to be made into a movie—patience! A comic book edition, set of postcards, and a 2011 wall calendar ideal for holiday giving are already available—go merchandising! Here is a chart which will help you sort out the relationships among the many characters in both Jane Austen's original novel and this one.

While this is a parody, whilst reading it I couldn't help but recall Herman Kahn's parable of the lions in New York City. Humans are almost infinitely adaptable and can come to consider almost any situation normal once they've gotten used to it. In this novel zombies are something one lives with as one of the afflictions of mortal life like tuberculosis and crabgrass, and it is perfectly normal for young ladies to become warriors because that's what circumstances require. It gives one pause to think how many things we've all come to consider unremarkable in our own lives might be deemed bizarre and/or repellent from the perspective of those of another epoch or observing from a different cultural perspective.

 Permalink

White, Rowland. Vulcan 607. London: Corgi Books, 2006. ISBN 978-0-552-15229-7.
The Avro Vulcan bomber was the backbone of Britain's nuclear deterrent from the 1950s until the end of the 1960s, when ballistic missile submarines assumed the primary deterrent mission. Vulcans remained in service thereafter as tactical nuclear weapons delivery platforms in support of NATO forces. In 1982, the aging Vulcan force was months from retirement when Argentina occupied the Falkland Islands, and Britain summoned all of its armed services to mount a response. The Royal Navy launched a strike force, but given the distance (about 8000 miles from Britain to the Falklands) it would take about two weeks to arrive. The Royal Air Force surveyed their assets and concluded that only the Vulcan, supported by the Handley Page Victor, a bomber converted to an aerial refueling tanker, would permit it to project power to such a distant theatre.

But there were difficulties—lots of them. First of all, the Vulcan had been dedicated to the nuclear mission for decades: none of the crews had experience dropping conventional bombs, and the bomb bay racks to dispense them had to be hunted down in scrap yards. No Vulcan had performed aerial refueling since 1971, since its missions were assumed to be short range tactical sorties, and the refueling hardware had been stoppered. Crews were sent out to find and remove refueling probes from museum specimens to install on the bombers chosen for the mission. Simply navigating to a tiny island in the southern hemisphere in this pre-GPS era was a challenge—Vulcan crews had been trained to navigate by radar returns from the terrain, and there was no terrain whatsoever between their launch point on Ascension Island and landfall in the Falklands, so boffins figured out how to adapt navigation gear from obsolete VC10 airliners to the Vulcan and make it work. The Vulcan had no modern electronic countermeasures (ECM), rendering it vulnerable to Argentinian anti-aircraft defences, so an ECM pod from another aircraft was grafted onto its wing, fastening to a hardpoint which had never been used by a Vulcan. Finding it, and thereby knowing where to drill the holes required dismantling the wing of another Vulcan.

If the preparations were remarkable, especially since they were thrown together in just a few weeks, the mission plan was audacious—so much so that one expects it would have been rejected as absurd if proposed as the plot of a James Bond film. Executing the mission to bomb the airfield on the Falkland Islands would involve two Vulcan bombers, one Nimrod marine patrol aircraft, thirteen Victor tankers, nineteen refuelings (including Victor to Victor and Victor to Vulcan), 1.5 million pounds of fuel, and ninety aircrew. And all of these resources, assembled and deployed in a single mission, managed to put just one crater in the airstrip in the Falkland Islands, denying it to Argentine fast jets, but allowing C-130 transports to continue to operate from it.

From a training, armament, improvisation, and logistics standpoint this was a remarkable achievement, and the author argues that its consequences, direct and indirect, effectively took the Argentine fast air fighter force and navy out of the conflict, and hence paved the way for the British reconquista of the islands. Today it seems quaint; you'd just launch a few cruise missiles at the airfield, cratering it and spreading area denial munitions and that would be that, without risking a single airman. But they didn't have that option then, and so they did their best with what was available, and this epic story recounts how they pulled it off with hardware on the edge of retirement, re-purposed for a mission its designers never imagined, mounted with a plan with no margin for error, on a schedule nobody could have imagined absent wartime exigency. This is a tale of the Vulcan mission; if you're looking for a comprehensive account of the Falklands War, you'll have to look elsewhere. The Vulcan raid on the Falklands was one of those extraordinary grand gestures, like the Doolittle Raid on Japan, which cast a longer shadow in history than their direct consequences implied. After the Vulcan raid, nobody doubted the resolve of Britain, and the resulting drawback of the Argentine forces almost certainly reduced the cost of retaking the islands from the invader.

 Permalink

Flynn, Vince. Protect and Defend. New York: Pocket Books, 2007. ISBN 978-1-4165-0503-7.
This is the eighth novel in the Mitch Rapp (warning—the article at this link contains minor spoilers) series. I usually wait a month or two between reading installments in this thriller saga, but since I'd devoured the previous volume, Act of Treason, earlier this month on an airline trip which went seriously awry, I decided to bend the rules and read its successor on the second attempt to make the same trip. This time both the journey and the novel were entirely successful.

The story begins with Mitch Rapp cleaning up some unfinished business from Act of Treason, then transitions into an a thriller whose premises may play out in the headlines in the near future. When Iran's covert nuclear weapons facility is destroyed under mysterious circumstances, all of the players in the game, both in Iran and around the world, try to figure out what happened, who was responsible, and how they can turn events to their own advantage. Fanatic factions within the Iranian power structure see an opportunity to launch a proxy terror offensive against Israel and the United States, while those aware of the vulnerability of their country to retaliation for any attack upon those nations try to damp down the flames. The new U.S. president decides to use a back channel to approach the Iranian pragmatists with a deal to put an end to the decades-long standoff and reestablish formal relations between the nations, and dispatches the CIA director to a covert meeting with her peer, the chief of the Iranian Ministry of Intelligence and Security. But word of the meeting makes its way to the radical factions in Iran, and things go horribly wrong. It is then up to Mitch Rapp and his small team, working against the clock, to puzzle out what happened, who is responsible, and how to respond.

If you haven't read the earlier Mitch Rapp novels, you'll miss some of the context, particularly in the events of the first few chapters, but this won't detract in any way from your enjoyment of the story. Personally, I'd read (and I'm reading) the novels in order, but they are sufficiently stand-alone (particularly after the first few) that there's no problem getting into the series at any point. Vince Flynn's novels are always about the action and the characters, not preachy policy polemics. Nonetheless, one gets a sense that the strategy presented here is how the author's brain trust would like to see a confident and unapologetic West address the Iranian conundrum.

 Permalink

June 2010

Lanier, Jaron. You Are Not a Gadget. New York: Alfred A. Knopf, 2010. ISBN 978-0-307-26964-5.
In The Fatal Conceit (March 2005) Friedrich A. Hayek observed that almost any noun in the English language is devalued by preceding it with “social”. In this book, virtual reality pioneer, musician, and visionary Jaron Lanier argues that the digital revolution, which began in the 1970s with the advent of the personal computer and became a new foundation for human communication and interaction with widespread access to the Internet and the Web in the 1990s, took a disastrous wrong turn in the early years of the 21st century with the advent of the so-called “Web 2.0” technologies and “social networking”—hey, Hayek could've told you!

Like many technologists, the author was optimistic that with the efflorescence of the ubiquitous Internet in the 1990s combined with readily-affordable computer power which permitted photorealistic graphics and high fidelity sound synthesis, a new burst of bottom-up creativity would be unleashed; creative individuals would be empowered to realise not just new art, but new forms of art, along with new ways to collaborate and distribute their work to a global audience. This Army of Davids (March 2006) world, however, seems to have been derailed or at least delayed, and instead we've come to inhabit an Internet and network culture which is darker and less innovative. Lanier argues that the phenomenon of technological “lock in” makes this particularly ominous, since regrettable design decisions whose drawbacks were not even perceived when they were made, tend to become entrenched and almost impossible to remedy once they are widely adopted. (For example, just look at the difficulties in migrating the Internet to IPv6.) With application layer protocols, fundamentally changing them becomes almost impossible once a multitude of independently maintained applications rely upon them to intercommunicate.

Consider MIDI, which the author uses as an example of lock-in. Originally designed to allow music synthesisers and keyboards to interoperate, it embodies a keyboardist's view of the concept of a note, which is quite different from that, say, of a violinist or trombone player. Even with facilities such as pitch bend, there are musical articulations played on physical instruments which cannot be represented in MIDI sequences. But since MIDI has become locked in as the lingua franca of electronic music production, in effect the musical vocabulary has been limited to those concepts which can be represented in MIDI, resulting in a digital world which is impoverished in potential compared to the analogue instruments it aimed to replace.

With the advent of “social networking”, we appear to be locking in a representation of human beings as database entries with fields chosen from a limited menu of choices, and hence, as with MIDI, flattening down the unbounded diversity and potential of human individuals to categories which, not coincidentally, resemble the demographic bins used by marketers to target groups of customers. Further, the Internet, through its embrace of anonymity and throwaway identities and consequent devaluing of reputation, encourages mob behaviour and “drive by” attacks on individuals which make many venues open to the public more like a slum than an affinity group of like-minded people. Lanier argues that many of the pathologies we observe in behaviour on the Internet are neither inherent nor inevitable, but rather the consequences of bad user interface design. But with applications built on social networking platforms proliferating as rapidly as me-too venture capital hoses money in their direction, we may be stuck with these regrettable decisions and their pernicious consequences for a long time to come.

Next, the focus turns to the cult of free and open source software, “cloud computing”, “crowd sourcing”, and the assumption that a “hive mind” assembled from a multitude of individuals collaborating by means of the Internet can create novel and valuable work and even assume some of the attributes of personhood. Now, this may seem absurd, but there are many people in the Silicon Valley culture to whom these are articles of faith, and since these people are engaged in designing the tools many of us will end up using, it's worth looking at the assumptions which inform their designs. Compared to what seemed the unbounded potential of the personal computer and Internet revolutions in their early days, what the open model of development has achieved to date seems depressingly modest: re-implementations of an operating system, text editor, and programming language all rooted in the 1970s, and creation of a new encyclopedia which is structured in the same manner as paper encyclopedias dating from a century ago—oh wow. Where are the immersive massively multi-user virtual reality worlds, or the innovative presentation of science and mathematics in an interactive exploratory learning environment, or new ways to build computer tools without writing code, or any one of the hundreds of breakthroughs we assumed would come along when individual creativity was unleashed by their hardware prerequisites becoming available to a mass market at an affordable price?

Not only have the achievements of the free and open movement been, shall we say, modest, the other side of the “information wants to be free” creed has devastated traditional content providers such as the music publishing, newspaper, and magazine businesses. Now among many people there's no love lost for the legacy players in these sectors, and a sentiment of “good riddance” is common, if not outright gloating over their demise. But what hasn't happened, at least so far, is the expected replacement of these physical delivery channels with electronic equivalents which generate sufficient revenue to allow artists, journalists, and other primary content creators to make a living as they did before. Now, certainly, these occupations are a meritocracy where only a few manage to support themselves, no less become wealthy, while far more never make it. But with the mass Internet now approaching its twentieth birthday, wouldn't you expect at least a few people to have figured out how to make it work for them and prospered as creators in this new environment? If so, where are they?

For that matter, what new musical styles, forms of artistic expression, or literary genres have emerged in the age of the Internet? Has the lack of a viable business model for such creations led to a situation the author describes as, “It's as if culture froze just before it became digitally open, and all we can do now is mine the past like salvagers picking over a garbage dump.” One need only visit YouTube to see what he's talking about. Don't read the comments there—that path leads to despair, which is a low state.

Lanier's interests are eclectic, and a great many matters are discussed here including artificial intelligence, machine language translation, the financial crisis, zombies, neoteny in humans and human cultures, and cephalopod envy. Much of this is fascinating, and some is irritating, such as the discussion of the recent financial meltdown where it becomes clear the author simply doesn't know what he's talking about and misdiagnoses the causes of the catastrophe, which are explained so clearly in Thomas Sowell's The Housing Boom and Bust (March 2010).

I believe this is the octopus video cited in chapter 14. The author was dubious, upon viewing this, that it wasn't a computer graphics trick. I have not, as he has, dived the briny deep to meet cephalopods on their own turf, and I remain sceptical that the video represents what it purports to. This is one of the problems of the digital media age: when anything you can imagine can be persuasively computer synthesised, how can you trust any reportage of a remarkable phenomenon to be genuine if you haven't observed it for yourself?

Occasional aggravations aside, this is a thoughtful exploration of the state of the technologies which are redefining how people work, play, create, and communicate. Readers frustrated by the limitations and lack of imagination which characterises present-day software and network resources will discover, in reading this book, that tremendously empowering phrase, “it doesn't have to be that way”, and perhaps demand better of those bringing products to the market or perhaps embark upon building better tools themselves.

 Permalink

Spira, S. F., Eaton S. Lothrop, Jr., and Jonathan B. Spira. The History of Photography As Seen Through the Spira Collection. Danville, NJ: Aperture, 2001. ISBN 978-0-89381-953-8.
If you perused the back pages of photographic magazines in the 1960s and 1970s, you'll almost certainly recall the pages of advertising from Spiratone, which offered a panoply of accessories and gadgets, many tremendously clever and useful, and some distinctly eccentric and bizarre, for popular cameras of the epoch. The creation of Fred Spira, a refugee from Nazi anschluss Austria who arrived in New York almost penniless, his ingenuity, work ethic, and sense for the needs of the burgeoning market of amateur photographers built what started as a one-man shop into a flourishing enterprise, creating standards such as the “T mount” lenses which persist to the present day. His company was a pioneer in importing high quality photographic gear from Japan and instrumental in changing the reputation of Japan from a purveyor of junk to a top end manufacturer.

Like so many businessmen who succeed to such an extent they redefine the industries in which they participate, Spira was passionate about the endeavour pursued by his customers: in his case photography. As his fortune grew, he began to amass a collection of memorabilia from the early days of photography, and this Spira Collection finally grew to more than 20,000 items, covering the entire history of photography from its precursors to the present day.

This magnificent coffee table book draws upon items from the Spira collection to trace the history of photography from the camera obscura in the 16th century to the dawn of digital photography in the 21st. While the pictures of items from the collection dominate the pages, there is abundant well-researched text sketching the development of photography, including the many blind alleys along the way to a consensus of how images should be made. You can see the fascinating process by which a design, which initially varies all over the map as individual inventors try different approaches, converges upon a standard based on customer consensus and market forces. There is probably a lesson for biological evolution somewhere in this. With inventions which appear, in retrospect, as simple as photography, it's intriguing to wonder how much earlier they might have been discovered: could a Greek artificer have stumbled on the trick and left us, in some undiscovered cache, an image of Pericles making the declamation recorded by Thucydides? Well, probably not—the simplest photographic process, the daguerreotype, requires a plate of copper, silver, and mercury sensitised with iodine. While the metals were all known in antiquity (along with glass production sufficient to make a crude lens or, failing that, a pinhole), elemental iodine was not isolated until 1811, just 28 years before Daguerre applied it to photography. But still, you never know….

This book is out of print, but used copies are generally available for less than the cover price at its publication in 2001.

 Permalink

Okrent, Daniel. Last Call: The Rise and Fall of Prohibition. New York: Scribner, 2010. ISBN 978-0-7432-7702-0.
The ratification of the Eighteenth Amendment to the U.S. Constitution in 1919, prohibiting the “manufacture, sale, or transportation of intoxicating liquors” marked the transition of the U.S. Federal government into a nanny state, which occupied itself with the individual behaviour of its citizens. Now, certainly, attempts to legislate morality and regulate individual behaviour were commonplace in North America long before the United States came into being, but these were enacted at the state, county, or municipality level. When the U.S. Constitution was ratified, it exclusively constrained the actions of government, not of individual citizens, and with the sole exception of the Thirteenth Amendment, which abridged the “freedom” to hold people in slavery and involuntary servitude, this remained the case into the twentieth century. While bans on liquor were adopted in various jurisdictions as early as 1840, it simply never occurred to many champions of prohibition that a nationwide ban, written into the federal constitution, was either appropriate or feasible, especially since taxes on alcoholic beverages accounted for as much as forty percent of federal tax revenue in the years prior to the introduction of the income tax, and imposition of total prohibition would zero out the second largest source of federal income after the tariff.

As the Progressive movement gained power, with its ambitions of continental scale government and imposition of uniform standards by a strong, centralised regime, it found itself allied with an improbable coalition including the Woman's Christian Temperance Union; the Methodist, Baptist and Presbyterian churches; advocates of women's suffrage; the Anti-Saloon League; Henry Ford; and the Ku Klux Klan. Encouraged by the apparent success of “war socialism” during World War I and empowered by enactment of the Income Tax via the Sixteenth Amendment, providing another source of revenue to replace that of excise taxes on liquor, these players were motivated in the latter years of the 1910s to impose their agenda upon the entire country in as permanent a way as possible: by a constitutional amendment. Although the supermajorities required were daunting (two thirds in the House and Senate to submit, three quarters of state legislatures to ratify), if a prohibition amendment could be pushed over the bar (if you'll excuse the term), opponents would face what was considered an insuperable task to reverse it, as it would only take 13 dry states to block repeal.

Further motivating the push not just for a constitutional amendment, but enacting one as soon as possible, were the rapid demographic changes underway in the U.S. Support for prohibition was primarily rural, in southern and central states, Protestant, and Anglo-Saxon. During the 1910s, population was shifting from farms to urban areas, from the midland toward the coasts, and the immigrant population of Germans, Italians, and Irish who were famously fond of drink was burgeoning. This meant that the electoral landscape following reapportionment after the 1920 census would be far less receptive to the foes of Demon Rum.

One must never underestimate the power of an idea whose time has come, regardless of how stupid and counterproductive it might be. And so it came to pass that the Eighteenth Amendment was ratified by the 36th state: Utah, appropriately, on January 16th, 1919, with nationwide Prohibition to come into effect a year hence. From the outset, it was pretty obvious to many astute observers what was about happen. An Army artillery captain serving in France wrote to his fiancée in Missouri, “It looks to me like the moonshine business is going to be pretty good in the land of the Liberty Loans and Green Trading Stamps, and some of us want to get in on the ground floor. At least we want to get there in time to lay in a supply for future consumption.” Captain Harry S. Truman ended up pursuing a different (and probably less lucrative career), but was certainly prescient about the growth industry of the coming decade.

From the very start, Prohibition was a theatre of the absurd. Since it was enforced by a federal statute, the Volstead Act, enforcement, especially in states which did not have their own state Prohibition laws, was the responsibility of federal agents within the Treasury Department, whose head, Andrew Mellon, was a staunch opponent of Prohibition. Enforcement was always absurdly underfunded compared to the magnitude of the bootlegging industry and their customers (the word “scofflaw” entered the English language to describe them). Federal Prohibition officers were paid little, but were nonetheless highly prized patronage jobs, as their holders could often pocket ten times their salary in bribes to look the other way.

Prohibition unleashed the American talent for ingenuity, entrepreneurship, and the do-it-yourself spirit. While it was illegal to manufacture liquor for sale or to sell it, possession and consumption were perfectly legal, and families were allowed to make up to 200 gallons (which should suffice even for the larger, more thirsty households of the epoch) for their own use. This led to a thriving industry in California shipping grapes eastward for householders to mash into “grape juice” for their own use, being careful, of course, not to allow it to ferment or to sell some of their 200 gallon allowance to the neighbours. Later on, the “Vino Sano Grape Brick” was marketed nationally. Containing dried crushed grapes, complete with the natural yeast on the skins, you just added water, waited a while, and hoisted a glass to American innovation. Brewers, not to be outdone, introduced “malt syrup”, which with the addition of yeast and water, turned into beer in the home brewer's basement. Grocers stocked everything the thirsty householder needed to brew up case after case of Old Frothingslosh, and brewers remarked upon how profitable it was to outsource fermentation and bottling to the customers.

For those more talented in manipulating the law than fermenting fluids, there were a number of opportunities as well. Sacramental wine was exempted from Prohibition, and wineries which catered to Catholic and Jewish congregations distributing such wines prospered. Indeed, Prohibition enforcers noted they'd never seen so many rabbis before, including some named Patrick Houlihan and James Maguire. Physicians and dentists were entitled to prescribe liquor for medicinal purposes, and the lucrative fees for writing such prescriptions and for pharmacists to fill them rapidly caused hard liquor to enter the materia medica for numerous maladies, far beyond the traditional prescription as snakebite medicine. While many pre-Prohibition bars re-opened as speakeasies, others prospered by replacing “Bar” with ”Drug Store” and filling medicinal whiskey prescriptions for the same clientele.

Apart from these dodges, the vast majority of Americans slaked their thirst with bootleg booze, either domestic (and sometimes lethal), or smuggled from Canada or across the ocean. The obscure island of St. Pierre, a French possession off the coast of Canada, became a prosperous entrepôt for reshipment of Canadian liquor legally exported to “France”, then re-embarked on ships headed for “Rum Row”, just outside the territorial limit of the U.S. East Coast. Rail traffic into Windsor, Ontario, just across the Detroit River from the eponymous city, exploded, as boxcar after boxcar unloaded cases of clinking glass bottles onto boats bound for…well, who knows? Naturally, with billions and billions of dollars of tax-free income to be had, it didn't take long for criminals to stake their claims to it. What was different, and deeply appalling to the moralistic champions of Prohibition, is that a substantial portion of the population who opposed Prohibition did not despise them, but rather respected them as making their “money by supplying a public demand”, in the words of one Alphonse Capone, whose public relations machine kept him in the public eye.

As the absurdity of the almost universal scorn and disobedience of Prohibition grew (at least among the urban chattering classes, which increasingly dominated journalism and politics at the time), opinion turned toward ways to undo its increasingly evident pernicious consequences. Many focussed upon amending the Volstead Act to exempt beer and light wines from the definition of “intoxicating liquors”—this would open a safety valve, and at least allow recovery of the devastated legal winemaking and brewing industries. The difficulty of actually repealing the Eighteenth Amendment deterred many of the most ardent supporters of that goal. As late as September 1930, Senator Morris Sheppard, who drafted the Eighteenth Amendment, said “There is a much chance of repealing the Eighteenth Amendment as there is for a hummingbird to fly to the planet Mars with the Washington Monument tied to its tail.”

But when people have had enough (I mean, of intrusive government, not illicit elixir), it's amazing what they can motivate a hummingbird to do! Less than two years later, the Twenty-first Amendment, repealing Prohibition, was passed by the Congress, and on December 5th, 1933, it was ratified by the 36th state (appropriately, but astonishingly, Utah), thus putting an end to what had not only become generally seen as a farce, but also a direct cause of sanguinary lawlessness and scorn for the rule of law. The cause of repeal was greatly aided not only by the thirst of the populace, but also by the thirst of their government for revenue, which had collapsed due to plunging income tax receipts as the Great Depression deepened, along with falling tariff income as international trade contracted. Reinstating liquor excise taxes and collecting corporate income tax from brewers, winemakers, and distillers could help ameliorate the deficits from New Deal spending programs.

In many ways, the adoption and repeal of Prohibition represented a phase transition in the relationship between the federal government and its citizens. In its adoption, they voted, by the most difficult of constitutional standards, to enable direct enforcement of individual behaviour by the national government, complete with its own police force independent of state and local control. But at least they acknowledged that this breathtaking change could only be accomplished by a direct revision of the fundamental law of the republic, and that reversing it would require the same—a constitutional amendment, duly proposed and ratified. In the years that followed, the federal government used its power to tax (many partisans of Repeal expected the Sixteenth Amendment to also be repealed but, alas, this was not to be) to promote and deter all kinds of behaviour through tax incentives and charges, and before long the federal government was simply enacting legislation which directly criminalised individual behaviour without a moment's thought about its constitutionality, and those who challenged it were soon considered nutcases.

As the United States increasingly comes to resemble a continental scale theatre of the absurd, there may be a lesson to be learnt from the final days of Prohibition. When something is unsustainable, it won't be sustained. It's almost impossible to predict when the breaking point will come—recall the hummingbird with the Washington Monument in tow—but when things snap, it doesn't take long for the unimaginable new to supplant the supposedly secure status quo. Think about this when you contemplate issues such as immigration, the Euro, welfare state spending, bailouts of failed financial institutions and governments, and the multitude of big and little prohibitions and intrusions into personal liberty of the pervasive nanny state—and root for the hummingbird.

In the Kindle edition, all of the photographic illustrations are collected at the very end of the book, after the index—don't overlook them.

 Permalink

Beck, Glenn. The Overton Window. New York: Threshold Editions, 2010. ISBN 978-1-4391-8430-1.
I have no idea who is actually responsible for what in the authorship of this novel. Glenn Beck is listed as the principal author, but the title page says “with contributions from Kevin Balfe, Emily Bestler, and Jack Henderson”. I have cited the book as it appears on the cover and in most mentions of it, as a work by Glenn Beck. Certainly, regardless of who originated, edited, and assembled the words into the present work, it would not have been published nor have instantaneously vaulted to the top of the bestseller lists had it not been associated with the high profile radio and television commentator to whom it is attributed. Heck, he may have written the whole thing himself and generously given credit to his editors and fact checkers—it does, indeed, read like a first attempt by an aspiring thriller author.

It isn't at all bad. Beck (et al., or whatever) tend to be a bit preachy and the first half of the novel goes pretty slow. It's only after you cross the 50 yard line that you discover there's more to the story than you thought, that things and characters are not what they seemed to be, and that the choices facing the protagonist, Noah Gardner, are more complicated than you might have thought.

The novel has been given effusive cover blurbs by masters of the genre Brad Thor and Vince Flynn. Still, I'd expect those page-turner craftsmen to have better modulated the tension in a story than we find here. A perfectly crafted thriller is like a roller coaster, with fear-inducing rises and terrifying plunges, but this is more like a lecture on constitutional government whilst riding on a Disneyland ride where most of the characters are animatronic robots there to illustrate the author's message. The characters just don't feel right. How plausible is it that a life-long advocate of liberty and conspiracy theorist would become bestest buddy with an undercover FBI agent who blackmailed him into co-operating in a sting operation less than 24 hours before? Or that a son who was tortured almost to death at the behest (and in the presence of) his father could plausibly be accepted as a minion in the father's nefarious undertaking? For the rest, we're going to have to go behind the spoiler curtain.

Spoiler warning: Plot and/or ending details follow.  
In chapter 30, Noah is said to have been kept unconscious for an entire weekend with a “fentanyl patch”. But fentanyl patches are used as an analgesic, not an anæsthetic. Although the drug was once used as a general anæsthetic, it was administered intravenously in this application, not via a transdermal patch.

The nuclear bomb “model” (which turns out to be the real thing) is supposed to have been purloined from a cruise missile which went missing during transport, and is said to weigh “eighty or one hundred pounds”. But the W-80 and W-84 cruise missile warheads weighed 290 and 388 pounds respectively. There is no way the weight of the physics package of these weapons could be reduced to such an extent while remaining functional.

The Mark 8 atomic bomb which comes on the scene in chapter 43 makes no sense at all. Where did it come from? Why was a bomb, of which only 40 were ever produced and removed from service in 1957, carefully maintained in secret and off the books for more than fifty years? Any why would the terrorists want two bombs, when the second would simply be vaporised when they set off the first? Perhaps I've missed something, but it's kind of like you're reading a spy thriller and in the middle of a gunfight a unicorn wanders through the middle and everybody stops shooting until it passes, whereupon they continue the battle as if nothing happened.

Spoilers end here.  

Apart from plausibility of the characters and quibbles, both of which I'm more than willing to excuse in a gripping thriller, the real disappointment here is that the novel ends about two hundred chapters before anything is actually resolved. This is a chronicle of the opening skirmish in a cataclysmic, protracted conflict between partisans of individual liberty and forces seeking to impose global governance by an élite. When you put the book down, you'll have met the players and understand their motives and resources, but it isn't even like the first volume of a trilogy where, regardless of how much remains to happen, there is usually at least the conclusion of a subplot. Now, you're not left with a cliffhanger, but neither is there any form of closure to the story. I suppose one has no option but to wait for the inevitable sequel, but I doubt I'll be reading it.

This is not an awful book; it's enjoyable on its own terms and its citations of real-world events may be enlightening to readers inattentive to the shrinking perimeter of liberty in this increasingly tyrannical world (the afterword provides resources for those inclined to explore further). But despite their praise for it, Vince Flynn and Brad Thor it's not.

 Permalink

Klein, Aaron with Brenda J. Elliott. The Manchurian President. New York: WND Books, 2010. ISBN 978-1-935071-87-7.
The provocative title of this book is a reference to Richard Condon's classic 1959 Cold War thriller, The Manchurian Candidate, in which a Korean War veteran, brainwashed by the Chinese while a prisoner of war in North Korea, returns as a sleeper agent, programmed to perform political assassinations on behalf of his Red controllers. The climax comes as a plot unfolds to elect a presidential candidate who will conduct a “palace coup”, turning the country over to the conspirators. The present book, on the other hand, notwithstanding its title, makes no claim that its subject, Barack Obama, has been brainwashed in any way, nor that there is any kind of covert plot to enact an agenda damaging to the United States, nor is any evidence presented which might support such assertions. Consequently, I believe the title is sensationalistic and in the end counterproductive. But what about the book?

Well, I'd argue that there is no reason to occupy oneself with conspiracy theories or murky evidence of possible radical connections in Obama's past, when you need only read the man's own words in his 1995 autobiography, Dreams from My Father, describing his time at Occidental College:

To avoid being mistaken for a sellout, I chose my friends carefully. The more politically active black students. The foreign students. The Chicanos. The Marxist professors and the structural feminists and punk-rock performance poets. We smoked cigarettes and wore leather jackets. At night, in the dorms, we discussed neocolonialism, Frantz Fanon, Eurocentrism, and patriarchy.

The sentence fragments. Now, certainly, many people have expressed radical thoughts in their college days, but most, writing an autobiography fifteen years later, having graduated from Harvard Law School and practiced law, might be inclined to note that they'd “got better”; to my knowledge, Obama makes no such assertion. Further, describing his first job in the private sector, also in Dreams, he writes:

Eventually, a consulting house to multinational corporations agreed to hire me as a research assistant. Like a spy behind enemy lines, I arrived every day at my mid-Manhattan office and sat at my computer terminal, checking the Reuters machine that blinked bright emerald messages from across the globe.

Now bear in mind that this is Obama on Obama, in a book published the same year he decided to enter Illinois politics, running for a state senate seat. Why would a politician feigning moderation in order to gain power, thence to push a radical agenda, explicitly brag of his radical credentials and background?

Well, he doesn't because he's been an overt hard left radical with a multitude of connections to leftist, socialist, communist, and militant figures all of his life, from the first Sunday school he attended in Hawaii to the circle of advisers he brought into government following his election as president. The evidence of this has been in plain sight ever since Obama came onto the public scene, and he has never made an effort to cover it up or deny it. The only reason it is not widely known is that the legacy media did not choose to pursue it. This book documents Obama's radical leftist history and connections, but it does so in such a clumsy and tedious manner that you may find it difficult to slog through. The hard left in the decades of Obama's rise to prominence is very much like that of the 1930s through 1950s: a multitude of groups with platitudinous names concealing their agenda, staffed by a cast of characters whose names pop up again and again as you tease out the details, and with sources of funding which disappear into a cloud of smoke as you try to pin them down. In fact, the “new new left” (or “contemporary progressive movement”, as they'd doubtless prefer) looks and works almost precisely like what we used to call “communist front organisations” back in the day. The only difference is that they aren't funded by the KGB, seek Soviet domination, or report to masters in Moscow—at least as far as we know….

Obama's entire career has been embedded in such a tangled web of radical causes, individuals, and groups that following any one of them is like pulling up a weed whose roots extend in all directions, tangling with other weeds, which in turn are connected every which way. What we have is not a list of associations, but rather a network, and a network is a difficult thing to describe in the linear narrative of a book. In the present case, the authors get all tangled up in the mess, and the result is a book which is repetitive, tedious, and on occasions so infuriating that it was mostly a desire not to clean up the mess and pay the repair cost which kept me from hurling it through a window. If they'd mentioned just one more time that Bill Ayers was a former Weatherman terrorist, I think I might have lost that window.

Each chapter starts out with a theme, but as the web of connections spreads, we get into material and individuals covered elsewhere, and there is little discipline in simply cross-referencing them or trusting the reader to recall their earlier mention. And when there are cross-references, they are heavy handed. For example at the start of chapter 12, they write: “Two of the architects of that campaign, and veterans of Obama's U.S. senatorial campaign—David Axelrod and Valerie Jarrett—were discussed by the authors in detail in Chapter 10 of this book.” Hello, is there an editor in the house? Who other than “the authors” would have discussed them, and where else than in “this book”? And shouldn't an attentive reader be likely to recall two prominent public figures discussed “in detail” just two chapters before?

The publisher's description promises much, including “Obama's mysterious college years unearthed”, but very little new information is delivered, and most of the book is based on secondary sources, including blog postings the credibility of which the reader is left to judge. Now, I did not find much to quibble about, but neither did I encounter much material I did not already know, and I've not obsessively followed Obama. I suppose that people who exclusively get their information from the legacy media might be shocked by what they read here, but most of it has been widely mentioned since Obama came onto the radar screen in 2007. The enigmatic lacunæ in Obama's paper trail (SAT and LSAT scores, college and law school transcripts, etc.) are mentioned here, but remain mysterious.

If you're interested in this topic, I'd recommend giving this book a miss and instead starting with the Barack Obama page on David Horowitz's Discover the Networks site, following the links outward from there. Horowitz literally knows the radical left from inside and out: the son of two members of the Communist Party of the United States, he was a founder of the New Left and editor of Ramparts magazine. Later, repelled by the murderous thuggery of the Black Panthers, he began to re-think his convictions and has since become a vocal opponent of the Left. His book, Radical Son (March 2007), is an excellent introduction to the Old and New Left, and provides insight into the structure and operation of the leftists behind and within the Obama administration.

 Permalink

Gingrich, Newt with Joe DeSantis et al.. To Save America. Washington: Regnery Publishing, 2010. ISBN 978-1-59698-596-4.
In the epilogue of Glenn Beck's The Overton Window (June 2010), he introduces the concept of a “topical storm”, defined as “a state in which so many conflicting thoughts are doing battle in your brain that you lose your ability to discern and act on any of them.” He goes on to observe that:

This state was regularly induced by PR experts to cloud and control issues in the public discourse, to keep thinking people depressed and apathetic on election days, and to discourage those who might be tempted to actually take a stand on a complex issue.

It is easy to imagine responsible citizens in the United States, faced with a topical storm of radical leftist “transformation” unleashed by the Obama administration and its Congressional minions, combined with a deep recession, high unemployment, impending financial collapse, and empowered adversaries around the world, falling into a lethargic state where each day's dismaying news simply deepens the depression and sense of powerlessness and hopelessness. Whether deliberately intended or not, this is precisely what the statists want, and it leads to a citizenry reduced to a despairing passivity as the chains of dependency are fastened about them.

This book is a superb antidote for those in topical depression, and provides common-sense and straightforward policy recommendations which can gain the support of the majorities needed to put them into place. Gingrich begins by surveying the present dire situation in the U.S. and what is at stake in the elections of 2010 and 2012, which he deems the most consequential elections in living memory. Unless stopped by voters at these opportunities, what he describes as a “secular-socialist machine” will be able to put policies in place which will restructure society in such as way as to create a dependent class of voters who will reliably return their statist masters to power for the foreseeable future, or at least until the entire enterprise collapses (which may be sooner, rather than later, but should not be wished for by champions of individual liberty as it will entail human suffering comparable to a military conquest and may result in replacement of soft tyranny by that of the jackbooted variety).

After describing the hole the U.S. have dug themselves into, the balance of the book contains prescriptions for getting out. The situation is sufficiently far gone, it is argued, that reforming the present corrupt bureaucratic system will not suffice—a regime pernicious in its very essence cannot be fixed by changes around the margin. What is needed, then, is not reform but replacement: repealing or sunsetting the bad policies of the present and replacing them with ones which make sense. In certain domains, this may require steps which seem breathtaking to present day sensibilities, but when something reaches its breaking point, drastic things will happen, for better or for worse. For example, what to do about activist left-wing Federal judges with lifetime tenure, who negate the people's will expressed through their elected legislators and executive branch? Abolish their courts! Hey, it worked for Thomas Jefferson, why not now?

Newt Gingrich seeks a “radical transformation” of U.S. society no less than does Barack Obama. Unlike Obama, however, his prescriptions, unlike his objectives, are mostly relatively subtle changes on the margin which will shift incentives in such a way that the ultimate goal will become inevitable in the fullness of time. One of the key formative events in Gingrich's life was the fall of the French Fourth Republic in 1958, which he experienced first hand while his career military stepfather was stationed in France. This both acquainted him with the possibility of unanticipated discontinuous change when the unsustainable can no longer be sustained, and the risk of a society with a long tradition of republican government and recent experience with fascist tyranny welcoming with popular acclaim what amounted to a military dictator as an alternative to chaos. Far better to reset the dials so that the society will start heading in the right direction, even if it takes a generation or two to set things aright (after all, depending on how you count, it's taken between three and five generations to dig the present hole) than to roll the dice and hope for the best after the inevitable (should present policies continue) collapse. That, after all, didn't work out too well for Russia, Germany, and China in the last century.

I have cited the authors in the manner above because a number of the chapters on specific policy areas are co-authored with specialists in those topics from Gingrich's own American Solutions and other organisations.

 Permalink

July 2010

Lewis, Michael. The Big Short. New York: W. W. Norton, 2010. ISBN 978-0-393-07223-5.
After concluding his brief career on Wall Street in the 1980s, the author wrote Liar's Poker, a memoir of a period of financial euphoria and insanity which he assumed would come crashing down shortly after his timely escape. Who could have imagined that the game would keep on going for two decades more, in the process raising the stakes from mere billions to trillions of dollars, extending its tendrils into financial institutions around the globe, and fuelling real estate and consumption bubbles in which individuals were motivated to lie to obtain money they couldn't pay back to lenders who were defrauded as to the risk they were taking?

Most descriptions of the financial crisis which erupted in 2007 and continues to play out at this writing gloss over the details, referring to “arcanely complex transactions that nobody could understand” or some such. But, in the hands of a master explainer like the author, what happened isn't at all difficult to comprehend. Irresponsible lenders (in some cases motivated by government policy) made mortgage loans to individuals which they could not afford, with an initial “teaser” rate of interest. The only way the borrower could avoid default when the interest rate “reset” to market rates was to refinance the property, paying off the original loan. But since housing prices were rising rapidly, and everybody knew that real estate prices never fall, by that time the house would have appreciated in value, giving the “homeowner” equity in the house which would justify a higher grade mortgage the borrower could afford to pay. Naturally, this flood of money into the housing market accelerated the bubble in housing prices, and encouraged lenders to create ever more innovative loans in the interest of “affordable housing for all”, including interest-only loans, those with variable payments where the borrower could actually increase the principal amount by underpaying, no-money-down loans, and “liar loans” which simply accepted the borrower's claims of income and net worth without verification.

But what financial institution would be crazy enough to undertake the risk of carrying these junk loans on its books? Well, that's where the genius of Wall Street comes in. The originators of these loans, immediately after collecting the loan fee, bundled them up into “mortgage-backed securities” and sold them to other investors. The idea was that by aggregating a large number of loans into a pool, the risk of default, estimated from historical rates of foreclosure, would be spread just as insurance spreads the risk of fire and other damages. Further, the mortgage-backed securities were divided into “tranches”: slices which bore the risk of default in serial order. If you assumed, say, a 5% rate of default on the loans making up the security, the top-level tranche would have little or no risk of default, and the rating agencies concurred, giving it the same AAA rating as U.S. Treasury Bonds. Buyers of the lower-rated tranches, all the way down to the lowest investment grade of BBB, were compensated for the risk they were assuming by higher interest rates on the bonds. In a typical deal, if 15% of the mortgages defaulted, the BBB tranche would be completely wiped out.

Now, you may ask, who would be crazy enough to buy the BBB bottom-tier tranches? This indeed posed a problem to Wall Street bond salesmen (who are universally regarded as the sharpest-toothed sharks in the tank). So, they had the back-office “quants” invent a new kind of financial derivative, the “collateralised debt obligation” (CDO), which bundled up a whole bunch of these BBB tranche bonds into a pool, divided it into tranches, et voilà, the rating agencies would rate the lowest risk tranches of the pool of junk as triple A. How to get rid of the riskiest tranches of the CDO? Lather; rinse; repeat.

Investors worried about the risk of default in these securities could insure against them by purchasing a “credit default swap”, which is simply an insurance contract which pays off if the bond it insures is not repaid in full at maturity. Insurance giant AIG sold tens of billions of these swaps, with premiums ranging from a fraction of a percent on the AAA tranches to on the order of two percent on BBB tranches. As long as the bonds did not default, these premiums were a pure revenue stream for AIG, which went right to the bottom line.

As long as the housing bubble continued to inflate, this created an unlimited supply of AAA rated securities, rated as essentially without risk (historical rates of default on AAA bonds are about one in 100,000), ginned up on Wall Street from the flakiest and shakiest of mortgages. Naturally, this caused a huge flow of funds into the housing market, which kept the bubble expanding ever faster.

Until it popped.

Testifying before a hearing by the U.S. House of Representatives on October 22nd, 2008, Deven Sharma, president of Standard & Poor's, said, “Virtually no one—be they homeowners, financial institutions, rating agencies, regulators, or investors—anticipated what is occurring.” Notwithstanding the claim of culpable clueless clown Sharma, there were a small cadre of insightful investors who saw it all coming, had the audacity to take a position against the consensus of the entire financial establishment—in truth a bet against the Western world's financial system, and the courage to hang in there, against gnawing self-doubt (“Can I really be right and everybody else wrong?”) and skittish investors, to finally cash out on the trade of the century. This book is their story. Now, lots of people knew well in advance that the derivatives-fuelled housing bubble was not going to end well: I have been making jokes about “highly-leveraged financial derivatives” since at least 1996. But it's one thing to see an inevitable train wreck coming and entirely another to figure out approximately when it's going to happen, discover (or invent) the financial instruments with which to speculate upon it, put your own capital and reputation on the line making the bet, persist in the face of an overwhelming consensus that you're not only wrong but crazy, and finally cash out in a chaotic environment where there's a risk your bets won't be paid off due to bankruptcy on the other side (counterparty risk) or government intervention.

As the insightful investors profiled here dug into the details of the fairy castle of mortgage-backed securities, they discovered that it wouldn't even take a decline in housing prices to cause defaults sufficient to wipe out the AAA rated derivatives: a mere stagnation in real estate prices would suffice to render them worthless. And yet even after prices in the markets most affected by the bubble had already levelled off, the rating agencies continued to deem the securities based on their mortgages riskless, and insurance against their default could be bought at nominal cost. And those who bought it made vast fortunes as every other market around the world plummeted.

People who make bets like that tend to be way out on the tail of the human bell curve, and their stories, recounted here, are correspondingly fascinating. This book reads like one of Paul Erdman's financial thrillers, with the difference that the events described are simultaneously much less probable and absolutely factual. If this were a novel and not reportage, I doubt many readers would find the characters plausible.

There are many lessons to be learnt here. The first is that the human animal, and therefore the financial markets in which they interact, frequently mis-estimates and incorrectly prices the risk of outcomes with low probability: Black Swan (January 2009) events, and that investors who foresee them and can structure highly leveraged, long-term bets on them can do very well indeed. Second, Wall Street is just as predatory and ruthless as you've heard it to be: Goldman Sachs was simultaneously peddling mortgage-backed securities to its customers while its own proprietary traders were betting on them becoming worthless, and this is just one of a multitude of examples. Third, never assume that “experts”, however intelligent, highly credentialed, or richly compensated, actually have any idea what they're doing: the rating agencies grading these swampgas securities AAA had never even looked at the bonds from which they were composed, no less estimated the probability that an entire collection of mortgages made at the same time, to borrowers in similar circumstances, in the same bubble markets might all default at the same time.

We're still in the early phases of the Great Deleveraging, in which towers of debt which cannot possibly be repaid are liquidated through default, restructuring, and/or inflation of the currencies in which they are denominated. This book is a masterful and exquisitely entertaining exposition of the first chapter of this drama, and reading it is an excellent preparation for those wishing to ride out, and perhaps even profit from the ongoing tragedy. I have just two words to say to you: sovereign debt.

 Permalink

Flynn, Vince. Extreme Measures. New York: Pocket Books, 2008. ISBN 978-1-4165-0504-4.
This is the ninth novel in the Mitch Rapp (warning—the article at this link contains minor spoilers) series and is perhaps the most politically charged of the saga so far. When a high-ranking Taliban commander and liaison to al-Qaeda is captured in Afghanistan, CIA agent Mike Nash begins an interrogation with the aim of uncovering a sleeper cell planning terrorist attacks in the United States, but is constrained in his methods by a grandstanding senator who insists that the protections of the Geneva Convention be applied to this non-state murderer. Frustrated, Nash calls in Mitch Rapp for a covert and intense debrief of the prisoner, but things go horribly wrong and Rapp ends up in the lock-up of Bagram Air Base charged with violence not only against the prisoner but also a U.S. Air Force colonel (who is one of the great twits of all time—one wonders even with a service academy ring how such a jackass could attain that rank).

Rapp finds himself summoned before the Senate Judiciary Committee to answer the charges and endure the venting of pompous gasbags which constitutes the bulk of such proceedings. This time, however, Rapp isn't having any. He challenges the senators directly, starkly forcing them to choose between legalistic niceties and defeating rogue killers who do not play by the rules. Meanwhile, the sleeper cell is activated and puts into motion its plot to wreak terror on the political class in Washington. Deprived of information from the Taliban captive, the attack takes place, forcing politicians to realise that verbal virtuosity and grandstanding in front of cameras is no way to fight a war. Or, at least, for a moment until they forget once again, and as long as it is they who are personally threatened, not their constituents.

As Mitch Rapp becomes a senior figure and something of a Washington celebrity, Mike Nash is emerging as the conflicted CIA cowboy that Rapp was in the early books of the series. I suspect we'll see more and more of Nash in the future as Rapp recedes into the background.

 Permalink

Sowell, Thomas. Intellectuals and Society. New York: Basic Books, 2009. ISBN 978-0-465-01948-9.
What does it mean to be an intellectual in today's society? Well, certainly one expects intellectuals to engage in work which is mentally demanding, which many do, particularly within their own narrow specialities. But many other people perform work which is just as cognitively demanding: chess grandmasters, musical prodigies, physicists, engineers, and entrepreneurs, yet we rarely consider them “intellectuals” (unless they become “public intellectuals”, discussed below), and indeed “real” intellectuals often disdain their concern with the grubby details of reality.

In this book, the author identifies intellectuals as the class of people whose output consists exclusively of ideas, and whose work is evaluated solely upon the esteem in which it is held by other intellectuals. A chess player who loses consistently, a composer whose works summon vegetables from the audience, an engineer whose aircraft designs fall out of the sky are distinguished from intellectuals in that they produce objective results which succeed or fail on their own merits, and it is this reality check which determines the reputation of their creators.

Intellectuals, on the other hand, are evaluated and, in many cases, hired, funded, and promoted solely upon the basis of peer review, whether formal as in selection for publication, grant applications, or awarding of tenure, or informal: the estimation of colleagues and their citing of an individual's work. To anybody with the slightest sense of incentives, this seems a prescription for groupthink, and it is no surprise that the results confirm that supposition. If intellectuals were simply high-performance independent thinkers, you'd expect their opinions to vary all over the landscape (as is often the case among members of other mentally demanding professions). But in the case of intellectuals, as defined here, there is an overwhelming acceptance of the nostrums of the political left which appears to be unshakable regardless of how many times and how definitively they have been falsified and discredited by real world experience. But why should it be otherwise? Intellectuals themselves are not evaluated by the real world outcomes of their ideas, so it's only natural they're inclined to ignore the demonstrated pernicious consequences of the policies they advocate and bask instead in the admiration of their like-thinking peers. You don't find chemists still working with the phlogiston theory or astronomers fine-tuning geocentric models of the solar system, yet intellectuals elaborating Marxist theories are everywhere in the humanities and social sciences.

With the emergence of mass media in the 20th century, the “public intellectual” came into increasing prominence. These are people with distinguished credentials in a specialised field who proceed to pronounce upon a broad variety of topics in which their professional expertise provides them no competence or authority whatsoever. The accomplishments of Bertrand Russell in mathematics and philosophy, of Noam Chomsky in linguistics, or of Paul Erlich in entomology are beyond dispute. But when they walk onto the public stage and begin to expound upon disarmament, colonialism, and human population and resources, almost nobody in the media or political communities stops to ask just why their opinion should be weighed more highly than that of anybody else without specific expertise in the topic under discussion. And further, few go back and verify their past predictions against what actually happened. As long as the message is congenial to the audience, it seems like public intellectuals can get a career-long pass from checking their predictions against outcomes, even when the discrepancies are so great they would have caused a physical scientist to be laughed out of the field or an investor to have gone bankrupt. As biographer Roy Harrod wrote of eminent economist and public intellectual John Maynard Keynes:

He held forth on a great range of topics, on some of which he was thoroughly expert, but on others of which he may have derived his views from the few pages of a book at which he happened to glance. The air of authority was the same in both cases.
As was, of course, the attention paid by his audience.

Intellectuals, even when pronouncing within their area of specialisation, encounter the same “knowledge problem” Hayek identified in conjunction with central planning of economies. While the expert, or the central planning bureau, may know more about the problem domain than 99% of individual participants in the area, in many cases that expertise constitutes less than 1% of the total information distributed among all participants and expressed in their individual preferences and choices. A free market economy can be thought of as a massively parallel cloud computer for setting prices and allocating scarce resources. Its information is in the totality of the system, not in any particular place or transaction, and any attempt to extract that information by aggregating data and working on bulk measurements is doomed to failure both because of the inherent loss of information in making the aggregations and also because any such measure will be out of date long before it is computed and delivered to the would-be planner. Intellectuals have the same conceit: because they believe they know far more about a topic than the average person involved with it (and in this they may be right), they conclude that they know much more about the topic than everybody put together, and that if people would only heed their sage counsel much better policies would be put in place. In this, as with central planning, they are almost always wrong, and the sorry history of expert-guided policy should be adequate testament to its folly.

But it never is, of course. The modern administrative state and the intelligentsia are joined at the hip. Both seek to concentrate power, sucking it out from individuals acting at their own discretion in their own perceived interest, and centralising it in order to implement the enlightened policies of the “experts”. That this always ends badly doesn't deter them, because it's power they're ultimately interested in, not good outcomes. In a section titled “The Propagation of the Vision”, Sowell presents a bill of particulars as damning as that against King George III in the Declaration of Independence, and argues that modern-day intellectuals, burrowed within the institutions of academia, government, and media, are a corrosive force etching away the underpinnings of a free society. He concludes:

Just as a physical body can continue to live, despite containing a certain amount of microorganisms whose prevalence would destroy it, so a society can survive a certain amount of forces of disintegration within it. But that is very different from saying that there is no limit to the amount, audacity and ferocity of those disintegrative forces which a society can survive, without at least the will to resist.
In the past century, it has mostly been authoritarian tyrannies which have “cleaned out the universities” and sent their effete intellectual classes off to seek gainful employment in the productive sector, for example doing some of those “jobs Americans won't do”. Will free societies, whose citizens fund the intellectual class through their taxes, muster the backbone to do the same before intellectuals deliver them to poverty and tyranny? Until that day, you might want to install my “Monkeying with the Mainstream Media”, whose Red Meat edition translates “expert” to “idiot”, “analyst” to “moron”, and “specialist” to “nitwit” in Web pages you read.

An extended video interview with the author about the issues discussed in this book is available, along with a complete transcript.

 Permalink

Thor, Brad. Foreign Influence. New York: Atria Books, 2010. ISBN 978-1-4165-8659-3.
Thanks to the inexorable working of Jerry Pournelle's Iron Law of Bureaucracy, government agencies, even those most central to the legitimate functions of government and essential to its survival and the safety of the citizenry, will inevitably become sclerotic and ineffective, serving their employees at the expense of the taxpayers. The only way to get things done is for government to outsource traditionally governmental functions to private sector contractors, and recent years have seen even military operations farmed out to private security companies.

With the intelligence community having become so dysfunctional and hamstrung by feel-good constraints upon their actions and fear of political retribution against operatives, it is only natural that intelligence work—both collection and covert operations—will move to the private sector, and in this novel, Scot Harvath has left government service to join the shadowy Carlton Group, providing innovative services to the Department of Defense. Freed of bureaucratic constraints, Harvath's inner klootzak (read the book) is fully unleashed. Less than halfway into the novel, here's Harvath reporting to his boss, Reed Carlton:

“So let me get this straight,” said the Old Man. “You trunked two Basque separatists, Tasered a madam and a bodyguard—after she kicked your tail—then bagged and dragged her to some French farmhouse where you threatened to disfigure her, then iceboarded a concierge, shot three hotel security guards, kidnapped the wife of one of Russia's wealthiest mobsters, are now sitting in a hotel in Marseille waiting for a callback from the man I sent you over there to apprehend. Is that about right?”
Never a dull moment with the Carlton Group on the job!

Aggressive action is called for, because Harvath finds himself on the trail of a time-sensitive plot to unleash terror attacks in Europe and the U.S., launched by an opaque conspiracy where nothing is as it appears to be. Is this a jihadist plot, or the first volley in an asymmetric warfare conflict launched by an adversary, or a terror network hijacked by another mysterious non-state actor with its own obscure agenda? As Harvath follows the threads, two wisecracking Chicago cops moonlighting to investigate a hit and run accident stumble upon a domestic sleeper cell about to be activated by the terror network. And as the action becomes intense, we make the acquaintance of an Athena Team, an all-babe special forces outfit which is expected to figure prominently in the next novel in the saga and will doubtless improve the prospects of these books being picked up by Hollywood. With the clock ticking, these diverse forces (and at least one you'll never see coming) unite to avert a disastrous attack on American soil. The story is nicely wrapped up at the end, but the larger mystery remains to be pursued in subsequent books.

I find Brad Thor's novels substantially more “edgy” than those of Vince Flynn or Tom Clancy—like Ian Fleming, he's willing to entertain the reader with eccentric characters and situations even if they strain the sense of authenticity. If you enjoy this kind of thing—and I do, very much—you'll find this an entertaining thriller, perfect “airplane book”, and look forward to the next in the series. A podcast interview with the author is available.

 Permalink

August 2010

Lansing, Alfred. Endurance. New York: Carroll & Graf [1959, 1986] 1999. ISBN 978-0-7867-0621-1.
Novels and dramatisations of interplanetary missions, whether (reasonably) scrupulously realistic, highly speculative, or utterly absurd, often focus on the privation of their hardy crews and the psychological and interpersonal stresses they must endure when venturing so distant from the embrace of the planetary nanny state.

Balderdash! Unless a century of socialism succeeds in infantilising its subjects into pathetic, dependent, perpetual adolescents (see the last item cited above as an example), such voyages of discovery will be crewed by explorers, that pinnacle of the human species who volunteers to pay any price, bear any burden, and accept any risk to be among the first to see what's over the horizon.

This chronicle of Ernest Shackleton's Imperial Trans-Antarctic Expedition will acquaint you with real explorers, and leave you in awe of what those outliers on the bell curve of our species can and will endure in circumstances which almost defy description on the printed page.

At the very outbreak of World War I, Shackleton's ship, the Endurance, named after the motto of his family, Fortitudine vincimus: “By endurance we conquer”, sailed for Antarctica. The mission was breathtaking in its ambition: to land a party in Vahsel Bay area of the Weddell Sea, which would cross the entire continent of Antarctica, proceeding to the South Pole with the resources landed from their ship, and then crossing to the Ross Sea with the aid of caches of supplies emplaced by a second party landing at McMurdo Sound. So difficult was the goal that Shackleton's expedition was attempting to accomplish that it was not achieved until 1957–1958, when the Commonwealth Trans-Antarctic Expedition made the crossing with the aid of motorised vehicles and aerial reconnaissance.

Shackleton's expedition didn't even manage to land on the Antarctic shore; the Endurance was trapped in the pack ice of the Weddell Sea in January 1915, and the crew were forced to endure the Antarctic winter on the ship, frozen in place. Throughout the long polar night, conditions were tolerable and morale was high, but much worse was to come. As the southern summer approached, the pack ice began to melt, break up, and grind floe against floe, and on 27th October 1915, pressure of the ice against the ship became unsustainable and Shackleton gave the order to abandon ship and establish a camp on the ice floe, floating on the Weddell Sea. The original plan was to use the sled dogs and the men to drag supplies and the ship's three lifeboats across the ice toward a cache of supplies known to have been left at Paulet Island by an earlier expedition, but pressure ridges in the sea ice soon made it evident that such an ambitious traverse would be impossible, and the crew resigned themselves to camping on the ice pack, whose drift was taking them north, until its breakup would allow them to use the boats to make for the nearest land. And so they waited, until April 8th, 1916, when the floe on which they were camped began to break up and they were forced into the three lifeboats to head for Elephant Island, a forbidding and uninhabited speck of land in the Southern Ocean. After a harrowing six day voyage, the three lifeboats arrived at the island, and for the first time in 497 days the crew of the Endurance were able to sleep on terra firma.

Nobody, even sealers and whalers operating of Antarctica, ever visited Elephant Island: Shackleton's crew were the first to land there. So the only hope of rescue was for a party to set out from there to the nearest reachable inhabited location, South Georgia Island, 1,300 kilometres across the Drake Passage, the stormiest and most treacherous sea on Earth. (There were closer destinations, but due to the winds and currents of the Southern Ocean, none of them were achievable in a vessel with the limited capabilities of their lifeboat.) Well, it had to be done, and so they did it. In one of the most remarkable achievements of seamanship of all time, Frank Worsley sailed his small open boat through these forbidding seas, surviving hurricane-force winds, rogue waves, and unimaginable conditions at the helm, arriving at almost a pinpoint landing on a tiny island in a vast sea with only his sextant and a pocket chronometer, the last remaining of the 24 the Endurance carried when it sailed from the Thames, worn around his neck to keep it from freezing.

But even then it wasn't over. Shackleton's small party had landed on the other side of South Georgia Island from the whaling station, and the state of their boat and prevailing currents and winds made it impossible to sail around the coast to there. So, there was no alternative but to go cross-country, across terrain completely uncharted (all maps showed only the coast, as nobody had ventured inland). And, with no other option, they did it. Since Shackleton's party, there has been only one crossing of South Georgia Island, done in 1955 by a party of expert climbers with modern equipment and a complete aerial survey of their route. They found it difficult to imagine how Shackleton's party, in their condition and with their resources, managed to make the crossing, but of course it was because they had to.

Then it was a matter of rescuing the party left at the original landing site on South Georgia, and then mounting an expedition to relieve those waiting at Elephant Island. The latter was difficult and frustrating—it was not until 30th August 1916 that Shackleton was able to take those he left on Elephant Island back to civilisation. And every single person who departed from South Georgia on the Endurance survived the expedition and returned to civilisation. All suffered from the voyage, but only stowaway Perce Blackboro lost a foot to frostbite; all the rest returned without consequences from their ordeal.

Bottom line—there were men on this expedition, and if similarly demanding expeditions in the future are crewed by men and women equal to their mettle, they will come through just fine without any of the problems the touchy-feely inkblot drones worry about. People with the “born as victim” self-image instilled by the nanny state are unlikely to qualify for such a mission, and should the all-smothering state manage to reduce its subjects to such larvæ, it is unlikely in the extreme that it would mount such a mission, choosing instead to huddle in its green enclaves powered by sewage and the unpredictable winds until the giant rock from the sky calls down the curtain on their fruitless existence.

I read the Kindle edition; unless you're concerned with mass and volume taking this book on a long trip (for which it couldn't be more appropriate!), I'd recommend the print edition, which is not only less expensive (neglecting shipping charges), but also reproduces with much higher quality the many photographs taken by expedition photographer Frank Hurley and preserved through the entire ordeal.

 Permalink

Suarez, Daniel. Daemon. New York: Signet, 2009. ISBN 978-0-451-22873-4.
Ever since “giant electronic brains” came into the public consciousness in the 1940s and '50s, “the computers taking over” has been a staple of science fiction, thrillers, and dystopian novels. To anybody who knows anything about computers, most of these have fallen in the spectrum from implausible to laughably bad, primarily because their authors didn't understand computers, and attributed to them anthropomorphic powers they don't possess, or assumed they had ways to influence events in the real world which they don't.

Here we have a novel that gets it right, is not just a thoughtful exploration of the interaction of computers, networks, and society, but a rip-roaring thriller as well, and, remarkably, is a first novel. In it, Matthew Sobol, a computer game designer who parleyed his genius for crafting virtual worlds in which large numbers of individuals and computer-generated characters interact (massively multiplayer online role-playing games) into a global enterprise, CyberStorm Entertainment, and a personal fortune in the hundreds of millions of dollars, tragically dies of brain cancer at the age of 34.

Shortly after Sobol's death, two CyberStorm employees die in bizarre circumstances which, when police detective Pete Sebeck begins to investigate them with the aid of itinerant computer consultant and dedicated gamer Jon Ross, lead them to suspect that they are murders orchestrated, for no immediately apparent motive, from beyond the grave by Sobol, and carried out by processes, daemons, running on Internet-connected computers without the knowledge of the systems' owners. When the FBI, called in due to their computer forensics resources, attempts to raid Sobol's mansion, things go beyond catastrophically wrong, and it appears they're up against an adversary which has resources and capabilities which are difficult to even quantify and potential consequences for society which cannot be bounded.

Spoiler warning: Plot and/or ending details follow.  
Or maybe not. Before long evidence emerges that Sobol was the victim of a scam orchestrated by Sebeck and his mistress, conning Sobol, whose cognitive facilities were failing as his disease progressed, and setting up the Daemon as a hoax to make a fortune in the stock market as CyberStorm's stock collapsed. This neatly wraps up the narrative, which is just what the police, FBI, and NSA want, and Sebeck is quickly convicted and finds himself on death row for the murders he was accused of having orchestrated. Some involved in the investigation doubt that this ties up all the loose ends, but their superiors put the kibosh on going public with their fears for the time-tested reason of “avoiding public panic”.

Meanwhile, curious things are happening in the worlds of online gaming, offshore Internet gambling and pornography businesses, pillars of the finance sector, media outlets, prisons, and online contract manufacturing. The plague of spam comes to an end in a cataclysmic event which many people on the receiving end may find entirely justified. As analysts at NSA and elsewhere put the pieces together, they begin to comprehend what they're up against and put together an above top secret task force to infiltrate and subvert the Daemon's activities. But in this wired world, it is difficult to keep anything off the record, especially when confronted by an adversary which, distributed on computers around the world, reading all Web sites and RSS feeds, and with its own stream of revenue and human agents which it rewards handsomely, is able to exert its power anywhere. It's a bit like God, when you think about it, or maybe what Google would like to become.

What makes the Daemon, and this book, so devilishly clever is that, in the words of the NSA analyst on its trail, “The Daemon is not an Internet worm or a network exploit. It doesn't hack systems. It hacks society.” Indeed, the Daemon is essentially a role playing game engine connected to the real world, with the ability to reward those humans who do its bidding with real world money, power, and prestige, not virtual credits in a game. Consider how much time and money highly intelligent people with limited social skills currently spend on online multiplayer games. Now imagine if the very best of them were recruited to deploy their talents in the world outside their parents' basements, and be compensated with wealth, independence, and power over others. Do you think there would be a shortage of people to do the Daemon's bidding, even without the many forms of coercion it could bring to bear on those who were unwilling?

Ultimately this book is about a phase change in the structure of human society brought about by the emergence of universal high bandwidth connectivity and distributed autonomous agents interacting with humans on an individual basis. From a pure Darwinian standpoint, might such a system be able to act, react, and mobilise resources so quickly and efficiently that it would run rings around the strongly hierarchical, coercive, and low bandwidth forms of organisation which have characterised human society for thousands of years? And if so, what could the legacy society do to stop it, particularly once it has become completely dependent upon the technologies which now are subverting and supplanting it?

Spoilers end here.  
When I say the author gets it right, I'm not claiming the plot is actually plausible or that something like this could happen in the present or near future—there are numerous circumstances where a reader with business or engineering experience will be extremely sceptical that so many intricate things which have never before been tested on a full scale (or at all) could be expected to work the first time. After all, multi-player online games are not opened to the public before extensive play testing and revision based upon the results. But lighten up: this is a thriller, not a technological forecast, and the price of admission in suspension of disbelief is much the same as other more conventional thrillers. Where the book gets it right is that when discussing technical details, terminology is used correctly, descriptions are accurate, and speculative technologies at least have prototypes already demonstrated. Many books of this genre simply fall into the trap of Star Trek-like technobabble or endow their technological gadgets with capabilities nobody would have any idea how to implement today. In many stories in which technology figures prominently, technologically knowledgeable readers find themselves constantly put off by blunders which aren't germane to the plot but are simply indicative of ignorance or sloppiness on the part of the author; that doesn't happen here. One of the few goofs I noticed was in chapter 37 where one of the Daemon's minions receives “[a] new 3-D plan file … then opened it in AutoCAD. It took several seconds, even on his powerful Unix workstation.” In fact, AutoCAD has run only on Microsoft platforms for more than a decade, and that isn't likely to change. But he knows about AutoCAD, not to mention the Haas Mini Mill.

The novel concludes with a rock 'em, sock 'em action scene which is going to be awe inspiring when this book is made into a movie. Rumour is that Paramount Pictures has already optioned the story, and they'll be fools if they don't proceed with production for the big screen. At the end of the book the saga is far from over, but it ends at a logical point and doesn't leave you with a cliffhanger. Fortunately, the sequel, Freedom™, is already out in hardcover and is available in a Kindle edition.

 Permalink

Reich, Eugenie Samuel. Plastic Fantastic. New York: St. Martin's Press, 2009. ISBN 978-0-230-62384-2.
Boosters of Big Science, and the politicians who rely upon its pronouncements to justify their policy prescriptions often cite the self-correcting nature of the scientific process: peer review subjects the work of researchers to independent and dispassionate scrutiny before results are published, and should an incorrect result make it into print, the failure of independent researchers to replicate it will inevitably call it into question and eventually cause it to be refuted.

Well, that's how it works in theory. Theory is very big in contemporary Big Science. This book is about how things work in fact, in the real world, and it's quite a bit different. At the turn of the century, there was no hotter property in condensed matter physics than Hendrik Schön, a junior researcher at Bell Labs who, in rapid succession reported breakthroughs in electronic devices fabricated from organic molecules including:

  • Organic field effect transistors
  • Field-induced superconductivity in organic crystals
  • Fractional quantum Hall effect in organic materials
  • Organic crystal laser
  • Light emitting organic transistor
  • Organic Josephson junction
  • High temperature superconductivity in C60
  • Single electron organic transistors

In the year 2001, Schön published a paper in a peer reviewed journal at a rate of one every eight days, with many reaching the empyrean heights of Nature, Science, and Physical Review. Other labs were in awe of his results, and puzzled because every attempt they made to replicate his experiments failed, often in ways which seemed to indicate the descriptions of experiments he published were insufficient for others to replicate them. Theorists also raised their eyebrows at Schön's results, because he claimed breakdown properties of sputtered aluminium oxide insulating layers far beyond measured experimental results, and behaviour of charge transport in his organic substrates which didn't make any sense according to the known properties of such materials.

The experimenters were in a tizzy, trying to figure out why they couldn't replicate Schön's results, while the theorists were filling blackboards trying to understand how his incongruous results could possibly make sense. His superiors were basking in the reflected glory of his ascendence into the élite of experimental physicists and the reflection of his glory upon their laboratory.

In April 2002, while waiting in the patent attorney's office at Bell Labs, researchers Julia Hsu and Lynn Loo were thumbing through copies of Schön's papers they'd printed out as background documentation for the patent application they were preparing, when Loo noticed that two graphs of inverter outputs, one in a Nature paper describing a device made of a layer of thousands of organic molecules, and another in a Science paper describing an inverter made of just one or two active molecules were identical, right down to the instrumental noise. When this was brought to the attention of Schön's manager and word of possible irregularities in Schön's publications began to make its way through the condensed matter physics grapevine, his work was subjected to intense scrutiny both within Bell Labs and by outside researchers, and additional instances of identical graphs re-labelled for entirely different experiments came to hand. Bell Labs launched a formal investigation in May 2002, which concluded, in a report issued the following September, that Schön had committed at least 16 instances of scientific misconduct, fabricating the experimental data he reported from mathematical functions, with no evidence whatsoever that he had ever built the devices he claimed to have, or performed the experiments described in his papers. A total of twenty-one papers authored by Schön in Science, Nature, and Physical Review were withdrawn, as well as a number in less prestigious venues.

What is fascinating in this saga of flat-out fraud and ultimate exposure and disgrace is how completely the much-vaunted system of checks and balances of industrial scale Big Science and peer review in the most prestigious journals completely fell on its face at the hands of a fraudster in a junior position with little or no scientific track record who was willing to make up data to confirm the published expectations of the theorists, and figured out how to game the peer review system, using criticisms of his papers as a guide to make up additional data to satisfy the objections of the referees. As a former manager of a group of ambitious and rambunctious technologists, what strikes me is how utterly Schön's colleagues and managers at Bell Labs failed in overseeing his work and vetting his results. “Extraordinary claims require extraordinary evidence”, and Schön was making and publishing extraordinary claims at the rate of almost one a week in 2001, and yet not once did anybody at Bell Labs insist on observing him perform one of the experiments he claimed to be performing, even after other meticulous experimenters in laboratories around the world reported that they were unable to replicate his results. Think about it—if a junior software developer in your company claimed to have developed a miraculous application, wouldn't you want to see a demo before issuing a press release about it and filing a patent application? And yet nobody at Bell Labs thought to do so with Schön's work.

The lessons from this episode are profound, and I see little evidence that they have been internalised by the science establishment. A great deal of experimental science is now guided by the expectations of theorists; it is difficult to obtain funding for an experimental program which looks for effects not anticipated by theory. In such an environment, an unscrupulous scientist willing to make up data that conforms to the prejudices of the theorists may be able to publish in prestigious journals and be considered a rising star of science based on an entirely fraudulent corpus of work. Because scientists, especially in the Anglo-Saxon culture, are loath to make accusations of fraud (as the author notes, in the golden age of British science such an allegation might well result in a duel being fought), failure to replicate experimental results is often assumed to be a failure by the replicator to precisely reproduce the circumstances of the original investigator, not to call into question the veracity of the reported work. Schön's work consisted of desktop experiments involving straightforward measurements of electrical properties of materials, which were about as simple as anything in contemporary science to evaluate and independently replicate. Now think of how vulnerable research on far less clear cut topics such as global climate, effects of diet on public health, and other topics would be to fraudulent, agenda-driven “research”. Also, Schön got caught only because he became sloppy in his frenzy of publication, duplicating graphs and data sets from one paper to another. How long could a more careful charlatan get away with it?

Quite aside from the fascinating story and its implications for the integrity of the contemporary scientific enterprise, this is a superbly written narrative which reads more like a thriller than an account of a regrettable episode in science. But it is entirely factual, and documented with extensive end notes citing original sources.

 Permalink

September 2010

Miller, Richard L. Under The Cloud. The Woodlands, TX: Two Sixty Press, [1986] 1991. ISBN 978-1-881043-05-8.
Folks born after the era of atmospheric nuclear testing, and acquainted with it only through accounts written decades later, are prone to react with bafflement—“What were they thinking?” This comprehensive, meticulously researched, and thoroughly documented account of the epoch not only describes what happened and what the consequences were for those in the path of fallout, but also places events in the social, political, military, and even popular culture context of that very different age. A common perception about the period is “nobody really understood the risks”. Well, it's quite a bit more complicated than that, as you'll understand after reading this exposition. As early as 1953, when ranchers near Cedar City, Utah lost more than 4000 sheep and lambs after they grazed on grass contaminated by fallout, investigators discovered the consequences of ingestion of Iodine-131, which is concentrated by the body in the thyroid gland, where it can not only lead to thyroid cancer but faster-developing metabolic diseases. The AEC reacted immediately to this discovery. Commissioner Eugene Zuckert observed that “In the present frame of mind of the public, it would only take a single illogical and unforeseeable incident to preclude holding any future tests in the United States”, and hence the author of the report on the incident was ordered to revise the document, “eliminating any reference to radiation damage or effects”. In a subsequent meetings with the farmers, the AEC denied any connection between fallout and the death of the sheep and denied compensation, claiming that the sheep, including grotesquely malformed lambs born to irradiated ewes, had died of “malnutrition”.

It was obvious to others that something serious was happening. Shortly after bomb tests began in Nevada, the Eastman Kodak plant in Rochester, New York which manufactured X-ray film discovered that when a fallout cloud was passing overhead their film batches would be ruined by pinhole fogging due to fallout radiation, and that they could not even package the film in cardboard supplied by a mill whose air and water supplies were contaminated by fallout. Since it was already known that radiologists with occupational exposure to X-rays had mean lifespans several years shorter than the general public, it was pretty obvious that exposing much of the population of a continent (and to a lesser extent the entire world) to a radiation dose which could ruin X-ray film had to be problematic at best and recklessly negligent at worst. And yet the tests continued, both in Nevada and the Pacific, until the Limited Test Ban Treaty between the U.S., USSR, and Great Britain was adopted in 1963. France and China, not signatories to the treaty, continued atmospheric tests until 1971 and 1980 respectively.

What were they thinking? Well, this was a world in which the memory of a cataclysmic war which had killed tens of millions of people was fresh, which appeared to be on the brink of an even more catastrophic conflict, which might be triggered if the adversary developed a weapon believed to permit a decisive preemptive attack or victory through intimidation. In such an environment where everything might be lost through weakness and dilatory progress in weapons research, the prospect of an elevated rate of disease among the general population was weighed against the possibility of tens of millions of deaths in a general conflict and the decision was made to pursue the testing. This may very well have been the correct decision—since you can't test a counterfactual, we'll never know—but there wasn't a general war between the East and West, and to this date no nuclear weapon has been used in war since 1945. But what is shocking and reprehensible is that the élites who made this difficult judgement call did not have the courage to share the facts with the constituents and taxpayers who paid their salaries and bought the bombs that irradiated their children's thyroids with Iodine-131 and bones with Strontium-90. (I'm a boomer. If you want to know just how many big boom clouds a boomer lived through as a kid, hold a sensitive radiation meter up to one of the long bones of the leg; you'll see the elevated beta radiation from the Strontium-90 ingested in milk and immured in the bones [Strontium is a chemical analogue of Calcium].) Instead, they denied the obvious effects, suppressed research which showed the potential risks, intimidated investigators exploring the effects of low level radiation, and covered up assessments of fallout intensity and effects upon those exposed. Thank goodness such travesties of science and public policy could not happen in our enlightened age! An excellent example of mid-fifties AEC propaganda is the Atomic Test Effects in the Nevada Test Site Region pamphlet, available on this site: “Your best action is not to be worried about fall-out. … We can expect many reports that ‘Geiger counters were going crazy here today.’ Reports like this may worry people unnecessarily. Don't let them bother you.”

This book describes U.S. nuclear testing in Nevada in detail, even giving the precise path the fallout cloud from most detonations took over the country. Pacific detonations are covered in less detail, concentrating on major events and fallout disasters such as Castle Bravo. Soviet tests and the Chelyabinsk-40 disaster are covered more sketchily (fair enough—most details remained secret when the book was written), and British, French, and Chinese atmospheric tests are mentioned only in passing.

The paperback edition of this book has the hefty cover price of US$39.95, which is ta lot for a book of 548 pages with just a few black and white illustrations. I read the Kindle edition, which is priced at US$11.99 at this writing, which is, on its merits, even more overpriced. It is a sad, sorry, and shoddy piece of work, which appears to be the result of scanning a printed edition of the book with an optical character recognition program and transferring it to Kindle format without any proofreading whatsoever. Numbers and punctuation are uniformly garbled, words are mis-recognised, random words are jammed into the text as huge raster images, page numbers and chapter headings are interleaved into the text, and hyphenated words are not joined while pairs of unrelated words are run together. The abundant end note citations are randomly garbled and not linked to the notes at the end of the book. The index is just a scan of that in the printed book, garbled, unlinked to the text, and utterly useless. Most public domain Kindle books sold for a dollar have much better production values than this full price edition. It is a shame that such an excellent work on which the author invested such a great amount of work doing the research and telling the story has been betrayed by this slapdash Kindle edition which will leave unwary purchasers feeling their pockets have been picked. I applaud Amazon's providing a way for niche publishers and independent authors to bring their works to market on the Kindle, but I wonder if their lack of quality control on the works published (especially at what passes for full price on the Kindle) might, in the end, injure the reputation of Kindle books among the customer base. After this experience, I know for sure that I will never again purchase a Kindle book from a minor publisher before checking the comments to see if the transfer merits the asking price. Amazon might also consider providing a feedback mechanism for Kindle purchasers to rate the quality of the transfer to the Kindle, which would appear along with the content-based rating of the work.

 Permalink

Walsh, Michael. Hostile Intent. New York: Pinnacle Books, 2009. ISBN 978-0-7860-2042-3.
Michael Walsh is a versatile and successful writer who has been a Moscow correspondent and music critic for Time magazine, written a novel which is a sequel to Casablanca, four books about classical music, and a screenplay for the Disney Channel which was the highest rated original movie on the channel at the time. Two of his books have been New York Times bestsellers, and his gangster novel And All the Saints won an American Book Award in 2004. This novel is the first of a projected series of five. The second, Early Warning, was released in September 2010.

In the present novel, the author turns to the genre of the contemporary thriller, adopting the template created by Tom Clancy, and used with such success by authors such as Vince Flynn and Brad Thor: a loner, conflicted agent working for a shadowy organisation, sent to do the dirty work on behalf of the highest levels of the government of the United States. In this case, the protagonist is known only as “Devlin” (although he assumes a new alias and persona every few chapters), whose parents were killed in a terrorist attack at the Rome airport in 1985 and has been raised as a covert instrument of national policy by a military man who has risen to become the head of the National Security Agency (NSA). Devlin works for the Central Security Service, a branch of the NSA which, in the novel, retains its original intent of being “Branch 4” of the armed forces, able to exploit information resources and execute covert operations outside the scope of conventional military actions.

The book begins with a gripping description of a Beslan-like school hostage attack in the United States in which Devlin is activated to take down the perpetrators. After achieving a mostly successful resolution, he begins to suspect that the entire event was simply a ruse to draw him into the open so that he could be taken down by his enemies. This supposition is confirmed, at least in his own justifiably paranoid mind, by further terrorist strikes in Los Angeles and London, which raise the stakes and further expose his identity and connections.

This is a story which starts strong but then sputters out as it unfolds. The original taut narrative of the school hostage crisis turns into a mush with a shadowy supervillain who is kind of an evil George Soros (well, I mean an even more evil George Soros), a feckless and inexperienced U.S. president (well, at least that could never happen!), and Devlin, the über paranoid loner suddenly betting everything on a chick he last met in a shoot-out in Paris.

Thrillers are supposed to thrill, but if set in the contemporary world or the near future (as is this book—the fall of Mugabe in Zimbabwe is mentioned, but everything is pretty much the same as the present), they're expected to be plausible as regards the technology used and the behaviour of the characters. It just doesn't do to have the hero, in a moment of crisis, when attacked by ten thousand AK-47 wielding fanatics from all directions, pull out his ATOMIC SPACE GUN and mow them down with a single burst.

But that's pretty much what happens here. I'll have to go behind the spoiler curtain to get into the details, so I'll either see you there or on the other side if you've decided to approach this novel freshly without my nattering over details.

Spoiler warning: Plot and/or ending details follow.  
  • We are asked to believe that a sitting U.S. president would order two members of his Secret Service detail to commit a cold blooded murder in order to frame a senator and manipulate his reelection campaign, and that the agents would carry out the murder. This is simply absurd.
  • As the story develops we learn that the shadowy “Branch 4” for which Devlin believes he is working does not, in fact, exist, and that Devlin is its sole agent, run by the director of NSA. Now Devlin has back-door access to all U.S. intelligence assets and databases and uses them throughout. How plausible is it that he wouldn't have figured this out himself?
  • Some people have cell phones: Devlin has a Hell phone. In chapter 7 we're treated to a description of Devlin's Black Telephone, which is equipped with “advanced voice-recognition software”, a fingerprint scanner in the receiver, and a retinal scanner in the handset. “If any of these elements were not sequenced within five seconds, the phone would self-destruct in a fireball of shrapnel, killing any unauthorized person unlucky enough to have picked it up.” Would you trust a government-supplied telephone bomb to work with 100% reliability? What if your stack of dossiers topples over and knocks off the receiver?
  • In several places “logarithm” is used where “algorithm” is intended. Gadgetry is rife with urban legends such as the computer virus which causes a hard drive to melt.
  • In chapter 12 the phone rings and Devlin “spoke into a Blu-Ray mouthpiece as he answered”. Blu-ray is an optical disc storage format; Bluetooth is the wireless peripheral technology. Besides, would an operative obsessed with security to the level of paranoia use a wireless headset with dubious anti-eavesdropping measures?
  • The coup de grace of the series of terrorist attacks is supposed to be an electromagnetic pulse (EMP) attack against the United States, planned to knock out all electronics, communications, and electrical power in the eastern part of the country. The attack consists of detonating an ex-Soviet nuclear weapon raised to the upper atmosphere by a weather balloon launched from a ship off the East Coast. Where to begin? Well, first of all, at the maximum altitude reachable by a weather balloon, the mean free path of the gamma rays from the detonation through the atmosphere would be limited, as opposed to the unlimited propagation distance from an explosion in space well above the atmosphere. This would mean that any ionisation of atoms in the atmosphere would be a local phenomenon, which would reduce the intensity and scope of the generated pulse. Further, the electromagnetic pulse cannot propagate past the horizon, so even if a powerful pulse were generated at the altitude of a balloon, it wouldn't propagate far enough to cause a disaster all along the East Coast.
  • In the assault on Clairvaux Prison, is it conceivable that an experienced special forces operator would take the mother of a hostage and her young son along aboard the helicopter gunship leading the strike?
  • After the fight in the prison, archvillain Skorenzy drops through a trap door and escapes to a bolt-hole, and at the end of the novel is still at large and presumed to be continuing his evil schemes. But his lair is inside a French maximum security prison! How does he get away? Say what you like about the French military, when it comes to terrorists they're deadly serious, right up there with the Mossad. Would a prison that housed Carlos the Jackal have a tunnel which would allow Skorenzy to saunter out? Would French officials allow the man who blew up a part of Los Angeles and brought down the London Eye with a cruise missile free passage?
Spoilers end here.  
It's a tangled, muddled mess. It has its moments, but there isn't the building toward a climax and then the resolution one expects from a thriller. None of the characters are really admirable, and the author's policy preferences (with which I largely agree) are exhibited far too blatantly, as opposed to being woven into the plot. The author, accomplished in other genres, may eventually master the thriller, but I doubt I'll read any of the sequels to find out for myself.

 Permalink

October 2010

Sowell, Thomas. Dismantling America. New York: Basic Books, 2010. ISBN 978-0-465-02251-9.
Thomas Sowell has been, over his career, an optimist about individual liberty and economic freedom in the United States and around the world. Having been born in the segregated South, raised by a single mother in Harlem in the 1940s, he said that the progress he had observed in his own lifetime, rising from a high school dropout to the top of his profession, convinced him that America ultimately gets it right, and that opportunity for those who wish to advance through their own merit and hard work is perennial. In recent years, however, particularly since the rise and election of Barack Obama, his outlook has darkened considerably, almost approaching that of John Derbyshire. Do you think I exaggerate? Consider this passage from the preface:

No one issue and no one administration in Washington has been enough to create a perfect storm for a great nation that has weathered many storms in its more than two centuries of existence. But the Roman Empire lasted many times longer, and weathered many storms in its turbulent times—and yet it ultimately collapsed completely.

It has been estimated that a thousand years passed before the standard of living in Europe rose again to the level it had achieved in Roman times. The collapse of civilization is not just the replacement of rulers or institutions with new rulers and new institutions. It is the destruction of a whole way of life and the painful, and sometimes pathetic, attempts to begin rebuilding amid the ruins.

Is that where America is headed? I believe it is. Our only saving grace is that we are not there yet—and that nothing is inevitable until it happens.

Strong stuff! The present volume is a collection of the author's syndicated columns dating from before the U.S. election of 2008 into the first two years of the Obama administration. In them he traces how the degeneration and systematic dismantling of the underpinnings of American society which began in the 1960s culminated in the election of Obama, opening the doors to power to radicals hostile to what the U.S. has stood for since its founding and bent on its “fundamental transformation” into something very different. Unless checked by the elections of 2010 and 2012, Sowell fears the U.S. will pass a “point of no return” where a majority of the electorate will be dependent upon government largesse funded by a minority who pay taxes. I agree: I deemed it the tipping point almost two years ago.

A common theme in Sowell's writings of the last two decades has been how public intellectuals and leftists (but I repeat myself) attach an almost talismanic power to words and assume that good intentions, expressed in phrases that make those speaking them feel good about themselves, must automatically result in the intended outcomes. Hence the belief that a “stimulus bill” will stimulate the economy, a “jobs bill” will create jobs, that “gun control” will control the use of firearms by criminals, or that a rise in the minimum wage will increase the income of entry-level workers rather than price them out of the market and send their jobs to other countries. Many of the essays here illustrate how “progressives” believe, with the conviction of cargo cultists, that their policies will turn the U.S. from a social Darwinist cowboy capitalist society to a nurturing nanny state like Sweden or the Netherlands. Now, notwithstanding that the prospects of those two countries and many other European welfare states due to demographic collapse and Islamisation are dire indeed, the present “transformation” in the U.S. is more likely, in my opinion, to render it more like Perón's Argentina than France or Germany.

Another part of the “perfect storm” envisioned by Sowell is the acquisition of nuclear weapons by Iran, the imperative that will create for other states in the region to go nuclear, and the consequent possibility that terrorist groups will gain access to these weapons. He observes that Japan in 1945 was a much tougher nation than the U.S. today, yet only two nuclear bombs caused them to capitulate in a matter of days. How many cities would the U.S. have to lose? My guess is at least two but no more than five. People talk about there being no prospect of a battleship Missouri surrender in the War on Terror (or whatever they're calling it this week), but the prospect of a U.S. surrender on the carrier Khomeini in the Potomac is not as far fetched as you might think.

Sowell dashes off epigrams like others write grocery lists. Here are a few I noted:

  • One of the painful consequences of studying history is that it makes you realize how long people have been doing the same foolish things with the same disastrous results.
  • There is usually only a limited amount of damage that can be done by dull or stupid people. For creating a truly monumental disaster, you need people with high IQs.
  • Do not expect sound judgments in a society where being “non-judgmental” is an exalted value. As someone has said, if you don't stand for something, you will fall for anything.
  • Progress in general seems to hold little interest for people who call themselves “progressives”. What arouses them are denunciations of social failures and accusations of wrong-doing.
      One wonders what they would do in heaven.
  • In a high-tech age that has seen the creation of artificial intelligence by computers, we are also seeing the creation of artificial stupidity by people who call themselves educators.
  • Most people on the left are not opposed to freedom. They are just in favor of all sorts of things that are incompatible with freedom.
  • Will those who are dismantling this society from within or those who seek to destroy us from without be the first to achieve their goal? It is too close to call.

As a collection of columns, you can read this book in any order you like (there are a few “arcs” of columns, but most are standalone), and pick it up and put it down whenever you like without missing anything. There is some duplication among the columns, but they never become tedious. Being newspaper columns, there are no source citations or notes, and there is no index. What are present in abundance are Sowell's acute observations of the contemporary scene, historical perspective, rigorous logic, economic common sense, and crystal clear exposition. I had read probably 80% of these columns when they originally appeared, but gleaned many new insights revisiting them in this collection.

The author discusses the book, topics raised in it, and the present scene in an extended video interview, for which a transcript exists. A shorter podcast interview with the author is also available.

 Permalink

Flynn, Vince. Pursuit of Honor. New York: Pocket Books, 2009. ISBN 978-1-4165-9517-5.
This is the tenth novel in the Mitch Rapp (warning—the article at this link contains minor spoilers) saga, and the conclusion of the story which began in the previous volume, Extreme Measures (July 2010). In that book, a group of terrorists staged an attack in Washington D.C., with the ringleaders managing to disappear in the aftermath. In the present novel, it's time for payback, and Mitch Rapp and his team goes on the trail not only of the terrorists but also their enablers within the U.S. government.

The author says that you should be able to pick up and enjoy any of his novels without any previous context, but in my estimation you'll miss a great deal if you begin here without having read Extreme Measures. While an attempt is made (rather clumsily, it seemed to me) to brief the reader in on the events of the previous novel, those who start here will miss much of the character development of the terrorists Karim and Hakim, and the tension between Mitch Rapp and Mike Nash, whose curious parallels underlie the plot.

This is more a story of character development and conflict between personalities and visions than action, although it's far from devoid of the latter. There is some edgy political content in which I believe the author shows his contempt for certain factions and figures on the Washington scene, including “Senator ma'am”. The conclusion is satisfying although deliberately ambiguous in some regards. I appear to have been wrong in my review of Extreme Measures about where the author was taking Mike Nash, but then you never know.

This book may, in terms of the timeline, be the end of the Mitch Rapp series. Vince Flynn's forthcoming novel, American Assassin, is a “prequel”, chronicling Rapp's recruitment into the CIA, training, and deployment on his first missions. Still, it's difficult in the extreme to cork a loose cannon, so I suspect in the coming years we'll see further exploits by Mitch Rapp on the contemporary scene.

 Permalink

Mahoney, Bob. Damned to Heaven. Austin, TX: 1st World Publishing, 2003. ISBN 978-0-9718562-8-8.
This may be the geekiest space thriller ever written. The author has worked as a spaceflight instructor at NASA's Johnson Space Center in Houston for more than a decade, training astronauts and flight controllers in the details of orbital operations. He was Lead Instructor for the first Shuttle-Mir mission. He knows his stuff, and this book, which bristles with as many acronyms and NASA jargon as a Shuttle flight plan, gets the details right and only takes liberty with the facts where necessary to advance the plot. Indeed, it seems the author is on an “expanded mission” of his NASA career as an instructor to ensure that not only those he's paid to teach, but all readers of the novel know their stuff as well—he even distinguishes acronyms pronounced letter-by-letter (such as E.V.A.) and those spoken as words (like OMS), and provides pronunciation guides for the latter.

For a first time novelist, the author writes quite well, and there are only a few typographical and factual errors. Since the dialogue is largely air to ground transmissions or proceedings of NASA mission management meetings, it comes across as stilted, but is entirely authentic—that's how they talk. Character description is rudimentary, and character development as the story progresses almost nonexistent, but then most of the characters are career civil servants who have made it to the higher echelons of an intensely politically correct and meritocratic bureaucracy where mavericks or those even remotely interesting are ground down or else cut off and jettisoned. Again, not the usual dramatis personæ of a thriller, but pretty accurate.

So what about the story? A space shuttle bound for the International Space Station suffers damage to its thermal protection system which makes it impossible to reenter safely, and the crew takes refuge on the still incomplete Station, stretching its life support resources to the limit. A series of mishaps, which may seem implausible all taken together, but every one of which has actually occurred in U.S. and Soviet space operations over the last two decades, eliminates all of the rescue alternatives but one last, desperate Hail Mary option, which a flight director embraces, not out of boldness, but because there is no other way to save the crew. Trying to thwart the rescue is a malevolent force high in the NASA management hierarchy, bent on destroying the existing human spaceflight program in order that a better replacement may be born. (The latter might have seemed preposterous when the novel was published in 2003, but looking just at the results of NASA senior management decisions in the ensuing years, it's hard to distinguish the outcomes from those of having deliberate wreckers at the helm.)

The author had just about finished the novel when the Columbia accident occurred in February 2003. Had Columbia been on a mission to the Space Station, and had the damage to its thermal protection system been detected (which is probable, as it would have been visible as the shuttle approached the station), then the scenario here, or at least the first part, would have likely occurred. The author made a few changes to the novel post-Columbia; they are detailed in notes at the end.

As a thriller, this worked for me—I read the whole thing in three days and enjoyed the author's painting his characters into corner after corner and then letting them struggle to avert disaster due to the laws of nature, ambitious bureaucratic adversaries, and cluelessness and incompetence, in ascending order of peril to mission success and crew survival. I suspect many readers will consider this a bit much; recall that I used the word “geekiest” in the first sentence of these remarks. But unlike another thriller by a NASA engineer, I was never once tempted to hurl this one into the flame trench immediately before ignition.

If the events in this book had actually happened, and an official NASA historian had written an account of them some years later, it would probably read much like this book. That is quite an achievement, and the author has accomplished that rare feat of crafting a page-turner (at least for readers who consider “geeky” a compliment) which also gets the details right and crafts scenarios which are both surprising and plausible. My quibbles with the plot are not with the technical details but rather scepticism that the NASA of today could act as quickly as in the novel, even when faced with an existential threat to its human spaceflight program.

 Permalink

[Audiobook] Wolfe, Tom. I Am Charlotte Simmons. (Audiobook, Unabridged). New York: Macmillan Audio, 2004. ISBN 978-0-312-42444-2.
Thomas Sowell has written, “Each new generation born is in effect an invasion of civilization by little barbarians, who must be civilized before it is too late”. Tom Wolfe's extensively researched and pitch-perfect account of undergraduate life at an élite U.S. college in the first decade of the twenty-first century is a testament to what happens when the barbarians sneak into the gates of the cloistered cities of academe, gain tenure, and then turn the next generation of “little barbarians” loose into a state of nature, to do what their hormones and whims tell them to.

Our viewpoint into this alien world (which the children and grandchildren of those likely to be reading this chronicle inhabit, if they're lucky [?] enough to go to one of those élite institutions which groom them for entry into the New [or, as it is coming to be called, Ruling] Class at the cost of between a tenth and a quarter of a million dollars, often front-end loaded as debt onto the lucky students just emerging into those years otherwise best spent in accumulating capital to buy a house, start a family, and make the key early year investments in retirement and inheritance for their progeny) is Charlotte Simmons of Sparta, North Carolina, a Presidential Scholar from the hill country who, by sheer academic excellence, has won a full scholarship to Dupont University, known not only for its academic prestige, but also its formidable basketball team.

Before arriving at Dupont, Charlotte knew precisely who she was, what she wanted, and where she was going. Within days after arriving, she found herself in a bizarre mirror universe where everything she valued (and which the university purported to embody) was mocked by the behaviour of the students, professors, and administrators. Her discoveries are our discoveries of this alien culture which is producing those who will decide our fate in our old age. Worry!

Nobody remotely competes with Tom Wolfe when it comes to imbibing an alien culture, mastering its jargon and patois, and fleshing out the characters who inhabit it. Wolfe's talents are in full ascendance here, and this is a masterpiece of contemporary pedagogic anthropathology. We are doomed!

The audio programme is distributed in four files, running 31 hours and 16 minutes and includes a brief interview with the author at the end. An Audio CD edition is available, as is a paperback print edition.

 Permalink

Shirer, William L. The Rise and Fall of the Third Reich. New York: Touchstone Books, [1959, 1960] 1990. ISBN 978-0-671-72868-7.
According to an apocryphal story, a struggling author asks his agent why his books aren't selling better, despite getting good reviews. The agent replies, “Look, the only books guaranteed to sell well are books about golf, books about cats, and books about Nazis.” Some authors have taken this too much to heart. When this massive cinder block of a book (1250 pages in the trade paperback edition) was published in 1960, its publisher did not believe a book about Nazis (or at least such a long one) would find a wide audience, and ordered an initial print run of just 12,500 copies. Well, it immediately went on to sell more than a million copies in hardback, and then another million in paperback (it was, at the time, the thickest paperback ever published). It has remained in print continuously for more than half a century, has been translated into a number of languages, and at this writing is in the top ten thousand books by sales rank on Amazon.com.

The author did not just do extensive research on Nazi Germany, he lived there from 1934 through 1940, working as a foreign correspondent based in Berlin and Vienna. He interviewed many of the principals of the Nazi regime and attended Nazi rallies and Hitler's Reichstag speeches. He was the only non-Nazi reporter present at the signing of the armistice between France and Germany in June 1940, and broke the news on CBS radio six hours before it was announced in Germany. Living in Germany, he was able to observe the relationship between ordinary Germans and the regime, but with access to news from the outside which was denied to the general populace by the rigid Nazi control of information. He left Germany in December 1940 when increasingly rigid censorship made it almost impossible to get accurate reporting out of Germany, and he feared the Gestapo were preparing an espionage case against him.

Shirer remarks in the foreword to the book that never before, and possibly never again, will historians have access to the kind of detailed information on the day-to-day decision making and intrigues of a totalitarian state that we have for Nazi Germany. Germans are, of course, famously meticulous record-keepers, and the rapid collapse and complete capitulation of the regime meant that those voluminous archives fell into the hands of the Allies almost intact. That, and the survival of diaries by a number of key figures in the senior leadership of Germany and Italy, provides a window into what those regimes were thinking as they drew plans which would lead to calamity for Europe and their ultimate downfall. The book is extensively footnoted with citations of primary sources, and footnotes expand upon items in the main text.

This book is precisely what its subtitle, “A History of Nazi Germany”, identifies it to be. It is not, and does not purport to be, an analysis of the philosophical origins of Nazism, investigation of Hitler's personality, or a history of Germany's participation in World War II. The war years occupy about half of the book, but the focus is not on the actual conduct of the war but rather the decisions which ultimately determined its outcome, and the way (often bizarre) those decisions were made. I first read this book in 1970. Rereading it four decades later, I got a great deal more out of it than I did the first time, largely because in the intervening years I'd read many other books about the period which cover aspects of the period which Shirer's pure Germany-focused reportage does not explore in detail.

The book has stood up well to the passage of time. The only striking lacuna is that when the book was written the fact that Britain had broken the German naval Enigma cryptosystem, and was thus able to read traffic between the German admiralty and the U-boats, had not yet been declassified by the British. Shirer's coverage of the Battle of the Atlantic (which is cursory), thus attributes the success in countering the U-boat threat to radar, antisubmarine air patrols, and convoys, which were certainly important, but far from the whole story.

Shirer is clearly a man of the Left (he manages to work in a snarky comment about the Coolidge administration in a book about Nazi Germany), although no fan of Stalin, who he rightly identifies as a monster. But I find that the author tangles himself up intellectually in trying to identify Hitler and Mussolini as “right wing”. Again and again he describes the leftist intellectual and political background of key figures in the Nazi and Fascist movements, and then tries to persuade us they somehow became “right wing” because they changed the colour of their shirts, even though the official platform and policies of the Nazi and Fascist regimes differed only in the details from those of Stalin, and even Stalin believed, by his own testimony, that he could work with Nazi Germany to the mutual benefit of both countries. It's worth revisiting Liberal Fascism (January 2008) for a deeper look at how collectivism, whatever the colour of the shirts or the emblem on the flags, stems from the same intellectual roots and proceeds to the same disastrous end point.

But these are quibbles about a monument of twentieth century reportage which has the authenticity of having been written by an eyewitness to many of the events described therein, the scholarship of extensive citations and quotations of original sources, and accessibility to the general reader. It is a classic which has withstood the test of time, and if I'm still around forty years hence, I'm sure I'll enjoy reading it a third time.

 Permalink

Codevilla, Angelo. The Ruling Class. New York: Beaufort Books, 2010. ISBN 978-0-8253-0558-0.
This slim volume (just 160 pages) is a somewhat expanded version of the author's much discussed essay with the same title which appeared in the July/August 2010 issue of The American Spectator. One of the key aspects of “American exceptionalism” over most of the nation's history has been something it didn't have but which most European and Asian nations did: a ruling class distinct from the general citizenry. Whether the ruling class was defined by heredity (as in Britain), or by meritocratic selection (as in France since the Revolution and Germany after Bismarck), most countries had a class of rulers who associated mostly with themselves, and considered themselves to uniquely embody the expertise and wisdom to instruct the masses (a word of which they tended to be fond) in how to live their lives.

In the U.S., this was much less the case. Before the vast centralisation and growth of the federal government in the New Deal and afterward, the country was mostly run by about fifty thousand people who got involved in grass roots public service: school boards, county commissions, and local political party organisations, from whom candidates for higher office were chosen based upon merit, service, and demonstrated track record. People who have come up by such a path will tend to be pretty well anchored to the concerns of ordinary citizens because they are ordinary citizens who have volunteered their time to get involved in res publica.

But with the grand centralisation of governance in Imperial Washington over the last century, a new kind of person was attracted to what used to be, and is still called, with exquisite irony, “public service”. These are people who have graduated from a handful of élite universities and law schools, and with the exception of perhaps a brief stint at a large law firm dealing mainly with the government, spent their entire careers in the public sector and its cloud of symbiotic institutions: regulatory agencies, appointed offices, elected positions, lobbying firms, and “non-governmental organisations” which derive their entire income from the government. These individuals make up what I have been calling, after Milovan Đilas, the New Class, but which Codevilla designates the Ruling Class in the present work.

In the U.S., entry to the ruling class is not, as it is in France, a meritocracy based on competitive examinations and performance in demanding academic institutions. Instead, it is largely a matter of who you, or your family, knows, what university you attended, and how well you conform to the set of beliefs indoctrinated there. At the centre of this belief system is that a modern nation is far too complicated to be governed by citizen-legislators chosen by ignorant rubes who didn't attend Harvard, Yale, Stanford, or one of the other ruling class feeder belts, but rather must be guided by enlightened experts like, well, themselves, and that all the ills of society can be solved by giving the likes of, well, themselves, more power over the population. They justify this by their reliance on “science” (the details of which they are largely ignorant), and hence they fund a horde of “scientists” who produce “studies” which support the policies they advocate.

Codevilla estimates that about a third of the U.S. population are either members of the ruling class (a small fraction), or aligned with its policies, largely due to engineered dependency on government programs. This third finds its political vehicle in the Democratic party, which represents their interests well. What about the other two thirds, which he dubs the “Country Class” (which I think is a pretty lame term, but no better comes immediately to mind)? Well, they don't have a political party at all, really. The Republican party is largely made up of ruling class people (think son of a president George W. Bush, or son of an admiral John McCain), and quickly co-opts outsiders who make it to Washington into the Imperial ruling class mindset.

A situation where one third of the population is dictating its will to the rest, and taxing a minority to distribute the proceeds to its electoral majority, in which only about a fifth of the population believes the federal government has the consent of the governed, and two thirds of the population have no effective political vehicle to achieve their agenda is, as Jimmy Carter's pollster Pat Caddell put it, pre-revolutionary. Since the ruling class has put the country on an unsustainable course, it is axiomatic that it will not be sustained. How it will end, however, is very much up in the air. Perhaps the best outcome would be a take-over of the Republican party by those genuinely representative of the “country party”, but that will be extremely difficult without a multitude of people (encouraged by their rulers toward passivity and resignation to the status quo) jumping into the fray. If the Republicans win a resounding victory in the elections of November 2010 (largely due to voters holding their noses and saying “they can't be worse than the current bums in office”) and then revert to ruling class business as usual, it's almost certain there will be a serious third party in play in 2012, not just at the presidential level (as the author notes, for a while in 1992, Ross Perot out-polled both the first Bush and Clinton before people concluded he was a flake with funny ears), but also in congressional races. If the Republicans are largely running in 2010 on a platform of, “Hey, at least we aren't the Democrats!”, then the cry in 2012 may be “We aren't either of those foul, discredited parties.”

As fiscally responsible people, let's talk about value for money. This book just doesn't cut it. You can read the original essay for free online. Although the arguments and examples therein are somewhat fleshed out in this edition, there's no essential you'll miss in reading the magazine essay instead of this book. Further, the 160 page book is padded—I can summon no kinder word—by inclusion of the full text of the Declaration of Independence and U.S. Constitution. Now, these are certainly important documents, but it's not like they aren't readily available online, nor that those inclined to read the present volume are unfamiliar with them. I think their presence is mostly due to the fact that were they elided, the book would be a mere hundred pages and deemed a pamphlet at best.

This is an enlightening and important argument, and I think spot-on in diagnosing the central problem which is transforming the U.S. from an engine of innovation and productivity into a class warfare redistributive nanny state. But save your money and read the magazine article, not the book.

 Permalink

McGovern, Patrick E. Uncorking the Past. Berkeley: University of California Press, 2009. ISBN 978-0-520-25379-7.
While a variety of animals are attracted to and consume the alcohol in naturally fermented fruit, only humans have figured out how to promote the process, producing wine from fruit and beer from cereal crops. And they've been doing it since at least the Neolithic period: the author discovered convincing evidence of a fermented beverage in residues on pottery found at the Jiahu site in China, inhabited between 7000 and 5800 B.C.

Indeed, almost every human culture which had access to fruits or grains which could be turned into an alcoholic beverage did so, and made the production and consumption of spirits an important part of their economic and spiritual life. (One puzzle is why the North American Indians, who lived among an abundance of fermentable crops never did—there are theories that tobacco and hallucinogenic mushrooms supplanted alcohol for shamanistic purposes, but basically nobody really knows.)

The author is a pioneer in the field of biomolecular archæology and head of the eponymous laboratory at the University of Pennsylvania Museum of Archæology and Anthropology; in this book takes us on a tour around the world and across the centuries exploring, largely through his own research and that of associates, the history of fermented beverages in a variety of cultures and what we can learn from this evidence about how they lived, were organised, and interacted with other societies. Only in recent decades has biochemical and genetic analysis progressed to the point that it is possible not only to determine from some gunk found at the bottom of an ancient pot not only that it was some kind of beer or wine, but from what species of fruit and grain it was produced, how it was prepared and fermented, and what additives it may have contained and whence they originated. Calling on experts in related disciplines such as palynology (the study of pollen and spores, not of the Alaskan politician), the author is able to reconstruct the economics of the bustling wine trade across the Mediterranean (already inferred from shipwrecks carrying large numbers of casks of wine) and the diffusion of the ancestral cultivated grape around the world, displacing indigenous grapes which were less productive for winemaking.

While the classical period around the Mediterranean is pretty much soaked in wine, and it'd be difficult to imagine the Vikings and other North Europeans without their beer and grogs, much less was known about alcoholic beverages in China, South America, and Africa. Once again, the author is on their trail, and not only reports upon his original research, but also attempts, in conjunction with micro-brewers and winemakers, to reconstruct the ancestral beverages of yore.

The biochemical anthropology of booze is not exactly a crowded field, and in this account written by one of its leaders, you get the sense of having met just about all of the people pursuing it. A great deal remains to be learnt—parts of the book read almost like a list of potential Ph.D. projects for those wishing to follow in the author's footsteps. But that's the charm of opening a new window into the past: just as DNA and other biochemical analyses revolutionised the understanding of human remains in archæology, the arsenal of modern analytical tools allows reconstructing humanity's almost universal companion through the ages, fermented beverages, and through them, uncork the way in which those cultures developed and interacted.

A paperback edition will be published in December 2010.

 Permalink

Haisch, Bernard. The Purpose-Guided Universe. Franklin Lakes, NJ: Career Press, 2010. ISBN 978-1-60163-122-0.
The author, an astrophysicist who was an editor of the Astrophysical Journal for a decade, subtitles this book “Believing In Einstein, Darwin, and God”. He argues that the militant atheists who have recently argued that science is incompatible with belief in a Creator are mistaken and that, to the contrary, recent scientific results are not only compatible with, but evidence for, the intelligent design of the laws of physics and the initial conditions of the universe.

Central to his argument are the variety of “fine tunings” of the physical constants of nature. He lists ten of these in the book's summary, but these are chosen from a longer list. These are quantities, such as the relative masses of the neutron and proton, the ratio of the strength of the electromagnetic and gravitational forces, and the curvature of spacetime immediately after the Big Bang which, if they differed only slightly from their actual values, would have resulted in a universe in which the complexity required to evolve any imaginable form of life would not exist. But, self evidently, we're here, so we have a mystery to explain. There are really only three possibilities:

  1. The values of the fine-tuned parameters are those we measure because they can't be anything else. One day we'll discover a master equation which allows us to predict their values from first principles, and we'll discover that any change to that equation produces inconsistent results. The universe is fine tuned because that's the only way it could be.
  2. The various parameters were deliberately fine tuned by an intelligent, conscious designer bent on creating a universe in which sufficient complexity could evolve so as to populate it with autonomous, conscious beings. The universe is fine tuned by a creator because that's necessary to achieve the goal of its creation.
  3. The parameters are random, and vary from universe to universe among an ensemble in a “multiverse” encompassing a huge, and possibly infinite number of universes with no causal connection to one another. We necessarily find the parameters of the universe we inhabit to be fine tuned to permit ourselves to exist because if they weren't, we wouldn't be here to make the observations and puzzle over the results. The universe is fine tuned because it's just one of a multitude with different settings, and we can only observe one which happens to be tuned for us.

For most of the history of science, it was assumed that possibility (1)—inevitability by physical necessity—was what we'd ultimately discover once we'd teased out the fundamental laws at the deepest level of nature. Unfortunately, despite vast investment in physics, both experimental and theoretical, astronomy, and cosmology, which has matured in the last two decades from wooly speculation to a precision science, we have made essentially zero progress toward this goal. String theory, which many believed in the heady days of the mid-1980s to be the path to that set of equations you could wear on a T-shirt and which would crank out all the dial settings of our universe, now seems to indicate to some (but not all) of those pursuing it, that possibility (3): a vast “landscape” of universes, all unobservable even in principle, one of which with wildly improbable properties we find ourselves in because we couldn't exist in most of the others is the best explanation.

Maybe, the author argues, we should take another look at possibility (2). Orthodox secular scientists are aghast at the idea, arguing that to do so is to “abandon science” and reject rational inference from experimental results in favour of revelation based only on faith. Well, let's compare alternatives (2) and (3) in that respect. Number three asks us to believe in a vast or infinite number of universes, all existing in their own disconnected bubbles of spacetime and unable to communicate with one another, which cannot be detected by any imaginable experiment, without any evidence for the method by which they were created nor idea how it all got started. And all of this to explain the laws and initial conditions of the single universe we inhabit. How's that for taking things on faith?

The author's concept of God in this volume is not that of the personal God of the Abrahamic religions, but rather something akin to the universal God of some Eastern religions, as summed up in Aldous Huxley's The Perennial Philosophy. This God is a consciousness encompassing the entire universe which causes the creation of its contents, deliberately setting things up to maximise the creation of complexity, with the eventual goal of creating more and more consciousness through which the Creator can experience the universe. This is actually not unlike the scenario sketched in Scott Adams's God's Debris, which people might take with the seriousness it deserves had it been written by somebody other than the creator of Dilbert.

If you're a regular reader of this chronicle, you'll know that my own personal view is in almost 100% agreement with Dr. Haisch on the big picture, but entirely different on the nature of the Creator. I'll spare you the detailed exposition, as you can read it in my comments on Sean Carroll's From Eternity to Here (February 2010). In short, I think it's more probable than not we're living in a simulation, perhaps created by a thirteen year old post-singularity superkid as a science fair project. Unlike an all-pervading but imperceptible Brahman or an infinitude of unobservable universes in an inaccessible multiverse, the simulation hypothesis makes predictions which render it falsifiable, and hence a scientific theory. Eventually, precision measurements will discover, then quantify, discrepancies due to round-off errors in the simulation (for example, an integration step which is too large), and—what do you know—we already have in hand a collection of nagging little discrepancies which look doggone suspicious to me.

This is not one of those mushy “science and religion can coexist” books. It is an exploration, by a serious scientist who has thought deeply about these matters, of why evidence derived entirely from science is pointing those with minds sufficiently open to entertain the idea, that the possibility of our universe having been deliberately created by a conscious intelligence who endowed it with the properties that permit it to produce its own expanding consciousness is no more absurd that the hypotheses favoured by those who reject that explanation, and is entirely compatible with recent experimental results, which are difficult in the extreme to explain in any other manner. Once the universe is created (or, as I'd put it, the simulation is started), there's no reason for the Creator to intervene: if all the dials and knobs are set correctly, the laws discovered by Einstein, Darwin, Maxwell, and others will take care of the rest. Hence there's no conflict between science and evidence-based belief in a God which is the first cause for all which has happened since.

 Permalink

Roach, Mary. Packing for Mars. New York: W. W. Norton, 2010. ISBN 978-0-393-06847-4.
At the dawn of the space age, nobody had any idea what effects travel into space might have on living beings, foremost among them the intrepid pilots of the first ships to explore the void. No organism from the ancestral cell of all terrestrial life up to the pointiest-headed professor speculating about its consequences had ever experienced more than an instant of weightlessness, and that usually ended badly with a sudden stop against an unyielding surface. (Fish and human divers are supported by their buoyancy in the water, but they are not weightless: the force of Earth's gravity continues to act upon their internal organs, and might prove to be essential for their correct functioning.) The eye, for example, freed of the pull of gravity, might change shape so that it couldn't focus; it might prove impossible to swallow; digestion of food in the stomach might not work without gravity to hold the contents together at the bottom; urination might fail without gravity working on the contents of the bladder, etc., etc.. The only way to be sure was to go and find out, and this delightful and witty book covers the quest to discover how to live in space, from the earliest animal experiments of the 1940s (most of which ended poorly for the animals, not due to travelling in space, but rather the reliability of the rockets and recovery systems to which they were entrusted) to present day long duration space station missions and research into the human factors of expeditions to Mars and the asteroids.

Travelling to space centres across the U.S., Russia, Europe, and Japan, the author delves into the physiological and psychological, not to mention the humourous and embarrassing aspects of venturing into the vacuum. She boards the vomit comet to experience weightlessness for herself, tries the television camera equipped “aiming practice toilet” on which space shuttle astronauts train before their missions, visits subjects in multi-month bed rest experiments studying loss of muscle and bone mass on simulated interplanetary missions, watches cadavers being used in crash tests of space capsules, tastes a wide variety of overwhelmingly ghastly space food (memo to astronaut corps worldwide: when they hire veterinarians to formulate your chow, don't expect gourmet grub on orbit), and, speaking of grubby, digs into experiments on the outer limits of lack of hygiene, including the odorifically heroic Gemini VII mission in which Frank Borman and James Lovell spent two weeks in a space smaller than the front seat of a Volkswagen Beetle with no way to bathe or open the window, nor bathroom facilities other than plastic bags. Some of the air to ground communications from that mission which weren't broadcast to the public at the time are reproduced here, and are both revealing and amusing in a grody kind of way.

We also meet the animals who preceded the first humans into space, and discover that their personalities were more diverse than those of the Right Stuff humans who followed. You may know of Ham (who was as gung-ho and outgoing as John Glenn) and Enos (who could be as cold and contemptuous as Alan Shepard, and as formidable hurling his feces at those within range as Nolan Ryan was with a baseball), but just imagine those who didn't fly, including Double Ugly, Miss Priss, and Big Mean.

There are a huge number of factoids here, all well-documented, that even the most obsessive space buff may not have come across. For example: why does motion sickness make you vomit? It makes sense to vomit if you've swallowed something truly noxious such as a glass of turpentine or a spoonful of lima beans, but it doesn't make any sense when your visual and vestibular systems are sending conflicting signals since emptying your stomach does nothing to solve the problem. Well, it turns out that functional brain imaging reveals that the “emetic brain” which handles the crucial time-sequencing of the vomit reflex just happens to be located next door in the meat computer to the area which integrates signals from the inner ear and visual system. When the latter is receiving crossed signals, it starts firing neurons wildly trying to make sense of it, and electro-chemical crosstalk gets into vomit central next door and it's a-hurling we will go. It turns out that, despite worries, most human organs work just fine in weightlessness, but some of them behave differently in ways to which space travellers must become accustomed. Consider the bladder—with gravity, the stretching of the wall of the bladder due to the weight of its contents is what triggers the urge to relieve oneself. But in weightlessness, the contents of the bladder, like other fluids, tend to cling to the walls due to surface tension, and the bladder fills up with no signal at all until it's completely full, at which point you have to go right now regardless of whatever you're doing or whether another crewmember is using the space toilet. Reusable manned spacecraft have a certain odour….

There may be nothing that better stimulates the human mind to think out of the box than pondering flight out of this world, and we come across a multitude of examples of innovative boffinology, both from the pages of history and contemporary research. There's the scientist, one of the world's preeminent authorities on chicken brains, who suggested fattening astronauts up to be 20 kilograms obese before launch, which would allow them to fly 90 day missions without the need to launch any food at all. Just imagine the morale among that crew! Not to be outdone, another genius proposed, given the rarity of laundromats in space, that astronauts' clothes be made of digestible fibres, so that they could eat their dirty laundry instead of packaged food. This seems to risk taking “Eat my shorts!” even beyond the tolerance threshold of Bart Simpson. Then consider the people who formulate simulated astronaut poop for testing space toilets, and those who study farts in space. Or, better yet, don't.

If you're remotely interested in space travel, you'll find this a thoroughly enjoyable book, and your only regret when closing it will be that it has come to an end. Speaking of which, if you don't read them as you traverse the main text, be sure to read the extensive end notes—there are additional goodies there for your delectation.

A paperback edition will be published in April 2011.

 Permalink

Thor, Brad. The Lions of Lucerne. New York: Pocket Books, 2002. ISBN 978-0-7434-3674-8.
This was the author's first published novel, which introduced Scot Harvath, the ex-Navy SEAL around whose exploits his subsequent thrillers have centred. In the present book, Harvath has been recruited into the Secret Service and is in charge of the U.S. president's advance team and security detail for a ski trip to Utah which goes disastrously wrong when an avalanche wipes out the entire Secret Service field team except for Harvath, leaving the president missing and his daughter grievously injured. This shock is compounded manyfold when evidence indicates that the president has been kidnapped in an elaborate plot, which is soon confirmed by an incontrovertible communication from the kidnappers.

If things weren't bad enough for the seriously battered Harvath, still suffering from a concussion and “sprained body”, he finds himself framed as the person who leaked the security arrangements to the kidnappers and for the murder of two people trying to bring evidence regarding the plot to the attention of the authorities.

Harvath decides the only way he can clear his name is to get to the bottom of the conspiracy and rescue the president himself and so, grasping at the only thread of evidence he has, travels incognito to Switzerland, where he begins to unravel the details of the plot, identify the conspirators, and discover where the president is being held and devise a plan to rescue him. You don't often come across a Swiss super-villain, but there's one here, complete with an Alpine redoubt worth of a Bond blackguard.

This is a first novel, and it shows. Thor's mastery of the craft of the thriller, both in storytelling and technical detail, has improved over the years. If I hadn't read two of the more recent books, I might have been inclined to give it up after this one, but knowing what's coming, I'll continue to enjoy books from this series. In the present story, we have a vast disparity between the means (an intricate and extremely risky plot to kidnap the U.S. president) and the ends (derailing the passage of an alternative energy bill like “cap and trade”), carried out by an international conspiracy so vast that its security would almost be certain to be quickly compromised, but which is, instead, revealed through a series of fantastically improbable coincidences. Scot Harvath is pursued by two independent teams of assassins who may be the worst shots in the entire corpus of bestselling thrillers. And the Swiss authorities simply letting somebody go who smuggled a gun into Switzerland, sprayed gunfire around a Swiss city (damaging a historical landmark in the process), and then broke into a secret Swiss military base doesn't sound like the Switzerland with which I'm acquainted.

Still, this is well deserving of the designation “thriller”, and it will keep you turning the pages. It only improves from here, but I'd start with one of the more recent novels.

 Permalink

November 2010

Ryan, Craig. Magnificent Failure. Washington: Smithsonian Books, 2003. ISBN 978-1-58834-141-9.
In his 1995 book, The Pre-Astronauts (which I read before I began keeping this list), the author masterfully explores the pioneering U.S. balloon flights into the upper atmosphere between the end of World War II and the first manned space flights, which brought both Air Force and Navy manned balloon programs to an abrupt halt. These flights are little remembered today (except for folks lucky enough to have an attic [or DVD] full of National Geographics from the epoch, which covered them in detail). Still less known is the story recounted here: one man's quest, fuelled only by ambition, determination, willingness to do whatever it took, persuasiveness, and sheer guts, to fly higher and free-fall farther than any man had ever done before. Without the backing of any military service, government agency, wealthy patron, or corporate sponsor, he achieved his first goal, setting an altitude record for lighter than air flight which remains unbroken more than four decades later, and tragically died from injuries sustained in his attempt to accomplish the second, after an in-flight accident which remains enigmatic and controversial to this day.

The term “American original” is over-used in describing exceptional characters that nation has produced, but if anybody deserves that designation, Nick Piantanida does. The son of immigrant parents from the Adriatic island of Korčula (now part of Croatia), Nick was born in 1932 and grew up on the gritty Depression-era streets of Union City, New Jersey in the very cauldron of the American melting pot, amid communities of Germans, Italians, Irish, Jews, Poles, Syrians, and Greeks. Although universally acknowledged to be extremely bright, his interests in school were mostly brawling and basketball. He excelled in the latter, sharing the 1953 YMCA All-America honours with some guy named Wilt Chamberlain. After belatedly finishing high school (bored, he had dropped out to start a scrap iron business, but was persuaded to return by his parents), he joined the Army where he was All-Army in basketball for both years of his hitch and undefeated as a heavyweight boxer. After mustering out, he received a full basketball scholarship to Fairleigh Dickinson University, then abruptly quit a few months into his freshman year, finding the regimentation of college life as distasteful as that of the Army.

In search of fame, fortune, and adventure, Nick next set his sights on Venezuela, where he vowed to be the first to climb Devil's Mountain, from which Angel Falls plummets 807 metres. Penniless, he recruited one of his Army buddies as a climbing partner and lined up sponsors to fund the expedition. At the outset, he knew nothing about mountaineering, so he taught himself on the Hudson River Palisades with the aid of books from the library. Upon arrival in Venezuela, the climbers learnt to their dismay that another expedition had just completed the first ascent of the mountain, so Nick vowed to make the first ascent of the north face, just beside the falls, which was thought unclimbable. After an arduous trip through the jungle, during which their guide quit and left the climbers alone, Nick and his partner made the ascent by themselves and returned to the acclaim of all. Such was the determination of this man.

Nick was always looking for adventure, celebrity, and the big score. He worked for a while as a steelworker on the high iron of the Verrazano-Narrows Bridge, but most often supported himself and, after his marriage, his growing family, by contract truck driving and, occasionally, unemployment checks. Still, he never ceased to look for ways, always unconventional, to make his fortune, nor failed to recruit associates and find funding for his schemes. Many of his acquaintances use the word “hustler” to describe him in those days, and one doubts that Nick would be offended by the honorific. He opened an exotic animal import business, and ordered cobras, mongooses, goanna lizards, and other critters mail-order from around the world for resale to wealthy clients. When buyers failed to materialise, he staged gladiatorial contests of both animal versus animal and animal versus himself. Eventually he imported a Bengal tiger cub which he kept in his apartment until it had grown so large it could put its paws on his shoulders, whence he traded the tiger for a decrepit airplane (he had earned a pilot's license while still in his teens). Offered a spot on the New York Knicks professional basketball team, he turned it down because he thought he could make more money barnstorming in his airplane.

Nick finally found his life's vocation when, on a lark, he made a parachute jump. Soon, he had progressed from static line beginner jumps to free fall and increasingly advanced skydiving, making as many jumps as he could afford and find the time for. And then he had the Big Idea. In 1960, Joseph Kittinger had ridden a helium balloon to an altitude of 31,333 metres and bailed out, using a small drogue parachute to stabilise his fall until he opened his main parachute at an altitude of 5,330 metres. Although this was, at the time (and remains to this day) the highest altitude parachute jump ever made, skydiving purists do not consider it a true free fall jump due to the use of the stabilising chute. In 1962, Eugene Andreev jumped from a Soviet balloon at an altitude of 25,460 metres and did a pure free fall descent, stabilising himself purely by skydiving techniques, setting an official free-fall altitude record which also remains unbroken. Nick vowed to claim both the record for highest altitude ascent and longest free-fall jump for himself, and set about it with his usual energy and single-minded determination.

Piantanida faced a daunting set of challenges in achieving his goal: at the outset he had neither balloon, gondola, spacesuit, life support system, suitable parachute, nor any knowledge of or experience with the multitude of specialities whose mastery is required to survive in the stratosphere, above 99% of the Earth's atmosphere. Kittinger and Andreev were supported by all the resources, knowledge, and funding of their respective superpowers' military establishments, while Nick had—well…Nick. But he was not to be deterred, and immediately set out educating himself and lining up people, sponsors, and gear necessary for the attempt.

The story of what became known as Project Strato-Jump reads like an early Heinlein novel, with an indomitable spirit pursuing a goal other, more “reasonable”, people considered absurd or futile. By will, guile, charm, pull, intimidation, or simply wearing down adversaries until they gave in just to make him go away, he managed to line up everything he needed, including having the company which supplied NASA with its Project Gemini spacesuits custom tailor one (Nick was built like an NBA star, not an astronaut) and loan it to him for the project.

Finally, on October 22, 1965, all was ready, and Nick took to the sky above Minnesota, bound for the edge of space. But just a few minutes after launch, at just 7,000 metres, the balloon burst, probably due to a faulty seam in the polyethylene envelope, triggered by a wind shear at that altitude. Nick rode down in the gondola under its recovery parachute, then bailed out at 3200 metres, unglamorously landing in the Pig's Eye Dump in St. Paul.

Undeterred by the failure, Nick recruited a new balloon manufacturer and raised money for a second attempt, setting off again for the stratosphere a second time on February 2, 1966. This time the ascent went flawlessly, and the balloon rose to an all-time record altitude of 37,643 metres. But as Nick proceeded through the pre-jump checklist, when he attempted to disconnect the oxygen hose that fed his suit from the gondola's supply and switch over to the “bail out bottle” from which he would breathe during the descent, the disconnect fitting jammed, and he was unable to dislodge it. He was, in effect, tethered to the gondola by his oxygen line and had no option but to descend with it. Ground control cut the gondola's parachute from the balloon, and after a harrowing descent Nick and gondola landed in a farm field with only minor injuries. The jump had failed, but Nick had flown higher than any manned balloon ever had. But since the attempt was not registered as an official altitude attempt, although the altitude attained is undisputed, the record remains unofficial.

After the second failure, Nick's confidence appeared visibly shaken. Having all that expense, work, and risk undertaken come to nought due to a small detail with which nobody had been concerned prior to the flight underlined just how small the margin for error was in the extreme environment at the edge of space and, by implication, how the smallest error or oversight could lead to disaster. Still, he was bent on trying yet again, and on May 1, 1966 (since he was trying to break a Soviet record, he thought this date particularly appropriate), launched for the third time. Everything went normally as the balloon approached 17,375 metres, whereupon the ground crew monitoring the air to ground voice link heard what was described as a “whoosh” or hiss, followed by a call of “Emergen” from Nick, followed by silence. The ground crew immediately sent a radio command to cut the balloon loose, and the gondola, with Nick inside, began to descend under its cargo parachute.

Rescue crews arrived just moments after the gondola touched down and found it undamaged, but Nick was unconscious and unresponsive. He was rushed to the local hospital, treated without avail, and then transferred to a hospital in Minneapolis where he was placed in a hyperbaric chamber where treatment for decompression sickness was administered, without improvement. On June 18th, he was transferred to the National Institute of Health hospital in Bethesda, Maryland, where he was examined and treated by experts in decompression disease and hypoxia, but never regained consciousness. He died on August 25, 1966, with an autopsy finding the cause of death hypoxia and ruptures of the tissue in the brain due to decompression.

What happened to Nick up there in the sky? Within hours after the accident, rumours started to circulate that he was the victim of equipment failure: that his faceplate had blown out or that the pressure suit had failed in some other manner, leading to an explosive decompression. This story has been repeated so often it has become almost canon—consider this article from Wired from July 2002. Indeed, when rescuers arrived on the scene, Nick's “faceplate” was damaged, but this was just the sun visor which can be pivoted down to cover the pressure-retaining faceplate, which was intact and, in a subsequent test of the helmet, found to seal perfectly. Rescuers assumed the sun visor was damaged by impact with part of the gondola during the landing and, in any case, would not have caused a decompression however damaged.

Because the pressure suit had been cut off in the emergency room, it wasn't possible to perform a full pressure test, but meticulous inspection of the suit by the manufacturer discovered no flaws which could explain an explosive decompression. The oxygen supply system in the gondola was found to be functioning normally, with all pressure vessels and regulators operating within specifications.

So, what happened? We will never know for sure. Unlike a NASA mission, there was no telemetry, nor even a sequence camera recording what was happening in the gondola. And yet, within minutes after the accident occurred, many members of the ground crew came to a conclusion as to the probable cause, which those still alive today have seen no need to revisit. Such was their certainty that reporter Robert Vaughan gave it as the cause in the story he filed with Life magazine, which he was dismayed to see replaced with an ambiguous passage by the editors, because his explanation did not fit with the narrative chosen for the story. (The legacy media acted like the legacy media even when they were the only media and not yet legacy!)

Astonishingly, all the evidence (which, admittedly, isn't very much) seems to indicate that Nick opened his helmet visor at that extreme altitude, which allowed the air in suit to rush out (causing the “whoosh”), forcing the air from his lungs (cutting off the call of “Emergency!”), and rapidly incapacitating him. The extended hypoxia and exposure to low pressure as the gondola descended under the cargo parachute caused irreversible brain damage well before the gondola landed. But why would Nick do such a crazy thing as open his helmet visor when in the physiological equivalent of space? Again, we can never know, but what is known is that he'd done it before, at lower altitudes, to the dismay of his crew, who warned him of the potentially dire consequences. There is abundant evidence that Piantanida violated the oxygen prebreathing protocol before high altitude exposure not only on this flight, but on a regular basis. He reported symptoms completely consistent with decompression sickness (the onset of “the bends”), and is quoted as saying that he could relieve the symptoms by deflating and reinflating his suit. Finally, about as close to a smoking gun as we're likely to find, the rescue crew found Nick's pressure visor unlatched and rotated away from the seal position. Since Nick would have been in a coma well before he entered breathable atmosphere, it isn't possible he could have done this before landing, and there is no way an impact upon landing could have performed the precise sequence of operations required to depressurise the suit and open the visor.

It is impossible put oneself inside the mind of such an outlier in the human population as Nick, no less imagine what he was thinking and feeling when rising into the darkness above the dawn on the third attempt at achieving his dream. He was almost certainly suffering from symptoms of decompression sickness due to inadequate oxygen prebreathing, afflicted by chronic sleep deprivation in the rush to get the flight off, and under intense stress to complete the mission before his backers grew discouraged and the money ran out. All of these factors can cloud the judgement of even the most disciplined and best trained person, and, it must be said, Nick was neither. Perhaps the larger puzzle is why members of his crew who did understand these things, did not speak up, pull the plug, or walk off the project when they saw what was happening. But then a personality like Nick can sweep people along through its own primal power, for better or for worse; in this case, to tragedy.

Was Nick a hero? Decide for yourself—my opinion is no. In pursuing his own ego-driven ambition, he ended up leaving his wife a widow and his three daughters without a father they remember, with only a meagre life insurance policy to support them. The project was basically a stunt, mounted with the goal of turning its success into money by sales of story, film, and celebrity appearances. Even had the jump succeeded, it would have yielded no useful aeromedical research data applicable to subsequent work apart from the fact that it was possible. (In Nick's defence on this account, he approached the Air Force and NASA, inviting them to supply instrumentation and experiments for the jump, and was rebuffed.)

This book is an exhaustively researched (involving many interviews with surviving participants in the events) and artfully written account of this strange episode which was, at the same time, the last chapter of the exploration of the black beyond by intrepid men in their floating machines and a kind of false dawn precursor of the private exploration of space which is coming to the fore almost half a century after Nick Piantanida set out to pursue his black sky dream. The only embarrassing aspect to this superb book is that on occasion the author equates state-sponsored projects with competence, responsibility, and merit. Well, let's see…. In a rough calculation, using 2007 constant dollars, NASA has spent northward of half a trillion dollars, killing a total of 17 astronauts (plus other employees in industrial accidents on the ground), with all of the astronaut deaths due to foreseeable risks which management failed to identify or mitigate in time.

Project Strato-Jump, funded entirely by voluntary contributions, without resort to the state's monopoly on the use of force, set an altitude record for lighter than air flight within the atmosphere which has stood from 1966 to this writing, and accomplished it in three missions with a total budget of less than (2007 constant) US$400,000, with the loss of a single life due to pilot error. Yes, NASA has achieved much, much more. But a million times more?

This is a very long review, so if you've made it to this point and found it tedious, please accept my excuses. Nick Piantanida has haunted me for decades. I followed his exploits as they happened, and were reported on the CBS Evening News in the 1960s. I felt the frustration of the second flight (with that achingly so far and yet so near view of the Earth from altitude, when he couldn't jump), and then the dismay at the calamity on the third, then the long vigil ending with his sad demise. Astronauts were, well, astronauts, but Nick was one of us. If a truck driver from New Jersey could, by main force, travel to the black of space, then why couldn't any of us? That was the real dream of the Space Age: Have Space Suit—Will Travel. Well, Nick managed to lay his hands on a space suit and travel he did!

Anybody who swallowed the bogus mainstream media narrative of Nick's “suit failure” had to watch the subsequent Gemini and Apollo EVA missions with a special sense of apprehension. A pressure suit is one of the few things in the NASA space program which has no backup: if the pressure garment fails catastrophically, you're dead before you can do anything about it. (A slow leak isn't a problem, since there's an oxygen purge system which can maintain pressure until you can get inside, but a major seam failure, or having a visor blow out or glove pop off is endsville.) Knowing that those fellows cavorting on the Moon were wearing pretty much the same suit as Nick caused those who believed the propaganda version of his death to needlessly catch their breath every time one of them stumbled and left a sitzmark or faceplant in the eternal lunar regolith.

 Permalink

Evans, M. Stanton. Blacklisted by History. New York: Three Rivers Press, 2007. ISBN 978-1-4000-8106-6.
In this book, the author, one of the lions of conservatism in the second half of the twentieth century, undertakes one of the most daunting tasks a historian can attempt: a dispassionate re-examination of one of the most reviled figures in modern American history, Senator Joseph McCarthy. So universal is the disdain for McCarthy by figures across the political spectrum, and so uniform is his presentation as an ogre in historical accounts, the media, and popular culture, that he has grown into a kind of legend used to scare people and intimidate those who shudder at being accused of “McCarthyism”. If you ask people about McCarthy, you'll often hear that he used the House Un-American Activities Committee to conduct witch hunts, smearing the reputations of innocent people with accusations of communism, that he destroyed the careers of people in Hollywood and caused the notorious blacklist of screen writers, and so on. None of this is so: McCarthy was in the Senate, and hence had nothing to do with activities of the House committee, which was entirely responsible for the investigation of Hollywood, in which McCarthy played no part whatsoever. The focus of his committee, the Permanent Subcommittee on Investigations of the Government Operations Committee of the U.S. Senate was on security policy and enforcement within first the State Department and later, the Signal Corps of the U.S. Army. McCarthy's hearings were not focussed on smoking out covert communists in the government, but rather investigating why communists and other security risks who had already been identified by investigations by the FBI and their employers' own internal security apparatus remained on the payroll, in sensitive policy-making positions, for years after evidence of their dubious connections and activities were brought to the attention of their employers and in direct contravention of the published security policies of both the Truman and Eisenhower administrations.

Any book about McCarthy published in the present environment must first start out by cutting through a great deal of misinformation and propaganda which is just simply false on the face of it, but which is accepted as conventional wisdom by a great many people. The author starts by telling the actual story of McCarthy, which is little known and pretty interesting. McCarthy was born on a Wisconsin farm in 1908 and dropped out of junior high school at the age of 14 to help his parents with the farm. At age 20, he entered a high school and managed to complete the full four year curriculum in nine months, earning his diploma. Between 1930 and 1935 he worked his way through college and law school, receiving his law degree and being admitted to the Wisconsin bar in 1935. In 1939 he ran for an elective post of circuit judge and defeated a well-known incumbent, becoming, at age 30, the youngest judge in the state of Wisconsin. In 1942, after the U.S. entered World War II following Pearl Harbor, McCarthy, although exempt from the draft due to his position as a sitting judge, resigned from the bench and enlisted in the Marine Corps, being commissioned as a second lieutenant (based upon his education) upon completion of boot camp. He served in the South Pacific as an intelligence officer with a dive bomber squadron, and flew a dozen missions as a tailgunner/photographer, earning the sobriquet “Tail-Gunner Joe”.

While still in the Marine Corps, McCarthy sought the Wisconsin Republican Senate nomination in 1944 and lost, but then in 1946 mounted a primary challenge to three-term incumbent senator Robert M. La Follette, Jr., scion of Winconsin's first family of Republican politics, narrowly defeating him in the primary, and then won the general election in a landslide, with more than 61% of the vote. Arriving in Washington, McCarthy was perceived to be a rather undistinguished moderate Republican back-bencher, and garnered little attention by the press.

All of this changed on February 9th, 1950, when he gave a speech in Wheeling, West Virgina in which he accused the State Department of being infested with communists, and claimed to have a list in his hand of known communists who continued to work at State after their identities had been made known to the Secretary of State. Just what McCarthy actually said in Wheeling remains a matter of controversy to this day, and is covered in gruelling detail in this book. This speech, and encore performances a few days later in Salt Lake City and Reno catapulted McCarthy onto the public stage, with intense scrutiny in the press and an uproar in Congress, leading to duelling committee investigations: those exploring the charges he made, and those looking into McCarthy himself, precisely what he said where and when, and how he obtained his information on security risks within the government. Oddly, from the outset, the focus within the Senate and executive branch seemed to be more on the latter than the former, with one inquiry digging into McCarthy's checkbook and his income tax returns and those of members of his family dating back to 1935—more than a decade before he was elected to the Senate.

The content of the hearings chaired by McCarthy are also often misreported and misunderstood. McCarthy was not primarily interested in uncovering Reds and their sympathisers within the government: that had already been done by investigations by the FBI and agency security organisations and duly reported to the executive departments involved. The focus of McCarthy's investigation was why, once these risks were identified, often with extensive documentation covering a period of many years, nothing was done, with those identified as security risks remaining on the job or, in some cases, allowed to resign without any note in their employment file, often to immediately find another post in a different government agency or one of the international institutions which were burgeoning in the postwar years. Such an inquiry was a fundamental exercise of the power of congressional oversight over executive branch agencies, but McCarthy (and other committees looking into such matters) ran into an impenetrable stonewall of assertions of executive privilege by both the Truman and Eisenhower administrations. In 1954, the Washington Post editorialised, “The President's authority under the Constitution to withhold from Congress confidences, presidential information, the disclosure of which would be incompatible with the public interest, is altogether beyond question”. The situational ethics of the legacy press is well illustrated by comparing this Post editorial to those two decades later when Nixon asserted the same privilege against a congressional investigation.

Indeed, the entire McCarthy episode reveals how well established, already at the mid-century point, the ruling class government/media/academia axis was. Faced with an assault largely directed at “their kind” (East Coast, Ivy League, old money, creatures of the capital) by an uncouth self-made upstart from the windswept plains, they closed ranks, launched serial investigations and media campaigns, covered up, destroyed evidence, stonewalled, and otherwise aimed to obstruct and finally destroy McCarthy. This came to fruition when McCarthy was condemned by a Senate resolution on December 2nd, 1954. (Oddly, the usual word “censure” was not used in the resolution.) Although McCarthy remained in the Senate until his death at age 48 in 1957, he was shunned in the Senate and largely ignored by the press.

The perspective of half a century later allows a retrospective on the rise and fall of McCarthy which wasn't possible in earlier accounts. Many documents relevant to McCarthy's charges, including the VENONA decrypts of Soviet cable traffic, FBI security files, and agency loyalty board investigations have been declassified in recent years (albeit, in some cases, with lengthy “redactions”—blacked out passages), and the author makes extensive use of these primary sources in the present work. In essence, what they demonstrate is that McCarthy was right: that the documents he sought in vain, blocked by claims of executive privilege, gag orders, cover-ups, and destruction of evidence were, in fact, persuasive evidence that the individuals he identified were genuine security risks who, under existing policy, should not have been employed in the sensitive positions they held. Because the entire “McCarthy era”, from his initial speech to condemnation and downfall, was less than five years in length, and involved numerous investigations, counter-investigations, and re-investigations of many of the same individuals, regarding which abundant source documents have become available, the detailed accounts in this massive book (672 pages in the trade paperback edition) can become tedious on occasion. Still, if you want to understand what really happened at this crucial episode of the early Cold War, and the background behind the defining moment of the era: the conquest of China by Mao's communists, this is an essential source.

In the Kindle edition, the footnotes, which appear at the bottom of the page in the print edition, are linked to reference numbers in the text with a numbering scheme distinct from that used for source references. Each note contains a link to return to the text at the location of the note. Source citations appear at the end of the book and are not linked in the main text. The Kindle edition includes no index.

 Permalink

Pournelle, Jerry. Fires of Freedom. Riverdale, NY: Baen Publishing, [1976, 1980] 2010. ISBN 978-1-4391-3374-3.
This book includes two classic Jerry Pournelle novels which have been long out of print. Baen Publishing is doing journeyman work bringing the back lists of science fiction masters such as Pournelle, Robert Heinlein, and Poul Anderson back to the bookshelves, and this is a much welcome addition to the list. The two novels collected here are unrelated to one another. The first, Birth of Fire, originally published in 1976, follows a gang member who accepts voluntary exile to Mars to avoid a prison sentence on Earth. Arriving on Mars, he discovers a raw frontier society dominated by large Earth corporations who exploit the largely convict labour force. Nobody has to work, but if you don't work, you don't get paid and can't recharge the air medal everybody wears around their neck. If it turns red, or you're caught in public not wearing one, good tax-paying citizens will put the freeloader “outside”—without a pressure suit.

Former gangster Garrett Pittston finds that Mars suits him just fine, and, avoiding the temptations of the big companies, signs on as a farmhand with a crusty Marsman who goes by the name of Sarge. At Windhome, Sarge's station, Garrett learns how the Marsmen claw an independent existence from the barren soil of Mars, and also how the unyielding environment has shaped their culture, in which one's word is a life or death bond. Inevitably, this culture comes into conflict with the nanny state of the colonial administration, which seeks to bring the liberty-loving Marsmen under its authority by taxing and regulating them out of existence.

Garrett finds himself in the middle of an outright war of independence, in which the Marsmen use their intimate knowledge of the planet as an ally against what, on the face of it, would appear to be overwhelming superiority of their adversaries. Garrett leads a bold mission to obtain the game-changing resource which will allow Mars to deter reprisals from Earth, and in doing so becomes a Marsman in every way.

Pournelle paints this story with spare, bold brush strokes: all non-essentials are elided, and the characters develop and events transpire with little or no filler. If Kim Stanley Robinson had told this story, it would probably have occupied two thousand pages and have readers dying of boredom or old age before anything actually happened. This book delivers an action story set in a believable environment and a society which has been shaped by it. Having been originally published in the year of the Viking landings on Mars, there are a few things it gets wrong, but there are a great many others which are spot-on, and in some cases prophetic.

The second novel in the book, King David's Spaceship, is set in the CoDominium universe in which the classic novel The Mote in God's Eye takes place. The story occurs contemporarily with The Mote, during the Second Empire of Man, when imperial forces from the planet Sparta are re-establishing contact with worlds of the original Empire of Man who have been cut off from one another, with many reverting to primitive levels of technology and civilisation in the aftermath of the catastrophic Secession Wars.

When Imperial forces arrive on Prince Samual's World, its civilisation had recovered from disastrous post-collapse warfare and plague to around the technological level of 19th century Earth. King David of the Kingdom of Haven, who hopes to unify the planet under his rule, forms an alliance with the Empire and begins to topple rivals and petty kingdoms while pacifying the less civilised South Continent. King David's chief of secret police learns, from an Imperial novel that falls into his hands, that the Empire admits worlds on different bases depending upon their political and technological evolution. Worlds which have achieved planetary government and an indigenous space travel capability are admitted as “classified worlds”, which retain a substantial degree of autonomy and are represented in one house of the Imperial government. Worlds which have not achieved these benchmarks are classed as colonies, with their local governmental institutions abolished and replaced by rule by an aristocracy of colonists imported from other, more developed planets.

David realises that, with planetary unification rapidly approaching, his days are numbered unless somehow he can demonstrate some kind of space flight capability. But the Empire enforces a rigid technology embargo against less developed worlds, putatively to allow for their “orderly development”, but at least as much to maintain the Navy's power and enrich the traders, who are a major force in the Imperial capital. Nathan McKinnie, formerly a colonel in the service of Orleans, a state whose independence was snuffed out by Haven with the help of the Navy, is recruited by the ruthless secret policeman Malcolm Dougal to lead what is supposed to be a trading expedition to the world of Makassar, whose own civilisation is arrested in a state like medieval Europe, but which is home to a “temple” said to contain a library of documents describing First Empire technology which the locals do not know how to interpret. McKinnie's mission is to gain access to the documents, discover how to build a spaceship with the resources available on Haven, and spirit this information back to his home world under the eyes of the Navy and Imperial customs officials.

Arriving on Makassar, McKinnie finds that things are even more hopeless than he imagined. The temple is in a city remote from where he landed, reachable only by crossing a continent beset with barbarian hordes, or a sea passage through a pirate fleet which has essentially shut down seafaring on the planet. Using no advanced technology apart from the knowledge in his head, he outfits a ship and recruits and trains a crew to force the passage through the pirates. When he arrives at Batav, the site of the temple, he finds it besieged by Islamic barbarians (some things never change!), who are slowly eroding the temple's defenders by sheer force of numbers.

Again, McKinnie needs no new technology, but simply knowledge of the Western way of war—in this case recruiting from the disdained dregs of society and training a heavy infantry force, which he deploys along with a newly disciplined heavy cavalry in tactical doctrine with which Cæsar would have been familiar. Having saved the temple, he forms an alliance with representatives of the Imperial Church which grants him access to the holy relics, a set of memory cubes containing the collected knowledge of the First Empire.

Back on Prince Samual's World, a Los Alamos style research establishment quickly discovers that they lack the technology to read the copies of the memory cubes they've brought back, and that the technology of even the simplest Imperial landing craft is hopelessly out of reach of their knowledge and manufacturing capabilities. So, they adopt a desperate fall back plan, and take a huge gamble to decide the fate of their world.

This is superb science fiction which combines an interesting premise, the interaction of societies at very different levels of technology and political institutions, classical warfare at sea and on land, and the difficult and often ruthless decisions which must be made when everything is at stake (you will probably remember the case of the Temple swordsmen long after you close this book). It is wonderful that these excellent yarns are back in print after far too long an absence.

 Permalink

Brandon, Craig. The Five-Year Party. Dallas: BenBella Books, 2010. ISBN 978-1-935251-80-4.
I suspect that many readers of Tom Wolfe's I Am Charlotte Simmons (October 2010) whose own bright college days are three or four decades behind them will conclude that Wolfe embroidered quite a bit upon the contemporary campus scene in the interest of telling an entertaining tale. In this book, based upon the author's twelve years of experience teaching journalism at Keene State College in New Hampshire and extensive research, you'll get a factual look at what goes on at “party schools”, which have de-emphasised education in favour of “retention”—in other words, extracting the maximum amount of money from students and their families, and burdening them with crushing loans which make it impossible for graduates to accumulate capital in those early years which, due to compounding, are so crucial. In fact, Charlotte Simmons actually paints a better picture of college life than that which awaits most freshmen arriving on campus: Charlotte's fictional Dupont University was an élite school, with at least one Nobel Prize winner on the faculty, and although corrupted by its high-profile athletic program, enforced genuine academic standards for the non-athlete student body and had real consequences for failure to perform.

Not so at party schools. First of all, let's examine what these “party schools” are. What they're not is the kind of small, private, liberal arts college parodied in Animal House. Instead, the lists of top party schools compiled annually by Playboy and the Princeton Review are overwhelmingly dominated by huge, taxpayer-supported, state universities. In the most recent set of lists, out of a total of twenty top party schools, only two were private institutions. Because of their massive size, state party schools account for a large fraction of the entire U.S. college enrollment, and hence are representative of college life for most students who do not enter the small number of élite schools which are feeders for the ruling class.

As with most “public services” operated by governments, things at these state institutions of “higher education” are not what they appear to be on the surface, and certainly not what parents expect when they send their son or daughter off on what they have been led to believe is the first step toward a promising career. The first lie is in the very concept of a “four-year college”: with today's absurd relaxation of standards for dropping classes, lighter class loads, and “retention” taking priority over selecting out those unsuited to instruction at the college level, only a minority of students finish in four years, and around half take more than five years to graduate, with only about 54% graduating even in six years. Apart from the wasted years of these students' lives, this means the price tag, and corresponding debt burden of a college education is 25%, 50%, or even more above the advertised sticker price, with the additional revenue going into the college's coffers and providing no incentive whatsoever to move students through the system more rapidly.

But the greatest scandal and fraud is not the binge drinking, widespread drug use, casual sex, high rates of serious crime covered up by a campus disciplinary system more interested in preserving the reputation of the institution than weeding out predators among the student body, although all of these are discussed in depth here, but rather the fact that at these gold-plated diploma mill feedlots, education has been de-emphasised to the extent of being entirely optional. Indeed, only about one fifth of university budgets goes to instruction; all the rest disappears into the fat salaries of endlessly proliferating legions of administrators, country club like student amenities, and ambitious building programs. Classes have been dumbed down to the extent that it is possible to navigate a “slacker track” to a bachelor's degree without ever taking a single course more intellectually demanding than what was once considered junior high level, or without being able to read, comprehend, and write the English language with high school proficiency. Grade inflation has resulted in more than 90% of all grades being either A or B, with a B expected by students as their reward simply for showing up, with the consequence that grade reports to parents and transcripts for prospective employers have become meaningless and impossible to evaluate.

The National Survey of Student Engagement finds that only about 10% of U.S. university students are “fully engaged”—actually behaving as college students were once expected to in order to make the most of the educational resources available to them. Twice that percent were “fully disengaged”: just there to party or passing time, while the remainder weren't full time slackers but not really interested in learning things.

Now these are very interesting numbers, and they lead me to a conclusion which the author never explores. Prior to the 1960s, it was assumed that only a minority of highest-ranking secondary school students would go on to college. With the mean IQ of bachelor's degree holders ranging from 110 to 120, this means that they necessarily make up around the top 10 to 15 percent of the population by intelligence. But now, the idea seems to be that everybody should get a “college education”, and indeed today in the U.S. around 70% of high school graduates go on to some kind of college program (although a far smaller fraction ever graduate). Now clearly, a college education which was once suited to the most intelligent 10% of the population is simply not going to work for the fat middle of the bell curve, which characterises the present-day college population. Looked at this way, the party school seems to be an inevitable consequence. If society has deemed it valuable that all shall receive a “college education”, then it is necessary to redefine “college education” as something the average citizen can accomplish and receive the requisite credential. Hence the elimination, or optional status, of actual learning, evaluation of performance, and useful grades. With universities forced to compete on their attractiveness to “the customer”—the students—they concentrate on amenities and lax enforcement of codes of conduct in order to keep those tuition dollars coming in for four, five, six, or however many years it takes.

A number of observers have wondered whether the next bubble to pop will be higher education. Certainly, the parallels are obvious: an overbuilt industry, funded by unsustainable debt, delivering a shoddy product, at a cost which has been growing much faster than inflation or the incomes of those who foot the bills. This look inside the ugly mass education business only reinforces that impression, since another consequence of a bubble is the normalisation and acceptance of absurdity by those inside it. Certainly one indication the bubble may be about to pop is that employers have twigged to the fact that a college diploma and glowing transcript from one of these rackets the author calls “subprime colleges” is no evidence whatsoever of a job applicant's literacy, knowledge, or work ethic, which explains why so many alumni of these programs are living in their parents' basements today, getting along by waiting tables or delivering pizza, while they wait for that lucky break they believe they're entitled to. This population is only likely to increase as employers in need of knowledge workers discover they can outsource those functions to Asia, where university degrees are much more rare but actually mean something.

Elite universities, of course, continue to provide excellent educational opportunities for the small number of students who make it through the rigorous selection process to get there. It's also possible for a dedicated and fully engaged student to get a pretty good education at a party school, as long as they manage to avoid the distractions, select challenging courses and dedicated professors, and don't have the bad fortune to suffer assault, rape, arson, or murder by the inebriated animals that outnumber them ten to one. But then it's up to them, after graduating, to convince employers that their degree isn't just a fancy credential, but rather something they've genuinely worked for.

Allan Bloom observed that “every age is blind to its own worst madness”, an eternal truth to which anybody who has been inside a bubble becomes painfully aware, usually after it unexpectedly pops. For those outside the U.S. education scene, this book provides a look into a bizarre mirror universe which is the daily reality for many undergraduates today. Parents planning to send their progeny off to college need to know this information, and take to heart the author's recommendations of how to look under the glossy surface and discover the reality of the institution to which their son or daughter's future will be entrusted.

In the Kindle edition, end notes are linked in the text, but the index contains just a list of terms with no links to where they appear and is consequently completely useless.

 Permalink

December 2010

Davies, Paul. The Eerie Silence. New York: Houghton Mifflin Harcourt, 2010. ISBN 978-0-547-13324-9.
The year 2009 marked the fiftieth anniversary of the Nature paper by Cocconi and Morrison which marked the beginning of the modern era in the search for extraterrestrial intelligence (SETI). They argued that the optimal channel by which technological civilisations in other star systems who wished to establish contact with those nearby in the galaxy would be narrowband microwave transmissions, perhaps pulse modulated in a pattern that would distinguish them from natural sources. Further, they demonstrated that radio telescopes existing at the time (which were modest compared to those already planned for construction in the near future) would suffice to send and receive such a signal over distances of tens of light years. The following year, Frank Drake used a 26 metre dish at the National Radio Astronomy Observatory to search for such signals from two nearby sun-like stars in Project Ozma.

Over the succeeding half-century, SETI has been an off and on affair, with a variety of projects with different search strategies. Since the 1990s a low level of SETI activity has been maintained, both using radio telescopes to conduct targeted searches and piggybacking on other radio astronomy observations to conduct a sky survey for candidate signals. There is still a substantial “giggle factor” associated with “listening for ET”, and funding and allocation of telescope time for SETI is minuscule compared to other radio astronomy research. SETI has been a direct beneficiary of the exponential growth in computing power available for a given cost, and now employs spectrum analysers able to monitor millions or billions of narrowband channels simultaneously, largely eliminating the original conundrum of SETI: guessing the frequency on which the aliens would be transmitting. The Allen Telescope Array, now under construction, will increase the capability of SETI observations by orders of magnitude, and will continue to benefit from progress in microelectronics and computing.

The one thing that all SETI projects to date have in common is that they haven't found anything. Indeed, the SETI enterprise, taken as a whole, may be the longest-pursued unsuccessful search for a phenomenon in the entire history of science. The reason people don't abandon the enterprise in disappointment is that detection of a signal from an intelligent extraterrestrial source would have profound consequences for understanding the human species' place in the cosmos, the prospects for long-term survival of technological civilisations, and potential breakthroughs in all fields of knowledge if an advanced species shares their knowledge with beginners barely evolved from apes. Another reason the searchers persist is the knowledge that they've barely scratched the surface of the “search space”, having only examined a minuscule fraction of potential targets in the galaxy, and a limited range of potential frequencies and forms of modulation a communicating civilisation might employ to contact others in the galaxy. Finally, continued advances in electronics and computing are making it possible to broaden the scope of the search at a rapidly increasing rate with modest budgets.

Still, after fifty years of searching (intermittently) and finding nothing, it's worth taking a step back and thinking about what that result might mean. In this book, the author revisits the history of SETI programs to date, the assumptions and logic upon which the targets they seek were based, and argues that while conventional microwave searches for narrowband beacons should continue, it is time for a “new SETI”, based on the original mission—search for extraterrestrial intelligence, not just a search for narrowband microwave signals. “Old SETI” was very much based on assumptions about the properties of potential communicating civilisations grounded in the technologies of the 1950s. A great deal has happened since then technologically (for example, the Earth, as seen from deep space, has increasingly grown “radio dark” as high-power broadcast transmitters have been supplanted by optical fibres, cable television systems, and geosynchronous communication satellites which radiate little energy away from the Earth).

In 1959, the pioneers contemplating a SETI program based on the tools of radio astronomy mostly assumed that the civilisations whose beacons they hoped to discover would be biological organisms much like humans or their descendants, but endowed with the scientific and technological capabilities of a much longer period of time. (For statistical reasons, it is vanishingly improbable that humans would make contact with another intelligent species at a comparable state of development, since humans have had the capability to make contact for less than a century, and if other civilisations are comparably short-lived there will never be more than one in the galaxy at any given time. Hence, any signal we receive will necessarily be from a sender whose own technological civilisation is much older than our own and presumably more advanced and capable.) But it now appears probable that unless human civilisation collapses, stagnates, or is destroyed by barbarism (I put the collective probability of these outcomes at around fifty-fifty), or that some presently unenvisioned constraint puts a lid on the exponential growth of computing and communication capability, that before long, probably within this century, our species will pass through a technological singularity which will witness the emergence of artificial intelligence with intellectual capabilities on the order of 1010 to 1015 times that of present-day humans. Biological humans may continue to exist (after all, the evolution of humans didn't impact the dominance of the biosphere by bacteria), but they will no longer determine the course of technological evolution on this planet and beyond. Asking a present-day human to comprehend the priorities and capabilities of one of these successor beings is like asking a butterfly to understand Beethoven's motivations in writing the Ninth Symphony.

And yet, unless we're missing something terribly important, any aliens we're likely to contact are overwhelmingly probable to be such forbidding machine intelligences, not Romulans, Klingons, Ferengi, or even the Borg. Why would such super beings try to get our attention by establishing interstellar beacons? What would they have to say if they did contact us? Consider: how much effort does our own species exert in making contact with or carrying on a dialogue with yeast? This is the kind of gap which will exist between humans and the products of millions of years of teleological development.

And so, the author argues, while keeping a lookout for those elusive beacons (and also ultra-short laser pulses, which are an alternative mechanism of interstellar signalling unimagined when “old SETI” was born), we should also cast the net much wider, looking for the consequences of an intelligence whose motivations and capabilities we cannot hope to envision. Perhaps they have seeded the galaxy with self-reproducing von Neumann probes, one of which is patiently orbiting in the asteroid belt or at one of the Earth-Sun Lagrangian points waiting to receive a ping from us. (And speaking of that, what about those long delayed echoes anyway?) Maybe their wave of exploration passed by the solar system more than three billion years ago and seeded the Earth with the ancestral cell from which all terrestrial life is descended. Or maybe they left a different kind of life, perhaps in their garbage dumps, which lives on as a “shadow biosphere” to this day, undetected because our surveys for life don't look for biochemistry which is different from that of our own. Heck, maybe they even left a message!

We should also be on the lookout for things which don't belong, like discrepancies in isotope abundances which may be evidence of alien technology in distant geological time, or things which are missing. Where did all of those magnetic monopoles which should have been created in the Big Bang go, anyway? Or maybe they've moved on to some other, richer domain in the universe. According to the consensus model of cosmology, we have no idea whatsoever what more than 95% of the universe is made of. Maybe they've transcended their juvenile baryonic origins and decamped to the greener fields we call, in our ignorance, “dark matter” and “dark energy”. While we're pointing antennas at obsolete stars in the sky, maybe they're already here (and everywhere else), not as UFOs or alien invaders, but super-intelligences made of structures which interact only gravitationally with the thin scum of baryonic matter on top of the rich ocean of the universe. Maybe their galactic Internet traffic is already tickling the mirrors of our gravitational wave detectors at intensities we can't hope to detect with our crude technologies.

Anybody who's interested in these kinds of deep questions about some of the most profound puzzles about our place in the universe will find this book a pure delight. The Kindle edition is superbly produced, with high-resolution colour plates which display beautifully on the iPad Kindle reader, and that rarest and most welcome of attributes in an electronic book, an index which is properly linked to the text. The Kindle edition is, however, more expensive than the hardcover as of this writing.

 Permalink

Hiltzik, Michael. Colossus. New York: Free Press, 2010. ISBN 978-1-4165-3216-3.
This book, subtitled “Hoover Dam and the Making of the American Century” chronicles the protracted, tangled, and often ugly history which led up to the undertaking, in the depths of the Great Depression, of the largest single civil engineering project ever attempted in the world up to that time, its achievement ahead of schedule and only modestly above budget, and its consequences for the Colorado River basin and the American West, which it continues to profoundly influence to this day.

Ever since the 19th century, visionaries, ambitious politicians, builders and engineers, and more than a few crackpots and confidence men had dreamt of and promoted grand schemes to harness the wild rivers of the American southwest, using their water to make the barren deserts bloom and opening up a new internal frontier for agriculture and (with cheap hydroelectric power) industry. Some of the schemes, and their consequences, were breathtaking. Consider the Alamo Canal, dug in 1900 to divert water from the Colorado River to irrigate the Imperial Valley of California. In 1905, the canal, already silted up by the water of the Colorado, overflowed, creating a flood which submerged more than five hundred square miles of lowlands in southern California, creating the Salton Sea, which is still there today (albeit smaller, due to evaporation and lack of inflow). Just imagine how such an environmental disaster would be covered by the legacy media today. President Theodore Roosevelt, considered a champion of the environment and the West, declined to provide federal assistance to deal with the disaster, leaving it up to the Southern Pacific Railroad, who had just acquired title to the canal, to, as the man said, “plug the hole”.

Clearly, the challenges posed by the notoriously fickle Colorado River, known for extreme floods, heavy silt, and a tendency to jump its banks and establish new watercourses, would require a much more comprehensive and ambitious solution. Further, such a solution would require the assent of the seven states within the river basin: Arizona, California, Colorado, Nevada, New Mexico, Utah, and Wyoming, among the sparsely populated majority of which there was deep distrust that California would exploit the project to loot them of their water for its own purposes. Given the invariant nature of California politicians and subsequent events, such suspicion was entirely merited.

In the 1920s, an extensive sequence of negotiations and court decisions led to the adoption of a compact between the states (actually, under its terms, only six states had to approve it, and Arizona did not until 1944). Commerce Secretary Herbert Hoover played a major part in these negotiations, although other participants dispute that his rôle was as central as he claimed in his memoirs. In December 1928, President Coolidge signed a bill authorising construction of the dam and a canal to route water downstream, and Congress appropriated US$165 million for the project, the largest single federal appropriation in the nation's history to that point.

What was proposed gave pause even to the master builders who came forward to bid on the project: an arch-gravity dam 221 metres high, 379 metres long, and 200 metres wide at its base. Its construction would require 3.25 million cubic yards (2.48 million cubic metres) of concrete, and would be, by a wide margin, the largest single structure ever built by the human species. The dam would create a reservoir containing 35.2 cubic kilometres of water, with a surface area of 640 square kilometres. These kinds of numbers had to bring a sense of “failure is not an option” even to the devil-may-care roughneck engineers of the epoch. Because, if for no other reason, they had a recent example of how the devil might care in the absence of scrupulous attention to detail. Just months before the great Colorado River dam was approved, the St. Francis Dam in California, built with the same design proposed for the new dam, suddenly failed catastrophically, killing more than 600 people downstream. William Mulholland, an enthusiastic supporter of the Colorado dam, had pronounced the St. Francis dam safe just hours before it failed. The St. Francis dam collapse was the worst civil engineering failure in American history and arguably remains so to date. The consequences of a comparable failure of the new dam were essentially unthinkable.

The contract for construction was won by a consortium of engineering firms called the “Six Companies” including names which would be celebrated in twentieth century civil engineering including Kaiser, Bechtel, and Morrison-Knudsen. Work began in 1931, as the Depression tightened its grip upon the economy and the realisation sank in that a near-term recovery was unlikely to occur. With this project one of the few enterprises hiring, a migration toward the job site began, and the labour market was entirely tilted toward the contractors. Living and working conditions at the outset were horrific, and although the former were eventually ameliorated once the company town of Boulder City was constructed, the rate of job-related deaths and injuries remained higher than those of comparable projects throughout the entire construction.

Everything was on a scale which dwarfed the experience of earlier projects. If the concrete for the dam had been poured as one monolithic block, it would have taken more than a century to cure, and the heat released in the process would have caused it to fracture into rubble. So the dam was built of more than thirty thousand blocks of concrete, each about fifty feet square and five feet high, cooled as it cured by chilled water from a refrigeration plant running through more than six hundred miles of cooling pipes embedded in the blocks. These blocks were then cemented into the structure of the dam with grout injected between the interlocking edges of adjacent blocks. And this entire structure had to be engineered to last forever and never fail.

At the ceremony marking the start of construction, Secretary of the Interior Ray Wilbur surprised the audience by referring to the project as “Hoover Dam”—the first time a comparable project had been named after a sitting president, which many thought unseemly, notwithstanding Hoover's involvement in the interstate compact behind the project. After Hoover's defeat by Roosevelt in 1932, the new administration consistently referred to the project as “Boulder Dam” and so commemorated it in a stamp issued on the occasion of the dam's dedication in September 1935. This was a bit curious as well, since the dam was actually built in Black Canyon, since the geological foundations in Boulder Canyon had been found unsuitable to anchor the structure. For years thereafter, Democrats called it “Boulder Dam”, while Republican stalwarts insisted on “Hoover Dam”. In 1947, newly-elected Republican majorities in the U.S. congress passed a bill officially naming the structure after Hoover and, signed by President Truman, so it has remained ever since.

This book provides an engaging immersion in a very different age, in which economic depression was tempered by an unshakable confidence in the future and the benefits to flow from continental scale collective projects, guided by wise men in Washington and carried out by roughnecks risking their lives in the savage environment of the West. The author discusses whether such a project could be accomplished today and concludes that it probably couldn't. (Of course, since all of the rivers with such potential for irrigation and power generation have already been dammed, the question is largely moot, but is relevant for grand scale projects such as solar power satellites, ocean thermal energy conversion, and other engineering works of comparable transformative consequences on the present-day economy.) We have woven such a web of environmental constraints, causes for litigation, and a tottering tower of debt that it is likely that a project such as Hoover Dam, without which the present-day U.S. southwest would not exist in its present form, could never have been carried out today, and certainly not before its scheduled completion date. Those who regard such grand earthworks as hubristic folly (to which the author tips his hat in the final chapters) might well reflect that history records the achievements of those who have grand dreams and bring them into existence, not those who sputter out their lives in courtrooms or trading floors.

 Permalink

Thor, Brad. Path of the Assassin. New York: Pocket Books, 2003. ISBN 978-0-7434-3676-2.
This, the second in the author's Scot Harvath saga, which began with The Lions of Lucerne (October 2010), starts with Agent Harvath, detached from the Secret Service and charged with cleaning up loose ends from events in the previous book, finding himself stalked and repeatedly preempted by a mysterious silver-eyed assassin who eliminates those linked to the plot he's investigating before they can be captured. Meanwhile, the Near East is careening toward war after a group calling itself the “Hand of God” commits atrocities upon Muslim holy sites, leaving a signature including the Star of David and the message “Terror for Terror”. Although the Israeli government denies any responsibility, there is substantial sympathy for these attacks within Israel, and before long reprisal attacks are mounted and raise tensions to the breaking point.

Intelligence indicates that the son of Abu Nidal has re-established his father's terrorist network and enlisted a broad coalition of Islamic barbarians in its cause. This is confirmed when a daring attack is mounted against a publicity stunt flight from the U.S. to Egypt which Harvath is charged to defeat.

And now it gets a little weird. We are expected to believe that, in just weeks or months, a public relations agent from Chicago, Meg Cassidy, whose spontaneous bravery brought down the hijackers in Cairo, could be trained to become a fully-qualified Special Forces operative, not only with the physical stamina which is found only in the best of the best, but also knowledge of a wide variety of weapons systems and technologies which veteran snake eaters spend years acquiring in the most demanding of conditions. This is as difficult to believe as the premise in G.I. Jane, and actually less so, since in that fantasy the woman in question actually wanted to become a commando.

This is a pretty good thriller, but you get the sense that Thor is still mastering the genre in this novel. He does realise that in the first novel he backed his protagonist into a corner by making him a Secret Service agent and works that out with the aid of a grateful president who appoints him to a much more loose cannon position in “Homeland Security”, which should make all of the dozens of lovers of liberty remaining in the United States shudder at that forbidding phrase.

 Permalink

Cordain, Loren. The Paleo Diet. Hoboken, NJ: John Wiley & Sons, 2002. ISBN 978-0-470-91302-4.
As the author of a diet book, I don't read many self-described “diet books”. First of all, I'm satisfied with the approach to weight management described in my own book; second, I don't need to lose weight; and third, I find most “diet books” built around gimmicks with little justification in biology and prone to prescribe regimes that few people are likely to stick with long enough to achieve their goal. What motivated me to read this book was a talk by Michael Rose at the First Personalized Life Extension Conference in which he mentioned the concept and this book not in conjunction with weight reduction but rather the extension of healthy lifespan in humans. Rose's argument, which is grounded in evolutionary biology and paleoanthropology, is somewhat subtle and well summarised in this article.

At the core of Rose's argument and that of the present book is the observation that while the human genome is barely different from that of human hunter-gatherers a million years ago, our present-day population has had at most 200 to 500 generations to adapt to the very different diet which emerged with the introduction of agriculture and animal husbandry. From an evolutionary standpoint, this is a relatively short time for adaptation and, here is the key thing (argued by Rose, but not in this book), even if modern humans had evolved adaptations to the agricultural diet (as in some cases they clearly have, lactose tolerance persisting into adulthood being one obvious example), those adaptations will not, from the simple mechanism of evolution, select out diseases caused by the new diet which only manifest themselves after the age of last reproduction in the population. So, if eating the agricultural diet (not to mention the horrors we've invented in the last century) were the cause of late-onset diseases such as cancer, cardiovascular problems, and type 2 diabetes, then evolution would have done nothing to select out the genes responsible for them, since these diseases strike most people after the age at which they've already passed on their genes to their children. Consequently, while it may be fine for young people to eat grain, dairy products, and other agricultural era innovations, folks over the age of forty may be asking for trouble by consuming foods which evolution hasn't had the chance to mold their genomes to tolerate. People whose ancestors shifted to the agricultural lifestyle much more recently, including many of African and aboriginal descent, have little or no adaptation to the agricultural diet, and may experience problems even earlier in life.

In this book, the author doesn't make these fine distinctions but rather argues that everybody can benefit from a diet resembling that which the vast majority of our ancestors—hunter-gatherers predating the advent of sedentary agriculture—ate, and to which evolution has molded our genome over that long expanse of time. This is not a “diet book” in the sense of a rigid plan for losing weight. Instead, it is a manual for adopting a lifestyle, based entirely upon non-exotic foods readily available at the supermarket, which approximates the mix of nutrients consumed by our distant ancestors. There are the usual meal plans and recipes, but the bulk of the book is a thorough survey, with extensive citations to the scientific literature, of what hunter-gatherers actually ate, the links scientists have found between the composition of the modern diet and the emergence of “diseases of civilisation” among populations that have transitioned to it in historical times, and the evidence for specific deleterious effects of major components of the modern diet such as grains and dairy products.

Not to over-simplify, but you can go a long way toward the ancestral diet simply by going to the store with an “anti-shopping list” of things not to buy, principally:

  • Grain, or anything derived from grains (bread, pasta, rice, corn)
  • Dairy products (milk, cheese, butter)
  • Fatty meats (bacon, marbled beef)
  • Starchy tuber crops (potatoes, sweet potatoes)
  • Salt or processed foods with added salt
  • Refined sugar or processed foods with added sugar
  • Oils with a high omega 6 to omega 3 ratio (safflower, peanut)

And basically, that's it! Apart from the list above you can buy whatever you want, eat it whenever you like in whatever quantity you wish, and the author asserts that if you're overweight you'll soon see your weight dropping toward your optimal weight, a variety of digestive and other problems will begin to clear up, you'll have more energy and a more consistent energy level throughout the day, and that you'll sleep better. Oh, and your chances of contracting cancer, diabetes, or cardiovascular disease will be dramatically reduced.

In practise, this means eating a lot of lean meat, seafood, fresh fruit and fresh vegetables, and nuts. As the author points out, even if you have a mound of cooked boneless chicken breasts, broccoli, and apples on the table before you, you're far less likely to pig out on them compared to, say, a pile of doughnuts, because the natural foods don't give you the immediate blood sugar hit the highly glycemic processed food does. And even if you do overindulge, the caloric density in the natural foods is so much lower your jaw will get tired chewing or your gut will bust before you can go way over your calorie requirements.

Now, if even if the science is sound (there are hundreds of citations of peer reviewed publications in the bibliography, but then nutritionists are forever publishing contradictory “studies” on any topic you can imagine, and in any case epidemiology cannot establish causation) and the benefits from adopting this diet are as immediate, dramatic, and important for long-term health, a lot of people are going to have trouble with what is recommended here. Food is a lot more to humans and other species (as anybody who's had a “picky eater” cat can testify) than just molecular fuel and construction material for our bodies. Our meals nourish the soul as well as the body, and among humans shared meals are a fundamental part of our social interaction which evolution has doubtless had time to write into our genes. If you go back and look at that list of things not to eat, you'll probably discover that just about any “comfort food” you cherish probably runs afoul of one or more of the forbidden ingredients. This means that contemplating the adoption of this diet as a permanent lifestyle change can look pretty grim, unless or until you find suitable replacements that thread among the constraints. The recipes presented here are interesting, but still come across to me (not having tried them) as pretty Spartan. And recall that even Spartans lived a pretty sybaritic lifestyle compared to your average hunter-gatherer band. But, hey, peach fuzz is entirely cool!

The view of the mechanics of weight loss and gain and the interaction between exercise and weight reduction presented here is essentially 100% compatible with my own in The Hacker's Diet.

This was intriguing enough that I decided to give it a try starting a couple of weeks ago. (I have been adhering, more or less, to the food selection guidelines, but not the detailed meal plans.) The results so far are intriguing but, at this early date, inconclusive. The most dramatic effect was an almost immediate (within the first three days) crash in my always-pesky high blood pressure. This may be due entirely to putting away the salt shaker (an implement of which I have been inordinately fond since childhood), but whatever the cause, it's taken about 20 points off the systolic and 10 off the diastolic, throughout the day. Second, I've seen a consistent downward bias in my weight. Now, as I said, I didn't try this diet to lose weight (although I could drop a few kilos and still be within the target band for my height and build, and wouldn't mind doing so). In any case, these are short-term results and may include transient adaptation effects. I haven't been hungry for a moment nor have I experienced any specific cravings (except the second-order kind for popcorn with a movie). It remains to be seen what will happen when I next attend a Swiss party and have to explain that I don't eat cheese.

This is a very interesting nutritional thesis, backed by a wealth of impressive research of which I was previously unaware. It flies in the face of much of the conventional wisdom on diet and nutrition, and yet viewed from the standpoint of evolution, it makes a lot of sense. You will find the case persuasively put here and perhaps be tempted to give it a try.

 Permalink

Flynn, Vince. American Assassin. New York: Atria Books, 2010. ISBN 978-1-4165-9518-2.
This is the eleventh novel in the Mitch Rapp (warning—the article at this link contains minor spoilers) series. While the first ten books chronicled events in sequence, the present volume returns to Rapp's origins as an independent assassin for, but not of (officially, at least) the CIA. Here, we revisit the tragic events which predisposed him to take up his singular career, his recruitment by rising anti-terrorist “active measures” advocate Irene Kennedy, and his first encounters with covert operations mastermind Thomas Stansfield.

A central part of the story is Rapp's training at the hands of the eccentric, misanthropic, paranoid, crusty, profane, and deadly in the extreme Stan Hurley, to whom Rapp has to prove, in the most direct of ways, that he isn't a soft college boy recruited to do the hardest of jobs. While Hurley is an incidental character in the novels covering subsequent events, he is centre stage here, and Mitch Rapp fans will delight in getting to know him in depth, even if they might not be inclined to spend much time with the actual man if they encountered him in real life.

Following his training, Rapp deploys on his first mission and immediately demonstrates his inclination to be a loose cannon, taking advantage of opportunities as they present themselves and throwing carefully scripted and practiced plans out the window at the spur of the moment. This brings him into open conflict with Hurley, but elicits a growing admiration from Stansfield, who begins to perceive that he may have finally found a “natural”.

An ambitious mission led by Hurley to deny terrorists their financial lifeblood and bring their leaders out into the open goes horribly wrong in Beirut when Hurley and another operative are kidnapped in broad daylight and subjected to torture in one of the most harrowing scenes in all the literature of the thriller. Hurley, although getting on in years for a field operative, proves “tougher than nails” (you'll understand after you read the book) and a master at getting inside the heads of his abductors and messing with them, but ultimately it's up to Rapp, acting largely alone, adopting a persona utterly unlike his own, and risking everything on the hope of an opportunity, to come to the rescue.

I wasn't sure how well a Rapp novel set in the context of historical events (Beirut in the early 1990s) would work, but in this case Flynn pulls it off magnificently. If you want to read the Rapp novels in story line sequence, this is the place to start.

 Permalink

Burns, Jennifer. Goddess of the Market. New York: Oxford University Press, 2009. ISBN 978-0-19-532487-7.
For somebody who built an entire philosophical system founded on reason, and insisted that even emotion was ultimately an expression of rational thought which could be arrived at from first principles, few modern writers have inspired such passion among their readers, disciples, enemies, critics, and participants in fields ranging from literature, politics, philosophy, religion, architecture, music, economics, and human relationships as Ayn Rand. Her two principal novels, The Fountainhead and Atlas Shrugged (April 2010), remain among the best selling fiction titles more than half a century after their publication, with in excess of ten million copies sold. More than half a million copies of Atlas Shrugged were sold in 2009 alone.

For all the commercial success of her works, which made this refugee from the Soviet Union, writing in a language she barely knew when she arrived in the United States, wealthy before her fortieth birthday, her work was generally greeted with derision among the literary establishment, reviewers in major newspapers, and academics. By the time Atlas Shrugged was published in 1957, she saw herself primarily as the founder of an all-encompassing philosophical system she named Objectivism, and her fiction as a means to demonstrate the validity of her system and communicate it to a broad audience. Academic philosophers, for the most part, did not even reject her work but simply ignored it, deeming it unworthy of their consideration. And Rand did not advance her cause by refusing to enter into the give and take of philosophical debate but instead insist that her system was self-evidently correct and had to be accepted as a package deal with no modifications.

As a result, she did not so much attract followers as disciples, who looked to her words as containing the answer to all of their questions, and whose self-worth was measured by how close they became to, as it were, the fountainhead whence they sprang. Some of these people were extremely bright, and went on to distinguished careers in which they acknowledged Rand's influence on their thinking. Alan Greenspan was a member of Rand's inner circle in the 1960s, making the case for a return to the gold standard in her newsletter, before becoming the maestro of paper money decades later.

Although her philosophy claimed that contradiction was impossible, her life and work were full of contradictions. While arguing that everything of value sprang from the rational creativity of free minds, she created a rigid system of thought which she insisted her followers adopt without any debate or deviation, and banished them from her circle if they dared dissent. She claimed to have created a self-consistent philosophical and moral system which was self-evidently correct, and yet she refused to debate those championing other systems. Her novels portray the state and its minions in the most starkly negative light of perhaps any broadly read fiction, and yet she detested libertarians and anarchists, defended the state as necessary to maintain the rule of law, and exulted in the success of Apollo 11 (whose launch she was invited to observe).

The passion that Ayn Rand inspires has coloured most of the many investigations of her life and work published to date. Finally, in this volume, we have a more or less dispassionate examination of her career and œuvre, based on original documents in the collection of the Ayn Rand Institute and a variety of other archives. Based upon the author's Ph.D. dissertation (and with the wealth of footnotes and source citations customary in such writing), this book makes an effort to tell the story of Ayn Rand's life, work, and their impact upon politics, economics, philosophy, and culture to date, and her lasting legacy, without taking sides. The author is neither a Rand follower nor a confirmed opponent, and pretty much lets each reader decide where they come down based on the events described.

At the outset, the author writes, “For over half a century, Rand has been the ultimate gateway drug to life on the right.” I initially found this very off-putting, and resigned myself to enduring another disdainful dismissal of Rand (to whose views the vast majority of the “right” over that half a century would have taken violent exception: Rand was vehemently atheist, opposing any mixing of religion and politics; a staunch supporter of abortion rights; opposed the Vietnam War and conscription; and although she rejected the legalisation of marijuana, cranked out most of her best known work while cranked on Benzedrine), as I read the book the idea began to grow on me. Indeed, many people in the libertarian and conservative worlds got their introduction to thought outside the collectivist and statist orthodoxy pervading academia and the legacy media by reading one of Ayn Rand's novels. This may have been the moment at which they first began to, as the hippies exhorted, “question authority”, and investigate other sources of information and ways of thinking and looking at the world. People who grew up with the Internet will find it almost impossible to imagine how difficult this was back in the 1960s, where even discovering the existence of a dissenting newsletter (amateurishly produced, irregularly issued, and with a tiny subscriber base) was entirely a hit or miss matter. But Ayn Rand planted the seed in the minds of millions of people, a seed which might sprout when they happened upon a like mind, or a like-minded publication.

The life of Ayn Rand is simultaneously a story of an immigrant living the American dream: success in Hollywood and Broadway and wealth beyond even her vivid imagination; the frustration of an author out of tune with the ideology of the times; the political education of one who disdained politics and politicians; the birth of one of the last “big systems” of philosophy in an age where big systems had become discredited; and a life filled with passion lived by a person obsessed with reason. The author does a thorough job of pulling this all together into a comprehensible narrative which, while thoroughly documented and eschewing enthusiasm in either direction, will keep you turning the pages. The author is an academic, and writes in the contemporary scholarly idiom: the term “right-wing” appears 15 times in the book, while “left-wing” is used not at all, even when describing officials and members of the Communist Party USA. Still, this does not detract from the value of this work: a serious, in-depth, and agenda-free examination of Ayn Rand's life, work, and influence on history, today, and tomorrow.

 Permalink

O'Rourke, P. J. Don't Vote—It Just Encourages the Bastards. New York: Atlantic Monthly Press, 2010. ISBN 978-0-8021-1960-5.
P. J. O'Rourke is one of the most astute observers of the contemporary scene who isn't, I believe, taken as seriously as he deserves to be simply because his writing is so riotously funny. In the present book, he describes the life-changing experience which caused him to become a conservative (hint: it's the same one which can cause otherwise sane adults to contemplate buying a minivan and discover a new and distasteful definition of the word “change”), and explores the foundations of conservatism in a world increasingly dominated by nanny states, an out-of-touch and increasingly inbred ruling class, and a growing fraction of the electorate dependent upon the state and motivated to elect politicians who will distribute public largesse to them, whatever the consequences for the nation as a whole.

This is, of course, all done with great wit (and quite a bit of profanity, which may be off-putting to the more strait-laced kind of conservative), but there are a number of deep insights you'll never come across in the legacy media. For example, “We live in a democracy, rule by the people. Fifty percent of people are below average intelligence. This explains everything about politics.” The author then moves on to survey the “burning issues of our time” including the financial mess, “climate change” (where he demolishes the policy prescriptions of the warm-mongers in three paragraphs occupying less than a page), health care, terrorism, the collapse of the U.S. auto industry, and foreign policy, where he brings the wisdom of Kipling to bear on U.S. adventures in the Hindu Kush.

He concludes, in a vein more libertarian than conservative, that politics and politicians are, by their very nature, so fundamentally flawed (Let's give a small number of people a monopoly on the use of force and the ability to coercively take the earnings of others—what could possibly go wrong?) that the only solution is to dramatically reduce the scope of government, getting it out of our lives, bedrooms, bathrooms, kitchens, cars, and all of the other places its slimy tendrils have intruded, and, for those few remaining functions where government has a legitimate reason to exist, that it be on the smallest and most local scale possible. Government is, by its very nature, a monopoly (which explains a large part of why it produces such destructive outcomes), but an ensemble of separate governments (for example, states, municipalities, and school districts in the U.S.) will be constrained by competition from their peers, as evidenced by the demographic shift from high tax to low tax states in the U.S. and the disparate economic performance of highly regulated states and those with a business climate which favours entrepreneurship.

In all, I find O'Rourke more optimistic about the prospects of the U.S. than my own view. The financial situation is simply intractable, and decades of policy implemented by both major political parties have brought the U.S. near the tipping point where a majority of the electorate pays no income tax, and hence has no motivation to support policies which would reduce the rate of growth of government, not to speak of actually shrinking it. The government/academia/media axis has become a self-reinforcing closed loop which believes things very different than the general populace, of which it is increasingly openly contemptuous. It seems to me the most likely outcome is collapse, not reform, with the form of the post-collapse society difficult to envision from a pre-discontinuity perspective. I'll be writing more about possible scenarios and their outcomes in the new year.

This book presents a single argument; it is not a collection of columns. Consequently, it is best read front to back. I would not recommend reading it straight through, however, but rather a chapter a day or every few days. In too large doses, the hilarity of the text may drown out the deeper issues being discussed. In any case, this book will leave you not only entertained but enlightened.

A podcast interview with the author is available in which he concedes that he does, in fact, actually vote.

 Permalink