Saturday, September 27, 2014
Reading List: Destination Moon
- Byers, Bruce K.
Washington: National Aeronautics and Space Administration, 1977.
NASA TM X-3487.
In the mid 1960s, the U.S. Apollo lunar landing program was at the
peak of its budget commitment and technical development. The mission
mode had already been chosen and development of the flight hardware was
well underway, along with the ground infrastructure required to test and
launch it and the global network required to track missions in flight.
One nettlesome problem remained. The design of the lunar module made
assumptions about the properties of the lunar surface upon which it would
alight. If the landing zone had boulders which were too large, craters
sufficiently deep and common that the landing legs could not avoid, or
slopes too steep to avoid an upset on landing or tipping over afterward,
lunar landing missions would all be aborted by the crew when they
reached decision height, judging there was no place they could set down
safely. Even if all the crews returned safely without having landed,
this would be an ignominious end to the ambitions of Project Apollo.
What was needed in order to identify safe landing zones was high-resolution
imagery of the Moon. The most capable Earth-based telescopes, operating
through Earth's turbulent and often murky atmosphere, produced images which
resolved objects at best a hundred times larger that those which could
upset a lunar landing mission. What was required was a large area, high
resolution mapping of the Moon and survey of potential landing zones, which
could only be done, given the technology of the 1960s, by going there,
taking pictures, and returning them to Earth. So was born the
program, which in 1966 and 1967 sent lightweight photographic reconnaissance
satellites into lunar orbit, providing both the close-up imagery needed
to select landing sites for the Apollo missions, but also mapping imagery
which covered 99% of the near side of the Moon and 85% of the far side,
In fact, Lunar Orbiter provided global imagery of the Moon far more
complete than that which would be available for the Earth many years
Accomplishing this goal with the technology of the 1960s was no small
feat. Electronic imaging amounted to analogue television, which, at the
altitude of a lunar orbit, wouldn't produce images any better than
telescopes on Earth. The first spy satellites were struggling to return
film from Earth orbit, and returning film from the Moon was completely
impossible given the mass budget of the launchers available. After a
fierce competition, NASA contracted with Boeing to build the Lunar
Orbiter, designed to fit on NASA's workhorse
launcher, which seriously constrained its mass. Boeing subcontracted
with Kodak to build the imaging system and RCA for the communications
hardware which would relay the images back to Earth and allow the
spacecraft to be controlled from the ground.
The images were acquired by a process which may seem absurd to those
accustomed to present-day digital technologies but which seemed
miraculous in its day. In lunar orbit, the spacecraft would aim its
cameras (it had two: a mapping camera which produced overlapping
wide-angle views and a high-resolution camera that photographed
clips of each frame with a resolution of about one metre) at the
Moon and take a series of photos. Because the film used had a very
low light sensitivity (ASA [now ISO] 1.6), on low-altitude imaging
passes the film would have to be moved to compensate for the motion
of the spacecraft to avoid blurring. (The low light sensitivity of
the film was due to its very high spatial resolution, but also reduced
its likelihood of being fogged by exposure to cosmic rays or
energetic particles from solar flares.)
After being exposed, the film would subsequently be processed on-board
by putting it in contact with a band containing developer and fixer, and
then the resulting negative would be read back for transmission to
Earth by scanning it with a moving point of light, measuring the transmission
through the negative, and sending the measured intensity back as an analogue
signal. At the receiving station, that signal would be used to modulate
the intensity of a spot of light scanned across film which,
when developed and assembled into images from strips, revealed the details
of the Moon. The incoming analogue signal was recorded on tape to
provide a backup for the film recording process, but nothing was done
with the tapes at the time. More about this later….
Five Lunar Orbiter missions were launched, and although some experienced
problems, all achieved their primary mission objectives. The first three
missions provided all of the data required by Apollo, so the final two
could be dedicated to mapping the Moon from near-polar orbits. After
the completion of their primary imaging missions, Lunar Orbiters continued
to measure the radiation and micrometeoroid environment near the Moon,
and contributed to understanding the Moon's gravitational field, which
would be important in planning later Apollo missions that would fly in
very low orbits around the Moon. On August 23rd, 1966, the first Lunar
Orbiter took one of the most iconic pictures of the 20th century:
from the Moon. The problems experienced by Lunar Orbiter missions and
the improvisation by ground controllers to work around them set the
pattern for subsequent NASA robotic missions, with their versatile,
reconfigurable flight hardware and fine-grained control from the ground.
You might think the story of Lunar Orbiter a footnote to space
exploration history which has scrolled off the screen with subsequent
Apollo lunar landings and high-resolution lunar mapping by missions
Lunar Reconnaissance Orbiter,
but that fails to take into account the exploits of 21st century space
data archaeologists. Recall that I said that all of the image data from Lunar
Orbiter missions was recorded on analogue tapes. These tapes contained about
10 bits of dynamic range, as opposed to the 8 bits which were preserved by
the optical recording process used in receiving the images during the missions.
This, combined with contemporary image processing techniques, makes for breathtaking
images recorded almost half a century ago, but never seen before. Here are a document and video
which record the exploits of the
Orbiter Image Recovery Project (LOIRP). Please visit the
LOIRP Web site for more restored
images and details of the process of restoration.
Tuesday, September 23, 2014
Reading List: My Sweet Satan
- Cawdron, Peter.
My Sweet Satan.
Seattle: Amazon Digital Services, 2014.
Here the author adds yet another imaginative tale of first contact
to his growing list of novels in that genre, a puzzle story which the
viewpoint character must figure out having lost memories of her entire
adult life. After a botched attempt at reanimation from cryo-sleep,
Jasmine Holden finds herself with no memories of her life after the age
of nineteen. And yet, here she is, on board Copernicus,
in the Saturn system, closing in on the distant retrograde moon
when approached by a probe from Earth, sent back an audio transmission
to its planet of origin which was mostly gibberish but contained the
chilling words: “My sweet Satan. I want to live and die for you,
my glorious Satan!”. A follow-up unmanned probe to Bestla is
destroyed as it approaches, and the Copernicus is
dispatched to make a cautious investigation of what appears to be an
alien probe with a disturbing theological predisposition.
Back on Earth, sentiment has swung back and forth about the merits of
exploring Bestla and fears of provoking an alien presence in
the solar system which, by its very capability of interstellar travel,
must be far in advance of Earthly technology. Jasmine, a key member
of the science team, suddenly finds herself mentally a 19 year old girl
far from her home, and confronted both by an unknown alien presence
but also conflict among her crew members, who interpret the imperatives
of the mission in different ways.
She finds the ship's computer, an early stage artificial intelligence,
the one being in which she can confide, and the only one who comprehends
her predicament and is willing to talk her through procedures she learned
by heart in her training but have been lost to an amnesia she feels
compelled to conceal from human members of the crew.
As the ship approaches Bestla, conflict erupts among the crew, and
Jasmine must sort out what is really going on and choose sides
without any recollections of her earlier interactions with her crew members.
In a way, this is three first contact novels in one: 19 year old Jasmine
making contact with her fellow crew members about which she remembers
nothing, the Copernicus and whatever is on Bestla,
and a third contact about which I cannot say anything without
spoiling the story.
This is a cracking good first contact novel which, just when you're
nearing the end and beginning to worry “Where's the sense of
wonder?” delivers everything you'd hoped for and more.
I read a pre-publication manuscript edition which the author
kindly shared with me.
Friday, September 19, 2014
Reading List: Superintelligence
- Bostrom, Nick.
Oxford: Oxford University Press, 2014.
Absent the emergence of some physical constraint which causes the
exponential growth of computing power at constant cost to cease,
some form of economic or societal collapse which brings an end to
research and development of advanced computing hardware and software,
or a decision, whether bottom-up or top-down, to deliberately relinquish
such technologies, it is probable that within the 21st century there
will emerge artificially-constructed systems which are more intelligent
(measured in a variety of ways) than any human being who has ever lived and,
given the superior ability of such systems to improve themselves, may
rapidly advance to superiority over all human society taken as a whole.
This “intelligence explosion” may occur in so short a time
(seconds to hours) that human society will have no time to adapt to its
presence or interfere with its emergence. This challenging and occasionally
difficult book, written by a philosopher who has explored these issues in depth,
argues that the emergence of superintelligence will pose the greatest
human-caused existential threat to our species so far in its existence,
and perhaps in all time.
Let us consider what superintelligence may mean. The history of
machines designed by humans is that they rapidly surpass their
biological predecessors to a large degree. Biology never produced
something like a steam engine, a locomotive, or an airliner. It
is similarly likely that once the intellectual and technological leap to
constructing artificially intelligent systems is made, these systems
will surpass human capabilities to an extent greater than those of a Boeing
747 exceed those of a hawk. The gap between the cognitive power of a human,
or all humanity combined, and the first mature superintelligence may be as
great as that between brewer's yeast and humans. We'd better be sure of
the intentions and benevolence of that intelligence before handing
over the keys to our future to it.
Because when we speak of the future, that future isn't just what we can
envision over a few centuries on this planet, but the entire “cosmic
endowment” of humanity. It is entirely plausible that we are members
of the only intelligent species in the galaxy, and possibly in the entire
visible universe. (If we weren't, there would be abundant and visible evidence
of cosmic engineering by those more advanced that we.) Thus our cosmic
endowment may be the entire galaxy, or the universe, until the end of
time. What we do in the next century may determine the destiny of the
universe, so it's worth some reflection to get it right.
As an example of how easy it is to choose unwisely, let me expand upon an
example given by the author. There are extremely difficult and subtle
questions about what the motivations of a superintelligence might be,
how the possession of such power might change it, and the prospects for
we, its creator, to constrain it to behave in a way we consider consistent
with our own values. But for the moment, let's ignore all of those
problems and assume we can specify the motivation of an artificially
intelligent agent we create and that it will remain faithful to that
motivation for all time. Now suppose a paper clip factory has installed a
high-end computing system to handle its design tasks, automate manufacturing,
manage acquisition and distribution of its products, and otherwise obtain
an advantage over its competitors. This system, with connectivity
to the global Internet, makes the leap to superintelligence before any
other system (since it understands that superintelligence will enable it
to better achieve the goals set for it). Overnight, it replicates itself
all around the world, manipulates financial markets to obtain resources
for itself, and deploys them to carry out its mission. The mission?—to
maximise the number of paper clips produced in its future light cone.
“Clippy”, if I may address it so informally, will rapidly
discover that most of the raw materials it requires in the near future
are locked in the core of the Earth, and can be liberated by
disassembling the planet by self-replicating nanotechnological
machines. This will cause the extinction of its creators and all
other biological species on Earth, but then they were just consuming
energy and material resources which could better be deployed for making
paper clips. Soon other planets in the solar system would be similarly
dispatched on missions to
other stars, there to make paper clips and spawn other probes to more
stars and eventually other galaxies. Eventually, the entire visible
universe would be turned into paper clips, all because the original
factory manager didn't hire a philosopher to work out the ultimate
consequences of the final goal programmed into his factory automation
This is a light-hearted example, but if you happen to observe a void in a
galaxy whose spectrum resembles that of paper clips, be very
One of the reasons to believe that we will have to confront superintelligence
is that there are multiple roads to achieving it, largely independent of
Artificial general intelligence
in as many domains as humans exhibit intelligence today, and not
constrained to limited tasks such as playing chess or driving a car) may
simply await the discovery of a clever software method which could run on
existing computers or networks. Or, it might emerge as networks store more
and more data about the real world and have access to accumulated human
knowledge. Or, we may build “neuromorphic“ systems whose
hardware operates in ways similar to the components of human brains, but
at electronic, not biologically-limited speeds. Or, we may be able to
scan an entire human brain and emulate it, even without understanding how
it works in detail, either on neuromorphic or a more conventional
computing architecture. Finally, by identifying the genetic components
of human intelligence, we may be able to manipulate the human germ line,
modify the genetic code of embryos, or select among mass-produced
embryos those with the greatest predisposition toward intelligence. All
of these approaches may be pursued in parallel, and progress in one may
At some point, the emergence of superintelligence calls into the question
the economic rationale for a large human population. In 1915, there were
about 26 million horses in the U.S. By the early 1950s, only 2 million
remained. Perhaps the AIs will have a nostalgic attachment to those who
created them, as humans had for the animals who bore their burdens for
millennia. But on the other hand, maybe they won't.
As an engineer, I usually don't have much use for philosophers, who are
given to long gassy prose devoid of specifics and for spouting
complicated indirect arguments which don't seem to be independently
testable (“What if we asked the AI to determine its own goals,
based on its understanding of what we would ask it to do if only
we were as intelligent as it and thus able to better comprehend what
we really want?”). These are interesting concepts, but would
you want to bet the destiny of the universe on them? The latter half
of the book is full of such fuzzy speculation, which I doubt is likely
to result in clear policy choices before we're faced with the emergence
of an artificial intelligence, after which, if they're wrong, it will
be too late.
That said, this book is a welcome antidote to wildly optimistic views
of the emergence of artificial intelligence which blithely assume it
will be our dutiful servant rather than a fearful master. Some readers
may assume that an artificial intelligence will be something like a
present-day computer or search engine, and not be self-aware and have
its own agenda and powerful wiles to advance it, based upon a knowledge
of humans far beyond what any single human brain can encompass. Unless you
believe there is some kind of intellectual
élan vital inherent in biological
substrates which is absent in their equivalents based on other hardware
(which just seems silly to me—like arguing there's something
special about a horse which can't be accomplished better by a truck),
the mature artificial intelligence will be the superior in every way
to its human creators, so in-depth ratiocination about how it will
regard and treat us is in order before we find ourselves faced with the
reality of dealing with our successor.
Friday, September 5, 2014
Reading List: The South Pole
- Amundsen, Roald.
The South Pole.
New York: Cooper Square Press,  2001.
In modern warfare, it has been observed that “generals win battles,
but logisticians win wars.” So it is with planning an exploration
mission to a remote destination where no human has ever set foot, and
the truths are as valid for polar exploration in the early 20th century as
they will be for missions to Mars in the 21st. On December 14th, 1911,
Roald Amundsen and his five-man southern party reached the South Pole after
a trek from the camp on the Ross Ice Shelf where they had passed the previous
southern winter, preparing for an assault on the pole as early as the
weather would permit. By over-wintering, they would be able to depart
southward well before a ship would be able to land an expedition, since
a ship would have to wait until the sea ice dispersed sufficiently to
make a landing.
Amundsen's plan was built around what space mission architects
call “in-situ resource utilisation” and “depots”,
as well as “propulsion staging”. This allowed for a very
lightweight push to the pole, both in terms of the amount of supplies
which had to be landed by their ship, the Fram, and in the
size of the polar party and the loading of their sledges. Upon arriving
in Antarctica, Amundsen's party immediately began to hunt the abundant seals
near the coast. More than two hundred seals were killed, processed, and
stored for later use. (Since the temperature on the
Ross Ice Shelf
and the Antarctic interior never rises above freezing, the seal meat would
keep indefinitely.) Then parties were sent out in the months remaining
before the arrival of winter in 1911 to establish depots at every degree of
latitude between the base camp and 82° south. These depots contained
caches of seal meat for the men and dogs and kerosene for melting snow for
water and cooking food. The depot-laying journeys familiarised the explorers
with driving teams of dogs and operating in the Antarctic environment.
Amundsen had chosen dogs to pull his sledges. While his rival to
be first at the pole,
Robert Falcon Scott,
experimented with pulling sledges by ponies, motorised sledges, and man-hauling,
Amundsen relied upon the experience of indigenous people in Arctic environments
that dogs were the best solution. Dogs reproduced and matured sufficiently
quickly that attrition could be made up by puppies born during the expedition,
they could be fed on seal meat, which could be obtained locally, and if a dog
team were to fall into a crevasse (as was inevitable when crossing uncharted
terrain), the dogs could be hauled out, no worse for wear, by the drivers
of other sledges. For ponies and motorised sledges, this was not the case.
Further, Amundsen adopted a strategy which can best be described as
“dog eat dog”. On the journey to the pole, he started with
52 dogs. Seven of these had died from exhaustion or other causes before
the ascent to the polar plateau. (Dogs who died were butchered and fed to
the other dogs. Greenland sled dogs, being only slightly removed from
wolves, had no hesitation in devouring their erstwhile comrades.) Once
reaching the plateau, 27 dogs were slaughtered, their meat divided between
the surviving dogs and the five men. Only 18 dogs would proceed to the pole.
Dog carcasses were cached for use on the return journey.
Beyond the depots, the polar party had to carry everything required for the
trip. but knowing the depots would be available for the return allowed them
to travel lightly. After reaching the pole, they remained for three days
to verify their position, send out parties to ensure they had encircled the
pole's position, and built a cairn to commemorate their achievement.
Amundsen left a letter which he requested Captain Scott deliver to King
Haakon VII of Norway should Amundsen's party be lost on its return to
base. (Sadly, that was the
fate which awaited Scott, who arrived at the pole
on January 17th, 1912, only to find the Amundsen expedition's cairn there.)
This book is Roald Amundsen's contemporary memoir of the expedition. Originally
published in two volumes, the present work includes both. Appendices
describe the ship, the Fram, and scientific investigations in
meteorology, geology, astronomy, and oceanography conducted during the
expedition. Amundsen's account is as matter-of-fact as the memoirs of
some astronauts, but a wry humour comes through when discussing dealing
with sled dogs who have will of their own and also the foibles of humans
cooped up in a small cabin in an alien environment during a night which lasts
for months. He evinces great respect for his colleagues and competitors in polar
exploration, particularly Scott and
worries whether his own approach to reaching the pole would be proved
superior to theirs. At the time the book was published, the tragic fate of
Scott's expedition was not known.
Today, we might not think of polar exploration as science, but a century ago it
was as central to the scientific endeavour as robotic exploration of Mars is
today. Here was an entire continent, known only in sketchy detail around its
coast, with only a few expeditions into the interior. When Amundsen's party set
out on their march to the pole, they had no idea whether they would encounter
mountain ranges along the way and, if so, whether they could find a way over or
around them. They took careful geographic and meteorological observations along
their trek (as well as oceanographical measurements on the trip to Antarctica and
back), and these provided some of the first data points toward understanding
weather in the southern hemisphere.
In Norway, Amundsen was hailed as a hero. But it is clear from this narrative
he never considered himself such. He wrote:
I may say that this is the greatest factor—the way in which the expedition
is equipped—the way in which every difficulty is foreseen, and precautions
taken for meeting or avoiding it. Victory awaits him who has everything in
order—luck, people call it. Defeat is certain for him who has neglected to
take the necessary precautions in time; this is called bad luck.
This work is in the public domain, and there are numerous editions of it
available, in print and in electronic form, many from independent
publishers. The independent publishers, for the most part, did not
distinguish themselves in their respect for this work. Many of their
editions were produced by running an optical character recognition program
over a print copy of the book, then putting it together with minimal
copy-editing. Some (including the one I was foolish enough to buy)
elide all of the diagrams, maps, and charts from the original book,
which renders parts of the text incomprehensible. The paperback edition
cited above, while expensive, is a facsimile edition of the original
1913 two volume English translation of Amundsen's original work, including
all of the illustrations. I know of no presently-available electronic
edition which has comparable quality and includes all of the material in
the original book. Be careful—if you follow the link to the paperback
edition, you'll see a Kindle edition listed, but this is from a different
publisher and is rife with errors and includes none of the illustrations.
I made the mistake of buying it, assuming it was the same as the highly-praised
paperback. It isn't; don't be fooled.
Friday, August 29, 2014
Reading List: The Man Who Changed Everything
- Mahon, Basil.
The Man Who Changed Everything.
Chichester, UK: John Wiley & Sons, 2003.
In the 19th century, science in general and physics in particular grew up,
assuming their modern form which is still recognisable today. At the start
of the century, the word “scientist” was not yet in use, and
the natural philosophers of the time were often amateurs. University
research in the sciences, particularly in Britain, was rare. Those
working in the sciences were often occupied by cataloguing natural
phenomena, and apart from Newton's monumental achievements, few people
focussed on discovering mathematical laws to explain the new physical
phenomena which were being discovered such as electricity and magnetism.
One person, James Clerk Maxwell, was largely responsible for creating the
way modern science is done and the way we think about theories of physics,
while simultaneously restoring Britain's standing in physics compared to
work on the Continent, and he created an institution which would continue
to do important work from the time of his early death until the present day.
While every physicist and electrical engineer knows of Maxwell and his
work, he is largely unknown to the general public, and even those who are
aware of his seminal work in electromagnetism may be unaware of the extent
his footprints are found all over the edifice of 19th century physics.
Maxwell was born in 1831 to a Scottish lawyer, John Clerk, and his wife Frances Cay.
Clerk subsequently inherited a country estate, and added “Maxwell”
to his name in honour of the noble relatives from whom he inherited it. His
son's first name, then was “James” and his surname “Clerk Maxwell”:
this is why his full name is always used instead of “James Maxwell”.
From childhood, James was curious about everything he encountered, and instead
of asking “Why?” over and over like many children, he drove his
parents to distraction with “What's the go o' that?”. His father
did not consider science a suitable occupation for his son and tried to direct
him toward the law, but James's curiosity did not extend to legal tomes and
he concentrated on topics that interested him. He published his first
scientific paper, on curves with more than two foci, at the age of 14.
He pursued his scientific education first at the University of Edinburgh
and later at Cambridge, where he graduated in 1854 with a degree in mathematics.
He came in second in the prestigious Tripos examination, earning the title of
Maxwell was now free to begin his independent research, and he turned
to the problem of human colour vision. It had been established that
colour vision worked by detecting the mixture of three primary colours,
but Maxwell was the first to discover that these primaries were red,
green, and blue, and that by mixing them in the correct proportion,
white would be produced. This was a matter to which Maxwell would
return repeatedly during his life.
In 1856 he accepted an appointment as a full professor and department head
at Marischal College, in Aberdeen Scotland. In 1857, the topic for the
prestigious Adams Prize was the nature of the rings of Saturn. Maxwell's
submission was a tour de force which
proved that the rings could not be either solid nor a liquid, and hence
had to be made of an enormous number of individually orbiting bodies.
Maxwell was awarded the prize, the significance of which was magnified
by the fact that his was the only submission: all of the others who
aspired to solve the problem had abandoned it as too difficult.
Maxwell's next post was at King's College London, where he investigated
the properties of gases and strengthened the evidence for the molecular
theory of gases. It was here that he first undertook to explain the
relationship between electricity and magnetism which had been discovered
by Michael Faraday. Working in the old style of physics, he constructed
an intricate mechanical thought experiment model which might explain the
lines of force that Faraday had introduced but which many scientists
thought were mystical mumbo-jumbo. Maxwell believed the alternative
of action at a distance without any intermediate mechanism was
wrong, and was able, with his model, to explain the phenomenon of
rotation of the plane of polarisation of light by a magnetic field,
which had been discovered by Faraday. While at King's College, to
demonstrate his theory of colour vision, he took and displayed the
first colour photograph.
Maxwell's greatest scientific achievement was done while living the life
of a country gentleman at his estate, Glenair. In his textbook,
A Treatise on Electricity and Magnetism, he presented
which showed that electricity and magnetism were
two aspects of the same phenomenon. This was the first of the great unifications
of physical laws which have continued to the present day. But that isn't
all they showed. The speed of light appeared as a conversion factor between
the units of electricity and magnetism, and the equations allowed solutions
of waves oscillating between an electric and magnetic field which could
propagate through empty space at the speed of light. It was compelling
to deduce that light was just such an electromagnetic wave, and that
waves of other frequencies outside the visual range must exist. Thus
was laid the foundation of wireless communication, X-rays, and gamma rays.
The speed of light is a constant in Maxwell's equations, not depending upon
the motion of the observer. This appears to conflict with Newton's laws
of mechanics, and it was not until Einstein's 1905 paper on
that the mystery would be resolved. In essence, faced with a dispute between
Newton and Maxwell, Einstein decided to bet on Maxwell, and he chose wisely.
Finally, when you look at Maxwell's equations (in their modern form, using
the notation of vector calculus), they appear lopsided. While they unify
electricity and magnetism, the symmetry is imperfect in that while a moving
electric charge generates a magnetic field, there is no magnetic charge which,
when moved, generates an electric field. Such a charge would be a
and despite extensive experimental searches, none has ever been found. The
existence of monopoles would make Maxwell's equations even more beautiful, but
sometimes nature doesn't care about that. By all evidence to date, Maxwell got it
In 1871 Maxwell came out of retirement to accept a professorship at Cambridge
and found the
which would focus on experimental science and elevate Cambridge to world-class
status in the field. To date, 29 Nobel Prizes have been awarded for work done
at the Cavendish.
Maxwell's theoretical and experimental work on heat and gases revealed
discrepancies which were not explained until the development of quantum
theory in the 20th century. His suggestion of
posed a deep puzzle in the foundations of thermodynamics which eventually,
a century later, showed the deep connections between information theory
and statistical mechanics. His practical work on automatic governors for
steam engines foreshadowed what we now call control theory. He played a key
part in the development of the units we use for electrical quantities.
By all accounts Maxwell was a modest, generous, and well-mannered man. He
wrote whimsical poetry, discussed a multitude of topics (although he had little
interest in politics), was an enthusiastic horseman and athlete (he would swim
in the sea off Scotland in the winter), and was happily married, with his wife
Katherine an active participant in his experiments. All his life, he supported
general education in science, founding a working men's college in Cambridge and
lecturing at such colleges throughout his career.
Maxwell lived only 48 years—he died in 1879 of the same cancer which had
killed his mother when he was only eight years old. When he fell ill, he was
engaged in a variety of research while presiding at the Cavendish Laboratory.
We shall never know what he might have done had he been granted another two
Apart from the significant achievements Maxwell made in a wide variety of
fields, he changed the way physicists look at, describe, and think about
natural phenomena. After using a mental model to explore electromagnetism,
he discarded it in favour of a mathematical description of its behaviour.
There is no theory behind Maxwell's equations: the equations are
the theory. To the extent they produce the correct results when
experimental conditions are plugged in, and predict new phenomena which
are subsequently confirmed by experiment, they are valuable. If they
err, they should be supplanted by something more precise. But they say
nothing about what is really going on—they only seek to
model what happens when you do experiments. Today, we are so accustomed
to working with theories of this kind: quantum mechanics, special and general
relativity, and the standard model of particle physics, that we don't think
much about it, but it was revolutionary in Maxwell's time. His mathematical
approach, like Newton's, eschewed explanation in favour of prediction: “We
have no idea how it works, but here's what will happen if you do this experiment.”
This is perhaps Maxwell's greatest legacy.
This is an excellent scientific biography of Maxwell which also gives the reader
a sense of the man. He was such a quintessentially normal person there aren't
a lot of amusing anecdotes to relate. He loved life, loved his work, cherished his
friends, and discovered the scientific foundations of the technologies which
allow you to read this. In the
Kindle edition, at least as read on an iPad, the text
appears in a curious, spidery, almost vintage, font in which periods are difficult to
distinguish from commas. Numbers sometimes have spurious spaces embedded within them,
and the index cites pages in the print edition which are useless since the Kindle
edition does not include real page numbers.
Thursday, August 21, 2014
Reading List: Savage Continent
- Lowe, Keith.
New York: Picador,  2013.
On May 8th, 1945, World War II in Europe formally ended when the Allies
accepted the unconditional surrender of Germany. In popular myth,
especially among those too young to have lived through the war and
its aftermath, the defeat of Italy and Germany ushered in, at least
in Western Europe not occupied by Soviet troops, a period of rebuilding
and rapid economic growth, spurred by the
Marshall Plan. The French
refer to the three decades from 1945 to 1975 as
Les Trente Glorieuses.
But that isn't what actually happened, as this book documents in detail.
Few books cover the immediate aftermath of the war, or concentrate
exclusively upon that chaotic period. The author has gone to great lengths
to explore little-known conflicts and sort out conflicting accounts of
what happened still disputed today by descendants of those involved.
The devastation wreaked upon cities where the conflict raged was extreme.
In Germany, Berlin, Hanover, Duisburg, Dortmund, and Cologne lost more
than half their habitable buildings, with the figure rising to 70% in
the latter city. From Stalingrad to Warsaw to Caen in France, destruction
was general with survivors living in the rubble. The transportation
infrastructure was almost completely obliterated, along with services
such as water, gas, electricity, and sanitation. The industrial plant
was wiped out, and along with it the hope of employment. This was the
state of affairs in May 1945, and the Marshall Plan did not begin to
deliver assistance to Western Europe until three years later,
in April 1948. Those three years were grim, and compounded by score-settling,
revenge, political instability, and multitudes of displaced people
returning to areas with no infrastructure to support them.
And this was in Western Europe. As is the case with just about everything
regarding World War II in Europe, the further east you go, the worse things
get. In the Soviet Union, 70,000 villages were destroyed, along with
32,000 factories. The redrawing of borders, particularly those of Poland
and Germany, set the stage for a paroxysm of ethnic cleansing and mass
migration as Poles were expelled from territory now incorporated into the
Soviet Union and Germans from the western part of Poland. Reprisals against
those accused of collaboration with the enemy were widespread, with
murder not uncommon. Thirst for revenge extended to the innocent, including
children fathered by soldiers of occupying armies.
The end of the War did not mean an end to the wars. As the author writes,
“The Second World War was therefore not only a traditional
conflict for territory: it was simultaneously a war of race, and a war
of ideology, and was interlaced with half a dozen civil wars fought for
purely local reasons.” Defeat of Germany did nothing to bring these
other conflicts to an end. Guerrilla wars continued in the Baltic states
annexed by the Soviet Union as partisans resisted the invader. An all-out
civil war between communists and anti-communists erupted in Greece and
was ended only through British and American aid to the anti-communists.
Communist agitation escalated to violence in Italy and France. And
country after country in Eastern Europe came under Soviet domination as
puppet regimes were installed through coups, subversion, or rigged
When reading a detailed history of a period most historians ignore, one
finds oneself exclaiming over and over, “I didn't know that!”,
and that is certainly the case here. This was a dark period, and no group
seemed immune from regrettable acts, including Jews liberated from Nazi
death camps and slave labourers freed as the Allies advanced: both sometimes
took their revenge upon German civilians. As the author demonstrates,
the aftermath of this period still simmers beneath the surface among the people
involved—it has become part of the identity of ethnic groups which will
outlive any person who actually remembers the events of the immediate
In addition to providing an enlightening look at this neglected period, the
events in the years following 1945 have much to teach us about those playing
out today around the globe. We are seeing long-simmering ethnic and religious
strife boil into open conflict as soon as the system is perturbed enough to
knock the lid off the kettle. Borders drawn by politicians mean little when
people's identity is defined by ancestry or faith, and memories are very long,
measured sometimes in centuries. Even after a cataclysmic conflict which levels
cities and reduces populations to near-medieval levels of subsistence, many
people do not long for peace but instead seek revenge. Economic growth
and prosperity can, indeed, change the attitude of societies and allow for
alliances among former enemies (imagine how odd the phrase
“Paris-Berlin axis”, heard today in discussions of the European
Union, would have sounded in 1946), but the results of a protracted
conflict can prevent the emergence of the very prosperity which might allow
consigning it to the past.
Tuesday, August 12, 2014
Reading List: Black List
- Thor, Brad.
New York: Pocket Books, 2012.
This is the twelfth in the author's
Harvath series, which began with
The Lions of Lucerne (October 2010).
Brad Thor has remarked in interviews that he strives to write thrillers
which anticipate headlines which will break after their publication,
and with this novel he hits a grand slam.
Scot Harvath is ambushed in Paris by professional killers who murder
a member of his team. After narrowly escaping, he goes to ground and
covertly travels to a remote region in Basque country where he has
trusted friends. He is then attacked there, again by
trained killers, and he has to conclude that the probability is high
that the internal security of his employer, the Carlton Group, has
been breached, perhaps from inside.
Meanwhile, his employer, Reed Carlton, is attacked at his secure compound
by an assault team and barely escapes with his life. When Carlton tries
to use his back channels to contact members of his organisation, they all
appear to have gone dark. To Carlton, a career spook with tradecraft flowing
in his veins, this indicates his entire organisation has been wiped out,
for no apparent motive and by perpetrators unknown.
Harvath, Carlton, and the infovore dwarf Nicholas, operating independently,
must begin to pick up the pieces to figure out what is going on, while
staying under the radar of a pervasive surveillance state which employs
every technological means to track them down and target them for
summary extra-judicial elimination.
If you pick up this book and read it today, you might think it's based
upon the revelations of
about the abuses of the NSA
conducting warrantless surveillance on U.S. citizens. But it was published
in 2012, a full year before the first of Snowden's disclosures.
The picture of the total information awareness state here is,
if anything, more benign than what we now know to be the case in reality.
What is different is that when Harvath, Carlton, and Nicholas get to the
bottom of the mystery, the reaction in high places is what one would
hope for in a constitutional republic, as opposed to the
“USA! USA! USA!” cheerleading or silence which
has greeted the exposure of abuses by the NSA on the part of all too many
This is a prophetic thriller which demonstrates how the smallest compromises
of privacy: credit card transactions, telephone call metadata, license
plate readers, facial recognition, Web site accesses, search engine queries,
etc. can be woven into a dossier on any person of interest which makes going
dark to the snooper state equivalent to living technologically in 1950.
This not just a cautionary tale for individuals who wish to preserve a
wall of privacy around themselves from the state, but also a challenge for
writers of thrillers. Just as mobile telephones would have wrecked the
plots of innumerable mystery and suspense stories written before their
existence, the emergence of the
state will make it difficult for thriller writers to have both their
heroes and villains operating in the dark. I am sure the author will
rise to this challenge.
Thursday, July 31, 2014
Reading List: Conversations with My Agent (and Set Up, Joke, Set Up, Joke)
- Long, Rob.
Conversations with My Agent (and Set Up, Joke, Set Up, Joke).
London: Bloomsbury Publishing, [1996, 2005] 2014.
Hollywood is a strange place, where the normal rules of business, economics,
and personal and professional relationships seem to have been suspended.
When he arrived in Hollywood in 1930,
P. G. Wodehouse
found the customs and antics of its denizens so bizarre that he parodied
them in a series of hilarious stories. After a year in Hollywood, he'd
had enough and never returned. When Rob Long arrived in Hollywood to attend
UCLA film school, the television industry was on the threshold of a
technology-driven change which would remake it and forever put an end to the
domination by three large networks which had existed since its inception.
The advent of cable and, later, direct to home satellite broadcasting
eliminated the terrestrial bandwidth constraints which had made establishing
a television outlet forbiddingly expensive and, at the same time, side-stepped
many of the regulatory constraints which forbade “edgy” content
on broadcast channels. Long began his television career as a screenwriter
in 1990, and became an executive producer of the show in 1992. After
the end of Cheers, he created and produced other television
Sullivan & Son,
which is currently on the air.
Television ratings measure both “rating points”: the absolute number of
television sets tuned into the program, and “share points”: the
fraction of television sets turned on at the time viewing the program.
In the era of Cheers, a typical episode might have a rating
equivalent to more than 22 million viewers and a share of 32%, meaning
it pulled in around one third of all television viewers in its time slot.
The proliferation of channels makes it unlikely any show will achieve numbers
like this again. The extremely popular
attracted between 9 and 14 million viewers in its eight seasons, and
the highly critically regarded
never topped a mean viewership of 2.7 million in its best season.
It was into this new world of diminishing viewership expectations but
voracious thirst for content to fill all the new channels that the
author launched his post-Cheers career. The present
volume collects two books originally published independently,
Conversations with My Agent from 1998, and
2005's Set Up, Joke, Set Up, Joke, written as
was well-advanced. The volumes fit
together almost seamlessly, and many readers will barely notice the
This is a very funny book, but there is also a great deal of wisdom
about the ways of Hollywood, how television projects are created,
pitched to a studio, marketed to a network, and the tortuous process
leading from concept to script to pilot to series and, all too often,
to cancellation. The book is written as a screenplay,
complete with scene descriptions, directions, dialogue, transitions,
and sound effect call-outs. Most of the scenes are indeed
conversations between the author and his agent in various
circumstances, but we also get to be a fly on the wall at story
pitches, meetings with the network, casting, shooting an episode,
focus group testing, and many other milestones in the life cycle of a
situation comedy. The circumstances are fictional, but are clearly
informed by real-life experience. Anybody contemplating a career in
Hollywood, especially as a television screenwriter, would be insane
not to read this book. You'll laugh a lot, but also learn something
on almost every page.
The reader will also begin to appreciate the curious ways of Hollywood
business, what the author calls “HIPE”: the Hollywood
Inversion Principle of Economics. “The HIPE, as it will come
to be known, postulates that every commonly understood, standard
business practice of the outside world has its counterpart in the
entertainment industry. Only it's backwards.” And anybody who
thinks accounting is not a creative profession has never had experience
with a Hollywood project. The culture of the entertainment business is
also on display—an intricate pecking order involving writers,
producers, actors, agents, studio and network executives, and “below
the line” specialists such as camera operators and editors, all of whom
have to read the trade papers to know who's up and who's not.
This book provides an insider's perspective on the strange way television
programs come to be. In a way, it resembles some aspects of venture
capital: most projects come to nothing, and most of those which are
funded fail, losing the entire investment. But the few which succeed
can generate sufficient money to cover all the losses and still yield
a large return. One television show that runs for five years, producing
solid ratings and 100+ episodes to go into syndication, can set up its
writers and producers for life and cover the studio's losses on all of
the dogs and cats.
Wednesday, July 30, 2014
Reading List: Robert A. Heinlein: In Dialogue with His Century. Vol 1
- Patterson, William H., Jr.
Robert A. Heinlein: In Dialogue with His Century. Vol. 1
New York: Tor Books, 2010.
Robert Heinlein came from a family who had been present in America before there
were the United States, and whose members had served in all of the wars of the
Republic. Despite being thin, frail, and with dodgy eyesight, he managed to be
appointed to the U.S. Naval Academy where, despite demerits for being a hellion,
he graduated and was commissioned as a naval officer. He was on the track to a
naval career when felled by tuberculosis (which was, in the 1930s, a potential death
sentence, with the possibility of recurrence any time in later life).
Heinlein had written while in the Navy, but after his forced medical retirement,
turned his attention to writing science fiction for pulp magazines, and
after receiving a cheque for US$ 70 for his first short story,
he exclaimed, “How long has this racket been
going on? And why didn't anybody tell me about it sooner?” Heinlein
always viewed writing as a business, and kept a thermometer on which he
charted his revenue toward paying off the mortgage on his house.
While Heinlein fit in very well with the Navy, and might have been, absent
medical problems, a significant commander in the fleet in World War II,
he was also, at heart, a bohemian, with a soul almost orthogonal to military
tradition and discipline. His first marriage was a fling with a woman who
introduced him to physical delights of which he was unaware. That ended
quickly, and then he married Leslyn, who was his muse, copy-editor, and
business manager in a marriage which persisted throughout World War II,
when both were involved in war work. Leslyn worked herself in this effort
into insanity and alcoholism, and they divorced in 1947.
It was Robert Heinlein who vaulted science fiction from the ghetto of the
pulp magazines to the “slicks” such as Collier's and the
Saturday Evening Post. This was due to a technological transition
in the publishing industry which is comparable to that presently underway in
the migration from print to electronic publishing. Rationing of paper during
World War II helped to create the “pocket book” or paperback
publishing industry. After the end of the war, these new entrants in the
publishing market saw a major opportunity in publishing anthologies of stories
previously published in the pulps. The pulp publishers viewed this as an
existential threat—who would buy a pulp magazine if, for almost the same
price, one could buy a collection of the best stories from the last
decade in all of those magazines?
Heinlein found his fiction entrapped in this struggle. While today, when you sell
a story to a magazine in the U.S., you usually only sell “First North American
serial rights”, in the 1930s and 1940s, authors sold all rights, and it was
up to the publisher to release their rights for republication of a work in an
anthology or adaptation into a screenplay. This is parallel to the contemporary battle
between traditional publishers and independent publishing platforms, which have
become the heart of science fiction.
Heinlein was complex. While an exemplary naval officer, he was a nudist, married
three times, interested in the esoteric (and a close associate of
L. Ron Hubbard).
He was an enthusiastic supporter of
EPIC movement and
his “Social Credit” agenda.
This authorised biography, with major contributions from Heinlein's widow, Virginia,
chronicles the master storyteller's life in his first forty years—until he found,
or created, an audience receptive to the tales of wonder he spun. If you've read
all of Heinlein's fiction, it may be difficult to imagine how much of it was based in
Heinlein's own life. If you thought Heinlein's later novels were weird, appreciate
how the master was weird before you were born.
I had the privilege of meeting Robert and Virginia Heinlein in 1984. I shall always
cherish that moment.
Sunday, July 27, 2014
Reading List: The Guns of August
- Tuchman, Barbara W.
The Guns of August.
New York: Presidio Press, [1962, 1988, 1994] 2004.
One hundred years ago the world was on the brink of a cataclysmic
confrontation which would cause casualties numbered in the tens of
millions, destroy the pre-existing international order, depose
royalty and dissolve empires, and plant the seeds for tyrannical
regimes and future conflicts with an even more horrific toll in
human suffering. It is not exaggeration to speak of World War I
as the pivotal event of the 20th century, since so much that
followed can be viewed as sequelæ which can be traced directly
to that conflict.
It is thus important to understand how that war came to be, and how
in the first month after its outbreak the expectations of all parties
to the conflict, arrived at through the most exhaustive study by
military and political élites, were proven completely wrong
and what was expected to be a short, conclusive war turned instead into
a protracted blood-letting which would continue for more than four
years of largely static warfare. This magnificent book, which covers
the events leading to the war and the first month after its outbreak,
provides a highly readable narrative history of the period with
insight into both the grand folly of war plans drawn up in isolation
and mechanically followed even after abundant evidence of their
faults have caused tragedy, but also how contingency—chance,
and the decisions of fallible human beings in positions of authority
can tilt the balance of history.
The author is not an academic historian, and she writes for a
popular audience. This has caused some to sniff at her work, but as
she noted, Herodotus, Thucydides, Gibbon, and MacCauley did not have Ph.D.s.
She immerses the reader in the world before the war, beginning with the
1910 funeral in London of Edward VII where nine monarchs rode in the
cortège, most of whose nations would be at war four years hence. The
system of alliances is described in detail, as is the mobilisation plans
of the future combatants, all of which would contribute to fatal
instability of the system to a small perturbation.
Germany, France, Russia, and Austria-Hungary had all drawn up detailed
mobilisation plans for assembling, deploying, and operating their
conscript armies in the event of war. (Britain, with an all-volunteer
regular army which was tiny by continental standards, had no
pre-defined mobilisation plan.) As you might expect, Germany's plan
was the most detailed, specifying railroad schedules and the
composition of individual trains. Now, the important thing to keep
in mind about these plans is that, together, they created a powerful
first-mover advantage. If Russia began to mobilise, and Germany
hesitated in its own mobilisation in the hope of defusing the conflict,
it might be at a great disadvantage if Russia had only a few days of
advance in assembling its forces. This means that there was a powerful
incentive in issuing the mobilisation order first, and a compelling reason
for an adversary to begin his own mobilisation order once news of it
Compounding this instability were alliances which compelled parties to
them to come to the assistance of others. France had no direct interest
in the conflict between Germany and Austria-Hungary and Russia in
the Balkans, but it had an alliance with Russia, and was pulled into
the conflict. When France began to mobilise, Germany activated its own
mobilisation and the
to invade France through Belgium. Once the Germans violated the neutrality
of Belgium, Britain's guarantee of that neutrality required (after the
customary ambiguity and dithering) a declaration of war against Germany,
and the stage was set for a general war in Europe.
The focus here is on the initial phase of the war: where Germany, France,
and Russia were all following their pre-war plans, all initially
expecting a swift conquest of their opponents—the
Battle of the Frontiers,
which occupied most of the month of August 1914. An afterword covers the
First Battle of the Marne
where the German offensive on the Western front was halted and the stage set
for the static trench warfare which was to ensue. At the conclusion of that
battle, all of the shining pre-war plans were in tatters, many commanders
were disgraced or cashiered, and lessons learned through the tragedy
“by which God teaches the law to kings” (p. 275).
A century later, the lessons of the outbreak of World War I could not be more
relevant. On the eve of the war, many believed that the interconnection of
the soon-to-be belligerents through trade was such that war was unthinkable,
as it would quickly impoverish them. Today, the world is even more connected
and yet there are conflicts all around the margins, with alliances spanning the
globe. Unlike 1914, when the world was largely dominated by great powers, now
there are rogue states, non-state actors, movements dominated by religion,
and neo-barbarism and piracy loose upon the stage, and some of these may lay
their hands on weapons whose destructive power dwarf those of 1914–1918.
This book, published more than fifty years ago, about a conflict a century
old, could not be more timely.