Friday, February 20, 2015
Reading List: A Force of Nature
- Reeves, Richard.
A Force of Nature.
New York: W. W. Norton, 2008.
In 1851, the
Crystal Palace Exhibition
opened in London. It was a showcase of the wonders of industry and culture of
the greatest empire the world had ever seen and attracted a multitude of
visitors. Unlike present-day “World's Fair” boondoggles, it
made money, and the profits were used to fund good works, including
endowing scholarships for talented students from the far reaches of the
Empire to study in Britain. In 1895, Ernest Rutherford, hailing from
a remote area in New Zealand and recent graduate of Canterbury College in
Christchurch, won a scholarship to study at Cambridge. Upon learning of
the award in a field of his family's farm, he threw his shovel in the
air and exclaimed, “That's the last potato I'll ever dig.” It was.
When he arrived at Cambridge, he could hardly have been more out of place.
He and another scholarship winner were the first and only graduate students
admitted who were not Cambridge graduates. Cambridge, at the end of the
Victorian era, was a clubby, upper-class place, where even those pursuing
mathematics were steeped in the classics, hailed from tony public schools,
and spoke with refined accents. Rutherford, by contrast, was a rough-edged
colonial, bursting with energy and ambition. He spoke with a bizarre
accent (which he retained all his life) which blended the Scottish brogue
of his ancestors with the curious intonations of the antipodes. He
was anything but the ascetic intellectual so common at
Cambridge—he had been a fierce competitor at rugby, spoke
about three times as loud as was necessary (many years later, when the
eminent Rutherford was tapped to make a radio broadcast from
Cambridge, England to Cambridge, Massachusetts, one of his associates
asked, “Why use radio?”), and spoke vehemently on any and
all topics (again, long afterward, when a ceremonial portrait was
unveiled, his wife said she was surprised the artist had caught him with
his mouth shut).
But it quickly became apparent that this burly, loud, New Zealander was
extraordinarily talented, and under the leadership of
he began original research in radio, but soon abandoned the field to
pursue atomic research, which Thomson had pioneered with his
discovery of the electron. In 1898, with Thomson's recommendation,
Rutherford accepted a professorship at McGill University in
Montreal. While North America was considered a scientific backwater in
the era, the generous salary would allow him to marry his fiancée,
who he had left behind in New Zealand until he could find a position which
would support them.
At McGill, he and his collaborator
radioactive decay of thorium, discovered that radioactive decay was
characterised by a unique
was composed of two distinct components which he named
radiation. He later named the most penetrating product of
Rutherford was the first to suggest, in 1902, that radioactivity resulted from
the transformation of one chemical element into another—something
previously thought impossible.
In 1907, Rutherford was offered, and accepted a chair of physics at
the University of Manchester, where, with greater laboratory resources
than he had had in Canada, pursued the nature of the products of
radioactive decay. By 1907, by a clever experiment, he had identified
alpha radiation (or particles, as we now call them) with the
nuclei of helium atoms—nuclear decay was heavy atoms being
spontaneously transformed into a lighter element and a helium nucleus.
Based upon this work, Rutherford won the Nobel Prize in Chemistry
in 1908. As a person who considered himself first and foremost an
experimental physicist and who was famous for remarking, “All
science is either physics or stamp collecting”, winning the
Chemistry Nobel had to feel rather odd. He quipped that while he
had observed the transmutation of elements in his laboratory, no
transmutation was as startling as discovering he had become a
chemist. Still, physicist or chemist, his greatest work was yet to
In 1909, along with
(later to invent the Geiger counter)
he conducted an experiment where high-energy
alpha particles were directed against a very thin sheet of gold foil.
The expectation was that few would be deflected and those only slightly.
To the astonishment of the experimenters, some alpha particles were
found to be deflected through large angles, some bouncing directly back
toward the source. Geiger exclaimed, “It was almost as incredible
as if you fired a 15-inch [battleship] shell at a piece of tissue paper
and it came back and hit you.” It took two years before Rutherford
fully understood and published what was going on, and it forever changed
the concept of the atom. The only way to explain the scattering results
was to replace the early model of the atom with one in which a diffuse
cloud of negatively charged electrons surrounded a tiny, extraordinarily
dense, positively charged nucleus (that word was not used
until 1913). This experimental result fed directly into the development
of quantum theory and the elucidation of the force which bound the
particles in the nucleus together, which was not fully understood until
more than six decades later.
In 1919 Rutherford returned to Cambridge to become the head of the
the most prestigious position in experimental
physics in the world. Continuing his research with alpha emitters, he
discovered that bombarding nitrogen gas with alpha particles would
transmute nitrogen into oxygen, liberating a proton (the nucleus of
hydrogen). Rutherford simultaneously was the first to deliberately
transmute one element into another, and also to discover the proton.
In 1921, he predicted the existence of the neutron, completing the
composition of the nucleus. The neutron was eventually discovered by
Rutherford's discoveries, all made with benchtop apparatus and a
small group of researchers, were the foundation of nuclear physics.
He not only discovered the nucleus, he also found or predicted its
constituents. He was the first to identify natural nuclear transmutation
and the first to produce it on demand in the laboratory. As a teacher
and laboratory director his legacy was enormous: eleven of his students
and research associates went on to win Nobel prizes. His students
and Ernest Walton
first particle accelerator
and ushered in the era of “big science”. Rutherford not
only created the science of nuclear physics, he was the last person to
make major discoveries in the field by himself, alone or with a few
collaborators, and with simple apparatus made in his own laboratory.
In the heady years between the wars, there were, in the public mind,
two great men of physics: Einstein the theoretician and Rutherford
the experimenter. (This perception may have understated the contributions
of the creators of quantum mechanics, but they were many and less known.)
Today, we still revere Einstein, but Rutherford is less remembered (except
in New Zealand, where everybody knows his name and achievements). And
yet there are few experimentalists who have discovered so much in
their lifetimes, with so little funding and the simplest apparatus.
Rutherford, that boisterous, loud, and restless colonial, figured out
much of what we now know about the atom, largely by himself, through a
multitude of tedious experiments which often failed, and he should
rightly be regarded as a pillar of 20th century physics.
This is the thousandth book to appear since I began to keep the
in January 2001.
Wednesday, February 18, 2015
Reading List: Tools for Survival
- Rawles, James Wesley.
Tools for Survival.
New York: Plume, 2014.
Suppose one day the music stops. We all live, more or less, as
part of an intricately-connected web of human society. The water that
comes out of the faucet when we open the tap depends (for the vast majority
of people) on pumps powered by an electrical grid that spans a continent.
So does the removal of sewage when you flush the toilet. The typical
city in developed nations has only about three days' supply of food on hand
in stores and local warehouses and depends upon a transportation
infrastructure as well as computerised inventory and payment systems
to function. This system has been optimised over decades to be
extremely efficient, but at the same time it has become dangerously
fragile against any perturbation. A financial crisis which disrupts
just-in-time payments, a large-scale and protracted power outage due to
a solar flare or EMP attack, disruption of data networks by malicious
attacks, or social unrest can rapidly halt the flow of goods and services
upon which hundreds of millions of people depend and rely upon without
rarely giving a thought to what life might be like if one day they weren't
The author, founder of the essential
site, has addressed such scenarios
in his fiction,
which is highly recommended. Here the focus is less speculative,
and entirely factual and practical. What are the essential skills and
tools one needs to survive in what amounts to a 19th century
homestead? If the grid (in all senses) goes down, those who
wish to survive the massive disruptions and chaos which will result
may find themselves in the position of those on the American frontier
in the 1870s: forced into self-reliance for all of the necessities
of life, and compelled to use the simple, often manual, tools which
their ancestors used—tools which can in many cases be fabricated
and repaired on the homestead.
The author does not assume a total collapse to the nineteenth century.
He envisions that those who have prepared to ride out a discontinuity
in civilisation will have equipped themselves with rudimentary
solar electric power and electronic communication systems. But at
the same time, people will be largely on their own when it comes to
gardening, farming, food preservation, harvesting trees for firewood
and lumber, first aid and dental care, self-defence,
metalworking, and a multitude of other tasks. As always, the
author stresses, it isn't the tools you have but rather the skills
between your ears that determine whether you'll survive. You may
have the most comprehensive medical kit imaginable, but if nobody
knows how to stop the bleeding from a minor injury, disinfect the
wound, and suture it, what today is a short trip to the emergency
room might be life-threatening.
Here is what I took away from this book. Certainly, you want to have
on hand what you need to deal with immediate threats (for example,
firefighting when the fire department does not respond, self-defence
when there is no sheriff, a supply of water and food so you don't become
a refugee if supplies are interrupted, and a knowledge of sanitation
so you don't succumb to disease when the toilet doesn't flush). If you have
skills in a particular area, for example, if you're a doctor, nurse, or
emergency medical technician, by all means lay in a supply of what you
need not just to help yourself and your family, but your neighbours.
The same goes if you're a welder, carpenter, plumber, shoemaker, or smith.
It just isn't reasonable, however, to expect any given family to acquire
all the skills and tools (even if they could afford them, where would they
put them?) to survive on their own. Far more important is to make the
acquaintance of like-minded people in the vicinity who have the diverse
set of skills required to survive together. The ability to build and maintain such
a community may be the most important survival skill of all.
This book contains a wealth of resources available on the Web (most
presented as shortened URLs, not directly linked in the Kindle edition)
and a great deal of wisdom about which I find little or nothing to
disagree. For the most part the author uses quaint units like inches,
pounds, and gallons, but he is writing for a mostly American audience. Please
take to heart the safety warnings: it is very easy to kill or gravely
injure yourself when woodworking, metal fabricating, welding,
doing electrical work, or felling trees and processing lumber. If
your goal is to survive and prosper whatever the future may bring,
it can ruin your whole plan if you kill yourself acquiring the
skills you need to do so.
Monday, February 9, 2015
Reading List: The Testament of James
- Suprynowicz, Vin.
The Testament of James.
Pahrump, NV: Mountain Media, 2014.
The author is a veteran newspaperman and was arguably the most libertarian
writer in the mainstream media during his long career with the
Las Vegas Review-Journal. He earlier turned his hand to
fiction in 2005's The Black Arrow (May 2005),
a delightful libertarian superhero fantasy. In the present volume he
tells an engaging tale which weaves together mystery, the origins of
Christianity, and the curious subculture of rare book collectors and dealers.
Matthew Hunter is the proprietor of a used book shop in Providence,
Rhode Island, dealing both in routine merchandise but also rare volumes
obtained from around the world and sold to a network of collectors who
trust Hunter's judgement and fair pricing. While Hunter is on a trip to Britain,
an employee of the store is found dead under suspicious circumstances,
while waiting after hours to receive a visitor from Egypt with a
manuscript to be evaluated and sold.
Before long, a series of curious, shady, and downright intimidating people
start arriving at the bookshop, all seeking to buy the manuscript which,
it appears, was never delivered. The person who was supposed to bring it
to the shop has vanished, and his brothers have come to try to find him.
Hunter and his friend Chantal Stevens, ex-military who has agreed to help
out in the shop, find themselves in the middle of the quest for one of
the most legendary, and considered mythical, rare books of all time, The
Testament of James, reputed to have been written by
James the Just,
the (half-)brother of Jesus Christ. (His precise relationship to Jesus is
a matter of dispute among Christian sects and scholars.) This Testament
(not to be confused with the
Epistle of James
in the New Testament, also sometimes attributed to James the Just), would
have been the most contemporary record of the life of Jesus, well predating the
Matthew and Chantal seek to find the book, rescue the seller, and get to the
bottom of a mystery dating from the origin of Christianity. Initially
dubious such a book might exist, Matthew concludes that so many people
would not be trying so hard to lay their hands on it if there weren't
A good part of the book is a charming and often humorous look inside the
world of rare books, one with which the author is clearly well-acquainted.
There is intrigue, a bit of mysticism, and the occasional libertarian
zinger aimed at a deserving target. As the story unfolds, an alternative
interpretation of the life and work of Jesus and the history of the early
Church emerges, which explains why so many players are so desperately
seeking the lost book.
As a mystery, this book works superbly. Its view of “bookmen”
(hunters, sellers, and collectors) is a delight. Orthodox Christians (by which I mean
those adhering to the main Christian denominations, not just those called
“Orthodox”) may find some of the content blasphemous, but before
they explode in red-faced sputtering, recall that one can never be sure about the
provenance and authenticity of any ancient manuscript. Some of the language
and situations are not suitable for young readers, but by the standards of
contemporary mass-market fiction, the book is pretty tame. There are essentially
no spelling or grammatical errors. To be clear, this is entirely a work of fiction:
there is no Testament of James apart from this book, in which
it's an invention of the author. A bibliography of works providing alternative
(which some will consider heretical) interpretations of the origins of
Christianity is provided. You can read an
excerpt from the novel
at the author's Web log; continue to follow the links in the excerpts to read
the first third—20,000 words—of the book for free.
Friday, January 30, 2015
Reading List: The Case of the Displaced Detective Omnibus
- Osborn, Stephanie.
The Case of the Displaced Detective Omnibus.
Kingsport, TN: Twilight Times Books, 2013.
This book, available only for the Kindle, collects the first four novels
of the author's Displaced Detective series. The individual
books included here are
The Rendlesham Incident, and
Endings and Beginnings.
Each pair of books, in turn, comprises a single story, the first
two The Case of the Displaced Detective and the
latter two The Case of the Cosmological Killer. If you
read only the first of either pair, it will be obvious that the
story has been left in the middle with little resolved. In the trade
paperback edition, the four books total more than 1100 pages, so
this omnibus edition will keep you busy for a while.
Dr. Skye Chadwick is a hyperspatial physicist and chief scientist of
Project Tesseract. Research into the multiverse and brane world
solutions of string theory has revealed that our continuum—all of
the spacetime we inhabit—is just one of an unknown number adjacent
to one another in a higher dimensional membrane (“brane”), and
that while every continuum is different, those close to one another in
the hyperdimensional space tend to be similar. Project Tesseract,
a highly classified military project operating from an underground laboratory
in Colorado, is developing hardware based on advanced particle physics
which allows passively observing or even interacting with these other
continua (or parallel universes).
The researchers are amazed to discover that in some continua characters
which are fictional in our world actually exist, much as they were
described in literature. Perhaps Heinlein and Borges were right in
speculating that fiction exists in parallel universes, and maybe
that's where some of authors' ideas come from. In any case, exploration
of Continuum 114 has revealed it to be one of those in which Sherlock
Holmes is a living, breathing man. Chadwick and her team decide to
investigate one of the pivotal and enigmatic episodes in the Holmes
literature, the fight at Reichenbach Falls. As Holmes and Moriarty
battle, it is apparent that both will fall to their death. Chadwick
acts impulsively and pulls Holmes from the brink of the cliff, back
through the Tesseract, into our continuum. In an instant, Sherlock Holmes,
consulting detective of 1891 London, finds himself in twenty-first
century Colorado, where he previously existed only in the stories of
Arthur Conan Doyle.
Holmes finds much to adapt to in this often bewildering world, but then
he was always a shrewd observer and master of disguise, so few people
would be as well equipped. At the same time, the Tesseract project
faces a crisis, as a disaster and subsequent investigation reveals
the possibility of sabotage and an espionage ring operating within
the project. A trusted, outside investigator with no ties to the
project is needed, and who better than Holmes, who owes his life to it?
With Chadwick at his side, they dig into the mystery surrounding the
As they work together, they find themselves increasingly attracted
to one another, and Holmes must confront his fear that emotional
involvement will impair the logical functioning of his mind upon
which his career is founded. Chadwick, learning to become
a talented investigator in her own right, fears that a deeper than
professional involvement with Holmes will harm her own emerging
I found that this long story started out just fine, and indeed I recommended
it to several people after finishing the first of the four novels
collected here. To me, it began to run off the rails in the second
book and didn't get any better in the remaining two (which begin with
Holmes and Chadwick an established detective team, summoned to help with
a perplexing mystery in Britain which may have consequences for all
of the myriad contunua in the multiverse). The fundamental problem is
that these books are trying to do too much all at the same time. They
can't decide whether they're science fiction, mystery, detective procedural,
or romance, and as they jump back and forth among the genres, so little
happens in the ones being neglected at the moment that the parallel
story lines develop at a glacial pace. My estimation is that an
editor with a sharp red pencil could cut this material by 50–60%
and end up with a better book, omitting nothing central to the story and
transforming what often seemed a tedious slog into a page-turner.
Sherlock Holmes is truly one of the great timeless characters in literature.
He can be dropped into any epoch, any location, and, in this case, anywhere
in the multiverse, and rapidly start to get to the bottom of the situation
while entertaining the reader looking over his shoulder. There is nothing
wrong with the premise of these books and there are interesting ideas and
characters in them, but the execution just isn't up to the potential of the concept.
The science fiction part sometimes sinks to the techno-babble level of
Star Trek (“Higgs boson injection beginning…”).
I am no prude, but I found the repeated and explicit sex scenes a bit
much (tedious, actually), and they make the books unsuitable for younger
readers for whom the original Sherlock Holmes stories are a pure delight.
If you're interested in the idea, I'd suggest buying just the first book
separately and see how you like it before deciding to proceed, bearing in mind
that I found it the best of the four.
Saturday, January 10, 2015
Reading List: Enlightening Symbols
- Mazur, Joseph.
Princeton: Princeton University Press, 2014.
Sometimes an invention is so profound and significant yet apparently
obvious in retrospect that it is difficult to imagine how people
around the world struggled over millennia to discover it, and how
slowly it was to diffuse from its points of origin into general use.
Such is the case for our modern decimal system of positional
notation for numbers and the notation for algebra and other
fields of mathematics which permits rapid calculation and
transformation of expressions. This book, written with the extensive source citations
of a scholarly work yet accessible to any reader familiar with
arithmetic and basic algebra, traces the often murky origins of
this essential part of our intellectual heritage.
From prehistoric times humans have had the need to count things,
for example, the number of sheep in a field. This could be
done by establishing a one-to-one correspondence between the
sheep and something else more portable such as
one's fingers (for a small flock), or pebbles kept in a sack.
To determine whether a sheep was missing, just remove a
pebble for each sheep and if any remained in the sack,
that indicates how many are absent. At a slightly more abstract
level, one could make tally marks on a piece of bark or clay
tablet, one for each sheep. But all of this does not imply
number as an abstraction independent of individual items of
some kind or another. Ancestral humans don't seem to have
required more than the simplest notion of numbers: until the
middle of the 20th century several tribes of Australian
aborigines had no words for numbers in their languages at all,
but counted things by making marks in the sand. Anthropologists
discovered tribes in remote areas of the Americas, Pacific
Islands, and Australia whose languages had no words for numbers
greater than four.
With the emergence of settled human populations and the
increasingly complex interactions of trade between villages
and eventually cities, a more sophisticated notion of numbers
was required. A merchant might need to compute how many
kinds of one good to exchange for another and to keep records
of his inventory of various items. The earliest known
written records of numerical writing are Sumerian cuneiform clay
tablets dating from around 3400 B.C.
These tablets show number symbols formed from two distinct
kinds of marks pressed into wet clay with a stylus. While
the smaller numbers seem clearly evolved from tally marks,
larger numbers are formed by complicated combinations of the
two symbols representing numbers from 1 to 59. Larger numbers
were written as groups of powers of 60 separated by spaces.
This was the first known instance of a positional number system,
but there is no evidence it was used for complicated calculations—just
as a means of recording quantities.
Ancient civilisations: Egypt, Hebrew, Greece, China, Rome, and the
Aztecs and Mayas in the Western Hemisphere all invented
ways of writing numbers, some sophisticated and capable of
representing large quantities. Many of these systems were
additive: they used symbols, sometimes derived from
letters in their alphabets, and composed numbers by writing
symbols which summed to the total. To write the number 563,
a Greek would write
γ=3. By convention, numbers were
written with letters in descending order of the value they
represented, but the system was not positional. This made
the system clumsy for representing large numbers, reusing
letters with accent marks to represent thousands and an
entirely different convention for ten thousands.
How did such advanced civilisations get along using number systems
in which it is almost impossible to compute? Just imagine a
Roman faced with multiplying MDXLIX by XLVII
(1549 × 47)—where do you start?
You don't: all of these civilisations used some form of
mechanical computational aid: an abacus, counting rods, stones
in grooves, and so on to actually manipulate numbers. The
Zi Suan Jing, dating from fifth century China, provides
instructions (algorithms) for multiplication, division, and
square and cube root extraction using bamboo counting sticks
(or written symbols representing them). The result of the
computation was then written using the numerals of the language.
The written language was thus a way to represent numbers, but
not compute with them.
Many of the various forms of numbers and especially computational
tools such as the abacus came ever-so-close to stumbling on the
place value system, but it was in India, probably before the
third century B.C. that a positional
decimal number system including zero as a place holder, with
digit forms recognisably ancestral to those we use today
emerged. This was a breakthrough in two regards. Now, by
memorising tables of addition, subtraction, multiplication,
and division and simple algorithms once learned by schoolchildren
before calculators supplanted that part of their brains, it was
possible to directly compute from written numbers. (Despite
this, the abacus remained in common use.) But, more profoundly,
this was a universal representation of whole numbers.
Earlier number systems (with the possible exception of that
invented by Archimedes in
The Sand Reckoner
[but never used practically]) either had a limit on the largest number
they could represent or required cumbersome and/or lengthy conventions
for large numbers. The Indian number system needed only ten symbols
to represent any non-negative number, and only the single
convention that each digit in a number represented how many of that
power of ten depending on its position.
Knowledge diffused slowly in antiquity, and despite India being on
active trade routes, it was not until the 13th century
introduced the new number system, which had been transmitted
via Islamic scholars writing in Arabic, to Europe in
Abaci. This book not only introduced the new number
system, it provided instructions for a variety of practical
computations and applications to higher mathematics. As revolutionary
as this book was, in an era of hand-copied manuscripts, its
influence spread very slowly, and it was not until the
16th century that the new numbers became almost universally used.
The author describes this protracted process, about which a great deal
of controversy remains to the present day.
Just as the decimal positional number system was becoming established
in Europe, another revolution in notation began which would
transform mathematics, how it was done, and our understanding of
the meaning of numbers. Algebra, as we now understand it, was known
in antiquity, but it was expressed in a rhetorical way—in words.
For example, proposition 7 of book 2 of Euclid's Elements
If a straight line be cut at random, the square of the whole
is equal to the squares on the segments and twice the
rectangle contained by the segments.
Now, given such a problem, Euclid or any of those following in
his tradition would draw a diagram and proceed to prove from
the axioms of plane geometry the correctness of the statement.
But it isn't obvious how to apply this identity to other
problems, or how it illustrates the behaviour of general
numbers. Today, we'd express the problem and proceed as
Once again, faced with the word problem, it's difficult to know where to begin,
but once expressed in symbolic form, it can be solved by applying rules of
algebra which many master before reaching high school. Indeed, the process of
simplifying such an equation is so mechanical that computer tools are readily
available to do so.
Or consider the following brain-twister posed in the 7th century
A.D. about the Greek mathematician
and father of algebra
how many years did he live?
“Here lies Diophantus,” the wonder behold.
Oh, go ahead, give it a try before reading on!
Today, we'd read through the problem and write a system of two
simultaneous equations, where x is the age of Diophantus
at his death and y the number of years his son lived.
Through art algebraic, the stone tells how old;
“God gave him his boyhood one-sixth of his life,
One twelfth more as youth while whiskers grew rife;
And then one-seventh ere marriage begun;
In five years there came a bounding new son.
Alas, the dear child of master and sage
After attaining half the measure of his father's life
chill fate took him.
After consoling his fate by the science of numbers for
four years, he ended his life.”
Plug the second equation into the first, do a little algebraic symbol
twiddling, and the answer, 84, pops right out. Note that not only are
the rules for solving this equation the same as for any other, with a
little practice it is easy to read the word problem and write down the
equations ready to solve. Go back and re-read the original problem and
the equations and you'll see how straightforwardly they follow.
Once you have transformed a mass of words into symbols, they invite you
to discover new ways in which they apply. What is the solution of the
equation x+4=0? In antiquity many would have said the
equation is meaningless: there is no number you can add to four to
get zero. But that's because their conception of number was too
limited: negative numbers such as −4 are completely valid and
obey all the laws of algebra. By admitting them, we discovered
we'd overlooked half of the real numbers. What about the solution
to the equation x² + 4 = 0? This was again considered
ill-formed, or imaginary, since the square of any real number, positive
or negative, is positive. Another leap of imagination, admitting the
square root of minus one to the family of numbers, expanded the
number line into the
yielding the answer 2i as
we'd now express it, and extending our concept of number into one which
is now fundamental not only in abstract mathematics but also science and
engineering. And in recognising negative and complex numbers, we'd
come closer to unifying algebra and geometry by bringing rotation
into the family of numbers.
This book explores the groping over centuries toward a symbolic
representation of mathematics which hid the specifics while revealing
the commonality underlying them. As one who learned mathematics
during the height of the “new math” craze, I can't recall
a time when I didn't think of mathematics as a game of symbolic
transformation of expressions which may or may not have any
connection with the real world. But what one discovers in reading
this book is that while this is a concept very easy to brainwash
into a 7th grader, it was extraordinarily difficult for even some
of the most brilliant humans ever to have lived to grasp in the
first place. When Newton invented calculus, for example, he always
expressed his “fluxions” as derivatives of time, and
did not write of the general derivative of a function of arbitrary variables.
Also, notation is important. Writing something in a more expressive
and easily manipulated way can reveal new insights about it. We benefit
not just from the discoveries of those in the past, but from those who
created the symbolic language in which we now express them.
This book is a treasure chest of information about how the language of
science came to be. We encounter a host of characters along the way,
not just great mathematicians and scientists, but scoundrels, master
forgers, chauvinists, those who preserved precious manuscripts and those
who burned them, all leading to the symbolic language
in which we so effortlessly write and do mathematics today.
Friday, January 2, 2015
Reading List: The Strangest Man
- Farmelo, Graham.
The Strangest Man.
New York: Basic Books, 2009.
Paul Adrien Maurice Dirac was born in 1902 in Bristol, England. His father,
Charles, was a Swiss-French immigrant who made his living as a French teacher at a
local school and as a private tutor in French. His mother, Florence (Flo), had
given up her job as a librarian upon marrying Charles. The young Paul and his
older brother Felix found themselves growing up in a very unusual, verging upon
bizarre, home environment. Their father was as strict a disciplinarian at home
as in the schoolroom, and spoke only French to his children, requiring them to
answer in that language and abruptly correcting them if they committed any
faute de français. Flo spoke to the
children only in English, and since the Diracs rarely received visitors at home,
before going to school Paul got the idea that men and women spoke different
languages. At dinner time Charles and Paul would eat in the dining room,
speaking French exclusively (with any error swiftly chastised) while Flo,
Felix, and younger daughter Betty ate in the kitchen, speaking English.
Paul quickly learned that the less he said, the fewer opportunities for error
and humiliation, and he traced his famous reputation for taciturnity to his
(It should be noted that the only account we have of Dirac's childhood
experience comes from himself, much later in life. He made no attempt to
conceal the extent he despised his father [who was respected by his
colleagues and acquaintances in Bristol], and there is no way to know
whether Paul exaggerated or embroidered upon the circumstances of his
After a primary education in which he was regarded as a sound but
not exceptional pupil, Paul followed his brother Felix into the
Merchant Venturers' School, a Bristol technical school ranked
among the finest in the country. There he quickly distinguished
himself, ranking near the top in most subjects. The instruction
was intensely practical, eschewing Latin, Greek, and music in favour
of mathematics, science, geometric and mechanical drawing, and
practical skills such as operating machine tools. Dirac learned
physics and mathematics with the engineer's eye to “getting the
answer out” as opposed to finding the most elegant solution
to the problem. He then pursued his engineering studies at
Bristol University, where he excelled in mathematics but struggled
Dirac graduated with a first-class honours degree in engineering, only
to find the British economy in a terrible post-war depression, the
worst economic downturn since the start of the Industrial Revolution.
Unable to find employment as an engineer, he returned to Bristol University
to do a second degree in mathematics, where it was arranged he could skip
the first year of the program and pay no tuition fees. Dirac quickly
established himself as the star of the mathematics programme, and also
attended lectures about the enigmatic quantum theory.
His father had been working in the background to secure a position at
Cambridge for Paul, and after cobbling together scholarships and a
gift from his father, Dirac arrived at the university in October 1923
to pursue a doctorate in theoretical physics. Dirac would already have seemed
strange to his fellow students. While most were scions of the upper
class, classically trained, with plummy accents, Dirac knew no Latin or
Greek, spoke with a Bristol accent, and approached problems as an
engineer or mathematician, not a physicist. He had hoped to study
Einstein's general relativity, the discovery of which had first interested
him in theoretical physics, but his supervisor was interested in
quantum mechanics and directed his work into that field.
It was an auspicious time for a talented researcher to undertake
work in quantum theory. The “old quantum theory”,
elaborated in the early years of the 20th century, had explained
puzzles like the distribution of energy in heat radiation and the
photoelectric effect, but by the 1920s it was clear that nature
was much more subtle. For example, the original quantum theory failed
to explain even the spectral lines of hydrogen, the simplest atom.
Dirac began working on modest questions related to quantum theory, but
his life was changed when he read
Heisenberg's 1925 paper which is now
considered one of the pillars of the new quantum mechanics. After
initially dismissing the paper as overly complicated and artificial,
he came to believe that it pointed the way forward, dismissing Bohr's
concept of atoms like little solar systems in favour of a probability
density function which gives the probability an electron will be observed
in a given position. This represented not just a change in the model
of the atom but the discarding entirely of models in favour of a
mathematical formulation which permitted calculating what could be
observed without providing any mechanism whatsoever explaining how it worked.
After reading and fully appreciating the significance of Heisenberg's work,
Dirac embarked on one of the most productive bursts of discovery in
the history of modern physics. Between 1925 and 1933 he published one
foundational paper after another. His Ph.D. in 1926, the first granted
by Cambridge for work in quantum mechanics, linked Heisenberg's theory to
the classical mechanics he had learned as an engineer and provided a framework
which made Heisenberg's work more accessible. Scholarly writing did not
come easily to Dirac, but he mastered the art to such an extent that his
papers are still read today as examples of pellucid exposition. At
a time when many contributions to quantum mechanics were rough-edged
and difficult to understand even by specialists, Dirac's papers were, in
the words of Freeman Dyson, “like exquisitely carved marble statues
falling out of the sky, one after another.”
In 1928, Dirac took the first step to unify quantum mechanics and special
relativity in the
The consequences of this equation led Dirac to predict the existence
of a positively-charged electron, which had never been observed. This
was the first time a theoretical physicist had predicted the existence of a
new particle. This
was observed in debris from
cosmic ray collisions in 1932. The Dirac equation also interpreted the
(angular momentum) of particles as a relativistic phenomenon.
Dirac, along with Enrico Fermi, elaborated the statistics of particles
with half-integral spin (now called
of ensembles of one such particle, the electron, is essential to the devices
you use to read this article. He took the first steps toward a relativistic
theory of light and matter and coined the name,
for the field, but never found a theory sufficiently simple and beautiful
to satisfy himself. He published
The Principles of Quantum Mechanics
in 1930, for many years the standard textbook on the subject and still read
today. He worked out the theory of
(not detected to
this date) and speculated on the origin and possible links between
numbers in physics and cosmology.
The significance of Dirac's work was recognised at the time. He was elected
a Fellow of the
Royal Society in 1930,
became the Lucasian Professor of
Mathematics (Newton's chair) at Cambridge in 1932, and shared the Nobel
Prize in Physics for 1933 with Erwin Schrödinger. After rejecting
a knighthood because he disliked being addressed by his first name, he was
Order of Merit in 1973. He is commemorated by a plaque in
Westminster Abbey, close to that of Newton; the plaque bears his name and
the Dirac equation, the only equation so honoured.
Many physicists consider Dirac the second greatest theoretical physicist of the
20th century, after Einstein. While Einstein produced great leaps of intellectual
achievement in fields neglected by others, Dirac, working alone, contributed
to the grand edifice of quantum mechanics, which occupied many of the
most talented theorists of a generation. You have to dig a bit deeper into the
history of quantum mechanics to fully appreciate Dirac's achievement, which
probably accounts for his name not being as well known as it deserves.
There is much more to Dirac, all described in this extensively-documented scientific
biography. While declining to join the British atomic weapons project during
World War II because he refused to work as part of a collaboration, he spent
much of the war doing consulting work for the project on his own, including
inventing a new technique for isotope separation. (Dirac's process proved less
efficient that those eventually chosen by the Manhattan project and was not
used.) As an extreme introvert, nobody expected him to ever marry, and he
astonished even his closest associates when he married the sister of his
fellow physicist Eugene Wigner, Manci, a Hungarian divorcée with two
children by her first husband. Manci was as extroverted as Dirac was reserved,
and their marriage in 1937 lasted until Dirac's death in 1984. They had two
daughters together, and lived a remarkably normal family life. Dirac, who
disdained philosophy in his early years, became intensely interested in the
philosophy of science later in life, even arguing that mathematical beauty,
not experimental results, could best guide theorists to the best expression
of the laws of nature.
Paul Dirac was a very complicated man, and this is a complicated and occasionally
self-contradictory biography (but the contradiction is in the subject's life,
not the fault of the biographer). This book provides a glimpse of a unique
intellect whom even many of his closest associates never really felt they
Wednesday, December 31, 2014
Books of the year: 2014
Here are my picks for the best books of 2014
, fiction and nonfiction. These aren't
the best books published this year, but rather the best I've read
last twelvemonth. The winner in both categories is barely distinguished from
the pack, and the runners up are all worthy of reading. Runners up appear
in alphabetical order by their author's surname.
Reading List: How Ronald Reagan Changed My Life
- Robinson, Peter.
How Ronald Reagan Changed My Life.
New York: Harper Perennial, 2003.
In 1982, the author, a recent graduate of Dartmouth College who had spent
two years studying at Oxford, then remained in England to write a novel,
re-assessed his career prospects and concluded that, based upon experience,
novelist did not rank high among them. He sent letters to everybody he
thought might provide him leads on job opportunities. Only William F.
Buckley replied, suggesting that Robinson contact his son, Christopher,
then chief speechwriter for Vice President George H. W. Bush, who might
know of some openings for speechwriters. Hoping at most for a few pointers,
the author flew to Washington to meet Buckley, who was planning to leave
the White House, creating a vacancy in the Vice President's speechwriting
shop. After a whirlwind of interviews, Robinson found himself, in his
mid-twenties, having never written a speech before in his life, at work
in the Old Executive Office Building, tasked with putting words into the
mouth of the Vice President of the United States.
After a year and a half writing for Bush, two of the President's speechwriters
quit at the same time. Forced to find replacements on short notice, the
head of the office recruited the author to write for Reagan: “He hired
me because I was already in the building.” From then through 1988,
he wrote speeches for Reagan, some momentous (Reagan's June 1987 speech
at the Brandenburg gate, where Robinson's phrase, “Mr. Gorbachev,
tear down this wall!”, uttered by Reagan against vehement objections
from the State Department and some of his senior advisers, was a pivotal
moment in the ending of the Cold War), but also many more for less
epochal events such as visits of Boy Scouts to the White House, ceremonies
honouring athletes, and the dozens of other circumstances where the President
was called upon to “say a few words”. And because the media were
quick to pounce on any misstatement by the President, even the most
routine remarks had to be meticulously fact-checked by a team of researchers.
For every grand turn of phrase in a high profile speech, there were many
moments spent staring at the blank screen of a word processor as the deadline
for some inconsequential event loomed ever closer and
wondering, “How am I supposed to get twenty minutes out of that?“.
But this is not just a book about the life of a White House speechwriter
(although there is plenty of insight to be had on that topic). Its
goal is to collect and transmit the wisdom that a young man, in his first
job, learned by observing Ronald Reagan masterfully doing the job to which
he had aspired since entering politics in the 1960s. Reagan was such a
straightforward and unaffected person that many underestimated him. For
example, compared to the hard-driving types toiling from dawn to dusk who
populate many White House positions, Reagan never seemed to work very hard.
He would rise at his accustomed hour, work for five to eight hours at his
presidential duties, exercise, have dinner, review papers, and get to bed on time. Some
interpreted this as his being lazy, but Robinson's fellow speechwriter, Clark
Judge, remarked “He never confuses inputs with output. …
Who cares how many hours a day a President puts in? It's what a
President accomplishes that matters.”
These are lessons aplenty here, all illustrated with anecdotes from the
Reagan White House: the distinction between luck and the results from persistence
in the face of adversity seen in retrospect; the unreasonable effectiveness and
inherent dignity of doing one's job, whatever it be, well; viewing life not
as background scenery but rather an arena in which one can act,
changing not just the outcome but the circumstances one encounters; the power
of words, especially those sincerely believed and founded in comprehensible,
time-proven concepts; scepticism toward the pronouncements of “experts”
whose oracle-like proclamations make sense only to other experts—if it
doesn't make sense to an intelligent person with some grounding in the basics,
it probably doesn't make sense period; the importance of marriage, and how the
Reagans complemented one another in facing the challenges and stress of the
office; the centrality of faith, tempered by a belief in free will and the
importance of the individual; how both true believers and pragmatists, despite
how often they despise one another, are both essential to actually getting things
done; and that what ultimately matters is what you make of whatever
situation in which you find yourself.
These are all profound lessons to take on board, especially in the drinking from
a firehose environment of the Executive Office of the President, and in one's
twenties. But this is not a dour self-help book: it is an insightful, beautifully
written, and often laugh-out-loud funny account of how these insights were
gleaned on the job, by observing Reagan at work and how he and his administration
got things done, often against fierce political and media opposition. This is one of those
books that I wish I could travel back in time and hand a copy to my twenty-year-old
self—it would have saved a great deal of time and anguish, even for a
person like me who has no interest whatsoever in politics. Fundamentally, it's
about getting things done, and that's universally applicable.
People matter. Individuals matter. Long before Ronald Reagan was a
radio broadcaster, actor, or politician, he worked summers as a lifeguard.
Between 1927 and 1932, he personally saved 77 people from drowning. “There
were seventy-seven people walking around northern Illinois who wouldn't have been there
if it hadn't been for Reagan—and Reagan knew it.” It is not just a
few exceptional people who change the world for the better, but all of those
who do their jobs and overcome the challenges with which life presents them.
Learning this can change anybody's life.
More recently, Mr. Robinson is the host of Uncommon Knowledge and co-founder of Ricochet.com.
Saturday, December 27, 2014
Tom Swift and His Submarine Boat updated, EPUB added
All 25 of the public domain Tom Swift novels have been posted in the Tom Swift and His Pocket Library
collection. I am now returning to the earlier novels, upgrading them to use the more modern typography of those I've done in recent years. The fourth novel in the series, Tom Swift and His Submarine Boat
, has now been updated. Several typographical errors in the original edition have been corrected, and Unicode text entities are used for special characters such as single and double quotes and dashes.
An EPUB edition of this novel is now available which may be downloaded to compatible reader devices; the details of how to do this differ from device to device—please consult the documentation for your reader for details.
It's delightful to read a book which uses the word "filibuster" in its original sense: "to take part in a private military action in a foreign country" but somewhat disconcerting to encounter Brazilians speaking Spanish! The diving suits which allow full mobility on the abyssal plain two miles beneath the ocean surface remain as science-fictional as when this novel was written almost a century ago.
Wednesday, December 24, 2014
Reading List: Hidden Order
- Thor, Brad.
New York: Pocket Books, 2013.
This is the thirteenth in the author's
Harvath series, which began with
The Lions of Lucerne (October 2010).
Earlier novels have largely been in the mainstream of the “techno-thriller”
genre, featuring missions in exotic locations confronting shadowy adversaries
bent on inflicting great harm. The present book is a departure from this
formula, being largely set in the United States and involving institutions
considered pillars of the establishment such as the Federal Reserve
System and the Central Intelligence Agency.
A CIA operative “accidentally” runs into a senior intelligence
official of the Jordanian government in an airport lounge in Europe,
who passes her disturbing evidence that members of a now-disbanded CIA
team of which she was a member were involved in destabilising
governments now gripped with “Arab Spring” uprisings and
next may be setting their sights on Jordan.
Meanwhile, Scot Harvath, just returned from a harrowing mission on the high
seas, is taken by his employer, Reed Carlton, to discreetly meet a new client:
the Federal Reserve. The Carlton Group is struggling to recover from the devastating
blow it took in the previous novel,
Black List (August 2014), and its boss is
willing to take on unconventional missions and new clients, especially ones
“with a license to print their own money”. The chairman of the
Federal Reserve has recently and unexpectedly died and the five principal
candidates to replace him have all been kidnapped, almost simultaneously,
across the United States. These people start turning up dead, in
circumstances with symbolism dating back to the American revolution.
Investigation of the Jordanian allegations is shut down by the CIA hierarchy,
and has to be pursued through back channels, involving retired people who
know how the CIA really works. Evidence emerges of a black program that
created weapons of frightful potential which may have gone even blacker and
deeper under cover after being officially shut down.
Earlier Brad Thor novels were more along the “U-S-A! U-S-A!”
line of most thrillers. Here, the author looks below the surface of
highly dubious institutions (“The Federal Reserve is about as federal
as Federal Express”) and evil that flourishes in the dark,
especially when irrigated with abundant and unaccountable funds. Like
many Americans, Scot Harvath knew little about the Federal Reserve other
than it had something to do with money. Over the course of his investigations
he, and the reader, will learn many disturbing things about its dodgy history
and operations, all accurate as best I can determine.
The novel is as much police procedural as thriller, with Harvath teamed with
a no-nonsense Boston Police Department detective, processing crime scenes
and running down evidence. The story is set in an unspecified near future
(the Aerion Supersonic Business
Jet is in operation). All is eventually revealed in the end, with a
resolution in the final chapter devoutly to be wished, albeit highly unlikely
to occur in the cesspool of corruption which is real-world Washington. There
is less action and fancy gear than in most Harvath novels, but interesting
characters, an intricate mystery, and a good deal of information of which
many readers may not be aware.
A short prelude to this novel,
is available for free for the Kindle. It provides the background of
the mission in progress in which we first encounter Scot Harvath in
chapter 2 here. My guess is that this chapter was originally part
of the manuscript and was cut for reasons of length and because it
spent too much time on a matter peripheral to the main plot. It's
interesting to read before you pick up Hidden Order, but
if you skip it you'll miss nothing in the main story.