Thursday, January 19, 2017

Probability Pipe Organ Updated to HTML5

Twenty years ago I posted The Probability Pipe Organ as part of the “Introduction to Probability and Statistics” documenting the Retropsychokinesis Experiments Online (RPKP). The pipe organ illustrates how the results of a series of experiments involving random values approaches the binomal distribution as the number of experiments increases.

This page was original implemented as a Java applet. When the Java language was launched, the accompanying hype claimed “Write once. Run everywhere.” After experience with several implementations of Java on different platforms, I added “Yeah, right.” to the slogan. Still, at the time, Java was the only practical way to include complex interaction and simulation in Web pages, so I used it for the pipe organ and online RPKP experiments.

In subsequent years, Java suffered severe bloat of the language and standard libraries, which created a security perimeter so large that when Java was embedded within a Web browser, it presented substantial security risks, to such an extent that many modern browsers disable Java by default, and, if they support it, require the user to manually install it and keep it up to date. This poses a barrier to what were intended to be easily-accessible Web resources.

While there is much to dislike about HTML5, its canvas element and multimedia support, along with JavaScript, allow interaction and animation within standards-compliant documents without third-party browser add-ons. I am in the process of replacing all Java applets on the Fourmilab site with HTML5; the probability pipe organ is the pathfinder for this project.

The HTML5 version is now the default. Users with older browsers, or those with Java installed who wish to compare, can still access the original Java implementation.

Posted at 12:37 Permalink

Monday, January 16, 2017

Reading List: The Kingdom of Speech

Wolfe, Tom. The Kingdom of Speech. New York: Little, Brown, 2016. ISBN 978-0-316-40462-4.
In this short (192) page book, Tom Wolfe returns to his roots in the “new journalism”, of which he was a pioneer in the 1960s. Here the topic is the theory of evolution; the challenge posed to it by human speech (because no obvious precursor to speech occurs in other animals); attempts, from Darwin to Noam Chomsky to explain this apparent discrepancy and preserve the status of evolution as a “theory of everything”; and the evidence collected by linguist and anthropologist Daniel Everett among the Pirahã people of the Amazon basin in Brazil, which appears to falsify Chomsky's lifetime of work on the origin of human language and the universality of its structure. A second theme is contrasting theorists and intellectuals such as Darwin and Chomsky with “flycatchers” such as Alfred Russel Wallace, Darwin's rival for priority in publishing the theory of evolution, and Daniel Everett, who work in the field—often in remote, unpleasant, and dangerous conditions—to collect the data upon which the grand thinkers erect their castles of hypothesis.

Doubtless fearful of the reaction if he suggested the theory of evolution applied to the origin of humans, in his 1859 book On the Origin of Species, Darwin only tiptoed close to the question two pages from the end, writing, “In the distant future, I see open fields for far more important researches. Psychology will be securely based on a new foundation, that of the necessary acquirement of each mental power and capacity of gradation. Light will be thrown on the origin of man and his history.” He needn't have been so cautious: he fooled nobody. The very first review, five days before publication, asked, “If a monkey has become a man—…?”, and the tempest was soon at full force.

Darwin's critics, among them Max Müller, German-born professor of languages at Oxford, and Darwin's rival Alfred Wallace, seized upon human characteristics which had no obvious precursors in the animals from which man was supposed to have descended: a hairless body, the capacity for abstract thought, and, Müller's emphasis, speech. As Müller said, “Language is our Rubicon, and no brute will dare cross it.” How could Darwin's theory, which claimed to describe evolution from existing characteristics in ancestor species, explain completely novel properties which animals lacked?

Darwin responded with his 1871 The Descent of Man, and Selection in Relation to Sex, which explicitly argued that there were precursors to these supposedly novel human characteristics among animals, and that, for example, human speech was foreshadowed by the mating songs of birds. Sexual selection was suggested as the mechanism by which humans lost their hair, and the roots of a number of human emotions and even religious devotion could be found in the behaviour of dogs. Many found these arguments, presented without any concrete evidence, unpersuasive. The question of the origin of language had become so controversial and toxic that a year later, the Philological Society of London announced it would no longer accept papers on the subject.

With the rediscovery of Gregor Mendel's work on genetics and subsequent research in the field, a mechanism which could explain Darwin's evolution was in hand, and the theory became widely accepted, with the few discrepancies set aside (as had the Philological Society) as things we weren't yet ready to figure out.

In the years after World War II, the social sciences became afflicted by a case of “physics envy”. The contribution to the war effort by their colleagues in the hard sciences in areas such as radar, atomic energy, and aeronautics had been handsomely rewarded by prestige and funding, while the more squishy sciences remained in a prewar languor along with the departments of Latin, Medieval History, and Drama. Clearly, what was needed was for these fields to adopt a theoretical approach grounded in mathematics which had served so well for chemists, physicists, engineers, and appeared to be working for the new breed of economists.

It was into this environment that in the late 1950s a young linguist named Noam Chomsky burst onto the scene. Over its century and a half of history, much of the work of linguistics had been cataloguing and studying the thousands of languages spoken by people around the world, much as entomologists and botanists (or, in the pejorative term of Darwin's age, flycatchers) travelled to distant lands to discover the diversity of nature and try to make sense of how it was all interrelated. In his 1957 book, Syntactic Structures, Chomsky, then just twenty-eight years old and working in the building at MIT where radar had been developed during the war, said all of this tedious and messy field work was unnecessary. Humans had evolved (note, “evolved”) a “language organ”, an actual physical structure within the brain—the “language acquisition device”—which children used to learn and speak the language they heard from their parents. All human languages shared a “universal grammar”, on top of which all the details of specific languages so carefully catalogued in the field were just fluff, like the specific shape and colour of butterflies' wings. Chomsky invented the “Martian linguist” which was to come to feature in his lectures, who he claimed, arriving on Earth, would quickly discover the unity underlying all human languages. No longer need the linguist leave his air conditioned office. As Wolfe writes in chapter 4, “Now, all the new, Higher Things in a linguist's life were to be found indoors, at a desk…looking at learned journals filled with cramped type instead of at a bunch of hambone faces in a cloud of gnats.”

Given the alternatives, most linguists opted for the office, and for the prestige that a theory-based approach to their field conferred, and by the 1960s, Chomsky's views had taken over linguistics, with only a few dissenters, at whom Chomsky hurled thunderbolts from his perch on academic Olympus. He transmuted into a general-purpose intellectual, pronouncing on politics, economics, philosophy, history, and whatever occupied his fancy, all with the confidence and certainty he brought to linguistics. Those who dissented he denounced as “frauds”, “liars”, or “charlatans”, including B. F. Skinner, Alan Dershowitz, Jacques Lacan, Elie Wiesel, Christopher Hitchens, and Jacques Derrida. (Well, maybe I agree when it comes to Derrida and Lacan.) In 2002, with two colleagues, he published a new theory claiming that recursion—embedding one thought within another—was a universal property of human language and component of the universal grammar hard-wired into the brain.

Since 1977, Daniel Everett had been living with and studying the Pirahã in Brazil, originally as a missionary and later as an academic linguist trained and working in the Chomsky tradition. He was the first person to successfully learn the Pirahã language, and documented it in publications. In 2005 he published a paper in which he concluded that the language, one of the simplest ever described, contained no recursion whatsoever. It also contained neither a past nor future tense, description of relations beyond parents and siblings, gender, numbers, and many additional aspects of other languages. But the absence of recursion falsified Chomsky's theory, which pronounced it a fundamental part of all human languages. Here was a field worker, a flycatcher, braving not only gnats but anacondas, caimans, and just about every tropical disease in the catalogue, knocking the foundation from beneath the great man's fairy castle of theory. Naturally, Chomsky and his acolytes responded with their customary vituperation, (this time, the adjective of choice for Everett was “charlatan”). Just as they were preparing the academic paper which would drive a stake through this nonsense, Everett published Don't Sleep, There Are Snakes, a combined account of his thirty years with the Pirahã and an analysis of their language. The book became a popular hit and won numerous awards. In 2012, Everett followed up with Language: The Cultural Tool, which rejects Chomsky's view of language as an innate and universal human property in favour of the view that it is one among a multitude of artifacts created by human societies as a tool, and necessarily reflects the characteristics of those societies. Chomsky now refuses to discuss Everett's work.

In the conclusion, Wolfe comes down on the side of Everett, and argues that the solution to the mystery of how speech evolved is that it didn't evolve at all. Speech is simply a tool which humans used their big brains to invent to help them accomplish their goals, just as they invented bows and arrows, canoes, and microprocessors. It doesn't make any more sense to ask how evolution produced speech than it does to suggest it produced any of those other artifacts not made by animals. He further suggests that the invention of speech proceeded from initial use of sounds as mnemonics for objects and concepts, then progressed to more complex grammatical structure, but I found little evidence in his argument to back the supposition, nor is this a necessary part of viewing speech as an invented artifact. Chomsky's grand theory, like most theories made up without grounding in empirical evidence, is failing both by being falsified on its fundamentals by the work of Everett and others, and also by the failure, despite half a century of progress in neurophysiology, to identify the “language organ” upon which it is based.

It's somewhat amusing to see soft science academics rush to Chomsky's defence, when he's arguing that language is biologically determined as opposed to being, as Everett contends, a social construct whose details depend upon the cultural context which created it. A hunter-gatherer society such as the Pirahã living in an environment where food is abundant and little changes over time scales from days to generations, doesn't need a language as complicated as those living in an agricultural society with division of labour, and it shouldn't be a surprise to find their language is more rudimentary. Chomsky assumed that all human languages were universal (able to express any concept), in the sense David Deutsch defined universality in The Beginning of Infinity, but why should every people have a universal language when some cultures get along just fine without universal number systems or alphabets? Doesn't it make a lot more sense to conclude that people settle on a language, like any other tools, which gets the job done? Wolfe then argues that the capacity of speech is the defining characteristic of human beings, and enables all of the other human capabilities and accomplishments which animals lack. I'd consider this not proved. Why isn't the definitive human characteristic the ability to make tools, and language simply one among a multitude of tools humans have invented?

This book strikes me as one or two interesting blog posts struggling to escape from a snarknado of Wolfe's 1960s style verbal fireworks, including Bango!, riiippp, OOOF!, and “a regular crotch crusher!”. At age 85, he's still got it, but I wonder whether he, or his editor, questioned whether this style of journalism is as effective when discussing evolutionary biology and linguistics as in mocking sixties radicals, hippies, or pretentious artists and architects. There is some odd typography, as well. Grave accents are used in words like “learnèd”, presumably to indicate it's to be pronounced as two syllables, but then occasionally we get an acute accent instead—what's that supposed to mean? Chapter endnotes are given as superscript letters while source citations are superscript numbers, neither of which are easy to select on a touch-screen Kindle edition. There is no index.

Posted at 01:48 Permalink

Friday, January 13, 2017

Reading List: Planck

Brown, Brandon R. Planck. Oxford: Oxford University Press, 2015. ISBN 978-0-19-021947-5.
Theoretical physics is usually a young person's game. Many of the greatest breakthroughs have been made by researchers in their twenties, just having mastered existing theories while remaining intellectually flexible and open to new ideas. Max Planck, born in 1858, was an exception to this rule. He spent most of his twenties living with his parents and despairing of finding a paid position in academia. He was thirty-six when he took on the project of understanding heat radiation, and forty-two when he explained it in terms which would launch the quantum revolution in physics. He was in his fifties when he discovered the zero-point energy of the vacuum, and remained engaged and active in science until shortly before his death in 1947 at the age of 89. As theoretical physics editor for the then most prestigious physics journal in the world, Annalen der Physik, in 1905 he approved publication of Einstein's special theory of relativity, embraced the new ideas from a young outsider with neither a Ph.D. nor an academic position, extended the theory in his own work in subsequent years, and was instrumental in persuading Einstein to come to Berlin, where he became a close friend.

Sometimes the simplest puzzles lead to the most profound of insights. At the end of the nineteenth century, the radiation emitted by heated bodies was such a conundrum. All objects emit electromagnetic radiation due to the thermal motion of their molecules. If an object is sufficiently hot, such as the filament of an incandescent lamp or the surface of the Sun, some of the radiation will fall into the visible range and be perceived as light. Cooler objects emit in the infrared or lower frequency bands and can be detected by instruments sensitive to them. The radiation emitted by a hot object has a characteristic spectrum (the distribution of energy by frequency), and has a peak which depends only upon the temperature of the body. One of the simplest cases is that of a black body, an ideal object which perfectly absorbs all incident radiation. Consider an ideal closed oven which loses no heat to the outside. When heated to a given temperature, its walls will absorb and re-emit radiation, with the spectrum depending upon its temperature. But the equipartition theorem, a cornerstone of statistical mechanics, predicted that the absorption and re-emission of radiation in the closed oven would result in a ever-increasing peak frequency and energy, diverging to infinite temperature, the so-called ultraviolet catastrophe. Not only did this violate the law of conservation of energy, it was an affront to common sense: closed ovens do not explode like nuclear bombs. And yet the theory which predicted this behaviour, the Rayleigh-Jeans law, made perfect sense based upon the motion of atoms and molecules, correctly predicted numerous physical phenomena, and was correct for thermal radiation at lower temperatures.

At the time Planck took up the problem of thermal radiation, experimenters in Germany were engaged in measuring the radiation emitted by hot objects with ever-increasing precision, confirming the discrepancy between theory and reality, and falsifying several attempts to explain the measurements. In December 1900, Planck presented his new theory of black body radiation and what is now called Planck's Law at a conference in Berlin. Written in modern notation, his formula for the energy emitted by a body of temperature T at frequency ν is:

Planck's Law

This equation not only correctly predicted the results measured in the laboratories, it avoided the ultraviolet catastrophe, as it predicted an absolute cutoff of the highest frequency radiation which could be emitted based upon an object's temperature. This meant that the absorption and re-emission of radiation in the closed oven could never run away to infinity because no energy could be emitted above the limit imposed by the temperature.

Fine: the theory explained the measurements. But what did it mean? More than a century later, we're still trying to figure that out.

Planck modeled the walls of the oven as a series of resonators, but unlike earlier theories in which each could emit energy at any frequency, he constrained them to produce discrete chunks of energy with a value determined by the frequency emitted. This had the result of imposing a limit on the frequency due to the available energy. While this assumption yielded the correct result, Planck, deeply steeped in the nineteenth century tradition of the continuum, did not initially suggest that energy was actually emitted in discrete packets, considering this aspect of his theory “a purely formal assumption.” Planck's 1900 paper generated little reaction: it was observed to fit the data, but the theory and its implications went over the heads of most physicists.

In 1905, in his capacity as editor of Annalen der Physik, he read and approved the publication of Einstein's paper on the photoelectric effect, which explained another physics puzzle by assuming that light was actually emitted in discrete bundles with an energy determined by its frequency. But Planck, whose equation manifested the same property, wasn't ready to go that far. As late as 1913, he wrote of Einstein, “That he might sometimes have overshot the target in his speculations, as for example in his light quantum hypothesis, should not be counted against him too much.” Only in the 1920s did Planck fully accept the implications of his work as embodied in the emerging quantum theory.

The equation for Planck's Law contained two new fundamental physical constants: Planck's constant (h) and Boltzmann's constant (kB). (Boltzmann's constant was named in memory of Ludwig Boltzmann, the pioneer of statistical mechanics, who committed suicide in 1906. The constant was first introduced by Planck in his theory of thermal radiation.) Planck realised that these new constants, which related the worlds of the very large and very small, together with other physical constants such as the speed of light (c), the gravitational constant (G), and the Coulomb constant (ke), allowed defining a system of units for quantities such as length, mass, time, electric charge, and temperature which were truly fundamental: derived from the properties of the universe we inhabit, and therefore comprehensible to intelligent beings anywhere in the universe. Most systems of measurement are derived from parochial anthropocentric quantities such as the temperature of somebody's armpit or the supposed distance from the north pole to the equator. Planck's natural units have no such dependencies, and when one does physics using them, equations become simpler and more comprehensible. The magnitudes of the Planck units are so far removed from the human scale they're unlikely to find any application outside theoretical physics (imagine speed limit signs expressed in a fraction of the speed of light, or road signs giving distances in Planck lengths of 1.62×10−35 metres), but they reflect the properties of the universe and may indicate the limits of our ability to understand it (for example, it may not be physically meaningful to speak of a distance smaller than the Planck length or an interval shorter than the Planck time [5.39×10−44 seconds]).

Planck's life was long and productive, and he enjoyed robust health (he continued his long hikes in the mountains into his eighties), but was marred by tragedy. His first wife, Marie, died of tuberculosis in 1909. He outlived four of his five children. His son Karl was killed in 1916 in World War I. His two daughters, Grete and Emma, both died in childbirth, in 1917 and 1919. His son and close companion Erwin, who survived capture and imprisonment by the French during World War I, was arrested and executed by the Nazis in 1945 for suspicion of involvement in the Stauffenberg plot to assassinate Hitler. (There is no evidence Erwin was a part of the conspiracy, but he was anti-Nazi and knew some of those involved in the plot.)

Planck was repulsed by the Nazis, especially after a private meeting with Hitler in 1933, but continued in his post as the head of the Kaiser Wilhelm Society until 1937. He considered himself a German patriot and never considered emigrating (and doubtless his being 75 years old when Hitler came to power was a consideration). He opposed and resisted the purging of Jews from German scientific institutions and the campaign against “Jewish science”, but when ordered to dismiss non-Aryan members of the Kaiser Wilhelm Society, he complied. When Heisenberg approached him for guidance, he said, “You have come to get my advice on political questions, but I am afraid I can no longer advise you. I see no hope of stopping the catastrophe that is about to engulf all our universities, indeed our whole country. … You simply cannot stop a landslide once it has started.”

Planck's house near Berlin was destroyed in an Allied bombing raid in February 1944, and with it a lifetime of his papers, photographs, and correspondence. (He and his second wife Marga had evacuated to Rogätz in 1943 to escape the raids.) As a result, historians have only limited primary sources from which to work, and the present book does an excellent job of recounting the life and science of a man whose work laid part of the foundations of twentieth century science.

Posted at 01:08 Permalink

Monday, January 9, 2017

The Autodesk File: Thirty-fifth Anniversary Edition

This year marks the thirty-fifth anniversary of several key events in the history of Autodesk:
  • January 12, 1982: Working Paper proposing a new company
  • January 30, 1982: Original organisation meeting at my house in Mill Valley, California
  • April 26, 1982: Incorporation of Autodesk, Inc. in California
  • September 20, 1982: First AutoCAD bug reported by a customer (in a pre-release copy)
  • November 29, 1982: Introduction of AutoCAD at COMDEX, Las Vegas
To celebrate Autodesk's thirty-fifth anniversary, I have prepared the the Fifth Edition (2017) of The Autodesk File. Except for correction of a few typographical errors, the content is identical to that of the 1994 fourth edition, but the book has been entirely reformatted and updated to contemporary Web standards. The typography uses Unicode text entities, and should be much easier on the eye. Each chapter is now a single document, instead of being broken into sections and subsections, and easier to read without incessant clicking on navigation buttons. All of the AutoCAD sample drawings used as illustrations have been re-made from their original PostScript plot files with higher resolution. The pop-up windows for footnotes (which were irritating and ran afoul of some browser pop-up blockers) have been replaced by [Footnote] icons which display the footnote when clicked. Cross-references are indicated by an [Ref] icon which navigates to the cited page when clicked. A navigation bar at the left provides instant access to all chapters, and highlights the current chapter regardless of how you arrived there. The Fifth Edition is compatible with most modern desktop browsers. The Safari browser on iOS mobile devices (iPad, iPhone) has a serious flaw in scrolling text within a window which has remained uncorrected for years. On these devices, you can read the iOS work-around edition, which contains a device-specific fix for the problem. All previous Web editions of the book remain available from its main directory page.

Posted at 21:31 Permalink

Friday, December 30, 2016

Books of the year: 2016

Here are my picks for the best books of 2016, fiction and nonfiction. These aren't the best books published this year, but rather the best I've read in the last twelvemonth. The winner in both categories is barely distinguished from the pack, and the runners up are all worthy of reading. Runners up appear in alphabetical order by their author's surname. Each title is linked to my review of the book.

Fiction:

Winner: Runners up:

Nonfiction:

Winner: Runners up:

Posted at 13:29 Permalink

Thursday, December 15, 2016

Reading List: On the Shores of Titan's Farthest Sea

Carroll, Michael. On the Shores of Titan's Farthest Sea. Cham, Switzerland: Springer International, 2015. ISBN 978-3-319-17758-8.
By the mid-23rd century, humans have become a spacefaring species. Human settlements extend from the Earth to the moons of Jupiter, Mars has been terraformed into a world with seas where people can live on the surface and breathe the air. The industries of Earth and Mars are supplied by resources mined in the asteroid belt. High-performance drive technologies, using fuels produced in space, allow this archipelago of human communities to participate in a system-wide economy, constrained only by the realities of orbital mechanics. For bulk shipments of cargo, it doesn't matter much how long they're in transit, as long as regular deliveries are maintained.

But whenever shipments of great value traverse a largely empty void, they represent an opportunity to those who would seize them by force. As in the days of wooden ships returning treasure from the New World to the Old on the home planet, space cargo en route from the new worlds to the old is vulnerable to pirates, and an arms race is underway between shippers and buccaneers of the black void, with the TriPlanet Bureau of Investigation (TBI) finding itself largely a spectator and confined to tracking down the activities of criminals within the far-flung human communities.

As humanity expands outward, the frontier is Titan, Saturn's largest moon, and the only moon in the solar system to have a substantial atmosphere. Titan around 2260 is much like present-day Antarctica: home to a variety of research stations operated by scientific agencies of various powers in the inner system. Titan is much more interesting than Antarctica, however. Apart from the Earth, it is the only solar system body to have natural liquids on its surface, with a complex cycle of evaporation, rain, erosion, rivers, lakes, and seas. The largest sea, Kraken Mare, located near the north pole, is larger than Earth's Caspian Sea. Titan's atmosphere is half again as dense as that of Earth, and with only 14% of Earth's gravity, it is possible for people to fly under their own muscle power.

It's cold: really cold. Titan receives around one hundredth the sunlight as the Earth, and the mean temperature is around −180 °C. There is plenty of water on Titan, but at these temperatures water is a rock as hard as granite, and it is found in the form of mountains and boulders on the surface. But what about the lakes? They're filled with a mixture of methane and ethane, hydrocarbons which can exist in either gaseous or liquid form in the temperature range and pressure on Titan. Driven by ultraviolet light from the Sun, these hydrocarbons react with nitrogen and hydrogen in the atmosphere to produce organic compounds that envelop the moon in a dense layer of smog and rain out, forming dunes on the surface. (Here “organic” is used in the chemist's sense of denoting compounds containing carbon and does not imply they are of biological origin.)

Mayda Research Station, located on the shore of Kraken Mare, hosts researchers in a variety of fields. In addition to people studying the atmosphere, rivers, organic compounds on the surface, and other specialties, the station is home to a drilling project intended to bore through the ice crust and explore the liquid water ocean believed to lie below. Mayda is an isolated station, with all of the interpersonal dynamics one expects to find in such environments along with the usual desire of researchers to get on with their own work. When a hydrologist turns up dead of hypothermia—frozen to death—in his bed in the station, his colleagues are baffled and unsettled. Accidents happen, but this is something which simply doesn't make any sense. Nobody can think of either a motive for foul play nor a suspect. Abigail Marco, an atmospheric scientist from Mars and friend of the victim, decides to investigate further, and contacts a friend on Mars who has worked with the TBI.

The death of the scientist is a mystery, but it is only the first in a series of enigmas which perplex the station's inhabitants who see, hear, and experience things which they, as scientists, cannot explain. Meanwhile, other baffling events threaten the survival of the crew and force Abigail to confront part of her past she had hoped she'd left on Mars.

This is not a “locked station mystery” although it starts out as one. There is interplanetary action and intrigue, and a central puzzle underlying everything that occurs. Although the story is fictional, the environment in which it is set is based upon our best present day understanding of Titan, a world about which little was known before the arrival of the Cassini spacecraft at Saturn in 2004 and the landing of its Huygens probe on Titan the following year. A twenty page appendix describes the science behind the story, including the environment at Titan, asteroid mining, and terraforming Mars. The author's nonfiction Living Among Giants (March 2015) provides details of the worlds of the outer solar system and the wonders awaiting explorers and settlers there.

Posted at 01:15 Permalink

Friday, December 9, 2016

Reading List: American Individualism

Hoover, Herbert. American Individualism. Introduction by George H. Nash. Stanford, CA: Hoover Institution Press, [1922] 2016. ISBN 978-0-8179-2015-9.
After the end of World War I, Herbert Hoover and the American Relief Administration he headed provided food aid to the devastated nations of Central Europe, saving millions from famine. Upon returning to the United States in the fall of 1919, he was dismayed by what he perceived to be an inoculation of the diseases of socialism, autocracy, and other forms of collectivism, whose pernicious consequences he had observed first-hand in Europe and in the peace conference after the end of the conflict, into his own country. In 1920, he wrote, “Every wind that blows carries to our shores an infection of social disease from this great ferment; every convulsion there has an economic reaction upon our own people.”

Hoover sensed that in the aftermath of war, which left some collectivists nostalgic for the national mobilisation and top-down direction of the economy by “war socialism”, and growing domestic unrest: steel and police strikes, lynchings and race riots, and bombing attacks by anarchists, that it was necessary to articulate the principles upon which American society and its government were founded, which he believed were distinct from those of the Old World, and the deliberate creation of people who had come to the new continent expressly to escape the ruinous doctrines of the societies they left behind.

After assuming the post of Secretary of Commerce in the newly inaugurated Harding administration in 1921, and faced with massive coal and railroad strikes which threatened the economy, Hoover felt a new urgency to reassert his vision of American principles. In December 1922, American Individualism was published. The short book (at 72 pages, more of a long pamphlet), was based upon a magazine article he had published the previous March in World's Work.

Hoover argues that five or six philosophies of social and economic organisation are contending for dominance: among them Autocracy, Socialism, Syndicalism, Communism, and Capitalism. Against these he contrasts American Individualism, which he believes developed among a population freed by emigration and distance from shackles of the past such as divine right monarchy, hereditary aristocracy, and static social classes. These people became individuals, acting on their own initiative and in concert with one another without top-down direction because they had to: with a small and hands-off government, it was the only way to get anything done. Hoover writes,

Forty years ago [in the 1880s] the contact of the individual with the Government had its largest expression in the sheriff or policeman, and in debates over political equality. In those happy days the Government offered but small interference with the economic life of the citizen.

But with the growth of cities, industrialisation, and large enterprises such as railroads and steel manufacturing, a threat to this frontier individualism emerged: the reduction of workers to a proletariat or serfdom due to the imbalance between their power as individuals and the huge companies that employed them. It is there that government action was required to protect the other component of American individualism: the belief in equality of opportunity. Hoover believes, and supports, intervention in the economy to prevent the concentration of economic power in the hands of a few, and to guard, through taxation and other means, against the emergence of a hereditary aristocracy of wealth. Yet this poses its own risks,

But with the vast development of industry and the train of regulating functions of the national and municipal government that followed from it; with the recent vast increase in taxation due to the war;—the Government has become through its relations to economic life the most potent force for maintenance or destruction of our American individualism.

One of the challenges American society must face as it adapts is avoiding the risk of utopian ideologies imported from Europe seizing this power to try to remake the country and its people along other lines. Just ten years later, as Hoover's presidency gave way to the New Deal, this fearful prospect would become a reality.

Hoover examines the philosophical, spiritual, economic, and political aspects of this unique system of individual initiative tempered by constraints and regulation in the interest of protecting the equal opportunity of all citizens to rise as high as their talent and effort permit. Despite the problems cited by radicals bent on upending the society, he contends things are working pretty well. He cites “the one percent”: “Yet any analysis of the 105,000,000 of us would show that we harbor less than a million of either rich or impecunious loafers.” Well, the percentage of very rich seems about the same today, but after half a century of welfare programs which couldn't have been more effective in destroying the family and the initiative of those at the bottom of the economic ladder had that been their intent, and an education system which, as a federal commission was to write in 1983, “If an unfriendly foreign power had attempted to impose on America …, we might well have viewed it as an act of war”, a nation with three times the population seems to have developed a much larger unemployable and dependent underclass.

Hoover also judges the American system to have performed well in achieving its goal of a classless society with upward mobility through merit. He observes, speaking of the Harding administration of which he is a member,

That our system has avoided the establishment and domination of class has a significant proof in the present Administration in Washington, Of the twelve men comprising the President, Vice-President, and Cabinet, nine have earned their own way in life without economic inheritance, and eight of them started with manual labor.

Let's see how that has held up, almost a century later. Taking the 17 people in equivalent positions at the end of the Obama administration in 2016 (President, Vice President, and heads of the 15 executive departments), we find that only 1 of the 17 inherited wealth (I'm inferring from the description of parents in their biographies) but that precisely zero had any experience with manual labour. If attending an Ivy League university can be taken as a modern badge of membership in a ruling class, 11 of the 17—65%, meet this test (if you consider Stanford a member of an “extended Ivy League”, the figure rises to 70%).

Although published in a different century in a very different America, much of what Hoover wrote remains relevant today. Just as Hoover warned of bad ideas from Europe crossing the Atlantic and taking root in the United States, the Frankfurt School in Germany was laying the groundwork for the deconstruction of Western civilisation and individualism, and in the 1930s, its leaders would come to America to infect academia. As Hoover warned, “There is never danger from the radical himself until the structure and confidence of society has been undermined by the enthronement of destructive criticism.” Destructive criticism is precisely what these “critical theorists” specialised in, and today in many parts of the humanities and social sciences even in the most eminent institutions the rot is so deep they are essentially a write-off.

Undoing a century of bad ideas is not the work of a few years, but Hoover's optimistic and pragmatic view of the redeeming merit of individualism unleashed is a bracing antidote to the gloom one may feel when surveying the contemporary scene.

Posted at 22:58 Permalink

Wednesday, December 7, 2016

Reading List: Paper

Kurlansky, Mark. Paper. New York: W. W. Norton, 2016. ISBN 978-0-393-23961-4.
One of the things that makes us human is our use of extrasomatic memory: we invent ways to store and retrieve things outside our own brains. It's as if when the evolutionary drive which caused the brains of our ancestors to grow over time reached its limit, due to the physical constraints of the birth canal, we applied the cleverness of our bulging brains to figure out not only how to record things for ourselves, but to pass them on to other individuals and transmit them through time to our successors.

This urge to leave a mark on our surroundings is deeply-seated and as old as our species. Paintings at the El Castillo site in Spain have been dated to at least 40,800 years before the present. Complex paintings of animals and humans in the Lascaux Caves in France, dated around 17,300 years ago, seem strikingly modern to observers today. As anybody who has observed young children knows, humans do not need to be taught to draw: the challenge is teaching them to draw only where appropriate.

Nobody knows for sure when humans began to speak, but evidence suggests that verbal communication is at least as old and possibly appeared well before the first evidence of drawing. Once speech appeared, it was not only possible to transmit information from one human to another directly but, by memorising stories, poetry, and songs, to create an oral tradition passed on from one generation to the next. No longer what one individual learned in their life need die with them.

Given the human compulsion to communicate, and how long we've been doing it by speaking, drawing, singing, and sculpting, it's curious we only seem to have invented written language around 5000 years ago. (But recall that the archaeological record is incomplete and consists only of objects which survived through the ages. Evidence of early writing is from peoples who wrote on durable material such as stone or clay tablets, or lived in dry climates such as that of Egypt where more fragile media such as papyrus or parchment would be preserved. It is entirely possible writing was invented much earlier by any number of societies who wrote on more perishable surfaces and lived in climates where they would not endure.)

Once writing appeared, it remained the province of a small class of scribes and clerics who would read texts to the common people. Mass literacy did not appear for millennia, and would require a better medium for the written word and a less time-consuming and costly way to reproduce it. It was in China that the solutions to both of these problems would originate.

Legends date Chinese writing from much earlier, but the oldest known writing in China is dated around 3300 years ago, and was inscribed on bones and turtle shells. Already, the Chinese language used six hundred characters, and this number would only increase over time, with a phonetic alphabet never being adopted. The Chinese may not have invented bureaucracy, but as an ancient and largely stable society they became very skilled at it, and consequently produced ever more written records. These writings employed a variety of materials: stone, bamboo, and wood tablets; bronze vessels; and silk. All of these were difficult to produce, expensive, and many required special skills on the part of scribes.

Cellulose is a main component of the cell wall of plants, and forms the structure of many of the more complex members of the plant kingdom. It forms linear polymers which produce strong fibres. The cellulose content of plants varies widely: cotton is 90% cellulose, while wood is around half cellulose, depending on the species of tree. Sometime around A.D. 100, somebody in China (according to legend, a courtier named Cai Lun) discovered that through a process of cooking, hammering, and chopping, the cellulose fibres in material such as discarded cloth, hemp, and tree bark could be made to separate into a thin slurry of fibres suspended in water. If a frame containing a fine screen were dipped into a vat of this material, rocked back and forth in just the right way, then removed, a fine layer of fibres with random orientation would remain on the screen after the water drained away. This sheet could then be removed, pressed, and dried, yielding a strong, flat material composed of intertwined cellulose fibres. Paper had been invented.

Paper was found to be ideal for writing the Chinese language, which was, and is today, usually written with a brush. Since paper could be made from raw materials previously considered waste (rags, old ropes and fishing nets, rice and bamboo straw), water, and a vat and frame which were easily constructed, it was inexpensive and could be produced in quantity. Further, the papermaker could vary the thickness of the paper by adding more or less pulp to the vat, by the technique in dipping the frame, and produce paper with different surface properties by adding “sizing” material such as starch to the mix. In addition to sating the appetite of the imperial administration, paper was adopted as the medium of choice for artists, calligraphers, and makers of fans, lanterns, kites, and other objects.

Many technologies were invented independently by different societies around the world. Paper, however, appears to have been discovered only once in the eastern hemisphere, in China, and then diffused westward along the Silk Road. The civilisations of Mesoamerica such as the Mayans, Toltecs, and Aztecs, extensively used, prior to the Spanish conquest, what was described as paper, but it is not clear whether this was true paper or a material made from reeds and bark. So thoroughly did the conquistadors obliterate the indigenous civilisations, burning thousands of books, that only three Mayan books and fifteen Aztec documents are known to have survived, and none of these are written on true paper.

Paper arrived in the Near East just as the Islamic civilisation was consolidating after its first wave of conquests. Now faced with administering an empire, the caliphs discovered, like the Chinese before them, that many documents were required and the new innovative writing material met the need. Paper making requires a source of cellulose-rich material and abundant water, neither of which are found in the Arabian peninsula, so the first great Islamic paper mill was founded in Baghdad in A.D. 794, originally employing workers from China. It was the first water-powered paper mill, a design which would dominate paper making until the age of steam. The demand for paper continued to grow, and paper mills were established in Damascus and Cairo, each known for the particular style of paper they produced.

It was the Muslim invaders of Spain who brought paper to Europe, and paper produced by mills they established in the land they named al-Andalus found markets in the territories we now call Italy and France. Many Muslim scholars of the era occupied themselves producing editions of the works of Greek and Roman antiquity, and wrote them on paper. After the Christian reconquest of the Iberian peninsula, papermaking spread to Italy, arriving in time for the awakening of intellectual life which would be called the Renaissance and produce large quantities of books, sheet music, maps, and art: most of it on paper. Demand outstripped supply, and paper mills sprung up wherever a source of fibre and running water was available.

Paper provided an inexpensive, durable, and portable means of storing, transmitting, and distributing information of all kinds, but was limited in its audience as long as each copy had to be laboriously made by a scribe or artist (often introducing errors in the process). Once again, it was the Chinese who invented the solution. Motivated by the Buddhist religion, which values making copies of sacred texts, in the 8th century A.D. the first documents were printed in China and Japan. The first items to be printed were single pages, carved into a single wood block for the whole page, then printed onto paper in enormous quantities: tens of thousands in some cases. In the year 868, the first known dated book was printed, a volume of Buddhist prayers called the Diamond Sutra. Published on paper in the form of a scroll five metres long, each illustrated page was printed from a wood block carved with its entire contents. Such a “block book” could be produced in quantity (limited only by wear on the wood block), but the process of carving the wood was laborious, especially since text and images had to be carved as a mirror image of the printed page.

The next breakthrough also originated in China, but had limited impact there due to the nature of the written language. By carving or casting an individual block for each character, it was possible to set any text from a collection of characters, print documents, then reuse the same characters for the next job. Unfortunately, by the time the Chinese began to experiment with printing from movable type in the twelfth and thirteenth centuries, it took 60,000 different characters to print the everyday language and more than 200,000 for literary works. This made the initial investment in a set of type forbidding. The Koreans began to use movable type cast from metal in the fifteenth century and were so impressed with its flexibility and efficiency that in 1444 a royal decree abolished the use of Chinese characters in favour of a phonetic alphabet called Hangul which is still used today.

It was in Europe that movable type found a burgeoning intellectual climate ripe for its adoption, and whence it came to change the world. Johannes Gutenberg was a goldsmith, originally working with his brother Friele in Mainz, Germany. Fleeing political unrest, the brothers moved to Strasbourg, where around 1440 Johannes began experimenting with movable type for printing. His background as a goldsmith equipped him with the required skills of carving, stamping, and casting metal; indeed, many of the pioneers of movable type in Europe began their careers as goldsmiths. Gutenberg carved letters into hard metal, forming what he called a punch. The punch was used to strike a copper plate, forming an impression called the matrix. Molten lead was then poured into the matrix, producing individual characters of type. Casting letters in a matrix allowed producing as many of each letter as needed to set pages of type, and for replacement of worn type as required. The roman alphabet was ideal for movable type: while the Chinese language required 60,000 or more characters, a complete set of upper and lower case letters, numbers, and punctuation for German came to only around 100 pieces of type. Accounting for duplicates of commonly used letters, Gutenberg's first book, the famous Gutenberg Bible, used a total of 290 pieces of type. Gutenberg also developed a special ink suited for printing with metal type, and adapted a press he acquired from a paper mill to print pages.

Gutenberg was secretive about his processes, likely aware he had competition, which he did. Movable type was one of those inventions which was “in the air”—had Gutenberg not invented and publicised it, his contemporaries working in Haarlem, Bruges, Avignon, and Feltre, all reputed by people of those cities to have gotten there first, doubtless would have. But it was the impact of Gutenberg's Bible, which demonstrated that movable type could produce book-length works of quality comparable to those written by the best scribes, which established the invention in the minds of the public and inspired others to adopt the new technology.

Its adoption was, by the standards of the time, swift. An estimated eight million books were printed and sold in Europe in the second half of the fifteenth century—more books than Europe had produced in all of history before that time. Itinerant artisans would take their type punches from city to city, earning money by setting up locals in the printing business, then moving on.

In early sixteenth century Germany, the printing revolution sparked a Reformation. Martin Luther, an Augustinian monk, completed his German translation of the Bible in 1534 (he had earlier published a translation of the New Testament in 1522). This was the first widely-available translation of the Bible into a spoken language, and reinforced the Reformation idea that the Bible was directly accessible to all, without need for interpretation by clergy. Beginning with his original Ninety-five Theses, Luther authored thirty publications, which it is estimated sold 300,000 copies (in a territory of around 14 million German speakers). Around a third of all publications in Germany in the era were related to the Reformation.

This was a new media revolution. While the incumbent Church reacted at the speed of sermons read occasionally to congregations, the Reformation produced a flood of tracts, posters, books, and pamphlets written in vernacular German and aimed directly at an increasingly literate population. Luther's pamphlets became known as Flugschriften: “quick writing”. One such document, written in 1520, sold 4000 copies in three weeks and 50,000 in two years. Whatever the merits of the contending doctrines, the Reformation had fully embraced and employed the new communication technology to speak directly to the people. In modern terms, you might say the Reformation was the “killer app” for movable type printing.

Paper and printing with movable type were the communication and information storage technologies the Renaissance needed to express and distribute the work of thinkers and writers across a continent, who were now able to read and comment on each other's work and contribute to a culture that knew no borders. Interestingly, the technology of paper making was essentially unchanged from that of China a millennium and a half earlier, and printing with movable type hardly different from that invented by Gutenberg. Both would remain largely the same until the industrial revolution. What changed was an explosion in the volume of printed material and, with increasing literacy among the general public, the audience and market for it. In the eighteenth century a new innovation, the daily newspaper, appeared. Between 1712 and 1757, the circulation of newspapers in Britain grew eightfold. By 1760, newspaper circulation in Britain was 9 million, and would increase to 24 million by 1811.

All of this printing required ever increasing quantities of paper, and most paper in the West was produced from rags. Although the population was growing, their thirst for printed material expanded much quicker, and people, however fastidious, produce only so many rags. Paper shortages became so acute that newspapers limited their size based on the availability and cost of paper. There were even cases of scavengers taking clothes from the dead on battlefields to sell to paper mills making newsprint used to report the conflict. Paper mills resorted to doggerel to exhort the public to save rags:

The scraps, which you reject, unfit
To clothe the tenant of a hovel,
May shine in sentiment and wit,
And help make a charming novel…

René Antoine Ferchault de Réaumur, a French polymath who published in numerous fields of science, observed in 1719 that wasps made their nests from what amounted to paper they produced directly from wood. If humans could replicate this vespidian technology, the forests of Europe and North America could provide an essentially unlimited and renewable source of raw material for paper. This idea was to lie fallow for more than a century. Some experimenters produced small amounts of paper from wood through various processes, but it was not until 1850 that paper was manufactured from wood in commercial quantities in Germany, and 1863 that the first wood-based paper mill began operations in America.

Wood is about half cellulose, while the fibres in rags run up to 90% cellulose. The other major component of wood is lignin, a cross-linked polymer which gives it its strength and is useless for paper making. In the 1860s a process was invented where wood, first mechanically cut into small chips, was chemically treated to break down the fibrous structure in a device called a “digester”. This produced a pulp suitable for paper making, and allowed a dramatic expansion in the volume of paper produced. But the original wood-based paper still contained lignin, which turns brown over time. While this was acceptable for newspapers, it was undesirable for books and archival documents, for which rag paper remained preferred. In 1879, a German chemist invented a process to separate lignin from cellulose in wood pulp, which allowed producing paper that did not brown with age.

The processes used to make paper from wood involved soaking the wood pulp in acid to break down the fibres. Some of this acid remained in the paper, and many books printed on such paper between 1840 and 1970 are now in the process of slowly disintegrating as the acid eats away at the paper. Only around 1970 was it found that an alkali solution works just as well when processing the pulp, and since then acid-free paper has become the norm for book publishing.

Most paper is produced from wood today, and on an enormous, industrial scale. A single paper mill in China, not the largest, produces 600,000 tonnes of paper per year. And yet, for all of the mechanisation, that paper is made by the same process as the first sheet of paper produced in China: by reducing material to cellulose fibres, mixing them with water, extracting a sheet (now a continuous roll) with a screen, then pressing and drying it to produce the final product.

Paper and printing is one of those technologies which is so simple, based upon readily-available materials, and potentially revolutionary that it inspires “what if” speculation. The ancient Egyptians, Greeks, and Romans each had everything they needed—raw materials, skills, and a suitable written language—so that a Connecticut Yankee-like time traveller could have explained to artisans already working with wood and metal how to make paper, cast movable type, and set up a printing press in a matter of days. How would history have differed had one of those societies unleashed the power of the printed word?

Posted at 21:39 Permalink

Friday, December 2, 2016

Floating Point Benchmark: Swift Language Added

I have posted an update to my trigonometry-intense floating point benchmark which adds Swift to the list of languages in which the benchmark is implemented. A new release of the benchmark collection including Swift is now available for downloading.

Swift is a general purpose programming language developed by Apple for application programming on all of their platforms (macOS, iOS, tvOS, and watchOS). In addition, Swift has been ported to Linux, and is now developed as an open source project. Swift is intended as a successor to Objective-C as the main development language for Apple systems. Swift is intended to clean up the syntax of C and eliminate security risks such as null pointers, errors in memory management, subscripts out of range, overflows, and type conversion errors. Memory management is automatic, using a reference count scheme and explicit declaration of weak references to avoid memory leaks due to circular references. Functions are first class objects and rudimentary support for functional programming (for example, map, reduce, and lazy evaluation) is provided.

The relative performance of the various language implementations (with C taken as 1) is as follows. All language implementations of the benchmark listed below produced identical results to the last (11th) decimal place.

Language Relative
Time
Details
C 1 GCC 3.2.3 -O3, Linux
Visual Basic .NET 0.866 All optimisations, Windows XP
FORTRAN 1.008 GNU Fortran (g77) 3.2.3 -O3, Linux
Pascal 1.027
1.077
Free Pascal 2.2.0 -O3, Linux
GNU Pascal 2.1 (GCC 2.95.2) -O3, Linux
Swift 1.054 Swift 3.0.1, -O, Linux
Rust 1.077 Rust 0.13.0, --release, Linux
Java 1.121 Sun JDK 1.5.0_04-b05, Linux
Visual Basic 6 1.132 All optimisations, Windows XP
Haskell 1.223 GHC 7.4.1-O2 -funbox-strict-fields, Linux
Ada 1.401 GNAT/GCC 3.4.4 -O3, Linux
Go 1.481 Go version go1.1.1 linux/amd64, Linux
Simula 2.099 GNU Cim 5.1, GCC 4.8.1 -O2, Linux
Lua 2.515
22.7
LuaJIT 2.0.3, Linux
Lua 5.2.3, Linux
Python 2.633
30.0
PyPy 2.2.1 (Python 2.7.3), Linux
Python 2.7.6, Linux
Erlang 3.663
9.335
Erlang/OTP 17, emulator 6.0, HiPE [native, {hipe, [o3]}]
Byte code (BEAM), Linux
ALGOL 60 3.951 MARST 2.7, GCC 4.8.1 -O3, Linux
Lisp 7.41
19.8
GNU Common Lisp 2.6.7, Compiled, Linux
GNU Common Lisp 2.6.7, Interpreted
Smalltalk 7.59 GNU Smalltalk 2.3.5, Linux
Forth 9.92 Gforth 0.7.0, Linux
COBOL 12.5
46.3
Micro Focus Visual COBOL 2010, Windows 7
Fixed decimal instead of computational-2
Algol 68 15.2 Algol 68 Genie 2.4.1 -O3, Linux
Perl 23.6 Perl v5.8.0, Linux
Ruby 26.1 Ruby 1.8.3, Linux
JavaScript 27.6
39.1
46.9
Opera 8.0, Linux
Internet Explorer 6.0.2900, Windows XP
Mozilla Firefox 1.0.6, Linux
QBasic 148.3 MS-DOS QBasic 1.1, Windows XP Console
Mathematica 391.6 Mathematica 10.3.1.0, Raspberry Pi 3, Raspbian

Posted at 16:37 Permalink

Monday, November 28, 2016

Floating Point Benchmark: Mathematica Language Added

I have posted an update to my trigonometry-intense floating point benchmark which adds Wolfram's Mathematica (or, if you like, “Wolfram Language”) to the list of languages in which the benchmark is implemented. A new release of the benchmark collection including Mathematica is now available for downloading.

The relative performance of the various language implementations (with C taken as 1) is as follows. All language implementations of the benchmark listed below produced identical results to the last (11th) decimal place.

Language Relative
Time
Details
C 1 GCC 3.2.3 -O3, Linux
Visual Basic .NET 0.866 All optimisations, Windows XP
FORTRAN 1.008 GNU Fortran (g77) 3.2.3 -O3, Linux
Pascal 1.027
1.077
Free Pascal 2.2.0 -O3, Linux
GNU Pascal 2.1 (GCC 2.95.2) -O3, Linux
Rust 1.077 Rust 0.13.0, --release, Linux
Java 1.121 Sun JDK 1.5.0_04-b05, Linux
Visual Basic 6 1.132 All optimisations, Windows XP
Haskell 1.223 GHC 7.4.1-O2 -funbox-strict-fields, Linux
Ada 1.401 GNAT/GCC 3.4.4 -O3, Linux
Go 1.481 Go version go1.1.1 linux/amd64, Linux
Simula 2.099 GNU Cim 5.1, GCC 4.8.1 -O2, Linux
Lua 2.515
22.7
LuaJIT 2.0.3, Linux
Lua 5.2.3, Linux
Python 2.633
30.0
PyPy 2.2.1 (Python 2.7.3), Linux
Python 2.7.6, Linux
Erlang 3.663
9.335
Erlang/OTP 17, emulator 6.0, HiPE [native, {hipe, [o3]}]
Byte code (BEAM), Linux
ALGOL 60 3.951 MARST 2.7, GCC 4.8.1 -O3, Linux
Lisp 7.41
19.8
GNU Common Lisp 2.6.7, Compiled, Linux
GNU Common Lisp 2.6.7, Interpreted
Smalltalk 7.59 GNU Smalltalk 2.3.5, Linux
Forth 9.92 Gforth 0.7.0, Linux
COBOL 12.5
46.3
Micro Focus Visual COBOL 2010, Windows 7
Fixed decimal instead of computational-2
Algol 68 15.2 Algol 68 Genie 2.4.1 -O3, Linux
Perl 23.6 Perl v5.8.0, Linux
Ruby 26.1 Ruby 1.8.3, Linux
JavaScript 27.6
39.1
46.9
Opera 8.0, Linux
Internet Explorer 6.0.2900, Windows XP
Mozilla Firefox 1.0.6, Linux
QBasic 148.3 MS-DOS QBasic 1.1, Windows XP Console
Mathematica 391.6 Mathematica 10.3.1.0, Raspberry Pi 3, Raspbian

The implementation of the benchmark program is completely straightforward: no implementation tricks intended to improve performance are used and no optimisations such as compiling heavily-used functions are done. The program is written in functional style, with all assignments immutable. The only iteration is that used to run the benchmark multiple times: tail recursion is used elsewhere. The code which puts together the summary of the computation (evaluationReport[]) is particularly ugly, but is not included in the benchmark timing.

To compare performance with native C code, I ran the C language version of the benchmark three times for about five minutes each on the Raspberry Pi 3 platform and measured a mean time per iteration of 14.06 microseconds. I then ran the Mathematica benchmark three times for five minutes and computed a mean time per iteration of 5506 microseconds. The C code thus runs around 391.6 times faster than Mathematica.

Note that the Raspberry Pi 3 runs Mathematica very slowly compared to most other desktop platforms. When I ran the identical benchmark in the Wolfram Cloud, it runs at about 681.7 microseconds per iteration, or eight times faster.

It is, of course, absurd to use a computer mathematics system to perform heavy-duty floating point scientific computation (at least without investing the effort to optimise the most computationally-intense portions of the task), so the performance measured by running this program should not be taken as indicative of the merit of Mathematica when used for the purposes for which it is intended. Like the COBOL implementation of the benchmark, this is mostly an exercise in seeing if it's possible and comparing how easily the algorithm can be expressed in different programming languages.

I have also added timings for the C implementations of the fbench and ffbench programs when run on the Raspberry Pi 3.

Posted at 23:42 Permalink