Recent Updates Page 2 Toggle Comment Threads | Keyboard Shortcuts

  • feedwordpress 09:01:03 on 2019/01/08 Permalink
    Tags: Blade Runner, , , , , , , , prediciton,   

    “Memories, you’re talking about memories”*… 



    It’s natural, here at the lip of a new year, to wonder what 2019 might hold.  And it’s bracing to note that Blade Runner (released in 1982) is one of 14 films set in a future that is this, the year on which we’re embarking.

    But lest we dwell on the dark prognostication they tend to portray, we might take heart from Jill Lepore’s wonderfully-entertaining review of predictions: “What 2018 looked like fifty years ago” and recent honoree Isaac Asimov’s 1983 response to the Toronto Star‘s request for a look at the world of 2019.

    Niels Bohr was surely right when he observed that “prediction is difficult, especially about the future.”

    * Rick Deckard (Harrison Ford), Blade Runner


    As we contend with the contemporary, we might spend a memorial moment honoring two extraordinary explorers who died on this date.  Marco Polo, whose coda to his remarkable travelogue was “I did not tell half of what I saw,” passed away on this date in 1324.

    A page from “Il Milione” (aka” Le Livre des Merveilles” (“The Book of Wonders”)… and in English, “The Travels of Marco Polo”

    And Galileo Galilei, the Italian physicist, philosopher, and pioneering astronomer, rose to his beloved heavens on this date in 1642.  Galileo (whom, readers will recall, had his share of trouble with authorities displeased with his challenge to Aristotelean cosmology), died insisting “still, it [the Earth] moves.”

    Draft of Galileo’s letter to Leonardo Donato, Doge of Venice, in which he first recorded the movement of the moons of Jupiter– an observation that upset the notion that all celestial bodies must revolve around the Earth.

  • feedwordpress 09:01:29 on 2019/01/07 Permalink
    Tags: artifical intelligence, , , George Dyson, , , Nikola Tesla, ,   

    “‘Now I understand,’ said the last man”… 



    All revolutions come to an end, whether they succeed or fail.

    The digital revolution began when stored-program computers broke the distinction between numbers that mean things and numbers that do things. Numbers that do things now rule the world. But who rules over the machines?

    Once it was simple: programmers wrote the instructions that were supplied to the machines. Since the machines were controlled by these instructions, those who wrote the instructions controlled the machines.

    Two things then happened. As computers proliferated, the humans providing instructions could no longer keep up with the insatiable appetite of the machines. Codes became self-replicating, and machines began supplying instructions to other machines. Vast fortunes were made by those who had a hand in this. A small number of people and companies who helped spawn self-replicating codes became some of the richest and most powerful individuals and organizations in the world.

    Then something changed. There is now more code than ever, but it is increasingly difficult to find anyone who has their hands on the wheel. Individual agency is on the wane. Most of us, most of the time, are following instructions delivered to us by computers rather than the other way around. The digital revolution has come full circle and the next revolution, an analog revolution, has begun. None dare speak its name.

    Childhood’s End was Arthur C. Clarke’s masterpiece, published in 1953, chronicling the arrival of benevolent Overlords who bring many of the same conveniences now delivered by the Keepers of the Internet to Earth. It does not end well…

    George Dyson explains that nations, alliances of nations, and national institutions are in decline, while a state perhaps best described as “Oligarchia” is on the ascent: the Edge New Year’s Essay, “Childhood’s End.”

    (For a different perspective on the same dynamics, see, e.g., Kevin Kelly’s The Inevitable.)

    * Arthur C. Clarke, Childhood’s End


    As we ponder the possibility of posterity, we might spare a thought for Serbian-American electrical engineer and inventor Nikola Tesla; he died on this date in 1943.  Tesla is probably best remembered for his rivalry with Thomas Edison:  Tesla invented and patented the first AC motor and generator (c.f.: Niagara Falls); Edison promoted DC power… and went to great lengths to discredit Tesla and his approach.  In the end, of course, Tesla was right.

    Tesla patented over 300 inventions worldwide, though he kept many of his creations out of the patent system to protect their confidentiality.  His work ranged widely, from technology critical to the development of radio to the first remote control.  At the turn of the century, Tesla designed and began planning a “worldwide wireless communications system” that was backed by J.P. Morgan…  until Morgan lost confidence and pulled out.  “Cyberspace,” as described by the likes of William Gibson and Neal Stephenson, is largely prefigured in Tesla’s plan.  On Tesla’s 75th birthday in 1931, Time put him on its cover, captioned “All the world’s his power house.”  He received congratulatory letters from Albert Einstein and more than 70 other pioneers in science and engineering.  But Tesla’s talent ran far, far ahead of his luck.  He died penniless n Room 3327 of the New Yorker Hotel.



  • feedwordpress 09:01:22 on 2019/01/06 Permalink
    Tags: , Georg Cantor, , , , , Peter Carruthers, , set theory,   

    “Control of consciousness determines the quality of life”*… 



    Peter Carruthers, Distinguished University Professor of Philosophy at the University of Maryland, College Park, is an expert on the philosophy of mind who draws heavily on empirical psychology and cognitive neuroscience. He outlined many of his ideas on conscious thinking in his 2015 book The Centered Mind: What the Science of Working Memory Shows Us about the Nature of Human Thought. More recently, in 2017, he published a paper with the astonishing title of “The Illusion of Conscious Thought.”…

    Philosopher Peter Carruthers insists that conscious thought, judgment and volition are illusions. They arise from processes of which we are forever unaware.  He explains to Steve Ayan the reasons for his provocative proposal: “There Is No Such Thing as Conscious Thought.”

    See also: “An Anthropologist Investigates How We Think About How We Think.”

    * Mihaly Csikszentmihalyi, Flow: The Psychology of Optimal Experience


    As we think about thought, we might spare one for Georg Ferdinand Ludwig Philipp Cantor; he died on this date in 1918.  Cantor was the mathematician who created set theory, now fundamental to math,  His proof that the real numbers are more numerous than the natural numbers implies the existence of an “infinity of infinities”… a result that generated a great deal of resistance, both mathematical (from the likes of Henri Poincaré) and philosophical (most notably from Wittgenstein).  Some Christian theologians (particularly neo-Scholastics) saw Cantor’s work as a challenge to the uniqueness of the absolute infinity in the nature of God – on one occasion equating the theory of transfinite numbers with pantheism – a proposition that Cantor, a devout Lutheran, vigorously rejected.

    These harsh criticisms fueled Cantor’s bouts of depression (retrospectively judged by some to have been bipolar disorder); he died in a mental institution.

    220px-Georg_Cantor2 source


  • feedwordpress 09:01:09 on 2019/01/05 Permalink
    Tags: , , , , , , stratigraphy, ,   

    “Eloquence is a painting of the thoughts”*… 


    San Callisto bread and fishes__1542403117405__w800

    Fish and loaves fresco from the Catacombs of St. Callixto, Rome, c. 200. Christian iconography appeared in the first third of the third century. It quickly developed a clear vocabulary—an image of a fisherman represented Jesus Christ and the apostles, a fish under a breadbasket represented communion, and the superimposed Greek letters Χ (chi) and Ρ (rho), sometimes called the Christogram or monogram of Christ, represented Christ himself (Χ and Ρ are the first two letters in the Greek word for Christ, Christos). Early Christians used these and other symbols in mural paintings, catacomb frescoes, and sarcophagi carvings to label deceased Christians. Sixteen popes are buried in the catacombs of San Callixto, located on the Appian Way in Rome.


    Although Éric de Grolier, the so-called Father of Information Systems in France, coined the term infographic in 1979, the history of the graphical representation of information stretches back much further. The history of the visualization of information is intrinsically tied to the history of human cognition, of technology, and of art and design. Human beings have used visuals for so many things: to communicate ideas and stories; to represent space, time, and the cosmos; to extrapolate and compare sets of data; to show connections and disparities; to teach complex concepts or succinctly display information. Visualizations—maps, diagrams, graphs—make arguments for how we should understand the world, and thereby teach us how to understand, organize, and make sense of complicated reality. These simplified versions of the world allow us to see things that are usually unseen: the borders between political jurisdictions, the hierarchy of an organization, or the relationship between the mortal plane and the afterlife…

    A fascinating history of the visual expression of ideas: “Instead of Writing a Thousand Words, Part One: Ideas, Part Two: Maps, and Part Three: Data.”

    * Blaise Pascal


    As we show, not tell, we might recall that it was on this date in 1796, at the Swan Inn in Dunkerton (England), that William Smith, a self-educated geologist, wrote in a single sentence his discovery of the mode of identifying strata by the organized fossils respectively imbedded therein (the theory of of stratigraphy)– now an axiomatic fact of modern geological knowledge.  He went on to publish (in 1799) the first large-scale geological map of the area around Bath, Somerset.

    William_Smith_(geologist) source


  • feedwordpress 09:01:06 on 2019/01/04 Permalink
    Tags: Alan Greenspan, , , , , Federal Reserve, , ,   

    “Efficiency is doing things right; effectiveness is doing the right things.”*… 



    Eliminating waste sounds like a reasonable goal. Why would we not want managers to strive for an ever-more-efficient use of resources? Yet as I will argue, an excessive focus on efficiency can produce startlingly negative effects, to the extent that superefficient businesses create the potential for social disorder. This happens because the rewards arising from efficiency get more and more unequal as that efficiency improves, creating a high degree of specialization and conferring an ever-growing market power on the most-efficient competitors. The resulting business environment is extremely risky, with high returns going to an increasingly limited number of companies and people—an outcome that is clearly unsustainable. The remedy, I believe, is for business, government, and education to focus more strongly on a less immediate source of competitive advantage: resilience. This may reduce the short-term gains from efficiency but will produce a more stable and equitable business environment in the long run…

    Roger Martin‘s eloquent argument for a longer-term perspective and for robustness as a primary goal: “The High Price of Efficiency.”

    [image above: source]

    * Peter Drucker


    As we take the long view, we might recall that it was on this date in 2000 that Alan Greenspan was nominated for his fourth term as Chairman of the Federal Reserve.  An accolyte of Ayn Rand, he oversaw an “easy money” Fed that, many suggest, was a leading cause of the dotcom bubble (which began later that year) and the subprime mortgage crisis, (which led to the Great Recession, and which occurred within a year of his departure from the Fed).

    220px-Alan_Greenspan_color_photo_portrait source


  • feedwordpress 09:01:53 on 2019/01/03 Permalink
    Tags: Andrew Wiles, , Fermat, Fermat's Last Theorem, , Josiah Wedgwood, manuafacturing, , number theiry, ,   

    “I have had my results for a long time, but I do not yet know how to arrive at them”*… 



    Andrew Wiles gave a series of lectures cryptically titled “Modular Forms, Elliptic Curves, and Galois Representations” at a mathematics conference in Cambridge, England, in June 0f 1993. His argument was long and technical. Finally, 20 minutes into the third talk, he came to the end. Then, to punctuate the result, he added:

    => FLT

    “Implies Fermat’s Last Theorem.” The most famous unverified conjecture in the history of mathematics. First proposed by the 17th-century French jurist and spare-time mathematician Pierre de Fermat, it had remained unproven for more than 350 years. Wiles, a professor at Princeton University, had worked on the problem, alone and in secret in the attic of his home, for seven years. Now he was unveiling his proof.

    His announcement electrified his audience—and the world. The story appeared the next day on the front page of The New York Times. Gap, the clothing retailer, asked him to model a new line of jeans, though he demurred. People Weekly named him one of “The 25 Most Intriguing People of the Year,” along with Princess Diana, Michael Jackson, and Bill Clinton. Barbara Walters’ producers reached out to him for an interview, to which Wiles responded, “Who’s Barbara Walters?”

    But the celebration didn’t last. Once a proof is proposed, it must be checked and verified before it is accepted as valid. When Wiles submitted his 200-page proof to the prestigious journal Inventiones Mathematicae, its editor divvied up the manuscript among six reviewers. One of them was Nick Katz, a fellow Princeton mathematician.

    For two months, Katz and a French colleague, Luc Illusie, scrutinized every logical step in Katz’s section of the proof. From time to time, they would come across a line of reasoning they couldn’t follow. Katz would email Wiles, who would provide a fix. But in late August, Wiles offered an explanation that didn’t satisfy the two reviewers. And when Wiles took a closer look, he saw that Katz had found a crack in the mathematical scaffolding. At first, a repair seemed straightforward. But as Wiles picked at the crack, pieces of the structure began falling away…

    How mistakes– first Fermat’s, then Wiles’– reinvigorated a field, then led to fundamental  insight: “How Math’s Most Famous Proof Nearly Broke.”

    * Karl Friedrich Gauss


    As we ponder proof, we might we might spare a thought for Josiah Wedgwood; he died on this date in 1795. An English potter and businessman (he founded the Wedgwood company), he is credited, via his technique of “division of labor,” with the industrialization of the manufacture of pottery– and via his example, much of British (and thus American) manufacturing.

    Wedgwood was a member of the Lunar Society, the Royal Society, and was an ardent abolitionist.  His daughter, Susannah, was the mother of Charles Darwin.



  • feedwordpress 09:01:34 on 2019/01/02 Permalink
    Tags: , Giacomo Gastaldi, , Isaac Asimov, , , , , , World History   

    “There is nothing new in the world except the history you do not know”*… 


    world history

    Map of the globe with a focus on trade and expansion, c. 1565, based on an earlier map by Giacomo Gastaldi. Image credit: Library of Congress


    As we look forward to 2019 and beyond, we might do well to pause and take a look back…

    This animation shows how humans have spread and organized themselves across the Earth over the past 200,000 years. The time lapse starts with the migration of homo sapiens out of sub-Saharan Africa 200,000 years ago, with a few thousand years passing every second. As the agricultural revolution gets underway and the pace of civilization quickens, the animation slows down to hundreds of years per second and eventually, as it nears modern times, 1-2 years per second…

    Via  See also time lapse animations of the history of Europe from the fall of Rome to modern times and human population through time. (via open culture)

    * Harry S. Truman


    As we listen for the rhymes, we might wish the happiest of birthdays to Isaak Yudovich Ozimov– aka Isaac Asimov– who was born on this date in 1920.  A biochemistry professor, he is better remembered as an author– more specifically, as one one of the greatest science fiction authors of his time (imaginer of “The Foundation,” coiner of the term “robotics,” and author of “The Three Laws of Robotics”).  But Asimov was extraordinarily prolific; he published over 500 books– including (in addition to sci fi) 14 books of history, several mysteries, a great deal of popular science, even a worthy volume on Shakespeare– and wrote an estimated 9,000 letters and postcards.

    Isaac.Asimov01 source


  • feedwordpress 09:01:51 on 2019/01/01 Permalink
    Tags: , , , , Julian Calendar, , , schadenfreude, , ,   

    “It is not enough that I succeed, others must fail”*… 



    Who said “it is not enough that I succeed, others must fail”? According to Tiffany Watt Smith, in this spry book, it might have been Gore Vidal or Genghis Kahn. According to the internet it is either La Rochefoucauld or Somerset Maugham. Having thought about it a bit, it might actually have been me, or perhaps it was Watt Smith herself. The point is that it doesn’t really matter since taking pleasure in another’s misfortune turns out to be a pungent but free-floating feeling that pops up everywhere. The flavours might change – as an academic cultural historian Watt Smith is far from suggesting that emotions are universal across time and place – but there is something familiar to us all about the odd stab of pleasure we get when an enemy or even, God help us, a friend, stumbles.

    So it is odd that the English language does not have a word for this grubby little pleasure – instead we have to borrow from the German and call it Schadenfreude (literally “damage-joy”)…

    Kathryn Hughes considers that delicious feeling of satisfaction at the “epic fails” of somebody else in a review of Tiffany Watt Smith’s Schadenfreude- the Joy of Another’s Misfortune: “Damage-joy.”

    * see above


    As we try not to snicker, we might recall that it was on this date in 45 B.C.E. that the Julian Calendar came into effect.  It was the predominant calendar in the Roman world, most of Europe, and in European settlements in the Americas and elsewhere, until it was refined and gradually replaced by the Gregorian calendar, promulgated in 1582 by Pope Gregory XIII.

    (The Julian calendar remains useful for some scientific, especially astronomical, purposes, as it provides a linear count of days from a starting point. which was introduced by Joseph Scaliger in 1583.  Julian Day 0 is defined as noon on Monday, January 1, 4713 B.C.E. (in the Julian Calendar).  Regardless of leap years and calendar changes by the Romans or Pope Gregory, the Julian date number enables the easy calculation of the number of days between two dates by simply taking the difference in their Julian day number. This is useful, say, for astronomers’ calculations of the dates of eclipses.  Thus, the Julian day number of a day is defined as the number of days since noon GMT on 1 Jan 4713 B.C.E. in the Proleptic Julian Calendar, and each Julian day number runs from noon to noon.)

    122918-03-History-Calendar-768x439 source


  • feedwordpress 09:01:51 on 2018/12/21 Permalink
    Tags: Around the World in Eighty Days, Christmas movies, Die Hard, , , , , ,   

    “Now I have a machine gun. Ho ho ho”*… 


    Ok, enough bickering and fighting. Let’s settle this once and for all in the only way I know how – going into a topic in way too much detail.

    As we prepare to enter the year 32 ADH (a.k.a. After Die Hard), the world is gripped by a constantly nagging question.

    No, it’s not “Why does everyone call Hans Gruber and his gang ‘terrorists’ when they were clearly bank robbers?”

    Today we’re going to use data to answer the question “Is Die Hard a Christmas movie?”

    Along the way, we’re going to test Die Hard’s Christmas bona fides against all movies in US cinemas for the past thirty years, using a variety of methods…

    Stephen Follows tackles a perennial poser: “Using data to determine if Die Hard is a Christmas movie.”

    [Image above: source… which also weighs in on the Die Hard question.]

    * Hans Gruber (Alan Rickman), reading what John McClane (Bruce Willis) had on a dead terrorist’s shirt


    As we just say Yippie-Ki-Yay, we might recall that it was on this date that Phileas Fogg completed his circumnavigation of the globe in Jules Verne’s Around the World in Eighty Days.  (As the book was published in 1873, the putative year of the journey was 1871 or 1872.)

    In 1888 American journalist Nellie Bly convinced her editor to let her attempt the feat.  She completed her round-the-world journey in 72 daysShe completed her round-the-world journey in 72 days.


    First edition of Verne’s tale


    Your correspondent is headed into his annual Holiday hiatus; Regular service will resume on or around January 2…  Meantime, many thanks to all for reading– and Happy Holidays!


  • feedwordpress 09:01:26 on 2018/12/20 Permalink
    Tags: , historians, , , , ,   

    “History is past politics and politics present history”*… 


    Vintage compass lies on an ancient world map.

    A recent study confirms a disturbing trend: American college students are abandoning the study of history. Since 2008, the number of students majoring in history in U.S. universities has dropped 30 percent, and history now accounts for a smaller share of all U.S. bachelor’s degrees than at any time since 1950. Although all humanities disciplines have suffered declining enrollments since 2008, none has fallen as far as history. And this decline in majors has been even steeper at elite, private universities — the very institutions that act as standard bearers and gate-keepers for the discipline. The study of history, it seems, is itself becoming a relic of the past.

    It is tempting to blame this decline on relatively recent factors from outside the historical profession. There are more majors to choose from than in the past. As a broader segment of American society has pursued higher education, promising job prospects offered by other fields, from engineering to business, has no doubt played a role in history’s decline. Women have moved in disproportionate numbers away from the humanities and towards the social sciences. The lingering consequences of the Great Recession and the growing emphasis on STEM education have had their effects, as well.

    Yet a deeper dive into the statistics reveals that history’s fortunes have worsened not over a period of years, but over decades. In the late 1960s, over six percent of male undergraduates and almost five percent of female undergraduates majored in history. Today, those numbers are less than 2 percent and 1 percent. History’s collapse began well before the financial crash.

    This fact underscores the sad truth of history’s predicament: The discipline mostly has itself to blame for its current woes. In recent decades, the academic historical profession has become steadily less accessible to students and the general public — and steadily less relevant to addressing critical matters of politics, diplomacy, and war and peace. It is not surprising that students are fleeing history, for the historical discipline has long been fleeing its twin responsibilities to interact with the outside world and engage some of the most fundamental issues confronting the United States…

    Hal Brands and Francis J. Gavin suggest that “The historical profession is committing slow-motion suicide.”

    [Image above: source]

    * The motto of the Johns Hopkins History Department (attributed to 19th century Oxford historian Edward Augutus Freeman by some scholars, and to 19th century Cambridge historian Sir John Robert Seeley by others)


    As we look to the past, we might recall that it was on this date in 1803 that the Louisiana Purchase was consummated, when the U.S. took formal possession of 828,000 square miles of territory from France (an area that includes all or part of 15 current U.S. states and 2 Canadian provinces).  Americans had originally sought to purchase only the port city of New Orleans and its adjacent coastal lands; but Napoleon, cash-strapped by his war with England, offered a (much) larger parcel– and the U.S. quickly agreed.


    The modern continental United States, with the Louisiana Purchase overlaid




compose new post
next post/next comment
previous post/previous comment
show/hide comments
go to top
go to login
show/hide help