Tagged: computing Toggle Comment Threads | Keyboard Shortcuts

  • feedwordpress 08:01:36 on 2019/04/18 Permalink
    Tags: Big Data, , computing, , database, , relational database, Ted Codd,   

    “Big Data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    a-day-in-data-1200

     

    You’ve probably heard of kilobytes, megabytes, gigabytes, or even terabytes.

    These data units are common everyday amounts that the average person may run into. Units this size may be big enough to quantify the amount of data sent in an email attachment, or the data stored on a hard drive, for example.

    In the coming years, however, these common units will begin to seem more quaint – that’s because the entire digital universe is expected to reach 44 zettabytes by 2020.

    If this number is correct, it will mean there are 40 times more bytes than there are stars in the observable universe…

    The stuff of dreams, the stuff of nightmares: “How Much Data is Generated Each Day?

    * Dan Ariely

    ###

    As we revel in really, really big numbers, we might spare a thought for Edgar Frank “Ted” Codd; he died on this date in 2003.  A distinguished computer scientist who did important work on cellular automata, he is best remembered as the father of computer databases– as the person who laid the foundation for for relational databases, for storing and retrieving information in computer records.

    150px-Edgar_F_Coddsource

     

     
  • feedwordpress 09:01:00 on 2019/02/16 Permalink
    Tags: computing, , , , John Mauchly, , library of the future, Presper Eckert,   

    “It is likely that libraries will carry on and survive, as long as we persist in lending words to the world that surrounds us, and storing them for future readers”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    library

     

    Many visions of the future lie buried in the past. One such future was outlined by the American librarian Charles Ammi Cutter in his essay “The Buffalo Public Library in 1983”, written a century before in 1883.

    Cutter’s fantasy, at times dry and descriptive, is also wonderfully precise:

    The [library], when complete, was to consist of two parts, the first a central store, 150 feet square, a compact mass of shelves and passageways, lighted from the ends, but neither from sides nor top; the second an outer rim of rooms 20 feet wide, lighted from the four streets. In front and rear the rim was to contain special libraries, reading-rooms, and work-rooms; on the sides, the art-galleries. The central portion was a gridiron of stacks, running from front to rear, each stack 2 feet wide, and separated from its neighbor by a passage of 3 feet. Horizontally, the stack was divided by floors into 8 stories, each 8 feet high, giving a little over 7 feet of shelf-room, the highest shelf being so low that no book was beyond the reach of the hand. Each reading-room, 16 feet high, corresponded to two stories of the stack, from which it was separated in winter by glass doors.

    The imagined structure allows for a vast accumulation of books:

    We have now room for over 500,000 volumes in connection with each of the four reading-rooms, or 4,000,000 for the whole building when completed.

    If his vision for Buffalo Public Library might be considered fairly modest from a technological point of view, when casting his net a little wider to consider a future National Library, one which “can afford any luxury”, things get a little more inventive.

    [T]hey have an arrangement that brings your book from the shelf to your desk. You have only to touch the keys that correspond to the letters of the book-mark, adding the number of your desk, and the book is taken off the shelf by a pair of nippers and laid in a little car, which immediately finds its way to you. The whole thing is automatic and very ingenious…

    But for Buffalo book delivery is a cheaper, simpler, and perhaps less noisy, affair.

    …for my part I much prefer our pages with their smart uniforms and noiseless steps. They wear slippers, the passages are all covered with a noiseless and dustless covering, they go the length of the hall in a passage-way screened off from the desk-room so that they are seen only when they leave the stack to cross the hall towards any desk. As that is only 20 feet wide, the interruption to study is nothing.

    Cutter’s fantasy might appear fairly mundane, born out of the fairly (stereo)typical neuroses of a librarian: in the prevention of all noise (through the wearing of slippers), the halting of the spread of illness (through good ventilation), and the disorder of the collection (through technological innovations)…

    Far from a wild utopian dream, today Cutter’s library of the future appears basic: there will be books and there will be clean air and there will be good lighting. One wonders what Cutter might make of the library today, in which the most basic dream remains perhaps the most radical: for them to remain in our lives, free and open, clean and bright.

    More at the original, in Public Domain Review: “The Library of the Future: A Vision of 1983 from 1883.”  Read Cutter’s essay in its original at the Internet Archive.

    Pair with “Libraries of the future are going to change in some unexpected ways,” in which IFTF Research Director (and Boing Boing co-founder) David Pescovitz describes a very different future from Cutter’s, and from which the image above was sourced.

    * Alberto Manguel, The Library at Night

    ###

    As we browse in bliss, we might recall that it was on this date in 1946 that the most famous early computer– the ENIAC (Electronic Numerical Integrator And Computer)– was dedicated.  The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX”); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built.  The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world.  ENIAC’s basic clock speed was 100,000 cycles per second. Today’s home computers have clock speeds of 1,000,000,000 cycles per second.

     source

     

     
  • feedwordpress 09:01:26 on 2019/01/14 Permalink
    Tags: computing, Elements, Euclid, , , , incompleteness theorems, , , ,   

    “The laws of nature are but the mathematical thoughts of God”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    euclid

    2,300 years ago, Euclid of Alexandria sat with a reed pen–a humble, sliced stalk of grass–and wrote down the foundational laws that we’ve come to call geometry. Now his beautiful work is available for the first time as an interactive website.

    Euclid’s Elements was first published in 300 B.C. as a compilation of the foundational geometrical proofs established by the ancient Greek. It became the world’s oldest, continuously used mathematical textbook. Then in 1847, mathematician Oliver Byrne rereleased the text with a new, watershed use of graphics. While Euclid’s version had basic sketches, Byrne reimagined the proofs in a modernist, graphic language based upon the three primary colors to keep it all straight. Byrne’s use of color made his book expensive to reproduce and therefore scarce, but Byrne’s edition has been recognized as an important piece of data visualization history all the same…

    Explore elemental beauty at “A masterpiece of ancient data viz, reinvented as a gorgeous website.”

    * Euclid, Elements

    ###

    As we appreciate the angles, we might spare a thought for Kurt Friedrich Gödel; he died on this date in 1978.  A  logician, mathematician, and philosopher, he is considered (along with Aristotle, Alfred Tarski— whose birthday this also is– and Gottlob Frege) to be one of the most important logicians in history.  Gödel had an immense impact upon scientific and philosophical thinking in the 20th century.  He is, perhaps, best remembered for his Incompleteness Theorems, which led to (among other important results) Alan Turing’s insights into computational theory.

    Kurt Gödel’s achievement in modern logic is singular and monumental – indeed it is more than a monument, it is a landmark which will remain visible far in space and time. … The subject of logic has certainly completely changed its nature and possibilities with Gödel’s achievement.                  — John von Neumann

    kurt_gödel source

     

     
  • feedwordpress 09:01:37 on 2018/12/02 Permalink
    Tags: , computing, Enrico Fermi, , , , , quantum computing, , ,   

    “As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality.”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    quantum computing

    Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decades—and without any practical results to show for it.

    We’ve been told that quantum computers could “provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence.” We’ve been assured that quantum computers will “forever alter our economic, industrial, academic, and societal landscape.” We’ve even been told that “the encryption that protects the world’s most sensitive data may soon be broken” by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.

    Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers. On Wall Street, Morgan Stanley and other financial giants expect quantum computing to mature soon and are keen to figure out how this technology can help them.

    It’s become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the world’s top technical talent, at places like Google, IBM, and Microsoft, are working hard, and with lavish resources in state-of-the-art laboratories, to realize their vision of a quantum-computing future.

    In light of all this, it’s natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5 to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.) I belong to a tiny minority that answers, “Not in the foreseeable future.” Having spent decades conducting research in quantum and condensed-matter physics, I’ve developed my very pessimistic view. It’s based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work…

    Michel Dyakonov makes “The Case Against Quantum Computing.”

    * Albert Einstein

    ###

    As we feel the need for speed, we might recall that it was on this date in 1942 that a team of scientists led by Enrico Fermi, working inside an enormous tent on a squash court under the stands of the University of Chicago’s Stagg Field, achieved the first controlled nuclear fission chain reaction… laying the foundation for the atomic bomb and later, nuclear power generation.

    “…the Italian Navigator has just landed in the New World…”
    – Coded telephone message confirming first self-sustaining nuclear chain reaction, December 2, 1942.

    Illustration depicting the scene on Dec. 2, 1942 (Photo copyright of Chicago Historical Society)

    source

    Indeed, exactly 15 years later, on this date in 1957, the world’s first full-scale atomic electric power plant devoted exclusively to peacetime uses, the Shippingport Atomic Power Station, reached criticality; the first power was produced 16 days later, after engineers integrated the generator into the distribution grid of Duquesne Light Company.

     source

     

     
  • feedwordpress 08:01:51 on 2018/10/18 Permalink
    Tags: , Analytical Engine, , computing, , , , history of computing,   

    “The future is already here – it’s just not evenly distributed”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    future

    Security, transportation, energy, personal “stuff”– the 2018 staff of Popular Mechanics, asked leading engineers and futurists for their visions of future cities, and built a handbook to navigate this new world: “The World of 2045.”

    * William Gibson (in The Economist, December 4, 2003)

    ###

    As we take the long view, we might spare a thought for Charles Babbage; he died on this date in 1871. A mathematician, philosopher, inventor, and mechanical engineer, Babbage is best remembered for originating the concept of a programmable computer. Anxious to eliminate inaccuracies in mathematical tables, he first built a small calculating machine able to compute squares.  He then produced prototypes of portions of a larger Difference Engine. (Georg and Edvard Schuetz later constructed the first working devices to the same design, and found them successful in limited applications.)  In 1833 he began his programmable Analytical Machine (AKA, the Analytical Engine), the forerunner of modern computers, with coding help from Ada Lovelace, who created an algorithm for the Analytical Machine to calculate a sequence of Bernoulli numbers— for which she is remembered as the first computer programmer.

    Babbage’s other inventions include the cowcatcher, the dynamometer, the standard railroad gauge, uniform postal rates, occulting lights for lighthouses, Greenwich time signals, and the heliograph opthalmoscope.  A true hacker, he was also passionate about cyphers and lock-picking.

     source

     

     
  • feedwordpress 08:01:34 on 2018/09/09 Permalink
    Tags: , computing, , , , , , small business,   

    “A buyer with disproportionate power”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    Chickens are seen at a poultry farm at Hartbeesfontein, a settlement near Klerksdorp, in the North West province

    Imagine the farm that raised the chicken that produced the meat that sits in your sandwich: a few workers, thousands of birds, tens of thousands of pounds of white and dark meat, work that starts before dawn and ends after dusk, uncertain revenue, slim profits. There are thousands of these small farms in the United States, and they benefit from millions of dollars of taxpayer support each year.

    Chicken is America’s favorite protein, after all. Family farms are one of its most prized institutions. And farming is tough business. According to one estimate, a new, hangar-like chicken house costs something like $300,000 to build, and more to maintain and upgrade. “A farmer has to invest over $1 million just to get set up—a lot of debt to carry when you’re paid on average between 5 cents and 6 cents per pound of chicken produced,” Sally Lee of the Rural Advancement Foundation International-USA has found. Even when a chicken-growing operation is established, financial success is far from a sure thing. Given those realities—and given the American love for and support of the family farm—generous taxpayer subsidies seem not just sensible, but vital.

    But a government report released this spring calls into question whether all those family chicken farms are really family chicken farms, and whether those taxpayer dollars might be better spent elsewhere. The Small Business Administration’s inspector general looked at poultry growers, and found that many of them are tied-and-bound contractors—so controlled by their agreements with giant food corporations that they no longer act like independent entities. Why offer them taxpayer support meant for the little guy?…

    What your chicken dinner says about wage stagnation, income inequality, and economic sclerosis in the United States: “The Rise of the Zombie Small Businesses.”

    For a consideration of the effects of corporate concentration on wages: “More and more companies have monopoly power over workers’ wages. That’s killing the economy.”

    * Monopsony: 1) (economics) A market situation in which there is only one buyer for a product; also, such a buyer. [from 1930s] 2) (economics) A buyer with disproportionate power.  -Wiktionary

    ###

    As we cogitate on (real) competition, we might recall that it was on this date in 1947 that fabled computer scientist Grace Hopper (see here and here), then a programmer at Harvard’s Harvard’s Mark II Aiken Relay computer, found and documented the first computer “bug”– an insect that had lodged in the works.  The incident is recorded in Hopper’s logbook alongside the offending moth, taped to the logbook page: “15:45 Relay #70 Panel F (moth) in relay. First actual case of bug being found.”

    This anecdote has led to Hopper being pretty widely credited with coining the term “bug” (and ultimately “de-bug”) in its technological usage… but the term actually dates back at least to Thomas Edison…

    bug

    Grace Hoppers log entry

     

     
  • feedwordpress 08:01:41 on 2018/04/25 Permalink
    Tags: , , , computing, , , , , , ,   

    “Man is not born to solve the problem of the universe, but to find out what he has to do; and to restrain himself within the limits of his comprehension”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    Half a century ago, the pioneers of chaos theory discovered that the “butterfly effect” makes long-term prediction impossible. Even the smallest perturbation to a complex system (like the weather, the economy or just about anything else) can touch off a concatenation of events that leads to a dramatically divergent future. Unable to pin down the state of these systems precisely enough to predict how they’ll play out, we live under a veil of uncertainty.

    But now the robots are here to help…

    In new computer experiments, artificial-intelligence algorithms can tell the future of chaotic systems.  For example, researchers have used machine learning to predict the chaotic evolution of a model flame front like the one pictured above.  Learn how– and what it may mean– at “Machine Learning’s ‘Amazing’ Ability to Predict Chaos.”

    * Johann Wolfgang von Goethe

    ###

    As we contemplate complexity, we might might recall that it was on this date in 1961 that Robert Noyce was issued patent number 2981877 for his “semiconductor device-and-lead structure,” the first patent for what would come to be known as the integrated circuit.  In fact another engineer, Jack Kilby, had separately and essentially simultaneously developed the same technology (Kilby’s design was rooted in germanium; Noyce’s in silicon) and has filed a few months earlier than Noyce… a fact that was recognized in 2000 when Kilby was Awarded the Nobel Prize– in which Noyce, who had died in 1990, did not share.

    Noyce (left) and Kilby (right)

     source

     

     

     
  • feedwordpress 09:01:14 on 2018/01/23 Permalink
    Tags: , computing, , , , James Gosling, Java, , , Sun Microsystems,   

    “Unless it wants to break faith with its social function, art must show the world as changeable. And help to change it.”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    Andrei Lacatusu, a self-taught digital artist from Rome, created this series of digital art called “Social Decay.”

    Learn more at “Artist Imagines The Decay Of Social Media Companies“: see the full set at Lacatusu’s Behance page.

    [TotH to the always-illuminating Pop Loser]

    * Ernst Fischer

    ###

    As we contemplate a post-social media world, we might recall that it was on this date in 1996 that the first version of the Java programming language was released by Sun Microsystems; the language, created by James Gosling, had been in use in since 1995 as part of Sun’s Java Platform.  Its ability to “write once, run anywhere” made Java ideal for Internet-based applications.  As the popularity of the Internet soared, so did the usage of Java.

     source

     

     
  • feedwordpress 09:01:41 on 2017/12/10 Permalink
    Tags: , , Computer programming, computing, flash, flash photography, , , ,   

    “A flash of revelation and a flash of response”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    “A Cellar Dive in the Bend,” c.1895, by Richard Hoe Lawrence and Henry G. Piffard

    All photography requires light, but the light used in flash photography is unique — shocking, intrusive and abrupt. It’s quite unlike the light that comes from the sun, or even from ambient illumination. It explodes, suddenly, into darkness.

    The history of flash goes right back to the challenges faced by early photographers who wanted to use their cameras in places where there was insufficient light — indoors, at night, in caves. The first flash photograph was probably a daguerreotype of a fossil, taken in 1839 by burning limelight…

    In its early days, a sense of quasi-divine revelation was invoked by some flash photographers, especially when documenting deplorable social conditions. Jacob Riis, for example, working in New York in the late 1880s, used transcendental language to help underscore flash’s significance as an instrument of intervention and purgation. But it’s in relation to documentary photography that we encounter most starkly flash’s singular, and contradictory, aspects. It makes visible that which would otherwise remain in darkness; but it is often associated with unwelcome intrusion, a rupturing of private lives and interiors.

    Yet flash brings a form of democracy to the material world. Many details take on unplanned prominence, as we see in the work of those Farm Security Administration photographers who used flash in the 1930s and laid bare the reality of poverty during the Depression. A sudden flare of light reveals each dent on a kitchen utensil and the label on each carefully stored can; each photograph on the mantel; each cherished ornament; each little heap of waste paper or discarded rag; each piece of polished furniture or stained floor or accumulation of dust; each wrinkle. Flash can make plain, bring out of obscurity, the appearance of things that may never before have been seen with such clarity…

    Find illumination at “A short history of flash photography.”

    * J.M. Coetzee, Disgrace

    ###

    As we glory in the glare, we might send elegantly-calculated birthday greetings to Augusta Ada King-Noel, Countess of Lovelace (née Byron); she was born on this date in 1815.  The daughter of the poet Lord Byron, she was the author of what can reasonably be considered the first “computer program”– so one of the “parents” of the modern computer.  Her work was in collaboration with her long-time friend and thought partner Charles Babbage (known as “the father of computers”), in particular, in conjunction with Babbage’s work on the Analytical Engine.

    Ada, Countess of Lovelace, 1840

    source

     

     

     
  • feedwordpress 08:01:19 on 2017/09/29 Permalink
    Tags: , computing, , , , , , software, , Word   

    “It is not enough for code to work”*… 


    Warning: preg_match_all(): Compilation failed: invalid range in character class at offset 7 in /homepages/23/d339537987/htdocs/pb/wp-content/themes/p2/inc/mentions.php on line 77

     

    It’s been said that software is “eating the world.” More and more, critical systems that were once controlled mechanically, or by people, are coming to depend on code. This was perhaps never clearer than in the summer of 2015, when on a single day, United Airlines grounded its fleet because of a problem with its departure-management system; trading was suspended on the New York Stock Exchange after an upgrade; the front page of The Wall Street Journal’s website crashed; and Seattle’s 911 system went down again, this time because a different router failed. The simultaneous failure of so many software systems smelled at first of a coordinated cyberattack. Almost more frightening was the realization, late in the day, that it was just a coincidence…

    Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break… Software failures are failures of understanding, and of imagination…

    Invisible– but all too real and painful– problems, and the attempts to make them visible: “The Coming Software Apocalypse.”

    * Robert C. Martin, Clean Code: A Handbook of Agile Software Craftsmanship

    ###

    As we Code for America, we might recall that it was on this date in 1983 that Microsoft released its first software application, Microsoft Word 1.0.  For use with MS-DOS compatible systems, Word was the first word processing software to make extensive use of a computer mouse. (Not coincidentally, Microsoft had released a computer mouse for IBM-compatible PCs earlier in the year.)  A free demo version of Word was included with the current edition of PC World—  the first time a floppy disk was included with a magazine.

     source

     

     
c
compose new post
j
next post/next comment
k
previous post/previous comment
r
reply
e
edit
o
show/hide comments
t
go to top
l
go to login
h
show/hide help
esc
cancel