Debugging the Mythology: Was Ada Lovelace Really the First Computer Programmer?

0

Ada Lovelace – in the background, an illustration of the difference machine. Image: Alexander Bertram-Powell / Alamy

Resembling the myths of the ancient Greeks, fictionalized science stories feature brilliant geniuses making spectacular breakthroughs. With the exception of Marie Skłodowska Curie, virtually all of these intellectual heroes are men. Activists therefore energetically sought role models to encourage female students to choose a career in science. None of them can yet compete with Curie, but among the British pioneers the main contender for celebrity status is the 19e– century aristocrat Ada Lovelace.

Wrongly dubbed “the first computer programmer,” Lovelace’s iconic reputation was confirmed in the late 1970s when the United States Department of Defense immortalized her as ADA, a specialized computer language. . Does she deserve such prestige? Rich, glamorous and charming, Lovelace was a visionary mathematician who foresightedly grasped the potential of computers, speculating that they might one day compose poetry or music. But this is very different from compiling a working program that can tell a computer how to process information and make its own decisions.

A remarkable woman in her own right, Lovelace was raised to celebrity status by partnering with two men. The first was her father, the poet Lord George Byron, who abandoned the family a few months after Ada’s birth and death in Greece when she was eight years old. Dominated by his autocratic mother, the child is subjected to a rigorous teaching in mathematics interspersed with long periods of flattening on a board. Desperate to escape this oppressive upbringing, Ada Byron escaped with a guardian but later married a viscount. Later becoming the Earl and Countess of Lovelace, they dedicated their lives to studying and reading, traveling between their different homes.

The flamboyant daughter of Britain’s most famous poet, Lovelace was quickly adopted into London’s elite intellectual circles. Her distinguished acquaintances included Charles Dickens, Michael Faraday and Mary Somerville, although her fairytale story did not have a happy ending – she gambled much of her fortune in horse racing and died of cancer. when she was only 36 years old. As a teenager, she had met the talented but temperamental inventor Charles Babbage, and they established a close and mutually beneficial relationship.

A frequent guest of the family, Babbage tactfully encouraged Lovelace’s scientific interests and helped her broaden her mathematical expertise; at the same time, he won a young and influential admirer with contacts in the right places. Babbage assiduously promoted her prestigious protege, describing her with rapture to Faraday as “that Enchantress who has cast her magic spell around the most abstract of Sciences.” [mathematics]. But other scientists were more skeptical. Astronomer John Herschel scathingly commented on his mathematical abilities: he himself does not have a clear idea. “

Like Lovelace, Babbage has acquired a heroic position he does not fully deserve. While he is certainly a brilliant and inspired inventor, this so-called “father of computing” had little impact on the birth or development of modern digital computers. He began to acquire his posthumous fame in 1971, the centenary of his death, when Maurice Wilkes – who led the Cambridge EDSAC team, an early electronic computer – hailed him as a “pioneer of computing”. This suggestion was greeted with eagerness: by restoring a forgotten ancestor to glory, the nascent computer industry could immediately acquire a glorious national history. Alan Turing wasn’t even a candidate – his photo now graces the £ 50 bill, but at that time he was viewed with suspicion as a doomed homosexual, and his triumphs at Bletchley Park were still largely unknown.

Ironically, Wilkes also accused Babbage of holding back progress by not delivering finished products, thus discouraging successors. Babbage’s first project was its Difference Engine, intended to be a giant calculating machine capable of producing long tables of numbers automatically and accurately. The government gave her funding to make this device, hoping it would ultimately prove cheaper than hiring human “computers,” many of which were women.

After about ten years, the money and enthusiasm ran out before the machine was fully operational, and Babbage turned his attention to a much more ambitious project: his analytical engine. Although it was never completed, this one was based on the unprecedented concept that he would “eat his own tail”, that he could modify his own behavior using the results of his calculations to determine his next. stage. This internal decision-making process is a fundamental characteristic of computers, although Babbage’s machine operated on the familiar decimal system based on the numbers one through nine, and not on the binary of electronic equipment. Relying on complex systems of interacting cogwheels, the engine degenerated into a technological nightmare that unnecessarily consumed vast amounts of time and energy.

Grace Hopper — digital revolutionary. Image: Granger Historical Picture Archive / Alamy

Lovelace got involved while the business was still an exciting path to the future. Babbage invited her to translate an article into French on her analytical machine by mathematician Luigi Menabrea, who later became Prime Minister of Italy. After nine months of work under Babbage’s direction, his published version included a commentary twice as long as Menabrea’s original. It was attributed to him, but – as Herschel suggested – Babbage may have had a contribution; it is impossible to know how many. More famous still, one of its additional notes, G, presents a table for the calculation of so-called Bernoulli numbers, which have great mathematical significance. Although she was solely responsible for it, the table is not a program, but shows the steps that would occur in a preprogrammed machine if one existed.

Heroes are made, not born. If computer scientists believe they need a 19ecentury, so maybe Herman Hollerith should supplant Babbage? To compile the U.S. Census, Hollerith invented eponymous punch cards that are still in use 100 years later – and he also founded a company that became the international giant IBM.

And as a female model, American math graduate Grace Hopper seems eminently more appropriate than fickle Victorian socialite from London. Rear Admiral of the US Navy during World War II, this programming pioneer gave her name to a powerful supercomputer. Hopper revolutionized the digital world by insisting that instead of forcing people to communicate in symbolic code, computers should learn to speak English. It also left a permanent mark on the English language – the term “debugging” was coined after removing a moth that had flown inside certain circuits.


Source link

Share.

Comments are closed.