8 facts every computer programmer should know

0


The first known use of the word “computer” dates back to 1613 in a book called The Yong Mans Gleanings by English writer Richard Braithwaite. He says in it: “I read the truest computer of the day, and the best arithmetician that ever breathed, and it has reduced your days to a few. (The spelling mistakes were all deliberate, it was a simpler time at the time).

Braithwaite was referring to a person doing calculations, or calculations. Today, most of this work is done with the metal and plastic plates that we have on our desks.

But how did a box of cables and circuits get so smart? Here, we go through the annals of history and the lesser-known corners of computer programming to find 10 things every computer programmer should know.

1. The first “pre-computers” were powered by steam

In 1801, a French weaver and merchant, Joseph Marie Jacquard, invented a loom which could base the design of a fabric on punched wooden cards.

Fast forward to the 1830s, and the world marveled at a device the size of a house and powered by six steam engines. It was invented by Charles Babbage, the father of IT – and he called it the analytical engine.

Babbage used punch cards to make the monstrous machine programmable. The machine consisted of four parts: The mill (analogous to CPU), magazine (analogous to memory and storage), reader (input) and printer (output).

It is the reader who makes the analytical engine innovative. Thanks to the card reading technology of the Jacquard loom, three different punch cards were used: Operation cards, numbered cards and variable cards.

Credit: Sydney Padua

Babbage unfortunately never managed to build a working version, due to persistent conflicts with his chief engineer. It seems that at the time, CEOs and developers didn’t have a lot of time.

2. The first computer programmer was a woman

In 1843, Ada Lovelace, a British mathematician, published an English translation of an analytical engine article written by Luigi Menabrea, an Italian engineer. To its translation, she added her own detailed notes.

Ada_Lovelace[1]
Credit: Curious expedition

In one of her notes, she described an algorithm for the analytical engine to calculate the Bernoulli numbers. Since the algorithm was considered the first specifically written for implementation on a computer, it has been cited as the first computer programmer.

Did Lovelace step into a Ted Talks life (or whatever the Victorian equivalent)? Sadly no, she passed away at the age of 36, but luckily her legacy lives on.

3. The first computer “bug” was named after a real bug

While the term ‘bug’ in the sense of technical error was first invented by Thomas Edison in 1878, it took another 60 years for someone else to popularize the term.

In 1947 Grace Hopper, an admiral in the US Navy, recorded the first computer “bug” in her logbook while working on a Mark II computer.

A butterfly was discovered stuck in a relay and thus interfering with the operation. Before it was recorded on his note, the butterfly had been “debugged” from the system.

On her note, she wrote: “First real bug detected. “

September 09_1[1]
Credit: Computer History Museum

4. The first digital computer game never made any money

What is considered the ancestor today’s action video games and the first digital computer game was not particularly successful.

In 1962, a computer programmer from the Massachusetts Institute of Technology (MIT), Steve Russell, and his team 200 man-hours to create the first version of Spacewar.

Using the front panel test switches, the game allowed two players to take control of two tiny spaceships. It has become your mission to destroy your opponent’s spaceship before they destroy you.

As well as avoiding your opponent’s shot, you also had to avoid the little white dot in the center of the screen, which represents a star. If you stumbled upon it, boom! You lost the battle.

Russell wrote Spacewar on a PDP-1, one of the first interactive mini-computers from Digital Equipment Corporation (DEC) that used a CRT display and keyboard. Significant improvements were produced later in the spring of 1962 by Peter Samson, Dan Edwards and Martin Graetz.

Dan Edwards (left) and Peter Samson (right) playing Spacewar!  on the PDP-1
Credit: Computer History Museum
Dan Edwards (left) and Peter Samson (right) playing Spacewar! on the PDP-1

Although the game was a big hit on the MIT campus, Russell and his team never enjoyed the game. They never copyrighted it. Besides, they were hackers who wanted to do it to show their friends. So they shared the code with everyone who requested it.

5. The computer virus was originally designed without any harmful intent

In 1983, Fred Cohen, better known as the inventor of defense techniques against computer viruses, designed a parasitic application that could “infect” computers. He defined it as computer virus.

This virus could take hold of a computer, make copies of itself, and spread from machine to machine via a floppy disk. The virus itself was benign and created only to prove it was possible.

He later created a positive virus called compression virus. This virus could be written to find uninfected executables, compress them with the user’s permission, and attach to them.

6. You are more likely to be killed by wolves than to have a SHA-1 collision in Git.

A popular distributed revision control, Git, uses a security hash algorithm 1 (SHA-1) to identify revisions and detect data corruption or tampering.

In data management, a commit makes a set of temporary changes permanent. One of the ways that Git allows its users to specify a commit is to using a short SHA-1.

In its short note regarding SHA-1, Git reported that many people are concerned that at some point they will have two objects in their repository that hashes the same SHA-1 value. This instance is what they call an SHA-1 collision.

Below is Git’s note regarding the SHA-1 collision:

SHA-1 git
Credit: Git

7. If computer programming were a country, it would be the third most diverse for languages ​​spoken

Papua New Guinea has approximately 836 indigenous languages ​​spoken, which makes it the number one country in terms of linguistic diversity. Second on this list is Indonesia, with more than 700 and Nigeria, with over 500 indigenous languages.

All notable programming languages ​​known to man, both in current and historical use, amount to 698 languages. If it was a country, computer programming would be in the bronze medal position. We don’t recommend that you try to learn them all.

8. An image from Playboy magazine is the most widely used for all kinds of image processing algorithms.

The image of Lena Söderberg is a standard test image widely used in the field of image processing since 1973. The image has been cropped from the center page of the November 1972 issue of Playboy magazine.

Lenna[1]
Credit: Dwight Hooker / Playboy Magazine

The Editor-in-Chief of the Institute of Electrical and Electronics Engineers (IEEE) Transactions on Image Processing, David Munson, given two reasons why the image is so popular:

a. It has a good mix of details, flat areas, shading, and textures that do a good job of testing the capabilities of any image processing software.

b. Lena’s image is that of an attractive woman. It is not surprising that the image processing research community (predominantly male) is inclined to use the image.

Want to know another amazing fact?
Right now, you can get three computer programming related packages with incredibly discounted prices on TNW deals:

1. The Raspberry Pi 2 Complete Starter Kit (85% off)
2. The complete Learn to Code pack in 2015 (94% reduction)
3.Python programming bootcamp (96% off)



Share.

Comments are closed.