By Roy Mathur, on 2012-02-27, for The Independent Daily, Mauritius (in which an edited version appeared)
Every week I talk to you about technological goodies like smart phones, games consoles and other gadgets, but where did it all start? Well this week, I'm taking you back in time, right to the very beginning of human history and then far into the future. I'm going to attempt, in this short column, to present you with probably shortest history of computing ever.
Number remembering systems and calendars began to appear many thousands of years ago. Simple devices ranged in size from scratchings on bone and walls to giant structures like Stone Henge in England.
These were later replaced by more efficient manual calculators like the abacus. Though not all these early machines were so simple. The immensely complex Antikythera Mechanism from 100BC Greece shows that even the ancients were capable of amazing feats of sophisticated engineering.
Gradually mechanical calculation machines became even more complex, culminating in the fantastic, though never completed, Difference Engine of Babbage in the early 19th century.
World War II saw another boost to computing power, especially with the building of code-breaking computers using electrical relays and valves. One example of an electro-mechanical computer was the Colossus constructed at Bletchly Park in England.
Other notable events from this period included the construction of the military ENIAC computer in the USA. Also, the word "bug" originates from this time, when US naval officer Grace Hopper discovered an insect stuck inside a computer.
With the adoption of transistors in the mid-1950s onwards, computers became much smaller. Instead of being the size of a room, they could now fit inside a room. Compact machines like IBM's 1401 became very popular.
The invention of micro-chips in the 1960s heralded a huge explosion in the spread of computers everywhere and even further miniaturisation. More importantly to us, these small pieces of etched silicon circuitry enabled companies like Apple and IBM to produce more affordable personal computers. These started to enter our homes from the 1970s and 80s.
The internet as we know it today is really a massive collection of connected computer networks, starting with the American military's ARPANET in the 1960s. By the 1970s many other American networks, including NASA's had joined. By the mid-1980s, the network protocol TCP/IP and the Domain Name System (DNS) was established. Then at CERN (the European particle physics laboratory) in 1990, Tim Berners-Lee created the first web browser and web server and the rest is history.
As you can see, the modern computer age is still in its infancy. There are still many exciting times ahead of us- truly intelligent robots, living, biological computers and much faster and more powerful machines are only a small taste of what we can look forward to. In fact, those born in the 20th century should really give themselves a pat on the back and puff out their chests with pride when they realise that they have been born in a century that has seen computers, space-flight and the internet become commonplace.
And finally, if you think all this talk of computing and the internet is far removed from our everyday lives out here in the middle of the Indian Ocean, you would be wrong. Even our small country has a very important part to play with the AfriNIC (African Network Information Centre) organisation based at Cyber City. AfriNIC is the regional internet registry for Africa. This is the organisation that allocates internet numbers for Africa, for example IP addresses.
So there you have it, the history of computers and the internet in one easy to digest chunk. Of course, I had to leave out a lot, but I hope that your interest has been piqued.
Welcome to the future.
Sources: NASA, Neatorama, W3.org, Wikipedia, Zakon.org