Enjoy unlimited access to all forum features for FREE! Optional upgrade available for extra perks.
Domain summit 2024

Top 25 days in computing history

Status
Not open for further replies.

domaingenius

Level 8
Legacy Platinum Member
Joined
Mar 6, 2006
Messages
1,250
Reaction score
8
Feedback: 6 / 0 / 0
Top 25 days in computing history

The path to modern-day computing is longer than many suspect, and strewn with interesting nuggets of information. These include:

- the inventor of e-mail can't remember when he got it working
- Pac-Man was modelled on a pizza and called Puck-Man until vandals forced a name change
- the first hard drive had a 5MB capacity and could only be moved by a fork-lift truck
- in 1980, The Times reported with wonder that a word processor could be bought for £3,500
- Deep Blue's chess victory over Garry Kasparov was described as a 'psychological triumph'.

For full details of these milestones, and other gems from the Times Archive, read on...

December 23, 1834: Charles Babbage announces the analytical engine

Babbage

Babbage had started work on a difference engine in 1821, believing that a mechanical calculating device could produce mathematical tables far more quickly and accurately than human mathematicians. During 1834, with his first engine still incomplete, he came up with the idea of an even more ambitious machine: the analytical engine, which could be programmed with a variety of calculations. According to the Science Museum in London:

The designs for the Analytical Engine include almost all the essential logical features of a modern electronic digital computer. The engine was programmable using punched cards. It had a ‘store’ where numbers and intermediate results could be held and a separate ‘mill’ where the arithmetic processing was performed. The separation of the ‘store’ (memory) and ‘mill’ (central processor) is a fundamental feature of the internal organisation of modern computers.

Babbage continued working on his designs until his death in 1871, but the limitations of Victorian engineering and his own awkward personality prevented him from building any of them. The Science Museum successfully assembled a difference engine from Babbage’s designs in November 1991 (pictured above).

- From the Times Archive, 1870: “It is difficult, perhaps, to make the nature of such abstruse inventions at all clear to the popular and untechnical reader” - The life and times of Mr Charles Babbage

Click here for a video of a recently built difference engine in action

October 22, 1925: The transistor is patented

Although Julius Edgar Lilienfeld never put his design into practice, the device he patented in 1925 was the forefather of each of the trillions of transistors in circulation today. Development was slow, but in the 1950s transistors began to replace the vacuum tubes of early computers, leading to smaller and more reliable devices.

- From the Times Archive, 1954: “The use of transistors in the ordinary radio set is probably still far off.” Click here for full article

January 1, 1939: Hewlett-Packard is founded, giving birth to Silicon Valley

Hp_advert_2 Bill Hewlett and Dave Packard, friends from Stanford University, decided to go into business together in the late 1930s. In 1938, they started making oscillators, used to test sound equipment, in a small shed behind the house at 367 Addison Avenue, Palo Alto, now an unlikely destination for a steady stream of tech-minded pilgrims seeking out the birthplace of Silicon Valley. The oscillator was a success, leading Hewlett and Packard to formalise their partnership at the beginning of 1939, deciding the order of their names with the toss of a coin. The company grew quickly, introducing new products and an open style of management that came to characterise the tech companies that followed it to the leafy strip of cities south of San Francisco. HP is now the world’s biggest maker of desktop computers.

Above: An HP advert from 1970 Click here to see the full advert with readable text

Click here for HP’s own interactive timeline

November 25, 1943: Colossus comes to life

When Alan Turing set out his plans for a universal computing machine, he was told that it would have to be bigger than St Paul’s Cathedral and could never be built. British determination to crack German codes during the Second World War provided Turing with the opportunity and support to get his designs up and running. At first, relatively primitive machines called bombes were used to unpick the German Enigma codes, running through combinations of cipher keys more quickly than the German’s believed possible. When the Mark I Colossus clattered into life at the end of 1943, it accelerated the process still further. Its contribution, which prefigured the vast role that information technology would come to play in warfare, was deemed to have shortened the war by two years. Of less immediate interest during the dark days of the Second World War was the historical significance of the machine itself: Turing and his team had succeeded where Babbage had failed, building a computer that was both automatic and programmable. Ten Colossus machines had been built by the end of the war, but none survived. Churchill ordered them all destroyed and their blueprints burnt to preserve the secrets of Bletchley Park.

- From the Times Archive, 1977: “It is a relief to those who worked at Bletchley Park during the war to learn from your report in The Times of October 13 that the veil of enforced secrecy is being lifted.” - A letter responding to reports that documents relating to the codebreaking operation at Bletchley Park will be published in 1977

Click here for stories of people who worked at Bletchley

February 14, 1946: ENIAC is unveiled

Eniac

ENIAC, or the Electronic Numerical Integrator and Computer also emerged from the Second World War. In 1943, John Mauchly and J Prosper Eckert began work on a machine designed to speed up the production of firing tables for the US Army. The room-sized computer that they built had 18,000 vacuum tubes compared with Colossus’s 1,500, and was a much more flexible and powerful machine. According to the ENIAC Museum, information processed by the computer contributed to weather prediction, atomic energy calculations, studies of cosmic rays and random numbers, and other scientific projects. Unlike the secret Colossus, the technical achievement of the ENIAC was publically recognised, according to The First Computers, by Raúl Rojas, Raúl Rojas and Ulf Hashagen:

ENIAC captured the imagination of the public, not only because of its sheer size, but, more importantly, because of its lightning speed. Addition (or subtraction) of two 10-digit numbers was accomplished at an unprecedented rate of 5,000 per second. This was about 1,000 times faster than any other computing machine was capable of up to that point, with similar accuracy.

When ENIAC was turned off for the last time in 1955, it was estimated to have performed more calculations in a decade than the whole human race had managed before it.

- From the Times Archive, 1946: “In the field of memory alone, it seemed likely that man was to be provided with vastly greater and speedier access to the inherited knowledge of the ages than he was able to command at present.” Click here for full report on ENIAC and other early computers

December 1954: Casio’s prototype desktop calculator

The Model 14-A was described as compact despite being more than a metre wide and weighing 140kg. What was revolutionary about it was its all-electronic operation, which distinguished it from mechanical predecessors that traced their roots back to the abacus. The 14-A kicked off a round of febrile innovation that became known as the calculator wars, during which companies competed to bring out cheaper and lighter models. Within 15 years, handheld calculators were disposable consumer goods.

Click here for more on the history of Casio

September 4, 1956: The launch of the IBM 305 RAMAC

Ramac The first hard drive arrived in the 305 RAMAC, meaning that the computer could store data digitally. Not much data, though – the one-ton machine seen in an archive photograph being loaded into an aircraft on a forklift truck, could store a little less than 5MB of information. Thumb-sized USB drives will now store more than a thousand times that quantity of data.

- From the Times Archive, 1958: IBM announces the RAMAC’s first visit to the UK in 1958

Click here to read about how the 305 RAMAC worked

October 29, 1969: The dawning of the internet era

Who invented the internet and when remains the subject of lively debate, in part because separate strands of research contributed to the ability to connect computers in a network. One breakthrough among many came in 1969, when researchers at the US military’s Advance Research Projects Agency used telephone lines to connect two computers, one in Los Angeles and the other at Stanford University, near San Francisco. The experiment, which laid the foundations of ARPANET and then the internet, worked at the second attempt. The first log-in crashed the system, according to the Computer History Museum in California.

November-December 1971: The first e-mail is sent

Like the internet and many other areas of computing history, the origin of e-mail is the subject of doubts and competing claims. It all depends on what you mean by e-mail. The first message to travel between two computers appears to have been sent by Ray Tomlinson, an American computer programmer, in late 1971. He can’t remember the exact date, and at first said it may have been in 1972 – the later date is still often given for the invention of e-mail. The precise details of the first message sent are also lost to history. Seemingly a man of immense modesty, Mr Tomlinson says on his website that no one really cared about the origins of e-mail until the mid Nineties, by which time he had forgotten many of the details and was rather alarmed by people’s hunger for information:

The seven years since have been curious, indeed. The zeal of humans to discover the origins of things that affect their lives is almost frightening. One reporter even asked what I ate for supper the evening on which I successfully sent the first network email. Can you remember what you ate in 1971? I certainly can't.

Contrary to some reports, he did not invent the @ sign, although he was the first to use it as a way of addressing messages.

April 16, 1977: Apple II heralds the age of the home computer

Apple_ii

Steve Jobs and Steve Wozniak had built the first Apple computer a year earlier, but the hand-built Apple I only ever sold in tiny numbers. Its successor, with a moulded plastic case, built-in keyboard and expansion slots, moved Apple into the computing mainstream. According to Paul E Ceruzzi in A History of Modern Computing:

Jobs and Wozniak … did not invent the personal computer, as the legend often goes. But the Apple II came closest to Stewart Brand’s prediction that computers would not only come to the people, they would be embraced by the people as a friendly, nonthreatening piece of technology that could enrich their personal lives.

Many customers were attracted by VisiCalc, a spreadsheet application that was available only with the Apple II, and which broadened the computer’s appeal to businesses as well as homes and schools. The Apple II was updated several times during the Eighties and total sales for the model range topped five million.

- From the Times Archive, 1980: “A word-processing system is now available at under £3,500." That's more than £10,000 once inflation is accounted for - Mighty micro and the small business

May 22, 1980: The birth of Pac-Man

Pacman_2 The world’s most famous computer game began life as Puck-Man, a Japanese arcade game created by a company called Namco Bandai. Worrying that Puck would prove too tempting to teenage vandals, the company changed the name to Pac-Man for the English-speaking world. It proved immensely popular in the US, spreading to every computer platform that would support it and appealing to a broader demographic than previous arcade games. Tori Iwatani, the inventor of the game, described his moment of inspiration in Programmers at Work, by Susan Lammer:

The story I like to tell about the origin of Pac-Man is that one lunchtime I was quite hungry and I ordered a whole pizza. I helped myself to a wedge and what was left was the idea for the Pac-Man shape.

Laptop_slim_2 April 3, 1981: The first portable computer

It looked more like a suitcase than a modern laptop, weighed 10kg and had only a five-inch screen, but the Osborne 1 has a strong claim to be the first portable computer. As such, it represents an early acknowledgement that computers would not be tied to homes, schools and offices – and the first step towards ubiquitous computing.

Click here to see the 1982 Osborne advert at full size

August 12, 1981: IBM launches the “PC”

IBM’s entrant into the home computing market was not technologically advanced, but it nevertheless became the template for the vast majority of early computers and leant its name to an entire class of product. This was partly because the IBM PC was not a closed system – it offered the potential for future add-ons and upgrades – but mainly because the openness of the system allowed competitors to produce cut-price clones of the original PC. The IBM and its clones so dominated the markets that most new products opted for IBM compatibility.

This was also the first time that an IBM computer had incorporated other companies’ technology: the processor came from Intel and the operating system from a small, West Coast software company called Microsoft. Two years later it was claiming victory “in the war to become the de facto standard operating system for the new 16-bit generation of microcomputers.” 5157024

March 3, 1981: The BBC Microcomputer

An unapologetically Anglocentric one, this. Little known outside the UK, the BBC Micro gave most British children of the Eighties their first taste of computing. It was built to complement a BBC series on computer literacy by Acorn, which went on to make a temporarily successful series of home computers. Cheap, robust and advanced for its day, the Micro became the mainstay of computing education for most of the Eighties. The BBC planned for an initial run of 12,000 computers, but eventually sold more than 1.5 million of them.

Click here for a profile of one of the designers

January 24, 1984: Apple Mac popularises the graphical user interface

Apple didn’t invent the graphical user interface – Xerox got there first – but its Macintosh range of computers demonstrated that this was the way people wanted to interact with computers. The Mac came with a mouse, which was used to control the computer via icons and menus, rather than with typed commands. A similar system had been used with Apple’s more expensive Lisa model, but that computer’s high price and sluggish performance limited its appeal.

The first Mac was announced to the world during the 1984 Superbowl, with an advert designed to contrast Apple with the monolithic IBM (a theme to which it would return with Microsoft as the bogeyman in its Mac versus PC campaign).

- The Times explains the mouse in 1983: “The mouse is a palm-sized box which the user pushes around his desk or flat surface beside the micro.” - A mouse to get you out of a hole

November 13, 1990: Tim Berners-Lee writes the first web page

Having submitted a detailed proposal for the world wide web the day before, Tim Berners-Lee wasted no time in getting down to business and creating page one. Now used as the synonym for the internet, the web more properly refers to the system by which information is stored, viewed and transferred over the infrastructure of the internet. It grew out of an information management system called ENQUIRE, built by Mr Berners Lee and his colleagues at the European Organisation for Nuclear Research in Switzerland in 1980. As with so many other strands of computing history, the lead time between initial inspiration and widespread adoption was measured in decades.

March 14, 1993: Mosaic opens up the web

Mosaic_logo Mosaic was the breakthrough browser that liberated the web from purely academic circles and introduced it to the wider world. Versions for Windows PCs and Apple Macs broadened the reach of the web still further. According to The Internet: A Historical Encyclopedia:

Mosaic introduced the concepts of a bookmark and a window history, both of which allowed users to navigate through the web pages they had already visited. Perhaps the most important feature of Mosaic, however, was the ‘image’ tag, which allowed image and text to appear on the page at the same time. While these features are commonplace today, to users of the Web circa 1993, they were extraordinary.

Mosaic was eventually succeeded by Netscape Navigator, which was eventually defeated by Microsoft’s Internet Explorer during the browser wars of the late Nineties.

March 16, 1995: The first Wiki is announced

With the infrastructure of the web in place, the process of populating it with information could begin. An early indication of the democratic, collaborative approach that would characterise internet culture came in 1995, when Ward Cunningham set up a database that could be edited and enlarged by anyone with access to it.

On March 16, he e-mailed a colleague inviting him to be among the first to visit what he called the WikiWikiWeb. Mr Cunningham named it after the Hawaiian word for ‘quick’ because he hoped that the system would speed up the transfer of ideas between programmers. Six years later, Wikipedia applied the concept to a grander ambition, and today the site contains more than ten million articles.

July 4, 1996: Hotmail arrives

Hotmail_logo For many people outside the tech world, Hotmail was the first clue that the internet was going to be very, very useful. A free, web-based mail system that allowed instant contact with anyone who could access a computer, wherever in the world he or she may be, was enough to win over the “it’ll never catch on” brigade. The Business Plan Workbook by Barrow, Barrow and Brown describes in fascinating detail how Sabeer Bhatia and Jack Smith hid their idea from venture capitalists until they had established enough trust to know that they wouldn’t be ripped off, and then walked away from offers that didn’t meet their expectations. It was a bold move from a pair of business novices, as Mr Bhatia remembers:

Only in Silicon Valley could two 27-year-old guys get $300,000 from men they had just met. Two 27-year-old guys with no experience with consumer products, who had never started a company, who had never managed anybody, who had no experience even in software – Jack and I were hardware engineers.

May 11, 1997: Machine takes on man, and wins

Deep_blue

Garry Kasparov, the Russian grandmaster, had previously beaten IBM’s supercomputer, Deep Blue, but something went wrong for humankind during the rematch. Mr Kasparov was not graceful in defeat, accusing Deep Blue of cheating. IBM denied that the computer had benefited from human assistance. After the match, Raymond Keane, the Times Chess Correspondent, commented on the unexpected dynamic of the contest:

It was not so much the machine's strength which defeated Kasparov, as the knowledge that it was a tireless opponent which would exploit any error which he did make. This led Kasparov to try uncharacteristic openings, of which he did not have a perfect grasp. Paradoxically, IBM's victory was more of a psychological triumph than a technical one.

November 18, 1997: Wi-fi standards laid down

The ugly name of IEEE 802.11 heralded the next evolution of the internet: wireless networks that would accelerate the spread of the web and raise the prospect of a truly universal connection.

September 7, 1998: Google founded

Google_beta_3

Larry Page and Sergey Brin were surprised to find that google.com was an unclaimed domain name, after they’d come up with the idea of naming their search engine after the impossibly large-sounding number, 10 to the power of 100. The next day they discovered that they’d been thinking of googol, and that googol.com was taken.

Within a few years Google had become the front door to the web for millions of people who knew no other way to find what they wanted, as well as a gatekeeper that could make or break a company with its search rankings. Google Earth, Google Maps and Google Books demonstrated the company’s ambition to move beyond web search, and its capacity to combine playfulness with unrivalled functionality.

Click here for the top ten Google moments

June 1, 1999: Shawn Fanning releases Napster

Napster_logo Computing and the internet have had a profound effect on almost every business, but few have been affected more dramatically than the music industry. The digital revolution kicked off with Shawn Fanning sending 30 of his friends a program that allowed them to swap their music online. A year later, more than 25 million people had downloaded Napster and a year after that, the site was bogged down in legal action as record labels tried to stop their former customers haemorrhaging to the site. Napster was soon offline, but it spawned a crop of decentralised, peer-to-peer file-sharing networks that proved more resistant to lawsuits.

The record labels have kept up legal efforts to quell file-sharing, trying to force internet service providers to hand over the names of the most prolific offenders, but they have also been forced into the digital market themselves. This year, almost a decade after Napster shot a gaping hole in their business model, the labels have started to experiment with alternative ways of making money from their artists. Mobile phone subscription deals, in which customers pay a nominal monthly fee for access to the music industry’s back catalogue, are emerging as the favoured model, but many of the details remain unresolved. The music industry’s battle for survival is far from over.

January 1, 2000: The world continues

The millennium bug fails to bring down computer networks and Western civilisation.

February 15, 2005: YouTube comes online

Part of the web 2.0 wave of sites built around community participation, YouTube has demonstrated the surprising public appetite for grainy images of amateur video, and the less surprising demand for illicitly uploaded TV shows. Networks were initially hostile to the video-sharing site, but most media companies have now stopped complaining about copyright infringement and started seeing the site as a promotional tool. Other professional programme-makers are producing videos specifically for YouTube, while some amateurs have been elevated to the ranks of the semi-professional by their devoted fans. Like many web 2.0 sites, it may struggle to make money for its corporate overlords (in this case, Google), but YouTube is in the process of creating a new medium of entertainment.

July 11, 2008: Apple launches the iPhone App Store

The first-generation iPhone set new standards for phone usability and desirability, but the second version of the handset marked a greater leap forward for computing. The App Store – an area of iTunes devoted to iPhone programs created by independent developers but approved by Apple – turned the iPhone into a flexible computing platform with the potential to grow with the imagination of its users.

The iPhone also represents a culmination of the various strands of development outlined here, with computers becoming progressively smaller, cheaper, more powerful and capable of an ever-increasing range of tasks. What began as inflexible calculating tools have moved steadily into broader business applications, and then communication and entertainment. They’ve spread from large enterprises to schools, homes, briefcases and pockets, and as they’ve got smaller, they’ve also got cooler. The first computers were large industrial machines; today’s are disposable consumer goods, driven as much by the demands of fashion as by technical improvements. The iPhone is most certainly a fashion item, but it is also a complex computer, telephone, media player and web browser, which marks the pinnacle of the movement towards computing machines that have become progressively less intimidating. Punch cards gave way to floppy disks and CD-ROMs, which are themselves on the way to obsolescence, as more and more data is being delivered via the internet. Even the keyboard may be dying out, at least in the form that we know it. The touchscreen interface and gesture-based control of the iPhone is likely to find its way into desktop computers in the next few years. Even Bill Gates says so…
 
Domain summit 2024
Status
Not open for further replies.

The Rule #1

Do not insult any other member. Be polite and do business. Thank you!

Sedo - it.com Premiums

IT.com

Premium Members

AucDom
UKBackorder
Be a Squirrel
MariaBuy

New Threads

Our Mods' Businesses

Free QR Code Generator by MerchArts
UrlPick.com

*the exceptional businesses of our esteemed moderators

Top Bottom