50 years of business computing: 1978-2028

50 years of business computing: 1978-2028

applehistory02.jpg

Steve Jobs unveiled Apple’s iMac in 1998.

Image: Apple

Another year in computing is drawing to a close, and with it we’re reminded of major information technology events that happened throughout the past few decades. Join us for this trip down memory lane—with assistance from the Computer History Museum’s timeline and the ComputerHope.com timeline—and let’s also consider where our field may be in a decade.

SEE: Photos: 20 most important tech inventions of the past 100 years (TechRepublic)

Computing in 1978 (this reporter was 4 years old) was starting to revolutionize the desktop. Apple, Commodore, and Tandy/Radio Shack were already selling successful microcomputers to business and home users alike. But in 1978 Dan Bricklin and Bob Frankston released VisiCalc, marking the beginning of the “killer app” perspective. It was a serious software spreadsheet, no mainframe required. This changed the lives of everyone from corporate executives to small-business owners.

Plenty of other milestones happened in 1978: The introduction of 5-1/4-inch disks and drives, ending the era of 8-inch drives that outweighed their host computers; the formation of the Hayes modem company; TCP/IP being invented around the same time as bulletin board systems; MicroPro WordStar making word processing available to the masses; and Digital Equipment Corp. launching its VAX series of minicomputers to give IBM a serious less expensive competitor.

SEE: Apple’s first employee: The remarkable odyssey of Bill Fernandez (cover story PDF) (TechRepublic)

In 1988, Steve Jobs (having been fired by Apple) formed NeXT. One of its computers—a simple black box called the Cube—wound up being used by Tim Berners-Lee to develop a piece of software he called Worldwide Web. (Later, NeXT’s operating system, NeXTStep, became the basis for MacOS when Apple acquired the company.) IBM-compatible personal computing in 1988 got the Creative Labs SoundBlaster card for better audio and graphics, while academia started using Wolfram’s Mathematica software to solve new kinds of problems.

In 1998 we have some corporate memories: Google, Mozilla, PayPal, and VMware were all founded. Apple introduced the iMac and broke the curse of beige boxes. Compaq acquired DEC, Congress passed the Digital Millennium Copyright Act, and Microsoft Internet Explorer passed Netscape Navigator in browser market share. This reporter got his first job in technology journalism, covering the nascent field of computer-telephony integration along with a new fad called voice over IP.

SEE: How the ‘PayPal Mafia’ redefined success in Silicon Valley (cover story PDF) (TechRepublic)

The year 2008 saw the birth of GitHub, as well as Microsoft releasing Hyper-V. But the big headlines were personal in nature: HTC Dream being the first Android phone, Apple launching the App Store, and Google debuting the Chrome browser.

As for 2018: IBM acquired RedHat, Linus Torvalds decided it’s time to be nice to people, we had Meltdown/Spectre, the EU started its General Data Protection Regulation (GDPR), net neutrality died, and solid-state drives reached multi-terabyte scale. GitHub fell victim to the largest recorded DDOS attack, Microsoft released its first Linux kernel, and then Microsoft bought GitHub. Pretty much every enterprise IT company started talking about artificial intelligence, blockchain, containers, and non-volatile memory.

SEE: How we learned to talk to computers, and how they learned to answer back (cover story PDF) (TechRepublic)

What’s ahead for 2028? Predictions are hard to make, but anyone can see what is already in development. Biological computing lets processors evolve, and quantum computing expands possible states from merely 0 and 1—on and off—to additional situations such as “maybe” or “both.” What if there was a processor that ran artificial intelligence software in quantum states atop biological hardware? Would such a machine be a machine anymore? Would it be unhackable, out of our control, or not really work at all? Will Moore’s Law continue? We may essentially be in a land of retina-level computing, where machines are ahead of the human brain’s ability to keep up. And yet, people will still pay too much for incremental improvements to Apple products.

Also see

Leave a Reply

Your email address will not be published. Required fields are marked *