The History Of Computers During My Lifetime – The 1990’s
by Jason Patterson
(jason@pattosoft.com.au)
The IBM RS/6000
In the early 1980’s IBM had been right at the forefront of RISC research with the IBM 801 project led by John Cocke, but it took IBM a long time to produce a successful RISC product. IBM’s first commercial RISC, the RT PC, was a flop, but IBM kept trying. Its second commercial RISC product, the RISC System 6000 introduced in 1990, was a good one. The RS/6000 workstation’s multi chip POWER1 processor was the first superscalar RISC processor, and racked up speed records in many areas. With over 100 instructions, the POWER architecture barely qualified as a RISC (it even included string operations), but its price was very competitive. IBM used its influence to push for third party software support, and as a result many CAD and scientific applications were ported to RS/6000 workstations running AIX, IBM’s version of UNIX.
Microsoft Windows 3
As its name implies, Windows 3 was not the first release of Microsoft’s Windows graphical user interface for PC’s. Windows had originally been released in 1985. However, in the past Windows had looked ugly, run slowly and had very little support from third party software developers.
Windows 3 was different. It was still 16-bit, but the user interface was completely revamped to mimic the look and feel of IBM’s as-yet unreleased OS/2 with its 3D sculpted buttons. The 640 Kb memory limit was broken (sort of), resulting in better performance and finally giving PC’s the chance to run large graphical applications. Multiple programs could be run simultaneously, and although this wasn’t true preemptive multitasking it was a big step forward. Virtual memory was also provided.
Most important of all, however, was that at its big launch in May 1990, Microsoft was able to parade an impressive lineup of major software vendors with applications which ran under Windows 3. Among these were versions of the Microsoft Word word processor and Microsoft Excel spreadsheet, which went on to dominate the personal word processing and spreadsheet markets on both Microsoft Windows and the Apple Macintosh.
The bottom line was that a PC running Windows 3 was now almost as easy to use as an Apple Macintosh. Because of this, Windows swept through the PC world like wildfire, and within a year nearly everyone was running it on their PC’s. In 1992, version 3.1 was released, which added TrueType fonts and provided better stability, further narrowing the gap between Windows and the Mac. This was followed by Windows 3.1.1 (marketed as Windows for Workgroups), which added networking support and closed the gap even more.
Apple Sues Microsoft
Apple actually began court proceedings against Microsoft in 1988, when Microsoft released Windows 2. However it wasn’t until Windows 3 was released, and Apple immediately expanded its claim to include Windows 3, that the media began to pay major attention to this case. In essence, Apple was arguing that Microsoft Windows breached copyright by being too similar to the Macintosh user interface.
The case ended up taking many years and going through several appeals. The final decision, announced on ???? (early 95?), was that copyright had not been breached. Some people saw this as a good decision because it promoted competition, while others saw it as a terrible decision because it reduced the incentive to develop new innovative technology. This fundamental question is still under debate today, and probably will be forever.
The AMD 386DX
The AMD 386 was the first successful x86 processor that wasn’t built by Intel, and as such it started an x86 processor price war. When Intel’s original 16 MHz 386 was introduced in 1985, it cost $299. Five years later, it was still commanding the relatively high price of $171, and the 33 MHz version fetched $214. AMD’s 40 MHz 386DX was released in March 1991 at $281, but within a year its price had plunged 50% to $140. The prices of PCs followed the chip prices down, and fell by as much as $1000. As a result, the market for PC’s running Windows expanded by over 33%.
Apple Macintosh System 7
Although not as much of a leap as users has hoped for, the System 7 release of Apple’s Macintosh operating system in May 1991 introduced a few new features. Cooperative multitasking had been available through the optional MultiFinder of System 6, but it was made a standard feature of System 7. Virtual memory was also introduced, as was the innovative balloon help system. But the biggest improvement was the TrueType scalable outline fonts. Even though they didn’t reach Microsoft Windows until version 3.1 (1992), TrueType fonts almost single handedly created the low cost inkjet printer market. Suddenly PostScript was no longer necessary for high quality printing, ending the domination of laser printers.
Virtual Reality
In the early 1990’s, the availability of high powered 3D graphics workstations from vendors such as Silicon Graphics allowed all sorts of interesting uses for interactive 3D graphics to be developed. Computer aided design (CAD) and 3D animation for special effects were two relatively obvious uses, but a more interesting and exciting use was the concept of virtual reality (VR).
Virtual reality allowed a user to be placed within a virtual world which he or she could explore from an arbitrary point of view. Early uses of VR included walkthroughs of buildings which had not yet been constructed, simulations of environments which were too expensive or too dangerous to perform normal training within (eg: outer space), and multi player virtual reality games. Other more sophisticated uses were found within existing scientific visualization fields such as chemical engineering. In a sense, virtual reality had actually been around for ages in the form of flight simulation.
To make the virtual world convincing to the user, a number of special devices were invented. The most important of these was the stereoscopic head mounted display, which tracked the position and orientation of the user’s head and displayed a different image to each eye to trick the user’s binocular vision into perceiving depth in the 3D scene. The head mounted display also fed separate audio signals to each ear to produce stereo sound. When driven by appropriate software running on a computer with fast enough 3D graphics, the user could become totally immersed in a realistic virtual environment.
To interact with the virtual environment, the user wore a dataglove on one, or sometimes both, hand(s). This allowed the position of the user’s hand(s) to be tracked and drawn within the virtual environment, complete with shape and gesture information. Different gestures were used to perform actions and interact with objects in the virtual world. Some lower cost VR systems used a hand held device with buttons, rather than a glove. Other high end VR systems added simple voice command recognition. One extremely expensive glove device even provided a degree of tactile feedback (touch).
The cost of a head mounted display and a dataglove was very high, so low cost systems gave up on the idea of true immersive virtual reality and concentrated on giving the perception of depth to 3D scenes, usually by synchronizing the display of a normal monitor with a pair of shutter glasses which blanked out each eye in succession.
The Alpha Architecture – 64-bit Arrives
Digital’s Alpha architecture, announced in 1992, was the first true 64-bit architecture. It was designed for a 15 to 25 year lifespan, and aimed to be the major replacement for Digital’s VAX architecture, which had dominated the minicomputer and server markets during the 1980’s. Alpha was a clean, pure RISC architecture designed to accommodate a 1000 fold increase in performance over its lifetime (10 fold by clock rate, 10 fold by superscalar execution, and 10 fold by multiprocessing). Learning from experience, DEC engineers and architects led by Richard Sites and Richard Witek carefully analyzed and avoided any obvious limits to future performance when they designed the Alpha. And since it was 64-bit from the beginning, kludgy 64-bit extensions were unnecessary (unlike MIPS, SPARC and the PowerPC).
At the launch of the Alpha architecture DEC also announced the first Alpha implementation, the 21064. It immediately jumped to the top of the performance table, clocking in at almost twice the performance of its competitors (for the 200 MHz version). It was a dual issue superscalar processor that contained just 1.68 million transistors, which wasn’t many compared to 3.1 million for Sun’s SuperSPARC, 2.8 million for the PowerPC 601, and 23 million for the POWER2. In many ways the Alpha 21064 was the antithesis of IBM’s POWER designs, which achieved high performance through instruction level parallelism at the expense of a large transistor count and a slower clock. The Alpha 21064 concentrated on the original RISC idea of simplicity and a higher clock rate, but that also had its drawback – very high power consumption.
One of Alpha’s major goals was to replace the VAX architecture. To make the VAX to Alpha transition easier for existing VAX customers, DEC provided a translator which converted binary VAX programs into binary Alpha programs. The resulting Alpha programs ran more slowly than if they had been recompiled for an Alpha from source code, but much faster than if they were emulated at runtime using standard interpretation. The Alpha was so much faster than the VAX that there was no performance loss, and usually a substantial performance gain, for translated programs. The same approach was also available for users of the MIPS based DECstations. Three years later, a SPARC to Alpha version of the translator was built in an attempt to encourage the large installed base of SPARCstation users to move to the Alpha platform.
Linux – A Free UNIX
In 1992, a talented programmer named Linus Torvalds took a small educational version of UNIX called Minix and rewrote and extended it. By mid 1993, Linux had completely dropped its Minix roots and was becoming quite a usable version of UNIX. It was adopted with great enthusiasm by other programmers on the Internet, and began to spread like wildfire. It soon became the fastest growing version of UNIX, mainly because it was free. Linux ran primarily on x86 based PC’s, and it actually ran pretty well even on a slow 386 with 4 Mb of RAM and a 40 Mb hard disk.
Linus and his followers proudly described Linux as a „hacker’s system” (ie: a programmer’s system) because it relied heavily on freely available software which had been written by other programmers. Its graphics system was the X Window System, which was freely available from MIT. For its GUI it used a collection of freely available window managers and other GUI components, as well as the Athena and OpenLook styles, which were also freely available. Because of its popularity, some companies even began to sell versions of Motif for Linux.
Most of the other programs which people actually used (the shells, the compilers, the utility commands etc) came from the GNU Project – a free software project started by Richard Stallman in the 1980’s. Stallman was a talented programmer, but he was also a little unusual. He passionately believed that all software should be free and should come with source code so that other programmers could extend it, and that computing professionals should only make money through consulting. The ultimate goal of the GNU Project was to create a completely free UNIX-like operating system called GNU (which stood for GNU’s Not Unix – a recursive definition!).
The PowerPC Architecture
1993 saw the introduction of the PowerPC RISC architecture, a modified version of IBM’s successful POWER architecture used in its RS/6000 workstations. PowerPC was the result of an alliance between Apple (who recognized the need to drop the aging 68000 architecture in favor of a RISC), IBM (who were dissatisfied with the PC market after loosing control of it to Microsoft and clone vendors), and Motorola (who had manufactured the 68000 series and wanted to keep making chips for Apple’s machines). The AIM (Apple, IBM, Motorola) alliance, which started in 1991, was something that the computing industry couldn’t believe was going to work. After all, IBM and Apple had been bitter enemies just a few years earlier.
Unlike the expensive multi chip POWER processors, the PowerPC architecture was aimed at low cost single chip microprocessor implementations. As a result, some of the excess baggage of POWER was eliminated or replaced. At the same time, to better accommodate the future, optional 64-bit extensions were added. Unfortunately, this meant that the PowerPC architecture was not totally forward or backward compatible with POWER, causing headaches for compiler writers.
The first PowerPC processor, the PowerPC 601, was released in 1993, two years after the announcement of the PowerPC architecture. It was a three issue superscalar processor which offered high performance at a low cost. It soon made its way into the lower half of IBM’s RS/6000 workstation line. A year later it also appeared in Apple’s first Power Macintosh computers.
Late 1993 also saw IBM’s POWER2 processor succeed the POWER1 as the processor in the high end RS/6000 machines. POWER2 was an expensive and very aggressively superscalar processor. This resulted in very high complexity and an impressive 23 million transistors spread over eight chips! The complexity was well targeted and was quite effective, but it also limited the clock rate – an interesting tradeoff considering that the highly parallel 71.5 MHz POWER2 was faster than the 200 MHz DEC Alpha 21064 (but the POWER2 was also much more expensive).
The Apple Newton
The Apple Newton, released in August 1993, was the first popular hand held personal digital assistant (PDA). The Newton’s primary input device was a stylus pen, and it relied heavily on printed handwriting recognition and pen based navigation for its user interface. It was aimed squarely at mobile business professionals, and had a built-in notepad, calculator, to-do list, calendar and address book for organizing personal and business affairs. Using an optional wired or wireless modem, it could send faxes or hook up to the Internet to send and receive email. It even had a version of the popular Quicken financial software to help organize personal and business expenses.
Although it weighed less than 1 lb and was only the size of a small notepad, the Newton had roughly the processing power of an Intel 80486. It used a 20 MHz Acorn ARM RISC processor because its low cost, high speed and low power consumption made it ideal for the Newton’s relatively demanding handwriting recognition based user interface. The Newton also had a 366×240 pixel reflective LCD display, 640 Kb of RAM, 1 Mb of non-volatile RAM, and 4 Mb of ROM containing its pen based operating system and built-in applications. The Newton communicated with other Newtons and normal desktop computers through an infra-red signaling system, and used credit card sized plug-in cards for expansion devices.
The Newton cost $699, and 50,000 units were sold in the first 10 weeks. Unfortunately, the first generation of Newtons had very poor handwriting recognition, and were received very poorly.
A Standard UNIX – COSE & CDE
Since the great divide of the early eighties, UNIX had suffered from a severe lack of standardization between platforms. The great divide had split UNIX into two distinct families – BSD from Berkeley and System V from AT&T. Although they had basically the same functionality, there were sufficient differences between BSD and System V that virtually every application of any substance needed to be modified to work on „the other” system. On top of this, the X Window System had even bigger problems stemming from the different GUI’s available – Athena, OpenLook, Motif, DECwindows etc.
Product differentiation forces within the UNIX market had meant that practically no two systems were alike. Sun used BSD and OpenLook, HP used System V and Motif, Digital used BSD and DECwindows, and so on. All up, the UNIX market was a mess and end users were frustrated by the differences between „UNIX” systems. However, in 1993 the looming threat of Microsoft’s Windows NT forced the UNIX vendors to finally see the light. Convinced that a divided UNIX market would never fight off Windows NT, but a united one might, they quickly agreed to standardize.
A consortium consisting of Sun, HP, IBM, Digital, AT&T Bell Labs, Novell and SCO all agreed on a single Common Operating Software Environment (COSE) and a Common Desktop Environment (CDE). The basis of COSE was Spec-1170, a UNIX API specification based on AT&T’s UNIX System V Release 4 (SVR4), which combined most of the BSD and System V functionality into a single version of UNIX. CDE sat on top of this, and consisted of the X Window System with the Motif user interface and a desktop manager based on HP’s Visual User Environment (VUE). Finally, Sun’s desktop utilities were converted to Motif and became the utilities supplied with CDE.
Over the next couple of years Sun moved to Solaris (SVR4) and gradually dropped OpenLook. Similarly, DEC moved to OSF/1 and gradually dropped DECwindows. Other vendors acted likewise, and by the end of 1995 all the major UNIX variants were COSE/CDE systems, with the exception of Silicon Graphics and Linux. Silicon Graphics adopted SVR4 and Motif, but used its own desktop manager rather than CDE’s VUE based one. Linux, of course, couldn’t adopt Motif because Motif was not free. But even Linux was basically SVR4, and users could buy Motif if they wanted it.
The Intel Pentium
The Intel Pentium processor began shipping in late 1993, and swept through the PC industry faster than any of Intel’s previous processors. Although Intel’s 80486 (1989) included a built-in FPU and was much faster than the 80386, it was the Pentium that introduced the next leap forward in the x86 microarchitecture: superscalar pipelines. Skeptics said a CISC architecture couldn’t do it, but the Pentium proved otherwise. The Pentium contained 3.1 million transistors and initially ran at 60 MHz. It was called the Pentium rather than the 80586 to avoid confusion with the copycat names of x86 processors from AMD and NexGen (such as the AMD386 and Nx586).
Although it dominated the PC world, the Pentium had a checkered career. In November 1994, a researcher at an educational institution found a serious error in the precision of the floating point divide operation of the Pentium. This meant that all previous floating point calculations involving division were now suspect if they had been performed on a Pentium. Even worse, it emerged that Intel had known about the flaw for over a year, but had decided to say nothing. In the end, a month of intense market pressure forced Intel to replace a lot of the installed base of Pentiums with a free upgrade to a reimplementation of the processor which didn’t have the flaw.
In 1994 and 1995 a typical Pentium based PC had a Pentium processor running at between 60 and 120 MHz, 4 to 16 Mb of RAM, a couple of hundred Mb of disk space, 8-bit 640×480 „SuperVGA” graphics, a 14″ color monitor, a CD-ROM drive, and ran Windows 3.1. It typically cost around $1800 to $2500, depending on the specific configuration.
Apple’s Power Macintosh
Apple’s Power Macintosh computers, introduced in March 1994, marked the successful transition of RISC into the mainstream personal computer market.
Unlike the painful CISC to RISC transitions which had occurred in the UNIX market, Apple handled the transition from its existing base of 68000 based Macintoshes to the new PowerPC based ones beautifully. To provide complete backward compatibility with existing 68000 software, the Macintosh operating system was augmented to emulate the 68040 processor. This emulation required no user intervention, and worked even for many device drivers. And because the new PowerPC processors were so fast, the emulation overhead was tolerable. Naturally, native PowerPC applications ran faster than emulated 68000 ones, but there was only a slight loss of performance for 68000 based software when running on a PowerMac, compared to a 68040 based Macintosh. The concept worked so well that within the first year over a million PowerMac’s had been sold, rocketing the PowerPC up to the top of the RISC architectures in terms of importance.
The initial PowerMac 6100 had a 60 MHz PowerPC 601 processor, 8 Mb of RAM, 16-bit 640×480 graphics, 16-bit stereo sound, a 250 Mb hard disk, a CD-ROM drive and built-in ethernet. Its 14″ color monitor was a unique design with the speakers mounted in an angled panel below the display. In this configuration, the PowerMac 6100 cost $2289. The more expensive 7100 and 8100 models had faster processors (66 and 80 MHz) and were more expandable.
Efficient emulation of the 68040 was absolutely critical to the success of the PowerMac, since even some parts of the Macintosh operating system were still 68000 code when the PowerMac was first released (much of the code had been written in assembly language back in the mid 1980’s). About a year after the introduction of the PowerMac, a start-up company called Connectix announced SpeedDoubler – a much faster 68040 emulator for the PowerMac based on dynamic compilation. Users were quick to adopt it. Recognizing the importance of emulation performance, Apple soon changed their 68040 emulator to use dynamic compilation.
The World Wide Web
Although the Internet had been around for many years, it was the introduction of the World Wide Web which made the Internet popular. The Web offered a simple, friendly, graphical way of browsing for information or entertainment. Millions of electronic storefronts suddenly sprung up for the tens of millions of Web „surfers” to look at. Suddenly the Internet boomed, and just about everyone who owned a computer wanted to connect. Modem sales skyrocketed.
The World Wide Web was based on the hypertext idea. Information was stored as formatted hypertext in the HTML format, which browser tools such as Mosaic and Netscape could fetch from across the Internet and display to the user. HTML was soon augmented to allow pictures, then video and audio, and even 3D graphics and virtual reality. Eventually, the Web was even able to achieve intelligent interactive content via Sun’s Java language.
End users loved the Web because the user interface was a simple point and click style (just click on the hypertext links). As such, it was much easier than ftp and telnet. The user base grew very rapidly, doubling every few weeks. Internet Cafe’s appeared in shopping malls so that even people without a computer could surf the web. Then the media got on board. Once that happened there was no stopping it.
The current opinion is that the Web has started the next big boom in the computer industry – the wide spread use of networking for both entertainment and commerce.
IBM’s OS/2 Warp
After the first two largely failed launches of IBM’s OS/2 operating system for PC’s, the third attempt, released in October 1994 and marketed as OS/2 Warp, finally put OS/2 on the map. IBM started a big new marketing push (small planet), which concentrated on OS/2’s networking support and Internet access tools, plus its technical advantages of multitasking, true 32-bit, and backward compatibility with applications written for Microsoft Windows 3.x. In the first five months IBM sold 1.7 million copies of OS/2 Warp, firmly establishing it as the second most popular PC operating system behind Microsoft Windows.
Macintosh Clones, Finally
In April 1995, six months after Apple agreed to license its Macintosh operating system for cloning purposes, the first Macintosh clones appeared. Onlookers from the PC world could have been forgiven for thinking that the new clone makers didn’t understand what cloning was all about – price. Radius’s VideoVision workstation was a souped up PowerMac 8100 aimed at the high end video editing market, and had a price tag of almost $30,000. At the other extreme, Cutting Edge’s Quotro 850 was based on the out of date Motorola 68040 processor. Only Power Computing’s offerings fit the traditional image of clones: cheaper machines using off the shelf components, with flexible configurations and quick delivery.
Power Computing’s first two models, the Power80 and Power100, were roughly equivalent to the PowerMac 7100 and 8100, but they used low cost PC components and enclosures wherever possible to keep the cost down, including using a standard PC monitor. As a result, they looked like IBM PC’s from the outside. The Power 100 model was priced at $3,349, about $1000 cheaper than a similarly equipped PowerMac 8100.
Microsoft Windows ’95
After at least eighteen months of pre-release hype, Microsoft finally released Windows ’95 on August 24th 1995. The associated marketing campaign was nothing short of amazing. It was a massive global multimedia marketing hype-fest including TV, radio, newspapers, magazines, billboards and just about everything else. Adds for Windows ’95 were everywhere! Such a barrage had hardly ever been seen in the history of marketing.
Technically, Windows ’95 added several important pieces of functionality to the Windows environment. The filesystem could now support filenames longer than 8 characters, because Windows ’95 was a free standing operating system which didn’t sit on top of DOS (unlike Windows 3.x). Nested folders were also supported (finally!). Windows ’95 also had full networking support, including tools for accessing the Internet and Microsoft’s own proprietary network (MSN, the Microsoft Network). The native Windows ’95 API and most of the operating system was 32-bit, which resulted in improved performance. True preemptive multitasking was provided for native 32-bit Windows ’95 applications (but not for 16-bit Windows 3.x applications). And finally, the look of the user interface components was altered to make them look more stylish.
All up, Windows ’95 was probably closer to Windows NT than to Windows 3.1, marking Microsoft’s clear intention for NT to be the successor operating system once the x86 architecture began to fade from the scene.
Toy Story
Toy story, released in late 1995 by PIXAR, was the first feature film to have been fully generated using 3D computer graphics. It marked the coming of age of 3D graphics, which had previously only been used for short special effect sequences lasting a few seconds within traditional films using actors, cameras etc.
The animators who created Toy Story used high end Silicon Graphics workstations to prepare the animations. More than 400 3D models and 2000 texture maps were used, and the two main characters (Woody and Buzz) each had over 700 animation controls, including 212 on Woody’s face and 58 on Woody’s mouth alone. The modeling and animation preparation took over ten man years to complete. No motion capture was used in the entire film – everything was animated by hand. The final frames were then rendered by a „rendering farm” of 117 multiprocessor Sun SPARCstation 20’s. It took 800,000 machine hours to render the 114,200 frames of the 79 minute film.
Parašykite komentarą