Historical Operating Systems: Xerox GlobalView

Author’s Note: The demonstrations in this article are based on Xerox GlobalView 2.1, the final release of the operating system and used a software collection available from among the links here: http://toastytech.com/guis/indexlinks.html

Xerox is not a name which one would usually associate with computing, being far more well-known for their photocopying enterprise. For this reason, it is somewhat bizarre to look at the history of Xerox and realise that through their PARC (Palo Alto Research Center), Xerox were one of the most revolutionary computer designers of all time. Their first design, the Alto minicomputer, was released in 1973 and introduced a functioning GUI, complete with WYSIWYG word processing and graphical features more than ten years before the first developments by any other company. Indeed, the Alto represented the concept of the personal computer several years before even the Apple II, Atari 8-bit family and the Radio Shack TRS-80 arrived in that sector and at a time when most computers still had switches and blinkenlights on their front panels.

The Alto was never sold as a commercial product, instead being distributed throughout Xerox itself and to various universities and research facilities. Xerox released their first commercial product, the Xerox 8010 workstation (later known as the Star) in 1981, but by that stage, they had presented their product to many other people, including Apple’s Steve Jobs and Microsoft’s Bill Gates. Microsoft and Apple would soon release their own GUI operating systems, based heavily on the work of Xerox PARC’s research and ultimately would compete to dominate the market for personal computer operating systems while Xerox’s work remained a footnote in their success.

The Xerox Star was relatively unsuccessful, selling in the tens of thousands. Part of the reason for the lack of success for the Xerox Star, despite its technical advantages, was the fact that a single Star workstation cost approximately $16,000 in 1981, $6,000 more than the similarly unsuccessful Apple Lisa and more than $10,000 more than the Macintosh 128k when that was released in 1984. Consequently, the people who could have made most immediate use of a GUI operating system, including graphic designers, typically couldn’t afford it, while those that could afford it were more likely in the market for computers more suited to data processing, like VAX minicomputers or IBM System/3 midrange computers.

Nevertheless, Xerox continued to market the Star throughout the early 1980s. In 1985, the expensive 8010 workstation was replaced with the less expensive and more powerful 6085 PCS on a different hardware platform. The operating system and application software was rewritten as well for better performance, being renamed to ViewPoint. By this stage, though, the Apple Macintosh was severely undercutting even its own stablemate, the Lisa, let alone Xerox’s competing offering. Meanwhile, GUI operating environments were beginning to pop up elsewhere, with the influential Visi On office suite already on IBM-compatible PCs and Microsoft Windows due to arrive at the end of the year, not to mention the release of the Commodore Amiga and the Atari ST.

Eventually, Xerox stopped producing specialised hardware for their software and rewrote it for IBM PC-compatible computers – along with Sun Microsystem’s Solaris – in a form called GlobalView. Since the Xerox Star and ViewPoint software was written in a language called Mesa – later an influence on Java and Niklaus Wirth’s Modula-2 language – GlobalView originally required an add-on card to facilitate the Mesa environment, but in its final release ran as a layer on top of Windows 3.1, 95 or 98 via an emulator.

As a consequence of running in this emulated environment, Xerox GlobalView 2.1 is not a fast operating system. It takes several minutes to boot on the VirtualBox installation of Windows 3.1 which I used for the process, most of which seems to be I/O-dependent, since the host operating system runs about as fast as Windows 3.1 actually can on any computer. The booting process is also rather sparse and cryptic, with the cursor temporarily replaced with a set of four digits, the meaning of which is only elucidated on within difficult-to-find literature on GlobalView’s predecessors.

Once the booting process is complete, one of the first things that you may notice is that the login screen doesn’t hide the fact that Xerox fully intended this system to be networked among several computers. This was a design decision that persisted from the original Star all the way back in 1981 and even further back with the Alto. Since I don’t have a network to use the system with, I simply entered an appropriate username and password and continued on, whereby the system booted up like any other single-user GUI operating system.

Looking at screenshots of the Xerox Star and comparing it with the other early GUI systems that I have used, I can imagine how amazing something like the Xerox Star looked in 1981 when it was released. It makes the Apple Lisa look vaguely dismal in comparison, competes very well with the Apple Macintosh in elegance and blows the likes of Visi On and Microsoft Windows 1.0 out of the water. Xerox GlobalView retains that same look, but by 1996, the lustre had faded and GlobalView looks rather dated and archaic in comparison to Apple’s System 7 or Windows 95. Nevertheless, GlobalView still has a well-designed and consistent GUI.


Astounding in 1981, but definitely old-fashioned by 1996.

GlobalView’s method of creating files is substantially different to that used by modern operating systems and bizarrely resembles the method used by the Apple Lisa. Instead of opening an application, creating a file and saving it, there is a directory containing a set of “Basic Icons”, which comprise blank documents for the various types of documents available, including word processor documents, paint “canvases” and new folders. This is similar to the “stationery paper” model used by the Lisa Office System, although GlobalView doesn’t extend the office metaphor that far.

Creating a new document involves chording (pressing both left and right mouse buttons at the same time) a blank icon in the Basic Icons folder, selecting the Copy option and clicking the left mouse button over the place where you wish to place the icon. Once the icon has been placed, the new document can be opened in much the same way that it may be opened on any newer PC operating system. By default, documents are set to display mode and you need to actually click a button to allow them to be edited.

GlobalView can be installed as an environment by itself, but is far more useful when you install the series of office applications that come with it. As with any good office suite, there is a word processor and a spreadsheet application, although since the Xerox Star pre-dated the concept of computerised presentations, there is no equivalent to Microsoft’s PowerPoint included. There is also a raster paint program, a database application and an email system, among others.

It’s difficult to talk about GlobalView without considering its historical line of descent and it’s clear that while the Xerox Star presented a variety of remarkable advances in GUI design, by 1996, GlobalView was being developed to placate the few remaining organisations who had staked their IT solutions on Xerox’s offerings in the past. The applications no longer had any sort of advances over the competition. In many cases, they feel clunky – the heavy requirement on the keyboard in the word processor is one example, made more unfriendly to the uninitiated by not following the standard controls that had arisen in IBM PC-compatibles and Macintoshes. Still, considering the historical context once again, these decisions feel idiosyncratic rather than clearly wrong.


The paint program isn’t too bad, though.

Using GlobalView makes me wonder what might have become of personal computing if Xerox had marketed their products better – if in fact they could have marketed them better. Of course, even by the standards of the operating systems that were around by the last version of GlobalView, the interface and applications had dated, but that interface had once represented the zenith of graphical user interface design. Like the Apple Lisa, the Xerox Star and its successors represent a dead-end in GUI design and one that might have led to some very interesting things if pursued further.

Historical Operating Systems: The Apple Lisa and the Lisa Office System (a.k.a. 7/7)

While the Macintosh is by far the most successful of Apple’s personal computing endeavours, it was not the first of their systems to introduce a graphical user interface. The Apple Lisa, an expensive computer designed for a professional environment, pre-dated it by a year. However, while the original Macintosh line immediately became an icon of the 1980s, the Lisa was not a sales success and now languishes in obscurity. From a computer history point of view, the Lisa is still a very interesting machine despite its failure in the market place, with a desktop paradigm distinctly different from that of rival computers, including the Macintosh, and one that has not been replicated – in a complete form, at least – to this day.

The Apple Lisa was the first entirely new computer from Apple since their ill-fated Apple III, which had been aimed at the business market rather than the home market occupied by the Apple II. The Apple III had suffered from instability and poor hardware engineering, including a worrying set of problems associated with heat, which caused the solder connections inside the computer to melt, displacing the chips from their slots and causing the computer to fail. Its operating system, Apple SOS, was also not compatible with the Apple DOS versions that were popular on the Apple II, which made it more difficult for users to use their pre-existing software and for developers to port over their programs to the more sophisticated Apple III hardware.

The Lisa project had started before the Apple III was released, and it had originally been intended as an improvement on the Apple II design. However, when Steve Jobs visited Xerox PARC in 1979, he was inspired by their nascent GUI technology which Xerox themselves had not successfully been able to market. Apple adapted the GUI technology for their own purposes as the Lisa’s development progressed.

The Lisa was released in January 1983, and as the second computer ever to be commercially released with a GUI operating system, after Xerox’s Star workstation, it presented a completely different vision of computing to the public than had been seen before. When the vast majority of personal computers presented a text-based interface to their users, usually starting with a BASIC interpreter, the GUI of the Lisa was revolutionary. What’s more, unlike the Xerox Star, which was intended to be bought as part of a networked collection of servers, clients and peripherals, the Lisa was designed as a single-user workstation for individual business users, making it rather more affordable than the Xerox machine.


The Apple Lisa, complete with the ProFile external hard drive.

The hardware of the Lisa centred around the Motorola 68000 processor, which would later become popular among many of the competitors in the mid-to-late 1980s cycle of personal computers, including the Amiga, Atari ST, Sharp X68000 and the Macintosh itself. However, while most of those later computers used a 68000 processor clocked at approximately 8 MHz, the Lisa’s processor was a rather less impressive 5 MHz, which might have compared favourably against the MOS 6502 and Zilog Z80 processors of home computers, but was mediocre in a graphical workstation and let down the Lisa’s performance, leaving it feeling sluggish at times.

Also provided was one megabyte of RAM, substantially more than provided in any home computer of the time and more than the maximum addressable by the IBM PC, along with two 5.25” floppy drives which used a proprietary disc format. An optional accessory was an external hard drive of either 5 MB or 10 MB capacity, which connected to the Lisa’s parallel port interface. This allowed for the installation of the operating system, along with a decent amount of storage for documents.

The operating system itself, the Lisa Office System (later named 7/7), was the most interesting component of the Apple Lisa project. The Lisa OS was an advanced operating system for 1983, with many components that looked ahead of the time when compared to the text-based home computers of the early 1980s. Aside from the GUI itself, which would be quickly copied by the computer companies that survived into the mid-1980s, the Lisa OS provided cooperative multitasking, which would not be present in the Macintosh until the release of System Software 5 in the late 1980s, and virtual memory, which the Macintosh only got in the 1990s with the release of System 7.

The Lisa Office System was delivered with seven application programs (thus explaining the later name of 7/7 for the Lisa OS). Comprising a large set of office-oriented software, it could be considered one of the first office suites ever designed. The suite was formed of a spreadsheet program, a vector graphics program, a graph and chart generator, a list-based database program, a project management program, a terminal application and a word processor, effectively resolving the usual domains of the office suite. It’s interesting to see how close this set of software is to the office suites of today as well; there are some needs in computing that have changed drastically more in scope than in the general presentation of the programs used.

One of the most peculiar and distinctive things about the Lisa Office System was its approach to how files were created and manipulated. Whereas in Microsoft Windows and the Unix/Unix-like systems, there is an application-centric approach to computing, where one finds an application and uses that to create and manipulate a document, the Lisa Office System used a document-centric approach. Files were created by double-clicking generic “paper” files, representing the tearing of a page from a piece of stationery. The files then could be individually named and manipulated.


No, I’ve never claimed to be a good artist – working on a vector graphics illustration with LisaDraw.

There were other distinctive parts of this document-centric model, like the fact that you could not quit applications as you would on a Windows or Unix-like operating system; instead, the applications remained open as long as the application floppy was present in the drive. Files did not have to be manually saved; they could be, as an optional choice, but the default behaviour was to save the files in the background. When the power button was pressed or the floppy disc was ejected, the system would immediately begin cleaning up, meaning that files were preserved and the system was left clean for the next boot or application load. While some parts of this model have later been adopted by other systems, there has still been no system that has really tried to emulate the operation of the Lisa, even when so many hobbyist-focused and open-source operating systems have attempted to recreate the models of, for example, AmigaOS, BeOS and Atari TOS.

Perhaps there are some reasons for that, though. The Lisa was not particularly successful in the marketplace; while the Apple II sold in the millions and the Macintosh became a roaring success in the years to come, the Lisa sold in the tens of thousands. The price of the Lisa didn’t help; it was difficult for companies to justify the cost of the Lisa versus the likes of the IBM PC. However, the performance of the Lisa Office System didn’t help either.

The Macintosh, released a year later with a more powerful processor and a substantially slimmer operating system, often feels quite slow when compared to other operating systems of that general period; the Lisa, with a more sophisticated operating system, suffers from speed issues due to its less powerful processor. To be fair to the Lisa Office System, the slowness doesn’t manifest itself in the same way as it might in a Windows or Unix system, in that the system doesn’t feel like it locks up, and therefore, the system does feel clean in that respect. However, it does feel like a limitation that might have and should have been avoided by using a more powerful processor.

Another limitation of the system comes from the difficulty of programming for it. Relatively little software was written for the Lisa, and most of that was included with the Lisa Office System. As the Lisa Office System covered all of the obvious bases when it came to office software, and the system was certainly not a hacker’s system, there were few places for third-party developers to stake their claim. The Lisa Office System was also written in Pascal, which is not a programming language which is now often considered appropriate for system programming. In order to program for the Lisa, one was forced to use a separate operating system known as the Lisa Workshop and develop software on one Lisa while running it on another. This seems like it would have been tedious and prohibitively expensive for software developers, particularly in the 1980s when many programming studios were rather smaller than they are today.

There are also some other limitations of the Lisa that go outside of hardware or software performance. For a system which seems like it would have been ideal for desktop publishing, the Lisa came with a bizarre lack of typefaces and a limited set of fonts for those typefaces. While the Macintosh came with several typefaces from the very beginning, including the Helvetica and Times typefaces with later revisions (or at least similarly named clones), the Lisa had two typefaces – not really all that impressive for a system with a graphical user interface.

However, for all of those limitations, I don’t feel the Lisa deserved quite the ignominy that it received. After another couple of hardware revisions, including the Macintosh XL, which was named for the Macintosh line due to the addition of a Macintosh ROM along with the MacWorks software giving compatibility with Macintosh programs (although it was still lumbered with the slow 5 MHz processor of the other Lisa computers), the Lisa was dropped by Apple in 1986. Among its users had been NASA, whose extensive use of LisaProject led Apple to develop MacProject for the Macintosh for release in concurrence with the first Macintosh in 1984. The remaining Lisa stock was purchased by Sun Remarketing, who ended up assisting Apple in dumping several thousands of the unsold computers in order to acquire a tax rebate.

What I’ve taken away from my brief experience of emulating the Lisa software is that it feels novel, yet it also has many of the utilities of a modern office system. The document-centric stationery system feels different – not necessarily better or worse – than the more popular application-centric model, and the entire operating system feels very tightly integrated with the application software. I’m not sure I would want to use such a system in my own endeavours – I tend more towards the programmer software tendencies and the hacker’s sensibilities – but as an office system, one could certainly do worse than the Lisa, and it would probably be more intuitive to an office user than Windows once you sufficiently explained how the system works. If you can teach a neophyte with little interest in computing how to use Microsoft Office, then you could certainly teach them how to use the Lisa Office System – and they might actually be more productive on it as well!

Historical Operating Systems: MS-DOS 6.0 and Microsoft Windows 3.1

In 1996, when I was eight years old, I got my first PC. With a 25MHz 486SX processor, 8MB of RAM and an 80MB hard disc, it wasn’t exactly cutting-edge, but it worked, and more importantly, it was mine. Personal computers belonging exclusively to eight-year-old children may be commonplace today, but it was a scarce sight in 1996. I remember some friends of mine with 286s and old monochrome Macintoshes, but my 486 was of a considerably newer pedigree.

Windows 95 had been released only about half a year previous to the acquisition of my 486, so plenty of older computers still ran the 16-bit versions of Windows. My own computer ran MS-DOS 6.0 and Windows 3.1, which were only three years old and still had software being made for them. Considering that I was interested first and foremost in experimenting and in learning how to use “serious” software such as office suites and the like, Windows 3.1 was still very much useful for me back then.

MS-DOS was then in its twilight years and the command-line nature contrasted strongly with most other personal computer operating systems of the time, most strongly with Macintosh. MS-DOS first appeared in 1981, when IBM requested an operating system for their IBM PC 5150. Microsoft responded by purchasing 86-DOS, an unofficial clone of the popular CP/M operating system ported to Intel 8086 processors, and renaming it to IBM PC-DOS and later to MS-DOS when clone PC manufacturers started attracting attention.

Microsoft continued to develop MS-DOS during the 1980s and early 1990s, riding on the wave of success that came from being associated with the growing popularity of the x86 platform. As time progressed, some limitations of the original design of MS-DOS began to display themselves more prominently, particularly in the light of the competition from Apple and Commodore. MS-DOS was a single-tasking operating system, capable of only processing one task at a time (with an exception for certain programs, usually device drivers, called TSRs, or Terminate and Stay Resident). The command-line interface of MS-DOS made this less of an issue than it was on the earliest versions of Apple’s System Software, or on Atari TOS, but it was still a limitation. The fact that device drivers were necessary for a fair amount of devices – serial mice, CD-ROM drives, et cetera, didn’t help either.

Another, more frustrating limitation came from IBM’s original PC design, which used memory between 640KB and the 1MB limit (from the 20-bit address bus of the Intel 8086 and 8088) for ROMs, video adapters and other devices. In 1981, this wasn’t an issue; most personal computers came with 64KB or less of RAM and the days of using 640KB of RAM looked to be quite far in the future. The limitation started hitting home around the late 1980s, when memory in excess of 1MB started to be fairly accessible with a home computer. Unfortunately, with Microsoft’s requirements for backwards compatibility with older IBM PC compatibles, they were forced to take a rather inelegant solution to affairs. The 640KB of memory with the lowest addresses became known as “conventional memory”, while memory above the 1MB mark was addressed as either “extended” or “expanded” memory.

All of this seemed like a bodge job even in the early 1990s, and it wasn’t assisted by Intel’s decision to use segmented memory with the Intel 8086. Memory below the 640KB barrier would be fought over viciously by programs, particularly by games which pushed the conventional memory to its limits. The usual way of resolving the issues raised by the 640KB barrier with games was to modify some of the files which specified the setup parameters of the MS-DOS startup sequence and move everything possible into the 384KB gap between the conventional memory barrier and the 1MB point of extended memory. Indeed, many gamers became dab hands at modifying the AUTOEXEC.BAT batch file and the CONFIG.SYS configuration file to free as much conventional memory as possible. If you were good, you might get about 600KB of memory free in the conventional memory zone, which would be enough for most games, but some just pushed the envelope that bit further than others.

In any case, the limitations of MS-DOS were well-known by 1988, when Microsoft released Windows/286, the 286-compatible version of Microsoft Windows 2.1. Windows itself dated back to 1985, when Microsoft released Windows 1.01 after a delay in release. Compared to the elegance of the Mac System Software or the power and flexibility of AmigaOS, Windows 1.0 looked like it was hacked together. Nevertheless, it provided a GUI to IBM PC compatibles, along with support for co-operative multitasking and multiple windows open at once (although due to legal concerns with Apple, the windows were not allowed to overlap even though the GUI engine allowed for it).

Microsoft continued to refine the Windows environment, with the jump from Windows 1.0 to Windows 2.0 improving the graphical environment considerably, and Windows 2.1 incorporating a DOS extender, allowing Windows programs to run in extended memory. By Windows 3.0, the graphical environment looked much more elegant than it had started out, and while I would argue that it did not look as good as the Mac System Software that it was competing with, it did at least have some endearing properties.

When I got my computer, it came with a box of floppy discs, complete with a full set of MS-DOS and Windows 3.1 installation discs. MS-DOS 6.0 came on three floppy discs, while my distribution of Windows 3.1 came on seven floppy discs plus one disc bizarrely marked ENH 1, the purpose of which I never managed to figure out. I recall installing Windows 3.1 to be remarkably easy; MS-DOS 6.0 was somewhat more difficult, but well within my hands from an early stage. The package of MS-DOS and Windows came with a large manual, from which I would often read not only to learn what to do but also for amusement. (Looking back, I seem to have had a rather skewed sense of what constituted fun when I was younger.)

Using Windows 3.1 was also rather easy, with a simple, consistent GUI that worked well enough for the requirements of office staff, home users and others alike. Given that it was possible to set up the AUTOEXEC.BAT file so that the computer would boot straight into Windows after the initial loading of MS-DOS, it was very possible to use the computer in such a way that you wouldn’t have to leave the Windows environment. I, on the other hand, spent a decent proportion of my time in the MS-DOS environment, not only for playing games – some of which wouldn’t play nicely with Windows 3.1 – but also for exploration purposes.

Efficient use of MS-DOS requires knowledge of the hierarchical nature of the directory structures, as is the case for many command-line interfaces. There were a number of simple commands which one was advised to learn as quickly as possible, such as dir for listing the contents of a directory, cd to change directories and copy for copying files from one destination to another. For whatever reason, command line interfaces proved to be rather friendly for me, but it was probably not the case for other computer users.


I got used to seeing a lot of this during my time with MS-DOS.

While most of the commands found in the MS-DOS distribution are explicit system tools, there are still enough utilities included with the operating system itself such that you’re not forced completely against a brick wall when starting, such as the MS-DOS editor for text files, the MS Anti-Virus suite (revealing an unfortunate legacy of the IBM PC platform) and the Microsoft QBasic interpreter which presented a sophisticated form of BASIC with proper control statements and the capacity for functions. I was particularly glad of the QBasic interpreter when I was younger; it was the first platform on which I learned how to program. Nevertheless, things do seem a bit sparse here for an operating system from 1992.

On the other hand, Windows 3.1 feels rather more generous with its included software – as befits an operating system which was distributed on seven floppy discs. Aside from the system tools, there was also a basic word processor, a graphical text editor, a media player for videos and music, a sound recorder for WAV files and a basic paint program, among other applications. Many of these applications have been updated and distributed with later versions of Windows as well. This guaranteed that you could do some sort of useful work without requiring external packages, although as Microsoft was then, as now, making the Microsoft Office suite, they were careful not to put too many features into the included software.


Enough software to at least get you going.

By the time of MS-DOS 6.0, the operating system was using the FAT16 file system with support for discs up to 2GB. This wasn’t much of a concern for me at the time, as I had my 80MB hard drive back in 1996, and didn’t get a computer with a hard drive in excess of 2GB until 2000, by which time Windows was using the FAT32 file system. A more pressing concern was the 8.3 filename format which MS-DOS restricted you to; files could only have a name of eight characters followed by a three-character extension. This led me to become quite imaginative with filenames, but it seemed to me like an arbitrary and insensible choice of restrictions, especially as Windows 95 included the capacity for filenames of 255 characters.

Considering hard drives in the days of Windows 3.1, it’s also bizarre that Microsoft chose to incorporate so many files with “raw” formats, such as BMP and WAV files. My 80MB hard drive was sufficient to install MS-DOS and Windows completely without any problems, but sometimes, I’d be pushed for hard drive space when I installed too many games. One way to quickly claw back the hard drive space would be to remove some of the large BMP and WAV files incorporated with the OS, but I consider myself fortunate that I didn’t really see the potential for multimedia until I started using Windows 95 – or for videos until Windows XP was released. Otherwise, there really would have been some problems!

Looking back on it with experience, it’s clear that MS-DOS 6.0 and Windows 3.1 didn’t quite match up to some of its competitors. The Macintosh’s system software looked prettier, AmigaOS (which was just about dead by the time I got into computing) was technically superior in many regards, including its multitasking model, while OS/2 was more robust (to say nothing of Unix). There were also some particular limitations of the operating system which really seemed questionable even at the time; the restrictions in filename sizes, the division of memory into conventional memory and extended memory (even if this was a solution to a problem that Microsoft didn’t create) and the difficulty posed by device drivers in MS-DOS.

Nevertheless, MS-DOS 6.0 and Windows 3.1 worked. They didn’t always work particularly well; there were some notable flaws and foibles, not all of them discussed above. They weren’t the peak of technical development, but they didn’t have to be. All they had to be was good enough for the majority of computer users, and given that I didn’t have any need then for the robustness or the sophistication of some other operating systems, it was good enough for me.

RAK’s OS Adventures Double Bill – Atari TOS & SymbOS

Author’s Note – Since I didn’t write an article two weeks ago, I’ve decided to try to make up for that somewhat by writing a set of short articles on a couple of operating systems that I’ve recently acquainted myself with. Hope you enjoy!

Historical Operating Systems – Atari TOS

As I’ve indicated before ([1], [2], [3]), the mid-1980s transition from 8-bit to 16-bit processors in personal computers marked a turning point, from computers which would boot up into a BASIC interpreter to ones which would incorporate full graphical user interfaces, and towards development of multimedia programs such as WYSIWYG word processors, desktop publishing and graphics manipulation suites. A lot of the competitors in the preceding 8-bit computer wars had been lost to market pressures, leaving only a few primarily American and British competitors to fight it out in the late 1980s.

Atari was one of these competitors. Atari was one of the few console-producing companies to survive the Great Video Game Crash of 1983, despite two games, the poor-quality port of Pac-Man with its flickering sprites and the turgid, over-produced E. T. the Extra-Terrestrial, on the Atari 2600 console being partially responsible for the crash. Nevertheless, the crash of the video game console market led Atari’s parent company, Warner Bros., to sell off Atari.

Atari’s computer division was responsible for the Atari 400/800 computers which had been strong competitors against the Apple II and Tandy Radio Shack TRS-80 computers, and had maintained sales against the Commodore 64. With strong sales in Europe, where the Commodore 64 had a harder time competing against the low-price British computers such as the Sinclair ZX Spectrum, this part of Atari’s operations was still desirable.

One man to whom Atari looked particularly desirable was Jack Tramiel. Tramiel had founded Commodore International in the 1950s, first repairing and selling typewriters, then moving onto calculators, but soon branching into the nascent computer market. Commodore’s PET and VIC-20 models had been highly successful, but it was the Commodore 64 which would forever cement the company’s name in computer history. However, Tramiel’s market tactics involved quickly dropping the price of the Commodore 64, effectively underpricing every other computer in the North American market. Tramiel’s tactics would end up making the Commodore 64 the single-most successful home computer model ever produced and wiping out many of the competitors.

It has been suggested that one of Tramiel’s main targets was Texas Instruments, who had almost ruined Commodore during the 1970s, and the continuous price-dropping of the Commodore 64 did end up wiping Texas Instruments from the home computer market in the process. Nevertheless, this aggressive price strategy did come at the cost of profits, and in 1983, Jack Tramiel was unceremoniously booted from the Commodore company.

All of this made Atari a rather tempting acquisition for Tramiel, who bought Atari in 1984. Atari originally had their eyes on the Amiga, but after losing a protracted battle with Commodore for the Amiga chipset after Tramiel’s takeover of the company, Atari had to develop a replacement. The computer they developed was the Atari ST, or Sixteen/Thirty-Two (a reference to the Motorola 68000’s 16-bit external bus and 32-bit internal architecture). Arriving on the market two months before the Amiga, the Atari ST would be competitive in the late-1980s, particularly in Europe, where the home computer market moved more slowly towards the adoption of IBM PC-compatibles, but as with the Amiga and most other competitors, lose ground into the 1990s.

The operating system of the Atari ST was a licensed version of the Digital Research GEM environment, known as Atari TOS. Originally shipped on floppy disc, it was later incorporated onto a ROM chip, similar to the contemporary RISC OS for the Acorn Archimedes. Unfortunately, though, Atari TOS feels like the most sparse of the operating systems of the late-1980s 16/32-bit microcomputers, even considering that it is on a ROM chip rather than the floppy discs used for most other contemporary operating systems. While RISC OS had a powerful BBC BASIC interpreter that could be used to build graphical applications, and AmigaOS and Mac OS at least came with calculators or clocks, Atari TOS really only came with the basic OS utilities and not much more.


At least graphically, Atari TOS matched up to early versions of AmigaOS and RISC OS – although these graphics would remain into the 1990s.

What’s more, Atari TOS was a single-tasking operating system right until the shipping of the MiNT-based MultiTOS package in the 1990s; this at a time when AmigaOS had possessed pre-emptive multitasking since 1985, when Mac OS had got cooperative multitasking in 1987, when RISC OS had cooperative multitasking since its release and when Microsoft Windows was giving even the x86-based MS-DOS computers multitasking capabilities. Given that I have complained about the slow adoption of multitasking by Mac OS and the cooperative multitasking model of RISC OS, I think it stands to reason that I do not regard Atari TOS as an especially sophisticated operating system, even by the standards of the 1980s.

It’s a shame, in a way, since the Atari ST itself seems to have been a sophisticated, yet inexpensive computer that could compete well against the Macintosh and Amiga in terms of pure grunt. Plenty of games seem to have been cross-developed between the Amiga and Atari ST, the common element of a Motorola 68000 microprocessor helping matters. The Atari ST was also the first platform for which the Allegro game library was developed (hence the name of the library, originally derived from Atari Low-Level Game Routines). The operating system, on the other hand, did not match up to the sophistication of the computer it ran on, and while Mac OS and the early versions of Microsoft Windows led to greater development down the line, AmigaOS was a sophisticated operating system ahead of its time and RISC OS is still being developed today for a processor architecture which is very much alive, Atari TOS doesn’t really seem to have any particular historical significance beyond the fact that it was there.

RAK’s OS Adventures – SymbOS

From there, I move onto an operating system which is very much more sophisticated than its hardware requirements would indicate. The 8-bit computer wars of the early 1980s typically revolved around two types of processors. The MOS Technology 6502 was used in the Commodore 64, the Apple II and the BBC Micro, among others, and could indirectly trace its lineage to the Motorola 6800. The Zilog Z80, on the other hand, was used in the Tandy Radio Shack TRS-80, Sinclair ZX Spectrum, the Amstrad CPC and PCW ranges and the MSX computers and uses an extended version of the instruction set architecture of the Intel 8080.

In certain regards, the MOS 6502 is a superior processor; while Z80 processors typically ran at a clock rate of approximately 4MHz in most early-1980s microcomputers, MOS 6502 processors could be competitive at a clock rate four times slower than that. In other regards, the Z80 has the advantage; with more registers and a more sophisticated instruction set, it is a more friendly processor to program for. Both processors are still being sold today, mostly for embedded systems; Zilog still exists and has a strong market selling Z80 and Z80-compatible processors for embedded systems, some quite a bit more powerful than the 1980s models (RS Components currently has a Zilog Z180 running at 20MHz for €11.40 for an individual processor), while Western Design Center, established by a co-holder of the 6502 patent, still sells 6502-compatible processors.

No matter how popular these processors may have been in their day, they seem rather old-hat even by the standards of the Intel 80286, the Motorola 68000 or the ARM2, let alone today’s multi-gigahertz Intel processors. Nevertheless, sometimes sophisticated programming can trump raw power, as seen with SymbOS, an operating system for Z80-based computers.

SymbOS is the project of a German programmer, known by the pseudonym Prodatron, and is developed for the Z80-based Amstrad CPC, MSX and Amstrad PCW computers. The Amstrad computers were particularly popular in Germany, where they sold in numbers comparable to the Commodore 64 in that market, while the MSX computers were developed as a standard for Japanese microcomputers. These computers typically came with 64 or 128kB of RAM and Z80 processors clocked at 4MHz – only moderately powerful for 1984 or 1985, when these computers were released.

All of that makes SymbOS look especially impressive when considering what you get out of it. SymbOS provides full pre-emptive multitasking, of the same level of sophistication as AmigaOS, developed for a computer with twice as much RAM as standard and a processor a generation older. It provides a full GUI system, similar in appearance to that of Microsoft Windows 95, and similar in sophistication to Atari TOS or Mac System 6 – and more sophisticated than Microsoft Windows 1.0. It can run sound and video applications. It can support 128GB FAT32 hard drives – when Microsoft Windows couldn’t do this until Windows 98.


It even looks pretty nifty as well.

Yes, the OS was first released in 2006, twenty years after the release of the computers it runs on. Yes, it’s not the only graphical user interface for an 8-bit computer – GEM was originally developed for the Commodore 64. But it’s got pre-emptive multitasking, sound and video capabilities and support for large hard drives, all on an 8-bit processor originally developed in the 1970s, and on as little as 128kB of RAM (although more is suggested for best performance). It makes me wonder – if this sort of operating system can be developed for an 8-bit computer, would it be possible to improve some of the 32/64-bit operating systems of today if there were generally more proficient programmers developing them?

Historical Operating Systems Reborn – RISC OS and the Raspberry Pi

The early-1980s 8-bit microcomputer battle brought the personal computer from a hobbyist’s plaything to a genuinely useful device for general use, and was fought by a host of companies. Most of these companies were from the United States, such as Commodore, IBM, Apple and Atari, but various British companies played a significant part including Sinclair, Amstrad and Acorn. By the mid-1980s, many of the smaller competitors had fallen by the wayside, and even the once-strong Sinclair Research had been bought up by Amstrad.

The big players who remained decided to produce more powerful machines using newer processors than the MOS 6502 and Zilog Z80 8-bit processors common in the early 1980s. Commodore bought up the Amiga Corporation, which had designed an eponymous computer; Apple designed the Macintosh; Atari developed the Atari ST and IBM continued to develop on their IBM PC platform. Most of these computer designs, with the notable exception of the IBM PC, were based around the Motorola 68000 processor. As Amstrad decided to focus on their PCW series of word processors, discontinuing the disappointing Sinclair QL, this left Acorn alone in the British market to try to fight out the battle of the post-8-bit era.

Acorn decided to take a different approach to the American companies, focusing on the educational segment rather than the business, desktop publishing and multimedia markets focused on by Commodore, Apple, Atari and IBM. Instead of using the Motorola 68000 processor familiar to other computers of the time, Acorn decided to design their own processor, using the then-novel RISC architectural design to develop the Acorn RISC Machine processor, better known as ARM.

In 1987, Acorn released the Archimedes. The ARM2 processor which Acorn used proved to be a great advantage for the Archimedes, with a simple, power-efficient design which nevertheless performed calculations about twice as quickly as a 68000 processor with the same clock speed. Allied to the ARM processor was Acorn’s Arthur operating system, which came on a ROM chip similar to the Amiga’s Kickstart ROM. Arthur, on balance, was on par or not far behind the Commodore Amiga’s notoriously advanced OS, and ahead of the single-tasking operating systems used by Atari and IBM.


The Acorn Archimedes – one of the several advanced computers of the late 1980s.

Unfortunately for Acorn, the Archimedes was not a particular sales success. Its focus on the educational market had come at the cost of the multimedia coprocessors available in the Amiga and Atari ST, leading to a system that was too expensive and not good enough at gaming for a home audience. Meanwhile, the business market became consumed by IBM and the various clones which arose from the easily-reverse-engineered BIOS of the IBM PC and its successors. Nevertheless, Acorn persisted and continued to develop new machines with more advanced operating systems. Arthur was updated, becoming RISC OS in the process, keeping to the same general structure but gaining new features.

Eventually, Acorn fell to the wayside, suffering a similar ignominious fate to Commodore and Atari as the personal computer market gradually became dominated by IBM-compatible computers with Intel processors. Apple managed to cling onto life during some very slim years, moving to the PowerPC architecture along the way, but eventually gave in and took up the Intel x86 processors as well, moving their BSD-derived Mac OS X operating system over to the new architecture.

Acorn has had one significant lasting legacy, however – the intellectual properties for the ARM processor were divested in a new company, ARM Holdings, who collaborated with Apple to continue developing the ARM architecture for Apple’s own devices. Today, the ARM processor is the most popular 32-bit processor architecture in the world, underpinning everything from smartphones and tablets to embedded processors inside other devices.

RISC OS has survived as well, with the intellectual property for the Acorn computers sold to Castle Technology Ltd., a small British company who has continued to develop ARM-based personal computers using RISC OS. A small but dedicated community grew up around the company, much like the remnants of the Amiga or Atari ST communities, and has continued to support the OS.

Now, we have the Raspberry Pi. The inexpensive, credit-card-sized computer has been a massive success, demonstrating a far more simple, hackable approach to computing than has been usual today. Something that has been a pleasant surprise is how readily the RISC OS community has decided to support the Raspberry Pi.

Given that until recently, I haven’t had a computer without an Intel processor, I didn’t have an opportunity to try RISC OS on anything but an emulator. However, I sometimes despair for the sheer homogeneity of the personal computer market, even though I have contributed to it for many years. Now, I have been granted a chance to try an operating system natively on modern hardware that isn’t part of the Microsoft Windows, Mac OS X or Linux families.

My initial thoughts when I first booted up RISC OS 5 were that it actually boots up as astoundingly quickly as others said it would. Frankly, this shouldn’t have been a surprise; not only is RISC OS still designed with the StrongARM processors of the Acorn Risc PC in mind, it is still developed for a 6MB ROM chip, and is therefore extraordinarily tuned for its environment. I had used RISC OS before on the ArcEm emulator about four years ago, so I recognised that RISC OS was slim and fast in the early 1990s, but it’s nice to see that this behaviour persists today. The same sort of responsiveness applies to the shutdown process as well. RISC OS has instant shutdowns. None of this behaviour where shutdowns can take almost as long as the boot process – as soon as you click the Shutdown option, short of certain file operations being in progress, the computer will immediately be ready to shut down.

After about ten to fifteen seconds, the GUI environment booted up. Two things were quickly apparent. The first is that the environment was immediately responsive as soon as it had finished loading, unlike contemporary Windows or Linux desktop environments, which, based on the number of background processes that are set to start – can leave you waiting a minute or more for full responsiveness.

The second thing is that the RISC OS GUI environment is, in fact, very pretty. Mac OS X and iOS are often held up as being the exemplars of pretty environments, but I’d argue that RISC OS is, in its own ways, marginally prettier. Much of what Mac OS X does to ensure its pretty environment is down to impressive, shiny graphics and high-resolution displays, whereas RISC OS manages to look good at 640×480 on a simple non-high-definition television screen.

A lot of this is down to the inherent design philosophy of RISC OS. The original Arthur OS for the Archimedes was the first operating system to incorporate a dock, or in RISC OS parlance, an icon bar. The icon bar distinguishes between application icons, set to the right-hand side of the icon bar, and storage devices, set to the left-hand side of the bar. This helps to create a distinct divide between applications and devices which store applications and data. In comparison, the Mac OS X dock can occasionally look a bit untidy and busy when you load up too many applications at once.

Another detail in RISC OS’s favour in the design stakes is the high-quality anti-aliasing technology that has been a part of the operating system since 1989. The renderer is designed, as are some of the more recently designed competing technologies available in other operating systems, to render type accurately at the cost of readability, but frankly, even at the 640×480 resolution I have been using, the typefaces still look clean and legible, which helps make the interface look clean and stylish.


RISC OS – stylish even at low resolutions, even better in high definition.

Enough about the style – how about the substance? It turns out that you get quite a few things even from your 6MB ROM image, including the full GUI environment, a text editor, a vector graphics program, a simple scientific calculator and a BBC BASIC interpreter.

Of course, it seems awfully odd and antediluvian to be supporting a BASIC interpreter in 2013, but BBC BASIC was one of the most sophisticated BASIC interpreters of its time and was extended with its move to RISC OS with capacity to write full, multitasking GUI applications. BBC BASIC is also one of the most optimised and rapid interpreted languages on any platform, proving sufficiently quick for the entire Arthur GUI interface to be written in it. The interpreter also includes capacity for inline ARM assembly language, providing a low-level programming environment inherent to the system. Few other operating systems actually have any inherent capacity for programming, and while Linux, Mac OS X and other Unix and Unix-like operating systems typically have programmability through their command shell, this isn’t going to fit in 6MB along with a GUI environment.

Unfortunately, when it comes to other applications, RISC OS currently looks a bit sparse. Given that the operating system has been maintained by a single, small company and kept alive mainly by hobbyists, this is to be expected, but you’re certainly not going to have the wealth of software that you have on Linux or Mac OS X, let alone Windows. This may improve if the community grows with the popularity of the Raspberry Pi, but it will prove difficult to use RISC OS for most serious work right now.

From a technical perspective, RISC OS is a very different beast to the three most popular desktop operating systems. Microsoft Windows comes from a lineage that incorporates elements of CP/M, OpenVMS and so on, while Mac OS X and Linux are obviously derived from Unix. RISC OS doesn’t derive from either lineage – or from any other apparent one either. Directory paths are delineated by full stops rather than slashes, for instance. Disc formatting uses the proprietary ADFS system first developed for the BBC Micro. Files don’t have extensions as default, with the file type determined by a six-byte file type number stored separately, and when extensions are used, perhaps from imported files from another operating system, the extension is delineated from the name by a forward slash.

One of the most distinctive details of RISC OS is how it deals with applications. Application names always begin with an exclamation mark, and RISC OS applications more closely represent directories in other operating systems than they do the executable files of Windows or Linux. In fact, RISC OS applications are extraordinarily modular in nature – you never have to “install” an application on RISC OS as you would in Windows, and you can just drag an application icon onto the icon bar to open it.

Another particularly distinctive detail of RISC OS comes from the way it handles the mouse. Acorn designed the Archimedes with a three-button mouse from the very start, and each of the buttons on the mouse have very individual functions. Unlike Windows, Mac OS X or Linux – or most other desktop GUI systems – RISC OS has a separate Menu button set to the middle button, and therefore, applications are not expected to have a program-specific menu bar, or a Ribbon interface or anything like that. The middle button performs menu tasks in every application, including the ones normally done by the right mouse button in Windows or Linux.

The other two button functions are Select, set to the left mouse button and performing tasks similar to the left mouse button in other desktop operating systems, and Adjust, set to the right mouse button. Adjust performs various functions, ranging from an alternate way to perform various tasks in most programs to an alternate menu for some application icons.

There are some places where RISC OS betrays its Eighties origin, though, and not necessarily in a good way. RISC OS uses cooperative multitasking rather than the pre-emptive multitasking common in operating systems from Unix to Microsoft Windows to AmigaOS and others besides. I have, in the past, been quite disparaging about the use of cooperative multitasking in any operating system, including RISC OS, and using RISC OS, it’s clear that it is an underlying disadvantage of the system.

I’m quite fond of pushing my systems to the limit when it comes to multitasking – it’s common for me to have a web browser, a word processor, a music player, a PDF reader and the file manager for my operating system all open at one time, with other tasks perhaps happening in the background. With a pre-emptive multitasking system, the programs are given a fair share of the computer’s free time, only occasionally locking up because one task is a bit too greedy with the clock cycles. With a cooperative multitasking system, it’s more difficult to run multiple applications at once, since one program that is badly designed or simply resource-heavy can lock up the system until it resolves. Using RISC OS for multimedia applications at the same time as performing a processor-heavy task is therefore a potential no-go area, which is a pity considering how smoothly the system runs on a single task.

Mostly, though, I like how different RISC OS 5 feels to other contemporary operating systems. Certain technical details, such as the obsolete cooperative multitasking model, make it difficult to recommend for everyday use right now, while the relative lack of applications also works against it. However, being allied to the Raspberry Pi could well give RISC OS a renewed lease of life, especially in the educational sector where it would be perfect for demonstrating that not every operating system is, or even has to be, the same as Windows or Mac OS X. In that sense, the OS could come full circle – from its educational roots right back around to them again.

Historical Operating Systems: Version 7 Unix

The story of Unix is one of the most interesting in computing history, a story of how a little pet project from one computer researcher in AT&T’s Bell Labs spread to become one of the pillars on which modern computer systems are built. All stories, of course, must begin somewhere, and the story of Unix begins in 1969, shortly after Bell Labs dropped out of the Multics project, fearing the “second-system effect” that had caused the likes of IBM’s OS/360 series of operating systems to be delivered late and bloated. In the wake of this, one of the Multics researchers, Ken Thompson, decided to port a game that he had written for the Multics system to something more cost-effective than the General Electric GE-645 mainframe that the game originally ran on.

In the process of porting the game, Space Travel, to a little-used PDP-7 minicomputer belonging to Bell Labs, Thompson and another former Multics researcher, Dennis Ritchie, ended up implementing an operating system which took influence from Multics while trying to avoid some of its pitfalls. This was the first version of what was originally called Unics, named by Brian Kernighan as an allusion to Multics; it featured some very modern features for an operating system for any system of that era, let alone one implemented on a cheap minicomputer. Among the features of the initial Unics system were a full user-accessible pre-emptive multitasking system, a hierarchical file system and a command-line interpreter. As research into Unics continued, more features were included, making it into a truly multi-purpose operating system with multi-user capacity, and Unics was renamed Unix.

Such features were possible on a computer with limited resources because of a certain minimalism in the operating system design, which relied on the modularity of several small utility programs to provide capabilities then uncommon even in mainframe operating systems. AT&T agreed to allow Thompson, Ritchie and the other Unix researchers to port the operating system to a more powerful PDP-11/20 minicomputer in exchange for the development of a typesetting system for the Unix system; it was on this platform that the operating system was rewritten in Ritchie’s C programming language, thus becoming one of the first operating systems to be written in a high-level language. C owed – and still owes – its success to much the same principles as Unix, namely a minimalistic, modular flexibility that belies its simplicity and allows complex techniques to be performed with simple components.

Under the terms of an anti-trust lawsuit against the Bell System, AT&T was prohibited from commercialising any computer technology it developed; in fact, it was obliged to licence its technology to any organisation which wished to use it. Therefore, Unix was distributed to universities, commercial firms and the United States government under a licence which distributed the Unix source code along with the binaries; this gave many computer science students a look at the innards of an operating system in a way that was impossible with many other operating systems. The implementation of Unix in a high-level language like C allowed it to be reimplemented on systems other than the original PDP-11; this was a demonstration of flexibility formerly unknown in the world of computing. The availability of the source code allowed Unix to be easily reverse-engineered, leading to a continuing development of new versions even as AT&T were finally allowed to commercialise it in 1982. In 1983, both Ken Thompson and Dennis Ritchie received the ACM Turing Award, the highest honour in computer science, for their joint development of Unix and of C, an award well-deserved for an operating system to which many modern operating systems, such as Mac OS X, Linux, Solaris and others, owe their lineage.

Version 7 Unix, developed in 1979, was the last version of Research Unix to be see wide distribution; its influence is still felt in all Unix and Unix-like operating systems today. Version 7 was the first version of Unix to include the Bourne shell which succeeded the Thompson shell found in previous versions, allowing greater programmability in a manner resembling ALGOL 68; it also included several new features such as awk, yacc, Make and the first instance of the malloc() routine now included in the C standard library. All of this made a powerful programming environment by the standards of the late 1970s.

Back in 2002, Caldera International released all of the Unix versions up to Version 7 and its 32-bit VAX port, Unix 32V, under a free software licence allowing free use of all of the source code, as well as distributing the original disc images. As I always explore the environments of any operating system I write an article about, I used one of these disc images with a PDP-11/45 simulator on SIMH with 256KB of memory. The first thing I noticed when I booted the simulator into Version 7 Unix was how usable it was by modern standards. OK, the shell lacks any of the modern trappings like command history, aliases or even a display of the current directory on the command prompt (that’s what the pwd command is for!), the text editor is the infamously terse and cryptic ed and the C compiler uses the historical, pre-ISO standard K&R dialect of C, but the operating system still shares enough features with a modern Linux or Unix command line for me to use my previous knowledge to reasonable effect.

Version 7 Unix - Source Code

Version 7 Unix – back when programmers could do more with 256KB of memory than some modern programmers can do with 4 gigabytes.

The basic utilities of the early Research Unix versions did seem to require you to be a programmer of some sort to get any sort of real use out of it; programming tools on the disc images of Version 7 Unix include compilers for C and FORTRAN 77, an interpreter for a variant of BASIC, the Ratfor preprocessor for FORTRAN and an assembler for native code. yacc, lex and other additional programming tools round out the Unix programming environment.

Editing your source files requires the use of ed, a simple line editor which can still be found in modern Unix and Linux systems, but which is seldom used, having been displaced by the likes of vim, GNU Emacs and GNU nano. The terse, cryptic syntax of ed was once infamous; almost all commands are entered using a single alphanumeric character for the command, plus some numbers and symbols as arguments, while the editor itself has two modes, similar to vi, except that there is no way of telling them apart by sight. Like many of the early Unix utility programs, ed was designed for the limitations of teletypes; in this case, it really shows.

As Unix was allowed to expand for purposes including typesetting, it should be evident that some of the other tools on Unix were developed with that in mind. The likes of troff and nroff were designed for typesetting on a C/A/T typesetter, a device which allowed typesetting without expensive devices like Monotype or Linotype hot metal typesetters. By 1979, the C/A/T typesetter was becoming obsolete, but Brian Kernighan had not yet completed his device-independent version of troff by the time that Version 7 Unix had been released; the version of troff used in Version 7 Unix was a version written in C by its original author, Joe Ossanna.

Not all of Version 7 Unix’s programs were serious in nature, just as not all of a modern desktop operating system’s programs are serious. As Unix was originally designed as an environment for porting a game onto a new computer, it is to be expected that Unix has a few games on it. The games included a chess game, a game of Hangman and the famous Hunt the Wumpus – with the source code full of goto statements!

Hunt the Wumpus

Killed the Wumpus on the first try!

A fair amount of the source code for these programs is available as is on the disc image I used, including a lot of the utility programs, a few of the games and the mathematical libraries. Comparing these bits of source code with the likes of the GNU Coreutils, modern variations of the old Unix programs, one notices that the Version 7 Unix utilities are a lot more sparse – although one might argue that they are more elegant – than the GNU utilities. The GNU echo utility in version 8.15 of the Coreutils is 275 lines long and covers circumstances such as converting hexadecimal to decimal; the Version 7 Unix echo command is barely 20 lines long and has a single flag controlling whether it should print a new line at the end of output. One may argue that the GNU echo command is far more flexible, and it is, but one might also argue that the Version 7 Unix echo command closer resembles the original intent of Unix. Such arguments begin “holy wars”, though, and as I don’t really have a strong enough grasp on the utility of such commands to truly judge them, I’ll leave the argument there.

What is clear though is that Version 7 Unix looks modern and familiar enough to clearly be the ancestor of many of today’s operating systems. It may not have a flashy graphical user interface like modern Unix and Linux variants, but when you get to the guts of these modern operating systems, you get to something that looks very like the same Research Unix systems that the likes of Ken Thompson, Dennis Ritchie and Brian Kernighan were programming on over thirty years ago. The code is different, especially with the GNU-derived operating systems where it was realised that to replace Unix, you must first replicate it perfectly, but the utilities have the same style of usage.

Even more of an influence on the world of computing was the C programming language that underpins not only Version 7 Unix, but almost every serious operating system still in use; by being able to underpin Unix, the C programming language was proved to be a serious contender in the systems programming field at a time when operating systems implemented in high-level languages were limited to mainframes. As system resources have grown, C’s minimalistic modularity and flexibility has proven itself up to the task of scaling up to modern computer systems. There truly is no better memorial for Dennis Ritchie than the language he invented back in 1972, and there will be no better memorial for Ken Thompson than the operating system which changed the world of computing utterly.

Historical Operating Systems – Apple’s System 6

Apple’s approach to the computer market has always rested on its hardware, with a deeply integrated approach to its operating system software which is still obvious even with the Intel-based hardware architecture of today’s Macintosh computers, and the UNIX operating systems used in almost all Apple hardware. In the 1980s, when Apple was one of the biggest customers of Motorola 68k processors and relied on its own proprietary hardware, their integrated approach was even more obvious, with a stylistic approach which stood apart from competitors such as Commodore’s Amiga, and especially stood apart from the distinctly ugly early versions of Microsoft Windows which were found on the myriad of IBM PC-compatible machines in the market.

System 6 was the penultimate revision of the Macintosh System Software designed for Motorola 68k processors. It was released in 1988, slightly after the release of the Macintosh SE computer, an integrated all-in-one unit which resembled the original Macintosh units, along with the Macintosh II, the first of the Macintoshes with a separate monitor, and the first able to display colour images. Ultimately, like its predecessors, System 6 took its provenance from the system software of the original Macintosh 128k, although it had been developed somewhat since then.

The system software of the Macintosh 128k was graphically impressive, but with a single-tasking approach which more closely resembled that of the command-line-based operating systems it was competing against, it wasn’t particularly technically impressive. Given the obvious capacity for a GUI to run more than one program at a time, which was demonstrated on the predecessor to the Macintosh, the Lisa, this was a somewhat disappointing state of affairs. This was only rectified with the release of System Software 5 in 1987 and its MultiFinder extension which allowed cooperative multitasking for the first time.

With the capacity for System 5 and System 6 to run on older machines, this extended the multitasking back to the Macintosh Plus and the 512Ke. Cooperative multitasking was not the optimal way of solving the problem of running more than one program at a time; it relied too much on the voluntary ceding of CPU time by each application, and led to problems such as entire networks slowing down because people held down their mouse buttons for too long. The Amiga, which had been released two years previous, used pre-emptive multitasking, a technically superior option, although one which required more complicated underlying software and which imposed a slight additional load on the CPU. Apple’s operating systems, on the other hand, would end up using cooperative multitasking all the way up to Mac OS 9, and even then had to provide the capability for cooperative multitasking to provide backwards compatibility.

Where System 6 lacked technical sophistication, however, it arguably made up in style. Where the GUI contemporaries of System 6, such as AmigaOS and RISC OS, started off with garish colour palettes and only later developed styles that were more comfortable to the eye, System 6 continued the slow evolution of the interface style that would persist on Apple’s platforms all the way up to the adoption of Mac OS X. With the Chicago typeface across the persistent top menu bar, the familiar Trash can and the iconic Happy Mac and Sad Mac icons, the familiar elements had already been set in place for the standard Mac OS interface for the next ten years.

A typical screenshot from a Macintosh SE – which would still look familiar to users of the latest versions of Mac OS 9.

System 6 came on four 800KB floppy discs, or later on two 1.4MB floppy discs, and the system could be booted directly from the floppy drive, while also having the capacity to be installed on both internally- and externally-mounted hard drives. As the older Plus and 512Ke models weren’t provided with hard drives as standard, and as System 6 officially dropped support for the slow and expensive Hard Disk 20 which had been supplanted by SCSI-based technology in the newer computers, the ability for the operating system to still be booted from the floppy drive gave older Macintosh users the features of the newer operating systems without having to buy completely new machines.

Even with the floppy interface, System 6 is notable for its rapid loading time. Even on the likes of the Macintosh Plus or SE, the operating system will load to a fully-working state in about thirty seconds, more quickly than a modern PC or Mac running the likes of Windows or Mac OS X. This rapid loading time was facilitated by the use of 68k assembly language to program the operating system, unlike its successors which would use C and were not as well-optimised for their individual platforms. System 6 was obviously very tightly integrated with the Macintosh architecture – a situation which would have suited Apple right down to the ground.

Ultimately, though, that tight integration with one processor architecture and one line of hardware couldn’t persist forever. While System 7 would run slower on any 68k-based Macintosh than System 6, and could no longer be booted from a single floppy disc, it was portable to the PowerPC processors which Apple had planned for its computers since 1991, and finally adopted in 1994. System 7 also had support for 32-bit memory addresses, rather than the limited 24-bit addresses of System 6 – limiting it to a scant 8 megabytes of RAM. While System 6 would run smoothly on a late-1980s or early-1990s Macintosh, its limitations weren’t in keeping with Moore’s Law.

Today, System 6 is one of the revisions of Mac OS Classic provided for free by Apple on its support website. The North American versions of System 6.0.3, 6.0.5 and 6.0.8 are provided as compressed StuffIt Expander images which can be expanded into four 800K or two 1.4MB floppy images. These images, once extracted, will work in various emulators of Macintosh systems, as well as providing users of old Macintosh systems a way to keep their systems going even if their original system discs stop working. It’s difficult to find many other operating systems from this time period – both AmigaOS and RISC OS are tightly controlled by their owners, while Windows versions predating Windows 3.1 are barely worth trying – so this is an interesting chance to see what GUIs looked like from the non-PC-compatible side of the fence.


Historical Computer Systems – The EDSAC

At the beginning of the computer age, there were various computer systems of differing construction and operational principles. Among these were the Z3, an electromechanical computer designed by the German civil engineer, Konrad Zuse, which incorporated data storage through the use of punched 35mm film, the electronic Atanasoff-Berry Computer designed by John Atanasoff, Howard Aiken’s Harvard Mark 1 and the famous ENIAC, designed by John Mauchly and J. Presper Eckert. While these machines varied in design and operation, they were all built with one specific task: to make the task of calculation quicker.

Before George Stibitz demonstrated the relay-based electromechanical Complex Number Calculator, the earliest electrical tabulator to exist, complex calculation was done using slow desk-mounted mechanical tabulators. With the simultaneous developments of Zuse, Atanasoff, Aiken and Britain’s Alan Turing, which came just in time to assist the war effort in the Second World War, the world had seen the nascent developments of a technology which would eventually revolutionise society. First, though, the computer had to become more than an elaborate calculator.

In the years following the Second World War, work by Turing, Eckert and Mauchly, along with John von Neumann, the Hungarian-born polymath, led to the development of the stored-program computer using what became known as the von Neumann architecture. The release of von Neumann’s paper, First Draft of a Report on the EDVAC, led to the development of a number of stored-program computers around the world. The first of these was the experimental SSEM, or Small-Scale Experimental Machine, a prototype computer developed in Manchester and designed to test the Williams tube, an early form of computer memory. This, in turn, influenced the development of the EDSAC at the University of Cambridge.

The EDSAC, or Electronic Delay Storage Automatic Calculator, was the first practical electronic stored-program computer. Completed in May 1949, the EDSAC took input from punched tape and printed output through a teleprinter, both methods which would find use long after the development of the EDSAC. Output could also be displayed on a series of CRT monitors, an interesting capacity which would play a role in a historic piece of software for the machine.

A picture of the room-filling EDSAC at the University of Cambridge.

In the years preceding and during the operation of the EDSAC, a number of computer scientists and mathematicians were devising applications for computers which would go beyond the military and academic mathematics tasks that the computers had been performing to that point. Alan Turing was conceiving of artificial intelligence, others were considering mathematics in fields such as chemistry and biology, and a few people had even considered experimental computer games. One of these was A. S. Douglas, a mathematician at Cambridge who, as part of his doctoral thesis on human-computer interactions, designed one of the first computer games.

OXO, or Noughts and Crosses, to give it its full title, was, as the name suggests, a simulation of tic-tac-toe designed to test the EDSAC’s capacity for doing things other than routine mathematics. Being a game with simple rules, the game wasn’t very sophisticated by today’s standards, but it worked – something notable when discussing a computer made less than a decade after the very first electronic computer.

The EDSAC design led to other applications as well. J. Lyons & Co. Ltd., a now-defunct restaurant, food manufacturing and hotel company in Britain, whose products included the popular Ready Brek cereal, had invested in the EDSAC project. From the EDSAC design came LEO I, one of the first commercial computers ever produced, and the first computer to be used for business applications. LEO I (standing for Lyons Electronic Office) efficiently ran through inventory and payroll jobs, first for J. Lyons & Co. themselves, then later for Ford UK. A LEO I computer was also used by the Met Office before the acquisition of their own computer in 1959.

The EDSAC as originally designed ran with a clock speed of 0.5MHz, and had 512 17-bit words in its mercury delay line memory. This was later expanded to 1024 words – just over 2kB of RAM, in modern parlance. All of this operated at approximately 700 operations per second, moderately quick but slightly compromised at the time, and beatable by even the least sophisticated modern microcontroller. Given that the EDSAC required 12kW of electrical power to operate, it demonstrates just how far we’ve come in the sixty-one years since the EDSAC ran its first application.

The EDSAC’s historical value is unquestionable for being the first practical von Neumann-compliant computer to be built. However, there are other reasons for the EDSAC to be historically interesting. The first are its applications outside of pure and applied mathematics, such as the derived LEO I’s business applications and OXO. The second is the existence of a fully-featured simulator for the EDSAC, made in 1996 by the computer scientist and historian, Martin Campbell-Kelly. While it doesn’t give the experience of a heated room full of vacuum tubes and paper tape readers, it’s still an interesting insight into the programming techniques of a first-generation computer.

Reports from the start of the year also suggest that a replica of the EDSAC is to be built at Bletchley Park (home of the Colossus code-breaking computers during the Second World War). As there are few components remaining from computers of the first generation, and only a single complete one in existence – Australia’s CSIRAC, also known as the first computer to play digital music – this will prove to be one of the few chances to see a computer with first-generation technology in action. (EDIT 10/09/2013: As pointed out in the comments by Robert Dowell, the WITCH, a British computer also from the first generation, was painstakingly restored to working condition by The National History of Computing at Bletchley Park, and was made operational in 2012.)

The EDSAC was later superseded by the EDSAC 2, another vacuum tube computer which served until the mid-1960s. During its time, it was responsible for accelerating several mathematical fields, as well as forming a bedrock for the British computer industry. LEO Computers Ltd., formed to sell LEO business computers, eventually amalgamated with English Electric, which led to the formation of ICL, one of Britain’s historically most successful computer manufacturers. The von Neumann architecture underpinning the EDSAC would go on to underpin almost every computer which followed.

FROM THE ARCHIVE: Esoteric Operating Systems – OS/360 and its successors

Author’s Note: First of all, I know this is going to be a case of tl;dr for most people, but a hell of a lot of work went into this. Apart from a whole load of research, I spent a week trying to install and use OS/360 MVT in order to get a feel for what computing was like in the 1960s, so I do know some of what I’m talking about when it comes to OS/360. Trust me when I say: That shit is hard, and the sort of work that went into it was far beyond the scope of most computer users.

Secondly, it’s not a perfect piece. There are probably a few too many tangents off towards other computers and operating systems, but I was making a chronicle of a series of forty-five year-old operating systems which defied the odds several times to still be used today. It may not be the most riveting read in the world, but please at least give it a go.

Finally, if you’re actually interested in emulating an IBM mainframe, you can try the Hercules emulator, which is capable of emulating System/370, ESA/390 and zArchitecture mainframes. Keeping this maintained is what Tron Guy does in his spare time.

In 1964, IBM was the largest computer manufacturer in the world. Thomas Watson Jr. had brought the company forwards from mechanical tabulators to the electronic era, despite his father’s initial objections, and the company had gained much success with its 700/7000 series of scientific computers, the 605 model, which opened up computer access all around the United States, and the 1401, their highly-successful business model. But despite this success, IBM found itself at a risky position. The computer models that IBM sold at that time were incompatible with each other, and customers were reluctant to upgrade because of the cost of moving over their programs and data.

The market needed standardisation, and IBM decided to risk its place at the top of a volatile computer industry, where many other companies had already failed, to design a standard system architecture which would serve businesses and scientists alike.

The venture was a success. IBM introduced the System/360 into the market in 1964, cementing its place at the top of the market with hundreds of new customers, who were ensured of the ability to expand as they needed. With a system architecture which would become the standard in mainframes up to the present day, IBM managed to sweep many of its competitors from the market, establishing itself as the industry leader for years to come.

This picture might look a bit odd to some people. I mean, when is the last time you saw a programmer wearing a suit?

System/360 machines were highly sophisticated for their time, with some of the earliest hard disc drives providing megabytes of storage and near-instant access to data where competing machines only had the sequential data storage of magnetic tape. For such an elaborate system to be used correctly, it required an equally elaborate operating system. Operating systems were a relatively new innovation, allowing for easier input/output routines without the necessity of programming them in for every program, and IBM had used them itself for some of its previous computers.

So, IBM planned two operating systems for its System/360 machines: OS/360 for punch-card batch processing, and the more advanced TSS/360 for time-sharing on more powerful System/360 computers. Both operating systems ran into development trouble, with TSS/360 never being released at all, but it was the development of OS/360 which became the most infamous.

Fred Brooks, one of the developers of the System/360 and project leader of the OS/360 project, released a book in the 1970s detailing some of the problems which had developed during the protracted development of the operating system. Named The Mythical Man-Month, the book noted some very important concepts in operating system design. This included an argument against the eponymous Mythical Man-Month, where more man-power was thought to be beneficial to a software project, with Brooks proving that “adding more man-power to a late software project makes it later”.

The most relevant of these concepts, however, happened to be Brooks’ description of the so-called “second-system effect”, where IBM went from a series of small, efficient operating systems on their 700/7000 series to a large, late operating system on the System/360 in an attempt to include features that they had forgotten in previous projects. As a result of this, OS/360 ended up as an extremely bloated OS for its time, requiring a lot of memory and expensive system resources at a time when programmers had to be highly efficient. Unfortunately, this principle has struck again in several later projects, including those by Microsoft (who are probably up to fourth-system effect by this stage!).

Because of the high system specifications to run OS/360, IBM were forced to produce a second batch-processing system, named DOS/360 (for Disc Operating System, and not to be confused with the personal computer DOSes). This operating system lacked the co-operative multitasking of OS/360, but was compatible with the lower-end business System/360 computers. Other interim measures existed with the BOS/360 (Basic Operating System) and TOS/360 (Tape Operating System) lines, but DOS/360 was the only one of these operating systems which became popular.

When OS/360 was finally released in 1966, it finally gave the System/360 the multiprogramming support that it required. However, the operating system was designed for computer specialists, with a language known as JCL (Job Control Language) inherent to the structure of the computer, designed for the computer to process quickly. OS/360 was not an easy operating system to use, made for punch-card input and programming languages like COBOL and FORTRAN, with teams of perhaps a dozen programmers, keypunch operators and other technical staff.

Few computers at the time were any easier to use, with the first successful minicomputer only designed in 1961, but these computers, usually present in universities or small computing companies, were able to be programmed and operated by a single person. While the microcomputer revolution of the late 1970s and 1980s was still a long way away, the atmosphere around these more simplistic machines, including the TX-0 and TX-2 in MIT’s Lincoln Labs, and the recently produced DEC PDP-1, was far more lax than that around the huge, complicated and expensive System/360s, to the point where the famous Sketchpad digital drawing program and even the seminal and groundbreaking computer game, Spacewar!, could be developed on it.

In contrast, the stuffy atmosphere around IBM’s machines contributed to a slightly oppressive opinion of them, and the languages that were used with them were derided by many of the early hackers for their inelegant syntax. There was no room to experiment with a System/360, because their programs demanded accuracy. What is more, the social changes of the 1960s were creating a new generation of computer scientists, ones with more relaxed clothing styles and social mores than their predecessors, who programmed with full suits and ties.

Meanwhile, as TSS/360 was cancelled due to even more protracted development than OS/360, replacement time-sharing systems were being programmed elsewhere. TSO (Time Sharing Option) gave users of the MVT variant of OS/360 a limited amount of time-sharing ability, and while this option would not become widely adopted by IBM’s business customers, who usually only required batch processing, it was more appreciated by scientists and military customers.

At the same time, IBM’s Cambridge Scientific Centre were developing a native time-sharing operating system. Running on the System/360-67 variant of the architecture, CP/CMS was a lightweight operating system specifically designed with scientific time-sharing in mind, and was built with an easier interface than the difficult OS/360. It wasn’t officially supported by IBM, being distributed under an open-source model to those places that desired it, mainly scientists rather than business customers.

By 1970, the success of the System/360 had easily paid for the risks undertaken and the protracted development of OS/360, and had helped IBM squeeze some of its rivals out of business. From the 1960s, where the mainframe producers had been colloquially named “Snow White and the Seven Dwarfs”, IBM had produced a situation where rivals sold out their mainframe businesses at the hands of System/360’s market dominance. Now, the mainframe manufacturers were called “IBM and the BUNCH”, the “BUNCH” taking its name from the first letters of the smaller manufacturers.

But things were not perfect for IBM. Incensed at the loss of the MULTICS project to General Electric, IBM needed to develop a new architecture in order to stay competitive. (However, in ironic circumstances, General Electric would be forced to sell out their mainframe business to Honeywell in the late 1960s, and MULTICS would become a laughing stock as it was succeeded by a private project produced by Dennis Ritchie in Bell Laboratories.) So, with the developments made on System/360 in mind, IBM developed its successor, System/370. With the new innovations of integrated circuit design and support for virtual memory included, among other improvements, System/370 became more sophisticated than its predecessor, yet retained complete backwards compatibility with all of its code and even its operating systems.

This architectural improvement helped IBM to maintain its position at the top, and yet, unknown to them, the seeds of the demise of their core business had just been sown. One of these factors would be caused by a company co-founded by Robert Noyce. Noyce, who had been one of the “Traitorous Eight” who had split off from Shockley Semiconductors, home of the transistor, had decided to make his own way once more, founding a company called Intel. In 1969, an Intel team led by Ted Hoff had invented the microprocessor, the first fully-integrated “computer on a chip”, and this invention would have far-reaching consequences as discussed later, ones which would change the face of computing forever.

Intel’s 4004 microprocessor led to a complete change in computing.

The other two factors in 1970 that would begin to erode IBM’s huge lead went hand-in-hand. The first was the rising prominence of the minicomputer, particularly in universities. DEC, developers of the PDP-1, had found much success with other PDP models, and while IBM had a few products present in this market, they failed to pay them much attention. The second factor was an operating system which just happened to be programmed on one of these PDP computers.

Developed in Bell Laboratories, Unix was an operating system first programmed in 1969 by Ken Thompson on a PDP-7 computer, and it had already swept aside the MULTICS project. Unix was just a pet project of Thompson, but it had several factors which would strongly influence later and even modern operating systems. Most important of these was the revelation that a high-level (human-readable) programming language could be used for operating systems, which would allow for Unix’s portability across system architectures. But there were more innovations which would help to cement its later success among computer scientists. Even from the very start, it had full pre-emptive multitasking, which made it perfect for time-sharing, and some of its design briefs would prove instrumental in helping it avoid much of the second-system effect and bloat of operating systems such as OS/360 and MULTICS.

Neither of these factors, however, would start to show their relevance until later on. System/370 came into being having to account for three major lines and a few minor lines of operating system from its predecessor, and so, each of these lines was brought forward with new variants. The low-end DOS/360, originally designed as a stop-gap, became DOS/VS. CP/CMS would eventually become VM/370 (Virtual Machine). IBM failed to account for the potential success of this operating system, which became very popular within the scientific market, partially for its abilities to virtualise itself perfectly, proving one of the very first uses of an emulator.

OS/360 had two specific variants to take account of. The less elaborate MFT variant was replaced by OS/VS1, while the more sophisticated MVT variant was replaced by OS/VS2, better known as MVS/370 (Multiple Virtual Storage). Unlike the development of OS/360, the development of the System/370 operating systems went quite smoothly, particularly as the code could be tested on System/360 machines using CP/CMS.

With these operating systems, IBM would manage to see off its competitors during the 1970s as well, even with increasing pressure from DEC’s PDP and VAX series minicomputers, many now running Unix, and the efforts of Cray Inc., who would build the most powerful supercomputers in the world until the late 1980s. But the seeds of change were germinating, and the microprocessor was coming into its own. The Motorola 6800, MOS Technology 6502 and Zilog Z80 would give cheap processors to those interested in designing their own computers, and after the successes of the MITS Altair 8800 and Apple I, a large number of companies entered the microcomputer market.

With companies such as Apple and Commodore selling millions of computers in America, and Acorn Computers and Sinclair Research doing the same in Europe, IBM realised that their market was at stake. Many of the traditional customers of mainframes would have no use for one with readily-available personal computers, and so, IBM would have to build a personal computer of their own if they wanted to remain relevant in the computing market.

Dispensing with normal in-house IBM design, a team was assembled to build a personal computer rapidly. Taking the Intel 8088 processor and using a form of BASIC from Microsoft, the company which had created the BASIC variant for the MITS Altair 8800, IBM produced the IBM 5150, better known as the IBM PC.

The IBM PC 5150, predecessor of the PCs most people use today.

The IBM PC was a huge success. Appealing to business customers who had previously bought Apple and Commodore computers, the IBM PC lent huge legitimacy to the personal computer market and set the standard for personal computers to the present day. But while IBM had once again maintained its strong position as the largest computer manufacturer in the world, its mainframe business appeared to be approaching obsolescence, as some companies attempted to replace their large water-cooled, room-sized machines with considerably less powerful but substantially smaller personal computers. What’s more, because the IBM PC was based on a commercially-available processor rather than one designed by IBM in house, the IBM PC’s BIOS was soon reverse-engineered, leading to a huge number of clone systems.

Nevertheless, IBM designed a new mainframe architecture, designated ESA/390, at a time when workstations were obtaining amounts of RAM only found in mainframes before then, and when IBM’s position in the personal computer market looked increasingly weak from competing clone manufacturers. The problems with operating systems had long been solved, with time-sharing available across the whole range, and several releases of MVS and VM through the lifespan of the System/370. ESA/390 had an astounding level of backwards compatibility, able to run programs from the very beginning of the System/360’s lifespan.

About halfway through the life of ESA/390, there became a sudden resurgence in the level of mainframe use. Spurred on by companies discovering new uses for their mainframes, and a general shift in IBM’s tactics, which led to their adoption of open-source operating systems, the mainframe defied odds to survive right through the 1990s. New operating systems were devised to maintain the three lines developed during the late 1960s. MVS was updated to become OS/390, which adopted many modern operating system concepts, while VS/370 became VSE. The VM line, which had been largely ignored during the 1960s and 1970s by IBM, became more important as users increasingly supported thousands of users on their mainframes. But to supplement these three lines, IBM’s new committal to open-source led to its adoption of Linux, usually used in conjunction with other operating systems on the same machine.

Today’s mainframe is much different to the mainframe of 1964, on the introduction of the System/360. Apart from the far superior power of today’s machines, they are used according to completely different paradigms. The mainframe of 1964 usually used batch processing using punch cards and magnetic tape, and some of them didn’t even have the ability to perform more than one task at a time. Today’s mainframe is usually used by hundreds or thousands of people at a time, through Web interfaces, and its power is in high potential input/output rates and extremely high reliability.

A System z10 machine, large, hugely reliable and capable of serving hundreds of people at once.

IBM is still the world’s largest manufacturer of mainframe computers, holding 90% of the market with their System z line of mainframes. Linux is gaining an increasing share of use on these mainframes, not often the main operating system on the mainframe, but becoming more popular. The three lines of operating systems from the 1960s still remain, with z/OS the successor of the OS/360 line, z/VSE the successor of DOS/360 and z/VM replacing the CP/CMS project. Each of these operating systems has adopted large amounts of modern computing concepts, yet maintain extreme backwards compatibility. The COBOL code of the 1960s, written for machines using punch cards and teletypes, will still work with little or no modification today.

However, IBM is no longer the largest computer manufacturer in the world. Removed from the desktop computer market by the pressure of clone manufacturers who left IBM unable to compete, and eventually selling out their line of high-quality but expensive Thinkpad laptops to the Chinese company, Lenovo, IBM now focuses on other markets, from its server and supercomputing lines, to the development of the POWER processor, most recently used in the consoles of today. Hewlett-Packard, only a humble calculator manufacturer in 1964, is now the largest computer company in the world, and one of the largest companies altogether.

The mainframe might not have the glamour of the supercomputer, or even the appeal of a desktop computer, but it performs tasks today that are essential to our society. The operating systems for the IBM mainframe, despite their difficult start, have matured into stable platforms, ready for huge loads every day, and ones able to run programs that are forty years old when today’s desktop operating systems often have difficulty with ten-year-old programs. There are lessons to be learned even today from the OS/360 project, particularly with the idea of the second-system effect (something Microsoft is guilty of several times), but despite the difficulties experienced during its development, it has maintained its line during a time period where dozens of operating systems and computer platforms have died out, and that is something to be impressed with, regardless of your ultimate position on the future of the mainframe.

Historical Operating Systems – AmigaOS

With the 1980s came the microcomputer revolution, and with it, a chance for a wide variety of manufacturers to try their hand at producing a machine to compete in the rapidly expanding home computer market. Some machines proved very successful indeed, such as the IBM PC and the Sinclair ZX Spectrum, while others were destined to become cult classics, such as Acorn Computers’ BBC Micro, an educational computer built in conjunction with the BBC Computer Literacy Project, and Microsoft’s MSX, a computer designed to tap into the massive potential Japanese market. Yet others, finding that the market could not sustain such variety indefinitely, remained obscure even in their own time.

Most of these early home computers followed the same basic layout – based around a cheap 8-bit processor, often an MOS 6502 or a Zilog Z80, and an amount of chip RAM, usually ranging from 2 to 128KB, depending on the specification, such computers regularly plugged into televisions and used a command-line interface based around a simple, crude variant of BASIC carried on a ROM chip, many of the variants being programmed by Microsoft. Then, in 1984, Apple released its Macintosh, and things started to change rapidly in the personal computer market.

With a graphical user interface based on the work of Apple’s previous, more expensive workstation model, the Lisa, which in turn took design cues from the Alto and Star machines from Xerox PARC, the Macintosh was arguably too short of RAM and too held back by its single-tasking model for its earliest variants to be particularly useful, but it introduced a far more user-friendly interface to the fray than the older command lines.

Commodore Business Machines was one of the lucky companies during the early 1980s, creating one of the iconic computers of the time: The Commodore 64. Relatively affordable, and with a generous amount of RAM, the Commodore 64 would go on to become the single-best selling computer model of all time. However, by 1985, this machine was beginning to look a bit long in the tooth to be sold as the flagship model for the company.

The original Amiga, later dubbed the Amiga 1000, was not originally designed by Commodore; it was developed by a group of discontent ex-Atari staff who formed a company named Amiga Corporation. Through several complicated deals, involving Amiga Corporation, Atari and the dismissed president of Commodore, Jack Tramiel, Amiga Corporation was bought out by Commodore Business Machines, and the first Amiga was released in 1985.

Looking closely at the image on the screen, it looks like something that my second PC could produce – in 1996.

With a 32-bit Motorola 68000 processor and 256KB of RAM as standard, it was an amazingly quick machine for the time. As the machine had originally been intended as a games console, it featured impressive graphical and sound capabilities, which put it far ahead of most of its contemporaries. It also featured a very impressive operating system, known as AmigaOS – giving full pre-emptive multitasking when the standard operating systems of its competitors were limited to single-tasking or cooperative multitasking.

It’s sometimes difficult to contemplate just how much more flexible and powerful pre-emptive multitasking can be over the co-operative sort, especially if you’ve never used an operating system with co-operative multitasking. Pre-emptive multitasking is a development in operating systems which essentially underpins all modern personal computer operating systems, and allows for multimedia applications and for appropriate background processing.

Imagine that you’re playing a music or video file, in conjunction with another program. With a pre-emptive system, the operating system itself divides up processor cycles evenly between each of the programs. In contrast, with a co-operative system, it is up to the programs themselves to cede control of the processor to the other applications, and all it takes is one poorly-programmed application, or one which is a bit too selfish with the processor cycles, and your music file will start skipping – or even worse, stop playing at all. As I think you’ll agree, this can get rather annoying.

By providing full pre-emptive multitasking in 1985, AmigaOS was even further ahead of its contemporaries than it had been with its lauded graphical and sound capabilities. Mac OS wouldn’t even develop co-operative multitasking until 1988, and it took until 2000 and the development of Mac OS X for it to finally develop pre-emptive multitasking. The IBM PC platform didn’t get a pre-emptive system until the development of OS/2 and Windows 95, and while some previous computers had support for varying forms of UNIX, this was of limited utility, had no GUI (the X Window System being notoriously bloated at the time), and ran slowly on the hardware.

AmigaOS is an operating system consisting of two parts: the Kickstart ROM, which contained most of the necessary base components for the operating system in a stunningly limited amount of space, and Workbench, the GUI layer for the OS, originally contained on a series of floppy discs. Such a dual-layer system may seem odd to more recent adopters of computer technology, but in the days of limited permanent storage, it showed itself to be an ideal way to allow for a complex operating system without compromise. It also allowed for games to use all of the Amiga’s RAM without having the GUI resident in RAM and taking up precious memory; such games thus booted directly from the Kickstart kernel.

Aesthetically, the Workbench GUI of AmigaOS was arguably not as clean or attractive as Apple’s Mac OS to begin with, but had the major advantage of being able to output in colour, which was not available on the Macintosh platform until 1987, and only then on their high-end Macintosh II computers with non-integrated monitors. The ability, exhibited by the Amiga, to output graphics in 4096 colours was a major advantage in the gaming field that the machine had originally been designed for, and only the Atari ST, a similar sort of computer also using a Motorola 68000 processor, could really come close to the Amiga in terms of graphical power.

The Mac OS interface may have been more elegant, but the Amiga had the decided power advantage.

Unfortunately for Commodore, though, a focus on computer gaming and multimedia power gave the machine a “toy-like” reputation which was not to serve them well at a time when computers were only just making their way into businesses. The original IBM PC could hardly be described as a graphical powerhouse, but it was developed by a company which had up to then absolutely dominated the computer market. IBM’s reputation for business machines meant that the IBM PC became a de facto standard in the workplace despite not being as powerful as some of its competitors, and at a time when the computer market was homogenising, IBM managed to secure a healthy share of the high-price end of the market. As such, at this early stage, the Amiga did not manage to attain the success that its powerful hardware and advanced operating system would suggest it deserved.

By 1987, the Amiga computer line-up diversified with the introduction of the low-end Amiga 500 and the high-end Amiga 2000, and with it came a new market for the Amiga. Capable of properly taking the fight to the Atari ST, the Amiga began to pull away from its less powerful competition at the low-end of its market segment. Amiga OS updates with these early machines were of limited scope, but with the advanced base of the programming, the OS hardly needed to be updated.

People were beginning to discover the potential of the Amiga as well, with the powerful graphics hardware for an inexpensive price allowing for the editing of television shows by broadcasters who could not afford more expensive workstations for the job. With applications outside the gaming market, the Amiga managed to carve out its own niche, although this was still relatively insubstantial compared to the office and desktop publishing markets dominated by the IBM PC and the Apple Macintosh respectively.

On the home market front, the Amiga may have had the legs on the Atari ST, but there was another competitor which held it back. Just as the IBM PC had managed to secure the office market, inexpensive IBM-compatible computers had acquired a significant share of the home market. The use of a relatively cheap Intel 8086 processor and an easily-reverse-engineered BIOS in the IBM PC 5150 had led other companies to quickly sell their own cheaper variants of the PC architecture.

As the cross-compatibility between these machines and the IBM machines that occupied offices allowed people to bring their work home, the IBM architecture quickly got a foothold on the home market as well. Computer gaming, the forte of the Amiga, was never as big of a priority at the time. By the time it was, IBM-compatible machines had bridged the gap between their previously slow efforts and the advanced Amigas with more powerful graphics hardware.

In 1990, the first significant change in AmigaOS came in conjunction with the release of the Amiga 3000, a complete upgrade to the Amiga architecture. Workbench 2.0 presented users with a more fluid and unified interface, in comparison to the somewhat messy and chaotic presentation in Workbench 1.x. The improved hardware in the Amiga 3000 gave it a new lease of life – if a short one – and some of the most technically advanced games of the time were to be originally found on the Amiga, including the incredible technical achievements of Frontier: Elite II, a space simulator designed by David Braben of Frontier Development fame, and exhibiting features which really made the most of the hardware.

This might not look like much now, but when I started using PCs in 1994, this was state-of-the-art.

To be honest, the demise of Commodore four years later looked inevitable with the increasing domination of the IBM-compatible architecture and its rapidly-improving graphical technology. Commodore hardly helped things with some of their later developments, though. In 1990, the development of the expensive CDTV, which was intended more as an expensive games console than Commodore’s previous developments, failed utterly when slotted into the market beside the far less expensive Nintendo and Sega games consoles of the time, both of which had far more variety of game titles. The later CD32 was less expensive, but the SNES and Sega Mega Drive made a complete mockery of Commodore’s efforts.

Commodore didn’t seem to do any better marketing their computers than their games consoles. The replacements for the Amiga 500 were intended to give Commodore something to contest the low-end market, but their sales were blunted by a marketing disaster which gave the public the impression that new “super-Amigas” would soon be on the market. Customers held back, creating further problems for the struggling company.

Finally, in 1994, Commodore was finished, going bankrupt and selling the intellectual property of the Amiga in order to pay its tremendous debts. Along with the Amiga died the Commodore 64, which had amazingly lasted 12 years in a market which had accelerated considerably since then. Soon after came the release of Windows 95 and the earliest 3D graphics accelerators, which would have nailed Commodore’s coffin shut, if their poor decisions hadn’t already done so. The Amiga had some final moments of glory after Commodore was gone, though – it was involved in the editing of the acclaimed science-fiction series, Babylon 5, for one thing.

Commodore may have been dead, but AmigaOS lived on – to some extent. Passing from company to company like the British contemporary, RISC OS, AmigaOS maintained a niche market of enthusiasts who were either unwilling to make the shift to the PC platform, or else wished to continue using their programs and games. The OS survives today, now at version 4.1 and being marketed by a Belgian company named Hyperion Entertainment. The nostalgic sorts can indulge themselves by using UAE (Universal Amiga Emulator), which allows one to emulate a wide variety of Amiga hardware from the earliest A1000s to the A4000s produced in 1994. UAE, as befits an open-source emulator, is available on several operating systems, including Windows, Mac OS X and Linux.

Like the Acorn Archimedes, a British contemporary of the Amiga which was itself ahead of the IBM PCs and Apple Macintoshes of the time, the Amiga was a computer which deserved to do well. Poor marketing on the part of Commodore may have had its role, but perhaps a more likely explanation for its failure was that the market wasn’t quite ready for a multimedia computer – or one that was dominant at computer gaming.

What is perplexing, though, is that the advanced operating system didn’t provide more inspiration to its competitors – its mixture of efficient programming (today, the Kickstart ROM is only 1MB!) and advanced multitasking could probably have given more power to the PCs which took over, which only gained pre-emptive multitasking with Windows 95, a notoriously unreliable and bloated operating system. The relative homogeneity of the operating system market may have largely eliminated the problems with software compatibility, but at the cost of computing efficiency, and with mobile platforms becoming more prevalent, perhaps that’s something that programmers should be taking a closer look at.