Pinball Challenge Deluxe – A Retrospective Review

Several years before DICE started to make their wildly popular Battlefield series, they made pinball games. Starting out as an offshoot of a Swedish Amiga demoscene group, The Silents, the company then known as Digital Illusions released three of the stable of pinball games published by the British company, 21st Century Entertainment Ltd. First released for the Amiga and then ported to several other platforms, including MS-DOS, the SNES and the Atari Jaguar, Pinball Dreams, Pinball Fantasies and Pinball Illusions were well regarded by the contemporary video game press.

In 2002, the same year as DICE released the first of the games in the Battlefield series, DICE’s pinball games found their way onto the Game Boy Advance. Developed by another British company, Binary9 Studios and published by Ubisoft, Pinball Challenge Deluxe incorporates all of the elements of the first two games in the series, Pinball Dreams and Pinball Fantasies. With eight tables available covering a range of themes from horror to space travel, Pinball Challenge Deluxe has plenty to offer for a pinball fan.

DICE’s pinball games tended towards the simulationist bent, with realistic ball physics and tables that looked and felt like they could easily make the transition to the physical domain. Binary9 did an expert job of replicating that on the Game Boy Advance, with the physics and table layouts fully intact. The developers did have to compensate for the lower resolution and smaller screen size of the Game Boy Advance compared to the Amiga, with the game requiring considerably more in the way of scrolling on the playfield, but aside from requiring more in the way of prediction to figure out where the ball is going to fall onto the flippers, their efforts do not diminish from the fun of the game. On the other hand, Binary9 have included some extra details on some of the tables that were not present in the original Amiga versions for the Original Chip Set and the game retains its colourful and stylistic presentation, which does a good job of capturing the essence of each table.

The music has been ported over properly as well. Originally composed by Olof Gustafsson and representing some of the best tracker music on the Amiga, the music is one of the highlights not only of the original DICE versions but of the Game Boy Advance port. The music does retain the Amiga version’s tendency to cut out and restart from a certain point after certain sound effects, probably a consequence of the Amiga’s limited number of sound channels, but this is authentic and doesn’t detract from the quality of the music in the first place.

I find the controls to be a mixed bag. While successfully putting all of the controls from the Amiga version onto the handheld platform, including flippers, spring control and a button to tilt the table vertically – and a tilt sensor to regulate use of that feature – the flipper controls are mapped to the shoulder buttons. Most pinball games I have played on Nintendo’s handheld systems have instead or also allowed the use of the left arrow key and the A button and while the use of the shoulder buttons works out acceptably on the original model of Game Boy Advance and the Game Boy Micro, it is a bit uncomfortable on the Game Boy Advance SP models or the Nintendo DS in either of its GBA-compatible forms. Nevertheless, the controls are responsive and the mapping isn’t a deal-breaker.

Pinball Challenge Deluxe doesn’t add many elements that weren’t already present in the original games. The load times are substantially better than they were on the Amiga original by virtue of the cartridge storage medium and the options menu does give you the option to decrease the volume of the sound effects and music, while also giving the option of how many balls you get per playthrough, from the original three up to five. However, I’m not particularly fond of the latter option, as I think it plays havoc with the authenticity of the original gameplay. On the other hand, the Pinball Fantasies tables retain the original feature whereby one can randomly receive an extra ball after losing their last one based on the first digit of their score and the game also saves three high scores per table.

Generally though, despite the lack of extra features over the original games, Pinball Challenge Deluxe is a good conversion of the original games. Retaining the same challenging, yet rewarding simulation of real-world pinball, the colourful and stylish graphics and the outstanding music, it’s a solid package and while the diminished resolution and extra scrolling of the Game Boy Advance versions mean that the Amiga versions are still what I would consider to be the definitive versions, the portability and quicker loading times make this a port worthy of praise.

Bottom Line: Pinball Challenge Deluxe does a good job of replicating what made the original Amiga games so much fun and maintains the strong simulation of pinball on a portable game system.

Recommendation: If you’re a pinball fan looking for fun on the go, take a look at it. It’s also a decent title for dipping your toes into the world of pinball, but don’t pay too much for it.

Advertisements

The First “PC Master Race” – Part 2: The Golden Age of the 8-Bit Home Computer (up to 1990)

At the end of Part 1 of this series, I discussed two releases in 1985 that would shape the video game markets of Europe and North America respectively in the coming years: The Commodore Amiga and the Nintendo Entertainment System. However, to understand how these systems fit historically into this divide, I must first fill a few gaps that were not addressed in Part 1, including the events that led to the development of both systems and the marketplace into which they emerged.

The state of the European 8-bit home computer market in 1985

The home computer as a market segment emerged in 1977 in the United States and by 1982, there were many home computer developers in the US competing for a piece of the pie. Among these were Apple, Tandy Radio Shack, Commodore Business Machines, Atari, Texas Instruments, Exidy and Timex Sinclair. Furthermore, there were several consoles with pretensions of becoming home computers with add-ons, including the ColecoVision console and Coleco Adam computer, the Intellivision with its Keyboard Component (delays to which would make the Intellivision a running joke in the media) and the Atari 5200 which, under the shell, was a stripped-down Atari 8-bit computer. By 1984, many of these companies had been marginalised or were in the process of leaving the market completely.

A major catalyst to this was Commodore’s release in late 1982 of the Commodore 64, which even on release represented a comparatively affordable machine respective to its competition, but which soon dropped even further in price due to a price war instigated by the ruthless founder of Commodore, Jack Tramiel, partially as revenge for being driven out of the electronic calculator market in the past by Texas Instruments and partially as a defence mechanism against the Japanese, who were expected to try to secure the computer market similar to the way that they had taken over the calculator market. By June 1983, the list price of the Commodore 64 had been halved, from a release price of $595 to $300, while rebates and bundle deals would drive down the price even further. A contributing factor to Commodore’s ability to drive down prices so far was their ownership of MOS Technology, which produced the 6510 processor and the custom chips of the Commodore 64, while other computer manufacturers had to purchase their processors and chips from other manufacturers.

In any case, the combination of a low price and technical sophistication through the Commodore 64’s custom chips meant that few other computer manufacturers could compete. Texas Instruments, which Tramiel had a particular grudge against, had a torrid time with their TI99/4A system, which soon dropped to an unsustainable price of $99. Manufacturers which had entered the market in search of a quick buck balked at competing with a company so willing to drive down prices in search of market share. Some companies managed to sustain themselves, such as Apple who were already in the process of trying to transition to sophisticated GUI-driven systems and managed to keep the Apple ][ line going as a cash cow through their involvement with business and education, but in general, the Commodore 64 proved a difficult machine to compete against in the American market.

Obviously, there were going to be plenty of objections about Commodore’s market practices, from retailers who saw the practices as predatory, from other companies whose systems matched up badly in the market versus the Commodore 64 – and even inside the company itself! Irving Gould, chairman of the company and an investor who had provided money to keep Commodore afloat in the past, clashed badly with Tramiel over his race to the bottom in search of market share. This would lead to a power struggle which would see Tramiel kicked out of the company he had created in January 1984. With many Commodore personnel leaving the company to join Tramiel in his next venture, this would have serious knock-on effects which will be discussed in more detail below, but with Tramiel gone, Commodore looked to diversify and move on from the Commodore 64.

Several systems would be released throughout 1984, but none would achieve anywhere near the success of the Commodore 64. The Commodore 128, a more sophisticated model with more memory and complete reverse compatibility with the Commodore 64, would come the closest to success with approximately 4.5 million sales over its lifetime, but would be hamstrung by the fact that few developers saw a point in developing software specifically for it rather than the older and more popular system with which it was compatible. The Commodore 16 and Plus/4 models would fare worse; designed as a range of computers to replace the Commodore 64, they were completely incompatible and were, despite some success in Europe, a complete flop in the US market.

While Commodore was struggling, however, the Commodore 64 was still going strong. The worldwide sales leader until 1985 (when the IBM PC and its clones started to take off, catalysed largely by the platform’s attractiveness to businesses) and dominant in the low-end computer market in the US, it was also in a strong position in the European market, which had, as mentioned in Part 1 of the series, already adopted the home computer as the gaming platform of choice. The Sinclair ZX Spectrum, also mentioned before, was one of its main competitors in this role. Another system, not previously mentioned, would complete the trifecta of the most popular 8-bit gaming computers in Europe throughout the 1980s and early 1990s.

Amstrad, a British company founded by Sir Alan Sugar and then in the field of low-end consumer electronics, decided to join the home computer market in 1984 with the release of the Amstrad CPC. The Amstrad CPC was somewhere between the ZX Spectrum and Commodore 64 in terms of graphical and sound capabilities, came with an integrated tape drive and, unlike the ZX Spectrum and Commodore 64, was specifically designed around having a separate monitor rather than plugging into a television set. At a time when households were likely to only have a single television set, this was a novel feature freeing the TV for people to watch while the computer was used on the separate monitor. Released at a price of £199 with a green-screen monitor or £299 with a colour monitor, it was a reasonable prospect against the Commodore 64 which was then £195.95 on its own without the C2N Datasette tape deck or the ZX Spectrum at £129 with 48KB of memory versus 64KB on the CPC 464 and again without a tape deck included.

These three systems would end up trading blows right through to the early 1990s, with a lot of multiplatform releases which would target all three to different extents. The success of these systems caused other computer manufacturers to diminish, similar to the situation in the United States, with the Oric and Dragon systems mentioned in Part 1 soon going by the wayside and France’s Thomson systems being pushed out by foreign competitors.

A few other systems managed to carve out a slice of the pie, including the BBC Micro, which was the platform of genesis for several important games – including the aforementioned Chuckie Egg and the seminal Elite – along with the MSX series, the latter being a Japanese-developed series of computers peculiar among 8-bit systems for not being designed and manufactured by a single company but instead representing a standard based around off-the-shelf components and a standard Microsoft-designed BASIC ROM whereby any manufacturer could licence the ROM and build their own system. (In this way, it was similar to the IBM PC which was readily cloned, but instead with the explicit permission of the standard’s designers.) Different countries had different preferences for systems; the Commodore 64 would be particularly embraced by Germany, for instance, while the Amstrad CPC would become the most popular system in France and the MSX range would be especially popular in the Netherlands due to Philips’ production of several MSX systems.

amstrad_cpc_464-img_4849

The Amstrad CPC 464, here branded with French-language markings.

This set of competitors represented the more affordable end of systems at the time. But 1985 was important for other reasons, as a new generation of systems entered the market that would later become the source of focus for the game developers of Europe.

The Commodore Amiga and Atari ST: Beginnings of the 16-bit war

Despite being ousted from his own company, Jack Tramiel didn’t call it a day. He had soon established Tramel Technology, with several former Commodore employees joining him and by April 1984, was planning a new computer based around the Motorola 68000 CPU. Soon afterwards, he learnt that Warner Communications were looking to sell Atari. Atari had been the market leader in the console market prior to the North American video game crash in 1983 (as well as being one of the instigators of the crash) and therefore had the most to lose. Atari was haemorraging money by 1984, losing an approximated $1 million per day and becoming a major drain on Warner’s resources.

In July 1984, Tramiel purchased Atari’s Consumer Division, including their home computer and console assets and immediately got to work to moulding the newly formed Atari Corporation into his own company. Using Atari’s stock of video game consoles as a means to stay afloat, Tramiel’s engineers continued to work on their new computer design. But a couple of months later, Tramiel’s son, Leonard, found a contract negotiated with Atari Inc. which was of particular relevance to a company looking for a new computer design.

Jay Miner, a designer of the custom chips used in the Atari 2600 and the Atari 8-bit family, had tried to convince Atari to invest in a design for a new computer and console architecture. When he was rebuffed, he left Atari in 1982 along with some other Atari staffers and founded a new company, initially called Hi-Toro but later renamed Amiga. Similar to Tramel Technology, they staked their future on the Motorola 68000 CPU as well. However, they had exhausted their venture capital by 1983 and were looking for a way to keep going, which led them back to the door of Atari. Atari agreed to fund Amiga for ongoing development work in exchange for a one-year exclusive deal to produce and sell the machine. However, before Amiga could complete the design, Atari went into freefall with the video game crash in 1983, leaving the future of the company in limbo.

While Tramiel was negotiating with Warner to buy Atari, Amiga was looking for alternate sources of funding and ended up going to Commodore. Commodore had already suffered heavily from brain drain after the departure of its staff to Tramel Technology and were planning an injunction to stop Tramiel from releasing his computer given their belief that the former Commodore employees had stolen trade secrets. Desperate for a new computer design after the failure of the Commodore 16 and Plus/4 and relative failure of the Commodore 128, Commodore looked to buy Amiga outright and cancel Amiga’s contract with Atari. Things didn’t end simply, as Tramiel returned the favour by seeking an injunction himself against Amiga, but Commodore did manage to successfully buy Amiga.

It would not be an exaggeration to call Amiga’s first product, the Amiga 1000 released in July 1985, revolutionary. A multimedia PC before the term was even coined, the Amiga represented a huge jump over the previous generation with class-leading graphics that allowed 32 colours out of a palette of 4096 to be displayed in normal use (and more in special modes), one of the best sound chips ever made in the four-channel, 8-bit PCM Paula chip and a very sophisticated pre-emptive multitasking operating system which was close to ten years ahead of its time. (I have discussed AmigaOS in a previous article.) Yet, despite that, it was not absurdly expensive; at an introductory price of $1,295 (with a monitor for an extra $300), it decidedly undercut the Macintosh 512K priced at $2,795 (which had more memory than the Amiga’s 256KB, but a monochrome screen and a single-tasking OS).

Atari might not have secured rights to the Amiga, but they did manage to finish their own computer a couple of months before the Amiga and released the Atari ST in June 1985. The ST was not as sophisticated as the Amiga; while it used the same 68000 processor clocked about one megahertz faster than the Amiga’s chip, its graphical and sound capabilities were less impressive, being able to display a maximum of 16 colours on screen out of a palette of 512 colours and used an off-the-shelf Yamaha derivative of the General Instrument AY-3-8910 with three channels that could produce square waves or white noise (also used in the likes of the Amstrad CPC and MSX along with later ZX Spectrum models). The computer retailed with 512KB of RAM for $799 with a monochrome monitor or $999 with a colour monitor.

In retrospect, the default sound capabilities of the Atari ST were disappointing, given that Atari had been developing an 8-channel additive synthesis chip known as the AMY which would have been inexpensive, yet provided a good counter-argument to the sophistication of Amiga’s Paula chip. Instead, they went for a chip which was not even as good as the MOS Technology SID released three years before the ST. (Atari did include a built-in MIDI in/out port, to be fair, which made the machine popular with musicians, but using it required additional hardware.) As well as that, the operating system wasn’t anywhere near as impressive as AmigaOS, with a single-tasking GUI paradigm which was on par with other systems at the time but was far outstripped by the pre-emptive multitasking of the Amiga. (I also have discussed Atari TOS previously.)

Nevertheless, the Atari ST became the bigger sales success early on, with a more approachable price for the families in Europe who bought many of the early units. Commodore’s woeful marketing didn’t hurt either, with Commodore apparently having no idea how – and no funding in any case – to market their sophisticated machine. (Interestingly, an Easter egg found in an early release of AmigaOS illustrated the Amiga engineers’ discontent at Commodore, with a message reading, “We made Amiga, they fucked it up”.) However, Atari wasn’t exactly in the healthiest of states either, with Tramiel’s ruthlessness with the Commodore 64 coming back to haunt him to some respect. Both systems would, over their lifetimes, appeal more to European audiences than Americans, who were instead focusing on other platforms which would shape the future of computers and of video gaming.

The NES: Saviour of the American video game market, but a “cult classic” in Europe

Japan, like Europe, had not suffered heavily in the wake of the North American video game crash of 1983. With a strong domestic market of arcade games and its own set of personal computer platforms ranging from the NEC PC-8801, the Fujitsu FM-7 and the Sharp MZ and X1 systems, Japan were able to sustain their own market during the contraction in the market in the US.

Nintendo were one of the notable successes of the Japanese market at the time. Having started developing video games in about 1975, with several arcade games, a few Pong clones and the Game & Watch series of handheld games, they had struck gold with Shigeru Miyamoto’s Donkey Kong in 1981. The first game to feature Mario (then a carpenter named Jumpman), Donkey Kong was a smash hit in the arcade and made it onto several home computers and consoles as well. With this success, Nintendo decided to develop their own video game console.

On the 23rd of July, 1983, Nintendo released the Famicom (or Family Computer) in Japan. Designed around a clone of the same MOS 6502 CPU architecture used in the Atari 8-bit systems, the Commodore 64 and the BBC Micro, the Famicom was more of an evolution rather than a revolution. It did have better graphics than anything else on the console market at the time, being able to display 25 colours on screen at once out of a 54-colour palette along with a sophisticated sprite engine, but the CPU was comparable to an Atari 8-bit system and the sound chip, with its five channels including two pulse waves, one sawtooth wave, a noise generator and a 6-bit PCM channel, was approximately on par with the Atari POKEY, General Instrument AY-3-8910 and Texas Instruments SN76489 chips found in other console and home computer systems and couldn’t match the MOS Technology SID in the Commodore 64.

After a slightly slow start, the Famicom soon picked up momentum to become the best-selling console in Japan by 1984. Plans were written up with Atari to distribute the system in the US in a modified form. While the name “Famicom” was a bit of a misnomer for a system that was first and foremost designed for video games, there was an add-on package called Family BASIC with a cartridge and keyboard peripheral which allowed the system to be used as a somewhat limited computer system through BASIC programming. The plans to sell the system in the US made the name look more appropriate. The planned Nintendo Advanced Video System would have come with an integrated keyboard, a cassette drive, a wireless joystick and a BASIC cartridge which would have made it as much a home computer as a video game console.

nintendo_advanced_video_system_retouched

The Nintendo Advanced Video System, complete with peripherals.

Of course, these plans never came to pass. Atari delayed an initial deal in 1983 to distribute the Famicom in North America after finding that Coleco were illegally bundling their Adam computer with Donkey Kong. Despite this being an unauthorised port, Atari took this as a sign that Nintendo were working with a major competitor in the video game market. The deal was cancelled as Atari’s CEO, Ray Kassar, was fired shortly afterwards for insider trading. A later attempt to market the AVS as mentioned above also fizzled out and in the wake of this, Nintendo decided to distribute the system themselves.

This would end up being a fortuitous decision. Modifying the Famicom further, with a front-loading zero insertion force cartridge slot that was meant to obfuscate the system’s purpose and evoke images of VCRs rather than video games consoles (although this would prove to be inferior to the card edge connector cartridge design of the Famicom and most previous and subsequent cartridge-based consoles), along with the R.O.B. (Robotic Operating Buddy) accessory designed to give the system a place on toy shelves, Nintendo released the Nintendo Entertainment System on the 18th of October, 1985 in limited test markets in the US and later distributing it across the whole United States throughout 1986.

The NES would prove, after a shaky start similar to that of the Famicom, to be a massive sales success in the United States, reigniting American passions with video games. Of the 61.9 million NES/Famicom systems sold worldwide, more than half of these were sold in the Americas and “Nintendo” would become a byword for video gaming in the US in the years to come.

The NES would not be so popular in Europe. Released in two batches across Europe, with continental Europe (apart from Italy) receiving the system on the 1st of September, 1986, while the UK, Ireland, Italy, Canada, Australia and New Zealand in 1987, the console entered a more challenging market, often with a baked-in preference for home computers.The official sales figures for the NES in regions other than Japan and the Americas are 8.5 million and while it’s difficult to get a solid figure for anything more specific, it is clear that not all of that 8.5 million was down to Western Europe.

While the NES had some degree of success in countries like France and Germany, video gamers in the United Kingdom were especially dismissive of the system on its release and sales always remained lukewarm even near the end of its lifespan. Nintendo had implemented some practices when developing the NES for the United States that were particularly inappropriate for the UK audience. Nintendo had deliberately targeted the system in the West more towards younger children, with a harsh policy towards the depiction of violence, profanity or sexuality, which made it look a bit “kiddy” when it was introduced in the UK.

Furthermore, in an attempt to mitigate the development of low-budget shovelware, advergames and pornographic titles that had infested the Atari 2600, Nintendo had instituted a very tight control on publishing for NES games and mandating the use of a lockout chip that required Nintendo’s approval to produce. This, however, was antithetical to the British video game industry which circled so much around the bedroom coder and small teams of indie game developers who lacked the financial backing to produce cartridges in the first place, let alone with a lockout chip. NES games were considerably more expensive than home computer games on cassette tapes and it was a difficult task to sell a device to British audiences that could only be used for video games and where, for the price of a single game, you could buy half-a-dozen games for a computer instead.

By virtue of sales late in its life, the NES would not be a total flop even in the UK, but it was hardly the saviour of the video game industry that it had become in the United States. Its success in the US will become more important later in this series, but for the time being, it served to illustrate the growing divergence in the video game industry between the US and Europe.

Sinclair – pulling defeat from the jaws of victory

The ZX Spectrum had proven to be a big success, with its aim of providing the cheapest possible colour computer resonating well with British buyers who appreciated its “cheap and cheerful” nature. Despite its limitations, such as the rubber-keyed chiclet keyboard and frequent attribute clash due to colour restrictions per on-screen tile, the Spectrum certainly did the trick as an affordable system for learning how to program and play games. However, not all of Sinclair’s ventures were so successful.

I find it interesting that despite the limited utility of early computers apart from entertainment, there were several computer manufacturers that dismissed video gaming as an inappropriate use of their systems. Apple, despite Steve Jobs’ and Steve Wozniak’s history with Atari, actively sought to discourage video game developers early on. IBM didn’t even consider the possibility of video gaming on their systems until the development of the IBM PCjr, which unsuccessfully tried to straddle the ground between the low-end systems like the Commodore 64 and the high-end business market it was already catering for. Sir Clive Sinclair was also famously dismissive of video gaming, having designed the ZX Spectrum to provide people with a platform for programming, but failing to see at the time that what a lot of buyers wanted to program were games.

Sinclair Research sought to follow up the ZX Spectrum in 1984 with the Sinclair QL (or Quantum Leap). Based around the Motorola 68008, a version of the 68000 somewhat analogous to the IBM PC’s Intel 8088 processor in that it had an 8-bit data bus which effectively halved the speed of the CPU, the QL did improve on the Spectrum in some respects, including the pre-emptive multitasking QDOS operating system released a year before AmigaOS, but was hardly the great leap forward that its name suggested.

The QL was rushed onto the market in January 1984 but was far from ready for production, lacking even a working prototype. Even when the first customer deliveries arrived in April, they were found to be unreliable, with multiple bugs in the firmware and numerous issues with the proprietary Microdrive storage system, which aimed to provide a cheaper alternative to the floppy disk by using an endless loop of magnetic tape inside a cartridge case. These issues were later resolved, but the early impression of the system stuck with it until it was discontinued in 1986. Today, the system is arguably most notable for Linus Torvalds’ ownership of one and his requirement to write his own software due to the poor support that the system received.

The QL wasn’t Sinclair Research’s only failure either. The portable Sinclair TV80 used a flat-screen CRT using a side-mounted electron gun and a Fresnel lens to make the picture look larger than it was, but failed to sell enough units to recoup its development costs. However, this was relatively low key compared to the biggest flop in Sinclair’s history: The infamous C5.

Sir Clive had held a long-lasting interest in electric vehicles since the 1950s and by 1983, the success of the ZX Spectrum gave Sir Clive capital to set up his own electric vehicle company called Sinclair Vehicles, Ltd. After in-depth research into the matter from the late 1970s onwards, Sinclair Vehicles released the C5 in 1985. It was a notorious flop, being underpowered, slow and unsafe with no weatherproofing – a big mistake in the frequently rainy climate of the UK. With both electric and pedal power, it was meant to bridge the gap between bicycles and cars, but ended up alienating both sets of people and only sold 5,000 of the 14,000 units produced.

All of these financial failures came at the expense of the device that had made Sinclair’s reputation. While the Spectrum did receive an update in 1984 in the form of the Spectrum+ with a new injection-moulded keyboard to replace the original chiclet keyboard, it took Sinclair’s Spanish distributor to really push for an improved model. The ZX Spectrum 128 added extra memory to the tune of 128kB overall (as the name implied) along with extra features such as an actual sound chip in the form of the AY-3-8912, an RS-232 serial port, an RGB monitor port and a better BASIC editor. Launched in Spain in September 1985, it wasn’t released in its major market of the UK until January 1986.

Faced with financial problems, Sir Clive would sell up the Sinclair branding and computer technology rights over to Amstrad in April 1986. Amstrad continued to not only sell but improve the Spectrum over the years, but their improvements did introduce some incompatibilities with the older models and the Spectrum would not sell in the same numbers as it did in the period up to 1985. There was enough of an install base to keep it relevant in the market and it had been enough of a success for Clive Sinclair to be knighted in 1983, but momentum was to shift to the Commodore 64, the Amstrad CPC and later the Atari ST and Commodore Amiga.

Now, back to the games!

By 1985, there had already been several smash hits on the 8-bit home computers, both in Europe and in the United States. The European games like Elite, Chuckie Egg and Manic Miner have already been discussed in Part 1, but American games like Epyx’s Games series, Lode Runner and Impossible Mission deserve a mention as they started to make their way over to Europe and began to be ported to the ZX Spectrum, Amstrad CPC and BBC Micro.

For the most part, European game development continued in a similar vein to how it had proceeded in previous years, with a distinct “bedroom coder” indie approach to a lot of the titles, with one or two programmers working on the game in their own time. With programming tools accessible as soon as users turned on their computers, along with a steady flow of resources from computer magazines reprinting BASIC and machine or assembly language code listings and books which discussed programming in the various dialects of assembly language on the different systems, the home computers made it a far less daunting prospect to develop and have a successful game published than video game consoles.

As mentioned briefly in Part 1 and as with any creative field where the barrier to entry is low, this led to a lot of mediocre and poor-quality games, many of which aped what their developers saw other games doing. This included a large number of platformers and shoot-’em-ups trying – and failing – to emulate the output of the Japanese companies such as Sega, Konami, Capcom and Irem for the arcades. However, the same low barrier to entry also allowed for genuinely novel games to make their mark on the systems, like 1985’s Paradroid, first developed by Andrew Braybrook for the Commodore 64 and incorporating both elements of the shoot-’em-up and puzzle genres, 1986’s The Sentinel, an esoteric and very original first-person perspective puzzle game first developed by Geoff Crammond (later known for his series of Formula One racing simulators) for the BBC Micro, the bizarre isometric 1987 action-adventure/puzzle/platform game Head over Heels designed first for the ZX Spectrum and the early fighting game International Karate, first developed for the ZX Spectrum in 1985 by System 3 and followed up by its even more successful sequel International Karate + (also known as IK+) in 1987.

Original game series were beginning to emerge as well, often from the platform game genre. The Miner Willy series, comprising Manic Miner and two official sequels in the form of the similarly popular Jet Set Willy in 1984 and the less successful Jet Set Willy Ii in 1985 along with a couple of spin-off titles, had become a smash hit for the ZX Spectrum early on. The Monty Mole series started in 1984, also on the ZX Spectrum and received sequels throughout the rest of the 1980s, including Monty on the Run and Auf Wiedersehen Monty, which joined rather expansive multi-screen platforming worlds with a quirky sense of British humour. Similarly, the Dizzy series, first emerging in 1987, received a whole host of sequels up until 1992 and combined platforming with action-adventure elements.

Speaking of arcade games, the original titles on the market began to be joined by a host of ports of popular arcade titles, including Commando, Ghosts ‘n Goblins and its sequel Ghouls ‘n Ghosts, Green Beret, Bubble Bobble, OutRun and R-Type. These invariably did not match up particularly well to the arcade versions, not only because of the less powerful hardware of home computers versus the specialised and custom-built hardware of arcade machines, but also because the developers of such games had little to no official support from the original developers, were not provided with an overview of the internal workings of the games and often had to spend their own money on watching the arcade game being played so that they could work out how the game worked by deduction. This was at least one area which contemporary consoles had an advantage in, given that the original developers of the arcade games were also responsible to their ports to the console platforms. However, some of the Commodore 64 ports of these games are notable for their astounding soundtracks, among the best on any 8-bit system.

As a matter of fact, very strong music was rapidly becoming a character trait of the Commodore 64. While early soundtracks on the system had tended to use the chip similarly to other sound chips of the time, with a single waveform for each of the three voices on the SID, a trick was devised a few years after the Commodore 64’s release to rapidly change waveforms dynamically on each voice to give the impression of having more channels available at a time. An early example of this technique was illustrated by Rob Hubbard, who popularised the style of music with the soundtrack to 1985’s Monty on the Run. This particular bit of music is influenced heavily by Charles Williams’ “Devil’s Galop”, the theme tune to the popular 1950s BBC radio serial, Dick Barton, but includes its own original take on the music, with a sophisticated sound that was simply not possible in the same way on any other contemporary platform and lasting for an unprecedented six minutes without looping, a veritable lifetime in an era when nearly all game music looped after a minute at most.

There were plenty of highly acclaimed British composers, such as the aforementioned Rob Hubbard, who continued to push the limits of the SID after Monty on the Run, Martin Galway, Ben Daglish, Matt Gray and Tim Follin, but other great composers came from elsewhere, like Chris Hülsbeck and Ramiro Vaca from Germany and Jeroen Tel from the Netherlands. Between them, they explored the limits of the SID, often making a game worth buying for the music alone. Several of these musicians would continue to compose for later platforms, particularly Follin and Hülsbeck.

The Commodore 64 wasn’t the only 8-bit platform that could receive good music, though, as good composers could sometimes modify their pieces to work on the less sophisticated AY-3-8910 and SN76489. Tim Follin, an incredible musician who would manage to compose brilliant pieces on every platform he ever touched, even managed to make the primitive 1-bit beeper of the ZX Spectrum produce surprisingly sophisticated polyphonic music resembling (very buzzy) rock and orchestral pieces.

As the 1980s progressed towards their end, new platforms began to emerge and older ones became more affordable. 1987 saw the release of the Acorn Archimedes, the successor to the BBC Micro and incorporating a sophisticated 32-bit RISC CPU from the ARM architecture as well as a co-operative multitasking OS named RISC OS. (I discuss RISC OS here as installed on the more recent Raspberry Pi.) The Archimedes was not itself particularly popular by itself; while it didn’t lack for hardware grunt with its 8 MHz ARM2 processor producing 4 MIPS, approximately three times that of the comparably clocked 68000s in the Amiga and Atari ST, it suffered from its reputation as an educational computer where the BBC Micro had instead benefitted from it. However, the Archimedes would have an impact that would outlast the computer series itself, as its power-efficient ARM processor would become the de-facto standard in mobile devices such as PDAs, handheld games consoles and smartphones.

Also released in 1987 was the Amiga 500, a redesigned system based on the hardware of the Amiga 1000, but with more onboard memory and a form factor more in keeping with the 8-bit computers, with the keyboard integrated into the case. With a reduced price to make it more enticing to home users, the Amiga 500 would end up becoming the most popular system in the history of the platform, opening it up in particular to British and German audiences who would embrace the system in the years to come.

Finally, 1987 saw the European release of the Sega Master System. While not as popular as the home computers of the time and a distant second in its generation of consoles behind the NES, many of those coming from Brazil, where Sega had shrewdly negotiated a contract for the system to be built and marketed by the Brazilian company Tec Toy in order to avoid huge import duties on foreign-made electronics, the Master System represented something very bizarre in terms of sales figures, as it was far more popular in Europe than it was in either the United States or its home market of Japan. Selling approximately 6.8 million units in Western Europe, it actually outsold the NES in Europe, playing off a more mature image as seen from the arcade games which were Sega’s bread and butter at the time, as well as a more lax licensing policy than Nintendo. Sega had even licenced several of its arcade games for ports to the home computers, including After Burner, Space Harrier and the aforementioned Out Run.

Around the time of the Amiga and Atari ST releases in 1985, games were being made first and foremost for the 8-bit computers and later receiving graphical polish in ports to the more advanced 16/32-bit systems. Near the end of the decade, this situation was beginning to be reversed, as games would be released on the Amiga or Atari ST first and later filter their way down to the 8-bit computers if the games were successful and simple enough to port appropriately. For example, 1989’s Shadow of the Beast by Psygnosis was developed with the Amiga in mind, using complex parallax scrolling and high-quality music to push the system to its limits, but still found its way onto the 8-bit systems soon after its release. Another 1989 title, Populous by Bullfrog Productions, also first released the title on the Amiga and was later ported to multiple other systems – which in this case did not include the 8-bit systems.

Big-name titles also started coming from countries other than Britain, which seemed to dominate proceedings in Europe when it came to popular and retrospectively acclaimed titles on the 8-bit platforms. One developer from Germany, Manfred Trenz, is of particular note here. One of Trenz’s early projects was doing graphical work on a Commodore 64 game called The Great Giana Sisters, which was published by Rainbow Arts. The game was a polished but very obvious clone of Super Mario Bros., with a soundtrack composed by Chris Hülsbeck, but its similarities to Nintendo’s game brought the risk of legal action against Rainbow Arts and the game disappeared from shelves. Nevertheless, Manfred Trenz was planning his own game and in 1988, he developed Katakis, first released on the Commodore 64 and soon after ported to the Amiga by Factor 5, a group formed by five former employees of Rainbow Arts.

Katakis was again a pretty obvious clone of a pre-existing game, this time Irem’s arcade classic, R-Type and again, the threat of legal action loomed over the game. However, in a bizarre set of events, Activision Europe, who held the legal rights to port R-Type to the Amiga, found themselves without programmers to port the game and delivered an ultimatum to Factor 5: either develop the port of R-Type on the Amiga or receive a lawsuit. Katakis, for what it was worth, was later retooled and re-released as Denaris in 1989.

It was Trenz’s next idea, however, that would really kickstart things for Rainbow Arts and Factor 5. Trenz turned his attention to the run-’n’-gun platforming genre and in 1990, developed Turrican for the Commodore 64, with a prompt port to the Amiga by Factor 5. Best described as “Contra meets Metroid” (although apparently heavily influenced by the obscure Japanese arcade game, Psycho-Nics Oscar), Turrican mixed together a blend of “don’t stop shooting” side-scrolling action with relatively open game worlds with multiple secrets containing power-ups and extra lives. Somewhat limited by the one-button Atari-compatible joystick interface on the Commodore 64 and needing to bind the “up” direction to jump, Trenz devised a solution for the problem of aiming up by adding a secondary fire mode which would generate a lightning whip when holding the fire button while standing still. This lightning whip could be rotated around in 360 degrees and provided a novel solution to the issues posed by the limited controls permitted by the Atari joystick interface.

While the Commodore 64 version of the game was one of the most solid and technically advanced games on the platform, it was when it made the move to the Amiga that it would really shine. With better graphics and an astounding retooled soundtrack done by Chris Hülsbeck in what comprised probably his best work until that point and illustrating just how well he had made the transition from the Commodore 64 to the Amiga, Turrican really stood out on the Amiga as one of the best run-’n’-gun games ever to come out of anywhere other than Japan. The game would receive multiple ports to the other 8-bit systems, to the Atari ST and to a few consoles which will receive more attention in the next part of this series, along with multiple sequels throughout the early 1990s.

By 1990, the Amiga had really started to become embraced by European software developers, having reached an affordable cost for a system that despite not receiving any significant upgrades – as will be elaborated on in Part 3 – still managed to impress. But there were new platforms on the horizon, offering a major threat to the Amiga, while horrible mismanagement had already threatened to kill the Amiga and was a constant threat in the years to come. Meanwhile, the Americans couldn’t be ignored forever; having licked their wounds following the video game crash, they were ready to come back in a big way and they favoured consoles over computers and even then much preferred the IBM PC clones to the Amiga. The home computer market was on shaky ground and was close to meeting its doom.

Part 3 will discuss the Amiga in more detail in the period between 1987, with the release of the Amiga 500 and 1994, with the demise of Commodore, along with the other platforms which would soon take the lustre away from the Amiga’s revolutionary design.

Historical Operating Systems – AmigaOS

With the 1980s came the microcomputer revolution, and with it, a chance for a wide variety of manufacturers to try their hand at producing a machine to compete in the rapidly expanding home computer market. Some machines proved very successful indeed, such as the IBM PC and the Sinclair ZX Spectrum, while others were destined to become cult classics, such as Acorn Computers’ BBC Micro, an educational computer built in conjunction with the BBC Computer Literacy Project, and Microsoft’s MSX, a computer designed to tap into the massive potential Japanese market. Yet others, finding that the market could not sustain such variety indefinitely, remained obscure even in their own time.

Most of these early home computers followed the same basic layout – based around a cheap 8-bit processor, often an MOS 6502 or a Zilog Z80, and an amount of chip RAM, usually ranging from 2 to 128KB, depending on the specification, such computers regularly plugged into televisions and used a command-line interface based around a simple, crude variant of BASIC carried on a ROM chip, many of the variants being programmed by Microsoft. Then, in 1984, Apple released its Macintosh, and things started to change rapidly in the personal computer market.

With a graphical user interface based on the work of Apple’s previous, more expensive workstation model, the Lisa, which in turn took design cues from the Alto and Star machines from Xerox PARC, the Macintosh was arguably too short of RAM and too held back by its single-tasking model for its earliest variants to be particularly useful, but it introduced a far more user-friendly interface to the fray than the older command lines.

Commodore Business Machines was one of the lucky companies during the early 1980s, creating one of the iconic computers of the time: The Commodore 64. Relatively affordable, and with a generous amount of RAM, the Commodore 64 would go on to become the single-best selling computer model of all time. However, by 1985, this machine was beginning to look a bit long in the tooth to be sold as the flagship model for the company.

The original Amiga, later dubbed the Amiga 1000, was not originally designed by Commodore; it was developed by a group of discontent ex-Atari staff who formed a company named Amiga Corporation. Through several complicated deals, involving Amiga Corporation, Atari and the dismissed president of Commodore, Jack Tramiel, Amiga Corporation was bought out by Commodore Business Machines, and the first Amiga was released in 1985.

Looking closely at the image on the screen, it looks like something that my second PC could produce – in 1996.

With a 32-bit Motorola 68000 processor and 256KB of RAM as standard, it was an amazingly quick machine for the time. As the machine had originally been intended as a games console, it featured impressive graphical and sound capabilities, which put it far ahead of most of its contemporaries. It also featured a very impressive operating system, known as AmigaOS – giving full pre-emptive multitasking when the standard operating systems of its competitors were limited to single-tasking or cooperative multitasking.

It’s sometimes difficult to contemplate just how much more flexible and powerful pre-emptive multitasking can be over the co-operative sort, especially if you’ve never used an operating system with co-operative multitasking. Pre-emptive multitasking is a development in operating systems which essentially underpins all modern personal computer operating systems, and allows for multimedia applications and for appropriate background processing.

Imagine that you’re playing a music or video file, in conjunction with another program. With a pre-emptive system, the operating system itself divides up processor cycles evenly between each of the programs. In contrast, with a co-operative system, it is up to the programs themselves to cede control of the processor to the other applications, and all it takes is one poorly-programmed application, or one which is a bit too selfish with the processor cycles, and your music file will start skipping – or even worse, stop playing at all. As I think you’ll agree, this can get rather annoying.

By providing full pre-emptive multitasking in 1985, AmigaOS was even further ahead of its contemporaries than it had been with its lauded graphical and sound capabilities. Mac OS wouldn’t even develop co-operative multitasking until 1988, and it took until 2000 and the development of Mac OS X for it to finally develop pre-emptive multitasking. The IBM PC platform didn’t get a pre-emptive system until the development of OS/2 and Windows 95, and while some previous computers had support for varying forms of UNIX, this was of limited utility, had no GUI (the X Window System being notoriously bloated at the time), and ran slowly on the hardware.

AmigaOS is an operating system consisting of two parts: the Kickstart ROM, which contained most of the necessary base components for the operating system in a stunningly limited amount of space, and Workbench, the GUI layer for the OS, originally contained on a series of floppy discs. Such a dual-layer system may seem odd to more recent adopters of computer technology, but in the days of limited permanent storage, it showed itself to be an ideal way to allow for a complex operating system without compromise. It also allowed for games to use all of the Amiga’s RAM without having the GUI resident in RAM and taking up precious memory; such games thus booted directly from the Kickstart kernel.

Aesthetically, the Workbench GUI of AmigaOS was arguably not as clean or attractive as Apple’s Mac OS to begin with, but had the major advantage of being able to output in colour, which was not available on the Macintosh platform until 1987, and only then on their high-end Macintosh II computers with non-integrated monitors. The ability, exhibited by the Amiga, to output graphics in 4096 colours was a major advantage in the gaming field that the machine had originally been designed for, and only the Atari ST, a similar sort of computer also using a Motorola 68000 processor, could really come close to the Amiga in terms of graphical power.

The Mac OS interface may have been more elegant, but the Amiga had the decided power advantage.

Unfortunately for Commodore, though, a focus on computer gaming and multimedia power gave the machine a “toy-like” reputation which was not to serve them well at a time when computers were only just making their way into businesses. The original IBM PC could hardly be described as a graphical powerhouse, but it was developed by a company which had up to then absolutely dominated the computer market. IBM’s reputation for business machines meant that the IBM PC became a de facto standard in the workplace despite not being as powerful as some of its competitors, and at a time when the computer market was homogenising, IBM managed to secure a healthy share of the high-price end of the market. As such, at this early stage, the Amiga did not manage to attain the success that its powerful hardware and advanced operating system would suggest it deserved.

By 1987, the Amiga computer line-up diversified with the introduction of the low-end Amiga 500 and the high-end Amiga 2000, and with it came a new market for the Amiga. Capable of properly taking the fight to the Atari ST, the Amiga began to pull away from its less powerful competition at the low-end of its market segment. Amiga OS updates with these early machines were of limited scope, but with the advanced base of the programming, the OS hardly needed to be updated.

People were beginning to discover the potential of the Amiga as well, with the powerful graphics hardware for an inexpensive price allowing for the editing of television shows by broadcasters who could not afford more expensive workstations for the job. With applications outside the gaming market, the Amiga managed to carve out its own niche, although this was still relatively insubstantial compared to the office and desktop publishing markets dominated by the IBM PC and the Apple Macintosh respectively.

On the home market front, the Amiga may have had the legs on the Atari ST, but there was another competitor which held it back. Just as the IBM PC had managed to secure the office market, inexpensive IBM-compatible computers had acquired a significant share of the home market. The use of a relatively cheap Intel 8086 processor and an easily-reverse-engineered BIOS in the IBM PC 5150 had led other companies to quickly sell their own cheaper variants of the PC architecture.

As the cross-compatibility between these machines and the IBM machines that occupied offices allowed people to bring their work home, the IBM architecture quickly got a foothold on the home market as well. Computer gaming, the forte of the Amiga, was never as big of a priority at the time. By the time it was, IBM-compatible machines had bridged the gap between their previously slow efforts and the advanced Amigas with more powerful graphics hardware.

In 1990, the first significant change in AmigaOS came in conjunction with the release of the Amiga 3000, a complete upgrade to the Amiga architecture. Workbench 2.0 presented users with a more fluid and unified interface, in comparison to the somewhat messy and chaotic presentation in Workbench 1.x. The improved hardware in the Amiga 3000 gave it a new lease of life – if a short one – and some of the most technically advanced games of the time were to be originally found on the Amiga, including the incredible technical achievements of Frontier: Elite II, a space simulator designed by David Braben of Frontier Development fame, and exhibiting features which really made the most of the hardware.

This might not look like much now, but when I started using PCs in 1994, this was state-of-the-art.

To be honest, the demise of Commodore four years later looked inevitable with the increasing domination of the IBM-compatible architecture and its rapidly-improving graphical technology. Commodore hardly helped things with some of their later developments, though. In 1990, the development of the expensive CDTV, which was intended more as an expensive games console than Commodore’s previous developments, failed utterly when slotted into the market beside the far less expensive Nintendo and Sega games consoles of the time, both of which had far more variety of game titles. The later CD32 was less expensive, but the SNES and Sega Mega Drive made a complete mockery of Commodore’s efforts.

Commodore didn’t seem to do any better marketing their computers than their games consoles. The replacements for the Amiga 500 were intended to give Commodore something to contest the low-end market, but their sales were blunted by a marketing disaster which gave the public the impression that new “super-Amigas” would soon be on the market. Customers held back, creating further problems for the struggling company.

Finally, in 1994, Commodore was finished, going bankrupt and selling the intellectual property of the Amiga in order to pay its tremendous debts. Along with the Amiga died the Commodore 64, which had amazingly lasted 12 years in a market which had accelerated considerably since then. Soon after came the release of Windows 95 and the earliest 3D graphics accelerators, which would have nailed Commodore’s coffin shut, if their poor decisions hadn’t already done so. The Amiga had some final moments of glory after Commodore was gone, though – it was involved in the editing of the acclaimed science-fiction series, Babylon 5, for one thing.

Commodore may have been dead, but AmigaOS lived on – to some extent. Passing from company to company like the British contemporary, RISC OS, AmigaOS maintained a niche market of enthusiasts who were either unwilling to make the shift to the PC platform, or else wished to continue using their programs and games. The OS survives today, now at version 4.1 and being marketed by a Belgian company named Hyperion Entertainment. The nostalgic sorts can indulge themselves by using UAE (Universal Amiga Emulator), which allows one to emulate a wide variety of Amiga hardware from the earliest A1000s to the A4000s produced in 1994. UAE, as befits an open-source emulator, is available on several operating systems, including Windows, Mac OS X and Linux.

Like the Acorn Archimedes, a British contemporary of the Amiga which was itself ahead of the IBM PCs and Apple Macintoshes of the time, the Amiga was a computer which deserved to do well. Poor marketing on the part of Commodore may have had its role, but perhaps a more likely explanation for its failure was that the market wasn’t quite ready for a multimedia computer – or one that was dominant at computer gaming.

What is perplexing, though, is that the advanced operating system didn’t provide more inspiration to its competitors – its mixture of efficient programming (today, the Kickstart ROM is only 1MB!) and advanced multitasking could probably have given more power to the PCs which took over, which only gained pre-emptive multitasking with Windows 95, a notoriously unreliable and bloated operating system. The relative homogeneity of the operating system market may have largely eliminated the problems with software compatibility, but at the cost of computing efficiency, and with mobile platforms becoming more prevalent, perhaps that’s something that programmers should be taking a closer look at.