Back again after a long wait…

I recognise that it’s been a while since I last wrote anything here, but I’ve been keeping myself busy and haven’t been inclined towards writing. However, I have been working on a few things which seem rather relevant to the scope of this blog. I’ve found quite some use out of the NAS that I mentioned in my last post, I’ve been working on electronics experiments and have further built up my Raspberry Pi collection with a new addition.

Synology DS416 – Performance Benchmarking

In order to get some sort of understanding of the performance of my Synology NAS, which I have connected through its two 1GbE ports to an 8-port Netgear GS108E switch, I decided to use IOmeter to pit it against the data storage drive I have in my computer, a 3TB 7.2K Seagate SATA disk. As artificial as the results from IOmeter can be, it still gives a good idea of what the speeds are like against the SATA disk on which I put most of my non-OS files. Using a maximum disk size of 1,048,576 sectors (or 512MB) and leaving the tests running for 20 minutes each, I got the following results, where Z: corresponds to an iSCSI volume on the NAS and D: corresponds to the SATA disk:

4KiB; 50% read, 50% write; 50% random

Z: 692 IOPS, 2.84MBPS, 23ms latency

D: 185 IOPS, 0.76MBPS, 86ms latency

64KiB; 0% read, 100% write; 0% random

Z: 552 IOPS, 36.15MBPS, 29ms latency

D: 849 IOPS, 55.62MBPS, 18ms latency

64KiB; 100% read, 0% write; 0% random

Z: 1787 IOPS, 112.33MBPS, 8.9ms latency

D: 939 IOPS, 62.91 MBPS, 16.6ms latency

In the 4KiB test, which would appear to be the most representative of real-world tasks involving using the storage as a standard drive (although I would expect more reads and more random data to be passed through in that circumstance), the NAS clearly had the advantage, with three times the IOPS, three times the data throughput and a third of the latency. The NAS is not as good in the 64KiB write-only test, possibly as a consequence of the RAID penalty applied since the four disks in the array are set up in RAID 5, but the NAS takes the lead again in the 64KiB read-only test, both of which might represent me writing or reading in a manner befitting the drives’ purposes as archival drives. The read-only performance of the NAS also lines up with the maximum network read throughput that I’ve seen.

I’ve also installed a Windows 7 VM on a separate 1TB iSCSI volume on the NAS which I’ve used with VMware Workstation to provide me with some ability to play Windows games without having to reboot my system. There have been some issues with stuttering, which would be a deal-breaker in action games but has been sufficient for the strategy and WRPG titles that I have been using it for, like Hearts of Iron III and Planescape: Torment.

Electronics experiments – a new collection of gear to try out!

The electronics experiments that I had been doing before with my Raspberry Pi had gone on the backburner for a long time. However, with Robot Wars back on the BBC recently and my younger brother interested in building a robot, I’ve decided to get back into the world of electronics experiments with the idea of getting enough knowledge to build a basic robot, at which point I can get my brother involved with various tasks. To that end, I’ve stocked up on a whole new list of components. At the moment, most of these are limited to things I can stick on a breadboard instead of motors and so on, but the components I’ve picked up include MCP23017 I/O expanders to complement the MCP23008 that I already have, an alphanumeric LCD display, a few AY-3-8910 sound chips, along with piezoelectric speakers, LM386 amplifiers and audio jacks and a few crystal oscillators for the sound chips. I’m also expecting a delivery of SN76489 sound chips within the month.

So far, I haven’t got too far; I’ve done some experiments to illustrate that the MCP23017 chips and LM386 amplifiers work, but I’ll need to learn how to solder before I can test the alphanumeric LCD and the AY-3-8910 sound chip will require some time to understand before I can get it tested.

Anyway, here’s a basic schematic using the LM386 amplifier:

LM386 Amplifier_bb

Pin 5 on the LM386 IC is connected to a voltage source between 4V and 12V, while pin 4 is connected to ground. Pins 2 and 3 of the LM386 are hooked up to the ground and the positive line of the input from the audio jack respectively. Pins 1 and 8 can be hooked up to a capacitor with a maximum value of 10kΩ to increase the gain, but I decided to go without and use the internal gain of 20 just to test that the IC worked. Then, pin 6 is connected to the positive terminal of a piezoelectric speaker, which produces sound, although the datasheet recommends using various capacitors in order to smooth out the sound and reduce noise. Pin 7 of the LM386 provides a bypass for sound without amplification, but this goes unused in this circuit.

A new Raspberry Pi and plenty of toys for it

The Raspberry Pi Zero, since its launch, has been one of the most desirable and difficult to find models of the single-board computer. I’ve been interested in having one for the novelty of such a small, yet capable computer, but the propensity for it to be sold out has put me off. However, the Raspberry Pi Foundation recently announced the Pi Zero W, a variant of the Pi Zero with on-board Wi-Fi and Bluetooth in the same form factor, although with a price of $10 rather than $5. Since adding Wi-Fi and Bluetooth capacity to a standard Pi Zero would generally increase the cost more than the extra $5 equivalent premium that the Pi Zero W has over the Pi Zero on its own and require awkward USB extenders, this appealed to me and I decided to pick one up along with the official case.

Along with the new Pi Zero, I’ve also picked up the standard and NoIR camera modules, a Sense HAT and a Gertduino add-on board to provide the capacity of an Arduino Uno to my Raspberry Pis, especially the older Model B boards that I rarely use any more. As with the electronic components, I’ve only tested these enough to verify that they work, but I’m looking forward to using them when I find the time.

With respect to the Raspberry Pis that I already have, I’ve recently installed RetroPie on a spare microSD card for my Pi 3 Model B. I’ve been impressed by the performance of the Pi when it comes to emulation; even PlayStation games have been smooth on the system, while older systems like the SNES and Mega Drive have worked excellently. I took the system for a spin with some of my friends in a retro gaming night; let’s just say that my Tekken skills could do with a bit of an improvement! Installation of RetroPie is very simple and very accessible as well; after copying the OS image onto the microSD card, everything else on the system is straightforward and it’s possible to copy ROMs and disk/disc images on with a USB drive without any particular effort.

A Brief Update

Since my last update, I’ve found myself rather busy with family affairs, but nevertheless have still been working on some technologically-focused aims of my own. I’m looking towards studying and becoming a Red Hat Certified System Administrator by the end of the year, a certification which I hope will serve as a good stepping stone for some of my career ambitions in the future.

In my home environment, I’ve finally got the AMDGPU open-source drivers working on my openSUSE system and my Radeon R9 290; I haven’t had much of an opportunity to really push the drivers to check if there has been any performance increase over the FGLRX drivers that I had been using, but it at least opens the door for me using Vulkan in the future. I’ve started playing a few games as well, but have not been able to finish them; these include Planescape: Torment and Doom (2016). Maybe when I have a bit more time, I’ll be able to go a bit further with these particular games.

But the biggest news of the last three months on the technological side of things is my purchase of a Synology DS416 NAS system for home storage. I’ve loaded it with four 4TB WD Red hard drives and set it up for RAID 5, with a total advertised capacity of 10.90TB in total. At present, I’ve only allocated 4TB of this space as a backup storage space, which I’m generally accessing through NFS on Linux, but can also set up for SMB on Windows, but with iSCSI capabilities on the system and my own experience with iSCSI through my job doing technical support for a certain brand of enterprise iSCSI SAN arrays, I will likely have ample opportunities to set up the system for VM space on either KVM or as a practice environment for VMware ESXi. I’ll report more on this when I’ve had a chance to do performance benchmarking and to set up more environments on the system.

A Repudiation of Donald Trump

So, it happened: The United States made the biggest mistake in its 240-year history, cutting off its nose to spite its face and letting emotion override logic. Donald Trump has been elected as the 45th President of the United States and while there are still some potential challenges to this result, including ongoing recounts planned for three narrowly-decided swing states and the potential for the Electoral College, however unorthodox and contentious this would be, to vote against Trump, I think that the US needs to resign itself to at least four years of the most unprecedently bad choice for the position of President in its entire history. I fail to see even a single positive facet of Trump’s campaign and an extensive list of negatives.

Trump is, by any reasonable definition of the word, actually a fascist. He has appointed a white supremacist, Steve Bannon (who is incidentally a far-right, echo-chamber bottom-of-the-barrel propagandist), as his Senior Counselor. His entire campaign was built around palingenetic populism (“Make America Great Again!”, “Build the wall!”) and backed by ultranationalists which any person I would consider reasonable would repudiate, rather than embrace. He has threatened to imprison his main opponent and to loosen libel laws so that he can sue media outlets like the New York Times with impunity. We should not be normalising Trump. We should not be rationalising Trump. We made the same mistakes in the past and suffered the consequences for it.

But there’s more. Trump has exhibited a horrendous amount of misogyny, culminating in sexual predation. He has also chosen a Vice President nominee with barbaric views on the LGBT community, who funds organisations who attempt gay conversion therapy, along with other individuals who seek to disenfranchise LGBT individuals.

One of the biggest complaints that Trump supporters made during the election was that Clinton was “crooked”, yet Trump has just settled a $25 million fraud suit against Trump University and has over 70 lawsuits pending against him. That’s before you get into the conflicts of interests that Trump has, from his refusal to put his assets into a blind trust, to inviting his daughter (who will, incidentally, be one of the people looking after Trump’s assets) to a meeting with the Japanese Prime Minister, to appointing lobbyists and multi-billionaire tycoons with vested interests to positions within his cabinet.

Then there’s the views of Trump and his cabinet on science. Trump is an open anti-vaxxer and climate change denier, contrary to the views of the vast majority of qualified scientists and has already made moves to appoint cabinet members based on those misconceptions. Other members of his cabinet are creationists. All of this points to what will be an incredibly hostile environment to scientists within the next few years.

And let’s not forget Trump’s narcissism, this being a man who stays up to engage in Twitter wars against former beauty queens and respectful criticism of his Vice President nominee. This is a man who is meant to represent the United States of America on the world stage. Does that seem like normal behaviour to you? Does that seem like the actions of a man you can trust talking to other world leaders?

Talking about foreign policy, Trump represents other dangers there as well. He seeks to hold NATO to ransom, while implicitly supporting the imperialism of Vladimir Putin. Putin has likely been salivating over the prospect of rolling T-90s straight into Riga, Tallinn and Vilnius with only nominal resistance. Furthermore, by promoting isolationism (a policy which has had an inauspicious history with respect to the United States), he has created an impetus for several other nations, including several European nations, South Korea, Japan and even Saudi Arabia, to build independent nuclear arsenals. This is contrary to US foreign policy for the last six decades.

Even the elements of Trump’s campaign that could most easily be spun into a positive carry suspicious undertones. He is an ostensibly successful businessman, but did so based on inherited money and has been bankrupted six times. He is an ostensible political outsider, despite schmoozing with politicians for decades (including the Clintons) and immediately going against his campaign promise to “drain the swamp” by appointing political insiders to his cabinet positions.

The way to fix a broken window is not to burn down the whole house. And voting Trump is like not only burning down the house, but taking a dump on the remains. I can only hope that Europe takes note of this and does not make the same mistakes itself. But I am not holding out hope for that.


The First “PC Master Race” – Part 2: The Golden Age of the 8-Bit Home Computer (up to 1990)

At the end of Part 1 of this series, I discussed two releases in 1985 that would shape the video game markets of Europe and North America respectively in the coming years: The Commodore Amiga and the Nintendo Entertainment System. However, to understand how these systems fit historically into this divide, I must first fill a few gaps that were not addressed in Part 1, including the events that led to the development of both systems and the marketplace into which they emerged.

The state of the European 8-bit home computer market in 1985

The home computer as a market segment emerged in 1977 in the United States and by 1982, there were many home computer developers in the US competing for a piece of the pie. Among these were Apple, Tandy Radio Shack, Commodore Business Machines, Atari, Texas Instruments, Exidy and Timex Sinclair. Furthermore, there were several consoles with pretensions of becoming home computers with add-ons, including the ColecoVision console and Coleco Adam computer, the Intellivision with its Keyboard Component (delays to which would make the Intellivision a running joke in the media) and the Atari 5200 which, under the shell, was a stripped-down Atari 8-bit computer. By 1984, many of these companies had been marginalised or were in the process of leaving the market completely.

A major catalyst to this was Commodore’s release in late 1982 of the Commodore 64, which even on release represented a comparatively affordable machine respective to its competition, but which soon dropped even further in price due to a price war instigated by the ruthless founder of Commodore, Jack Tramiel, partially as revenge for being driven out of the electronic calculator market in the past by Texas Instruments and partially as a defence mechanism against the Japanese, who were expected to try to secure the computer market similar to the way that they had taken over the calculator market. By June 1983, the list price of the Commodore 64 had been halved, from a release price of $595 to $300, while rebates and bundle deals would drive down the price even further. A contributing factor to Commodore’s ability to drive down prices so far was their ownership of MOS Technology, which produced the 6510 processor and the custom chips of the Commodore 64, while other computer manufacturers had to purchase their processors and chips from other manufacturers.

In any case, the combination of a low price and technical sophistication through the Commodore 64’s custom chips meant that few other computer manufacturers could compete. Texas Instruments, which Tramiel had a particular grudge against, had a torrid time with their TI99/4A system, which soon dropped to an unsustainable price of $99. Manufacturers which had entered the market in search of a quick buck balked at competing with a company so willing to drive down prices in search of market share. Some companies managed to sustain themselves, such as Apple who were already in the process of trying to transition to sophisticated GUI-driven systems and managed to keep the Apple ][ line going as a cash cow through their involvement with business and education, but in general, the Commodore 64 proved a difficult machine to compete against in the American market.

Obviously, there were going to be plenty of objections about Commodore’s market practices, from retailers who saw the practices as predatory, from other companies whose systems matched up badly in the market versus the Commodore 64 – and even inside the company itself! Irving Gould, chairman of the company and an investor who had provided money to keep Commodore afloat in the past, clashed badly with Tramiel over his race to the bottom in search of market share. This would lead to a power struggle which would see Tramiel kicked out of the company he had created in January 1984. With many Commodore personnel leaving the company to join Tramiel in his next venture, this would have serious knock-on effects which will be discussed in more detail below, but with Tramiel gone, Commodore looked to diversify and move on from the Commodore 64.

Several systems would be released throughout 1984, but none would achieve anywhere near the success of the Commodore 64. The Commodore 128, a more sophisticated model with more memory and complete reverse compatibility with the Commodore 64, would come the closest to success with approximately 4.5 million sales over its lifetime, but would be hamstrung by the fact that few developers saw a point in developing software specifically for it rather than the older and more popular system with which it was compatible. The Commodore 16 and Plus/4 models would fare worse; designed as a range of computers to replace the Commodore 64, they were completely incompatible and were, despite some success in Europe, a complete flop in the US market.

While Commodore was struggling, however, the Commodore 64 was still going strong. The worldwide sales leader until 1985 (when the IBM PC and its clones started to take off, catalysed largely by the platform’s attractiveness to businesses) and dominant in the low-end computer market in the US, it was also in a strong position in the European market, which had, as mentioned in Part 1 of the series, already adopted the home computer as the gaming platform of choice. The Sinclair ZX Spectrum, also mentioned before, was one of its main competitors in this role. Another system, not previously mentioned, would complete the trifecta of the most popular 8-bit gaming computers in Europe throughout the 1980s and early 1990s.

Amstrad, a British company founded by Sir Alan Sugar and then in the field of low-end consumer electronics, decided to join the home computer market in 1984 with the release of the Amstrad CPC. The Amstrad CPC was somewhere between the ZX Spectrum and Commodore 64 in terms of graphical and sound capabilities, came with an integrated tape drive and, unlike the ZX Spectrum and Commodore 64, was specifically designed around having a separate monitor rather than plugging into a television set. At a time when households were likely to only have a single television set, this was a novel feature freeing the TV for people to watch while the computer was used on the separate monitor. Released at a price of £199 with a green-screen monitor or £299 with a colour monitor, it was a reasonable prospect against the Commodore 64 which was then £195.95 on its own without the C2N Datasette tape deck or the ZX Spectrum at £129 with 48KB of memory versus 64KB on the CPC 464 and again without a tape deck included.

These three systems would end up trading blows right through to the early 1990s, with a lot of multiplatform releases which would target all three to different extents. The success of these systems caused other computer manufacturers to diminish, similar to the situation in the United States, with the Oric and Dragon systems mentioned in Part 1 soon going by the wayside and France’s Thomson systems being pushed out by foreign competitors.

A few other systems managed to carve out a slice of the pie, including the BBC Micro, which was the platform of genesis for several important games – including the aforementioned Chuckie Egg and the seminal Elite – along with the MSX series, the latter being a Japanese-developed series of computers peculiar among 8-bit systems for not being designed and manufactured by a single company but instead representing a standard based around off-the-shelf components and a standard Microsoft-designed BASIC ROM whereby any manufacturer could licence the ROM and build their own system. (In this way, it was similar to the IBM PC which was readily cloned, but instead with the explicit permission of the standard’s designers.) Different countries had different preferences for systems; the Commodore 64 would be particularly embraced by Germany, for instance, while the Amstrad CPC would become the most popular system in France and the MSX range would be especially popular in the Netherlands due to Philips’ production of several MSX systems.


The Amstrad CPC 464, here branded with French-language markings.

This set of competitors represented the more affordable end of systems at the time. But 1985 was important for other reasons, as a new generation of systems entered the market that would later become the source of focus for the game developers of Europe.

The Commodore Amiga and Atari ST: Beginnings of the 16-bit war

Despite being ousted from his own company, Jack Tramiel didn’t call it a day. He had soon established Tramel Technology, with several former Commodore employees joining him and by April 1984, was planning a new computer based around the Motorola 68000 CPU. Soon afterwards, he learnt that Warner Communications were looking to sell Atari. Atari had been the market leader in the console market prior to the North American video game crash in 1983 (as well as being one of the instigators of the crash) and therefore had the most to lose. Atari was haemorraging money by 1984, losing an approximated $1 million per day and becoming a major drain on Warner’s resources.

In July 1984, Tramiel purchased Atari’s Consumer Division, including their home computer and console assets and immediately got to work to moulding the newly formed Atari Corporation into his own company. Using Atari’s stock of video game consoles as a means to stay afloat, Tramiel’s engineers continued to work on their new computer design. But a couple of months later, Tramiel’s son, Leonard, found a contract negotiated with Atari Inc. which was of particular relevance to a company looking for a new computer design.

Jay Miner, a designer of the custom chips used in the Atari 2600 and the Atari 8-bit family, had tried to convince Atari to invest in a design for a new computer and console architecture. When he was rebuffed, he left Atari in 1982 along with some other Atari staffers and founded a new company, initially called Hi-Toro but later renamed Amiga. Similar to Tramel Technology, they staked their future on the Motorola 68000 CPU as well. However, they had exhausted their venture capital by 1983 and were looking for a way to keep going, which led them back to the door of Atari. Atari agreed to fund Amiga for ongoing development work in exchange for a one-year exclusive deal to produce and sell the machine. However, before Amiga could complete the design, Atari went into freefall with the video game crash in 1983, leaving the future of the company in limbo.

While Tramiel was negotiating with Warner to buy Atari, Amiga was looking for alternate sources of funding and ended up going to Commodore. Commodore had already suffered heavily from brain drain after the departure of its staff to Tramel Technology and were planning an injunction to stop Tramiel from releasing his computer given their belief that the former Commodore employees had stolen trade secrets. Desperate for a new computer design after the failure of the Commodore 16 and Plus/4 and relative failure of the Commodore 128, Commodore looked to buy Amiga outright and cancel Amiga’s contract with Atari. Things didn’t end simply, as Tramiel returned the favour by seeking an injunction himself against Amiga, but Commodore did manage to successfully buy Amiga.

It would not be an exaggeration to call Amiga’s first product, the Amiga 1000 released in July 1985, revolutionary. A multimedia PC before the term was even coined, the Amiga represented a huge jump over the previous generation with class-leading graphics that allowed 32 colours out of a palette of 4096 to be displayed in normal use (and more in special modes), one of the best sound chips ever made in the four-channel, 8-bit PCM Paula chip and a very sophisticated pre-emptive multitasking operating system which was close to ten years ahead of its time. (I have discussed AmigaOS in a previous article.) Yet, despite that, it was not absurdly expensive; at an introductory price of $1,295 (with a monitor for an extra $300), it decidedly undercut the Macintosh 512K priced at $2,795 (which had more memory than the Amiga’s 256KB, but a monochrome screen and a single-tasking OS).

Atari might not have secured rights to the Amiga, but they did manage to finish their own computer a couple of months before the Amiga and released the Atari ST in June 1985. The ST was not as sophisticated as the Amiga; while it used the same 68000 processor clocked about one megahertz faster than the Amiga’s chip, its graphical and sound capabilities were less impressive, being able to display a maximum of 16 colours on screen out of a palette of 512 colours and used an off-the-shelf Yamaha derivative of the General Instrument AY-3-8910 with three channels that could produce square waves or white noise (also used in the likes of the Amstrad CPC and MSX along with later ZX Spectrum models). The computer retailed with 512KB of RAM for $799 with a monochrome monitor or $999 with a colour monitor.

In retrospect, the default sound capabilities of the Atari ST were disappointing, given that Atari had been developing an 8-channel additive synthesis chip known as the AMY which would have been inexpensive, yet provided a good counter-argument to the sophistication of Amiga’s Paula chip. Instead, they went for a chip which was not even as good as the MOS Technology SID released three years before the ST. (Atari did include a built-in MIDI in/out port, to be fair, which made the machine popular with musicians, but using it required additional hardware.) As well as that, the operating system wasn’t anywhere near as impressive as AmigaOS, with a single-tasking GUI paradigm which was on par with other systems at the time but was far outstripped by the pre-emptive multitasking of the Amiga. (I also have discussed Atari TOS previously.)

Nevertheless, the Atari ST became the bigger sales success early on, with a more approachable price for the families in Europe who bought many of the early units. Commodore’s woeful marketing didn’t hurt either, with Commodore apparently having no idea how – and no funding in any case – to market their sophisticated machine. (Interestingly, an Easter egg found in an early release of AmigaOS illustrated the Amiga engineers’ discontent at Commodore, with a message reading, “We made Amiga, they fucked it up”.) However, Atari wasn’t exactly in the healthiest of states either, with Tramiel’s ruthlessness with the Commodore 64 coming back to haunt him to some respect. Both systems would, over their lifetimes, appeal more to European audiences than Americans, who were instead focusing on other platforms which would shape the future of computers and of video gaming.

The NES: Saviour of the American video game market, but a “cult classic” in Europe

Japan, like Europe, had not suffered heavily in the wake of the North American video game crash of 1983. With a strong domestic market of arcade games and its own set of personal computer platforms ranging from the NEC PC-8801, the Fujitsu FM-7 and the Sharp MZ and X1 systems, Japan were able to sustain their own market during the contraction in the market in the US.

Nintendo were one of the notable successes of the Japanese market at the time. Having started developing video games in about 1975, with several arcade games, a few Pong clones and the Game & Watch series of handheld games, they had struck gold with Shigeru Miyamoto’s Donkey Kong in 1981. The first game to feature Mario (then a carpenter named Jumpman), Donkey Kong was a smash hit in the arcade and made it onto several home computers and consoles as well. With this success, Nintendo decided to develop their own video game console.

On the 23rd of July, 1983, Nintendo released the Famicom (or Family Computer) in Japan. Designed around a clone of the same MOS 6502 CPU architecture used in the Atari 8-bit systems, the Commodore 64 and the BBC Micro, the Famicom was more of an evolution rather than a revolution. It did have better graphics than anything else on the console market at the time, being able to display 25 colours on screen at once out of a 54-colour palette along with a sophisticated sprite engine, but the CPU was comparable to an Atari 8-bit system and the sound chip, with its five channels including two pulse waves, one sawtooth wave, a noise generator and a 6-bit PCM channel, was approximately on par with the Atari POKEY, General Instrument AY-3-8910 and Texas Instruments SN76489 chips found in other console and home computer systems and couldn’t match the MOS Technology SID in the Commodore 64.

After a slightly slow start, the Famicom soon picked up momentum to become the best-selling console in Japan by 1984. Plans were written up with Atari to distribute the system in the US in a modified form. While the name “Famicom” was a bit of a misnomer for a system that was first and foremost designed for video games, there was an add-on package called Family BASIC with a cartridge and keyboard peripheral which allowed the system to be used as a somewhat limited computer system through BASIC programming. The plans to sell the system in the US made the name look more appropriate. The planned Nintendo Advanced Video System would have come with an integrated keyboard, a cassette drive, a wireless joystick and a BASIC cartridge which would have made it as much a home computer as a video game console.


The Nintendo Advanced Video System, complete with peripherals.

Of course, these plans never came to pass. Atari delayed an initial deal in 1983 to distribute the Famicom in North America after finding that Coleco were illegally bundling their Adam computer with Donkey Kong. Despite this being an unauthorised port, Atari took this as a sign that Nintendo were working with a major competitor in the video game market. The deal was cancelled as Atari’s CEO, Ray Kassar, was fired shortly afterwards for insider trading. A later attempt to market the AVS as mentioned above also fizzled out and in the wake of this, Nintendo decided to distribute the system themselves.

This would end up being a fortuitous decision. Modifying the Famicom further, with a front-loading zero insertion force cartridge slot that was meant to obfuscate the system’s purpose and evoke images of VCRs rather than video games consoles (although this would prove to be inferior to the card edge connector cartridge design of the Famicom and most previous and subsequent cartridge-based consoles), along with the R.O.B. (Robotic Operating Buddy) accessory designed to give the system a place on toy shelves, Nintendo released the Nintendo Entertainment System on the 18th of October, 1985 in limited test markets in the US and later distributing it across the whole United States throughout 1986.

The NES would prove, after a shaky start similar to that of the Famicom, to be a massive sales success in the United States, reigniting American passions with video games. Of the 61.9 million NES/Famicom systems sold worldwide, more than half of these were sold in the Americas and “Nintendo” would become a byword for video gaming in the US in the years to come.

The NES would not be so popular in Europe. Released in two batches across Europe, with continental Europe (apart from Italy) receiving the system on the 1st of September, 1986, while the UK, Ireland, Italy, Canada, Australia and New Zealand in 1987, the console entered a more challenging market, often with a baked-in preference for home computers.The official sales figures for the NES in regions other than Japan and the Americas are 8.5 million and while it’s difficult to get a solid figure for anything more specific, it is clear that not all of that 8.5 million was down to Western Europe.

While the NES had some degree of success in countries like France and Germany, video gamers in the United Kingdom were especially dismissive of the system on its release and sales always remained lukewarm even near the end of its lifespan. Nintendo had implemented some practices when developing the NES for the United States that were particularly inappropriate for the UK audience. Nintendo had deliberately targeted the system in the West more towards younger children, with a harsh policy towards the depiction of violence, profanity or sexuality, which made it look a bit “kiddy” when it was introduced in the UK.

Furthermore, in an attempt to mitigate the development of low-budget shovelware, advergames and pornographic titles that had infested the Atari 2600, Nintendo had instituted a very tight control on publishing for NES games and mandating the use of a lockout chip that required Nintendo’s approval to produce. This, however, was antithetical to the British video game industry which circled so much around the bedroom coder and small teams of indie game developers who lacked the financial backing to produce cartridges in the first place, let alone with a lockout chip. NES games were considerably more expensive than home computer games on cassette tapes and it was a difficult task to sell a device to British audiences that could only be used for video games and where, for the price of a single game, you could buy half-a-dozen games for a computer instead.

By virtue of sales late in its life, the NES would not be a total flop even in the UK, but it was hardly the saviour of the video game industry that it had become in the United States. Its success in the US will become more important later in this series, but for the time being, it served to illustrate the growing divergence in the video game industry between the US and Europe.

Sinclair – pulling defeat from the jaws of victory

The ZX Spectrum had proven to be a big success, with its aim of providing the cheapest possible colour computer resonating well with British buyers who appreciated its “cheap and cheerful” nature. Despite its limitations, such as the rubber-keyed chiclet keyboard and frequent attribute clash due to colour restrictions per on-screen tile, the Spectrum certainly did the trick as an affordable system for learning how to program and play games. However, not all of Sinclair’s ventures were so successful.

I find it interesting that despite the limited utility of early computers apart from entertainment, there were several computer manufacturers that dismissed video gaming as an inappropriate use of their systems. Apple, despite Steve Jobs’ and Steve Wozniak’s history with Atari, actively sought to discourage video game developers early on. IBM didn’t even consider the possibility of video gaming on their systems until the development of the IBM PCjr, which unsuccessfully tried to straddle the ground between the low-end systems like the Commodore 64 and the high-end business market it was already catering for. Sir Clive Sinclair was also famously dismissive of video gaming, having designed the ZX Spectrum to provide people with a platform for programming, but failing to see at the time that what a lot of buyers wanted to program were games.

Sinclair Research sought to follow up the ZX Spectrum in 1984 with the Sinclair QL (or Quantum Leap). Based around the Motorola 68008, a version of the 68000 somewhat analogous to the IBM PC’s Intel 8088 processor in that it had an 8-bit data bus which effectively halved the speed of the CPU, the QL did improve on the Spectrum in some respects, including the pre-emptive multitasking QDOS operating system released a year before AmigaOS, but was hardly the great leap forward that its name suggested.

The QL was rushed onto the market in January 1984 but was far from ready for production, lacking even a working prototype. Even when the first customer deliveries arrived in April, they were found to be unreliable, with multiple bugs in the firmware and numerous issues with the proprietary Microdrive storage system, which aimed to provide a cheaper alternative to the floppy disk by using an endless loop of magnetic tape inside a cartridge case. These issues were later resolved, but the early impression of the system stuck with it until it was discontinued in 1986. Today, the system is arguably most notable for Linus Torvalds’ ownership of one and his requirement to write his own software due to the poor support that the system received.

The QL wasn’t Sinclair Research’s only failure either. The portable Sinclair TV80 used a flat-screen CRT using a side-mounted electron gun and a Fresnel lens to make the picture look larger than it was, but failed to sell enough units to recoup its development costs. However, this was relatively low key compared to the biggest flop in Sinclair’s history: The infamous C5.

Sir Clive had held a long-lasting interest in electric vehicles since the 1950s and by 1983, the success of the ZX Spectrum gave Sir Clive capital to set up his own electric vehicle company called Sinclair Vehicles, Ltd. After in-depth research into the matter from the late 1970s onwards, Sinclair Vehicles released the C5 in 1985. It was a notorious flop, being underpowered, slow and unsafe with no weatherproofing – a big mistake in the frequently rainy climate of the UK. With both electric and pedal power, it was meant to bridge the gap between bicycles and cars, but ended up alienating both sets of people and only sold 5,000 of the 14,000 units produced.

All of these financial failures came at the expense of the device that had made Sinclair’s reputation. While the Spectrum did receive an update in 1984 in the form of the Spectrum+ with a new injection-moulded keyboard to replace the original chiclet keyboard, it took Sinclair’s Spanish distributor to really push for an improved model. The ZX Spectrum 128 added extra memory to the tune of 128kB overall (as the name implied) along with extra features such as an actual sound chip in the form of the AY-3-8912, an RS-232 serial port, an RGB monitor port and a better BASIC editor. Launched in Spain in September 1985, it wasn’t released in its major market of the UK until January 1986.

Faced with financial problems, Sir Clive would sell up the Sinclair branding and computer technology rights over to Amstrad in April 1986. Amstrad continued to not only sell but improve the Spectrum over the years, but their improvements did introduce some incompatibilities with the older models and the Spectrum would not sell in the same numbers as it did in the period up to 1985. There was enough of an install base to keep it relevant in the market and it had been enough of a success for Clive Sinclair to be knighted in 1983, but momentum was to shift to the Commodore 64, the Amstrad CPC and later the Atari ST and Commodore Amiga.

Now, back to the games!

By 1985, there had already been several smash hits on the 8-bit home computers, both in Europe and in the United States. The European games like Elite, Chuckie Egg and Manic Miner have already been discussed in Part 1, but American games like Epyx’s Games series, Lode Runner and Impossible Mission deserve a mention as they started to make their way over to Europe and began to be ported to the ZX Spectrum, Amstrad CPC and BBC Micro.

For the most part, European game development continued in a similar vein to how it had proceeded in previous years, with a distinct “bedroom coder” indie approach to a lot of the titles, with one or two programmers working on the game in their own time. With programming tools accessible as soon as users turned on their computers, along with a steady flow of resources from computer magazines reprinting BASIC and machine or assembly language code listings and books which discussed programming in the various dialects of assembly language on the different systems, the home computers made it a far less daunting prospect to develop and have a successful game published than video game consoles.

As mentioned briefly in Part 1 and as with any creative field where the barrier to entry is low, this led to a lot of mediocre and poor-quality games, many of which aped what their developers saw other games doing. This included a large number of platformers and shoot-’em-ups trying – and failing – to emulate the output of the Japanese companies such as Sega, Konami, Capcom and Irem for the arcades. However, the same low barrier to entry also allowed for genuinely novel games to make their mark on the systems, like 1985’s Paradroid, first developed by Andrew Braybrook for the Commodore 64 and incorporating both elements of the shoot-’em-up and puzzle genres, 1986’s The Sentinel, an esoteric and very original first-person perspective puzzle game first developed by Geoff Crammond (later known for his series of Formula One racing simulators) for the BBC Micro, the bizarre isometric 1987 action-adventure/puzzle/platform game Head over Heels designed first for the ZX Spectrum and the early fighting game International Karate, first developed for the ZX Spectrum in 1985 by System 3 and followed up by its even more successful sequel International Karate + (also known as IK+) in 1987.

Original game series were beginning to emerge as well, often from the platform game genre. The Miner Willy series, comprising Manic Miner and two official sequels in the form of the similarly popular Jet Set Willy in 1984 and the less successful Jet Set Willy Ii in 1985 along with a couple of spin-off titles, had become a smash hit for the ZX Spectrum early on. The Monty Mole series started in 1984, also on the ZX Spectrum and received sequels throughout the rest of the 1980s, including Monty on the Run and Auf Wiedersehen Monty, which joined rather expansive multi-screen platforming worlds with a quirky sense of British humour. Similarly, the Dizzy series, first emerging in 1987, received a whole host of sequels up until 1992 and combined platforming with action-adventure elements.

Speaking of arcade games, the original titles on the market began to be joined by a host of ports of popular arcade titles, including Commando, Ghosts ‘n Goblins and its sequel Ghouls ‘n Ghosts, Green Beret, Bubble Bobble, OutRun and R-Type. These invariably did not match up particularly well to the arcade versions, not only because of the less powerful hardware of home computers versus the specialised and custom-built hardware of arcade machines, but also because the developers of such games had little to no official support from the original developers, were not provided with an overview of the internal workings of the games and often had to spend their own money on watching the arcade game being played so that they could work out how the game worked by deduction. This was at least one area which contemporary consoles had an advantage in, given that the original developers of the arcade games were also responsible to their ports to the console platforms. However, some of the Commodore 64 ports of these games are notable for their astounding soundtracks, among the best on any 8-bit system.

As a matter of fact, very strong music was rapidly becoming a character trait of the Commodore 64. While early soundtracks on the system had tended to use the chip similarly to other sound chips of the time, with a single waveform for each of the three voices on the SID, a trick was devised a few years after the Commodore 64’s release to rapidly change waveforms dynamically on each voice to give the impression of having more channels available at a time. An early example of this technique was illustrated by Rob Hubbard, who popularised the style of music with the soundtrack to 1985’s Monty on the Run. This particular bit of music is influenced heavily by Charles Williams’ “Devil’s Galop”, the theme tune to the popular 1950s BBC radio serial, Dick Barton, but includes its own original take on the music, with a sophisticated sound that was simply not possible in the same way on any other contemporary platform and lasting for an unprecedented six minutes without looping, a veritable lifetime in an era when nearly all game music looped after a minute at most.

There were plenty of highly acclaimed British composers, such as the aforementioned Rob Hubbard, who continued to push the limits of the SID after Monty on the Run, Martin Galway, Ben Daglish, Matt Gray and Tim Follin, but other great composers came from elsewhere, like Chris Hülsbeck and Ramiro Vaca from Germany and Jeroen Tel from the Netherlands. Between them, they explored the limits of the SID, often making a game worth buying for the music alone. Several of these musicians would continue to compose for later platforms, particularly Follin and Hülsbeck.

The Commodore 64 wasn’t the only 8-bit platform that could receive good music, though, as good composers could sometimes modify their pieces to work on the less sophisticated AY-3-8910 and SN76489. Tim Follin, an incredible musician who would manage to compose brilliant pieces on every platform he ever touched, even managed to make the primitive 1-bit beeper of the ZX Spectrum produce surprisingly sophisticated polyphonic music resembling (very buzzy) rock and orchestral pieces.

As the 1980s progressed towards their end, new platforms began to emerge and older ones became more affordable. 1987 saw the release of the Acorn Archimedes, the successor to the BBC Micro and incorporating a sophisticated 32-bit RISC CPU from the ARM architecture as well as a co-operative multitasking OS named RISC OS. (I discuss RISC OS here as installed on the more recent Raspberry Pi.) The Archimedes was not itself particularly popular by itself; while it didn’t lack for hardware grunt with its 8 MHz ARM2 processor producing 4 MIPS, approximately three times that of the comparably clocked 68000s in the Amiga and Atari ST, it suffered from its reputation as an educational computer where the BBC Micro had instead benefitted from it. However, the Archimedes would have an impact that would outlast the computer series itself, as its power-efficient ARM processor would become the de-facto standard in mobile devices such as PDAs, handheld games consoles and smartphones.

Also released in 1987 was the Amiga 500, a redesigned system based on the hardware of the Amiga 1000, but with more onboard memory and a form factor more in keeping with the 8-bit computers, with the keyboard integrated into the case. With a reduced price to make it more enticing to home users, the Amiga 500 would end up becoming the most popular system in the history of the platform, opening it up in particular to British and German audiences who would embrace the system in the years to come.

Finally, 1987 saw the European release of the Sega Master System. While not as popular as the home computers of the time and a distant second in its generation of consoles behind the NES, many of those coming from Brazil, where Sega had shrewdly negotiated a contract for the system to be built and marketed by the Brazilian company Tec Toy in order to avoid huge import duties on foreign-made electronics, the Master System represented something very bizarre in terms of sales figures, as it was far more popular in Europe than it was in either the United States or its home market of Japan. Selling approximately 6.8 million units in Western Europe, it actually outsold the NES in Europe, playing off a more mature image as seen from the arcade games which were Sega’s bread and butter at the time, as well as a more lax licensing policy than Nintendo. Sega had even licenced several of its arcade games for ports to the home computers, including After Burner, Space Harrier and the aforementioned Out Run.

Around the time of the Amiga and Atari ST releases in 1985, games were being made first and foremost for the 8-bit computers and later receiving graphical polish in ports to the more advanced 16/32-bit systems. Near the end of the decade, this situation was beginning to be reversed, as games would be released on the Amiga or Atari ST first and later filter their way down to the 8-bit computers if the games were successful and simple enough to port appropriately. For example, 1989’s Shadow of the Beast by Psygnosis was developed with the Amiga in mind, using complex parallax scrolling and high-quality music to push the system to its limits, but still found its way onto the 8-bit systems soon after its release. Another 1989 title, Populous by Bullfrog Productions, also first released the title on the Amiga and was later ported to multiple other systems – which in this case did not include the 8-bit systems.

Big-name titles also started coming from countries other than Britain, which seemed to dominate proceedings in Europe when it came to popular and retrospectively acclaimed titles on the 8-bit platforms. One developer from Germany, Manfred Trenz, is of particular note here. One of Trenz’s early projects was doing graphical work on a Commodore 64 game called The Great Giana Sisters, which was published by Rainbow Arts. The game was a polished but very obvious clone of Super Mario Bros., with a soundtrack composed by Chris Hülsbeck, but its similarities to Nintendo’s game brought the risk of legal action against Rainbow Arts and the game disappeared from shelves. Nevertheless, Manfred Trenz was planning his own game and in 1988, he developed Katakis, first released on the Commodore 64 and soon after ported to the Amiga by Factor 5, a group formed by five former employees of Rainbow Arts.

Katakis was again a pretty obvious clone of a pre-existing game, this time Irem’s arcade classic, R-Type and again, the threat of legal action loomed over the game. However, in a bizarre set of events, Activision Europe, who held the legal rights to port R-Type to the Amiga, found themselves without programmers to port the game and delivered an ultimatum to Factor 5: either develop the port of R-Type on the Amiga or receive a lawsuit. Katakis, for what it was worth, was later retooled and re-released as Denaris in 1989.

It was Trenz’s next idea, however, that would really kickstart things for Rainbow Arts and Factor 5. Trenz turned his attention to the run-’n’-gun platforming genre and in 1990, developed Turrican for the Commodore 64, with a prompt port to the Amiga by Factor 5. Best described as “Contra meets Metroid” (although apparently heavily influenced by the obscure Japanese arcade game, Psycho-Nics Oscar), Turrican mixed together a blend of “don’t stop shooting” side-scrolling action with relatively open game worlds with multiple secrets containing power-ups and extra lives. Somewhat limited by the one-button Atari-compatible joystick interface on the Commodore 64 and needing to bind the “up” direction to jump, Trenz devised a solution for the problem of aiming up by adding a secondary fire mode which would generate a lightning whip when holding the fire button while standing still. This lightning whip could be rotated around in 360 degrees and provided a novel solution to the issues posed by the limited controls permitted by the Atari joystick interface.

While the Commodore 64 version of the game was one of the most solid and technically advanced games on the platform, it was when it made the move to the Amiga that it would really shine. With better graphics and an astounding retooled soundtrack done by Chris Hülsbeck in what comprised probably his best work until that point and illustrating just how well he had made the transition from the Commodore 64 to the Amiga, Turrican really stood out on the Amiga as one of the best run-’n’-gun games ever to come out of anywhere other than Japan. The game would receive multiple ports to the other 8-bit systems, to the Atari ST and to a few consoles which will receive more attention in the next part of this series, along with multiple sequels throughout the early 1990s.

By 1990, the Amiga had really started to become embraced by European software developers, having reached an affordable cost for a system that despite not receiving any significant upgrades – as will be elaborated on in Part 3 – still managed to impress. But there were new platforms on the horizon, offering a major threat to the Amiga, while horrible mismanagement had already threatened to kill the Amiga and was a constant threat in the years to come. Meanwhile, the Americans couldn’t be ignored forever; having licked their wounds following the video game crash, they were ready to come back in a big way and they favoured consoles over computers and even then much preferred the IBM PC clones to the Amiga. The home computer market was on shaky ground and was close to meeting its doom.

Part 3 will discuss the Amiga in more detail in the period between 1987, with the release of the Amiga 500 and 1994, with the demise of Commodore, along with the other platforms which would soon take the lustre away from the Amiga’s revolutionary design.

Track & Field (NES) – A Retrospective Review

Author’s Note: Having spent most of the month following the Olympics, I thought the following review would be somewhat relevant.

First released by Konami in 1983 for the arcades and subsequently ported to a myriad of different platforms, Track & Field, as the name suggests, is a sports game revolving around track & field athletics events. One of the most notable ports of the game was the NES version, released in 1987 in America and re-released in Europe in 1992 under the title Track & Field in Barcelona, which included five of the six events from the arcade game and three events from the arcade sequel, Hyper Sports.

The gameplay of Track & Field is generally very simple, being played with three buttons: The A and B buttons usually representing one leg each and a third button representing the “action” button on the original arcade machine. Using these buttons, the player is tasked with at least matching a qualifying time or score in order to proceed to the next event, for instance by repeatedly pushing the run buttons in sequence in the running events, using the action button to jump over hurdles, or by using the run buttons to build up speed for the jumping and throwing events and using the action button to set the angle for the jump or throw.

As mentioned above, there were eight events included in the game: 100 metre dash and 110 metre hurdles for running events, long jump, triple jump and high jump for jumping events, javelin throw, skeet shooting and archery. Of the events originally in the arcade version, only hammer throw is missing, although personally, I would have preferred this over the awkward skeet shooting and archery events which are not only dissimilar in several respects to the other events in the game, but also not actually track and field events. There is also nothing in the way of longer-distance running events, which makes sense given the game’s arcade roots, but which would, with something like a stamina bar, have represented an interesting complement to the sprinting events. The other events are done very well, though, even with the running events representing button-bashing affairs which will wear out your fingers – and controllers.

The game can be played by one player versus the computer or two players, with two difficulty settings differing in the thresholds that players must reach in order to proceed. The game also gives you a choice of which event you want to start at. In the two-player mode, the players play head-to-head in the races and one after the other in the other events; if one player does not make the qualifying threshold, that player will be eliminated from the game and the other player will continue against the computer.

Graphically, the game is not the most impressive on the NES, but makes a good show of replicating the arcade game. The game also lacks the synthesised voices of the arcade version, but this shouldn’t be surprising given the sound hardware of the NES and the bleeps that the game does include are adequate for the purposes of the game.

Perhaps the most notable thing about the arcade version of Track & Field is that it set in stone the way in which following titles in the same category of games were played, with its simple button-bashing controls. As a consequence, the game is still very playable and represents rather simple fun as long as you can get your fingers or thumbs to cooperate with the speed at which you need to press the controls in order to succeed in most of the events. Strangely enough, I find it is the events that don’t require quick fingers that are the most troublesome; the skeet shooting and archery events feel out of place even if the game series soon expanded after its first title to cover other Olympic sports outside of the track and field events and can be particularly difficult to pass if you can’t get your timing just right. Given that there were events in both Track & Field and Hyper Sports in the arcade that would have fit better, I can’t see why they decided that those two events made a good fit into the structure of the game.

Aside from the criticisms I have regarding those events, the one downfall of the title is that the same simplicity that makes the game very easy to pick up and good fun also lends it very little depth. There’s always going to be the challenge of getting a higher score – and even setting world records if you’re good enough, but once you have the formula down, there’s not much else to learn about the game. There are a few Easter eggs scattered around the game in various places, but ultimately, the game sticks to its formula throughout. This is both a blessing and a curse and as a consequence, the game is more suited to two-player gameplay where you have another person to beat.

Bottom Line: Aside from a few stumbling blocks, Track & Field is simple, very easy to grasp and good fun – if your fingers are up to the button-bashing gameplay. However, it is also very formulaic and lacks depth, making it better in two-player mode.

Recommendation: This isn’t a title that is worth going out and spending a huge amount on a cartridge for, despite its fun factor, but if you can find it in a bargain bin somewhere or are willing to go down the emulation route, it’s a fun title which would be particularly good for short bursts of multiplayer gameplay.

The First “PC Master Race” – Part 1: The Start of the European Microcomputer Market (up to 1985)

While I have had my fair share of consoles, both home and handheld, through the years, I have always found myself predominantly drawn to my PCs as gaming platforms. From my first computer, with a 486SX running at 25MHz, a basic VGA graphics card and 8MB of RAM, to my current computer with an overclocked Core i5-4690K, an AMD Radeon R9 290 GPU and 16GB of RAM, each of my desktops has been used heavily for playing video games, even when they were not particularly suited to the games of the time.

Something I’ve noticed throughout the progression of PC specifications over the last fifteen or so years is that PCs have steadily become more compelling options against the dedicated video game consoles of their time. While, when I got my first computer in 1996, you could specify a computer that could outstrip the consoles of the time, it came at a considerably higher price and took considerably more effort to get games going on than the plug-and-play consoles like the PlayStation. Meanwhile, my PC with its standard VGA graphics card was more akin to the previous generation of consoles like the SNES in terms of graphical capability.

In 2016, not only does my current computer, which isn’t even on the pinnacle of PC graphics performance, far outstrip both the Sony PlayStation 4 and Microsoft Xbox One, it is possible in the United States to specify a computer that beats both consoles in graphical capability, yet costs in the same region as them (well, OK, that’s if you don’t want a Windows OS). This PC would also, despite the low price, have flexibility and adaptability unbeknownst to the consoles including the ability to use it for general-purpose computing and tens of thousands of commercial games available in just about every genre under the sun. At the same time, the current generation of consoles have been losing some of the traditional advantages of console platforms, such as the loss of split-screen multiplayer allowing multiple players to compete using a single console and a single screen, as well as the plug-and-play advantages of being able to put a disc or cartridge straight into the console and start playing being eroded by the necessity for multi-gigabyte bug-fixing patches.

Despite the improvements to the PC platform which have made it possible to easily specify a computer that will easily beat the consoles as well as have the capacity to do things other than video gaming and media consumption, PCs are still lumbered with a reputation from their past from when they genuinely were expensive, temperamental and difficult to set up. Furthermore, certain game developers, allured by the easy money of the console market, have allowed these misconceptions to be treated as gospel by their customers, focusing their games on the consoles and then follow up with lazy ports to the PC which fail to take advantage of the superior graphical potential of the platform and which frequently feature control schemas and user interfaces that assume that players are using console-style control pads.

The “PC Master Race” movement, named for a sly insult towards PC enthusiasts and their perceived elitism by the reviewer Yahtzee Croshaw of Zero Punctuation that was later adopted as a term of endearment, seeks to spread awareness of the merits of PC gaming at what they see as the first time in gaming history where PCs have surpassed consoles in every conceivable way for less money”. But what if I were to tell you that there was another period in time when personal computers represented a very compelling alternative to consoles, where they became the preferred gaming platform for most of a continent and where the comparatively high prices of consoles was considered to be detrimental? The story starts in 1982…

The 8-bit micros take off in Europe

The period between 1981 and 1982 represents a turning point in the history of personal computers. Commercially viable computers had been first put on sale in 1977 in the United States, but none, even the long-lasting Apple ][, would have the market impact of the IBM PC, which would later form the standard for the modern personal computer, the Commodore 64, which would become the best-selling computer model of all time and the Sinclair ZX Spectrum, which represented one of the few platforms that stood against the Commodore 64 toe-to-toe and managed to hold its own. Each of these computers was created in a turbulent market where dozens of manufacturers worldwide were already jostling for position and each computer managed to not only survive but thrive as many other models of computer dropped off the radar in later years.

The IBM PC was the first step into the personal computer market from the company that was then the largest computer company in the world, but it was then irrelevant to the gaming market – and will be discussed in passing in this section. On the other hand, both Commodore and Sinclair had precedent in the PC market, both having had previous sales successes. Commodore had been one of the pioneers of personal computing in 1977 with their PET 2001 computer, which competed with the Apple ][ and Atari 400/800, then followed it up with the first million-selling computer in the VIC-20 in 1980. Sinclair’s first releases, the ZX80 in 1980 followed up by the ZX81 in 1981, were even by the standards of the time very limited, with a scant 1KB of RAM by default, but with release prices of £99.95 and £69.95 on release respectively, they represented an affordable entryway into hobbyist computing.


The Commodore 64 and 48K Sinclair ZX Spectrum: Two of the fiercest competitors in the 8-bit home computer market.

A notable characteristic about both the Commodore 64 and the ZX Spectrum was that both computers were particularly inexpensive. The Commodore 64 was released at a price of $595, which compared very well with the Apple ][+ at $1,330, the Atari 800 at $899.95 and the entry-level IBM PC at $1,265. Yet it was surprisingly sophisticated, with the 64 KB of RAM from which it got its name compared to 16 or 32 KB in most contemporaries, a very sophisticated graphics chip which was better than almost anything else on the market and arguably the best sound chip of any 8-bit computer in the MOS Technologies SID, with three voices capable of generating four different waveforms and each with their own ADSR (attack decay sustain release) envelope to further modify the output of each voice. Its only notable weakness was a comparatively slow processor, a MOS 6510 at 1.023 MHz (or 0.985 MHz in PAL regions) which might have matched the Apple ][ range, but did not compare well to the 1.78 MHz processor in the Atari 8-bit computers.

The ZX Spectrum was not as sophisticated, with less sophisticated graphics hardware that lacked the hardware sprites and had a more limited colour palette compared to the Commodore 64 and a simple one-channel beeper which was significantly more limited than the SID on the Commodore machine. On the other hand, on release, it was significantly cheaper at £125 (approximately $220 in 1982) for the 16 KB model and £175 (approximately $310) for the 48 KB model. Both computers would soon become even cheaper with Commodore engaging in a price war against its competitors in the United States which led to the Commodore 64 dropping to $200 by 1983 and Sinclair decreasing prices on the Spectrum in response.

The low price of both computers is significant in the economic context of the time. The economic recession of the 1980s had a greater effect on Europe than it did on the likes of the United States and Japan and hit the United Kingdom especially hard, the country having experienced a string of crises throughout the 1970s. In particular, the conversion rate of the pound sterling dropped significantly in the period from 1980 to 1985, from an average of $2.33 in 1980 to $1.29 in 1985. While adoption of computers was slow between 1982 and 1983, with an estimated 600,000 microcomputers in the UK by the end of 1983, sales picked up significantly by 1984, by which time the economy of the UK would dictate that less expensive computers were most likely to succeed. There were similar situations across Western Europe and as other countries in Europe lacked a strong indigenous home computer market, consumers in these markets were inclined to buy American or British models.

The low price of the Commodore 64 is also significant – and has been linked as a cause – for an event in the video game market which had a huge impact in the United States, but had little effect in other markets. The North American video game crash of 1983 has become legendary, as the glut of consoles in the market at the time succumbed to the arrogance of the marketing executives pushing such systems, who seemed to believe that customers would eat up whatever shovelware the game developers could push out and come back for more, ultimately while the rapid decrease in price of the Commodore 64 made it a compelling alternative.

As unsold copies of the overproduced E.T. The Extra-Terrestrial for the Atari 2600 were being buried in a landfill in New Mexico, causing a contraction in the video game market in North America that would last until the 1985 release of the Nintendo Entertainment System, if you were in Europe, you would be forgiven for not realising that the crash had happened at all. The games market in Europe was already based around personal computers, most notably the Commodore 64 and ZX Spectrum, but also including several other predominantly British home computers such as the Acorn-designed BBC Micro, the Dragon 32/64 systems from Dragon Data and the Oric systems from Tangerine Computer Systems. The NES wouldn’t be released in Europe until 1986 and not in the UK until 1987, by which time the personal computer market had well-and-truly taken hold. Even by 1983, games like Manic Miner and Chuckie Egg, which would become known as some of the best games available on 8-bit platforms (and, incidentally, which would exemplify the “one man programming in his bedroom” sensibilities of the European game development sphere), had already been released – and things were only just getting started.

The home computer market picked up significantly in the UK in 1984, as more than one million home computers were sold, more than doubling the number of PCs in the country. Exposure of the home computer was helped by the BBC’s Computer Literacy Project and television shows like The Computer Programme and Micro Live. For the former, the BBC had put their name to the BBC Micro, an expensive, yet sophisticated computer designed and produced by Acorn Computers. While at a release price of £400 in 1981 (approximately $2,000), the top-of-the-line 32 KB BBC Micro Model B was too expensive for most households at the time, it did find its way into many schools and later sold a respectable 1.5 million units over its history.

The BBC Micro is not just significant for its role in the BBC’s efforts in trying to spread computer literacy, however, as it also plays a large role in computer gaming history. In 1984, a pair of students at the University of Cambridge, David Braben and Ian Bell, would work together to release the seminal game Elite, creating a legacy that lives to this day. Elite is one of the earliest sandbox games, a space simulator in which the player is given the freedom to play the game in any of a multitude of ways and in which there is no true victory condition. This contrasted heavily with the general pattern of games of the era, which were still generally simple, arcade-style affairs. Yet, despite this, Elite was very successful, soon spreading from the BBC Micro and similar game-focused Acorn Electron to be ported on platforms ranging from the Commodore 64 and Spectrum to the Apple II to the Japanese MSX range, even to the Taiwanese Tatung Einstein and then to the next generation of home computers in the mid-1980s.


Elite on the BBC Micro: Wireframe 3D graphics and a universe to explore on less than 32 KB of memory.

Elite was arguably the most sophisticated game of its time and, being designed for a home computer with more memory than consoles would have until the Sega Mega Drive in 1988, was very much a PC-focused game. While it was eventually ported to a console – the NES – in 1991, this required additional hardware and memory mappers to make up for the limitations of the console. In any case, until that point, if you wanted to play Elite, you needed a PC of some sort.

While discussing the efforts of British coders during this period, I do not intend to ignore the fact that American development studios were also developing sophisticated games for home computers at the time, including the Ultima series of role-playing games from Richard Garriott and while the ZX Spectrum design only reached America in the form of the largely incompatible Timex Sinclair 2068, the Commodore 64 was also wildly popular in the United States. However, few of the American games became a sales success in the UK or the rest of Europe for various reasons, largely linked again to the economic downturn in Europe. While European audiences predominantly bought their software on cassette tapes, which had excruciatingly long loading times even by the standards of the day, but were cheap and could use a standard cassette player which was likely already in the home, American games were written for floppy disks, which granted greater capacities as a consequence of not having to load the whole game into memory at once and significantly improved the loading time for software, but were more expensive and required the purchase of an additional floppy drive on top of the base package.

On the other hand, not scared off from the games industry by the collapse of the predominant game market like the Americans, the European coders felt free to exploit their home computers to the limits. Several elements made the home computers much more friendly for hobbyist coders to make the step to commercial game development, including the use of rewritable media like cassettes and floppy disks rather than the cartridges of consoles (although several home computers did have capacity for cartridge-based games). As a result, a huge number of one-man projects were started and had the capacity to become commercially viable. This did, predictably, lead to a lot of dross mixed in with the good games, but it created a crucible for innovation and diversity which would rarely be seen in the industry.

1985 saw the Western release of two systems that would, in the coming years, very much illustrate the differences between the American and European game markets. The Commodore Amiga 1000 was the most sophisticated home computer of its time and while that model itself would not become particularly successful, successor machines such as the Amiga 500 would find far much more success in Europe than they would in the United States from which they came. On the other hand, the Nintendo Entertainment System, derived from the Japanese Famicom (or Family Computer), would be seen as the saviour of the games industry in the United States but was far less successful in Europe. In the meantime, though, the 8-bit home computers had a lot more to offer… and the Germans had not yet illustrated their best.

Part 2 of this series will discuss the years leading up to 1990, which represented a golden age for the home computer in Europe, but where complacency, bad business decisions and the growing threat of the IBM PC would soon after cause the demise of the supremacy of the personal computer for several years afterwards.

FROM THE ARCHIVE: Battlezone – A Retrospective Game Review

“Let them have their ticker tape parades, their ‘space races’ and their commemorative packets of dehydrated ice cream. While Von Braun takes credit for his Redstone bottle rockets, I am finalising plans for an inter-planetary fleet that could plant an American flag on every rock and pebble in this solar system by the end of the decade. I will be watching the sunrise from Olympus Mons long before NASA takes their first steps on the moon.” – Dr. Wilhelm Arkin, Battlezone

Battlezone is a 1998 PC-format first-person vehicle shooter/real-time strategy, developed and produced by Activision. Despite the innovative “field commander” concept, the ambitious and impressive gameplay, and the interesting and various range of settings, it remains an obscure title to this day.

The story of the game starts in 1952, when an investigation into a meteor shower near the Bering Strait leads to the discovery of a strange extra-terrestrial material, soon dubbed “bio-metal”. Further investigations of this material lead to the discovery that weapons can easily be fashioned from it, weapons which appear to be derived from some sort of “memory” of the material to reshape into its previous form. These weapons systems, shaped into vehicles resembling the tanks of Earth, have very promising properties, like the ability to counter-act gravity, to redistribute damage over their entire bodies, rather than taking damage at any specific point, and a single ammunition source for every sort of weapon.

As both the Americans and the Soviets have both acquired samples of this bio-metal, it is clear that both of them will have discovered the material and investigated its properties, and with its properties being so promising, it is also clear that both sides will compete for the bio-metal which is believed to exist in the solar system. With the bio-metal at their disposal, either side could drive through their opponents’ cities with impunity, ending the Cold War with a single stroke.

In order to collect this bio-metal, Dwight Eisenhower establishes a secret space organisation under the control of the National Security Agency, named the National Space Defence Force, or the NSDF. Recruiting under the auspices of NASA, the NSDF, with the overpowering weapons constructed from the bio-metal at their disposal, set forth to set up a lunar outpost, and thus begin gathering as much of this strange alien material as possible.

However, the NSDF are not alone in space. The Russians have earned a substantial lead with their technical advantage in developing space technologies, and their Cosmo Colonist Army, or CCA, outnumber the NSDF contingent, with superior weapons systems. With the Soviets overpowering the NSDF, the commanders of the American forces must quickly make up for their slow start.

But many questions remain about this bio-metal. Where did it come from? What relation do they have to seemingly alien structures located around the celestial bodies the NSDF and CCA pursue each other over? With all of these questions and more, the future of humankind rests on the secrets of bio-metal.

The plot of Battlezone is not necessarily the strongest I’ve ever seen, but it’s definitely in the upper echelon, told exceptionally well through the game and the manual – which comes from the days when games came with a substantial manual in the box. I’m especially enamoured by the connections with the space missions undertaken by the Americans and the Soviets during that point in time, and I’m glad to finally see a space conspiracy which doesn’t suggest that man never landed on the moon. The angle of having a secret space war raging while the American and Soviet populaces remain occupied with the political concerns of the era is also interesting, and overall, it gives the game a nice political and military angle.

During the game itself, the plot is never made too elaborate, with little snippets of details coming from the player character’s reminiscences during the loading screens, and the general feed of information coming through the game. The plot never interferes with having a good time in the game, which is an imperative of action game design.

Speaking of game design, the gameplay is a great strength of this game. The first-person shooter and the real-time strategy are not necessarily genres that you would expect to work very well together, but Battlezone manages to meld the two genres together very well, through the process of making the player a field commander instead of a rear-echelon general. As such, the amounts of forces that are available are limited, and the clever commander will have to use those forces in the most appropriate fashion.

All of the commands can be accessed using a small selection of keys.

Battlezone is first and foremost a first-person shooter, with the player taking control of a number of the bio-metal constructed vehicles, bringing them into combat versus the enemy forces. These vehicles, due to their anti-gravity, have a large amount of momentum, presenting them as moving targets at all times, and makes first-person strategies such as strafe-running still viable in this game.

Due to the concept of “one ammunition source for all weapons”, there will never be any logistical problems where you lack ammunition for any weapon in particular, and logistical requirements are quickly resolved by the production buildings in the game. This sort of unrealism is acceptable in a game which never claims to be anything less than fantastical. Another feature of unrealism is the concept of equally distributed damage, or EDD, armour, which renders the real life tactic of flanking obsolete, and generally simplifies the game.

The gameplay isn’t limited to vehicles either. It is possible to hop out of vehicles and transfer to others, or to bail out when your vehicle is destroyed, progressing on foot, although infantry are far weaker than vehicles. To balance the game when the player is on foot, it is possible using a high-power sniper rifle to take out the pilots of other vehicles and commandeer their weapons against them. This generally improves the survivability of the player, although there is a mission during the campaign where it is necessary to use this technique of commandeering enemy vehicles. Unfortunately, this mission doesn’t use this game device particularly well, and I prefer using it as a matter of expediency rather than a necessity to progress through the campaign.

The other section of the game, the real-time strategy, takes a back-seat to the action, but is necessary to proceed, because the odds will definitely be against the lone-wolf. The central unit in this part of the game is the Recycler, the fundamental construction unit from which all other units are derived. It is imperative to the game, and if it is destroyed, the game is lost. The Recycler creates the most basic units in the game, including the Scavenger, used as a resource collector in the game, and also produces the Factory, responsible for building advanced units, the Armoury, which produces weapons systems and provides long-range logistics, and the Constructor, which builds bases, and is also used as a resource stop-off point.

The other creator units all come into their own roles nicely, with the Factory making everything from tanks to rocket tanks up to the Walker, a huge and extremely slow but very powerful attacking unit which can steamroll over the enemy if used correctly. The Armoury delivers replacement weapons systems, allowing players to customise their vehicles appropriate to different situations, and also delivers repair and ammunition units to any spot on the battlefield, but the further from the Armoury, the longer the delivery takes. The Constructor takes care of the process of base building, providing power plants, fixed emplacements, and supply and repair facilities.

The resource unit in the game is the unit of bio-metal scrap, gathered by the Scavenger. The game goes to long lengths to make sure that the player will never get bogged down in the standard real-time strategy process of “peasant watching”, by making the Scavenger units relatively autonomous and able to find scrap supplies easily by itself.

Another way in which the game makes sure that you don’t get bogged down in the more monotonous aspects of many real-time strategies is by restricting the numbers of any various type of unit to ten for each type. This means that not only is your population not dominated unnecessarily by resource collectors, but the “Zerg Rush” idea that plagues most real-time strategy games is removed, replaced by a concept of tactics which transcends throwing everything you have at the opponent and hoping for more kills on your side than theirs.

The game further allows for the survival of the units under your control by allowing them to use the same repair and ammunition resupply facilities that you, yourself, can use, and allows a commander to re-organise their army by recycling units for their scrap value, thus ensuring that you should never be left with units that are no longer of any use.

The Recycle option makes up for that restrictive population limit.

I found this gameplay to be refreshing, and still do, especially after being crushed by the likes of an army of Zealots and Dragoons in the original StarCraft, but you can’t really appreciate the full complexity of the game in single-player. Unfortunately, due to lack of internet at the time of playing the game, I never played the multiplayer, but I did investigate it, and it seems like an ideal multiplayer experience for those looking for more tactics than evidenced in most first-person shooter, while not moving over to the hardcore difficulty of looking after every unit as evidenced in many RTS games.

For those looking for a quicker thrill than the strategy-based main game, there is a more traditional deathmatch style multiplayer mode, but the real meat of the game is definitely in the strategy game, and most of the multiplayer maps are set up with that in mind.

Unfortunately, this game isn’t perfect, and there are quite a few flaws lying around the place. It is a particularly buggy game, and is somewhat incompatible with modern operating systems. The most obvious bug that I found was an inability to get the hardware 3D renderer operating properly – no matter what system I tried it on, the game crashed. Windows 98, Windows Me – well, of course it was going to crash there! – Windows 2000, Windows XP – all afflicted with this 3D renderer bug, meaning that I had to run the game in the decidedly inferior software rendering mode. I tried this on a multitude of graphics cards as well, and every one of them decided to choke up when the hardware rendering was on.

This doesn’t render the game unplayable, and I was rather glad for the software rendering in an era when I only had a 1MB 2D graphics card and I was running Windows 95, but it is a disappointment, because the software renderer creates quite a few jaggy and grainy images, particularly noticeable on the pylons in the training area and on some of the vehicles.

This isn’t assisted by the low maximum resolution. 640×480 was acceptable when I was running Windows 3.1 on my first computer, but it’s not exactly what I want to play games at in this day and age. System Shock played at 640×480 four years before this, and Half-Life went 1024×768 and above that very year. I don’t normally complain about graphics, but it does seem a bit ridiculous considering that other first-person perspective games that year could go to the sort of resolutions that I typically use. Again, I wasn’t complaining when it was a matter of expediency with my ancient S3 graphics card – which, incidentally, I still own – but with the gift of hindsight, I can see that they left little room for posterity.

To be honest, though, it doesn’t matter that much. Battlezone is fun, it’s original, it’s clever. Gifted with a great plot, fantastic and intelligent gameplay and bringing new ideas to the world of first-person shooters, Battlezone deserves far more attention than it received.