Karthik Balasubramanian, Author at Gameopedia - Page 2 of 3

The Evolution of 3D Graphics in Video Games (The Hunt for Photorealism)

A colleague shared this story about his first time playing a game on the PS5 – he had downloaded God of War and had just started to play the initial sequence when his father, who happened to be watching, exclaimed, “Why do they make movies any more when games look so realistic?.”

God of War (2018) looks stunning, but it is not exactly photorealistic. However, the reaction of our colleague’s father suggests that today’s major games could, perhaps, pass for live action to an untrained eye – clearly, games have come a very long way from the first 3D titles of the late 1990s. 

God of War (Courtesy Sony)
God of War (Courtesy Sony)

In this blog we will delve into the major milestones in the development of 3D game graphics, which have evolved from very basic and primitive effects to the near-photorealistic visuals we enjoy today. We will discuss the advent of true 3D games and then delve into the evolution of how 3D games are rendered – or how developers kept pushing the envelope to make games look increasingly realistic, immersive and believable. 

Before the advent of true 3D, games such as Doom (1993) and its numerous clones had faked the illusion of 3D using 2D game objects. The enormous success and technical innovations of Doom soon led to full 3D games – developers were keenly aware that gamers wanted true 3D titles and worked ceaselessly to create such experiences by designing and using new game engines.

The Advent and Rise of 3D Games

3D gaming rose to prominence not only because developers were striving to create 3D-capable engines – hardware manufacturers were coming up with the first true graphics cards and the developers of games for the Nintendo 64 (1996) and the Sony PlayStation (1994) were trying to make true 3D games that would reach mainstream audiences. All these factors conspired to make 3D gaming prominent by the late 1990s. 

Hardware Acceleration Reaches Consumers

Hardware acceleration is a process by which certain workloads are offloaded to specialised hardware capable of parallel processing, which can execute these demanding tasks more efficiently than a software application running on the CPU.

Early graphics cards were designed to support hardware acceleration for video game rendering and one of the first successful cards of this type was the Voodoo 1, made by the company 3dfx and launched in late 1996. By the end of 1997 it was the most popular card among developers and consumers, though 3dfx soon declined with the ascent of Nvidia, which would buy 3dfx Interactive in 2000.

Tech Pioneers Start Making True 3D Games

The first 3D games were the result of unceasing innovation by a handful of brilliant programmers at id Software and Epic Games. At id, John Carmack spearheaded the creation of the Quake engine in 1996, which featured real-time 3d rendering and support for 3D hardware acceleration. The engine used static light maps for stationary objects and environment, while moving bodies such as the player character and enemies had dynamic shaders. Tim Sweeney of Epic Games introduced 3D graphic effects way ahead of their time with his Unreal Engine, which used clever tricks to simulate soft shadows, volumetric fog, dynamic lighting and more. 

Quake (Courtesy id)
Quake (Courtesy id)

5th-Gen Consoles and Mainstream 3D Gaming

Advancements were not just restricted to PC hardware and games – consoles also gave a major push to the emergence of games with 3D graphics. The Nintendo 64’s hardware architecture powered true 3D games such as Super Mario 64 (1996) and The Legend of Zelda: Ocarina of Time (1998), and the PlayStation also had great looking 3D games such as Gran Turismo (1997), a racing game that uses full 3D environments. Like the Nintendo 64, the PlayStation used custom hardware to make 3D graphics possible, and the enormous success of the PlayStation, the first console to sell more than a 100 million units, propelled 3D games into the mainstream.

The Legend of Zelda: Ocarina of Time (Courtesy Nintendo)
The Legend of Zelda: Ocarina of Time (Courtesy Nintendo)

By the late 1990s, both PC hardware and consoles were capable of supporting 3D games and there had been a decisive shift toward 3D gaming. The next challenge was to make such games look as realistic as possible. Both Carmack and Sweeney had experimented with various rendering techniques to make their blocky 3D games look more realistic, but these titles are a far cry from what we see today. Since the late 1990s, developers have continued to push 3D game rendering toward photorealism, and this endeavour continues to this day. This blog is hence a history of advancements in 3D rendering – it ends with real-time ray-tracing, but only time will tell what new avenues game developers will explore. 

How Game Graphics Evolved Towards Realism

When developers strive for photorealism, they use every tool at their disposal to achieve it. In the following sections, we discuss the key innovations in game graphics, and how each of them dramatically increased the realism of 3D game rendering.

Normal Mapping – Detailing Optimised Game Models

Every 3D model (or mesh) in a game is composed of triangles, and will have generally gone through several iterations of optimisation to reduce its ‘polycount’, i.e, the number of triangles (or polygons) it has. During the design stage, however, high-poly models, containing lots of details, are created using 3D design tools such as Max, Maya, ZBrush and others. Such high-res models can contain more than a million polygons and simply cannot be deployed in-game – the renderer would choke – but their details are required to make a scene believable. This is where normal mapping comes in – through a process known as baking, the detail of a high-res model is transferred to a ‘map’ (short for bitmap), or texture, which the game engine can use to give an optimised model the illusion of detail. Such normal maps can also convincingly mimic how the high-polygon model would respond to lighting, furthering the illusion that you are seeing a detailed in-game object, and not an optimised mesh linked to a normal map. The key benefit of a normal map is not just that it creates the impression of detail, but also that it creates this with highly-optimised, game-ready geometry. 

Normal Mapping Allows Detail to be Preserved in Highly Optimised Meshes (Courtesy Wikimedia Commons)
Normal Mapping Allows Detail to be Preserved in Highly Optimised Meshes (Courtesy Wikimedia Commons)

Nvidia’s GeForce 3 was the first card to support textures such as normal maps and specular maps – the former gave models a detailed appearance, and the latter controlled how shiny or glossy the model would look. A custom-made version of this card was used in Microsoft’s first Xbox, which used normal mapping extensively. The PS2 didn’t include support for such texture-mapping, but by the seventh generation of consoles, normal mapping was the norm across PCs and consoles. 

Implementing normal mapping and other textures in games was a major breakthrough, and normal maps are used to this day in games and other graphics pipelines. Today’s graphics processing units (GPUs) are capable of rendering far more polygons, but this capability is used in conjunction with normal mapping to make ultra-realistic scene assets.

The Transition to HD Gaming

HD-TV is an advancement in digital display technology that became available to consumers in the early 2000s and became widespread within a few years. The resolution of digital TV sets and monitors are measured by the number of pixels on the display, and in the early years of HD-TV, display resolutions ranged from 720p (921,600 pixels) to 1080p (over 2 million pixels). 

Seventh-generation consoles such as Sony’s PS3 (2006) and Microsoft’s Xbox 360 (2005) supported both HD gaming and HD video playback. In 2005, Microsoft exec J Allard touted the Xbox 360 as the console that would usher in a new era of HD gaming, even though gaming in high resolutions had been possible on PCs for many years before these consoles, thanks to the power of dedicated PC graphics cards.

God of War Was One of the First Franchises to be Remastered in HD (Courtesy Sony)
God of War Was One of the First Franchises to be Remastered in HD (Courtesy Sony)

However, it was the consoles, which could support gaming and double as home entertainment systems, that made HD gaming mainstream. Sony remastered many of its PS2 games to run on HD screens with the PS3. Many of these remasters, which now played at higher resolutions, had much sharper image quality and better-looking character models. Higher resolutions also decrease aliasing – the jagged edges that appear on rendered game models. This type of visual artefact can be quite distracting, and a HD display can help make for a more immersive experience by minimising aliasing artefacts. HD resolutions do not automatically imply photorealistic renders, but they can help bring out the detail in renders by achieving high image quality. 

Advancements in Graphics Shaders

A shader is essentially a piece of code that runs on the GPU and contains specific instructions on how to render a 3D scene in pixels, or how to manipulate a 2D image before it’s shown on-screen. Shaders can tell the renderer how a 3D object should be lit, how it should be coloured, what it reflects and much, much more. Early graphics cards had fixed-function rendering pipelines, limiting the sort of effects that could be applied while rendering a scene. But the advent of cards with programmable shaders – the first of which was the GeForce 3 – utterly transformed 3D rendering. In fact, the use of normal and specular maps to texture an object requires a programmable shader pipeline. 

Within a few years, developers had written highly complex shaders. In 2007, a programmer working on the CryTek engine developed screen-space ambient occlusion, which darkens the creases, holes and dents of an object, and the areas where it is contact with other objects, resulting in a more realistic scene that looks like it is responding to indirect or ‘ambient’ light. The shader was first used in Crysis (2007), a game now legendary for its demands on computer hardware. Crysis in fact contains more than 85000 shaders, which all but melted the graphics hardware of the time – and contributed greatly to the realism of the game.

Crysis has Always Pushed the Envelope in Terms of Game Graphics (Courtesy Electronic Arts)
Crysis has Always Pushed the Envelope in Terms of Game Graphics (Courtesy Electronic Arts)

Crysis 2 (2011) used a screen-space reflection shader to render reflections on glossy or glass-like surfaces and objects. The SSR shader contributes a lot to a rendered scene, but has its limitations, and is used in conjunction with other techniques such as cube mapping (also implemented via a shader) to create realistic in-game reflections. 

Crysis 2’s Screen Space Reflection Shader (Courtesy Electronic Arts)
Crysis 2’s Screen Space Reflection Shader (Courtesy Electronic Arts)

Deferred rendering was another major screen-space shading technique developed in the late 2000s. In essence, this technique allows a scene to be rendered with many more lights, by rendering the game geometry and the scene lighting in separate rendering passes. In traditional, ‘forward’ rendering, increasing the number of lights can rapidly increase rendering times but deferred rendering enables more lights and more realistic world lighting without impacting render times significantly. Games such as Dead Space (2008) and Killzone 2 (2009) were among the first to implement deferred rendering and it has now become an industry standard

Killzone 2 Was One of the First Games to Use Deferred Rendering (Courtesy Sony)
Killzone 2 Was One of the First Games to Use Deferred Rendering (Courtesy Sony)

The shaders described above improve lighting, shadows and reflections and thereby help make a scene look more realistic and believable. However, shaders are capable of many more effects, such as anti-aliasing, and contrast-adaptive sharpening, which allows games to be run at lower resolutions by intelligently sharpening the upscaled render. Such shaders can even be injected into games through custom programs like Reshade, which boasts an ever-growing library of shaders, and supports a huge variety of games

Physically-Based Rendering – a New Paradigm

The hunt for photorealism is punctuated with various paradigm shifts, and the advent of physically-based rendering (PBR) around the early 2010s is probably one of the most important. Graphics shaders had incrementally improved the look of games, striving for realism by improving how in-game scenes were rendered. Shaders could do everything from scene lighting to post-processing effects such as camera filters and depth of field. But these shaders were all working on models that were textured in what is now known as ‘traditional’ or ‘non-PBR’ workflows, where the diffuse and specular maps of game assets were painted by texture artists and did not generally reflect the real-world properties of such assets. 

Traditional vs PBR Shading – Notice the Accurate Reflection on the Rifle’s Scope on the Left (Courtesy Marmoset)
Traditional vs PBR Shading – Notice the Accurate Reflection on the Rifle’s Scope on the Right (Courtesy Marmoset)

PBR is crucial to photorealistic renders, because the associated texture sets and shaders accurately model how light interacts with in-game objects. In real life, a shiny gold crown or copper bracelet will look gold or orange – not because these metals are in any way ‘pigmented’, but because they absorb certain wavelengths of light. PBR shaders model this accurately, giving a clear grey sheen to most metals, but giving coloured metals or alloys their characteristic tint based on what they absorb. 

Even non-metallic objects are textured so that they will reflect a tiny bit of light, as they do in real life, and shiny non-metallics have a sheen based on their real-life properties. Reflective surfaces will accurately reflect in-game environments – physically-based textures even capture how reflective a body is based on the angle at which light hits it. PBR can hence improve the results from a screen-space reflection shader – if a smooth marble floor has a PBR-based texture, then it will be essentially opaque when you look directly down it, but will reflect scene objects if you look at it from an angle

In fact, texture artists can make use of detailed reference tables when they adopt the PBR workflow for the sake of physical accuracy – such charts provide base values for texturing various metals and non-metals. PBR texturing can make life easier for artists – earlier, they would expend a good deal of effort into making a golden crown look appropriately golden, and then add surface details like dirt, dull cavities, scratches and more – now, they can add such realistic details while letting the renderer take care of making the object look ‘golden’. 

Remember Me, a 2013 title published by Capcom, is credited as the first game to use physically-based rendering. Many studios soon transitioned to PBR, and held in-depth workshops to help texture artists adopt the new texturing pipeline.

Remember Me Was One of the First Games to Use Physically-Based Rendering (Courtesy Capcom)
Remember Me Was One of the First Games to Use Physically-Based Rendering (Courtesy Capcom)

The Advent of High-Dynamic Range Game Content

Both graphics shaders and physically-based rendering can work in synergy to enhance a scene by improving its lighting, shadows and reflections. The advent of high-dynamic range (HDR) TV in the mid-2010s utterly transformed this process by allowing games to output scenes with a greater dynamic luminance range, a wider colour range (known as the gamut), and more colour tones within this gamut (using higher bit depth). 

Skies and the sun could look tremendously bright. Shadows in dungeons could look very dark, and in horror games, these areas were that much scarier, because you could just make out a lurking shape in the shadow. Roses in a bouquet could each have a subtly different shade of red. The gradient of colours in the horizon during sunsets could look much smoother. 

Standard dynamic range (SDR) displays have a maximum luminance value of around 100 nits. Expensive HDR displays can get as bright as 4000 nits. This means that both the brightest and darkest parts of the render are displayed without losing detail – in an SDR display, these parts would fall above or below the screen’s luminance range and will look either bright white or pitch black. SDR displays can only show 8-bit colour information, or 256 levels of luminance for each colour channel. HDR displays have a bit depth of 10 bits per channel, resulting in 1024 shades between the brightest white and the darkest black, and can support over a billion colour tones

This is why such displays (when showing HDR content) can make colours really pop, enhance the overall contrast of the scene, and smooth the gradient between light and dark colours during a sunset, eliminating banding artefacts with a wider range of colour tones. 

Ironically, game engines had become capable of high dynamic range rendering (HDRR) by the early 2000s, but had no displays capable of showing such renders. Half-Life 2: Lost Coast (2005) was one of the first games to use HDRR and many other games performed render calculations in high dynamic range, but then squeezed the result into standard dynamic range using a process called tone-mapping. Just as a normal map texture is used to capture the geometric details of a high-polygon model, tone-mapping is used to map a HDR rendered frame onto a lower dynamic range. The result is better than what would have been generated without using HDRR, but is not true HDR output.

High Dynamic Range Rendering in Half-Life 2: Lost Coast (Courtesy Valve)
High Dynamic Range Rendering in Half-Life 2: Lost Coast (Courtesy Valve)

Horizon Zero Dawn (2017), Shadow of the Tomb Raider (2018) and Middle Earth: Shadow of War (2017) are among several games that were released soon after HDR displays became widespread, and such games support true HDR output, drastically improving image quality. Tone mapping still remains part of the workflow when creating HDR content, but the far wider luminance and colour range of a HDR display results in content whose dynamic range is not clamped, making for highly realistic lighting, colour and better overall image quality. 

Horizon Zero Dawn was One of the First Games to Output HDR Content for Supported Displays (Courtesy Sony)
Horizon Zero Dawn was One of the First Games to Output HDR Content for Supported Displays (Courtesy Sony)

Real-Time Ray Tracing

In May 2020, the BBC published an article on real-time ray-tracing titled ‘Get ready for the ‘holy grail’ of computer graphics,’ and there is probably no better indication about the importance and primacy of this revolutionary technique in present-day game graphics.

Ray-tracing had long been a part of CGI (computer generated imagery) pipelines in film and television, but was implemented via offline rendering, and was prohibitively expensiveToy Story 3 (2010) took an average of seven hours per frame, and Monsters University (2013) is said to have taken 29 hours per frame

Toy Story 3, and Many Other Films, Use Ray-Traced Rendering (Courtesy Pixar)
Toy Story 3, and Many Other Films, Use Ray-Traced Rendering (Courtesy Pixar)

As ray-tracing actually models the interaction of light with in-game objects, it works best with physically-based textures, which provide accurate data to the ray-tracing algorithm. In fact, the first major book-length publication on PBR refers to ray-tracing, and contextualises PBR as a new method to improve such ray-traced scenes with physically-accurate materials – ray-tracing in films and TV predates PBR. 

The first challenge to implementing ray-tracing in games is that it has to be in real time and not offline, and Nvidia’s first range of RTX cards, released in 2018, managed this feat. Real-time ray-tracing greatly enhances shadows, lights and reflections dynamically and it works thus – the GPU shoots rays from the camera and then calculates how these rays bounce off in-game objects, scene lights and other scene elements (like water bodies), to determine how the scene should look.

A ray that bounces off an object and hits a scene light determines how that object is lit and where its shadow falls – if the object is close to another, then contact shadows are drawn on both. Objects that deflect rays onto glass-like surfaces will be reflected by such scene elements. Rays that move from a light source to coloured objects will take on the objects’ hue and bathe nearby geometry with coloured light. Since the camera and the player character move constantly in games, such calculations have to be performed countless times. Ray tracing is even capable of recursive reflections, like infinite mirrors, though such reflections may not be feasible for complex scenes.

Ray-Tracing Renders Ultra Realistic Reflections and Lighting in Spider-Man: Remastered (Courtesy Sony)
Ray-Tracing Renders Ultra Realistic Reflections and Lighting in Spider-Man: Remastered (Courtesy Sony)

Ray-tracing greatly improves upon previous lighting solutions, like screen-space reflections, and a simple example can illustrate why. Imagine a third-person perspective scene in which the player character is facing a reflective glass shop front, near which are two barrels. Since the SSR shader can see the barrels (in the 2D render), it will paint their reflection on the glass. But the shader cannot see the front side of the player character, and thus cannot draw the appropriate reflection. Ray-tracing creates an accurate reflection of the whole scene by accounting for rays that hit the glass and then hit the player character (and vice versa), and also adds other off-screen objects to the reflection based on their position in the game world.

Cyberpunk 2077’s Ray-Tracing Patch Transforms the Lighting of Many In-Game Scenes (Courtesy CD Projekt Red)
Cyberpunk 2077’s Ray-Tracing Patch Transforms the Lighting of Many In-Game Scenes (Courtesy CD Projekt Red)

Nevertheless, real-time ray-tracing has high performance costs, and only one game – the indie title Stay in the Light (2020) – currently applies it across the board. Other games use it in specific contexts and combine existing methods along with ray-tracing to enhance a scene’s graphical fidelity. Metro Exodus: Enhanced Edition (2019) was hailed as ‘the first AAA ray-tracing game,’ and Control, released a few months later, achieved widespread acclaim for its implementation of ray-tracing.

Stay in the Light is Completely Based on Ray-Traced Rendering (Courtesy Sunside Games)
Stay in the Light is Completely Based on Ray-Traced Rendering (Courtesy Sunside Games)

It is no coincidence that Nvidia, AMD and Intel have all come out with upscaling algorithms soon after the advent of real-time ray tracing. Even the beefiest graphics card will slow to a crawl if it tries to render a game in native 4K with high-quality ray-tracing settings, and that’s where upscaling comes in – the GPU renders the image at a significantly lower resolution, which is then scaled up to (almost lossless) 4K. While Nvidia’s Deep Learning Super Sampling (DLSS) and Intel’s Xe Super Sampling (XESS) use machine learning, AMD’s Fidelity Super Resolution (FSR) does not, though it provides comparable results. Support for DLSS is available for both Control and Metro Exodus: Enhanced Edition, and upscaling algorithms benefit games that lack ray-tracing as well, giving them a significant performance boost – God of War on PC supports both DLSS and FSR.

Conclusion

3D game graphics have evolved over the course of two decades to create stunning visuals in present-day games. Developers have strived for realism and the best implementations of 3D techniques work synergistically to create near-photorealistic, or even hyper-realistic renders. HDR, which enhances colour, contrast and image quality, works best when PBR textures and ray-tracing accurately model how a scene interacts with light. Screen-space shaders are deployed alongside ray-tracing for performance gains, and normal mapping – by now a very old technique – is still critical in optimising scene geometry without losing detail. Some of these techniques represent paradigm shifts – PBR totally replaced traditional texturing workflows and real-time ray-tracing may well replace screen-space effects completely as graphics cards add muscle to their ray-tracing capabilities. 

So, have we reached photorealism in gaming yet? Not quite – even the latest games are near-photorealistic but are still not indistinguishable from a live-action video. Real time ray-tracing is itself a very clever approximation of real life, or ‘ground truth’, as technologists like to call scenes observable to the human eye. 

However, contemporary live-action content, such as the Jibaro episode from Netflix’s Love, Death and Robots, show that modern offline renders can pass for real life – in fact, the best CGI in live-action content is often in places where you don’t even expect it. This suggests that as computing capability increases and advanced ray-tracing methods such as path-tracing become feasible, we may edge closer to ground truth and true photorealism, even in gaming, where a life-like, interactive experience needs to be rendered at 60 frames per second. 

Gameopedia offers custom solutions depending on your specific data requirements. Reach out to us for actionable insights about video game graphics and technology.

Read More

Remakes, Remasters and Next-Gen Upgrades (How Revived Games Thrive)

When Bluepoint Games was remaking Shadow of the Colossus (2005) for the PS4, they noticed a pattern about the birds in the central temple complex of the game. In Shadow, the protagonist Wander sets out from this temple to kill various giants, and is returned to the temple once his mission is complete. Bluepoint developers noticed that if Wander was heading out for his fourth giant, or Colossus, then four birds would be perched in the temple; if he was heading out for his 10th, then ten birds would appear. 

This detail had escaped Bluepoint’s notice when they were remastering the game for the PS3, and in any case, since a remaster mostly uses the original code, the pattern would have been reproduced automatically. But in a remake, which requires developers to recreate a game, pretty much from scratch, all these little touches have to be remade as well, and Bluepoint took great pains to ensure that the bird-and-colossus pattern, and various other details, made it to the remake. 

The result was a ‘precedent-setting’ game that not only introduced Shadow to a whole new generation of gamers, but also preserved and recaptured the experience of playing the ground-breaking original. Bluepoint’s remake has received widespread critical acclaim, with some claiming that the new Shadow is one of the best remakes of all time. The original Shadow of the Colossus is often cited as an example of how video games can be art, and Bluepoint’s painstaking reconstruction does justice to the game, its fans and its legacy. 

In this blog, we will discuss remakes, remasters and next-gen upgrades, all of which give old games a new lease of life on upgraded consoles and modern PC hardware. We will see how prominent remakes faithfully retain the unique features of the originals, how remasters greatly enhance the graphics of an older game for new hardware, and how the next-gen upgrade endows a game with improved graphical fidelity and performance on a new generation of consoles. 

Remakes and remasters are compelling business propositions today, especially because the gamers who played the original versions of classic games are older now and have more disposable income. Remakes and remasters are big money makers – digital revenue for prominent remakes nearly doubled between 2018 and 2020, and remake earnings surged in 2020 amidst widespread pandemic lockdowns. Moreover, remakes and remasters allow younger gamers to experience ground-breaking classics with all the graphical fidelity and streamlined gameplay of modern hardware.

Older Gamers Who Have Played Classic Games Create a Market for Remasters and Remakes
Older Gamers Who Have Played Classic Games Create a Market for Remasters and Remakes

In fact, in 2020, a remaster – The Legend of Zelda: Skyward Sword HD  – was more anticipated than many new titles, and it was also the best-selling game in the US at launch, during July 2021. Remakes of Resident Evil 2 (2019) and 3 (2020) had bigger launches than the new entry in the franchise, Resident Evil 7 (2017). One observer calls the new trend of remasters and remakes a ‘nostalgia gold rush’, underscoring how a longing for the past plays a crucial role in driving the success of a remake or a remaster. 

In the following sections, we will discuss what remakes, remasters and next-gen upgrades are, and why they are made. 

How are Games Revived for New Generations?

The development of remakes, remasters and next-gen upgrades are all endeavours that revive an older game for new hardware and modern consoles. But what, exactly, do these terms mean, and why do studios and developers undertake remakes and remastering projects? We discuss both in the sections below.

What is a Video Game Remake?

A video game remake is a ground-up recreation of a classic game. It includes high-quality models, textures, animations and sounds, and is powered by a modern game engine that brings state-of-the-art lighting, reflections, shadows and other effects. 

Examples include Capcom’s Resident Evil 2 remake, released nearly 21 years after the original, and the Final Fantasy VII remake (2020). Both games were originally released on Sony’s first PlayStation console.

Final Fantasy VII Remake (Courtesy Square Enix)
Final Fantasy VII Remake (Courtesy Square Enix)

What is a Video Game Remaster?

A video game remaster is essentially a much better-looking version of an older game. Taking advantage of modern hardware, a remaster adds a whole range of visual effects that were either unavailable to the original, or hard to implement without performance costs, and also upgrade the game’s textures, models and animations. In general, remasters use much of the same code as the original, but can update it so that the game runs at higher resolutions and frame rates on new hardware. Many remasters are bundled into a single collection as well, and remasters of a single game can include all the DLC in a single edition. 

There are numerous examples of remasters, across video game generations, including the Shadow of the Colossus remaster (2011) for the PS3, the Last of Us Remastered (2014) for the PS4, and the Master Chief Collection (MCC) for PC and the Xbox One consoles. The MCC continues to receive updates long after its initial release in 2014

Halo: The Master Chief Collection (Courtesy Microsoft)
Halo: The Master Chief Collection (Courtesy Microsoft)

What is a Next-Gen Upgrade?

A next-gen patch updates a game to match the quality of titles released for the latest hardware. They are usually meant for recent games – older games would need a remaster. Even a bare-bones next-gen upgrade will usually boost frame rates and performance and enable higher resolutions. Some developers may also provide high-res texture packs and greatly upgrade game graphics with features such as ray-tracing, support for upscaling algorithms and HDR rendering. Many games receive such upgrades for current-gen consoles. 

Developers may also update remasters or remakes with next-gen features: examples include Resident Evil 2, a remake which received a ray-tracing patch on console and PC, and Crysis Remastered, which was updated to support NVIDIA’s Deep Learning Super Sampling (DLSS), an AI-based upscaling technique

Why are Video Games Remastered and Remade?

Remasters and remakes are not made purely for financial reasons, though such considerations may play a significant role in determining what game is to be remastered or remade. Good remakes and remasters evoke nostalgia, build excitement for new releases in the franchise and help developers improve their skills and industry cred.

Furthering a Game’s Legacy and Evoking Nostalgia

One of the key reasons for remastering or remaking a game is to evoke nostalgia among the many fans it garnered when the original was released. Remasters and remakes allow fans to revisit cherished virtual spaces while enjoying all the convenience and graphical fidelity of modern hardware. Such updated games can also attract entirely new audiences looking to discover why these titles became classics. 

Off-Setting the Risks of AAA Development

According to an NPD analyst, publishers can pursue the remastering trend to make money through lower-risk ventures. Remasters may sell less than a new game, but cost much less to make, and publishers can also decide which games to remaster, knowing where the demand exists. In fact, Nintendo’s Super Mario 3D All-Stars, a compilation of older Super Mario games, was sold with a sixty-dollar price tag and became the second best-selling Switch title of 2020, despite the fact that the collection was a time-limited release. An avid fan base looking to relive their beloved franchise had created a natural and profitable market for the Switch release, and saved Nintendo the millions of dollars involved in making a new game from scratch. 

Super Mario 3D All-Stars (Courtesy Nintendo)
Super Mario 3D All-Stars (Courtesy Nintendo)

Creating Excitement for New Releases

Microsoft successfully built hype around Halo 5: Guardians (2015) by releasing the Master Chief Collection for Xbox One right before it. The MCC allowed many new gamers to experience the franchise’s history before they dived into Halo 5. Despite a troubled launch, the Master Chief Collection is now a well-regarded remaster and arguably the best way to experience the early adventures of John 117.

The Crash Bandicoot N. Sane Trilogy (2017), a collection of remastered Crash Bandicoot games was a resounding success, and a remake of the PS1 game, Crash Team Racing Nitro-Fueled (2019) followed soon after. For long, Crash Bandicoot had been a mascot for Sony, but the franchise had stagnated until the remaster and remake revived it. In 2020, Crash Bandicoot 4: It’s About Time marked the first new release in the franchise in 12 years and proved a commercial and critical success

Crash Bandicoot N. Sane Trilogy (Courtesy Activision)
Crash Bandicoot N. Sane Trilogy (Courtesy Activision)

Building a Developer’s Reputation and Capabilities

When Grim Fandango (1998) was remastered and released for multiple platforms in 2015 by Tim Schafer’s Double Fine Productions, it was praised by fans and critics and sold far more units than the original. Double Fine built brand loyalty so well with its Grim Fandango remaster that its Kickstarter project ‘Double-Fine Adventure’ broke records in 2012, raising one million dollars within 24 hours. 

Grim Fandango Remastered (Courtesy Double Fine Productions)
Grim Fandango Remastered (Courtesy Double Fine Productions)

Bluepoint Games has a splendid reputation thanks to its critically-acclaimed remasters of games in the God of War and Uncharted franchises, and its marvellous remakes of Shadow of the Colossus and Demon’s Souls (2020). Its remake of FromSoftware’s first ‘Souls-like’ game was a launch title for the PS5 and Bluepoint has been purchased by Sony. It now has the chance to make a first-party PlayStation game, and all its experience remaking and remastering Sony hits will no doubt help.

In the following sections, we will discuss just what it takes to remake, remaster or upgrade a game – each endeavour has its own challenges, and we delve into them below. 

The Video Game Remake - a Labour of Love

Remaking a video game from scratch is a major undertaking, given that the game being remade was released generations ago. The resulting remake must nevertheless capture the feel of the original faithfully, while updating the content to modern gameplay and graphics standards. As such, developers must have one eye on the past and one to the future, and strive to recreate every little detail in the original, and even keep the gameplay elements and mechanics intact while updating them to match modern controller setups. 

Prominent titles include Bluepoint’s great Shadow of the Colossus (SOTC) remake for the PS4 released in 2018, Capcom’s remakes of Resident Evil 2 and 3, and Square Enix’s Final Fantasy VII Remake. SOTC’s remake was released 13 years after the original, the RE 2 remake nearly two decades after the original, and the FF7 remake was released 23 years after the original – the original games are so old they necessitated a full remake. 

This is why remakes are recreated on modern game engines, and not much of the original code makes it to the remake. Many of the environments, models and textures have to be made from scratch. Combat elements may need to be overhauled, as is the case with the Final Fantasy VII remake – the original had turn-based combat while the remake features a revamped real-time combat system and allows you to switch between the main player character and his companions to execute special moves. The end result recreates the flow of battle in the original, while introducing innovative gameplay mechanics.

FF VII Remake’s Combat System is Used to Control Multiple Characters (Courtesy Square Enix)
FF VII Remake’s Combat System is Used to Control Multiple Characters (Courtesy Square Enix)

Bluepoint Games has justly earned the moniker ‘masters of the remaster’ because of their work on various critically acclaimed remasters and remakes, and their Shadow of the Colossus has been hailed as one of the best remakes ever – a closer look can tell us just what it takes to create a great remake of a beloved game. 

Shadow of the Colossus is a minimalist classic made by game creator Fumito Ueda for the PS2. In the game, the hero Wander explores desolate landscapes in search of mighty ‘Colossi’, whom he must kill in order to bring his lover back from death. The game’s areas offer no treasures, there are no low-level enemies to combat, no NPCs and no cities, towns or villages. Wander has only a sword and a bow with arrows to kill each Colossus, and rides a horse that doesn’t always respect his commands. The game is essentially sixteen boss battles, each of which is unique and takes place in its own desolate world, made all the more memorable because Ueda shuns so many traditional gameplay tropes. SOTC was an instant classic and A New Yorker article discusses the game’s status as a work of art. Remaking a game with such a formidable reputation was, well, a colossal challenge. 

For its PS4 remake, Bluepoint started with the updated code base from their PS3 remaster of the game, and used Ueda’s later title, The Last Guardian (2016) as a guideline for how the visuals should be updated. To recreate the forested and grassy areas of the original, they devised a foliage system, in which grass and plants not only sway in the wind, but also bend and flatten as Wander runs through them. 

They painted details such as erosion, cracks and other damage onto the terrain and the mountains to make them realistic, and retained the unique architecture and look of each structure, and even the Colossi themselves, while improving them with higher-quality textures and models. Animations were also revamped and look much more believable, especially in the battles with the Colossi. Even the fur on the Colossi sway and bend as Wander climbs up the giants to kill them. 

Crucially, Bluepoint used physically-based rendering (PBR), a texturing and rendering pipeline that accurately models the interaction of light with in-game objects and is especially effective in rendering reflective or glossy, metallic surfaces realistically. In a game that uses PBR, a gold crown and an iron sword won’t shine the same way – each metal’s sheen is based on its real-life characteristics – and even glossy or dull leather will look different based on how they interact with light in real life. Bluepoint used a blend of PBR and traditional techniques to maintain a balance between modern photorealism and the stylised look of the original. The developers also strove to stay true to the lighting setup of the original, even as they introduced High-Dynamic Range (HDR) rendering with their Bluepoint engine. With HDR, the game has much brighter highlights (leading to glorious skies), much deeper blacks and a far greater range of colours.

The Shadow of the Colossus Remake’s Visuals Utterly Transcends the Original (Courtesy Sony)
The Shadow of the Colossus Remake’s Visuals Utterly Transcends the Original (Courtesy Sony)

Bluepoint also brought performance improvements to the game. They broke up the world into manageable portions, rendering distant areas at lower levels of detail, and made countless optimisations to game assets so that they could offer a 60-frames-per-second (FPS) performance mode, and a locked 30-FPS quality mode at 4k HDR on the PS4 Pro. They also fixed the awkward controls and camera movement of the original, so that the game is playable no matter what frame rate you choose. However, they decided to keep the original control mapping as an option for older fans. With its devout attention to detail and its commitment to reviving SOTC for a new generation, this remake does count as one of the best ever, and is a standard for other developers to reach for. 

Capcom also did an excellent job with the Resident Evil 2 remake, though the title is significantly different from the original, which was not a true 3D game and was released for Sony’s first PlayStation, well before full-3D games were pioneered by id Software and Epic Games. Nevertheless, Capcom faithfully retained the original’s atmosphere – ‘jump-scare’ locations are recreated faithfully in the remake, and even some of the in-game objects are placed exactly where they were in the original. However, some character roles are expanded and certain areas are reworked from scratch. 2019’s Resident Evil 2 is not a shot-for-shot recreation, but a reimagined version that remains true to the spirit of the original. 

The Resident Evil 2 Remake Captures the Spirit of the Original (Courtesy Capcom)
The Resident Evil 2 Remake Captures the Spirit of the Original (Courtesy Capcom)

Capcom nevertheless fumbled the Resident Evil 3 remake, cutting out content, iconic locations and scripting the behaviour of the enemy, Nemesis, taking away the element of surprise that made him so terrifying in the original. 

A remake works if it remains faithful to the original while reimagining it at the same time. Not even the best graphical upgrades can assure success if the game veers too far away from the original. Introducing new elements while staying true to the source is a tough balancing act, and remakes of games such as Resident Evil 2, Shadow of the Colossus and Final Fantasy VII pull this off, while others fail to live up to expectations. Considering that the Final Fantasy VII remake cost $140 million – more than what it takes to make and market a major movie – developers must tread carefully when remaking old games for contemporary audiences. However, despite the manifest challenges involved, studios continue to pursue the remake trend – upcoming remakes include Resident Evil 4 (2005), Dead Space (2008), Silent Hill 2 (2001), Tom Clancy’s Splinter Cell (2002), and Prince of Persia: The Sands of Time (2003).

The Video Game Remaster – Old Games Get a New Look

A video game remaster is a lesser undertaking than a remake – it uses largely the same game code but greatly enhances the visuals by adding various graphical effects, and increasing resolution and performance by taking advantage of modern hardware. Many remasters are bundled into a single collection, giving users good value for their money. 

Remastering is an ideal choice for famous games released a generation or two ago Generally the time frame between a remaster and the original is shorter, especially when contrasted to a remake. Bioshock: The Collection (2016) – a compilation of remastered Bioshock games, was released about three years after the last Bioshock title, Bioshock Infinite. Borderlands: The Handsome Collection (2015) which contains Borderlands II and Borderlands: The Pre-Sequel, was released a year after the latter title.

Sony has used remasters to fill out its catalogue since the PS3 generation. A Wikipedia page lists nearly 60 remasters for the PS3, many of which are bundles containing multiple remastered games. These games look much better than their PS2 or PS1 counterparts because the PS3 is one of the first HD consoles, and the original games remastered from SD (standard definition) look much sharper, with higher-quality texture detail and better character models. There are nearly 60 remaster titles (some of which bundle multiple games) for the PS4 as well, but many of these remasters are available on other consoles and PC too. Sony’s remasters are either branded as ‘Classics HD’ or come with labels like ‘Remastered in High Definition’. 

Remastered games come with a slew of visual upgrades. Assassin’s Creed 3 Remastered (2019) includes volumetric lighting (or ‘god-rays’), screen-space reflections (by which water bodies and other glass-like surfaces reflect nearby objects in the scene), improved shadow detail, realistic lighting from in-game light sources and increased view distance. The remaster also uses physically-based rendering along with upgraded textures and remade character models to make the game look photorealistic and supports 4K HDR rendering on PC, PS4 Pro, Xbox One X, and the current console generation.

Ray-Tracing Renders Ultra Realistic Reflections and Lighting in Spider-Man: Remastered (Courtesy Sony)
Ray-Tracing Renders Ultra Realistic Reflections and Lighting in Spider-Man: Remastered (Courtesy Sony)

One of the better remaster editions available today is Halo: The Master Chief Collection, both in terms of the visual upgrades it brings to its games, and the sheer number of games included – six remastered games, with each title’s multiplayer receiving regular updates. When released in 2014 for the Xbox One, the MCC won IGN’s People’s Choice Award as the best remaster of the year, and the collection has only gotten better since. MCC was first released for PC in 2019, (though only Halo: Reach was included), but within a year, all games in the collection were ported to PC. 

The visual upgrades are of such a quality that one can end up considering these remasters as remakes. Digital Foundry’s Youtube review of Halo 2: Anniversary, which is part of the MCC, straight up calls the game a remake. Halo 2 has gorgeous pre-rendered cinematics that replace the original’s engine-based cutscenes, and uses real-time lighting and shadows, along with global illumination, to add realistic lights and shadows to both in-game objects and particle effects (such as explosions). Not all games in the collection were as comprehensively remastered as Halo 2, but every game does come with performance enhancements and supports increased resolutions. The collection is not without its flaws, however, and the multiplayer experience can be buggy, especially in the PC versions of Halo 2 and Halo 3

Halo 2’s Remaster Overhauls Much of the Original’s Graphics (Courtesy Microsoft)
Halo 2’s Remaster Overhauls Much of the Original’s Graphics (Courtesy Microsoft)

Remasters generally improve upon the original and become the ideal way to experience an older game, unless the game has been remade – Fumito Ueda had endorsed the PS3 remaster of Shadow of the Colossus as the definitive edition of the game before the remake was released. But even prominent companies like Blizzard and Rockstar Games can botch remasters so thoroughly that they become the target for relentless backlash from gamers. 

Grand Theft Auto: The Trilogy – Definitive Edition (2021), which bundles ground-breaking classics such as GTA III (2001), GTA: Vice City (2002) and GTA: San Andreas (2004) is an example of what happens if a publisher remasters games without taking care to respect player expectations. The release was buggy, the graphics lacklustre, character models still looked flat and unrealistic, and the Guardian’s critic calls the remaster an ‘infuriating disappointment’. Gamesindustry.biz published an opinion piece arguing that companies shouldn’t release remasters just to make a fast buck but must cherish the creative history of the games they upgrade, and excoriated Rockstar for removing the original versions of these games from digital download stores when it released the remastered collection. The debacle forced Rockstar to apologise, even though they had not developed the remaster in-house – Grove Street Games, the studio behind the remaster, is fixing its various issues

Rockstar Studios Received Backlash for Botching the Remaster of Seminal GTA Games (Courtesy Rockstar)
Rockstar Studios Received Backlash for Botching the Remaster of Seminal GTA Games (Courtesy Rockstar)

Another infamous example is Warcraft III Reforged (2020), which is a remaster of the original Warcraft III: Reign of Chaos (2002). It received an overwhelmingly negative reception and even led to the creation of a website that successfully petitioned for a refund to all those who bought the remaster by listing all the upgrades that Blizzard had promised, but not delivered. 

Warcraft III: Reforged – a Remaster – Failed to Deliver on its Promises (Courtesy Blizzard)
Warcraft III: Reforged – a Remaster – Failed to Deliver on its Promises (Courtesy Blizzard)

Remasters will likely succeed if developers stick to what is now a well-worn path – offering good value for money by improving the visuals and performance of the game, and bundling either its DLC, or multiple games, into one easily accessible package. 

As discussed above, remasters are inherently less risky than new games or costly remakes, and their target audience can be clearly identified. This is perhaps the reason why a badly-made remaster draws such backlash – it fails to live up to quite modest expectations.

The Next-Gen Upgrade – A Boost for Recent Games

Both remasters and remakes involve a good deal of effort, time and money, and must satisfy gamers familiar with the original and others looking to discover a classic. 

A next-gen upgrade doesn’t have to completely overhaul a game, and usually upgrades a game’s performance and visuals to a certain level – it can unlock 60-FPS modes on a title that was locked at 30 FPS in an earlier console generation, and can also use current-gen hardware to display a game at native or upscaled 4k resolutions. 

Some developers go the extra mile and provide higher-res textures, and offer support for ray-tracing and upscaling algorithms, and utterly transform their game’s look as a result – ray-tracing is an advanced graphics technology that results in near-perfect lighting, shadows and reflections, but it can also severely affect performance. It thus goes hand-in-hand with upscaling algorithms, which allow the game engine to render ray-traced scenes at much lower resolutions before upscaling the result. 

Capcom’s Resident Evil 2, Resident Evil 3 and Resident Evil 7: Bio-Hazard all have free next gen-upgrades, which feature ray-tracing. id Software’s Doom Eternal (2020) got a ray-tracing patch for both PC and current-gen consoles, as did HellBlade: Senua’s Sacrifice (2017) – both games are also available on the Xbox and PC Game Pass.

Doom Eternal is Utterly Transformed by Ray-Traced Reflections (Courtesy Bethesda)
Doom Eternal is Utterly Transformed by Ray-Traced Reflections (Courtesy Bethesda)

Next-gen upgrades make sense for recent games. By default, such games use modern rendering paradigms, such as physically-based rendering and HDR, and are built from scratch with the sort of graphics features that are introduced to an older game when it is remastered or remade. A cross-gen title, which is released late in the life-cycle of an older console generation and ‘straddles’ the boundary between older and current-gen consoles, is also an obvious candidate for a next-gen patch. 

Recent games and cross-gen titles may have run at lower resolutions and frame rates due to the limitations of console hardware when they were released, and the next-gen patch is meant to fix this and make the game run better on newer consoles. Adding support for ray-tracing and upscaling algorithms is a bonus – developers may be focussing on delivering such features for true next-gen games released exclusively for the PS5 or the XBox Series X|S, rather than upgrading older games with such tech.

Given that a next-gen upgrade can be quite trivial compared to a remaster or a remake, one would expect an industry-wide standard for delivering them. This is far from the case. If you want a remake or a remaster, all you need to do is go to a shop or a digital storefront and buy it. But the upgrade path to a next-gen version of your game is absurdly convoluted today. 

Some developers participate in Xbox’s Smart Delivery program, which automatically downloads the game version best suited to your console regardless of what disc or digital edition you buy. Microsoft has promised Smart Delivery support for all first-party games, but third-party publishers aren’t obliged to participate. Ubisoft supports Smart Delivery for Assassin’s Creed: Valhalla (2020) and CD Projekt Red does the same for Cyberpunk 2077 (2020). However, EA came up with a ‘dual entitlement’ scheme where owners of Madden NFL 21 (2020) could claim a free next-gen upgrade for the Xbox Series X|S or PS5 before Madden NFL 22 was released, which makes little sense – why should one game’s next-gen upgrade be blocked after the franchise gets a new release?

Sony, meanwhile, seems to lack a coherent strategy for delivering upgrades, and does not feature any pro-consumer initiative meant to match up to Smart Delivery. Sony promised dual entitlement for a range of first-party cross-gen games for the PS5 – which meant that buying a cross-gen PS4 game would entitle you to a free PS5 upgrade – but did not include Horizon Forbidden West (2022) in this policy. After the ensuing backlash, the game’s PS4 version now has a free upgrade when bought for a PS5, but nevertheless, a costlier PS5 version exists. Consumers may well buy the PS5 version, not realising that the buying PS4 version will result in an identical download

For cross-gen titles with dual entitlement, Sony simply leaves it up to the user to figure out the differences between the PS4 and PS5 versions of the game, and choose accordingly. An InputMag columnist calls the PS5 the ‘most confusing console on the market’, especially because it does not list the next-gen features of even its launch titles in the product description – you have to download these games (or watch Youtube videos of those who did) to find out the features available in quality and performance modes.

As mentioned above, Capcom released its Resident Evil next-gen upgrades for free, and CD Projekt Red did the same for Cyberpunk 2077, offering ray-tracing support on consoles with its next-gen patch. The company has also promised a next-gen upgrade for its seven-year-old open-world classic, The Witcher 3: The Wild Hunt (2015) – the patch is expected to arrive before the end of 2022

In fact, Microsoft has told third-party developers that they should offer Xbox Series X|S upgrades for free, and should refrain from branding such upgrades as new DLC. If the developer still decides to create a paid upgrade path, Microsoft recommends that owners of a last-gen version be offered a discount when they pay for the Xbox Series X|S version. Of course, no third-party studio or publisher is obliged to heed Microsoft’s suggestions, resulting in controversial releases such as Control Ultimate Edition.

505 Games’ policy regarding Control’s next-gen upgrade has been particularly egregious. Released in 2019, Control was one of the first games to implement advanced ray-tracing effects on the PC, and is one of the games NVIDIA uses to showcase the capabilities of its RTX cards. Console players, however, could not enjoy these features because the Xbox Series X|S and the PS5 were yet to be released. 

When the publisher did offer a next-gen console upgrade, they locked it behind a $40 ‘Ultimate Edition’, which had no new content when compared to the Digital Deluxe Edition many users had bought earlier, expecting a free next-gen upgrade for the extra money they had paid. In fact, the only upgrade path for the next-gen patch on console is to buy the Ultimate Edition, even if you have bought the base game and all the DLC earlier, and fans of the game are justifiably angry. 

Despite being simpler than a remaster or a remake, a next-gen patch is a far more convoluted upgrade path and can lead to considerable confusion and frustration. While developers strive to create beautiful and faithful remakes and remasters of older classics, there appears to be no industry-wide commitment to deliver the relatively simpler enhancements of a next-gen patch. Is this because gamers now expect these upgrades for free?

Conclusion

Remakes and remasters succeed or fail based on how well they uplift a game while remaining true to the source material, especially as nostalgia is a significant factor in determining such games’ sales. In the future, however, we may not see such remakes or remasters because of the prominence of live-service games, which aim to always keep pace with the 11latest graphics standards as part of their intent to keep gamers engaged for years. 

Also, the primacy and profitability of mobile games may make remakes and remasters less important in the future because the factors driving their creation may no longer be relevant. Shadow of the Colossus has arguably awed multiple generations of gamers, but will a mobile game like Angry Birds ever hold the same place in gamers’ hearts, especially to justify a remaster? 

Ray-tracing transforms the look of present-day games that implement it, and we may soon reach a point where there isn’t much of a difference between console generations, especially as consoles these days feature much the same architecture as PCs, while being optimised for gaming. This would imply that the ‘next-gen upgrade’ will suffice to update a game to a new console generation or a new line of PC hardware. 

Given how utterly confusing upgrade paths are today, publishers, developers and console manufacturers – especially Sony and Microsoft – may soon have to collaborate on establishing a standard by which each console automatically provides access to the most suitable version of any title that a gamer buys. Microsoft has already laid the groundwork with Smart Delivery, but such an initiative can work only if everyone agrees to implement it. Gamers will continue to be short-changed, especially when it comes to next-gen patches, until an industry-wide policy is established for such upgrades. 

Gameopedia offers custom solutions depending on your specific data requirements. Reach out to us for actionable insights on the remake-and-remaster trend in the gaming industry.

Read More

The Decline of Physical Games and The Rise of Digital Distribution

When Valve made the much-anticipated Half-Life 2 available on Steam in 2004, a deluge of users rushed to download the game or authenticate their physical copy, and Steam simply keeled over and crashed

Valve had made Steam authentication mandatory for even physical copies of the game, so everyone who had bought the title, either through Steam or at a retailer, had to go through Valve’s client, and the company’s servers simply could not handle the load. Gamers who had expected the further adventures of Gordon Freeman ended up with pop-ups from the Steam client apologising for the lengthy delays. 

Steam has come a long way since then and is a fixture of millions of PCs now, and many major publishers, including Sony, currently release games on the platform. Steam is also home to thousands of indie games and is arguably the most prominent digital delivery service for PCs. 

The digital distribution of games – whether on PC, console or mobile – has grown prevalent thanks to growing internet speeds, higher broadband penetration, and many other factors. Consoles have long featured digital storefronts from which gamers can buy and download games, smartphone games are available via Google Play or the iOS App Store, and for the PC, digital games have practically become the norm as modern computers tend to lack optical drives. In turn, sales of physical game copies, or ‘retail’ editions, are on the decline, though they are yet to be fully supplanted by digital delivery. 

In this blog, we will delve into the history of game storage media, from cartridges to Blu-ray discs and chart the rise to prominence of digital distribution services for games. Game storage formats have played a significant role in the history of gaming, and even today’s major consoles provide support for physical copies. However, the industry is transitioning towards digital delivery as the principal mode for selling games – we will look into why this is the case.

What is a Digital Game?

A digital game refers to any title downloaded from a digital delivery service such as Steam, a console manufacturer’s online store, or smartphone app stores. Digital game data can be installed on devices like internal hard disks, solid-state drives and even removable storage. Note that such games do not have to be paid for a la carte – subscription services such as the Xbox Game Pass or PlayStation Plus allow users to download many games and play them so long as they remain subscribed.  

What is a Physical Game?

A physical game copy refers to any game whose data is stored in physical media like cartridges, floppy discs, CDs, DVDs or Blu-ray discs. Today, such physical copies may not contain all the files required to play the game and you may still have to download either a significant portion of game data or the latest patch from a digital delivery service.

In the past (especially during the time of cartridges), buying a game was a one-and-done deal, and you inserted the game cartridge into a slot to play and even save progress – cartridges featured a chip specially designed for game saves. Effectively, this meant that all game data – whether it was the game itself, or your progress, was saved in the cartridge, and not on a storage device in the console. 

Physical game copies (especially of console games) can be shared among friends, sold second-hand to others or traded in at a retailer. They can even allow access to a full game without necessitating an internet connection, though most contemporary games need downloadable patches and updates to work properly. People still buy physical games because of discounts at local retailers or e-commerce sites, and the advantages that physical copies offer. 

A Retail Display of Video Games at a Store in Geneva (Courtesy Wikimedia Commons)
A Retail Display of Video Games at a Store in Geneva (Courtesy Wikimedia Commons)

In the next section, we will outline the evolution of physical game storage formats, from the cartridge to the Blu-ray disc. Like any form of media, gaming depended on physical storage before high-quality internet speeds made digital downloads a viable alternative. The capacity and efficiency of physical formats steadily increased and the transition from one storage format to another often marked a major inflection point for the gaming industry, as we will see below. 

A History of Physical Game Media

Physical copies of games have played an integral role in making gaming a popular and affordable past-time, especially since they were geared from the first to make home gaming viable. 

The following sections will discuss the role of cartridges and floppy disks in gaming from the 70’s to the 90’s, followed by CDs and DVDs (both of which Sony used to great effect in PlayStation consoles), and then to Blu-Ray, which was first used by Sony for the PlayStation 3, and then integrated with Xbox consoles as well from the eighth console generation onwards. 

Storage Size Growth

Cartridges and Floppy Disks

The advent of cartridges marked a major shift for home gaming – no longer did consumers have to buy dedicated consoles for their favourite games, but could buy a single console and play multiple games on it simply by slotting a compatible game cartridge into the system. 

Fairchild Semiconductor pioneered the design of the first console with interchangeable game cartridges – the Fairchild Channel F (1976). Although overshadowed by the Atari 2600 (1977), Fairchild was the first to create a console with a microprocessor that loaded games from programmable cartridges.

The Fairchild Channel F (Courtesy Wikimedia Commons)
The Fairchild Channel F (Courtesy Wikimedia Commons)

Designers at Fairchild knew that their sensitive cartridge circuit boards had to be capable of withstanding considerable abuse, like being left out in the sun, or being stepped upon. They encased their technology in hard, durable plastic, but also created an easy-to-use slotting mechanism, which would enable both the cartridges and the console to withstand multiple insertions and ejections. 

Cartridge design evolved considerably after Fairchild’s pioneering efforts. They loaded graphics and other game data faster and faster, and new iterations were made to use less system memory. By the time Nintendo released the N64 (1996), cartridges could store upto 64 MB of data – the first cartridges made by Atari and Fairchild could store just 32 KB at best. From the late 70s to the early 90s, the cartridge was the default storage medium for console games, especially because they were hard to reverse engineer

Game Cartridges for the Atari 2600 (Courtesy Flickr)
Game Cartridges for the Atari 2600 (Courtesy Flickr)

During roughly the same period, floppy disks – especially the 3.5 inch versions introduced by Sony in 1980 – were used to store and exchange games and other programs for the PC. The 3.5 inch disk was designed to be durable and resilient, with a hard plastic casing and a sliding metal shutter that protected the magnetic storage tape inside. They were an ‘almost viral’ way for transmitting shareware – especially portions of games like Doom (1993) and the Commander Keen games. 

Doom was also one of the most prominent games to be released on floppy disks, as were many other games of that period, such as Prince of Persia (1989). Such games could be shared even with friends who didn’t have the same type of PC, since the floppy disk drive had become an industry standard by the late 70s. 

The Floppy Disk Set for the Full Release of Doom (Courtesy Internet Archive)
The Floppy Disk Set for the Full Release of Doom (Courtesy Internet Archive)

Both the floppy disk and the game cartridge would eventually be supplanted by the CD-ROM (Compact Disc Read Only Memory). We discuss the advent and rise of optical media in the following sections.

Optical Media

The use of optical media greatly expanded the possibilities of gaming by allowing games to be much larger in size. In a cartridge, game data is stored on a single chip on a circuit board, limiting storage capacity, and save games are stored on another chip. But almost all the surface area of an optical disc can be crammed with data in the form of microscopic indents etched by a laser – such discs are known as ‘optical’ because light is used to both read and write onto them. A Blu-ray disc’s capacity is more than 50 times that of a CD because of its much smaller, densely-packed indents.

Comparison of Storage Density in Optical Media (Courtesy Wikimedia Commons)
Comparison of Storage Density in Optical Media (Courtesy Wikimedia Commons)

Games such as Final Fantasy VII (1997), Halo 2 (2004), the Last of Us (2013), and many others made full use of the storage space available to them to deliver quality content at the highest-fidelity graphics that were possible at the time they were released. Support for optical media also helped consoles double as home entertainment systems, and the success of many consoles, across generations, was partly because they could play movies and music as well. 

The CD-ROM Supersedes the Game Cartridge

The PlayStation (1994) is the first home gaming console to sell more than 100 million units, and this is at least partly due to its use of compact disks (CDs) as the storage format for its games. In 1985, Sony and Philips had together developed a technical standard by which CDs could hold any form of data, which led to the creation of CD-ROMs. With about 700 MB of storage space, they dwarfed the capacity of cartridges (which could store up to 64 MB) and allowed games to offer much more content than was possible earlier. CDs also featured various other advantages – they were cheaper to manufacture and also reduced production times for games, leading to lower retail prices. 

Sony’s PlayStation Popularised the CD-ROM as a Game Storage Medium (Courtesy Wikimedia Commons)
Sony’s PlayStation Popularised the CD-ROM as a Game Storage Medium (Courtesy Wikimedia Commons)

According to a PC World article, Sony won the first console war it ever participated in largely because it used CDs. Due to the inherent advantages of this storage format, Sony was able to bring many third-party publishers into the fold, including Square Enix, whose hugely successful Final Fantasy games had traditionally been Nintendo exclusives and had served as system sellers for Nintendo consoles

Square decided that Nintendo’s cartridge-based N64 would not suffice for Final Fantasy VII. It turned to Sony, and the end result was a gorgeous game with high-quality cinematics and pre-rendered backgrounds, all made possible because of the CD’s higher storage capacity and the PlayStation’s support for 3D graphics. Final Fantasy VII was hailed as the ‘game that sold the PlayStation’ – FF VII did for Sony what previous instalments of the franchise had done for Nintendo.

Final Fantasy VII (Courtesy Sony)
Final Fantasy VII (Courtesy Sony)

The PlayStation could also play audio and video CDs, making it one of the most versatile home entertainment systems of the time. The resounding success of the PlayStation led to the marginalisation of the cartridge, and home gaming consoles would use optical storage as the primary medium by selling games.

Final Fantasy VII’s Cinematics Were a Key Selling Point for the Game (Courtesy Sony)
Final Fantasy VII’s Cinematics Were a Key Selling Point for the Game (Courtesy Sony)

DVDs Turn Consoles into Home Entertainment Systems

Much like the CD-ROM, the DVD (Digital Versatile Disc) would allow games to be much larger in size and feature more content. Halo: Combat Evolved (2001) spawned a huge franchise that became a major system seller for Xbox consoles, and required the higher storage capacity of a DVD. Halo 2, considered one of the greatest games of all time, is nearly seven times as large as the first Halo game.

Halo 2 (Courtesy Microsoft)
Halo 2 (Courtesy Microsoft)

The larger-sized 3D games made for the PS 2 and the Xbox are recognisably modern – Gran Turismo 3 (2001) for the PS 2 features stunning graphics, as do the Halo games released for the Xbox, which used Microsoft’s DirectX technology and a graphics card made in collaboration with Nvidia for hardware acceleration. Higher graphical fidelity depends on high-resolution textures and game assets, which in turn require greater storage space – the shift to DVD enabled more content, and contributed to higher-quality graphics as well.

The PlayStation 2 (2000) and the first Xbox (2001) supported game DVDs and CDs and both could interface with home theatre systems, essentially serving as DVD players. The PS 2 could play movies and music out of the box, with the gamepad allowing you to control playback. You could also buy a remote for image adjustments and more playback features. Unlike the PS 2, the Xbox did not come with built-in DVD playback support: it needed a DVD kit containing a remote control and infrared sensor

The Xbox Required a Special Kit for DVD Video Playback (Courtesy Wikimedia Commons)
The Xbox Required a Special Kit for DVD Video Playback (Courtesy Wikimedia Commons)

Costing a mere $299 at launch, the PS 2 was actually cheaper than some of the standalone DVD players of that time. The console’s support for DVD video was welcomed at a time when the DVD was one of the best ways to experience movies at home, as it came with bonus features and extra materials like deleted scenes and interviews with cast and crew. Films like the lengthy Lord of the Rings trilogy made it to the DVD as extended editions. The PS 2 not only boasted a great library of game exclusives but also served as an affordable home entertainment system, allowing users to enjoy high-quality DVD editions of their favourite movies. 

The PS 2 is the best-selling home-gaming console of all time, and one of the factors that is said to have contributed to this was its capacity to double as a DVD player – it even allowed you to play burned CDs and DVDs, i.e, pirated movies, music and games.

Blu-ray Powers Massive Games

The Blu-ray disc format allowed even larger game sizes thanks to its storage capacity of upto 50 GB. The PS 3 edition of Last of Us (2013) – one of the finest games ever made – was around 26 GB in size, Uncharted 3: Drake’s Deception (2011) takes up 43.5 GB of storage space, and the PS 4 box-set for the Last of Us Part II contained two Blu-ray discs, to accommodate more than 70 GB of game data.

The Last of Us Part II (Courtesy Sony)
The Last of Us Part II (Courtesy Sony)

Game sizes actually increased from one console generation to another – Call of Duty: Ghosts (2013) for the PS 3 was 10.9 GB in size – this mushroomed to more than 30 GB for the PS 4, mainly because the next-gen versions came with higher-resolutions texture sets, higher-polygon game assets, and higher-definition cutscenes and cinematics.

In fact, the Blu-ray disc format’s prevalence is at least partly due to gaming and the PlayStation 3. Sony created the prototype of the Blu-ray disc in 2000, but when it was officially released in 2006, a format war ensued between Blu-ray and the HD-DVD format. By 2008, however, both the software and entertainment industries had settled on the Blu-ray format – a BBC article published the same year argues that the format won because it was integrated with the PlayStation 3, which had sold more than 10 million units by early 2008 and could play Blu-ray video at a time when the first standalone Blu-ray players were nearly double the PS3’s price. Microsoft had gone with the HD-DVD format for the Xbox 360 (2005), and though rumours sometimes surfaced about a Blu-ray add-on for the console, the Xbox 360 does not support Blu-ray playback

Eventually, Microsoft also chose the Blu-ray, which is supported by the PS 4 (2013), the PS 5 disc edition (2020), the Xbox One (2013) and its variants, and the Xbox Series X (2020). These consoles support DVD and Blu-ray video playback as well.

It must be noted that a built-in Blu-ray player did not propel the PS 3 to the success that PS 2 enjoyed with its DVD playback capability. Nintendo’s Wii (2006), which used a proprietary DVD-based disc format for physical copies and lacked any movie or music playback, won the console war of this generation, mainly due to its far wider appeal. 

In fact, by the late 2000s, observers were predicting that physical formats would decline thanks to a new ‘download era’ ushered in by high-speed, high-bandwidth internet connections and increased broadband penetration – in effect, supporting a physical format would not confer a significant advantage. In the next sections, we will discuss the rise to prominence of digital distribution for games, and the factors that have driven this transition.

The Prominence of Digital Delivery in Gaming

Digital delivery for games has grown more and more popular over the course of the last decade and the pandemic-period lockdowns accelerated this trend – in 2020, digital game sales surpassed physical sales for the first time, and 91% of the game industry’s revenue was digital (this includes full game downloads along with in-app purchases, downloadable content and mobile game sales). By 2021, the number of unique console titles sold on digital platforms in the US had far surpassed games sold as physical copies. According to an NPD report, nearly 2200 unique console titles were released on digital storefronts in the US, as compared to 226 titles sold as physical copies. A year later, Sony reported that digital purchases constituted 80% of game sales in 2022’s first quarter

Unique Digital Console Titles Far Surpassed Physical Copies in 2021 (Courtesy Ars Technica)
Unique Digital Console Titles Far Surpassed Physical Copies in 2021 (Courtesy Ars Technica)

All this data suggests that digital delivery has become the primary, if not the sole, method for distributing games, and analysts believe that digital games will totally dominate the industry in less than a decade. Digital delivery is already the default for smartphones, which have never offered support for physical games, and we have covered mobile gaming and its main platforms in detail elsewhere. However, digital games did not rise to primacy overnight for other platforms such as PC and consoles, which had to transition from selling large-sized titles on high-capacity optical media to digital delivery. We will discuss the emergence and growth of digital distribution in such platforms below.

The Rise of Steam

Steam is by far the most successful digital distribution platform for PC games, though it was not the first – the now-defunct Stardock Central (2001) was the first such service for PC games. Steam was released in 2003 as a client that could easily update certain Valve games, especially the popular multiplayer shooter, Counter-Strike (2004).  

A 2002 survey conducted by Valve revealed that 75% of its users had broadband connections, and convinced the company that digital delivery of games was a viable proposition. Their first attempt to deliver a game digitally (Half-Life 2) was not a resounding success, as is recounted above, but Valve continued to improve the Steam client and its support infrastructure, and within a few years, major publishers such as id Software, Take Two Interactive, Eidos, EA and many others decided to make their PC games available on Steam.

The Steam User Interface circa 2010 (Courtesy Valve and Reddit User 2muchrubik)
The Steam User Interface circa 2010 (Courtesy Valve and Reddit User 2muchrubik)

As the service grew more and more popular, some developers baulked at its restrictive terms of service, with EA deciding to launch Mass Effect 3 on its own Origin platform in 2012. Nevertheless, Steam remains the pre-eminent digital distribution platform for PC games – by 2021, it had 132 million monthly active users who could browse a library of over 50000 unique titles. Revenue from games sold on Steam reached $6.6 bn in the same year. 

Other digital delivery sites include GOG.com (2008) – which enforces a strict non-digital rights management policy – and the Epic Games Store (2018), which is attempting to challenge Steam’s primacy. To incentivise publishers, Epic takes a 12% cut from game sales, whereas Steam takes 30% initially, though its share decreases to 20% if game revenue exceeds $50 million. Epic also does not enforce a digital rights management (DRM) policy for all games, allowing each publisher to decide DRM policy. 

In 2020, Epic Games spent more than $400 million to secure games that would be exclusive to Epic, and unavailable on Steam, for at least a year. Subsequently, in December 2021, monthly active users peaked at 62 million, which is less than half of Steam’s 132 million monthly active users, and as of 2021, Epic has a total of 917 games, a fraction of Steam’s game catalogue. However, Steam took more than a decade to build its library and following, and the Epic Store has been around for less than five years. Epic is also known for freely giving away an estimated $17.5 billion worth of games, and even though its store went down for eight hours when it made Grand Theft Auto V free for a week in May 2020, Epic got seven million new users as a result of the giveaway. Steam might have a huge head-start, but Epic arguably has the momentum.

Epic Games Got 7 Million New Users After its GTA V Giveaway (Courtesy Rockstar Games)
Epic Games Got 7 Million New Users After its GTA V Giveaway (Courtesy Rockstar Games)

The prominence of Steam, the rise of Epic and the death of the computer optical drive, have made digital distribution the principal method for selling games for PCs.

Digital Stores for Consoles Evolve

Console makers, especially Microsoft, were quick to realise the potential of digital delivery – the original Xbox shipped with the Xbox Live (2002) service, whose principal function was to enable multiplayer gaming and the digital delivery of game content. In its early years, the platform offered premium downloadable content and add-ons for games, but could not offer full game downloads. In 2004, Microsoft created the Xbox Live Arcade platform, which offered small, quickly downloadable games from a range of developers – most of these games were either console classics or arcade-style titles. The platform served as an easy entry point for independent developers who could develop games and quickly release them as download-only titles. 

Paperboy, a 1985 Atari Game, on Xbox Live Arcade (Courtesy Flickr)
Paperboy, a 1985 Atari Game, on Xbox Live Arcade (Courtesy Flickr)

Sony launched the PlayStation Store in 2006, and as with the Xbox, it initially offered downloadable content rather than full game downloads. The Store platform was first launched with the PlayStation 3 and six years later, it started to offer a variety of full game downloads. 

The Wii Shop Channel (2006 – 2019), a digital distribution platform for the Nintendo Wii, supported the download of various applications, content, and even a web browser. Wii Ware (2008), one of the services in the Wii Shop Channel, was Nintendo’s first foray into digital delivery of full (small-sized) games, and as Microsoft did with the Xbox Live Arcade, Nintendo promoted Wii Ware as an avenue for small developers to publish innovative content that would be delivered digitally, thereby avoiding the risks and commitments of retail distribution. The Wii Ware service shut down when the Wii Shop Channel was discontinued, and players can no longer purchase titles anymore, though they can continue to download owned games to compatible devices.

By 2009, both Microsoft and Sony started offering full game downloads – after its E3 press conference, Microsoft announced that games such as Bioshock (2007), Assassin’s Creed (2007), and The Elder Scrolls IV: Oblivion (2006) would be offered as full game downloads on its Store platform for the Xbox 360. Meanwhile, Sony announced that full games could be downloaded over WiFi to its handheld gaming device, the PSP Go (2009), which had 16 GB of storage and lacked a slot for physical games. By 2011, the PS 3 had also enabled full game downloads – at a time when game sizes were ballooning, and broadband speeds were not quite capable of downloading large-sized games. In 2013, Ars Technica reported that a major Steam game took about 4 hours to download, while a PS 3 game took more than 5 hours to download – and this was in the US, where broadband speeds have generally been high. Download times have however tended to decrease as internet speeds and broadband access have steadily improved.

Oblivion was One of the First AAA Games to be Delivered Digitally on the Xbox (Courtesy Bethesda Softworks)
Oblivion was One of the First AAA Games to be Delivered Digitally on the Xbox (Courtesy Bethesda Softworks)

The Nintendo eShop launched in 2011 for the 3DS and was made available on the Wii U at the console’s launch (2012), and keeping up with Sony and Microsoft, Nintendo’s eShop featured full game downloads, along with DLC, video content, updates and more. The first game to be released both on the eShop and as a retail copy was New Super Mario Bros. 2 (2012). 

New Super Mario Bros 2 (Courtesy Nintendo)
New Super Mario Bros 2 (Courtesy Nintendo)

The majority of Nintendo’s retail releases are now available as digital downloads, and developers can also choose to publish digital-only titles on the eShop. This platform is the primary digital storefront for the Nintendo Switch, the hybrid hand-held and home-gaming console that has sold more than 100 million units

By 2013, digital game sales had started to edge past physical sales in the US, and by 2018, physical sales accounted for only 17% of all game sales in the country. In the next section, we will discuss the factors that have contributed to the rise of digital distribution. 

Physical Game Sales Have Steadily Declined Since 2013
Physical Game Sales Have Steadily Declined Since 2013

Why Have Physical Game Sales Declined?

There are many interconnected factors that have led to the decline of physical game sales. Higher internet speeds and broadband penetration have made video streaming services possible – they have also contributed to making large game downloads viable. The COVID pandemic also made digital downloads popular during the shutdowns, and revenue from digital downloads have surpassed physical sales in part because mobile games, which are delivered digitally, have become the biggest segment of the gaming market. We discuss these and other factors below. 

  • Increasing penetration of high-bandwidth, high-speed, low-cost internet: Based on Ookla reports, nearly 50 countries have median download speeds greater than 100 megabits/second (Mbps). As of September 2021, the global average download speed on broadband was 113.2 Mbps, up nearly 30 megabits from 2020 speeds. 83% of US households have access to broadband connections, and a majority of those homes have connection speeds of 15 Mbps or higher. With the advent of 4k texture packs, 4k video, high-quality audio and more, the size of some contemporary games exceeds 100 GB, with Call of Duty: Modern Warfare (2019) at a staggering 183 GB. The game would take at least four Blu-ray discs for the retail edition, and a high-speed internet connection is a more viable option for delivering such games than cumbersome disc editions. 
  • Digital storefronts control game prices and cut out the resale market: Game sales are significant sources of revenue for console makers and publishers. Cutting out the middleman – retailers who stock physical copies of games and trade-ins – and drawing people to digital storefronts would result in higher overall profits for console manufacturers and publishers. This has led console makers to offer more incentives for digital purchases – the Xbox Play Anywhere program, launched in 2016, allowed users to buy a digital Xbox game and play it on both the console and the PC. Both Sony and Microsoft now offer cheaper all-digital editions of consoles: the Xbox Series S and the PlayStation 5 Digital Edition. Microsoft is even working on a patent that will allow Series S owners to authenticate a physical game copy using an external disc drive and download it via the digital storefront. The PS 5 uses an advanced compression technique to decrease the size of game data and reduce the wait time to access a digitally downloaded game.
  • The Pandemic made digital downloads more popular: Many observers have pointed out that lockdown restrictions spurred the purchase of digital copies, especially as gaming itself became of vital importance as a means of connecting with people, and physical stores could not be visited due to widespread shutdowns. In fact, the pandemic is considered a tipping point for digital game sales. By 2020, when the current generation of consoles were released, digital purchases accounted for nearly half of console game sales
  • Cloud gaming and gaming subscriptions: Currently, both Microsoft and Sony feature a robust digital delivery platform, and both offer game subscription services with libraries that boast multiple exclusives, which can be downloaded and played for free so long as you pay a monthly subscription fee. Such services also offer discounted prices at the digital storefront for many games, including those that are not part of the subscription. The highest tiers of these services also feature cloud gaming. With 25 million users, the Xbox Game Pass accounts for a significant portion of Microsoft’s gaming revenue. In fact, the rise of subscription services, made possible by higher internet speeds, could even lead to a new war based on the ‘subscription exclusives’ between console makers.
  • Live-service games depend on downloads: The industry has seen a major shift towards live-service games, which seek to keep their audiences engaged for years by regularly adding new game content, updates, game-balancing patches and in-app purchases – and all such content is delivered digitally. In a world where many of the industry’s most prominent titles follow the games-as-a-service model, physical copies may simply not be a viable mode of distribution. Live-service games are also quite large in size – the free-to-play game Destiny 2, including all expansions and updates, currently requires 97 GB of storage space on a PS 5, and more than 100 GB of space on an Xbox Series X|S, and though a physical copy exists, a significant portion of the game content needs to be downloaded. Some of the most prominent games today depend on digital delivery.
  • The biggest sector in gaming uses digital delivery: Mobile games are delivered as apps through stores such as Google Play and the iOS App Store, and they are some of the most lucrative titles on the market. As of 2022, more than 60 mobile games have made more than $1 billion in lifetime sales, and mobile titles constitute the largest share (45%) of global games revenue. The mobile gaming market is expected to be worth nearly $140 billion by 2026, and as mobile games grow ever more popular and lucrative, revenue from digital sales (including full-game downloads, DLC, in-app purchases and more) will continue to surpass physical sales revenue. 
  • Considering that so many factors are contributing toward the transition to digital delivery, one might believe that physical games are on the way to extinction. This is, however, not quite the case. In 2018, gamesindustry.biz argued that the sale of physical games might be declining, but it still remained an industry worth billions of dollars worldwide, with 75% of AAA games being sold as physical copies. A year later, it reported that physical console games accounted for 60% of sales value during the last quarter of 2019. These reports may reflect a pre-COVID reality, but even during the pandemic, in April 2020, more than a million physical games were sold in the UK, the highest figure since 2015. Physical games, while on the decline, are not yet finished.

Conclusion

The inherent advantages of physical games – shareability, resale value and access to game data without the need for a connection – still hold value to gamers. The outrage over Microsoft’s attempt to implement an always-online DRM policy for even physical Xbox One games shows that full ownership – another perk of physical games – is valued by users.

However, the Nintendo Switch’s cartridges underscore the problems with physical copies. While Sony and Microsoft support Blu-ray disc editions for their consoles, the Switch uses a proprietary cartridge based on the SD card format for retail games – such cards can currently hold upto 32 GB of data, though 64-GB cartridges have been in the works for at least three years. The small storage capacity of such cartridges requires users to download a significant part of the game, and they are also costlier to manufacture than Blu-ray discs, resulting in some games costing more on the Switch than on PC or console. These disadvantages may lead the portable Switch’s successors to eschew support for physical copies.

In fact, both retail distribution and digital delivery may be superseded by cloud gaming, which is expected to become a $20.9-billion industry by 2030. Amazon’s Luna and the cloud gaming solutions of Microsoft and Sony require only a fast connection to play a game, and if game streaming becomes prevalent, it can make retail copies, digital downloads and even game installation a thing of the past. Cloud gaming is considered the ‘killer use-case’ for burgeoning 5G networks, and the growth of 5G infrastructure can make game streaming a highly popular and easy way to play games. 

Some sources on the web continue to argue that physical games are still relevant, and others that they should remain relevant, given that access to a digital copy of a game can potentially end when the provider goes bust – this is one of many arguments used to justify breaking DRM on media. Some have even argued that the death of the Blu-ray format (as a medium for movies and games) does not bode well for gamers, especially those on a budget, who need to resell their games to afford new ones.

Opinions aside, the industry itself is moving away from physical copies in a quite decisive manner, propelled by steadily increasing internet speeds and a changing industry landscape in which highly-lucrative live-service titles and mobile games dominate. Physical copies may not completely die out, but they will likely be relegated to a (very) niche market in the years to come. 

Gameopedia offers custom solutions depending on your specific data requirements. Reach out to us for actionable insights on both digital delivery and the retail distribution of games.

Read More

Game Engines: All You Need to Know

On January 1, 1993, id Software issued a press release about their upcoming game, Doom, in which they made a few immodest claims – the game would herald a technical revolution in PC programming, it would push back the boundaries of what was thought possible on contemporary PCs, and would offer a host of technical ‘tour de forces to surprise the eyes’ while delivering smooth gameplay across a range of PC hardware. The press release also contained a quote from John Carmack, id’s technical director, who claimed that the Doom ‘engine’ was optimised to run at excellent frame rates. 

When it came out, Doom (1993) was all it had promised to be. It was played by millions of people and continues to be celebrated as a milestone in gaming and culture more than 25 years later. Many factors contributed to Doom’s massive success, but perhaps the most important was the ‘engine’ used to power the game, which not only allowed Carmack and his associates to push the limits of what was possible in a computer game, but also repeat such a feat with their subsequent games, deploying upgraded versions of their original engine. In effect, id Software created a new paradigm for game development – the use and reuse of a unified code base known as an engine to create new games that shared a common core. 

Engines such as the one mentioned in id’s press release now power the creation of a bewildering array of games for multiple platforms – in this blog, we will discuss what engines are, their early history, and the two most popular game engines in the world – the Unreal Engine and Unity. We will also delve into various first-party game engines used by major studios to create games for classic franchises such as Assassin’s Creed, Battlefield, Call of Duty and The Elder Scrolls. 

What is a Game Engine and Why it’s Vital

A game engine is a software framework primarily designed for the development of video games. Developers can use engines to construct games for consoles, PC’s, mobile devices and even VR platforms.

Game engines often include core functionality such as a 2D or 3D renderer, a physics or collision engine, and support for sound, cinematics, scripting, animation, AI, and networked play. They also allow games to be deployed on multiple platforms – game engines are capable of platform abstraction. Both Unity and Unreal Engine support easy deployment of game-ready assets from their respective marketplaces, the Unity Asset Store and the Unreal Marketplace.

The great advantage of a game engine is that it allows game developers to reuse or adapt the engine to make new games, instead of starting from scratch – they are spared the chore of reinventing the wheel. By providing support out of the box for developing many aspects of a game, engines reduce cost, time and manpower investment, allowing game studios to remain competitive in a constantly-evolving industry in which production costs have steadily risen. They also foster innovation by making game development easier. 

In the following section, we will discuss how game engines transformed game development, by allowing developers to use a single software to create multiple games for multiple platforms. 

History of Game Engines

Strictly speaking, game engines did not exist prior to the ’90s. Before the advent of such software applications, games had to be designed from the ground up, and very little of the code base could be reused when deploying games on multiple platforms. Games for the Atari 2600 – a major game console released in the ’70s – were coded using assembly language specific to the console. Developers did reuse code, but there was no unified code base that could be extended to create multiple Atari 2600 games. 

Super Mario Bros (1985) is arguably the first console game to be made by reusing a critical piece of code – Shigeru Miyamoto’s team had written the code to enable smooth side scrolling for their 2D racing game Excitebike (1984), and their code was reused in the making of the first Super Mario Bros game, allowing Mario to smoothly move across the screen and even accelerate from a walk to a run. This was an important inflection point in game development, considering that much of the first Mario game had to be designed by hand, on graph paper, and the platform for which it was built, the Nintendo Entertainment System, used assembly language, limiting code reuse. Despite this, the developers managed to re-implement the Excitebike code for the Mario game.

Super Mario Bros
Super Mario Bros (Courtesy Nintendo)

The ’80s also saw the release of several 2D game development kits – known as game creation systems – that enabled users to create specific games using pre-built assets: the Pinball Construction Set (1983), the Adventure Construction Set (1984) and the Shoot-‘Em-Up Construction Kit (1987) are examples of kits published during this time. Each kit was specialised for a specific type of game – users made pinball games with the Pinball Construction set, while the Shoot-Em-Up kit enabled the creation of 2D shooters.

Pinball Construction Set (Courtesy Electronic Arts)
Pinball Construction Set (Courtesy Electronic Arts)

These kits allowed everyday users to create games using existing components, and often featured a Graphical User Interface (GUI) that further eased the process of creating a game. The Shoot-Em-Up kit even allowed users to share their creations as full game files, which could then be played on other systems that did not have the construction kit itself. No coding knowledge was required to make these games – this helped popularise such kits, but also limited the scope of the games that could be made with them.

Garry Kitchen’s GameMaker (1985), released by Activision for various home computer systems, was the first integrated development environment for making games, and can be considered a proto-game engine. GameMaker allowed users to create background graphics, movable objects known as sprites, sound effects and in-game music, and also included a game programming language that allowed the developer to code more features. GameMaker’s versatile programming language set it far apart from other game creation systems of the time.

The first game engine as we understand the term today was not the Doom engine, but an earlier code base created by id for the production of a new game trilogy for their Commander Keen franchise. From the outset, id recognised that creating a single piece of software that provided common functionality for multiple games was a more significant accomplishment than making a game, and even tried to license the so-called ‘Keen engine’ (1991), but met with little success

The reason why this software was called a game engine is largely because both John Carmack and John Romero of id are car enthusiasts. According to Romero, an engine is the heart of the car, while the game engine is the heart of the game – it powers the game, while the art and other game assets are comparable to the body of the car. After coining the term, id introduced it to the world with the Doom press release in 1993. 

The developer found a much more receptive audience for the Doom engine – after all, it had powered a massive hit, and was hence licensed by several companies, leading to the release of games such as Heretic (1994), Hexen: Beyond Heretic (1995) and Strife: Quest for the Sigil (1996). Developers using the Doom engine added their own graphics, characters, weapons and levels—the “game content” or “game assets” were unique to the developer, but the technology that powered the game was id’s engine.

Heretic (Courtesy GT Interactive)
Heretic (Courtesy GT Interactive)

The engine itself was revolutionary in many ways – it faked a 3D experience by adding height differences to the environment and used only 2D sprites. Even now, Doom is remembered for how fast-paced it was, but the original renderer Carmack wrote slowed to a crawl on most systems when rendering complex scenes. Carmack then researched academic papers and implemented a technique known as binary space partitioning – never used before in a game – to dramatically speed up the Doom engine’s renderer. In drastically simplified terms, binary space partitioning enabled the engine to prioritise the rendering of objects based on how close they were to the player, and ‘cull’ or not render areas that were hidden or too far away from the player, thereby shortening render times. This enabled Doom to be very responsive and run at high frame rates on the home computers of that time, without sacrificing graphical fidelity. 

A few years after Doom, Carmack observed that the engine technology was id’s key value proposition and that game designing (on top of such engines) could well be managed by ‘a few reasonably clued-in players’. 

Ken Silverman took cues from Doom to create the Build (1995) engine at the age of 19. The Build engine refined the illusion of 3D with the ability to look up and down in games such as Duke Nukem 3D (1996), even though these titles were rendered, like Doom, on a 2D plane. Silverman also added tags to parts of the game world – walking over or into these areas would teleport the player, creating the impression of falling down a pit or passing through a tunnel.

Soon after Doom, id released the Quake engine (1996), which featured true 3D real-time rendering and support for 3D hardware acceleration. The engine also used a dynamic shader for moving objects and a static lightmap for stationary bodies. Quake (1996), which was made with the engine, is yet another milestone in gaming.

Quake (Courtesy GT Interactive)
Quake (Courtesy GT Interactive)

Bethesda had earlier attempted a true 3D engine of its own, called XnGine (1995), which suffered from bugs and stability issues. Eventually, Bethesda achieved success with the engine by creating the huge, procedurally-generated world of The Elder Scrolls II: Daggerfall (1996), but would eventually abandon XnGine for NetImmerse (1997), the predecessor to Gamebryo. 

The developers at id then outdid themselves with id Tech 3 (1998), an upgraded version of the Quake engine that would later power games such as Medal of Honor: Allied Assault (2002) and Call of Duty (2003), and would be licensed to several developers. The engine was simply called the Quake III Arena engine, the game for which id had upgraded their original code base. It featured next-gen graphics using shaders – scripts that determined – and enhanced – the appearance of many in-game objects, surfaces, areas and even character models with accurate shadows, light emission and reflections. Shader code also specified the sound effect associated with a surface, and was also used to identify map sections such as water bodies. The engine was one of the first to allow for curved surface geometry as well. All these innovations – especially the shader system – were computationally expensive and would have crippled the engine’s renderer, which is why the developers implemented the now legendary fast inverse square root algorithm. 

The fast inverse square root algorithm quadrupled the speed of calculations for pathfinding, lighting, reflections, and many other game-critical operations. In effect, Quake III Arena (1999) boasted far better performance – with high quality, hardware accelerated graphics – because a piece of code less than 20 lines long sped up calculations performed millions of times per second in games. When id released the source code for the engine in 2005, the code fragment attracted the attention of many programmers, mathematicians and developers because of the accuracy and speed with which it approximated inverse square roots (it had an error margin of about 1%). It is four times as fast as a regular inverse square root calculation, and is almost as fast, though not as accurate, as the current algorithm for approximating inverse square roots.

Quake III Arena (Courtesy Activision)
Quake III Arena (Courtesy Activision)

id created a thriving licensing business on the back of its pioneering game engine, which also powered the developer’s highly successful multiplayer title – Quake III Arena. It may hence seem like id lacked any serious competition at that time, but that was not the case – it was challenged by a relative newcomer – Epic Games’ Unreal Tournament (1999). Epic’s multiplayer title was made with the Unreal Engine, which had first been used to create Unreal (1998), a first-person shooter that had proved successful. The engine itself surpassed id’s technology in key areas, especially graphics, and in the next section, we will discuss the rise of the Unreal Engine, and Unity, another industry standard engine.

Unreal Tournament (Courtesy GT Interactive)
Unreal Tournament (Courtesy GT Interactive)

Unreal and Unity: The Industry Standard

The Unreal Engine and Unity are currently the industry standard – they provide the ‘digital infrastructure for many of the world’s most popular games’. We will discuss the history of both engines – the first version of the Unreal Engine is a milestone in the development of game engines, and drew praise even from the wizards of id, while Unity rose to prominence by making high-quality tools accessible to indie studios. We will also detail the features that both these engines now offer to developers. 

Unreal Engine

Tim Sweeney, the founder of Epic Games, was no laggard at programming. He coded 90% of the first Unreal Engine for Epic Games’ FPS title, Unreal, which would debut in 1998, although the engine itself was licensed to other developers by 1996. 

Initially, UE was designed for software (CPU-based) rendering, but would later be able to make use of dedicated graphics hardware. From the outset, it used 16-bit colour and supported visual effects such as soft shadows, dynamic lights, volumetric fog and texture filtering – many of these features would not only be praised by id’s Carmack, but also acknowledged as milestones that Tim Sweeney reached before id could get there. 

Unity (Courtesy Unity Technologies)
Unity (Courtesy Unity Technologies)

Carmack would remark that Unreal had broken the mould with the use of 16-bit real colour – developers would inevitably choose to work with a 16-bit colour palette going forward – and that the engine had raised the bar on what gamers would expect from future games. Carmack had always been one of Sweeney’s heroes – in fact, Sweeney equated Carmack’s innovations in the gaming field with Newton’s contributions to the study of physics. One can only imagine Sweeney’s reaction to Carmack’s praise. 

By late 1999, about sixteen projects had been developed using Epic’s engine, including Deus Ex (2000) and The Wheel of Time (1999). Unlike id Software, whose engine business only offered the source code, Epic provided support for licensees and met with them to discuss improvements to its game development engine. 

Unreal also provided tools that were more user-friendly for non-engineers, a crucial factor for the Deus Ex developers, whose team included many designers and whose goal for the game was to go beyond the standard FPS. 

Fast forward to today and Unreal still enjoys a reputation for creating user-friendly tools that greatly extend the capabilities of developers in creating cutting-edge games. The current version of the Unreal Engine has been hailed as a game changer because of its feature set, is free to use and download (there are premium licensing options as well) and is known for its forgiving learning curve. The engine is free for internal projects, though users will have to pay a 5% royalty fee if and when their product earns over $1 million

A huge number of games are made with Unreal – there is an exhaustive list in Wikipedia that keeps track of these. Both Unreal Engine 4 and 5, the latest iterations of the game engine, came with major improvements – Unreal Engine 4 introduced the Blueprints system, a versatile visual scripting language that allows developers to prototype game elements and gameplay mechanics quickly by connecting nodes and other basic building blocks. 

Released in April 2022, UE 5 features a host of innovations and is already being used by companies to make games – currently, we have access to some eye-watering gameplay footage for several upcoming UE 5 games, an interactive tech demo based on the Matrix universe, and footage of a fan-made Superman game, all of which demonstrate the engine’s myriad capabilities. 

UE 5’s tools have brought about some of the most photo-realistic gameplay footage we have yet seen, and we discuss some of the engine’s most important innovations below.

Nanite: This is a ‘virtualised geometry system’ that speeds up the creation of LODs (levels of detail). In any game, objects are rendered at decreasing levels of detail based on how far they are from the player (or how important they are in a game environment).  Before Nanite, developers had to author LODs by hand (the engine would then use the appropriate LOD for an object based on the player’s position), but Nanite allows one to import high-quality assets that are automatically rendered at the correct level of complexity with respect to the player’s point-of-view.

Lumen: This utility is used to manage and rapidly update scene lighting – scenes change to reflect time of day accurately, new light sources are immediately integrated into the overall lighting profile, and even sudden flashes of light entering the shot affect the scene realistically. 

World Partition System: This utility greatly enhances the Unreal Engine’s functionality in terms of building open worlds. The system uses a grid to map an entire universe into manageable sub-level chunks, which can be loaded and unloaded as a player traverses the landscape. Such sub-levels can be developed by independent teams as well. 

UE 5 also has new tools for animation and sound, and can even be used for film and animation production (Unreal Engine 4 is already being used to create the environments for TV shows such as the Mandalorian). It allows for much larger file sizes, supporting even 12k texture files – an important feature considering that games are now played on increasingly higher resolutions. On the whole, UE 5 continues to power the creation of games and media with cutting edge graphics, sound, animations and more.

Unity

When he created the first version of the Unreal Engine, Tim Sweeney was going up against his greatest hero, and the Unreal Engine soon became the de facto choice for AAA titles until Epic made it free, allowing a lot of indie and AA studios to use Unreal’s rich toolset. Unity has quite a different story – from the outset it courted indie developers.

Unreal Engine (Courtesy Epic Games)
Unreal Engine (Courtesy Epic Games)

Unity is the brainchild of three developers – Nicholas Francis, Joachim Ante, and David Helgason. The company was founded in Copenhagen and the engine began life as a graphics tool for Mac OSX. The team then recruited a diverse group of developers, all of whom brought their own ideas to the table and the result was an engine applicable to a variety of use cases. Immediately, commercialising the product became a priority, especially for use by indie developers, whose main pain point was having to reconstruct an engine for every new game concept. 

The creators of Unity felt that their engine would foster creators in the indie game space and democratise game development. They released their first version in 2005, and won an award from Apple for ‘Best Use of Mac OSX Graphics’ the next year. The developers soon provided support for Windows and browsers and the engine grew more sophisticated, eventually allowing the founders to devote their attention full-time to what had essentially begun as a passion project. 

It was in 2008 that Unity skyrocketed in popularity, when it became the first engine to provide support for Apple’s new App Store – suddenly, a great many developers wanted to use Unity to make game apps for the App Store and the engine rapidly rose to prominence as iPhones became ever more popular. 

Unity 3.0 (2010) was an important inflection point as well – it featured Android support, provided advanced graphics features for desktop and console platforms, and a host of technical upgrades – essentially bringing high-end development tools to indie game makers at very affordable rates. Within two years, Venture Beat observed that ‘few companies had done as much for the flowing of independently produced games as Unity’. Unity has continued to deliver high-end tools to developers who would otherwise have no access to them, and by 2018, CEO John Riccitiello claimed that half of all the games on the market were made using Unity. 

Unity now supports over 25 platforms and has long been the darling of indie developers. Unity has an advanced physics engine, integrates with Autodesk asset-creation tools like Maya and 3DX Max, and matches most of what UE offers – recently, it even introduced the Bolt system to compete with Unreal’s Blueprints. However, while even novice programmers can use Blueprints to develop the entire logic of a game, Bolt’s functionality does not extend that far.

Unity has a massive community of users, and learning it is a breeze because of the wealth of tutorials and guides available online – it is known for being beginner-friendly. Like UE, Unity is the engine behind a huge number of games and enables the creation of 2D, 3D, VR and AR content – it now powers lucrative mobile games such as the AR title Pokemon Go, and Call of Duty Mobile. 

According to a TechCrunch article, Unity didn’t seek to topple AAA game engines but succeeded in making – and selling – a product ideally suited to the budget and needs of independent studios. Unity’s market grew rapidly when it supported the App Store, and continues to expand thanks to the steady addition of new features. 

Though it first rose to prominence as an engine for iOS games, Unity has long been capable of producing AAA-quality titles across platforms. Both Unity 5 (2015) and Unity 2021 have introduced major updates to graphics, sound, lighting, animations and multiplayer, added visual effects such as volumetric fog and global illumination (which make environments more realistic), and integrated techniques such as deferred rendering (an optimised form of rendering that greatly improves framerates), all of which have contributed to the polish of new Unity games. 

The engine is still considered best for designing mobile games – the fastest-growing market in gaming – and was ranked as the most popular for mobile game development in a survey by Game Developer. By 2020, Unity had a 50% market share in mobile game development, and a 60% share in VR and AR game development. 

Given the overwhelming popularity of both Unreal Engine and Unity, they are the driving force behind a huge number of games – in fact, Unity has edged past Unreal as the most popular game engine because of its use by a growing number of small and medium-sized studios. Both Unreal and Unity come with very flexible licensing options, which have contributed to their widespread use in the gaming industry. As both Unity and Unreal are highly accessible, they are even taught in game design courses.

However, quite a few prominent games are made without using these two engines – major game studios built their own engines to make such titles. We will provide an overview of such engines in the following section. 

First-Party Game Engines

Like Unreal and Unity, first-party engines have powered very well-known games. However, these engines are inaccessible to the average user, and developers can get to use them only if they work for the studio that has built it, or partner with the studio or its parent company and develop games for it. Such first-party engines may even lack a name or a versioning system – not much can be found about the specific upgrades each version has received. For instance, it is known that Bethesda’s Creation Engine is a fork of the Gamebryo engine, but no public changelogs exist that track the evolution of the Gamebryo to the Creation Engine. 

Be that as it may, such first-party engines have powered some of the greatest games and game franchises ever, from Half-Life 2 (2004) and Skyrim (2011) to Battlefield, Call of Duty and Assassin’s Creed. In this section we will explore the first-party engines that have been critical in creating such titles and franchises.

id Tech

id Tech first debuted as the Doom engine in 1993 and led to a whole family of game engines based on id’s technology when id was still licensing its engine to other developers. Prior to id Tech 5, the engine had no official designation and was referred to either as the Doom or the Quake engine. The id Tech 5 version was an attempt to revive id’s engine licensing business and compete with the likes of Unreal, but then ZeniMax bought id Software in 2009 and decided to restrict the engine to id’s projects and those of sister studios owned by ZeniMax. Since id Tech 5, the engine has been proprietary, while previous versions have been released under the GPL licence. The latest version, id Tech 7, was used to make Doom Eternal (2020), and it is evident that the engine has come a long way from its predecessors.

Doom Eternal (Courtesy Bethesda Softworks)
Doom Eternal (Courtesy Bethesda Softworks)

Source

The engine behind some of the greatest games ever made, including Half-Life 2 and the Portal series, Source (2004) began life as an updated version of Valve’s GoldSrc engine, which was in turn a highly-modified version of the Quake engine. The Source engine is famous for its accurate simulation of physics and collisions .

Half-Life 2 (Courtesy Valve)
Half-Life 2 (Courtesy Valve)

Source 2, which featured improved graphics, debuted with the release of Dota 2 Reborn (a remaster of Dota 2) in 2015 and then made a major mark with the release of the VR phenomenon Half Life: Alyx in 2020. Valve’s engine is now showing its age and its graphics look somewhat dated. Source lacks a version numbering scheme, and is improved with incremental updates.

Half-Life: Alyx (Courtesy Valve)
Half-Life: Alyx (Courtesy Valve)

IW Engine

Created by developer Infinity Ward, this proprietary engine has powered the Call of Duty game franchise. Originally based on id Tech 3, the IW engine (2005) has received major updates at a steady rate, allowing CoD titles to have high-quality graphics and state-of-the-art features with each major release. Some have argued, however, that the latest CoD titles lack the quality of their predecessors, but even such naysayers were amazed by the next-gen graphics of Call of Duty: Modern Warfare (2019), which used a rebuilt version of the IW engine.

Call of Duty: Modern Warfare (Courtesy Activision)
Call of Duty: Modern Warfare (Courtesy Activision)

Anvil

Anvil (2007) is Ubisoft’s proprietary engine, used primarily for the Assassin’s Creed games, a few Prince of Persia titles, and Tom Clancy’s Ghost Recon Wildlands (2017). Anvil is a powerful engine that made possible the open-world settings, traversal mechanics, and high-fidelity graphics of the original Assassin’s Creed in 2007, and continues to power Assassin’s Creed games, such as Origins (2017), Odyssey (2018) and Valhalla (2020). Consistent updates to Anvil have allowed Ubisoft to transition to fully open world settings for Assassin’s Creed games. 

Assassin’s Creed (Courtesy Ubisoft)
Assassin’s Creed (Courtesy Ubisoft)

While Anvil was always the engine for the AC games, many of Ubisoft’s other titles were powered using Unreal Engine, until relations soured when Epic doubled the royalty fee Ubisoft had to pay. Ubisoft, in turn, decided to adapt Anvil for projects other than the Assassin’s Creed franchise and Ghost Recon Wildlands became the first title in the Ghost Recon franchise to be made using Ubisoft’s in-house engine. Anvil was also used to make the sports games Steep (2016) and Riders Republic (2021) and the proprietary engine will power the upcoming Prince of Persia: Sands of Time remake as well.

Frostbite

The Frostbite engine was first used in 2008 to create Battlefield: Bad Company – earlier Battlefield games were made using other engines. The engine has powered all Battlefield games since the release of Battlefield 3 in 2011. 

Created by EA developer DICE, Frostbite is one of the few first-party game engines to receive criticism from a developer that used it. EA subsidiary Bioware has publicly declared that the development of Dragon Age: Inquisition (2014), Mass Effect: Andromeda (2017) and Anthem (2019) was hamstrung because of EA’s directive to use Frostbite – an FPS-centric engine – for Bioware’s markedly different games. While Dragon Age: Inquisition did prove successful, Andromeda and Anthem tanked. 

Criticisms aside, Frostbite is still known for powering the games of major franchises such as Fifa, Madden and Battlefield, many of which have tight, annual release schedules.

Madden 23 (Courtesy Electronic Arts)
Madden 23 (Courtesy Electronic Arts)

Creation Engine

The Creation Engine powered the development of Skyrim (2011), Fallout 4 (2016) and Fallout 76 (2018). A modified version of the Gamebryo engine, which was used to make Oblivion (2006), the Creation Engine enabled Bethesda to greatly extend the scope and breadth of Skyrim as compared to Oblivion, resulting in one of the greatest open-world games ever made. Bethesda also updated the engine to bring more realistic and high-quality graphics to Fallout 4. 

While the engine was able to meet the higher graphics standards for Fallout 4, Bethesda struggled to update the engine to support the multiplayer Fallout 76. During the making of Fallout 76, developers and QA testers were forced into crunch mode, attempting to create a multiplayer, live-service experience from a tool that had hitherto only powered single-player games. Fallout 76 had a disastrous start, in part because there were no NPCs at launch due to the Creation Engine’s limitations. The game has considerably improved since, but not without taking its toll on the staff involved.

Fallout 76 (Courtesy Bethesda Softworks)
Fallout 76 (Courtesy Bethesda Softworks)

Decima

Decima is a proprietary engine created by the Dutch developer Guerilla Games and was first used for Killzone: Shadow Fall (2013), a PS4 launch title. The engine stood out because of the high-quality graphics featured in Killzone and was then used to make the PlayStation VR game Until Dawn: Rush of Blood (2015) – the engine is primarily associated with the development of PlayStation games. 

The engine was heavily modified to power Guerilla Games’ ambitious Horizon Zero Dawn (2017) project, which featured a fully-open world teeming with animal automatons – a far cry from a ‘corridor shooter’ like Killzone, wherein players are funnelled through a narrow path from one game location to another.

Horizon Zero Dawn (Courtesy Sony Interactive Entertainment)
Horizon Zero Dawn (Courtesy Sony Interactive Entertainment)

Throughout these years, Guerilla had no official name for the engine, until they shared their tech with Hideo Kojima when he visited the studio – Kojima was then coming to terms with splitting from Konami and was profoundly grateful to Guerilla for sharing the engine with him. Guerilla, in turn, had great respect for Kojima and his team, and decided to name the engine ‘Decima’, based on Dejima, a Japanese island where the Dutch and the Japanese traded during the 17th century. Kojima’s first game after parting ways with Konami – Death Stranding (2019) – was the result of collaboration between Guerilla and Kojima Productions, and both Horizon Zero Dawn and Death Stranding contain references to each other.

Conclusion

Since the ’90s, game engines have been the driving force behind just about every game released on the market. In our early history of game engines we see an unbroken line of innovation from John Carmack and John Romero to Tim Sweeney and the makers of Unity. We have also seen the steady rise to prominence of Unreal Engine and Unity as the two most popular game engines on the market, both having low barriers to entry. The number of games made using these engines is mind-boggling, and they are hence behind some of the best and worst titles ever made. 

Games made with an engine may vary in quality, but the game engine itself is a powerhouse that arguably allows developers to meet the cripplingly harsh annual release schedule for many major games, and also develop amazing titles when working under longer development cycles. By sparing developers the chore of reinventing the wheel, game engines not only make their lives easier, but also enrich the lives of gamers, who demand – and enjoy – games that constantly push the envelope in terms of graphics and gameplay. 

Gameopedia works with clients across the industry on custom requests and can provide in-depth data about game engines. Reach out to us for data that can yield novel insights about the software frameworks that power the creation of games.

Read More

Contemporary Trends in Online Multiplayer

In early 2012, a Kiwi soldier named Dean Hall released a mod for Bohemia Interactive’s tactical military sim Arma 2, creating an online multiplayer open world where players had to survive a zombie apocalypse. Named DayZ, the mod featured an unprecedented degree of realism – players had to eat, sleep and maintain a steady temperature, and the basic need to survive both the zombies, and the humans in the game world, became the sole focus of players. In DayZ you could either team up with others to stand a better chance of survival, or shoot and loot them for their gear, rations and medical supplies – if the game had any goal to speak of, it was to not die. 

The early 2010s continued the tradition of innovation in multiplayer – a trend we discussed in depth in our previous blog. In the 2010s, games such as Minecraft, GTA Online and Final Fantasy XIV would each offer their own spin on the multiplayer experience. Another trend to emerge in the first half of the decade was the shift to mobile multiplayer, where studios would release innovative games that made meaningful use of mobile technology, such as Pokemon Go, with its augmented reality-based gameplay. The dominant trend in the latter half of the 2010s was the rise of the hero shooter and the battle royale, two genres that became wildly popular across platforms from PC and console to mobile. Perhaps the most significant recent trend in multiplayer, however, has little to do with game development and everything to do with the state of our society – online multiplayer experienced tremendous growth during the lockdowns of the pandemic, and we will discuss this as well in this blog. 

 

2010-2016: Innovation and Mobile Multiplayer

In the first half of the 2010s, developers created innovative multiplayer games for consoles and PC, and also shifted toward multiplayer on mobile. These two trends – continued innovation in multiplayer and the shift toward mobile multiplayer, will be covered in this section.

Unique Multiplayer Titles Refresh the Genre

The decades between 1990-2010 had seen unique genres native to multiplayer, but the first half of the 2010s would see developers taking multiplayer to directions that defied traditional expectations.

Minecraft (2011)

Mine Craft Gameplay Image
Minecraft (Courtesy Mojang)

Soon after the release of Minecraft 1.0 in 2011, it would become a highly popular multiplayer game. Minecraft’s multiplayer is distinctive in that it allows players to collaborate on mining for resources and working together to build increasingly complex and elaborate structures. While it has other multiplayer modes, Minecraft’s collaborative multiplayer was unlike any other style seen before – no game had ever enabled a mode where players just worked together to build incredible structures: there is no adversarial element involving combat in collaborative mode. Minecraft does offer more traditional gameplay in Minecraft Realms, where you can team up with others and go on adventures, and a PvP mode called BedWars, where you defeat other players by destroying their respawn point – a bed. 

 

DayZ (2012)

DayZ (Courtesy Bohemia Interactive)

DayZ’s realism was unprecedented in gaming, let alone online multiplayer. Apart from having to eat and sleep, players were vulnerable to fractures, drinking poisoned water could result in cholera and a zombie bite or bullet wound could send players into shock. The hostile conditions made for highly tense encounters with other players, who might simply choose to kill you, or decide on the spur of the moment to cooperate and team up with you. Dayz also features permadeath, meaning no matter how much loot and experience you have, you restart from scratch when you die. This adds yet another layer of immersion to what is already a hyper-realistic survival sim. Games such as DayZ are referred to as PvPvE (player vs player vs environment) because they combine cooperation and competition set against challenges posed by the game world. 

 Within the first three months of launch, DayZ had a million unique users and also boosted sales of Arma 2, the base game required to play the mod, by 500 percent, leading the CEO of Bohemia Interactive to admit that Dayz was the primary driver of Arma 2 sales. 

Dean Hall was soon hired by Bohemia Interactive to create a standalone version of the game, which would be released on Steam Early Access in 2013, where the alpha version would sell a million copies by 2014. It also influenced games like Rust and ARK: Survival Evolved, and cast a long shadow on the development of survival games.

 

Dota 2, GTA Online and Final Fantasy XIV: A Realm Reborn (2013)

In the span of a single year, gamers got three of the most enduringly popular online multiplayer games: the MOBA, Dota 2, the MMORPG, Final Fantasy XIV, and GTA Online, which can be considered an evolution of what MMO experiences could offer, with its plethora of activities and challenges.

Valve released Dota 2 in 2013 and went on to host the most lucrative esports title in the world – the Dota 2 International, which now boasts a staggering prize pool of over $40 mn. Valve took an innovative approach to the free-to-play game’s premier esports title, crowdfunding it by the sale of skins, cosmetic upgrades and battle passes. Dota 2 is hailed as one of the most complex, balanced and challenging MOBA games and was ranked the best PC multiplayer game by IGN in 2013. 

DOTA 2
Dota 2 (Valve Corporation)

Set in the vast open world of Los Santos, GTA Online allows players to do just about anything they want, and Rockstar keeps releasing updates that expand the activities that players can engage in, introduce quality of life upgrades, and add even single-player missions. In GTA Online, you can take part in vehicle races, heists, and casino trips, run a criminal enterprise, buy homes, go to flight school, steal exotic cars, run around the city… and more – it’s a game containing countless games.

GTA Online (Courtesy Rockstar Games)

Final Fantasy XIV’s first iteration (2010) was a disaster, but Square Enix decided to resurrect it rather than abandon the project. The end result, released in 2013, is now a major MMORPG. Naoki Yoshida, tasked with reinventing FFXIV, streamlined many of its MMORPG elements: the developers set a low level cap, inviting players to continue playing the game in search of loot and resources, the armory system allowed players to change classes on the fly by equipping certain items, and Yoshida also pushed hard to bring high quality graphics to the MMO. The game also receives updates regularly, every 3.5 months, a cadence that ensures it remains fresh for gamers. It is termed one of the biggest games in the world as of 2022 by Eurogamer.

Destiny (2014)

Destiny (Activision)

Destiny received quite a bit of criticism when it was initially launched, and it took a novel approach to addressing such critiques – listening to players. Unlike other games, Destiny’s patches, updates and expansions were direct responses to player feedback, with Bungie acting on what the community wanted rather than setting an update schedule purely based on its own agenda for the game. On its release, Destiny called itself a ‘shared-world shooter’ – the PvE element constitutes the majority of the game, while the PvP zones are equally appealing. Destiny features a unique networked mission architecture, something that Bungie has elaborated on in detail

Destiny achieved a seamless blend of single-player, co-op and multiplayer, using a persistent world made up of public and private spaces. Multiplayer events might be triggered in a public space, or a friend might join in on co-op, while private spaces lock players into campaign goals. The two spaces flow together without interrupting the story. 

Destiny made nearly $500 million dollars in pre-orders and day one sales, amassed over 20 million players a year after release, and was the best-selling new IP of 2014.

 

Rainbow Six Siege and Rocket League (2015)

On paper, Rocket League’s premise seems absurd – soccer (football) matches between player-controlled vehicles. After a low-key initial launch, the game gained a massive following thanks to its fast-paced, intensely competitive gameplay and sustained developer support. The game was also offered for free on the Playstation Plus service for about a month, increasing its visibility and making it the most downloaded PS4 game of 2016. The game received the Best Sports/Racing Game award at The Game Awards of 2015. When it went free-to-play in 2020, it crossed one million concurrent players.

Rocket League (Courtesy Psyonix)

Rainbow Six Siege had a bad initial start but turned things around to become an important multiplayer FPS game emphasising strategy, taking a cue from Arma 3’s tactical elements. Ubisoft achieved one of the industry’s most impressive turnarounds by adopting a games-as-a-service model for Siege, releasing a slew of content updates and patches to bring the game up to scratch and eventually garnering 25 million users. The game also maximised its appeal by morphing into a hero-based shooter, giving playable characters unique abilities. Rainbow Six Siege is now a major esport.

Rainbow Six Siege (Courtesy Ubisoft)

The Shift toward Mobile Multiplayer

As early as 2011, Mojang realised that Minecraft would work very well as a mobile game and released the Pocket Edition in 2011. The mobile Minecraft title had mostly the same feature set as the PC game, and would go on to become one of the top-grossing mobile game apps. The shift to mobile multiplayer had already begun.

Clash of Clans (2012)

Clash of Clans (Courtesy Supercell)

One of the earliest successful mobile multiplayer games was Clash of Clans, which offered complex team-based gameplay over mobile devices. Set in a persistent world, the player is a village chief. Raiding other villages for important resources, unlocking new troops and bolstering the defences of your own village against attacks form the core gameplay elements of the title. Players can also team up to form clans (of upto 50 players) and battle other clans, chat with friends and more. In 2021, Clash of Clans generated nearly $490 million in in-app purchase revenues, and remains the second most popular game by daily user counts in the US.

Hearthstone (2014)

Hearthstone (Courtesy Blizzard)

The free-to-play PC and mobile game Hearthstone: Heroes of Warcraft achieved an unexpected degree of success, proving that a digital collectible card game could be just as successful as similar games played with actual cards. Blizzard used the lore, characters and other elements of the Warcraft franchise to full effect to create a fun, fast-paced card game with eye-popping graphics. The developers worked hard to recreate the experience of a real card game with the user interface, digitising assets from the earlier physical World of Warcraft Trading Card Game. The game’s success is attributable to faithfully adapting a traditional deck-building experience in a digital environment, keeping games short, offering a variety of match types, releasing regular expansions with new cards, and even letting you admire your card collection with special views. Hearthstone reached 100 million players by 2018, had a user base of 23.5 million by 2020 and nearly 4 million players play it across platforms as of 2022. It has made more than $700 million since launch. It has its own esports scene as well.

Honor of Kings (Arena of Valor, 2015)

Honor of Kings (Tencent Games )

The mobile MOBA (Multiplayer Online Battle Arena) Honor of Kings is one of the most successful mobile games of all time, registering 100 million active users per day, becoming the first mobile game to make $10 billion in revenue, and becoming the leading mobile game app in China. Ironically, it might have never come about if Riot Games had agreed to parent company Tencent’s request for a mobile version of League of Legends. The LoL creators did not want to dilute the game’s brand with a mobile knock-off, and Tencent turned to another subsidiary, TiMi Studio Group, for a new mobile MOBA IP. The result was Honor of Kings, a MOBA game that features multiple competitive modes, a PvE mode and even a standalone mode when the player is offline. An international version, known as Arena of Valor, was released for Western regions in 2016, with greatly altered heroes to fit the target market. The MOBA has many key virtues – you can easily set up battles with friends, and the fast-paced stand-offs last just 15-20 minutes, The game had all the key MOBA elements but was still easier to master than Dota 2 or League of Legends, creating a low barrier to entry, and its seamlessly integrated social elements kept players engaged with each other. The massive success of the MOBA led Riot Games to reassess its stance on a mobile version of LoL – it would release League of Legends: Wild Rift, a modified version of LoL, for mobile in 2020. Since late 2021, Wild Rift has been drawing in about 15-20 million players each month

 

Clash Royale and Pokemon Go (2016)

Clash Royale (Courtesy Supercell)

Mobile multiplayer fans would get their hands on not one, but two innovative mobile titles in 2016 – Supercell’s follow-up to Clash of Clans – Clash Royale, and Niantic’s revolutionary Pokemon Go, which would use augmented reality as the basis of its gameplay. 

A gamesindustry.biz article asserts that Clash Royale’s innovative gameplay powered it to displace lacklustre titles from the top of mobile grossing charts. Clash Royale cleverly combined aspects of tower defence, MOBA, and card-based battles to instantly become one of the top grossing games in the world a month after its release. Clash Royale offers accessible but deep gameplay, fast synchronous multiplayer that lasts only minutes, ending in a nail-biting stand-off between competitors, and well-integrated social elements. It is hailed as a smart game that rewards strategy and delivers a complex, tactical experience on a small screen. The game crossed $3 billion in lifetime player spending by 2020, and as of 2021, has been downloaded more than 500 million times

While many of the games listed here bring interesting genres to mobile, Pokemon Go is altogether a different beast. Niantic used augmented reality to overlay Pokemon on real-life locations, which would be visible to players through their phone camera. A swipe of a ‘Poke ball’ would ‘capture’ the Pokemon. One of the biggest selling points of Pokemon Go was that players had to step outside to capture Pokemon – it was probably the first game that actually took place in real-life settings. The game garnered more than a 100 million players on mobile phones within a month of its release.

After reaching a certain level, players can experience the game’s multiplayer aspects – they can battle at a Pokemon Gym and join one of three colour-coded teams – red (Valor), blue (Mystic) and yellow (Instinct). The three teams vie for control of the strategic Pokemon Gyms around the world – not only do the Gyms host raids, but also allows your owned Pokemon to earn coins, which can be spent on upgrades and items at the in-game store. Updates to the Gym mechanic brought cooperative raiding and the chance to take down large-sized Pokemon together, and the game continues to get updates.

Pokemon Go (Courtesy Niantic)

The game peaked at 232 million active players in 2016, and is still going strong – 71 million people played the game in 2021 and the game has been downloaded over 500 million times. The game has amassed $5 billion in lifetime revenue by 2021, and much of this revenue has come from the United States.

2016-2020: The Rise of Hero Shooters and Battle Royales

The first half of the decade saw the release of online multiplayers games so unique that it is difficult to imagine them sharing the same demographic. Of course, dedicated multiplayer fans would have played all these games to fully enjoy the variety on offer. In the latter half of the 2010s, online multiplayer games would be characterised mostly by the hero shooter and battle royale genres, though some unusual games, such as Sea of Thieves and Among Us, would make their mark on the online multiplayer genre. 

 

The Rise of the Hero Shooter

Overwatch (2016)

Overwatch (Courtesy Blizzard)

Overwatch has been imitated by multiple developers, and for good reason. The game’s distinctive roster of hero characters is not new – Team Fortress (1996) would also feature unique characters who performed different functions during competitive missions, but Overwatch took that concept and added MOBA elements to fashion the hero shooter as we know it today. The heroes of Overwatch have outrageous powers and going one-on-one against a single one of them would be difficult – but the 6v6 player structure ensures the game remains accessible – each team brings its own ridiculously overpowered band of heroes into play – each with their own special attacks that are easily learned by watching guides. Team-based plays revolve around one character facilitating the powers of one another in a special set of moves. Winning in Overwatch hence entails having an intimate knowledge of the strengths and weaknesses of the 32-character roster, knowing which hero can totally neutralise an opponent’s character, and managing your own weaknesses – your opponent knows as well as you do that each hero has a counter who can negate the hero. Within a year, Overwatch had made more than $1 billion, and had 30 million registered players, becoming Blizzard’s fastest-growing franchise. As of 2022, Overwatch draws in about 8-9 million players each month

 

Multiple Games Adopt the Hero Shooter Formula

We discussed above that Rainbow Six Siege grew in popularity once it became a hero shooter like Overwatch. But Siege was hardly Overwatch’s only imitator. According to a PC gamer article, hero shooters have ‘become the de facto mould of what multiplayer shooters should look like in 2022.’

Rainbow Six Siege was one of the first games to adopt the hero roster formula, adding mercenaries and criminals to its array of ‘Operators’. The game now features far more female and trans characters. Each Operator, like a hero from Overwatch, has special abilities and even plays a critical role in PvE experiences. The game’s use of Operators ‘with more flair’ has been credited as one of the reasons for its turnaround after a bad launch. 

Even games that have never been hero shooters have taken cues from Overwatch. The latest iterations of Call of Duty have featured distinct playable characters, while Battlefield 2042 changed the series’ anonymous classes into unique personalities. Even CS:GO, one of the most traditional first-person shooters, now has unlockable skins that let you enter battle by picking a favourite agent

Apex Legends would combine the hero shooter with the other prominent genre of the late 2010s – the battle royale. We will deal with Apex Legends in our discussion of the battle royale genre below.

Valorant (2020)

Valorant (Courtesy Riot Games)

The hero shooter Valorant was Riot Games’ answer to titles like CS: GO, Rainbow Six Siege and Overwatch – a highly-accessible tactical shooter involving 5v5 matches where one team tries to plant a bomb known as the ‘Spike’ and the other works to stop them. The first team to win 13 rounds is the victor. In unrated games, if both teams have 12 wins each after 24 rounds, the 25th round serves as a ‘sudden death’ tie-breaker. In competitive games, if scores are tied after 24 rounds, a team has to win two consecutive rounds to secure victory.

Valorant differs significantly from the games that it draws inspiration from. Unlike the heroes in Overwatch, none of Valorant’s heroes will survive a critical shot. Each player is an agent with distinct abilities, one that is in-built, and two others that you can buy at the game’s beginning. Another ‘ultimate’ ability gets charged by surviving multiple rounds. Valorant is a traditional shooter in that kills are based on your aim and skill with a weapon. But the special abilities impart intel, create killing zones and can even blind opponents to give players a better chance at scoring kills. As such, mastering the game and winning the best of 25 rounds depends on both your shooting abilities and skillful deployment of special powers. This combination makes for gameplay that is a blend of control and chaos, and reviewers praised the game for breaking new ground. By January 2021, the game had overtaken CS:GO in earnings, and it is now a major esport and has drawn in nearly 20 million players per month for the last year. 

Battle Royale takes Centre-Stage

Hero shooters such as Overwatch enjoyed massive popularity – until a new genre – the battle royale – began to grow in popularity. One of the biggest phenomena in gaming – Fortnite – is a battle royale that has since morphed into a metaverse-like experience, and we discuss the important battle royale games below.

PUBG: Battlegrounds and Fortnite (2017)

PUBG Mobile (Courtesy Tencent Games)

Like so many genres before it, the battle royale got its start as a mod – in fact, as a mod for a mod. DayZ fan Brendan Greene (known as PlayerUnknown) initially released a mod for DayZ where players were thrown into a shrinking map and had to kill each other until a last player remained. He would then use Arma 3’s resources to create a total conversion featuring battle royale, with an aircraft that dropped players into a large map to fight it out amongst each other. In 2016, Krafton studios invited Greene to create a standalone version, which would result in PUBG: Battlegrounds, earlier known as PlayerUnknown’s Battlegrounds. 

The simple premise of PUBG (and other battle royales) may well be the reason for its enduring popularity. A number of players (upto a 100 in PUBG) are dropped to an area with no weapons and must hunt for arms in order to go up against other players. Weapons and other items can be looted off killed opponents as well, and the map size shrinks every few minutes, forcing the players closer together and increasing the frequency of PvP encounters. Players can enter the dropzone individually or play as teams, and must also choose the right time to parachute off the aircraft. As players kill off each other, the last player or team standing wins. 

At over 75 million copies sold for PC and consoles, PUBG is the best-selling game on PC and Xbox One, and the fifth-best selling game of all time. Since 2020, the game has been drawing in a staggering 300-400 million players each month

Fortnite Battle Royale – one of the most popular games in the world, and a cultural phenomenon in its own right – took direct inspiration from the success of PUBG. Epic Games realised that they could create a battle royale version of Fortnite with ease and released it after two months of in-house development. It had much the same gameplay as PUBG, though it also featured a building mechanic where players could construct structures to fend off attacks from enemies and traverse the game map. Within a year of its release, the free-to-play game had 125 million users and was making huge sums from microtransactions, making more than $9 billion by 2019, and making $5.8 billion for Epic in 2021. It has been drawing in more than 250 million players a month since 2020.

Fortnite is more than just a successful game – it transcends gaming with its live concerts, crossover events, skins from other important media franchises and more. Its huge registered player base of 400 million allows it to experiment with metaverse-like experiences, as we have discussed elsewhere.

Fortnite (Courtesy Epic Games)

Battle Royale Becomes a Craze

The massive success of the battle-royale mode spawned a number of imitators, both in the console/PC and mobile space. The first IP to join the bandwagon was Call of Duty, which introduced a battle royale mode called ‘Blackout’ in Call of Duty: Black Ops 4 (2018). Supporting up to 100 players, Blackout took place in the largest map yet in a CoD title and made full use of the fluid movement and controls of the franchise to create a fast-paced battle-royale mode that even drew in players already tired of battle royales.

Not to be outdone, Electronic Arts introduced a battle royale mode called ‘Firestorm’ to Battlefield V (2018), and, like Activision, featured the battle royale in the biggest map to date in a Battlefield game. The battle royale mode supports 64 players, who can compete in squads of up to four players.

EA then followed this up with a new IP (based partly on Titanfall) that combined elements of both the battle royale and the hero shooter in one addictive package: Apex Legends (2019). Legends is a gorgeous game with an incredibly detailed game map – knowing the map’s ins and outs confers a significant tactical advantage. It introduced many other innovations such as dropships, care packages (loot drops) and the highly efficient non-verbal ping-based communication between teammates. Each hero also brings distinctive playstyles to player squads, and a steadily growing roster of heroes keeps the game fresh. In a little over two years, it had 100 million players and has made more than $2 billion as of mid-2022

Activision responded with Call of Duty: Warzone (2020), in which the battle royale mode predominated. While it featured much the same elements as the battle royale mode in Black Ops 4, it also encouraged players to amass Cash – in-game currency – in order to buy killstreaks and special items, and allowed upto 150 players to play in the free-for-all, in teams of up to four players. Within a year, the game had 100 million users and Warzone made nearly $4 billion in the first two years following its release. 

Even mobile gaming is characterised by highly-successful battle royale IPs such as Fortnite, which began life as a cross-platform game released for PC, consoles and mobile, and PUBG Mobile (2018), which crossed $7 billion in lifetime revenue by 2021 and Garena Free-Fire (2017), which became the most downloaded mobile game in 2019, and has made more than $4 billion in the two years since its release. 

Even a tetris game available on the Nintendo Switch – Tetris 99 (2019) – has battle royale elements, as does the open-world racing sim Forza Horizon 5 (2021), which offers the ‘Eliminator’ battle royale mode. 

 

New Styles Emerge, Old Ones Return

The second half of the decade was not just an unbroken series of hero shooters and battle royales – they were merely the most popular genres around. 

2018 saw the release of three innovative multiplayer titles, A Way Out, Among US and Sea of Thieves. The first featured only a two-player cooperative campaign, without any single-player version, and could be played in couch co-op mode or over the internet with a friend. The developer of A Way Out, Hazelight Studios, released another co-op game called It Takes Two in 2021, building on its formula of telling a compelling story purely through multiplayer mode. 

Sea of Thieves is a rollicking pirate adventure with sea battles between player crews, and went on to become highly popular, holding its own against battle royales and hero shooters during the height of their popularity. It sold five million copies on the PC platform Steam by 2021 and has drawn in 15-17 million users per month since 2020

The multi-platform game Among Us (2018) was yet another innovative title, which features asymmetric multiplayer – the game consists of a team of crewmates and a smaller team of impostors, both of whom look alike and work in the same area. The crewmates should complete all tasks in the allotted time or vote out all impostors, while the impostors should sabotage crewmate activities, kill crewmates without being detected or unleash a disaster that cannot be solved in time by crewmates. It was largely ignored upon release until pandemic shutdowns resulted in a massive spike in user counts – the game amassed nearly half a billion players in 2020, and has drawn in nearly 400 million players per month since late 2020

The free multiplayer mode of Halo Infinite (2021), replete with classic Halo multiplayer elements and the new grapple-shot mechanic, became the most popular Xbox title on Steam within less than a day of its launch. In about a month, nearly 20 million players had joined the fray. 

Games like Escape from Tarkov (beta release 2017) and The Cycle: Frontier (2022) also indicate the emergence of a new multiplayer play style, where players are dropped into common zones but can choose freely between co-op and PvP mode – there is no need to be the last man standing – each mission has its own objective quite apart from killing other players. Like DayZ, Tarkov and The Cycle are PvPvE matches – both the environment and other players pose a challenge, but you can cooperate with the latter. 

The battle royale and hero shooter might be the biggest players in town, but game developers appear to have outgrown the need to copy the two genres.

 

The Pandemic and Multiplayer

Online multiplayer games – for PCs, consoles or mobile – draw in millions of players every month. It might be tempting to criticise them as addictive time-sinks, but the popularity of multiplayer during the pandemic tells a wholly different story. 

According to the BBC’s Life Project, online multiplayer became a social lifeline during the pandemic lockdowns and gamers successfully built supportive communities around the games they loved, forging strong friendships. Playing with friends online has also been studied as a healthy replacement for in-person contact when lockdowns prevent such interaction. 

Existing friendships have thrived and people have actually grown their network of friends during the pandemic via multiplayer, social gaming and connecting over the gamer-focussed Discord, a VOIP and instant-messaging platform – gaming gives people a way to share fun, light-hearted experiences during dark times. As a result, people have reported overwhelmingly positive experiences from gaming, especially thanks to its potential for social interaction via multiplayer. In fact, dedicated MMO players have reported feeling a strong sense of social identity, a higher sense of self-esteem and decreased feelings of loneliness even before the pandemic.

VentureBeat attributes the rapid evolution of the social aspects of mobile gaming to the pandemic, because such features allowed people to stay connected while socially distanced, as they played inexpensive but interesting mobile games together.

It is no surprise, then, that the gaming industry registered record gains during the pandemic, growing 12% to $139 billion in 2020 amidst widespread lockdowns. Despite a contracting PC and console market in the post-pandemic period, the overall industry is poised to grow at a compound annual rate of 11% through 2024 to hit a record $200 billion in worth.

 

Conclusion

Online multiplayer has grown massively this decade, with games supporting millions of players across platforms, with even mobile multiplayer coming into its own. While the pioneers of the previous decades had to innovate just to make multiplayer a viable option (remember the code optimizations for QuakeWorld and Unreal Tournament), the developers of this decade have worked hard to make the most of matured internet infrastructure. 

After several innovative entries in the multiplayer genre, and the prominence of hero shooters and battle royales such as Overwatch, PUBG and Fortnite, we see an increase in variety, with titles such as The Cycle, Among US and the instant success Multiversus, which crossed 20 million players within a month of launching its open beta. 

The success of such games suggests that new online multiplayer games can offer a wide range of experiences and maintain huge fan bases – not participating in a zero-sum game for the players’ attention, but adding to an overall online multiplayer experience far greater than the sum of its parts. 

Gameopedia works with clients across the industry on custom requests and can provide in-depth data about online multiplayer games. Reach out to us for data that can yield novel insights about the billion-dollar online multiplayer gaming market.

Read More

The Rise of Online Multiplayer

Multiplayer games have always been a major aspect of gaming – whether through the internet or local area networks or just co-op mode for playing together in-person. Such games are among the most popular titles in gaming, many of them have become massive esports, and multiplayer gaming itself has evolved steadily over the years to become increasingly complex and nuanced. In this blog we will discuss the history of multiplayer gaming, from the 1980’s to 2010, with a special focus on the two decades from 1990 to 2010, when multiplayer evolved rapidly and matured into a staple of our gaming experience.

 

What is Online Multiplayer and Why it’s Important

A multiplayer video game can be played by more than one person in the same game environment simultaneously – either locally, on the same computing system, or through networks shared by multiple systems. Online multiplayer refers to games played over the internet and networked multiplayer refers to games played on different systems through a local area network. In a multiplayer game, players or teams of players can compete with each other or cooperate toward a common goal. Multiplayer games involve a social element not found in single-player titles and can also offer a higher level of challenge as compared to playing against AI.

Modern multiplayer games often share certain common characteristics – various ‘modes’, which may involve competition or cooperation, a progression system with ‘unlockables’, a steady stream of new content (though this is more applicable to live-service games in general), a system by which players can communicate using voice and/or text, a dedicated server or a single terminal hosting the game, and more. 

A look at Steam stats reveals that multiplayer games are among the most-played, with thousands of daily users. There were 932 million online gamers by 2020 and as of 2022, 54% of the most active gamers worldwide play multiplayer games at least once a week, for seven hours on average. As of 2022, 83% of US gamers play with each other, either in person or online. In comparison, the figure was 65% in 2020. This spike is attributed to Covid-period lockdowns – in fact, the multiplayer game Among US (2018) surged in popularity during the pandemic, amassing a user base of nearly half a billion players. The global online gaming market size was valued at $56 billion in 2021 and is expected to grow to $132 billion by 2030, at a cumulative annual growth rate of 10.2%. In the following sections, we will delve into the history and evolution of multiplayer and its rise to prominence among gamers. 

 

The Origins and Early History of Online Multiplayer

The early years of online multiplayer saw the advent of multi-user dungeons, or MUDs, where multiple players engaged in text-based games through typing commands. This was followed by the arrival of multiplayer FPS in the ’90s– legendary games such as Doom and Quake not only pioneered the FPS genre as we know it today, but also created multiplayer modes that allowed gamers to team up with or fight against each other. Over the course of the decade, as the internet became commonplace, many MMORPGs emerged to the fore, whose graphics brought to life the text-based experience of the early MUDs.

 

Early Years: The Multi-User Dungeon and the Internet

In 1978, two University of Essex students, Roy Trubshaw and Richard Bartle, created a multiplayer adventure that they would call ‘MUD’, or multi-user dungeon. The text-based game was a revelation that allowed you to live in a persistent fantasy world through the networked computers of the institute. A persistent world is a virtual environment that changes dynamically even when the player is logged off. The world continues to exist on the network, enabling other players to continue playing, thereby presenting new activities to any player who logs back in after a certain interval. Bartle and Trubshaw’s text-based world may not have had any graphics to speak of, but its dynamic persistent world gave the fantasy environment a life of its own, independent of player actions

The Multi-User Dungeon Interface

A MUD can be text-based or may use storyboards to flesh out its world. MUDs combine elements of role-playing games, interactive fiction and online chat to create a real-time virtual world where players interact with the game world and each other using text-based commands. Bartle and Trubshaw’s game counts as one of the first of its kind and was also the first MUD to be playable on the internet, when the university connected its internal network to the ARPANET. Multi-user dungeons with persistent worlds would influence the MMORPGs to come.

In 1985, University of Virginia classmates John Taylor and Kelton Flinn created a MUD-like game called Island of Kesmai, a multiplayer adventure that used ASCII-based graphics. Considered a direct forerunner of subsequent MMORPGs, the game was available on the early CompuServe online service and allowed upto 100 players to play simultaneously.

 

The Emergence of Multiplayer FPS

Multiplayer gaming over networks came into its own in the 1990s, with the release of major first-person shooter titles such as Pathway to Darkness and the legendary Doom in 1993. The games’ multiplayer modes led to the birth of the LAN party – people coming together and creating a local area network to play multiplayer games together. 

Doom was not just a revolutionary game, it also pushed multiplayer to new directions. The game offered networked multiplayer supporting two-player teams, and a special matchmaking service known as DWANGO supported online multiplayer, allowing four-player teams to either cooperate in Doom’s main campaign or fight against each other in a deathmatch mode. Doom was one of the first games in the world to offer online multiplayer via a matchmaking service. It was also a highly popular LAN party game, along with Pathway to Darkness, and Marathon (1994), another first-person shooter. Multiplayer over LAN would remain prominent until the internet became more widespread and ushered in online multiplayer on a large scale.

Doom (Courtesy id)

The same year Doom was released, CERN made the software for the world-wide-web open source – a move that would eventually result in our world of browsers, email, streaming services and internet-based multiplayer games played by millions of people simultaneously. Just three years after the world wide web went open-source, id would release Quake, a major milestone in online multiplayer gaming. 

Quake was not only the first game to feature full real-time rendering of 3D environments and 3D acceleration, it was also the first game to enable online multiplayer over TCP/IP on the internet. Multiplayer was easier than ever before because all one had to do was enter an IP address and connect with a friend or a server over the internet to play cooperative or competitive multiplayer. The multiplayer mode ran on dedicated servers, but Quake also allowed players to turn their own machines into custom servers. 

In december 1996, id released QuakeWorld, an update to the Quake engine, which introduced a network optimization feature called client-side prediction to enable an online gaming experience comparable to single player even for players on high-latency connections. An IGN article describes the QuakeWorld update as the first successful large-scale implementation of online multiplayer mode. In 1997, id hosted a nationwide esports tournament in the US called Red Annihilation, featuring Quake, and the winner, Dennis ‘Thresh’ Fong, won a 1987 Ferrari 328 GTS cabriolet that belonged to John Carmack, the wizard behind Quake, Doom and many of id’s hit IPs.

QuakeWorld (Courtesy id)

Multiplayer Grows in Variety

The late ’90s saw the arrival of yet another hugely popular multiplayer genre – the massively multiplayer online role playing game. In an MMORPG, players adopt the role of a character with distinct abilities, traits and weaknesses and take part in a huge persistent world filled with thousands – even millions – of concurrent players. Progression is a key aspect of the MMORPG, where player actions earn them points that they can then use to level up their skills. Like classic MUDs, the world of an MMORPG continues to change even when the player is offline. 

MMORPGs such as Meridian 59 (1996), Ultima Online (1997) and EverQuest (1999), emerged as internet technology matured in the 90s – they were still called ‘graphical MUDs’, evoking their origins in the MUDs of the 80s, and featured persistent worlds with real interactions with other online players, and made their mark as a new genre for online multiplayer. 

While Meridian 59 and Ultima Online helped establish the MMORPG, EverQuest built on the genre’s potential. The game offered a great degree of player choice, a huge world ready to explore and (for its time) high-quality graphics. The title boasted 10,000 active subscribers 24 hours after its launch, and within the year, it had 150,00 active subscribers. EverQuest continues to be played, with a user base of 66,000 subscribers. 

EverQuest (Courtesy Sony Online Entertainment)

Soon after EverQuest, the real-time strategy (RTS) title StarCraft (1998) emerged as a major online multiplayer game. Blizzard’s StarCraft built on the popularity of the RTS genre, which had successful franchises such as Sid Meier’s Civilization series, Age of Empires, Command and Conquer, and WarCraft – StarCraft would introduce sophisticated multiplayer gameplay to this genre. 

StarCraft’s multiplayer mode was facilitated by Blizzard’s Battle.net, a free game hosting and matchmaking service that helped StarCraft – and other Blizzard games – reach huge audiences. By the time StarCraft’s expansion, Brood Wars, was released, the title had become a phenomenon in South Korea, which accounted for a third of StarCraft’s global sales, and spawned a professional esports scene that was broadcast over South Korean media. Starcraft’s masterful game balancing and potential for complex strategies enhanced its multiplayer greatly, allowing for immense variation in gameplay. The game’s success led to increased usage of Blizzard’s Battle.net service, which hosts tens of millions of active players across Blizzard’s library of games today.

The FPS genre then made a resounding comeback in 1999 with the release of Epic’s Unreal Tournament and id’s Quake III Arena, both of which would make multiplayer the main gameplay mode – the first Unreal game released in 1998 did not deliver good multiplayer gameplay, and it became a top priority to improve the multiplayer code, with Epic CEO Tim Sweeney even apologising for Unreal’s poor multiplayer. Epic intended to deliver the updated multiplayer gameplay as an expansion for Unreal, but then decided to make a standalone, multiplayer-focussed instalment called Unreal Tournament, which was hailed as one of the best multiplayer games of the year, along with Quake III Arena. Both games dispensed with plot-based single-player campaigns and featured single-player modes that merely pitted players against bots. Even now, critics cannot decide which offers a better experience – both are incredible in their own ways. The Unreal Engine, used to build Unreal Tournament, would go on to become an industry-standard game engine that would spawn a host of award-winning titles.

Unreal Tournament (Courtesy Epic Games)

Another important FPS multiplayer game released in 1999 was Counter-Strike, a Half-Life mod that would later be purchased by Valve after the title became a staple of LAN events and a hugely successful multiplayer experience. Like StarCraft, CounterStrike would spawn its own esports scene. 

 

The Development of Multiplayer in the 2000s

By the second half of the nineties, the internet and the world-wide web had become commonplace. The web rose in prominence until the dotcom crash in 2000, and within a year, dotcom companies had folded, wiping out trillions of dollars of investment. 

RuneScape (2001) emerged from the ruins of the dotcom bubble to become one of the most enduring MMORPGs of all time. RuneScape is playable in a browser and was supported purely by ads until the crash. It pivoted to a freemium model, where premium users got access to more content, after the dotcom bubble burst. The browser-based MMORPG drew in droves of players, and continues to attract gamers today – nearly 17 million players are estimated to have played RuneScape, and in 2020, it reached its highest-ever concurrent user count, at more than 170,000. RuneScape’s success indicated that online multiplayer games could weather market crises and the 2000s were marked by constant innovation in the field. 

RuneScape (Courtesy Jagex)

Console Makers Enter the Fray

Microsoft launched Xbox Live in 2002, a dedicated service for online multiplayer that would become hugely popular with the release of Halo 2. While the first game in the Halo franchise (2001) was shipped before Xbox Live, its sequel, Halo 2 (2004), offered multiplayer modes with the Xbox’s unified online service. While many aspects of Halo 2 were lauded by gamers and critics alike, it is now known for ‘changing online multiplayer gaming forever’ and is considered the ‘game which showed the world how console multiplayer should be done’. 

Until Halo 2 launched, few gamers were using Xbox Live, although the Xbox itself offered sophisticated broadband compatibility at a time when the technology was still uncommon. By the time Halo 2 was released, broadband infrastructure had grown widespread, and Halo 2 could exploit the possibilities of Xbox Live to the fullest, creating an unprecedented online multiplayer experience on console. While Halo: Combat Evolved had become the killer app for the Xbox, Halo 2 became the killer app for Xbox Live, and made console-based online gaming straightforward and intuitive.

Halo 2 (Courtesy Microsoft)

Sony’s PlayStation 2 also offered online multiplayer with a separate network adapter, which was integrated later with the PS2 Slimline model. The console offered both dial-up and broadband-based connectivity and networked multiplayer using ethernet cables or a router network. Unlike Xbox Live, which functioned as a unified service for all Xbox games, providing online multiplayer for PS2 games was the responsibility of the publisher, who had to use third-party servers.

Sony would catch up with Xbox Live in 2006 with the PlayStation Network – a free, unified service for online multiplayer for the PlayStation 3 that also featured an online store from which to buy games digitally. Online gaming was free for the PS3, but required a subscription to the PlayStation Plus service for the PS4 and PS5. The introduction of online multiplayer for consoles would lead to many games offering the feature out of the box, making online multiplayer gaming a staple for gamers, despite the fact that both Microsoft and Sony charge for the service

 

Modders Create a New Multiplayer Genre: MOBA

In 2003, a WarCraft III fan released a mod called Dota – Defense of the Ancients. The mod would spawn an entirely new genre in gaming, the multiplayer online battle arena (MOBA). Dota enabled players to control hero units and fight an opposing team across three lanes that connected each team’s base. The game involved two teams of five player-controlled characters that battled each other, with the mission being to destroy the opposing team’s base. Soon, other modders were creating their own version of the map, adding new heroes and items. 

Eventually, modder Steve Feak would develop Dota Allstars, a version that would incorporate the best elements from multiple Dota iterations, and would become the most popular version of the mod. As the game was a modification of WarCraft III, modders could not add any original content (such as models/textures or characters) not provided in the modding resources released by Blizzard. Nor could Dota’s popularity result in any monetary gains for the modders. 

Dota AllStars (Courtesy Blizzard)

Steve Feak would hand over the reins of managing Dota to IceFrog, who would go on to collaborate with developers at Steam to release Dota 2 in 2013, one of the biggest esports in the world in terms of prize pools. Feak would himself be hired by Riot Games to develop the free-to-play MOBA League of Legends, one of the most popular esports in the world. A fan-made mod has spawned not one, but two major esports and changed gaming – especially online multiplayer gaming – forever. 

 

Online Multiplayer Goes To War

Around the time when Dota was becoming a phenomenon, two studios released two FPS titles that changed both online multiplayer and the FPS genre beyond recognition. In 2002, Electronic Arts released Battlefield 1942, and in 2003, Activision released Call of Duty, marking the start of a rivalry that has lasted nearly two decades. Both games had a World War II setting, and both of them fleshed out this conflict in masterful detail.  

Both games excelled at multiplayer – the Encyclopaedia Britannica credits Call of Duty for breathing new life into the multiplayer FPS genre spawned by Quake and Unreal Tournament. The first Call of Duty title was a visceral experience, set in World War II, and featured immersive audio-visual effects – when the player is close to an explosion, sounds are muffled, there is a ringing noise in the ear (simulating tinnitus) and vision is blurred as well. The game also featured excellent NPC AI (who are programmed to flank the player and move from cover to cover) and its multiplayer features could easily be modded by gamers themselves. Call of Duty: Modern Warfare (2007), would take the IP to modern settings and advance the multiplayer experience even further, with the introduction of killstreaks, where the player gains special abilities by killing opponents without dying. Staying alive while killing your enemies allowed you to call in UAV reconnaissance scans, airstrikes, and even attack helicopters. 

Call of Duty (Courtesy Activision)

Battlefield 1942’s contributions are just as significant – its online multiplayer allowed for epic, chaotic battles fought by dozens of players in large, detailed maps. The game established the 64-player online gameplay of the series, set in environments with multiple vehicles that you could use while battling with your foes. Its 22 maps were actual real-world settings such as El Alamein, Iwo Jima and Stalingrad from World War II. The game’s numerous vehicles, including tanks, planes, carriers and even submarines, added to the chaos of multiplayer and resulted in innovative tactics. The game’s active modding community introduced various weapons, settings and themes to the title – the well-known Desert Combat mod added modern assault rifles, rocket launchers, helicopters, and planes, while total conversions such as Galactic Conquest attempted to turn the game into a Star Wars title.

Battlefield 1942 (Courtesy Electronic Arts)

The two franchises dominated online multiplayer during the 2000s, and have continued to remain popular, releasing a new game every year. The Call of Duty franchise’s popularity has grown dramatically in recent years, following the release of its first mobile title in 2019 and its free-to-play title Call of Duty: Warzone in early 2020 – the series’ user base grew from 70 million in 2018 to more than 250 million in 2020.

 

Massive Communities and the Proto-Metaverse

The success of multiplayer games starting from RuneScape to Call of Duty would set the stage for the behemoth that was World of Warcraft (2004). The MMORPG still boasts a huge player count and it is known for its large expansion packs, complex lore and gameplay. It has been praised for its fluid combat, and the classic version of the game, as opposed to the retail version, is also known for being more challenging. Despite being around for nearly two decades, WoW is still very accessible, allowing new players to experience it on their own terms, and the latest expansion Shadowlands even includes the tutorial phase, like in the first release of WoW. The game boasts a total of over 120 million registered players.

World of Warcraft (Courtesy Blizzard)

But World of Warcraft is not just about gaming – the game would implement many features that would later be associated with today’s nascent metaverse. WoW was not the first game to come up with player-driven economies, social gathering points or the sale of virtual real estate, but it was the first widespread game to make these features part of the gaming landscape. With its massive community and metaverse-like features, WoW can be considered a proto-metaverse, and we have argued elsewhere that Microsoft’s purchase of Activision Blizzard qualifies as a metaverse play, precisely because Activision Blizzard is used to handling an enormous global community and can help Microsoft get a headstart on its metaverse initiatives.  

Second Life (2003) is another title that can count as a proto-metaverse – it is a vast 3D virtual world and platform where people can interact with each other and with user-generated content in real time. Players, known as ‘residents’, create a digital avatar and freely explore the world, create their own content and even trade goods and services with the in-world currency, the Linden dollar – Second Life hence boasts a thriving in-world economy. The platform has a daily average of 200,000 users from 200 countries, and over 70 million users spread out over 27,334 regions in the world.

 

Second Life (Courtesy Linden Lab)

Unlike games, Second Life has no goals or objectives and social interaction is the core aspect of the experience. Residents have married and even raised children, and created communities with unique customs. The game actively fosters such interaction by ensuring that everyone in any part of the platform will experience the same thing – Second Life consists of an integrated space and not disparate instances. 

The platform even created an early version of the non-fungible token – the digital assets in the world contain tags that record who made them, who owns them, what they cost and what a buyer can do with them. 

While the platform has been hailed as one of the longest-running experiments in a metaverse-like experience, creator Philip Rosedale is sceptical about present-day metaverse initiatives. Rosedale believes that a true metaverse would have to be built by its users rather than software companies, just as Second Life residents create digital assets to enhance their virtual world. Rosedale is also wary of the blockchain, and believes that the metaverse needs a centralised economy to prevent wealth disparity. Second Life is not just a proto-metaverse, it has yielded insight into the possible problems with current conceptions of the metaverse.

 

Multiplayer Matures, Becomes the Norm

By the late 2000s, online multiplayer was ubiquitous – and some of the best games of the decade were focussed on delivering memorable (and addictive) multiplayer experiences, supporting millions of connected players. 

Console games such as Halo 2 and its sequels thrived on the back of their multiplayer mode, modders created an entire genre of multiplayer – MOBA – on their own, games like Battlefield and Call of Duty raised the bar for what could be achieved in multiplayer FPS and became a staple of the multiplayer gaming scene throughout the 2000s and beyond, and MMORPGs such as World of Warcraft and the social platform Second Life led to the formation of massive online communities that persist to this date.

Throughout the course of the 2000s, long-running sports game franchises such as FIFA, Madden and NBA also started to offer robust multiplayer on multiple platforms and spawned their own esports communities and events. 

League of Legends was released towards the end of the decade and is one of the first MOBA games to launch as free-to-play, and to employ a live-service model with continual updates, new heroes, and game-balancing patches. The League of Legends franchise, comprising multiple games, registered a staggering 180 million active players in October 2021. The desktop version is one of the highest grossing free-to-play games as well, and its mobile version is one of the most popular mobile MOBA games as of 2022. Many multiplayer-focussed titles of the 2010s would take a cue from LoL and go free-to-play, deriving revenues from cosmetic upgrades and other microtransactions. 

 

Conclusion

Online multiplayer began as text-based adventures and matured into massively multiplayer games with high-fidelity graphics that support millions of active players. This evolution was spurred in part by developments in internet technology, but was also the result of game developers pushing the limits of what could be achieved with the network infrastructures they had access to. The early MUDs depended on university networks and then the ARPANET, while Doom used a matchmaking service based on a dial-up connection until the arrival of the internet, which QuakeWorld, Unreal Tournament and Quake III Arena used to maximal effect with code bases optimised for online multiplayer. 

The efforts of these pioneers led to widespread online multiplayer in the 2000s, where millions of gamers could participate in MMORPGs, MOBAs, multiplayer FPS games and more. The development of online multiplayer, especially from the ’90s to the 2000s, is characterised by ceaseless innovation and pushing the limits of what can be achieved in a multiplayer experience over the internet.

In a subsequent blog, we will discuss current trends in online multiplayer – the shift toward mobile multiplayer, the rise to prominence of the hero shooter and battle royale genres, and how multiplayer rose in prominence during the pandemic period.

Gameopedia works with clients across the industry on custom requests and can provide in-depth data about online multiplayer games. Reach out to us for data that can yield novel insights about the billion-dollar online multiplayer gaming market.

Read More

The Evolution of Open-World Games

Learn how open world games evolved to deliver immersive experiences in expansive environments.

Read More

An Introduction to Open World Games

On November 11, 2011, Bethesda released The Elder Scrolls V: Skyrim and a week later, Mojang officially released version 1.0 of Minecraft – both were open-world games whose enormous success would make them household names, and they marked a watershed moment in open-world gaming – some of the finest games in this genre would be released in the years following the success of Skyrim and Minecraft.

Open-world games such as these, and open-world franchises such as Grand Theft Auto, Red Dead Redemption and The Elder Scrolls are some of the most iconic and successful IPs in gaming history. In this blog we define open-world games and provide a brief history of the genre from its origins in Elite (1984) to its golden age in the second decade of the twenty-first century. We also cover many of the seminal open-world games – titles that took the genre to new heights.

What is an Open-World Game?

An open-world game features a non-linear game world design where the player is able to freely traverse the environment, which consists of many different areas and structures that can be visited any time. Players are not restricted to a specific path that they have to follow to reach a location, though certain areas of the game can be accessed only after the player reaches a certain point in the game. The open world may be procedurally generated or pre-created.

Exploration is the heart of open-world design – players can access various locations, many of which feature their own biomes, Players can also often tackle missions in any order – many open world games feature side quests and activities that one can pursue while the main questline remains unfinished. An open world contains many usable in-game objects, collectibles and loot, and NPCs who react to what you do, and a game world that can dynamically change based on your actions. Many games feature activities that have nothing to do with character progression – GTA V has golfing, paragliding and more, while Skyrim allows you to play tag with children and read dozens of books. All these qualities of the open-world game engender a strong sense of immersion. 

A Brief History of Open-World Games

The history of the open-world game can be traced back as early as 1976, although the first game we would recognize as a 3D open world is GTA III (2001). In this section, we will chart the history of open-world games from Elite (1984) to present-day open worlds such as in Elden Ring. 

Early History

Ars Technica considers the 1976 text adventure game Colossal Cave to be the first free-roaming game, though it credits the space sim Elite (1984) for being a game that would be recognizable as open world to modern audiences – its wireframe planets and spaceships can look primitive today but was cutting-edge at the time. Elite used procedural generation to create its universe and instil a sense of vastness. 

The Legend of Zelda (1986) emphasised exploration and non-linear gameplay in a world filled with discoverable locations. Though it did not pioneer open-world mechanics, it was one of the first games to make exploration a key aspect of gameplay. As an IGN article observes, Zelda featured simple combat and an overworld that led to dungeons, where you killed enemies and collected treasure, in a gameplay loop that rewarded exploration consistently. 

The Legend of Zelda (1986, Courtesy Nintendo)
The Legend of Zelda (1986, Courtesy Nintendo)

Turbo Esprit (1986) and Vette (1989) could be considered the forerunners of open-world racing or driving games such as Forza Horizon and Burnout Paradise. Turbo Esprit allowed you to choose between four cities, where you tailed drug runners and disabled their cars before they reached their destination. Vette was a one-on-one street racing game set in San Francisco, but you could drive in any direction if you preferred. Both games gave you a city you could drive around, and Turbo Esprit let you choose your own goals, without forcing you into any missions.

Nintendo followed up Zelda with the revolutionary 3D platformer Super Mario 64 (1996), which allowed players to freely explore a 3D world. 3D games had existed before, but Super Mario offered an open-ended free-roaming world. The game also introduced novel jumping mechanics that Mario could use to traverse various platforms, giving a sense of spontaneity to moving around the game world. 

The first generation of the Pokemon games (Red, Green, Blue and Yellow) also featured open worlds that could be explored as the player quested to master Pokemon battling and collect all the Pokemon in the world. The subsequent generations of Pokemon games also continued the open-world design philosophy and have been significant in open-world gaming in handheld consoles. 

1998 saw the release of the seminal Legend of Zelda: Ocarina of Time, the first 3D game in the Zelda series, set in the expansive world of Hyrule, which Link navigates to defeat the evil king Ganondorf. Player characters could explore the open overworld and its dungeons, collecting weapons and other loot. Many of its side-quests were also optional, allowing the character to chart their own journey to defeat Ganon. 

The Rise of the Modern Open-World Game

Grand Theft Auto III (2001), which featured a 3D open world that could be fully explored, can be considered a precursor to the open-world game as we understand it today. Players could go inside some buildings and see the world from street level, kill NPCs and explore the possibilities of a fully-3D world while driving around in vehicles. NPC’s would react to your actions and criminal activity could attract the attention of the authorities. The environment was not just a setting but a real part of the gameplay

GTA III was the best selling game of 2001, a year filled with watershed games such as Metal Gear Solid 2, Halo, and Super Smash Bros. It has since sold 14.5 mn copies, and inspired multiple clones over the years. 

Grand Theft Auto III (Courtesy Rockstar Games)
Grand Theft Auto III (Courtesy Rockstar Games)

Bethesda released Morrowind in 2002 and Oblivion in 2006, both open-world titles that now rank among the greatest games ever made. Both offered a great degree of freedom, a hallmark of Bethesda’s open worlds. The games featured RPG elements such as character classes, skill progression and more, all of which made each player character unique to the player. Oblivion’s main quest-line and side-quests are some of the most memorable in the Elder Scrolls series.

Test Drive Unlimited (2006) was probably the first recognizable open-world driving game as we know it today. Need for Speed: Underground 2, released in 2004, had set its races in an open world, but Test Drive Unlimited made exploration the main gameplay element. Set in a Hawaiian island, the game turned driving into an ‘open-ended, self-determined and leisurely pursuit’. You could just go for a drive, and the island’s complex system of roads meant that each drive would be quite unlike any other. Like any great open-world game, Test Drive Unlimited let you choose what you did – you could race through the streets, or cruise through the island, taking in the sights, or even launch yourself off a hill. 

Soon after, the makers of the Burnout racing series released the free-roam Burnout Paradise (2008), which would go on to rank as one of the most influential open-world games and will be discussed below. Games such as these would lead to titles like Forza Horizon (2012), The Crew (2014), and multiple open-world Need for Speed releases.

In the second half of 2000s, Ubisoft would launch the highly-successful Assassin’s Creed franchise, which would place players in historical open world settings. The first title in the series takes place in 12th-century Jerusalem, while Assassin’s Creed II (2009) and Assassin’s Creed: Brotherhood (2010) are set in Renaissance Italy. As the franchise’s in-house historian Maxime Durand says, “History is our playground,” and the AC series is renowned for its attention to detail in fleshing out historical settings and has even created game modes that can be used in class-room settings. 

Assassin’s Creed II (Courtesy Ubisoft)
Assassin’s Creed II (Courtesy Ubisoft)

Red Dead Redemption (2010), Rockstar’s gritty, realistic western game, would mark the end of a decade that saw the rise of modern open-world gaming. The game is hailed as one of the best westerns ever made, thanks to its powerful quest-lines, strongly-written characters and its elegiac portrayal of the dying West.

The Golden Age

The golden age of open-world gaming can be said to have begun with the release of Skyrim and Minecraft in November 2011. Skyrim shipped 7 million copies within the first week of its release and has gone on to sell 30 million copies, making it the second-best selling RPG of all time, while Minecraft has gone to greater heights, becoming one of the best selling games of all time. Both games would appeal to demographics beyond core gamers. 

Bethesda created one of its most fully-realised worlds with Skyrim, offering tons of activities and explorable locations in an epic, hand-crafted fantasy world inspired by Norse mythology. Minecraft, on the other hand, was an immense procedurally generated voxel-based sandbox world where you could give free rein to your creativity. Both games inspired game makers to take open-world gaming to new heights.

The Elder Scrolls V: Skyrim (Courtesy Bethesda)
The Elder Scrolls V: Skyrim (Courtesy Bethesda)

The next few years would see a host of marvellous open-world games, including Grand Theft Auto V (2013), Middle-Earth: Shadow of Mordor (2014), The Witcher 3: The Wild Hunt (2015), The Legend of Zelda: Breath of the Wild (2017), Red Dead Redemption 2 (2018) and many more, such as Fallout 4 (2016) and Kingdom Come: Deliverance (2018). 

These games pushed the boundaries of the open world even further – they offered myriad ways to travel within the game world, featured superbly-written questlines, and had beautiful locations that made exploration worth it. Many of these games introduced mechanics that would enrich gameplay, such as Shadow of Mordor’s Nemesis system, wherein enemy orcs rise and fall in the hierarchy based on their combat experience with the player character. Horizon Zero Dawn created an open-world quite unlike any other, setting the game in a post-apocalyptic earth teeming with animal automatons. 

The Assassin’s Creed series also expanded the breadth and depth of their open world with titles such as Origins (2017), set in ancient Egypt, Odyssey (2018), set in ancient Greece and Valhalla (2020), set in the Viking age. Like previous games in the franchise, these games rendered historic settings in minute detail, but they also featured a slew of side quests in sprawling game worlds that could be traversed on both land and sea

Death Stranding (2019) turned traversal into the main challenge in the game and the beautiful Samurai epic, Ghost of Tsushima (2020), marked the end of a decade that saw some of the greatest open-world games ever made. 

 

Ghost of Tsushima (Courtesy Sony Interactive Entertainment)
Ghost of Tsushima (Courtesy Sony Interactive Entertainment)

Open-world titles are still prominent today, as indicated by games like Horizon Forbidden West (2022) and Elden Ring (2022). Halo Infinite (2021) is the first Halo game to incorporate open-world elements into its design, and Elden Ring is the first Souls-like game to go open-world. Upcoming games such as Hogwarts: Legacy (2022), Forspoken (2023), and Starfield (2023) – Bethesda’s first new IP in 25 years – indicate that the genre is far from fading into irrelevance. 

In the next section, we will delve into some of the best and most influential open-world games, and how they pushed the genre to new heights. 

Seminal Open-World Games

The best open-world games rank among the greatest games ever made, and in this section, we take a closer look at some games that have delivered unique open-world experiences, enabling gamers to create their own journey through vast worlds that seem to have a life of their own, with a host of NPCs, missions, side-quests and locations offering endless possibilities. 

The table below contains Metacritic’s top twenty open-world games, in chronological order, culled from a list of its best-rated games of all time. We will discuss some of these games in greater detail based upon the impact they had – and as many of these games are part of larger series, we choose one game from the franchise that has most influenced the game industry or open-world gaming. As a result, Skyrim is chosen over Morrowind or Oblivion, and GTA III is chosen over GTA IV or V – all of these titles rank among the best games ever, but Skyrim and GTA III have had the most impact – the former brought Bethesda’s open world to a far wider audience, while the latter helped pave the way for the 3D open world as we know it today. Minecraft – the lowest rated among the top 20 – is also discussed at length because of how it utterly changed what you could create in an open game world. We also discuss The Witcher 3: The Wild Hunt, which has the same rating as Minecraft, for its brilliant story-telling and quest design. 

table of open world games

The Legend of Zelda: Ocarina of Time (1998, 99/100)

This game was the first Zelda title to feature 3D graphics – players could explore Hyrule and its various locations and peoples on their quest to defeat the evil king Ganondorf. Distant landmarks on the horizon could actually be visited and secret locations could be broken into by destroying walls at certain weak points, and the novel ‘Z targeting’ system allowed Link, the protagonist, to lock onto enemies or in-game objects. A single button performed various context-sensitive actions – Link would mount his horse with the button if Epona was nearby and throw a bomb if he had one in his hands. Optional side quests yield loot that Link can trade with various NPC and the game world had a life of its own. 

The Legend of Zelda: Ocarina of Time (Courtesy Nintendo)
The Legend of Zelda: Ocarina of Time (Courtesy Nintendo)

Grand Theft Auto III (2001, 97/100)

Along with Zelda: Ocarina of Time, Grand Theft Auto III paved the way for the modern 3D open-world genre as we know it today. Unlike the Zelda title, GTA III was set in a realistic game world – the fictional Liberty City, which draws inspiration from New York. Players could explore the game world freely in between missions, and could tackle side-quests in any order, while progressing through the main mission. Driving around the streets of Liberty City gave an unprecedented sense of freedom, especially since you could drive off-road and launch yourself off ramps in the game world. Rockstar also raised the bar for voice acting by casting recognizable Hollywood names for major NPCs. GTA III was all about creating a virtual world that players wanted to return to again and again, and spawned numerous imitators, such as Mafia (2002), True Crime: Streets of LA (2003), The Getaway (2002) and the Saints Row franchise. 

Burnout Paradise (2008, 88/100)

Strictly speaking, Burnout Paradise is a racing game set in the fictional open world of Paradise City. But it’s so much more. It actively encouraged players to drive dangerously fast, destroy billboards and other city structures, hurtle down the wrong side of a flyover, launch oneself off ramps, and cause all sorts of mayhem and finally crash in spectacular fashion. Even Metacritic makes note of this – ‘Burnout Paradise proves that crashing is awesome!’ 

The game also offered races, time trials, stunt runs and other activities at every intersection, but many gamers ignored these missions, choosing instead to explore the city or use the game’s social features, and attempt increasingly absurd driving stunts. Player freedom and exploration – two of the defining characteristics of an open-world game, are central to the gameplay in Burnout Paradise. 

Burnout Paradise (Courtesy Electronic Arts)
Burnout Paradise (Courtesy Electronic Arts)

The Elder Scrolls V: Skyrim (2011, 96/100)

You are the hero, you make the story. Your quests take you through other points of interest, which lead to more quests, which in turn lead to other points of interest and so on. Player freedom is nearly sacrosanct in Skyrim: you can postpone the main quest indefinitely, explore Skyrim entirely on your own terms and the smallest details, like a book, could lead to a compelling new quest. Skyrim also allows you to customise your character’s looks, choose your race, and upgrade your skills and gear as you progress through the game world. To be sure, Morrowind and Oblivion had already pioneered such gameplay elements, but Skyrim implemented them on a far grander scale and brought the series’ open-world RPG mechanics to a far wider audience. As one critic puts it, many open-world games such as Kingdom Come: Deliverance, Dragon’s Dogma: Dark Arisen, The Witcher 3 and even The Legend of Zelda: Breath of the Wild would draw inspiration from Skyrim, marking its resounding effect within the industry. 

Minecraft (2011, 93/100)

Minecraft is a massive procedurally generated open-world sandbox where the player can unleash his creativity in ways that are quite impossible in other games. Players can build just about anything, from a log-house to a computer, in this voxel-based world that comes with its own unique biomes. Players have built monumental structures in the game world, including giant golden bricks, replicas of real-life buildings, working calculators and computers, and much more. Minecraft became a word-of-mouth sensation soon after release and went on to become one of the best-selling games of all time. Minecraft not only fosters exploration, but it also allows you to remake the world however you see fit: players mine for materials by hacking away parts of the game world and resources harvested this way can be used to create structures of stunning complexity, making Minecraft the ultimate sandbox open-world game. 

Minas Tirith, a city from the Lord of the Rings, made in Minecraft (Courtesy Mojang)
Minas Tirith, a city from the Lord of the Rings, made in Minecraft (Courtesy Mojang)

The Witcher 3: The Wild Hunt (2015, 93/100)

The Witcher 3: The Wild Hunt (2015) is an open-world game that stands out for its depth and breadth – it focuses on brilliantly-written quests and a fully-realised player character, and does away with trivial side missions to make most quests weird and interesting. Witcher’s narrative quests are some of the most compelling in gaming, and exploring the world is hence a thrilling experience, because of the sheer variety and depth in each quest you will uncover.

The Witcher 3 is one of the few games that does not trade in narrative complexity for the freedom and non-linearity of the open world. As one of the lead developers said, “[Our] goal … was to not compromise on the quality of our quests and stories by having them take place in an open world.” Quests fit organically in the game world, allowing players to stumble upon them naturally. Many of these missions are interlinked, so even tackling them in non-linear fashion can allow players to experience all the connected activities, which together tell a cohesive story. 

The game is also gorgeous, inviting the player to explore its huge map, and with its attention to detail, brilliant quests and story-telling, The Witcher 3 is one of the greatest open-world games to date. 

The Witcher 3: The Wild Hunt (Courtesy CD Projekt Red)
The Witcher 3: The Wild Hunt (Courtesy CD Projekt Red)

The Legend of Zelda: Breath of the Wild (2017, 97/100)

The Legend of Zelda: Breath of the Wild (BOTW) represents open-world design at its finest. Almost everything in this game is optional. In fact, nothing stops you from challenging Ganon by going to Hyrule Castle the moment the game starts  – you might lose initially as you are not yet acquainted with how the game works, but may well succeed after a few attempts. The game does not compel you to discover all of its treasures, and you can explore the lovingly-crafted game world at your own pace. The game achieves true non-linearity by not forcing any narrative or questline upon exploration, and its open world is fully traversable – you can climb up almost any surface, bridge a valley with a felled tree, fly through the air with a paraglider and even drive a fantasy motorcycle through Hyrule. Traversing the world can be challenging as well, requiring you to wear warm clothing in cold weather and sheathe metal weapons while in the open during a thunderstorm (to prevent being struck by lightning). But such survival elements are implemented with a light touch, and never result in repetitive chores. 

In many ways, this game is the culmination of all the open-world Zelda titles that led up to it – Breath of the Wild is the best-selling Zelda game of all time and is a huge system seller for the Nintendo Switch as well. 

Red Dead Redemption 2 (2018, 97/100)

Red Dead Redemption 2 (RDR 2), the second instalment in the Red Dead series, is a prequel to the original and one of the most cinematic open-world games ever made. Thanks to its exceptional quality of writing, Arthur Morgan emerges as a fully realised character and one of the greatest protagonists to date in Rockstar’s history, and even side characters have compelling story arcs that make you care for them. The game has a bewildering number of easter eggs, including a UFO and a vampire, and players continue to explore the massive game world to this date for its hidden treasures.

Red Dead Redemption 2 (Courtesy Rockstar Games)
Red Dead Redemption 2 (Courtesy Rockstar Games)

Rockstar also limited player freedom in meaningful ways in RDR 2, by requiring the player to rest, eat, wear protective clothing and even take care of his horse, and by introducing serious consequences to breaking the law – being a criminal results in a bounty on your head that seriously restricts freedom of movement, and thereby limits your ability to gather resources, find new missions or explore the game world. In Red Dead Redemption 2, Rockstar subverts the lawless freedom of the GTA series to create a more immersive, realistic narrative that portrays the hard life in the American west. 

Red Dead Redemption 2 is among the top ten best-selling games of all time, and thanks to its powerful story, graphics, gameplay and open world, it is considered one of the greatest games ever made. 

Conclusion

When looking at the industry’s seminal open-world titles, one is struck by the variety in these games’ design. All feature open worlds, non-linearity, traversal, adventure, discovery and exploration, but they differ radically in how they create their worlds. Player freedom is central to Skyrim, but it is curtailed in Red Dead Redemption 2. GTA III lets you live the life of a gangster, but in BOTW and Skyrim, you are the world’s saviour (of course, in Skyrim you can also be a vampire or a thief or an assassin, or a werewolf or all four at once, but you are still the only one who can rid the world of the world eating dragon Alduin). RDR 2 is a tragic western while Minecraft has no story, focusing instead on letting the player create anything in the game world. All these games have pushed the boundaries of open world gaming, but have also pulled it in different directions. 

It is more than likely that ambitious game developers will continue to make and remake the open world, inspired by seminal titles and their own vision. Elden Ring for example, eschews map markers and lets you discover the world without much guidance – it is the first Souls-like game to go open world and retains the difficulty of the Souls genre. In the coming years, we can expect more iterations on what is now a staple of the industry – the open world game is not just a treasure trove for gamers, but a well of inspiration for developers as well. 

In a subsequent blog, we will discuss the evolution of open-world games – how developers built on the efforts of pioneers to create open worlds of increasing complexity, depth and scope.

Gameopedia works with clients across the industry on custom requests and can provide in-depth data about open world games. Reach out to us for data that can yield novel insights about open-world gaming.

Read More

The Decline of Platform Exclusivity

The year was 1991 and Sony had just suffered a public humiliation at the hands of Nintendo. On May 28, Sony had announced that it was working with the gaming company to build a new console with a CD drive. Nervous about Sony’s intentions regarding the gaming market, Nintendo publicly rebuffed the company a day later, declaring that it would be partnering with Phillips instead. 

 

Nintendo had inadvertently helped create a juggernaut that would crush its competitors until Microsoft joined the fray. In the ’90s, the console wars had all been about Sega and Nintendo, but with its first PlayStation console (1994), Sony outsold the Nintendo 64 by a huge margin, and with the PS2, it finished off Sega as a console player. Microsoft realised that a new entrant could disrupt the market, and released its Xbox to compete with the PS2 – and this is why we talk about Sony vs Microsoft today, rather than Nintendo vs Sega. Nintendo still thrives in the console market, largely by not competing directly with Microsoft or Sony, and appealing to a wider, casual demographic. 

In this article we will discuss how platform exclusives – games released solely for one platform – have been the deciding factor in every iteration of the console war until the ninth generation of consoles. We will also delve into the reasons why companies are now moving away from platform exclusivity toward an inclusive approach that involves PC ports, multi-platform game subscription libraries and cloud gaming solutions.

What are Exclusive Games and non-Exclusive Games?

An exclusive is simply a game that can be played on only one platform – no exceptions. Such games are locked down to a single platform, such as the Xbox or the PlayStation, or even the PC. Halo 5: Guardians (2015) – the best selling first-party Xbox One game, can only be played on that console or the Xbox Series X|S, Shadow of The Colossus (2005) can be played on the PS2, the 2011 remaster can be played on the PS3, and the 2018 remake can be played on both the PS4 and PS5 – in effect, the title is restricted to the PlayStation console platform. Dota 2 (2013) and League of Legends (2009) – both major esports – are exclusive to PC. 

A timed exclusive is an exclusive that can be released for different console platforms and/or PC after a specific timeframe lapses. Timed exclusives such as GTA III (2001) and GTA: Vice City (2002) are among the best selling PS2 games of all time. Mass Effect (2007) was a timed exclusive for the Xbox 360, before being ported to PC in 2008, and to the PS3 in 2012, and Mass Effect 2 (2010) was a timed exclusive for the Xbox 360 that was ported to the PS3 in early 2011. Both the first and second part of Mass Effect are among the best-selling Xbox 360 games

Both exclusives and timed exclusives can be called ‘platform exclusives’ – they are released for only a specific video game console or to one company’s console platform, and not available on any other platform, either permanently, or for a set duration. 

A console exclusive is available and playable on one console platform, but not on the other, while being available on PC or another non-console platform. Halo Infinite (2021) can be played on both the Xbox Series X|S and Xbox One, but it can’t be played on any PlayStation console, and Horizon Zero Dawn (2017) can be played both on the PS4 and PS5, but it can’t be played on any Xbox console. While both Halo Infinite and Horizon Zero Dawn are available on PC, a gaming desktop would be required for the best experience. Like platform exclusives, ‘console exclusives’ are meant to sway potential buyers toward their respective platforms – Halo fans will opt for the latest Xbox consoles while fans of Sony exclusives would choose the latest PlayStation console.

Many prominent games such as The Elder Scrolls: Skyrim (2011), The Witcher 3 (2015), GTA V (2013) and Elden Ring (2022) are non-exclusive – they offer vast worlds, great stories, engaging quests and more, and none of them are locked behind a single platform. All these games are available on PC, Xbox and Playstation and thus maximise the potential audience they can garner. 

In the next section, we will deal with how platform exclusives determined the victor in several iterations of the console wars, until Microsoft adopted a multi-platform ecosystem policy mid-way into the eighth generation of consoles, signalling a shift away from platform exclusivity. 

The Prominence and Decline of Exclusives in the Console Wars

Most video game consoles are sold at a loss for the first few years – the strategy being to gain consumers for the console’s library of games – eventually, console sales, along with sales of games available on it, may more than make up for the cost incurred in developing the console.

For multiple generations, the success of a console was largely dependent on its exclusive library – it was the deciding factor for someone looking to buy a console. Sega pulled in gamers with exclusives like Sonic, Nintendo did the same with Mario and Zelda, and Sony and Microsoft would continue this trend with a plethora of exclusives Hence, any history of platform exclusives is inextricably linked with the console wars – in the sections below, we discuss such exclusives in the larger context of competing console makers.

Gen 1-3: Atari, Sega and Nintendo Come to the Fore

The first generation home consoles often supported only one game, usually a variation of Pong, such as Pong Doubles, Quadrapong, and Breakout. The success of consoles like Magnavox Odyssey and Atari’s Home Pong series resulted in hundreds of inferior console clones hitting the market, eventually precipitating a market crash in 1977. In the same year, Nintendo would release several dedicated home consoles – the Color TV-Game series – that would support multiple games. The first console had six ball-and-paddle games. A later console would feature a racing game and the last would contain a port of Nintendo’s arcade game Computer Othello. With a bigger library, Nintendo’s Color TV-game series outsold all others, at 1,500,000 units

Home consoles of the second generation used game cartridges, which spurred the development of multiple games for each console. Space Invaders (1980) became the killer app for the Atari 2600 (1977), quadrupling sales of the console. The ColecoVision (1982) also boasted the successful Donkey Kong, which it licensed from Nintendo. The Atari had an extensive game library and dominated sales with 30 million units sold, while Mattel’s Intellivision sold 3,000,000 units and ColecoVision, 2,000,000 units. The second generation would also end in a market crash in North America, due to market saturation and poor game quality (apart from a few exceptions) 

Nintendo’s revolutionary NES (Nintendo Entertainment System, 1983) would dominate third-gen consoles and revitalise the industry, selling more than 60 million units on the back of a top-notch gaming library that used the computational power of 8-bit processors. Titles like Super Mario Bros (1985), Mega Man 2 (1988), and The Legend of Zelda (1986) set the standard for the third generation and Zelda was a runaway bestseller, selling over 6 million copies – both Zelda and Mario were system sellers as well. Phantasy Star and Alex Kidd in Miracle World were landmark titles for the Sega Master System, though neither Sega nor Atari could compete with the NES – The Atari 7800 sold less than 4 million units, (helped in part by licensed conversions of Nintendo games) and the Sega Master System sold 13 million units

Fourth Gen: Sonic vs Mario

By the fourth generation, Atari had exited the market due to the 1983 crash and the prominence of Nintendo and Sega, who would become the primary combatants in the ensuing console war. For the Genesis (1988), Sega came up with the infamous marketing slogan, Sega does what Nintendon’t, indicating it would compete directly with Nintendo, especially with Sonic the Hedgehog (1991), a title that would give Mario a run for his money. Other Sega Genesis games include Streets of Rage 2 (1993), Phantasy Star IV (1994) and Castlevania: Bloodlines (1994)

Nintendo responded with the Super NES (1990), releasing more industry-standard games such as Chrono Trigger (1995), Super Metroid (1994), Street Fighter II (1992) and The Legend Of Zelda: A Link To The Past (1991), which is considered one of the greatest games of all time.

However, Sega held its own: Sonic the Hedgehog became the best-selling game of 1991, selling 2 million copies worldwide. Sega did not win the war – it sold over 30 million Genesis units while the SNES sold over 49 million units, but it had proved that it could go up against the industry giant. 

Fifth Gen: Sony Rises to the Top with Game-Changing Exclusives

Until the fifth generation, the major players in the console wars had been Atari (until the fourth generation), and Sega and Nintendo. That would change with the advent of Sony’s Playstation. 

The original PlayStation home console changed the game, quite literally, with Final Fantasy VII (1997). FF publisher Square Enix had developed games exclusively for Nintendo until Sony convinced the company that its ambitions for FF VII would only be realised with the PlayStation, which used CDs rather than cartridges, and supported the latest 3D graphics. Square took full advantage of the PlayStation’s capabilities – introducing full-motion video cinematic cutscenes that would become a major selling point for the game. 

Like FF VII, other Sony exclusives such as the original Tekken (1994), Resident Evil (1996), Crash Bandicoot (1996) and Metal Gear Solid (1998), capitalised on the new console’s technological capabilities. Sony became the platform of choice for third-party studios such as Capcom, Konami, Electronic Arts and Namco, which it had eagerly courted from the outset. From 1996 to 2000, the Crash Bandicoot games were exclusive to the PlayStation consoles, Titles in the Metal Gear Solid series also began life as PS exclusives before being ported, and they rank as some of the best games of all time.

Both Sega’s Saturn and the Nintendo 64 failed against a better-designed console with a host of third-party exclusives that set a new standard for gaming. Sony sold 102 million PlayStation units, Sega sold a mere 9.24 million Saturn units, and Nintendo sold 32.93 million N64 units. With its first gaming console, Sony had become the market leader.

Sixth Gen: Sony Exclusives Redefine Gaming, Halo powers the Xbox

Sony’s PlayStation 2 (2000) also boasted a strong library of games. With titles such as Shadow of the Colossus, Ico (2001), Okami (2006), Gran Turismo 3: A-Spec (2001), Final Fantasy X (2001), God of War (2005), God of War 2 (2007), GTA III and GTA: Vice City, Sony continued to redefine what people could expect from a gaming experience. 

While both GTA III and GTA: Vice City have since been ported to other platforms, Sony entered into timed exclusivity agreements with the publisher, allowing it to gain a ‘stranglehold on the competition’. Sony’s exclusives helped make the PS2 the best-selling home gaming console of all time.

Microsoft’s first console – the Xbox (released 2001-02) , was going up against an industry titan with few competitors. But the Xbox had a killer app – Halo. The original XBox could not match the PS2’s success – Microsoft sold 24 million units as opposed to 158 million PS2 units – but Halo: Combat Evolved (2001) was the beginning of one of the biggest franchises in gaming, and proved that Microsoft could be a contender in the console market. Microsoft has sold 6.43 million copies of Halo: Combat Evolved, and the Halo franchise has sold 81 million copies worldwide – it is still one of the biggest reasons to buy an Xbox console. 

The Dreamcast (1998) preceded both the PS2 and the Xbox, but failed against the PS2 because of a lack of third-party content and Sega ended production before it even had a chance to compete with Microsoft’s console. Sega ceased to be a player in the market – the Dreamcast was its last console. 

Seventh Gen: Wii Wins by Appealing to a Wider Audience

Microsoft closed the console sales gap in the next generation – the Xbox 360 eventually sold 84 million units while the PS 3 sold 87.4 million units. The Xbox 360 (2005) had a strong exclusive line up, including Halo 3 (2007), which doubled sales of the console, and other hits such as Gears of War 2 (2008) and 3 (2011), Forza Motorsport 4 (2011) and Forza Horizon (2012).

The release of Metal Gear Solid 4 in 2008 boosted sales of the PS3, but as the final tally indicates, the seventh generation was a closely fought race for Microsoft and Sony – many developers found it hard to develop for the unique architecture of the PS3, leading to an underwhelming exclusive line up that gave the Xbox 360 an advantage.

The PS3 eventually caught up with ambitious exclusives such as the Last of Us (2013), God of War: Ascension (2013), Heavy Rain (2010) and Gran Turismo 5 (2010), but the generation was characterised by franchises such as Grand Theft Auto, Bioshock, Assassin’s Creed, Elder Scrolls, Tomb Raider and Final Fantasy, whose third-party publishers released their biggest games on both consoles to reach a wider audience. 

The real winner of the seventh generation console war, however, was the Nintendo Wii (2006), whose Wii Remote could be used for both traditional input and motion sensing. Wii Sports (2006), a game bundle that recreated popular sports at home using the Wii’s motion detection, became the best-selling Nintendo game ever, and helped the Wii become the best-selling console of the seventh generation, at 102 million units. With its lineup of family-friendly exclusives such as Mario Kart Wii (2008), New Super Mario Bros Wii (2009) and Wii Play (2006), the Wii won by attracting a much wider, casual audience, which neither the PS3 nor the Xbox 360 targeted. 

Since the Wii, Nintendo isn’t participating in a war so much as playing its own game: the Switch (2017) is yet another innovative console – a hybrid hand-held and home gaming platform. It has sold 103 million units as of February 2022, and its exclusives appeal to both core gamers and casual players. Many of its game franchises, such as Mario, Zelda and Pokemon, are huge system sellers  and The Legend of Zelda: Breath of the Wild (2017) has powered Switch sales.

Eighth Gen: Sony Wins by Sticking to an Exclusive Strategy

The scales swung back toward Sony in the next generation of consoles – both the Xbox One and the PS4 were launched in late 2013, and Sony’s console would outsell the Xbox One by a huge margin, primarily by adding steadily to its library of top-notch AAA titles. Despite an underwhelming exclusive lineup at launch, the PS4 would go on to become the second-best selling home console of all time after the PS2.

After a shaky start, Sony released a string of hit exclusives  – Bloodborne (2015), Uncharted 4: A Thief’s End (2016), Horizon Zero Dawn (2017), God of War (2018), Marvel’s Spider-Man (2018), and Ghost of Tsushima (2020) – all of which were integral to PS4 sales. 

Microsoft, on the other hand, was impaired by an identity crisis, as a ScreenRant article puts it. The tech giant pushed extras like the Kinect, which few cared for, and announced a controversial digital rights management (DRM) policy that required gamers to connect to the internet to play offline games, and also limited sharing physical copies with others. Widespread backlash forced them to abandon their DRM measures. 

Sony, meanwhile, offered a cheaper system that was more powerful, and also came with a solid library. Having only a few heavy hitters like Forza Horizon 2 (2014) and Halo 5: Guardians (not yet ported to PC), the Xbox One only managed 51 million units, less than half of PS4’s 117 million units. The Xbox One lost to the PS4 not only because it lacked a good roster of exclusives, but also because it tried to build an ecosystem hostile to gamers. 

Microsoft Begins the Move Away from Platform Exclusives

When Microsoft launched its Xbox Play Anywhere program in 2016, it got DRM just right – it allowed users to buy a game once and play it ‘anywhere’ (meaning on both PCs and Xbox consoles, but not PS consoles). It also enabled gamers to carry over their saved games, addons and expansions with them when they resumed playing on a different platform.

The Play Anywhere service was enabled for a slew of games, including Gears of War 4 (2016), Forza Horizon 3 (2016) and ReCore (2016), and the program signalled a major shift from exclusivity to an ecosystem for both Xbox and PC players, even in the face of repeated criticism regarding the Xbox One’s lack of exclusives. The roster of Play Anywhere games has since grown steadily.

After the Play Anywhere Program, Microsoft launched the successful Game Pass game subscription service for Xbox in 2017 and PC in 2019, and added Xbox Cloud Gaming to the Game Pass later that year. Microsoft’s cloud gaming solution would lead to speculation that consoles would soon become unnecessary

Sony, on the other hand, would take a few years to realise that its time-honoured exclusive-first strategy for its consoles restricted it from reaching the wider audience that Microsoft had attracted with its content services. 

Why are More Games Non-Exclusive Today?

In this section we will look at the various factors that have contributed to the decline of platform exclusives – changing industry attitudes, the release of prominent platform exclusives on PC, and the advent of cloud gaming.

Changing Industry Attitudes

In a June 2020 interview with BBC Click, Xbox head Phil Spencer said: ‘Our strategy does not revolve around how many Xboxes I sell this year.’ He added that Microsoft was focused on delivering services through the Xbox Game Pass. 

Months before the launch of the Xbox Series X|S consoles, Xbox head Phil Spencer also insisted that the Xbox brand was not built around exclusives. In a July 2020 interview with gamesindustry.biz, the game industry veteran said that the idea of locking people away from being able to experience games was completely counter to what gaming meant to him. 

He also characterised the Series X|S consoles as an upgrade rather than a complete departure from previous consoles, and promised backwards compatibility for thousands of games. Later that year, he announced that Microsoft would release all next gen titles for both Xbox and PC, with first-party titles arriving on the Game Pass subscription. 

Spencer – and Microsoft – could make such claims on the strength of the Xbox Game Pass. By 2020, many outlets were calling it a success, and even the Guardian took note when the service hit 10 million subscribers during the lockdown. With 25 million subscribers as of 2022, the Game Pass subscription service has helped Microsoft reach a far wider audience – especially as the Xbox cloud gaming solution is available on multiple platforms, including PC, console and mobile devices. The subscription service also accounts for a significant portion of Microsoft’s gaming revenue. A PC gamer article provides an apt summary of Microsoft’s core business strategy: selling game content through subscriptions or direct purchases. The device on which you play does not matter – in fact, Microsoft can shun the platform exclusive, and disregard console sales, simply because it can afford to. 

The PS5 and the Xbox Series X|S were released at roughly the same time in late 2020. Sony stuck with its policy of lining up a compelling set of exclusives, which led analysts to predict that the next-gen Xbox consoles would suffer poor sales because of Sony’s superior library. 

Despite the PS5 boasting a strong library at launch, the Xbox Series S actually outsold the Sony console in the 2021 November holiday season – the pandemic-period chip shortage and other factors had led to low PS5 stock, which led in turn to poor sales.

November is a critical month for the game industry: sales spike as Americans buy up game consoles and games to give as gifts. The Xbox Series S was readily available and also benefited from its low cost and the release of the much-anticipated Halo Infinite and Forza Horizon 5 (2021). Forza Horizon 5 broke Xbox records, passing 10 million players within its first week

An exclusive strategy built around a console requires the console to be readily available, and the PS5 shortage may have impelled Sony toward a dramatic change in policy. In May 2022, SIE president Jim Ryan stated that the company’s focus on building a strong portfolio of narrative-rich, graphically beautiful single-player games, had restricted it to a ‘rather narrow portion of the gaming market’. 

A chart from a 2022 Sony presentation showing the rise in PC/Mobile Releases (Courtesy Sony)

By expanding to PC and mobile and offering live services, Sony could move from a single part of the market to ‘being present pretty much everywhere’. In a mid-term strategy meeting, Sony released a presentation detailing its move away from a ‘walled garden’ approach to releasing content on more platforms and mediums. Sony now wants around half of its games to be available on PC and mobile by 2025. 

For Microsoft, platform exclusives are no longer part of its business strategy, which now involves multi-platform subscriptions and streaming services. For Sony, moving into multiple platforms gives it the chance to be present in all gaming segments. In the next section, we will discuss how both industry giants have approached the PC ecosystem.

Microsoft and Sony Double Down on PC gaming

Microsoft has led the charge in making all its games available on PC – the tech giant’s next-gen games will be available not only on the Microsoft Store, but Steam as well. Microsoft has also added to its game library by acquiring companies such as Activision Blizzard and Bethesda, both of which have released prominent games on PC and other platforms. With the Game Pass’ cloud solution, such games are available to stream as well, on multiple devices (except PlayStation consoles). 

Microsoft is also enticing publishers into the Windows Store ecosystem – in 2021, the tech giant reduced its revenue cut from Store games to 12% from 30%, encouraging more developers to create games for the digital storefront, which is of course, available on Xbox consoles as well. 

Sony is also catching up, releasing some of its exclusives on PC. In fact, releasing PC ports is seen as one of Sony’s strategies to reach a wider audience and give gamers a taste of the quality they can expect from a PS5 console. In August 2020, Sony released a PC port of the much-lauded Horizon Zero Dawn and ported Days Gone (2019) in 2021. It then made a stunning announcement later that year: 2018’s God of War, one of Sony’s greatest exclusives, would be available on PC by 2022. 

Sony now plans to release four more games on PC this year –  Uncharted 4: A Thief’s End and Uncharted: The Lost Legacy will be bundled as the Legacy of Thieves Collection, though no release date has been confirmed yet, while Marvel’s Spider Man Remastered (2020) will be available on August 12, and Marvel’s Spider-Man: Miles Morales (2020) during fall 2022. Sony’s announcements have led multiple media outlets to declare that exclusivity and the console wars are over for good. 

A slide from a 2022 Sony presentation growth in revenue via PC Ports (Courtesy Sony)

Sony is committed to its PC gambit – in 2021, it launched the Playstation PC label on Steam and a year later, it acquired the porting specialist Nixxes Software, and later confirmed that it would collaborate with Nixxes to bring more PS4 and PS5 titles to PC. In April 2022, Sony put out a job listing for ‘senior director of PC planning and strategy at PlayStation Global’, and a month later, the company forecast that its PC game sales would jump a whopping 275% to $300 million by the end of the next fiscal year. 

It is apparent, however, that Sony is making PC ports of its platform exclusives a while after their release on PlayStation consoles, enabling the company to walk a fine line between drawing people toward its consoles, and reaching a wider audience with ports.

The Advent of Cloud Gaming

Cloud gaming services could actually bring exclusives back into play, if they follow a model similar to media streaming services such as Netflix, Prime Video and Disney+. ‘Exclusive’ media franchises such as Stranger Things on Netflix, and the Star Wars and Marvel franchises on Disney+ impel consumers to pay monthly subscriptions for multiple streaming services. 

Enabling gamers to play on any device with a strong network connection is a key value proposition for any cloud gaming service. Xbox’s cloud gaming solution (part of the highest Game Pass tier) is available on Android, Windows, iOS, iPadOS, Xbox One, Xbox Series X|S, and Steam Deck.

With its robust library, Sony’s PlayStation Plus service is characterised as “the Game Pass alternative that many people believed Sony would never offer”, and as the highest tiers of both the Game Pass and PS Plus feature cloud gaming, gamesindustry.biz argues that a new war, now between competing streaming services, is about to begin. The Playstation Plus Cloud Streaming Service is available on PC – as such, no streaming exclusive can be considered a platform exclusive. 

However, cloud gaming services are still in the nascent stage and 5G networks – integral to the game streaming experience on mobile platforms – are far from ubiquitous and the ‘deprioritization’ of Google Stadia has raised concerns about the viability of cloud gaming

Conclusion: Is Platform Exclusivity Dying?

During the console wars, platform exclusives, timed or otherwise, played a big role in determining the success of a console. Current trends suggest that the once-mighty platform exclusive may be past its heyday. 

Microsoft appears to have decisively moved away from platform exclusives, especially with the Game Pass, its multi-platform Store, and its cloud gaming solution. But even Microsoft wants to keep some high-profile titles as console exclusives – Xbox chief Phil Spencer has all but announced that The Elder Scrolls VI will be playable only on Xbox or PC and Starfield, Bethesda’s first new IP in 25 years, is exclusive to PC and Xbox. Responding to the ensuing backlash, Todd Howard, chief of Bethesda, insisted that exclusivity would lead to a better product

Sony seems unwilling to fully relinquish platform exclusives. It has many platform exclusives scheduled for the PS5, and revealed in 2021 that the company had spent $329 mn on third-party exclusives for the console. But a year later, Sony announced that it would spend $300 mn to help first-party studios develop games and release them on multiple platforms, suggesting a shift toward inclusivity. Sony is committed to its PC strategy, but it wants to make a compelling case for the PS5 as well – by lining up third-party exclusives, and releasing first-party exclusives on PC a year or so after they are released on the PS5 – attracting buyers to their consoles and reaching a wider audience with their games.

The platform exclusive might be dying – only for the service exclusive to take its place. Both the Xbox Game Pass and the PlayStation Plus service feature exclusive content that cannot be accessed anywhere else. Such exclusives may not be locked to a single platform – Game Pass exclusives can be downloaded to both PC and Xbox, and its streaming solution brings these exclusives to just about any device except a PlayStation console. The PS Plus’ streaming solution works on the PC as well. 

If subscriptions and game streaming solutions become prevalent, and turn into major sources of revenue, Microsoft and Sony may be at war yet again, on a new front, and may even face challengers offering innovative game content services. The decline of the platform exclusive may well usher in a new era of multi-platform gaming, but gamers would then have to decide between subscription services rather than consoles. Gameopedia’s data curation team amasses information about all sorts of gaming hardware, gaming services and games. Reach out to us to learn about our game data solutions and more.

Read More

Why the Gaming Industry is Spending Billions on Consolidation

The pandemic was a difficult time for the world, but a period of growth for gaming. The industry rose to prominence during COVID lockdowns – when millions turned to video games – and its biggest players are now consolidating at a rapid pace through high-profile mergers and acquisitions. 

In this article we will discuss some of the transformative gaming deals of the past to provide context for the current spate in M&A, and delve into the motivations behind today’s high-profile acquisitions – companies are looking to penetrate lucrative gaming platforms, racing to create the metaverse, enriching gaming with cutting-edge technology, and incorporating vast user bases built around successful franchises. We will conclude with a discussion of how the industry appears to be entering a period of mega-consolidation.

How Explosive Growth in Gaming is Powering Consolidation

The industry registered record growth during COVID – people restricted to their homes sought to entertain themselves and stay connected, and gaming enabled them to do both – as more users played multiplayer games, time spent on gaming and related activities increased by 39%. No surprise, then, that the gaming market was valued at $198.40 bn in 2021, and is expected to be worth nearly $340 bn by 2027, with mobile games revenue expected to cross $100 bn by 2023. The gaming industry is larger than the movie and music industries combined.

Video Game Revenue (Source: Statista)

The industry is capitalising on this growth by consolidating: according to investment banking firm Drake Star, M&A activity reached a record $85 bn in 2021 – three times the value in the previous year – as game companies participated in 1,159 deals including 299 mergers and acquisitions. The value of deals announced in Q1 2021 was $18.2 bn, and deal value mushroomed to $90.4 bn in Q1 2022, on the back of Microsoft’s purchase of Activision Blizzard (the biggest ever acquisition in gaming), and Take Two’s deal with Zynga

While industry observers agree that explosive growth is powering gaming consolidation today, they also argue that tech giants are fighting over the future of gaming through high-profile deals, and vying for supremacy in the industry. COVID transformed gaming forever, and the aggressive plays by tech powerhouses are the result of this tectonic shift. However, there is more to the consolidation today than just a bid for primacy or an investment in gaming’s future, as we will discuss below. 

Transformative Gaming Deals: Then and Now

Gaming deals then and now have powered the growth of companies via the acquisition of lucrative IPs. The key difference between the deals of the past and the M&A today is the large sums of money involved, and the complex agendas behind consolidation. Some acquisitions before the pandemic period also reveal similar motivations, foreshadowing the billion-dollar deals that characterise gaming acquisitions today. 

The IP-centric deals of the past

Many of the transformative gaming deals in the two decades prior to COVID center around the acquisition of lucrative IPs. 

In 1998, Take Two bought BMG Interactive, the then makers of Grand Theft Auto, for $14.2 mn. Take Two’s purchase eventually resulted in ‘the most lucrative game in history’ and the most profitable entertainment product of all time – GTA V. GTA’s ‘incredible staying power’ has yielded billions of dollars in revenue.

GTA
Grand Theft Auto (Courtesy Rockstar Games Inc)

In 2005 Bandai and Namco merged and became the third-largest gaming company in Japan after Nintendo and Sony, on the strength of their combined franchises. 

In 2011, Tencent became the majority shareholder of Riot Games, the makers of League of Legends, acquiring a 93% stake in the company for $400 million. Four years later it acquired the remaining 7%, owning League of Legends outright at a time when the game was rapidly becoming one of the world’s biggest esports. The game is hugely popular, especially in China, and League of Legends’s esports events are some of the biggest in the world.

league-of-legends
League of Legends (Courtesy Riot Games, Inc.)

In 2014, Microsoft acquired Mojang, the creator of Minecraft, for $2.5 bn – a prescient deal, considering that Minecraft would go on to become the best selling game of all time, with 238 million copies sold across platforms as of 2021. 

Amazon’s acquisition of Twitch in the same year for $970 mn is an example of a pre-COVID platform play – Amazon not only had the capacity to sustain Twitch’s rapid growth, but it could also enter an entirely new platform – live streaming – and Twitch was eventually able to support a massive number of viewers and broadcast high-profile esports tournaments on the back of Amazon’s support platform. In 2021, the Dota 2 International finals drew in a total of 1.7 mn unique viewers on Twitch – 62% of the 2.7mn viewers who tuned into the game

Tencent’s 2016 acquisition of Supercell is not only one of the few high-stakes deals before COVID: it is also a platform play. The Chinese company made the third-largest gaming deal in history by paying $8.6 bn for a majority stake in Supercell, adding the Finnish developer’s flagship game Clash of Clans to its portfolio. The mobile title raked in nearly $490 mn in 2021 through in-app purchases. Tencent has consistently made the most of the mobile gaming platform – 60% of its $19.3 bn gaming revenue in 2019 came from mobile games, and in Q2 2021, the company’s mobile games raked in $6.3 bn – 30% of all gaming revenue.

The Billion-Dollar Deals Today

Gaming M&A today are notable for the massive, billion-dollar amounts in play, and how they have catapulted companies into market-leading positions. In the past, Tencent’s buyout of Supercell or Activision Blizzard’s $5.9-bn purchase of King were the exception rather than the norm, but today, even the low-profile deals – such as EA’s acquisition of smaller mobile gaming companies – run into billions of dollars. 

Gaming acquisitions in the post-pandemic era
Major video gaming acquisitions in the post-pandemic era

In 2020, Microsoft purchased Zenimax for $7.5 bn, giving it control over some of the world’s greatest franchises, such as Fallout, The Elder Scrolls, Doom and others. 

Xbox Bethesda
Xbox acquires Bethesda (Courtesy: Microsoft)

Electronic Arts, the third-largest Western gaming company, has largely built its success on desktop and console titles. The studio bought into the mobile gaming market in 2021 with the acquisition of Glu Mobile for $2.4 bn in April 2021, followed soon after by the purchase of Playdemic for $1.4 bn.

But it was in January 2022 that the gaming industry went on a purchasing spree. Take Two acquired Zynga for $12.7 bn. Microsoft bought Activision Blizzard for a whopping $68.7 bn. When Sony acquired Bungie for $3.6 bn, its stock surged by nearly 6%

When Microsoft completed its purchase of Activision Blizzard, it announced that it was the third largest gaming company in the world by revenue, behind Sony and Tencent. Microsoft now owns 24 first-party studios, while Sony owns 19.

xbox first party studios
Xbox First-Party Studios (Courtesy MIcrosoft)

Various reports have led to speculation that Ubisoft may be the next studio to be acquired. A Bloomberg article reports that a few private equity firms have been scrutinizing the business, and employees have also claimed that various company divisions are under audit. In 2018, the studio fought off a hostile takeover play by French company Vivendi, but Ubisoft’s CEO has said in a recent earnings call that the company is open to offers for purchase, amidst plummeting stock prices. 

In the next section, we discuss the reasons why gaming companies are making such high-profile deals today.

Platforms, Metaverse, Tech and Users: Why the Industry is Consolidating

The industry’s M&A frenzy is not just about capitalizing on growth during the pandemic. By consolidating, companies are attempting to make their mark on new platforms, jump-start the metaverse, enrich gaming with new technologies and assimilate vast user bases spread out among multiple IPs and platforms.

Platform Plays: Staking a Claim in Lucrative Platforms

The high-profile deals made by Take Two, Microsoft and Sony do share a common thread – the acquisition of first-rate IP’s. However, Take Two and Microsoft are also making multiple platform plays with their acquisitions. 

Take Two grew on the back of console and PC gaming, powered by franchises such as Red Dead Redemption and Grand Theft Auto. It is now looking to grow on the mobile platform by acquiring Zynga. 

Take Two has unsuccessfully tried to build an in-house mobile gaming market strategy by acquiring small companies such as PlayDots. By buying Zynga, it not only owns money spinners like Farmville, but can also learn from Zynga’s expertise in making free-to-play mobile games. Take Two also plans to launch more of its franchises on mobile platforms – will we be seeing a mobile version of Read Dead Redemption in the coming years?

Microsoft’s acquisition of Activision Blizzard not only gives it ownership over World of Warcraft, but also mobile games such as Call of Duty Mobile. It also gains mobile games such as Candy Crush Saga, because Activision Blizzard acquired the social gaming company King in 2015. 

Microsoft makes about $250 million per month from its 25 million Xbox Game Pass subscribers, and Game Pass subscriptions account for 80% of the tech giant’s gaming revenue. With Activision, Microsoft now has 24 first-party Xbox studios and can offer even more content on a subscription service already known as the ‘Netflix of gaming’, reaping ever-higher profits from its XBox division. 

And if that’s not enough, Microsoft boasts a steadily improving cloud gaming solution, Xbox Cloud Gaming. By purchasing Activision Blizzard and Zenimax, the tech giant could offer a wider variety of IPs on the cloud, drawing more users into its game streaming service.  

The Chinese company Tencent has become the world’s largest gaming company by revenue largely on the back of its extensive portfolio of online games, targeted at its domestic audience. Games like PUBG Mobile and Honor of Kings regularly rank as top-grossing mobile titles, and as discussed above, its mobile segment has always been a rich source of revenue. 

Now, it may be looking into desktop and console gaming. In 2021, Tencent acquired 11 gaming companies, including Turtle Rock (developers of Back 4 Blood) and Sumo Group (developers of many Sonic racing games). This is a bid to enter the console and desktop platforms, according to industry analyst Drake Star, which also expects the gaming giant to acquire many AAA studios and developers this year. 

Platform plays are not restricted to high-profile deals – smaller acquisitions reflect similar objectives. Netflix is continually buying indie studios and adding to its gaming library in a bid to enter the mobile gaming sector, and it intends to use data from its gamers to enhance its video catalog as well. In a Gamopedia poll, 33% of the respondents expressed interest in playing games on the Netflix platform, while 26% said it depended on Netflix’s gaming library.

The Metaverse: Jump-Starting the New Internet

The metaverse is a combination of virtual and augmented reality where users fully inhabit a simulated world created by technology, interacting with each other within digital environments. It is considered the future of the internet and cyberspace. It is still in its infancy, and it could take more than a decade for a full-fledged metaverse to come into being.

Horizon Workrooms
Horizon Workrooms (Courtesy Meta)

When Meta (formerly Facebook) acquired Oculus in 2014 – years before the company had sold a single headset – it was staking an early claim on the metaverse. 

Meta has also acquired many successful VR game studios, such as Ready At Dawn (the makers of the award-winning Lone Echo) and Beat Games (makers of the highly successful Beat Saber). These acquisitions are not just about adding IPs – but about offering more reasons to participate in the metaverse and ‘Social VR’. 

Meta’s Oculus has also acquired many companies to accelerate development of the metaverse by improving the VR experience. In 2015, it acquired Surreal Vision, which recreates 3D environments in virtual spaces. A year later it purchased Eye Tribe, which specialises in eye-tracking technology that allows you to control the direction you look using your eyes alone. 

Meta’s metaverse play has actually hurt profits and Reality Labs, the division behind augmented reality and virtual reality, lost more than $10 bn in 2021, despite impressive sales figures for the Oculus Quest 2 headset in the same year. This has not deterred investment in the metaverse. 

According to Bloomberg, even Microsoft’s acquisition of Activision Blizzard is a metaverse play, because the company’s new IPs give it access to massive gaming communities, which have come closest to creating metaverse-like experiences. Both Roblox Corp, publisher of Roblox and Epic Games, the publishers of Fortnite, have organized immersive in-game experiences such as live concerts. Epic Games pioneered crossover events featuring an eclectic mix of IPs through custom skins. Such events were attended by millions, turning Fortnite into a metaverse community rather than just a battle-royale F2P game. 

Technology First: Betting on the Cutting Edge

Sony’s reasons for acquiring Bungie, however, have little to do with the metaverse, but much to do with technical know-how: they want in on Bungie’s capabilities in live-game services, and its expertise with cross-platform play. Sony has not delved into the live-service platform, opting instead for narrative-driven games and open-world franchises such as Horizon Zero Dawn. Buying Bungie gives Sony the chance to get into the live-service platform, and move past a console-only strategy – in fact, the company wants around half of its games on desktops and mobile devices by 2025, indicating a significant change in its approach to making and distributing games. 

Bungie has mastered the live service model after years of developing the Destiny series, which has drawn 187 million unique users over the course of its existence. Bungie also has pioneered cross-saves and cross-platform play, and has deployed a variety of revenue models to keep Destiny competitive against other F2P and subscription titles. Bungie’s complex infrastructure supports almost a dozen different hardware platforms for Destiny 2, including Google Stadia. Sony could not have chosen a better tech partner. 

Bungie plans to extend Destiny beyond gaming as well – there is speculation that the company is working on a Destiny TV show or movie as part of its plans to create a Destiny Universe, just as Sony adapted The Last of US into a TV Show and Uncharted into a movie. Venturing into new mediums will draw more users toward Sony and Bungie’s franchises, and Sony may well lean on Bungie’s expertise in coping with enormous player communities. 

Square Enix has sold off its western titles to focus on technologies such as the blockchain, AI and the cloud. Square Enix’s deal with Embracer Group involves the sale of 50 IPs, including classics like Tomb Raider, Deus Ex, Thief and Legacy of Kain, for just $300 mn. However, the company still has a strong library and has also retained some of its Western IPs. Its decision to focus on new tech could enhance their gaming titles as well.

Prior to the sale of its IPs, Square Enix invested in the Ethereum-based game developer TSB Gaming, creator of The Sandbox, a virtual world built on the blockchain, which allows players to build, own and monetize their voxel gaming experiences.  

Square Enix’s president Yosuke Matsuda has emphasized the company’s interest in introducing blockchain technologies into gaming to incentivise gamers and modders, and the gaming firm may also look into including token economies in its titles. The sale of Western IPs will bankroll these endeavours. 

Gaining User Communities and Data

Consolidation buys access to millions of users spread out among the popular franchises of smaller studios, and these user bases yield significant data-driven insights. 

This enables bigger players to devise well-directed promotion strategies across an ever-increasing portfolio of games and first-party studios, thereby saving costs on marketing and ad campaigns. User acquisition and retention is a high priority across gaming platforms and a significant part of a game app’s marketing strategy. Microsoft not only gains lucrative IPs by buying Zenimax and Activision Blizzard, but it also buys into vast user bases and data across platforms such as desktops, consoles and mobile.

User acquisition is vital to the metaverse – big players are already taking notice of the huge user communities of Roblox and Fortnite and their metaverse-like experiences. Having access to data on these communities, their behaviour and their metaverse expectations might be a game-changer for tech giants who are looking to usher in the new internet. 


Gaining users and their data will hence allow gaming giants to fine-tune their consolidation agendas. Indeed, user data and insights into behaviour tie into all the three objectives listed above: they will enable companies to maximize gains from new platforms, entice gamers into a new, as yet unpolished metaverse, and improve the user experience with new technology.

Impact of Consolidation on Gaming

Today’s gaming deals are not solely about acquiring high-performing IPs – companies have a complex agenda, ranging from platform plays to metaverse investments. In this section, we discuss how gaming could change as a result of such strategic moves.

More monopolies and exclusives: Microsoft’s acquisition of Activision Blizzard was not met with unanimous approval, and raised concerns that the tech company had suddenly risen to a position where it could dictate terms as a monopoly player with the largest number of first-party studios. Microsoft’s position of power over the gaming industry may also draw antitrust scrutiny according to some observers. Such larger players can also decide to keep lucrative IPs as exclusives – note that XBox chief Phil Spencer has all but confirmed that The Elder Scrolls VI will be an exclusive available only on XBox and PC. 

Tech giants vs smaller players: Big companies have made gaming acquisitions that will cement their place as market leaders. This will make it harder for smaller studios to compete, but not to create, as they will get opportunities to be purchased in an industry that hungers for new talent and new experiences. Netflix has bought many mobile game studios, and Meta has done the same for VR game studios. However, industry behemoths may not allow such creators to maintain their independence. Studios with successful franchises may succumb to acquisition after a few failed experiments hurt their bottomline, and the dominance of a few large players may stifle innovation and the creative freedom of developers working within the confines of a larger corporate structure. In effect, smaller studios can still create in an era of consolidation – but what they create may be dictated by their corporate paymasters rather than their creative ambition, leading to less variety in gaming content.

Cross-platform play as the norm: The gaming industry has traditionally not been very keen on providing cross-platform playability. However, this is likely to change because of the various platform plays reflected in today’s gaming M&A. Thanks to its acquisitions, Microsoft already boasts a sizeable collection of games playable via the Xbox Game Pass and its cloud platform. Sony is looking to enter the live-service online platform with the help of Bungie. Take Two wants to launch mobile versions of its famous IPs. As the number of multi-platform franchises grows, cross-platform play could become the norm rather than the exception. At the very least, gamers can expect an enhanced cross-platform experience going forward.

Mobile gaming as default: Smartphones are already the most popular device for gaming, but the introduction of high-quality IPs on mobile platforms could make smartphones the default device for gamers, or at least provide an experience that transcends casual gaming. As cloud gaming on mobile matures, more subscription-based services featuring high-quality games can be accessed purely via smartphones – without any hardware apart from the mobile screen. In the future, a smartphone streaming to a HD screen or a headset may be all that you require to get a high-fidelity, immersive playing experience. Gaming could hence become ‘mobile-native’. 

Forging the Metaverse and its community: Apart from its technological trappings, the metaverse is essentially an online interconnected space where individuals interact with each other. Acquiring gaming companies that have experience dealing with massive user communities, rather than just user bases (contrast WoW players vs Windows users), will enable tech giants to understand how to create the metaverse community. In many respects, WoW can be considered a successful proto-metaverse – from its earliest stages, it featured player-driven economies, social gathering points and virtual real estate. Technology companies are looking to integrate the immersive elements of gaming into the metaverse experience: Roblox, Epic Games’ Fortnite, and GTA Online already have metaverse-like platforms that incorporate player communities into their business models.

Conclusion: How Consolidation will Beget Consolidation

Microsoft’s purchase of Zenimax in 2020 sparked speculation that there would be more such major acquisitions in the future – and this has proved to be true. 

Tech companies and game industry giants have harnessed COVID-period growth to make billion-dollar consolidation deals. To some observers, the consolidations are still very much underway and experts suggest that we are entering an era of mega-consolidation, where consumer demand for cross-platform experiences will drive further M&A. Even tech firms without a presence in gaming may enter the market because of the industry’s massive potential – 26% of the world’s population plays games and gaming is the most lucrative entertainment industry by a wide margin. 

Acquisitions lead to acquisitions, according to analyst Brandon Ross. Successful publishers and studios can expect to be bought – the industry may become the fief of a few large players, but studios that create quality games will not lack for customers, be they gamers or tech giants. 

Mergers and acquisitions are about spending money to make money. Big players can create or simply buy more content. With more content comes more players, and with more players arises the need for more and diverse content – a virtuous cycle where high-profile acquisitions constantly transform the gaming market so that it can keep pace with rising consumer expectations and the demand for more content. 

The consolidation in the gaming industry may eventually propel it toward a new normal, with a thriving metaverse and a tech-enriched gaming experience that transcends platform limitations. For gamers, developers, independent studios and gaming giants, that is a win-win situation.

Read More