How do you create an AAA?

(Guilty Gear Xrd - SIGN - On the left, the 3D model in the software, on the right, how it is seen in game)

Each year thousands of games are published and users follow the videos, trailers, news and images that are published on YouTube, forums and social media. Afterwards, we dedicate hours and hours of our life to play them and enjoy the stories that they tell, but few people know how a game begins, going from an empty project that slowly but surely builds into the shape that ends with a product of hundreds or thousands of hours of fun. We want to shine a light on this creation process given that it can be almost as interesting as the game itself.

Currently, all AAA games are built in various parts: graphic resources, audio resources and the engine of the game which the game is mounted on.

Graphics

The graphics are surely the part which has the biggest impact on the first impression and which has to be taken care of to the extreme, especially to get photorealistic images like those from Battlefield V or cell shading like those in Dragon Ball Fighterz. Poor graphics usually lead to an almost automatic rejection of the game by the public.

Artists have dozens of tools available to create models and textures for a game, the most famous being Maya, 3DS Max, Zbrush and Photoshop. In these programmes, they don’t just model and animate the 3D models which are later imported to the graphics engine, but they also work on lighting using normal maps, adding texture to images on models and shaders, programmes which modify how the graphics card or GPU works through vertex and pixel shaders.

The animation of these models can be done by hand from frame to frame, achieving spectacular results, but taking a lot of time. Other methods include the capture of movement, recording real actors in spaces with 3D cameras or through physical simulators.

(Ellen Page and Willem Dafoe in movement capturing sessions for the game Beyond Two Souls, developed by Quantic Dream studios for Sony PlayStation)

The soundtrack matters just as much

For audio, tools like Wwisde, Fabric or Spatial Workstation are used. The sounds and music  can seem like one of the least important parts of a game’s development, but people like Akira Yamoaka (Silent Hill), Nobuo Uematsu (Final Fantasy) or Gustavo Santaolalla (Last of Us) wouldn’t be in agreement given that they’ve created some of the most famous soundtracks.

Once the graphics and audio have been created, they pass to the engine of the game, the element which ensures that everything comes together and works, like a conductor of an orchestra. From those available for licence for individuals and other studios, two are the market leaders, Unreal Engine from Epic Games (creators of Fortnite) and Unity.

(Unreal 1/Epic Games)


Unreal Engine was born in 1998 when EPIC developed the classic Unreal. In the age of 3DFX Voodoo and Glide, EPIC created a huge science fiction FPS, with spectacular graphics for the time and, with the popularity of the game, it had no choice but to share the tool set which acted as the seed for the most versatile engine. Once Unreal was finished, other classics like Unreal Tournament or Unreal 2 were born from that engine, and that evolved little by little until it became a tool used by other studios across the world. The four versions which have been released until now have given us games like BioShock, Deus Ex, Splinter Cell, Army of Two or Batman Arkham Asylum.

(We recommend taking a look at the complete list of titles developed by the latest versions of Unreal Engine)

(Blacksmith, an impressive demo made with Unity)

For their part, Unity was born as a game engine only for Mac, but with time it has grown its base and arrived on all devices. Currently it is the most widely used by indie developers and for mobile development. Among the games that use Unity are Overcooked, Hollow Knight, Gris, Cuphead or Yooka-Laylee (with Unity we also recommend seeing the list of dveeloped games, given that many indie gems have been made with it).

Big studios like EA, Ubisoft or development teams for AAA games have created their own engines, with the same quality or more as Unreal Engine or Unity. Electronic Arts bought Dice in 2006 and brought with it all of their technology: the engine Frostbyte. With each game, EA, achieved independence from Epic Games and from Unreal Engine and focuses on the development of the Dice engine, being able to enjoy it thanks to games like Battlefield IV, Battlefield V or Mass Effect.

(Assassin’s Creed / UbiSoft)

Ubisoft took the decision with the first Assassin’s Creed to create their own technology. At first, it was called Scimitar Engine but would later be called Anvil Engine, this engine being the second version at present. Since its creation, all of Ubi’s studios have used it and games like For Honor, Rainbow Six or Assassin’s Creed Odissey. Only one of Ubi’s studios work with another engine, Massive Entertainment, the creators of The Division. When they began the titanic project of creating an online game with these characteristics (TPS mixed with RPG in an open online world), they decided to create their own technology from scratch, an engine called Snow Drop.



(The Division / UbiSoft - Massive Entertainment)

These engines are tools set up with APIs of third parties to be able to play audio, generate graphic, receive information from controls or communicate with the internet and create online games. The most famous APIs are DirectX, created by Microsoft for all kinds of multimedia applications on Windows, and OpenGL/Vulkan/OpenAL, a graphic API created by the Khronos Group, a group of different businesses like Epic, nVidia, Apple, Valve or AMD.

The process of using these engines to represent the games is very complicated, but in broad terms it consists in the following:

  • The audio is compressed in RAM if it’s small (sounds for shooting, explosions, background noise…) so that it is quick and immediate. If they are soundtracks, there are usually streams from the disk given that they don’t need the same speed as other sounds. 3D models are rasterised from triangles to pixels using vertex shaders and then pixel shaders. The first transform the models from 3D to 2D as a screen or television is. Later, they apply colour to each pixel transforming it with vertex shaders.

  • Once all of the resources are loaded, small programmes called scripts respond to each of the events that create a game’s logic: the user interacts with the control, data from other online players is received, the game’s AI makes enemies attack the player...

The second point, how models are represented in the game, is the part that has changed the most over the last few decades. Now developers can use vertex and pixel shaders to create all kinds of graphic effects, something which was impossible just over a decade ago, when the graphics pipeline was preset. With this freedom, you can create effects for photorealistic graphics such as Subsurface Scattering, Parallax Mapping or, possibly one of the greatest advances of recent years, Physically Based Rendering or PBR, which through physical simulations, for example, means that the materials of a 3D model are able to react to light.



Recently, thanks to the new RTX graphics of nVidia, it has been possible to implement ray tracing in real time inside the games, Battlefield V being the first one to do so. Ray tracing consists of tracing rays that simulate light to calculate their bounces and to be able to represent the refraction and reflection on the materials of a 3D scene.

One of the most important areas in recent years has been the development of online gaming. To be able to offer these games against other players, the games need to make use of servers that are responsible for sending information between the different players of the game, which can be specifically dedicated to the game or p2p. The dedicated servers are made available to users by the developers themselves, with normally low latencies (not always, that’s why sometimes there are games with lag) and p2p, which is when the player who creates the game that shares a server between all players on their team. The difference between the two is the price of maintenance, with dedicated servers providing a significant cost to companies, although they offer much better performance than p2p alternatives, so games that opt for this type of solution are often more highly thought of among users.

This summary provides just a brief outline of the very long process of creating an AAA, a task that usually takes between three to four years and involves teams of 100 to 300 people, with some extreme cases in high profile games having up to 500 people involved.

Continue reading

#}