As soon as in knew that we would be making a VR game, I got into coming up with concepts,
I had a couple of ideas which we're not really viable, after a little filtering, I had come up with two ideas that really stuck with me.
[PRE - PRODUCTION]
The first concept that I really wanted to work on was a very simple game, where you played as a scientist and had a simple conversation with an AI. The AI would have the voice of a little kid and the scientist would act as it’s father figure.
In the story of the game, we come to know that there is an incoming nuke sent to destroy the AI before it does any damage to the world. The conversation that follows, revolves around the scientist explaining the AI why humans are scared of her and why it’s not her fault. This game foreshadows what our possible future as a race would be. How advancements in AI technology would one day give birth to a truly artificial being capable of thinking for itself. And how would we treat it? As an equal? Enslave it? Given our pedigree for destruction, I would say we would try to erase our mistakes first.
Now imagine a little kid who has the basic reigns of the world understands that they are about to die, for absolutely no reason, just because of a whimsical what if? Imagine the thoughts that would course through their mind. And imagine a father sitting with his child, ready to die by her side trying to explain the injustices of the world. This is the emotion I was going for.
I wanted to make a game which revolved around the feeling of isolation (but with someone). So, the concept of exploring a moon seemed very attractive to me. In this idea, the player will be an astronaut who is sent to the moon when the earth is in dire need of resources. Since, over utilisation of resources, global warming and pollution have brought the world to its knees, humanity faced an existential crisis with food shortages and global catastrophise. You were sent to the moon as the advance party to start terraforming it to a liveable biosphere.
While on the moon, you would complete daily tasks and progress humanities transition to the moon, while there, you would hear radio transmissions talking about what is happening on the surface. While you’re gone, the world will fall apart, nations will fall, entire species will go extinct, people will start killing and pillaging for the basic necessities.
You were sent to the moon under the pretence that you would be able to come back from your expedition but because of a mishap in one of the mission days, the ship used for getting you back to earth gets destroyed. Initially you are promised a rescue and your mission for research and development changes into a mission for survival until your rescue vehicle arrives. But during that time, the state of the earth worsens and the above-mentioned calamities come to pass. Now it becomes a topic of debate whether the cost of building a ship to bring you back to earth is worth it or not. And a unanimous decision is made to abandon you and your friend in space. Defeated and broken, you have no option other than survive on the desolate moon and listen to the Earth tear itself apart.
When bad things happen, they tend to happen all at once. You receive the news from the radio that a meteor is heading towards earth and there is nothing that the people can do to avoid. You are partially happy that the people who betrayed you are all going to die. But you are devastated at the fact that there is no hope anymore for you and your comrade.
The game ends with the meteor crashing onto earth and parts of earth coming towards the moon. The player can see the earth throughout their ordeal and slight changes on the surface of the earth will work as environmental storytelling.
I also had a few more Ideas that you can find in the PPT provided below.
We were then told of different constraints which our projects would haver to adhere to:
Climate change issues and solutions
Future of work, automation and UBI (Universal Basic Income)
Artificial Intelligence and machine learning (not killer robots!)
Future of citizenship, democracy, activism
Food, water and resources issues;
Genetically modified organisms (GMOs), lab grown meat and future of food;
Synthetic Body Parts: Prosthetics to Electronics
Money, Bitcoin, Blockchain, NFT’s
Surveillance, Facial Recognition and Eye tracking - personal privacy
Corporate/government data capture, data privacy and ethics
Social VR / Metaverse for Community support.
My idea seemed to adhere to the climate change constraint and I was already making up my mind to move on with it when Bhuvanesh approached me wanting to collab for the project. So I pitched my idea and we started discussing how to improve it. We chose to add a base building element to the game much like Astroneer, giving it more depth.
During our discussion, Dhruv also wanted to join the team and we went with a trio for out VR project.
I have worked on loads of individual and team projects before as you can see under the WORK section of the website. I am generally extremely particular about what my vision for a project is and I do not start working until all my documentation, ideation and direction is in place. I have already worked on an individual project in the first term. It was fairly well received and everyone enjoyed how I had pushed myself to make a solo 3D project.
The walkthrough of my game for the 1st Semester:
So, I decided to move forward with a more team based approach. Because honestly speaking, there are limitations to the amount of work I can accomplish alone. And there is always a requirement for critical thinking where it is always better to have more minds rather than a single one. It helps with refection on ideas wherein multiple mindsets can shape a game much better and point out the flaws that a single individual might be ignorant of.
I have also worked with Bhuvanesh and Dhruv before, so I have a fair idea of how each of them work and the edge they would be able to bring to the team.
FINAL TEAM IDEA
With these in mind, we went forward with coming up with ideas or using the previous ones I had pitched. During our discussions, Bhuvanesh came up with a concept which would have the player explore different environments which would talk about the worlds state. I really loved the art reference he provided us with, I immediately had multiple ideas for the story and how the game would progress.
This was the reference he had provided and given the look of the game, I was fairly confident that I would be able to pull it off with volumetric fog and lighting. So I got to work with coming up with the story and how the narrative flow would work. Bhuvanesh and I came up with a one - pager which defined how the game works in the form of an overview.
Once we had a basic idea of how the game would work, we got to work prototyping and trying out different things to better flesh out the concept.
I personally got to work on the look of the game.
MY FIRST TASKS
One of the first things I got to doing was figuring out how to achieve the visuals we were going for, so I started researching with multiple unity features, to understand what would suit our needs the best.
I moved forward with making a small list of all the things I had to figure out initially, while Bhuvanesh was busy figuring out how the coding for VR works.
VOLUMETRIC FOG RESEARCH
Following through with the basic research provided me with a base to start with in terms of technical abilities.
FIRST SCENE TEST
I started making the first scene and put into view what the first scene would look like. And what our workflow would be to achieve it.
This is how one of our trial objects on scene were looking before the lighting and fog volumes.
After having a basic proof of concept, I moved forward with ideating about the entire game. The story, mechanics, music look and feel, environment ideas. As I mentioned before, I am extremely particular about every moment in a game so I absolutely need to have a script in place to get started with work. And given that my job was Creative Direction, I had to be the one to keep all the visuals and ideas in mind to ultimately make a cohesive project.
Given that my job was Creative Direction, I had to be the one to keep all the visuals and ideas in mind to ultimately make a cohesive project. I also had to come up with a narrative that would keep the player hooked and move the game forward. I also had to design all the interactions for the game with Bhuvanesh by my side. As I would pitch him ideas and he would counter with the viability of executing those ideas in the limited time.
On top of all the constraints, we also put another constraint on ourselves where we wanted to work with other students from the university.
We wanted to work with a Music Composer as well as a Product Designer. So we pitched them our ideas and moved forward with them inputs and implemented them in our project.
The product designer we were working with, Nishant Bhokare, came up with an Idea where he wanted to make a watch which would detect CO2, CO and Methane levels in the atmosphere and tell the user of their exposure in the immediate vicinity.
We took this idea and implemented a watch in the game that would beep and change colour depending on the player characters exposure to pollutants in the given scene. This helped us build tension and a certain level of interactivity throughout the game.
I am what you would call an audiophile, and have a certain sense of direction for different types of audio and how they would affect moods. I knew what I wanted for the different pieces for the music and just need help from someone professional to bring it to reality. The one person who is extremely capable in this field and understood the ideals of the project and vibe of music required to bring the game to fruition was Viral Parmar.
WRITING THE SCRIPT
The next thing in line was writing a script, this would cement the entirety of the game idea an allow us to come up with a proper production pipeline. In the script, everything from design, programming to sound at every moment of the game is taken into consideration and listed out.
Though it is extremely hard and time consuming, in the length of the project, it ends up being extremely useful and effective.
I have provided the script below for a better understanding of it's function. It took me around a week to come up with the script and flesh it out completely.
I had an idea of how the entire vision of the game was and how each scene would look and feel like. Each scenes Art, sound, music, dialogues, design and interactions everything was charted out and planned. Once I was done writing the script, I had everything in place to start researching and making the GDD.
Both Bhuvanesh and I coordinated and got the GDD and the Research document done.
I have provided the GDD and Research Document below as it would make the blog extremely long if I were to paste each page.
Once all of these factors were in place, I got to working on actually making the project come to life.
For the Record, we had trouble working with Dhruv as he was not working up to his potential. So we had a few issues with our workflow, and the only solution for us turned out to be sleepless nights and an unhealthy amount of slogging.
One of the scenes in the game required me to make a beach, adding the beach and its corresponding land mass was the easy bit. Making the water body using planes proved to be one of the hardest things in the project, I had to achieve a semi realistic ocean surface using shaders on a plane.
So, I got into researching how to make the water body come to life. The shader graph is one of the most complicated packages Unity has to offer. Though it has unlimited possibilities, it is also infinitely difficult to achieve a set target.
I had a basic idea of what I wanted the water to achieve, it needed basic refraction, foam and wave crests.
Since I had barely any idea on how to make the shader graph work, I got to researching and searching for tutorials that would help us reach our target easier. I started by understanding how the settings work I set the surface type to transparent and kept the material type to standard.
I found that the other settings in the graph Inspector were not required for setting the water shader. I also ticked the check box for tessellation and double sided. Adding these parameters help us move multiple vertexes individually. I also activated a Screen space reflection so that the water would reflect the directional light falling on it.
Now once the basic set up was done, the first task was to generate waves.
There are different methodologies to generate sine waves on a plane I for one chose the Gerstner wave method. The Gerstner wave equation provided me with parameters for steepness, amplitude, direction, frequency and speed over the different axis’s.
So, I focused on implementing this equation on the shader graph. Starting with a position node and splitting them into an x and z components, making our world space UVs to tile everything.
Adding a time component to keep track of a constant, duration/speed.
Also making a few float variables at the side for the different values the equation requires as mentioned above.
After which I was done focusing on the basic UVs over time.
Getting back to the equation, multiplying amplitude and steepness as well as direction and frequency (after normalisation). With speed and time. Makes the equation set up ready.
Moving forward with generating the waves along a direction. But first we needed to multiply the vector 2 with a dot product, to generate a mask on the plane.
Then adding time to the dot product so that everything is increasing and there is continuous movement for our waves.
I then took the cosine of (time + dot product) and multiply this with our previously made direction module. We also multiply the amplitude with the direction. Getting our X and Y Offset for our waves.
Now, all that's needed is to calculate the height. Which is very simple it's just the sine instead of the cosine of the dot product and then we multiply it by our amplitude and feed this calculation into a vector 3. Using that vector 3 to calculate tessellation displacement with a factor of 2.
Now the next thing to focus on is the fact that waves don’t move in a singular direction, they move in multiple directions at the same time. So I will copy the previous equation again and set lower values in a different direction to emulate the effect of waves going in different directions. These secondary waves make the entire ordeal more realistic and believable.
All of these nodes together will calculate the height of the waves.
Next, to move on to calculating the Normal maps. We made a few 2D textures on photoshop for foam and water ripple. With these set, we can make variables which will handle the textures on the waters surface and get us a shiny water effect.
Since we want these to be going in opposite directions, we'll add the time to one of our add time nodes with the UVs acting as one of our samples and subtract time from the other normal in our UVs.
And using a normal blend, we combine the two textures. With the normalization of the strength of the ripples, we recalculate the normal from the height maps by blending both the values and reassign them to the normal slot.
Next, we take the pixel depth of the normal. Which will give is the pixel depth for each normal on the plane. This will help us generate foam on the edges of the ocean.
Now that the refraction was taken care of, I started focusing more on how the foam will work on the surface of the water. I had decided to generate foam on the edges of the water body where the water came into contact with the land mass as well as generate form at the crest of each wave making it look more realistic.
This effect was achieved by inverting edge masks and adding a foam texture to the calculated edge mask using scene pixel depth node. With the combination of variables such as colour, foam, refraction, depth, amplitude, steepness, frequency, speed, direction, ripple strength, ripple tiling, foam fall off, foam width, foam removal, foam bands crest size, crest offset and edge hardness. All of these different variables coming together made the ocean possible.
All these variables can be altered to gain varying qualities or types of water, depending on the game or the situation. We had to keep in mind, to make the Shader Graph modular as I had no clue what would look good in the scene. So, after putting the water shader on the scene, the main focus was to make it look good according to the scene. So, we focused on tweaking the values to our liking till we were happy with how the ocean looked in contrast to the scene.
After the completion of the shader graph the next big thing that I had to complete was the effects on the VFX graph.
My first task was to make fireworks so I started with a simple spawn system which would initialise particles at slight intervals over set boundaries which would simulate a square box.
Setting a variable velocity on the y axis. And a lifetime for how long it would last.
On update, I added a trigger event where, the death of the particle would spawn another system which was made to be an explosion from a certain point. I also added a colour with extremely high emission to flash on the screen when the cracker is right about to explode.
This was achieved by using the node velocity from direction and speed (Random direction).
This same event had an output particle quad which would let me play around with the Colour over lifetime of the particle. Making the particle look more like colour changing fireworks. We gave these explosions a longer life with the addition of linear drag, which would give it the effect of lights falling.
The next VFX we had to focus on was the cutting of a tree. This was another complex effect as the effect would have to take into account where the player is hitting.
So we started with setting the spawn rate to a predicated Boolean, which would help me control the VFX without any trouble.
Then we focused on getting the smoke velocity come out as per the requirement. And we wanted it to be easily accessible, so we went forward with making hem all into exposed variables.
Next we got down to setting the spawn position of the particles each time the player hit a tree. So, the particles would spawn at the location the axe would hit the tree at any given moment.
Next I focused on using Unity’s built-in particle mesh system. As we did not want multiple particles on the scene causing lag, we chose to spawn a few meshes in their stead. This considerably brought the lag down but increased the complexity. We had to sit on blender and make a few basic meshes for the system to work.
Next, we move on to the most complex effect in the game, the Nuke explosion.
In this effect a lot of things are working in harmony to make the nuke come to life. As this is the last scene of the game, we had to make it as appealing as possible. It also follows the particular low poly art style of the game. And introduces its own unique flair given the sheer scale at which it is showcased.
It’s a culmination of a lot of different effects, shader graphs and particle effects coming together to form one giant effect that is the nuclear explosion.
One of the first things I did was to start making shaders for the different smoke particles that will radiate out of the bomb. A few particles are shown radiating out of the bomb with smoke affects rippling. I also made a tornado effect which would circle around the bomb and give it an added effect of zero pressure when the bomb explodes.
The next few things were a culmination of multiple particle effects all responsible for different segments of the explosion.
I also made a few assets on blender to use with the nuke mushroom cloud effect. The bomb will grow over time as the explosion spreads out and covers the player’s screen. The main particle effects from the nuke mushroom cloud as well as the nukes stem.
These two particles were made using assets which gives the bomb it’s classic feature of a funnel cloud and a mushroom top. The other particles are responsible for multiple shockwaves. Each of the shockwaves takes care of a different part of the bomb when shockwaves are on the ground level while one is on the middle level while the last one just shows an impact on the clouds.
The other particle effects are made to show debris flying and a flash to show the initial moments of the explosion. These artefacts are what create the base of the nuclear explosion, as the bomb was an extremely complex culmination of particles I thought it best to do it with the basic particle system rather than unity’s VFX shader graph. The nuke also has a few funnel effects that cement the field of debris flying from the ground to as the sky.
The nukes simulation speed is also matched with the theatrics of the final climax, the nuke is made in such a way that it will last only the duration of the final cut scene.
The nuke also has a shock wave which flies towards the players over time after a force is acting on a particle system. This shock wave is what the player sees at the end when the nuke ends at all. The shockwaves also follow the beats and a simulated in such a manner that it fits in with the theatrics of the scene.
Given the unique vision the game had, to achieve that specific art style and lighting we had to do a lot of research on how the unity engine would be able to reproduce what we were envisioning. The only way to recreate what we were imagining was through a clever mixture of lighting fog and VFX. On this track the first thing that we started to focus on was lighting. As lighting would play a major role throughout the game to set the base for the entire mood of the scene. Given that each scene followed a specific colour theory, Hazardous scenes are depicted more on the spectrum of yellow and red while safer and more informative scenes depicted using mellower pastel colours.
The atmospheric nature of the game would not allow for only the use of lighting to generate mood. Through further research we understood that we would have to use lighting in combination with fog to generate the type of ambience we needed to bring the game to life.
Through the inclusion of fog, it gave rise to another problem, unity does not allow fog to be volumetric in their built-in render pipeline. So, we had to get back to Google and start researching on what render pipeline would support the type of volumetric fog we required for the game. After scouring the internet, we found out that unity’s High - Definition Render Pipeline could do what we needed to set the atmosphere for the game.
Our first test scene included a small fog volume using volumetric lights, with added colour to achieve the desired Islands Non - Places look.
Even with all these additions we could not quite achieve what we wanted. So, we had to delve deeper into the unity lighting system and volumetric fog system.
Let’s start with the directional light, this provided us with the most realistic lighting source that we could ask for. We had to keep the lighting mode real time as the player would be moving large objects around the scene. Given the dense fog and the heavy atmospherics the player would not be able to see the celestial body so we did not bother with its attributes. What mattered to us the most was its emission. We chose a colour filter over the light and set its intensity to the desired amount. Since the fog on the screen was volumetric, the intensity of the light needn’t be too high we chose to enable volumetrics for the light and increase its multiplier to varying degrees to achieve the diffused look we were going for. With the addition of volumetrics, a shadow dimmer component is introduced, given the nature of the volumetric fog and its interaction with lights we had to decrease the strength of the shadows such that they wouldn’t flow too much through the fog. we also enabled shadow maps and contact shadows to update every frame so that any interaction the player conducts is reflected on the shadows cast on the scene. Once the base lighting was set up and we had a basic understanding of it, all that was left was to place the directional light on different scenes and change the filter colour to our liking. The variance in different scenes also forced us to change the volumetric multiplier as well as the shadow dimmer, depending on the scene size and requirement.
FOG AND VOLUME
Next I moved on to the volumetric fog, the volumetric fog component has a number of overrides including sky, shadowing, ray tracing, Post processing, material, lighting and exposure. Each of these overrides have their own functions and components capable of influencing the scene through their own unique functions. Given the nature of our game, we had very little backing in terms of online material. So, I had to add each override and understand what its effects were on the scene. Through a course of trial and error, I realised what I needed for the fog volume we added a local volume component, a visual environment component, a fog volume component, and a box collider. These overrides formed the base of the proceeding fog volumes that will fill the scenes and in cohesion with the directional light bring to light the vision we sought-after.
The fog volumes mode was set to local as that allowed us to make a simple box collider with a unique effect within its area the visual environment component allowed us to choose the effect of the skybox on the fog. (Which was minimal to none)
The next and most important component, was the fog itself. We set the volumetric fog Boolean to true and set its colour fog the desired value. Max distance and maximum height were set at a respectable distance in reference to the camera. As these will be the base values for all scenes. The only variable values would be attenuation distance and the volumetric fog albedo.
Each scene combines varying lighting effects and fog volumes to create distinct fog volumetric profiles seen on the screen.
We genuinely wanted to hire a professional to voice our narrator. So, I went on Mandy and searched for days on end for a very particular voice, I was searching for a voice much like David Attenborough. I got lucky when we found a voice actor named Tim Nice.
Given the pitch of the project, a lot of people on Mandy showed interest but we were most keen on working with Tim because his voice was absolutely perfect for our narrator.
We also had a few extra lines which needed to be acted by other people, so we chose a few of our friends to enact them. Because of this reason, we got to use the Foley studio to its full potential.
Here are a few snippets of the Audio that was used in the game.
The game has a total of 54 individual dialogues, so you can only imagine the amount of work put in. As there were a lot of takes and different variations I had to work with.
The next thing I got to doing was making the music for the game. The music is made in such a way that it is continuously following a metronome, the metronome almost always creates a sense of urgency much like how the climate crisis needs urgent resolution. We focused on a slow metronome in the beginning of the game and slowly progressed to a faster version. The fastest metronome mimicked the passing of a second showing a real-world time conversion (impact).
Each of the compositions, have a certain unique background audio attached to them, the beach has the ocean as his background, the landfill has machines whirring as its background and the city has a siren playing when the bombs go off. All of these have been integrated into the game to make each scene come alive.
The game has a total of 12 different sound tracks made over the course of two weeks. I had to juggle between recording audio, directing the music pieces as well as making final tweaks and adjustments to VFX and shader for the game.
I have provided a few snippets of music below:
We also made all of our friends playtest the game before submissions and took feedback into consideration. Though the game might have a few bugs, there is nothing game breaking.
[POST - PRODUCTION]
This is what our final game looks like with its logo and a few screenshots, there's also a walkthrough of the game below:
[Please keep in mind that the recording is not up to par and to get a real feel of the game, you need to play the game]
My final Critical Reflection document is provided below:
The game can be downloaded and played from Itch.io