New year, new plans! I want to start writing more here, so I'm starting with a dev log. I plan on following a loose weekly release schedule. Should be fun! Since this is my first dev log here, first, a quick summary of the game:
In Decommissioned, you play as the last astronaut to visit the International Space Station, charged with successfully ending the station's 26 year mission by deorbiting the space station into the Indian Ocean. After powering down the station's aging systems and prepping to leave, a catastrophic failure disables your only means of escape. Now - with your oxygen failing and your orbit plunging deeper into the Earth's atmosphere, you must find a way to repair and bring back online the systems that once kept you alive, and finally make it back home.
- Immersive VR movement system - Players navigate the space station using their hands, just like real astronauts.
- Complex environmental systems - All space station systems are simulated and interoperate with each other in a realistic manner. Players will have to manage power, water, oxygen, and other vital resources to keep themselves and the space station alive.
- Repair and replace gameplay - All systems aboard the station can fail. Players must use the limited resources on board the space station to keep the station running. As supplies run low, players will have to dismantle other systems to keep vital systems working, leading to tough choices about what stays and what goes. Removing the wrong system could have dire consequences!
But enough of that, on to last week!
Work on the movement/interaction system has taken up most of the week. In the game, I want the player to be able to open compartments, open hatches and slide out equipment from the racks on the walls. The player would open and interact with these items by grabbing handles attached to them.
I started with compartment doors this week. This may sound like a simple problem, and in a normal standing VR experience it would be. The fact that this game takes place in zero gravity greatly complicates things, since the player moves by grabbing handles as well. This means that the position and rotation of the compartment door is dependent on the player's hand position, and the player's hand position is dependent on the position and orientation of the compartment door. Instead of just using the player's hand position to control the movement of the compartment door, I now additionally have to take into account the velocity of the player, the angular velocity of the door, and whether or not the player has their other hand planted to provide leverage.
Currently, I don't have the math quite right yet. I have logic set up to allow the player to open the hatch if they have leverage, but it currently doesn't take into account the velocity of the player and the door. I have some ideas of where to go from here though, hopefully I should have them implemented this week. Once I have that working I should have a good framework to start on the hatches and sliding bits, since they are all similar problems.
Camera and Skybox
One of the first objects i put in the game was a DSLR, so the player could go around taking pictures. It was a good object for testing out the system I built for controlling how the player's hands grip items, as well as my system for passing input from the Vive controllers to items. More importantly, it's fun too!
Over the past few weeks I've had a nagging problem: the DSLR was not compositing in my skybox. The game uses a two-camera system for doing rendering, one draws the space station, and the other draws the stars and the Earth. This allows me to give a sense of scale to the planet without making it massive, and will also allow me to simulate the station orbiting the planet without actually moving the station. The DSLR was only rendering the station, however.
Compositing cameras in Unity is normally pretty simple. Each camera has a "depth" property that controls the order that it is rendered in. By rendering the skybox first, and then rendering your scene with a camera set to clear the depth buffer only, the cameras will draw on top of each other.
In this case, it wasn't that simple since I was using a render texture to get rendered frames from the DSLR to display on the camera's viewfinder. It seems that Unity only writes the color buffer to the render texture from the camera the render texture is attached to, not including colors from previously rendered cameras. This appears to be happening before composition occurs.
Here's how I fixed it:
- Create a render texture in code (I cached this so I wasn't making a new one every frame)
- Copy the render texture to the skybox camera
- Call Render() on the skybox camera
- Remove the render texture from the skybox camera and copy it to the DSLR camera
- Call Render() on the DSLR camera
- Remove the render texture from the DSLR camera, and then make a texture from it.
What this means is, render textures respect the clear flags on cameras, and happily support compositing. You can see the results in the video at the start of this post!
This is hugely important for me, because I'm going to be using this technique often. The station will have multiple cameras monitoring the station exterior, and they will all need to support composition.
All told, I'm very excited by my progress this week. The DSLR composition issue has been bothering me for weeks, so it's feels really nice to have it figured out. I feel really close to cracking the compartment door problem as well. Next week should be an interesting week!