Home of the future (work in progress)
“Home of the future” is a home automation showcase demo for the Oculus Rift which is still in development. It was initially created as a demo for our partner in Dubai.
The interactive demo is powered by Unity 5. This blog post is intended to share some of the insights and thoughts already with the great community Unity has.
For a home automation showcase, lights turning on and off is the most important and basic mandatory functionality. This means all lighting and global illumination has to be done in realtime.
Because there can be many lights, we switched to deferred rendering where performance is not
affected by the amount of lights, but only by the amount of lit objects (or actually the rendered pixels of those objects) combined with the amount of lights lighting those objects that are in view.
For physically based rendering with an HDR setup it’s important to switch to linear lighting. It gives much more accurate lighting results and the gamma space conversion is unnecessary in HDR as all rendering is performed into floating point buffers, which allow larger precision and larger values.
Using HDR lighting means color values can exceed 1. This means there is no fixed range a pixel can eventually be rendered to your screen. But because it has to be within a 0 to 1 range, the value will be clipped to 1 when it exceeds that value. This is where tone mapping comes in. It will compress the dynamic non-fixed range from black to the highest color value currently on screen. This way the sun for example will not be just as bright as a small LED light, but much brighter without exploding your screen.
Reflection probes manager
To make things shiny, we have to use reflection probes. The downside is that reflection probes are very expensive. Reflection probes create a 360 degree snapshot by storing a cubemap compiled by 6 camera’s surrounding the probe.
But being able to change lights, they have to be updated to reflect the right kind of lighting and we also have to have more than one for more accurate reflections. Therefore updating the probes once at startup isn’t an option.
For VR it is important to stay at least above 70 frames per second. So updating them all at once every frame isn’t an option either as the framerate will drop fast.
That’s where our reflection probe updater script comes in handy. It’s a small script which we place on every reflection probe. We can tell them to update every x seconds to update the slow daylight changes and can tell them to update immediately when toggling a light switch.
To avoid a performance drop we use the build-in option to render every side of the probe spread over several frames and on top of that, the script manages the probes to wait for eachother to keep performance impact at a minimum.
We will eventually extend the script with trigger zones to update the probes based on your position as well. For example standing in the living room, when not facing the hallway, we can update those probes at a lower frequency and lower priority than the ones in the living room.
As we have to use realtime lighting, we can also have a full day and night cycle with a moving sun. Therefore we use a simple script that changes the directional light rotation based on time. To speed things up, we multiply the time by the amount of virtual seconds per second (one minute in VR per real second, like GTA V, is a good acceptable speed).
We use an animationcurve to tweak the directional light intensity based on the time of day. This is necessary to get a realistic sun as the light of the morning sun has much more atmosphere to travel through before it reaches your eyes than a noon sun. The sky itself uses the default unity 5 procedural skybox.
All materials use physically based shading. Most of the materials used are procedural pbr materials from the Allegorithmic substance database (which is a no brainer to purchase when you want to have fast iteration times developing physically based correct content in Unity). The textures are tiled to fit the large surfaces.
We also pre-bake our world space ambient occlusion textures in Blender. But because the material is tiled and substance materials have their own local ambient occlusion, we can’t use the occlusion slot. We therefore use the secondary maps (which are intended for detail maps) to apply the ambient occlusion overlay. To compensate for the detail luminosity multiplication we lower the luminosity of the base substance material which has a slider for it already available on all the materials. We also bake the procedural materials to avoid any loading performance overhead.
The ceiling rgb led lights are constructed out of a single quad with a custom led-shader. The additive shader is a very simple multiplication of an emission multiplier, a color and a texture. We use the texture as a mask to simulate the individual LED lights, use the multiplier to control its emission and to dim the intensity. The HDR color slot is used to control the saturation and hue.
The tablet uses a physically based shaded multi material mesh. The screen material part of the mesh has a render texture assigned to the emission slot to make the screen actually glow in the dark. The render texture is rendered by a separate camera which draws the Unity UI system. This way we can have all kind of cool image effects and screen transitions on the tablet in the future.
To control the tablet we use a look cursor using a custom unity event system module to keep all the UI hover and click events working out of the box.
Home automation controllers
The home automation system uses an architecture which looks similar like the ones in the real world.
Devices (lamps, roller shades, led lights) all have their own controller with an internal circuit to receive incoming commands and dim, switch, control the output. A global controller sends the command to one or multiple devices.
The global controller has the ability to be controlled by bluetooth, wifi or through a web interface.
The tablet communicates with the global controller.
Because our interface uses the activity driven approach to control the devices (sit on couch instead of switch light above couch on and the one in the kitchen off), we store all device states into a profile. The controller supports broadcasting, so we don’t have to switch each light off individually but can address all lights in a group at once.
With Unity 5 also comes a total recall digital mixer with a great audio architecture under the hood.
There are two sound sources for the seagulls and ocean sounds at the front of the building and sound sources at the back of the building for traffic on the street below and one for the ocean view.
Using a low pass filter we can simulate the glass of the window filtering out all high frequencies which results in a dampened sound. For the traffic and seagulls we stacked a high pass filter as well to filter out the low recording artifacts. And there is a 3d sound source at every servo engine and device controller location as most relays produce a clicking sound when switched.
All sources are directed to their own group. There’s a custom mix created for daytime (more bird and traffic noise) and one for nighttime (less bird noise, less traffic noise and in the future we will hear crickets). Interpolation between these two snapshots (depending on the time of day) will be added at a later time too. To top things off, reverb zones were added to simulate the difference in sound reflection when standing in the kitchen as to standing in the living room where there is
That’s it for now. Hope we can release this showcase demo soon. But there are still a lot of great features in the pipeline to be added, so stay tuned!