Tag Archives: Virtual Reality

Point Cloud Gaming: Scanner Sombre in VR

Scanner Sombre is an exploration game which places the player in the depths of a pitch black cave system with nothing to guide them except an experimental headset and LiDAR like sensor enabling them to see in the dark. I first saw Scanner Sombre back in April at the EGZ Rezzed computer game trade fair. I immediately fell in love with the beautiful visual style which renders the environment as a point cloud. The visual style links cleverly to the central game mechanic by which the points representing the contours of the cave walls only appear through the player’s use of the scanning device, providing an eerily partial view of the environment.

Following the initial release for desktop PC in April the game’s makers Introversion Software have just released a new VR version, now available on Steam for both Oculus Rift and HTC Vive. Having played the two I’d argue that players really have to try Scanner Sombre in VR to get the most out of the experience. Producer Mark Morris and designer Chris Delay touch on this in the following video which discusses the the process of transferring the desktop game to VR and the differences between the two. They also provide a very frank discussion of the factors contributing to the game’s poor sales relative to the runaway success of their earlier runaway success with Prison Architect.

One area that Mark and Chris discuss at length is narrative. The difficulty they discuss is providing the player sufficient motivation to play, and the pressure they felt to fit a narrative to the experience part way through development. At the same time they are uncertain that a more developed narrative would have added anything. I’d tend to agree. The unusual visual style and game mechanic have a niche feel which some players will love and some will hate. I love the VR version of the game but I could see others might feel it is more of an extended demo.

While Scanner Sombre has not met the designer’s expectations for sales I’ve found it a really enjoyable and atmospheric experience, particularly with the heightened sense of immersion provided by VR. If you’re interested in giving it a go you can currently pick it up for less that a fiver on Steam here.

One Man Game Jam: HTC Vive Basketball

HTC Vive BasketballLast week CASA finally received the HTC Vive. Everyone in the office had great fun exploring Valve’s demo experience The Lab. During the week the Longbow emerged as a particular favourite and caused several of us to discuss which sports might work in VR as viable training simulations. Wanting to get to grips with the HTC Vive hand controllers I decided to take up the challenge by creating a basketball simulation for the Vive in Unity.

I started by downloading a SketchUp model of a basketball court from the 3D warehouse. The model had no walls and a lot of reversed faces so I quickly fixed it up in SketchUp with the aid of the S4U Material plugin, ThomThom’s Material Tools and ThomThom’s excellent CleanUp³ plugin. I also obtained a royalty free basketball model from TurboSquid.

Basketball Court SketchUp

As the Unity importer for SketchUp had failed last time I used it I exported the model from SketchUp in Collada format, and converted it to FBX out of habit using the Autodesk FBX converter. After importing the models into Unity I downloaded the SteamVR plugin and added the CameraRig prefab to my scene to handle the basic Vive interaction.

Basketball Court Unity

Trigger colliders were placed in the basketball hoops with a C# script attached to count the score. The Steam scripts for TestThrow and Teleporter were then added to the hand controllers and modified to enable the player to navigate the entire basketball court and to spawn and throw the ball. The ball physics were handled with a simple Unity physics material which was surprisingly effective.

Using the Vive hand controller works well with two qualifications: Firstly it isn’t possible to apply back spin to the ball; secondly there is a high risk of throwing the hand controller out of the window. Risk of breakage in injury aside the final game is really challenging but great fun. I thought I’d actually got the drop on Basketball games in VR but it looks like HelloVR are adding a basketball experience to their social VR platform Metaworld. Could be fun!

Virtual Architectures at the QEOP Smart Park Demonstrator

From Saturday 13th of February to Sunday 21st February the London Legacy Development Corporation (LLDC) held a 9 day ‘Smart Park’ demonstrator event at the ArcelorMittal Orbit tower on the Queen Elizabeth Olympic Park. Exhibits were provided by The Bartlett Centre for Advanced Spatial Analysis (CASA) and the UCL Interaction Centre (UCLIC).

As the event was held during school half-term the exhibits were specifically aimed to engage youngsters. My contribution on behalf of CASA was an immersive virtual reality fly-through of a 3D model of the Olympic Park created using CityEngine and Unity with the Ordnance Survey’s MasterMap and Building Height data. In order to capture the imagination of visitors the tour was presented as a magic carpet ride. The Oculus Rift virtual reality headset was used to provide the sense of immersion while a specially prepared soundtrack and an electric fan were used to heighten the impression of flight by simulating rushing wind.

image1

During the course of the week the event was attended by over 1500 local, national and international visitors. In this way we were able to engage the public with the Olympic Park’s work to create a data rich 3D model of the park for the purpose of ‘smart’ day to day management and planning of the park. This also served as an opportunity for a gentle introduction to my own research into the use of augmented and virtual reality technologies for the purpose of providing spatial intelligence and facilitating real time decision making in a fun and engaging way. Further work will focus on the use of the LLDC’s 3D model of the park and various emerging interaction technologies as a means for interfacing with the site’s underlying operational data infrastructure.

image2

Also exhibiting on behalf of CASA was Sharon Richardson who is conducting research into sensing and evoking behavioural change in urban spaces through dynamic data sources and the Internet of Things. Sharon’s exhibit used a web cam and computer vision to sense the emotional state of visitors and present it back to them as a visualisation in real time. Going forward Sharon hopes to take control of the Park’s fountains to visualise the emotional state of the park.

Image-1

UCLIC’s contributions included their VoxBox and a robot Roam.io. Developed in partnership with the Intel Collaborative Research Institute (ICRI) these provide novel and engaging interfaces for ‘soft sensing’ of visitor opinions.

Jointly these ongoing collaborations will investigate and contribute to aspects of the LLDC’s participation in a pan-European programme for Smart Sustainable Districts (SSD) which focuses on four primary areas of the park’s daily operation:

  • Resource Efficient Buildings – Focusing initially on the iconic London Aquatics Centre and Copper Box Arena, this workstream will create tools and approaches to enable low cost, low energy, low environmental impact management and maintenance of future ready non-domestic buildings.
  • Energy Systems – The energy systems workstream will create an efficient, smart, low carbon, resilient energy ecosystem, with specific focal points including optimisation of district energy systems, community engagement and benefits and increased renewable energy generation.
  • Smart Park / Future Living – Implementing user facing digital and data solutions that deliver financial and CO2 efficiencies and prioritise quality of life improvements for those who live, work and visit the Park.
  • These are all underpinned by the fourth workstream, Data Architecture and Management – Implementing efficient and robust data management solutions that support the identification and trialling of innovative solutions and provide the foundation for improved Park operations, user experience and approaches that can be replicated by others, including through the London Data Store.

The SSD project is overseen by Climate-KIC, one of three Knowledge and Innovation Communities created in 2010 by the European Institute of Innovation and Technology (EIT).

Our thanks to the LLDC and the management of ArcelorMittal Orbit for a fun and eventful week!

Images courtesy of Twitter user Ben Edmonds @benjaminpeter77.

Stress Relief in VR for Urban Planners?

Couldn’t resist sharing this video for a new City Destruction prototype for HTC Vive developed by Canadian games company AlienTrap. Using the Vive controllers you can swing a wrecking ball at the environment or pick up vehicles and other objects to launch around the scene.

I wondered if this might be perfect stress relief for town planners. It would be great to see this in a larger scale environment with networked play. Sad to see the characters in the scene getting turned into little red splodges though. Maybe they’ll learn to fight back if the demo gets released.

The scoop on this came via Road to VR.

HTC Vive: Rethinking Reality

HTC ViveOn Monday at the Mobile World Congress HTC announced their new Vive VR headset. Developed in collaboration with games company Valve, best known for their games Half-Life and Portal, the headset is expected to be released before the end of the year, with developer kits available this Spring.

The teaser video isn’t giving anything away. However, the HTC VR ‘Re Vive’ website promises a number of features:

  • A separate 1,200 by 1,080 pixel screen for each eye
  • Screen refresh rates of 90 frames per second
  • Accurate head tracking with a gyrosensor, accelerometer and…a laser position sensor (???)
  • Room scale tracking (15 feet sq) with a pair of base stations
  • VR game controllers with positional tracking

I’ve signed up for a demo as I’m particularly keen to find out more and experiment with the room scale tracking feature. It sounds very much like the equipment is aimed at the high-end gamer. If HTC and Valve can deliver though I don’t doubt that this will be a great piece of kit for creating all kinds of immersive and interactive experiences. Find out more at the HTC Re Vive website.

The making of the CASA Urban Roller Coaster

In May 2014 Virtual Architectures was invited by CASA to create a virtual reality exhibit for the Walking on Water exhibition that was partnered with Grand Designs Live at London’s ExCeL. While CASA spend a lot of time thinking very seriously about cities it was quickly decided that a fun and novel way to engage the 100,000 or so expected visitors would be an urban roller coaster ride using Oculus Rift.

The first tool chosen for this project was the Unity game engine because it provides a very simple means of integrating the Oculus Rift virtual reality headset into a real-time 3D experience. Initial tests were made in Unity with a pre-made roller coaster model downloaded from the Unity Asset Store. However, rather than simply place that roller coaster in an urban setting I wanted to create a track that would be unique to this experience and feel like it might have been part of the urban infrastructure. Due to time constraints it was not possible to model the urban scene from scratch. Instead I decided to generate it procedurally in Autodesk 3ds MAX using a great free script called ghostTown Lite.

Making_Of_CASA_Roller_Coaster_01

Although I like to use SketchUp for 3D modelling wherever possible 3ds MAX was much better suited to this project as it allowed me to quickly generate the city scene, model the roller coaster track, and animate the path of the ride, all in the one software package. After generating the urban scene I used the car from the Asset Store roller coaster as a guide for modelling my track in the correct proportions.

Making_Of_CASA_Roller_Coaster_02

The path of the ride through the city was modeled using Bezier splines, first in the Top view to get the rough layout and then in the Front and Left views to ensure the path would clear the buildings in my scene. The experience needed to be comfortable to users who may not have experienced virtual reality before so it was agreed to exclude loop-the-loops on this occasion. It was also important to avoid bends that would be too sharp for roller coaster to realistically follow. Once I was happy with the path I welded all the vertices in my splines so that the path could be used to animate the movement of the roller coaster car along the track later.

Making_Of_CASA_Roller_Coaster_03

Next sections of track were added to the path I’d created using the 3ds MAX PathDeform (WSM) modifier. As the name suggests this modifier deforms selected geometry to follow a chosen path. Using this modifier massively simplified the process by allowing my pre-made sections of track to be offset along the length of the path and then stretched, rotated and twisted to fit together as seamlessly as possible. This was the most intricate and time consuming part of the project.

Making_Of_CASA_Roller_Coaster_04

In order to minimise the the potential for motion sickness with the Oculus Rift I was careful to keep the rotation of the track as close to the horizontal plane as possible. Supporting struts were then arrayed along the path of the track and positioned in order to anchor it to the rest of the scene. When I was satisfied a ‘Snapshot’ was made of the geometry in 3ds MAX to create a single mesh ready for export to Unity. At this point the path deformed sections of track could be deleted as Unity does not recognise the modifier.

Making_Of_CASA_Roller_Coaster_05

To create the movement of the roller coaster car along the track a 3ds MAX dummy helper was constrained to the path I’d created earlier. This generated starting and ending key frames on the animation timeline. The roller coaster car model was then placed on the track and linked to the dummy helper. It is possible in 3ds Max to have the velocity and banking of the dummy calculated automatically, but I found that this did not give a realistic feel. Instead I controlled both by editing the animation key frames using a camera linked to the dummy for reference. This was time intensive but gave a better result. The city scene, roller coaster and animation were exported as a single FBX which is the preferred import format for 3D geometry in Unity.

Making_Of_CASA_Roller_Coaster_06

Having completed the track and animated the car it was time to assemble the final scene in Unity. First I generated a terrain using a great plugin called World Composer. This enables you to import satellite imagery and terrain heights from Bing maps to give your backdrops a high degree of realism.

Making_Of_CASA_Roller_Coaster_07

The urban scene and roller coaster were then imported and a skybox and directional light were added. The scene was completed with various assets from the Unity Asset Store including skyscrapers, roof objects, vehicles, idling characters and a flock of birds.

Making_Of_CASA_Roller_Coaster_08Making_Of_CASA_Roller_Coaster_09

To prepare the Oculus Rift integration the OVR camera controller asset from Oculus was placed inside and parented to the roller coaster car. In my initial tests with the Asset Store roller coaster I’d found that OVR camera would drift from the forward facing position. This would disorientate the user and contribute to motion sickness. To prevent it with a quick fix I parented a cube to the front of the roller coaster car, turned off rendering of the cube so it would be invisible, and set the camera controller to follow the cube.

Making_Of_CASA_Roller_Coaster_10

In order to ensure the best possible virtual experience it is really important to keep the rendered frames per second as high as possible. As the Oculus Rift renders two cameras simultaneously, one for each eye, you need to aim to render 60 fps in Unity so as to ensure the user can expect to experience a frame rate of 30 fps.

CASA Urban Roller Coaster

In order to achieve this I took advantage of occlusion culling in Unity Pro which prevents objects being rendered when they are outside the camera’s field of view or obscured by other objects.

Making_Of_CASA_Roller_Coaster_11

I also baked the shadows for all static objects in the scene to save them as textures which saves the processor calculating them dynamically. The only objects casting dynamic shadows are the roller coaster car and animated characters.

Finally two simple java scripts were added. The first would start the roller coaster and play a roller coaster sound file upon pressing the ‘S’ key. The second closed the roller coaster application upon pressing the ‘Esc’ key.

UCLProvost

The reception of the CASA Urban Roller Coaster ride at Grand Designs Live was fantastic and I’m really pleased to have participated. It was a great project to work on and an excellent opportunity to learn new techniques in 3ds MAX and Unity. Having my first VR roller coaster under my belt I’m looking forward to building another truly terrifying one when I get the time, hopefully for the Oculus Rift DK2 which has just arrived at CASA.

On a last note I’d like to thank Tom Hoffman of Lake Earie Digital whose excellent YouTube tutorials on creating roller coasters in 3ds MAX provided a great guide through the most difficult part of this challenging project.

Update 26/08/2014 – This article has been featured as a guest post on the Digital Urban blog here.

Update 05/09/2014 – Versions of the roller coaster for use with Oculus Rift DK1 are now available here for Windows and for Mac.

Virtual Architectures with CASA at Grand Designs Live

From Saturday the 3rd May (today) until Sunday 11 May CASA will be exhibiting at the Grand Designs Live event at London’s Excel. Virtual Architectures is pleased to have contributed an exciting virtual reality Urban Roller Coaster to the roster of high tech exhibits.

CASA Urban Roller Coaster

If you have the opportunity make sure you check out the CASA stand and give the Urban Roller Coaster and Oculus Rift virtual reality headset a try.