One Man Game Jam: HTC Vive Basketball

HTC Vive BasketballLast week CASA finally received the HTC Vive. Everyone in the office had great fun exploring Valve’s demo experience The Lab. During the week the Longbow emerged as a particular favourite and caused several of us to discuss which sports might work in VR as viable training simulations. Wanting to get to grips with the HTC Vive hand controllers I decided to take up the challenge by creating a basketball simulation for the Vive in Unity.

I started by downloading a SketchUp model of a basketball court from the 3D warehouse. The model had no walls and a lot of reversed faces so I quickly fixed it up in SketchUp with the aid of the S4U Material plugin, ThomThom’s Material Tools and ThomThom’s excellent CleanUp³ plugin. I also obtained a royalty free basketball model from TurboSquid.

Basketball Court SketchUp

As the Unity importer for SketchUp had failed last time I used it I exported the model from SketchUp in Collada format, and converted it to FBX out of habit using the Autodesk FBX converter. After importing the models into Unity I downloaded the SteamVR plugin and added the CameraRig prefab to my scene to handle the basic Vive interaction.

Basketball Court Unity

Trigger colliders were placed in the basketball hoops with a C# script attached to count the score. The Steam scripts for TestThrow and Teleporter were then added to the hand controllers and modified to enable the player to navigate the entire basketball court and to spawn and throw the ball. The ball physics were handled with a simple Unity physics material which was surprisingly effective.

Using the Vive hand controller works well with two qualifications: Firstly it isn’t possible to apply back spin to the ball; secondly there is a high risk of throwing the hand controller out of the window. Risk of breakage in injury aside the final game is really challenging but great fun. I thought I’d actually got the drop on Basketball games in VR but it looks like HelloVR are adding a basketball experience to their social VR platform Metaworld. Could be fun!

Royal Institution Coding Club: Drones and 3D Modelling

On Saturday the 5th of March the CASA Drone team and I ran workshops on drones and 3D modelling as part of the Royal Institution’s Coding Club for year 9 students.

drones4good--3d-models-casa-masterclass_24953827523_o

The session began with an introduction by Flora and a discussion of the #drones4good movement and her previous collaboration with the ReMap Lima project where drones were used to map illegal land grabbing on the outskirts of the Peruvian capital.

drones4good--3d-models-casa-masterclass_25461850402_o

Richard then introduced the workshop with a look at FPV quadcopter racing and went on to instruct the students in how to assemble the electronics in order to complete the wonderful 3D printed drone frames they had prepared specially for the event. Thanks there to our director Andy Hudson-Smith for his perseverance with the 3D printer!

RIFramesAfterwards

After flying the drones we moved through to the computer room where I explained how 3D modelling formed a link between the 3D printed frames we had been using during the first part of the session and the kinds of 3D models used in movies, video games and virtual reality experiences that we would demonstrate in the second half. After a brief demonstration of SketchUp the students were able to start creating their own 3D models.

drones4good--3d-models-casa-masterclass_25487944691_o drones4good--3d-models-casa-masterclass_25487955571_o

To end the session Richard and I gave demonstrations of the drone simulator he had created in Unity and the CASA Virtual Reality Urban Roller Coaster I had made previously for Oculus Rift Unity game engine.

drones4good--3d-models-casa-masterclass_24949858734_odrones4good--3d-models-casa-masterclass_25554551986_odrones4good--3d-models-casa-masterclass_24949909354_odrones4good--3d-models-casa-masterclass_24949868594_o

We had a fantastic morning. Additional thanks are due to Martin Austwick who facilitated on behalf of CASA and two CASA Master’s students Nathan Suberi and Anouchka Lettre who kindly volunteered to help out (first, second and third from the left in the pic below).

drones4good--3d-models-casa-masterclass_24953639783_o

Finally a massive thanks to Elpida Makrygianni from the UCL Engineering who organised the event.

Further details of the event can be found on the CASA Drone team blog post here.  The full set of pictures from the workshop can be found here.

Virtual Architectures at the QEOP Smart Park Demonstrator

From Saturday 13th of February to Sunday 21st February the London Legacy Development Corporation (LLDC) held a 9 day ‘Smart Park’ demonstrator event at the ArcelorMittal Orbit tower on the Queen Elizabeth Olympic Park. Exhibits were provided by The Bartlett Centre for Advanced Spatial Analysis (CASA) and the UCL Interaction Centre (UCLIC).

As the event was held during school half-term the exhibits were specifically aimed to engage youngsters. My contribution on behalf of CASA was an immersive virtual reality fly-through of a 3D model of the Olympic Park created using CityEngine and Unity with the Ordnance Survey’s MasterMap and Building Height data. In order to capture the imagination of visitors the tour was presented as a magic carpet ride. The Oculus Rift virtual reality headset was used to provide the sense of immersion while a specially prepared soundtrack and an electric fan were used to heighten the impression of flight by simulating rushing wind.

image1

During the course of the week the event was attended by over 1500 local, national and international visitors. In this way we were able to engage the public with the Olympic Park’s work to create a data rich 3D model of the park for the purpose of ‘smart’ day to day management and planning of the park. This also served as an opportunity for a gentle introduction to my own research into the use of augmented and virtual reality technologies for the purpose of providing spatial intelligence and facilitating real time decision making in a fun and engaging way. Further work will focus on the use of the LLDC’s 3D model of the park and various emerging interaction technologies as a means for interfacing with the site’s underlying operational data infrastructure.

image2

Also exhibiting on behalf of CASA was Sharon Richardson who is conducting research into sensing and evoking behavioural change in urban spaces through dynamic data sources and the Internet of Things. Sharon’s exhibit used a web cam and computer vision to sense the emotional state of visitors and present it back to them as a visualisation in real time. Going forward Sharon hopes to take control of the Park’s fountains to visualise the emotional state of the park.

Image-1

UCLIC’s contributions included their VoxBox and a robot Roam.io. Developed in partnership with the Intel Collaborative Research Institute (ICRI) these provide novel and engaging interfaces for ‘soft sensing’ of visitor opinions.

Jointly these ongoing collaborations will investigate and contribute to aspects of the LLDC’s participation in a pan-European programme for Smart Sustainable Districts (SSD) which focuses on four primary areas of the park’s daily operation:

  • Resource Efficient Buildings – Focusing initially on the iconic London Aquatics Centre and Copper Box Arena, this workstream will create tools and approaches to enable low cost, low energy, low environmental impact management and maintenance of future ready non-domestic buildings.
  • Energy Systems – The energy systems workstream will create an efficient, smart, low carbon, resilient energy ecosystem, with specific focal points including optimisation of district energy systems, community engagement and benefits and increased renewable energy generation.
  • Smart Park / Future Living – Implementing user facing digital and data solutions that deliver financial and CO2 efficiencies and prioritise quality of life improvements for those who live, work and visit the Park.
  • These are all underpinned by the fourth workstream, Data Architecture and Management – Implementing efficient and robust data management solutions that support the identification and trialling of innovative solutions and provide the foundation for improved Park operations, user experience and approaches that can be replicated by others, including through the London Data Store.

The SSD project is overseen by Climate-KIC, one of three Knowledge and Innovation Communities created in 2010 by the European Institute of Innovation and Technology (EIT).

Our thanks to the LLDC and the management of ArcelorMittal Orbit for a fun and eventful week!

Images courtesy of Twitter user Ben Edmonds @benjaminpeter77.

SpatialOS: What If We Had A Digital City?

In this video from Slush 2015 the CEO of tech company Improbable, Herman Narula, introduces the company’s ground breaking SpatialOS by asking the question ‘What if we had a digital city?’  Its value is the possibility it provides for seeing into the future of the city and answering the types of questions that start ‘What if?’ In this way the concerns motivating the creation of SpatialOS coincide with those of my own PhD in Near Real Time Urban Data Spaces here at CASA.

In his presentation Narula considers the impact that the introduction of self-driving autonomous vehicles might have on our cities. We might expect them to change patterns of usage on transport networks by making journeys more efficient, and this could have further implications for land uses associated with related phenomena such as parking. This in turn suggests an impact on economic activity which, in turn, could have implications for transport related crime and policing. We can guess what the impacts might be but what will this really look like?

The motivation behind SpatialOS is the ability to understand the consequences of complexity. In complex adaptive systems such as cities the interactions between different components of the system are non-linear. Contrasted with the relationships in linear systems where effects will be proportional to and more directly related to their causes, the relationships in non-linear systems will be disproportionate, indirect and, therefore, extremely difficult to predict.

While large amounts of data about cities can now be gathered in near real time for the purpose of up to the minute analysis, this does not provide any guarantee that the currently observed patterns of behavior will persist into the future. This limits our ability to answer questions of the kind ‘What If?’ What if we add a new station to the tube network? What if we route the trains differently? What if we build a new district? What will the impact of a new technology be? SpatialOS offers to help answer these questions by enabling us to digitally recreate and simulate the entire system.

The challenge Narula identifies is that of achieving a form of ‘Strong Simulation’ which leverages sufficient computational power that is fully dedicated to the process of simulation. This is contrasted with ‘Weak Simulation’ which Narula characterised in a previous presentation as the necessity of sharing computation between simulation and other processes such as rendering. With SpatialOS Strong Simulation is achieved through the distribution of computational load across servers in the cloud.

This is potentially ground breaking technology for data scientists but also has amazing potential for the creation of virtual environments and video gaming. The presentation ends with a presentation from Bossa Studios whose game Worlds Adrift uses SpatialOS to provide a massive mutliplayer online gaming environment.

What makes the game special is the way in which each of the individual objects in the game environment are indefinitely persistent and affected by physics in real time. Already simulating over four million objects at the current stage of development the potential of SpatialOS for both gaming and scientific simulation is massive!

 

VUCITY: Wagstaffs 3D Model of London Ten Months On

Back in February I posted the first video of VUCITY by Wagstaffs Design. Ten months on the functionality of the product has improved considerably. Aimed at high end developers, planners and architects their 3D model visualises 80 square kilometres of London and incorporates high detail contextual models provided by Wagstaffs partners Vertex Modelling.

In addition to interactive navigation the model also provides tools for rights to light and sunlight studies, London View Management Framework (LMVF) visualisation for protected views, search for existing, planned and consented development, transport information and integration of live data overlays. At the recent MIPIM expo in London VUCITY also appeared with further functionality for intuitive navigation through gesture control.

In a recent feature on London Live Wagstaffs director Jason Hawthorn explains how VUCITY has been created with the aid of gaming technology, in this case Unity game engine, and outlines the companies future plans. VUCITY remains one to watch closely!

GSS 2015: Presenting INSIGHT on behalf of CASA

GSS

This past week I’ve had the good fortune to be in Genoa in northern Italy participating in the Global Systems Science (GSS) conference 2015. where I represented CASA on behalf Dr Vassilis Zachariadis who was unable to attend. This year the conference was jointly organised by the Genoa Festival of Science. The focus of the conference is the way in which global issues impacting the globe can be addressed using data analysis and systems science to support to coordinate strategies through policy:

Philosophers have been reminding us for millennia that everything in the universe is connected to everything else in one enormous complex system. While this has always been true, until recently the propagation of states of the many sub-systems was so slow that they appeared to be disconnected. The advent of globalisation, the internet, and instantaneous propagation of system state have outrun human ability to understand the complexity of modern, global systems. In parallel, our ability to capture information about events in the real-world, to communicate, to store, and to analyse such information has made extraordinary progress. We are approaching a state where almost everything that determines how we live is capable of being observed and analysed. Global systems science (GSS) is a policy-related scientific program dedicated to research in applying systems science and large scale data analysis and models to global challenges including climate change, pandemics, sustainable growth, energy sufficiency, financial crisis, urbanisation and global conflict. These challenges are all ‘global’ and ‘borderless’, i.e. they are shared worldwide and they tightly connect policies across different sectors. They necessitate joint action and coordination across various networks of actors with different interests. The conference will elaborate on how systemic thinking, models and data can help address such coordination problems in policy, society, and economy.

CASA_INSIGHT_GSS

CASA’s contribution to the conference involved the demonstration of visualisation outputs relating to research conducted as part of the pan European project INSIGHT (Innovative Policy Modelling and Governance Tools for Sustainable Post-Crisis Urban Development). This project seeks to investigate how information and communication technology (ICT), data science, and complexity theory can help European cities formulate policies for for sustainable urban development and economic recovery in the event of crisis. The objectives of the project are:

1. to investigate how data from multiple distributed sources available in the context of the open data, the big data and the smart city movements, can be managed, analysed and visualised to understand urban development patterns;
2. to apply these data mining functionalities to characterise the drivers of the spatial distribution of activities in European cities, focusing on the retail, housing, and public services sectors, and paying special attention to the impact of the current economic crisis;
3. to develop enhanced spatial interaction and location models for retail, housing, and public services;
4. to integrate the new theoretical models into state-of-the-art simulation tools, in order to develop enhanced decision support systems able to provide scientific evidence in support of policy options for post-crisis urban development;
5. to develop innovative visualisation tools to enable stakeholder interaction with the new urban simulation and decision support tools and facilitate the analysis and interpretation of the simulation outcomes;
6. to develop methodological procedures for the use of the tools in policy design processes, and evaluate and demonstrate the capabilities of the tools through four case studies carried out in cooperation with the cities of Barcelona, Madrid, London, and Rotterdam.

The INSIGHT project which commenced in October 2013 is funded by the European Union’s Seventh Framework Programme and will conclude in 2016. Alongside CASA the project is run by a consortium including the Technical University of Madrid (UPM), Nommon Solutions and Technologies, the Technical University of Eindhoven (TU/e), the Institute for Cross-Disciplinary Physics and Complex Systems (IFISC) at the University of the Balearic Islands (UIB), and Barcelona City Council.

A number of CASA visualisations relating to the themes of the project were presented over the course of the conference. At the pedestrian level the following animation by Vassilis demonstrates the output of an agent based model involving the dynamic assignment of pedestrian route choice in response to crowding. The colour of each pedestrian reflects the deviation from their preferred walking speed:

  • Red = high deviation
  • Green = Low deviation

More details on the used methods can be found in the following paper:

Zachariadis, V., Amos, J., Kohn, B. (2009) Simulating pedestrian route choice behaviour under transient traffic conditions.

Looking at the urban scale Vassilis’ second visualisation uses a cross-cluster analysis of geolocated tweets in London through the course of a day. In this way it becomes possible to see how urban activities vary in different locations across the city at different times of the day. The different colours reflect estimated activity based on data concerning the known uses of premises associated with those locations:

  • Yellow = Retail
  • Red = Offices
  • Cyan = Rail
  • Green = Leisure

Further details of the analysis behind the visualisation can be found here.

By way of contrast Vassilis also provided an earlier visualisation demonstrating the origins and destinations of journeys by taxis equipped with GPS trackers in the San Francisco Bay Area over a period of 24 hours. In this visualisation green dots represent pickups while red dots represent drop offs:

When watched carefully the visualisation shows how certain areas shift from being more predominantly collectors or distributors of taxi trips throughout the day. In this way it suggests a ‘tempo of the city’ as Vassilis notes in his original blogpost here.

Now lecturing at King’s, Dr Jonathan Reades’ visualisation for CASA of Oyster card data entitled ‘Pulse of the City’ further develops this theme of tempo in the dynamics of passenger flows across London’s tube network:

The next visualisation by CASA researcher Joan Serras presents the output of a model for activity-based transport demand created by Joan, Melanie Bosredon, Vassilis Zachariadis, Camilo Vargas-Ruiz, Thibaut Dubernet and Mike Batty. Using MATSim software and TFL data they simulated activity across a typical working day in London. Each person or agent within the model is plotted performing certain activities:

  • Dark Blue = Home
  • Light Blue = Work
  • Yellow = Shop
  • Green = Education
  • Pink = Leisure

The model was developed by CASA under the EUNOIA research project for Evolutive User-centric Networks for Intraurban Accessibility. The output of the model was visualised using Via software by provided by Senozon. Further details are available here.

On a national scale Joan Serras’ visualisation of Public Transport flows across the UK for a typical day in 2009 was also displayed. Presenting data on train, coach, tram, tube, ferry and available air trips for Scotland, the visualisation demonstrates the complexity of the networks and indicates the distinct transport geographies of particular regions.

Further information on the visualisation can be found here.

I also had the opportunity to display sneak previews of a visualisation presenting the application of land use modelling to Dubai along with an interactive model of the UK being developed by CASA and the Future Cities Catapult.

During the conference I had fantastic opportunities to attend plenary sessions on complexity by Geoffrey West from the Santa Fe Institute and on ‘The Promise of Urban Science’ by Steven E. Koonin of New York University. I also had a great time exploring Genoa’s unique topography and the labyrinthine knot of its old town. Many thanks to CASA and Vassilis for making the trip possible.

Finally, my thanks also to Luca Piavano and Francisco Oostrom from CeDint at UPM who joined me at the INSIGHT stand to present their own great work. I owe them both a pint next time they are in London.

Sense of Presence: Epic’s documentary series on VR

In their new video documentary series ‘Sense of Presence’ the company behind the Unreal Engine 4, Epic Games, offer a number of short films seeking to explain Virtual Reality.

In the first film in the series ‘What is Virtual Reality’ they consider the new possibilities provide by the medium. The primary affordance of the technology is provided by 360 degree visual tracking, in real-time, within a computer generated environment. What this means is that any environment that can be represented in three dimensions within the computer can be visually sensed by users with a virtual reality headset or HMD as if they were there. Accompanying this is a sense of immersion which, in the words of Oculus founder Palmer Luckey, can ‘make it feel like you’re actually in a place you are not’. This this sense of immersion is effectively achieved by fooling the brain and the success of this can be seen in peoples physical reactions to events in the virtual environment such as ducking out of the way of flying objects or attempting to lean on furniture in those environments.

In the second film ‘Building Virtual Reality’ Epic look at the way gaming technology has been pivotal in the development of VR  and the challenges that have been encountered in trying to deliver immersive experiences. The first challenge is that achieving the stereoscopic effect of depth in 3D requires that the computer render two images rather than one. In order to keep the frame rate high enough to trick the eye into perceiving continuous motion the computer needs to render twice as many frames per second (fps) as a game displayed on a single screen. If the frame rate drops unacceptably perceived motion in the headset stutters, the sense of immersion is lost, and the mismatch between the information received by the users brain from their eyes and that from their inner ear causes motion sickness. Game engines which provide real-time rendering capabilities are ideal platforms for the creation of immersive VR experiences.

The third film ‘Storytelling in Virtual Reality’ looks at the way immersion within the virtual experience can promote emotional responses to a degree that existing game experiences do not. The user is active within the virtual scene and free to move their attention as they desire so new techniques are required to solicit the users attention.

In the final film ‘The Future of Virtual Reality’ Epic consider the potential of VR beyond gaming, particularly those for connecting people via telepresence: the possibility of being present somewhere you are not. This opens up possibilities for training, education and new forms of visualisation. The film highlights the way in which developers are still learning to work in the new medium. Virtual reality for Palmer Luckey is the ‘final medium’ insofar as it can simulate all of the others that preceded it: ‘The ultimate goal is to make virtual reality as real as possible, because once you can do that, there’s not really any need to perfect anything else’.

I appreciated the way the films largely, though not completely, avoided the juxtaposition of the virtual with the real which I find unhelpful when discussing VR. The films provide a good summary of the way the new wave of VR views itself, but also the way it wants to be perceived by the public ahead of next years commercial releases. Alongside the hype there are plenty of exciting opportunities waiting to be realised in the coming years. It’s exciting to see what happens next!