CleanSpace: Mapping Air Pollution in London

phones-two

Today I received a personal air quality sensor, the CleanSpace sensor tag. The device is a carbon monoxide (CO) sensor which is designed to be carried by the user and paired with the CleanSpace Android or iOS app via blueetooth. While the sensor takes readings the app provides real time feedback to the user on local air quality. It also pushes the anonymised sensor readings to a cloud server which aggregates them to create a map of air quality in London.

As well as providing data for analytics the app is intended to encourage behaviour change. It does this by rewarding users with ‘CleanMiles’ for every journey made on foot or by bike. The clean miles can then be exchanged for rewards with CleanSpace partner companies and retailers.

Another interesting aspect of the project is that the sensor tag is powered using Drayson Technologies’ Freevolt. This enables the device to harvest radio frequency (RF) energy from wireless and broadcast networks including 3G, 4G and WiFi. In theory this means that the device can operate continually without needing to have its batteries recharged because it can draw energy directly from its environment. In this way the CleanSpace tag provides a perfect test bed for Drayson’s method of powering Low Energy IoT devices.

The project kicked off with a campaign on Crowdfunder last autumn which raised £103,136 in 28 days. The campaign was initiated shortly after the announcement of results from a study at Kings College which found that nearly 9,500 deaths per year could be attributed to air pollution. Two pollutants in particular were found to be responsible: fine PM2.5 particles in the air from vehicle exhaust along with toxic Nitrogen Dioxide (NO2) gas released through the combustion of diesel fuel on city streets. While the CleanSpace tag does not measure PM2.5 or NO directly it is believed that recorded levels of CO can provide a suitable surrogate for other forms of air pollution given their shared source in car fuel emissions.

While the UK government are under pressure to clean up air pollution from the top-down, Lord Drayson who leads the CleanSpace project argues that there is also need for a complementary response from the bottom up:

“I think the effect of air pollution is still relatively underappreciated and there is work to do in raising awareness of the impact it has.”

“Yes, the government has a role to play, but this isn’t solely a government issue to tackle. The best way to achieve change, and for legislation and regulation to work, is for it to grow from and reflect the beliefs and behaviours of the general public as a whole.”

I’m looking forward to seeing what the device reveals about my own exposure to air pollution on my daily commute. It’ll also be interesting to see how my contribution fits in with the broader map being built up by the CleanSpace user community. After collecting some data I’m keen to compare the apps output with the data collected by the London Air Quality Network based at King’s College.

I’m a card carrying walker. At the same time I’m struck by the paradox that every CleanMile walked or cycled is essentially a dirty mile for the user. I can see the device and app appealing massively to those who already walk and cycle, and want to contribute to raising awareness on the issue of air pollution. However, with the sensor retailing at £49.99 the CleanMile rewards will have to be sufficiently compelling to encourage a wider base of new users participate, especially if the project is expected to have a genuine impact on the way they commute. Of course, it has to start somewhere! It’s an exciting challenge so I’m looking forward to seeing how it goes.

Microsoft HoloLens: Hands On!

It’s taken a while but I finally had my first hands on look at Microsoft HoloLens last night. The demonstration was given as part of the London Unity Usergroup (LUUG) meetup a talk by Jerome Maurey-Delaunay of Neutral Digital about their initial experiences of building demos for the device with Unity. Neutral are a design and software consultancy who have a portfolio of projects including work with cultural institutions such as the Tate and V&A, engineering and aviation firms like Airbus, and architectural firms such as Zaha Hadid architects who they are currently assisting to develop Virtual Reality visualisation workflows.

During the break following the presentation I had may first chance to try the device out for myself.  One of the great features of HoloLens is that it incorporates video capture straight out of the box. Although clips weren’t taken on the night these videos from the Neutral Digital twitter stream provide a good indication of my experience when I tested it:

After using VR headsets like the Oculus Rift and HTC Vive the first thing you notice about the HoloLens is how unencumbered you feel. Where VR headsets enclose the user’s face to block out ambient light and heighten immersion in a virtual environment, the HoloLens is open affording the user unhindered awareness of their surrounding [augmented] environment over which the virtual objects or ‘holograms’ are projected. The second thing you notice is that the HoloLens runs without a tether. Once applications have been transferred to the device it can be unplugged leaving the user free to move about without worrying about tripping up or garroting themselves.

Being able to see my surroundings also meant that I could easily talk face to face with Jerome and see the gestures he wanted me to perform in order operate the device and manipulate the virtual objects it projected. Tapping forefinger and thumb visualised the otherwise invisible virtual mesh that the HoloLens draws as a reference to anchor holograms to the users environment. A projected aircraft could then be walked around and visualised from any angle. Alternatively holding forefinger and thumb while moving my hand would rotate the object in that direction instead.

Don’t be fooled by the simplicity of these demos. The ability of HoloLens to project animated and interactive Holograms that feel anchored to the user’s environment is impressive. I found the headset comfortable and appreciated being able to see my surroundings and interact easily with the people around me. At the same time I wouldn’t say that I felt immersed in the experience in the sense discussed with reference to virtual reality. The ability to interact through natural gestures helped involve my attention in the virtual objects I was seeing, but the actual field of view available for projection is not as wide as the video captures from the device might suggest.

As it stands I wouldn’t mistake Microsoft’s holograms for ‘real’ objects, but then I’m not convinced that this is what we should be aiming for with AR. While one of the prime virtues of virtual reality technologies like Oculus and Vive is their ability to provide a sense of ‘being there’, I see the strength of augmented reality technologies elsewhere in their potential for visualising complex information at the point of engagement, decision or action.

Kind thanks to Neutral Digital for sharing their videos via Twitter. Thanks also to the London Unity Usergroup meetup for arranging the talks and demo.

One Man Game Jam: HTC Vive Basketball

HTC Vive BasketballLast week CASA finally received the HTC Vive. Everyone in the office had great fun exploring Valve’s demo experience The Lab. During the week the Longbow emerged as a particular favourite and caused several of us to discuss which sports might work in VR as viable training simulations. Wanting to get to grips with the HTC Vive hand controllers I decided to take up the challenge by creating a basketball simulation for the Vive in Unity.

I started by downloading a SketchUp model of a basketball court from the 3D warehouse. The model had no walls and a lot of reversed faces so I quickly fixed it up in SketchUp with the aid of the S4U Material plugin, ThomThom’s Material Tools and ThomThom’s excellent CleanUp³ plugin. I also obtained a royalty free basketball model from TurboSquid.

Basketball Court SketchUp

As the Unity importer for SketchUp had failed last time I used it I exported the model from SketchUp in Collada format, and converted it to FBX out of habit using the Autodesk FBX converter. After importing the models into Unity I downloaded the SteamVR plugin and added the CameraRig prefab to my scene to handle the basic Vive interaction.

Basketball Court Unity

Trigger colliders were placed in the basketball hoops with a C# script attached to count the score. The Steam scripts for TestThrow and Teleporter were then added to the hand controllers and modified to enable the player to navigate the entire basketball court and to spawn and throw the ball. The ball physics were handled with a simple Unity physics material which was surprisingly effective.

Using the Vive hand controller works well with two qualifications: Firstly it isn’t possible to apply back spin to the ball; secondly there is a high risk of throwing the hand controller out of the window. Risk of breakage in injury aside the final game is really challenging but great fun. I thought I’d actually got the drop on Basketball games in VR but it looks like HelloVR are adding a basketball experience to their social VR platform Metaworld. Could be fun!

Royal Institution Coding Club: Drones and 3D Modelling

On Saturday the 5th of March the CASA Drone team and I ran workshops on drones and 3D modelling as part of the Royal Institution’s Coding Club for year 9 students.

drones4good--3d-models-casa-masterclass_24953827523_o

The session began with an introduction by Flora and a discussion of the #drones4good movement and her previous collaboration with the ReMap Lima project where drones were used to map illegal land grabbing on the outskirts of the Peruvian capital.

drones4good--3d-models-casa-masterclass_25461850402_o

Richard then introduced the workshop with a look at FPV quadcopter racing and went on to instruct the students in how to assemble the electronics in order to complete the wonderful 3D printed drone frames they had prepared specially for the event. Thanks there to our director Andy Hudson-Smith for his perseverance with the 3D printer!

RIFramesAfterwards

After flying the drones we moved through to the computer room where I explained how 3D modelling formed a link between the 3D printed frames we had been using during the first part of the session and the kinds of 3D models used in movies, video games and virtual reality experiences that we would demonstrate in the second half. After a brief demonstration of SketchUp the students were able to start creating their own 3D models.

drones4good--3d-models-casa-masterclass_25487944691_o drones4good--3d-models-casa-masterclass_25487955571_o

To end the session Richard and I gave demonstrations of the drone simulator he had created in Unity and the CASA Virtual Reality Urban Roller Coaster I had made previously for Oculus Rift Unity game engine.

drones4good--3d-models-casa-masterclass_24949858734_odrones4good--3d-models-casa-masterclass_25554551986_odrones4good--3d-models-casa-masterclass_24949909354_odrones4good--3d-models-casa-masterclass_24949868594_o

We had a fantastic morning. Additional thanks are due to Martin Austwick who facilitated on behalf of CASA and two CASA Master’s students Nathan Suberi and Anouchka Lettre who kindly volunteered to help out (first, second and third from the left in the pic below).

drones4good--3d-models-casa-masterclass_24953639783_o

Finally a massive thanks to Elpida Makrygianni from the UCL Engineering who organised the event.

Further details of the event can be found on the CASA Drone team blog post here.  The full set of pictures from the workshop can be found here.

Virtual Architectures at the QEOP Smart Park Demonstrator

From Saturday 13th of February to Sunday 21st February the London Legacy Development Corporation (LLDC) held a 9 day ‘Smart Park’ demonstrator event at the ArcelorMittal Orbit tower on the Queen Elizabeth Olympic Park. Exhibits were provided by The Bartlett Centre for Advanced Spatial Analysis (CASA) and the UCL Interaction Centre (UCLIC).

As the event was held during school half-term the exhibits were specifically aimed to engage youngsters. My contribution on behalf of CASA was an immersive virtual reality fly-through of a 3D model of the Olympic Park created using CityEngine and Unity with the Ordnance Survey’s MasterMap and Building Height data. In order to capture the imagination of visitors the tour was presented as a magic carpet ride. The Oculus Rift virtual reality headset was used to provide the sense of immersion while a specially prepared soundtrack and an electric fan were used to heighten the impression of flight by simulating rushing wind.

image1

During the course of the week the event was attended by over 1500 local, national and international visitors. In this way we were able to engage the public with the Olympic Park’s work to create a data rich 3D model of the park for the purpose of ‘smart’ day to day management and planning of the park. This also served as an opportunity for a gentle introduction to my own research into the use of augmented and virtual reality technologies for the purpose of providing spatial intelligence and facilitating real time decision making in a fun and engaging way. Further work will focus on the use of the LLDC’s 3D model of the park and various emerging interaction technologies as a means for interfacing with the site’s underlying operational data infrastructure.

image2

Also exhibiting on behalf of CASA was Sharon Richardson who is conducting research into sensing and evoking behavioural change in urban spaces through dynamic data sources and the Internet of Things. Sharon’s exhibit used a web cam and computer vision to sense the emotional state of visitors and present it back to them as a visualisation in real time. Going forward Sharon hopes to take control of the Park’s fountains to visualise the emotional state of the park.

Image-1

UCLIC’s contributions included their VoxBox and a robot Roam.io. Developed in partnership with the Intel Collaborative Research Institute (ICRI) these provide novel and engaging interfaces for ‘soft sensing’ of visitor opinions.

Jointly these ongoing collaborations will investigate and contribute to aspects of the LLDC’s participation in a pan-European programme for Smart Sustainable Districts (SSD) which focuses on four primary areas of the park’s daily operation:

  • Resource Efficient Buildings – Focusing initially on the iconic London Aquatics Centre and Copper Box Arena, this workstream will create tools and approaches to enable low cost, low energy, low environmental impact management and maintenance of future ready non-domestic buildings.
  • Energy Systems – The energy systems workstream will create an efficient, smart, low carbon, resilient energy ecosystem, with specific focal points including optimisation of district energy systems, community engagement and benefits and increased renewable energy generation.
  • Smart Park / Future Living – Implementing user facing digital and data solutions that deliver financial and CO2 efficiencies and prioritise quality of life improvements for those who live, work and visit the Park.
  • These are all underpinned by the fourth workstream, Data Architecture and Management – Implementing efficient and robust data management solutions that support the identification and trialling of innovative solutions and provide the foundation for improved Park operations, user experience and approaches that can be replicated by others, including through the London Data Store.

The SSD project is overseen by Climate-KIC, one of three Knowledge and Innovation Communities created in 2010 by the European Institute of Innovation and Technology (EIT).

Our thanks to the LLDC and the management of ArcelorMittal Orbit for a fun and eventful week!

Images courtesy of Twitter user Ben Edmonds @benjaminpeter77.

SpatialOS: What If We Had A Digital City?

In this video from Slush 2015 the CEO of tech company Improbable, Herman Narula, introduces the company’s ground breaking SpatialOS by asking the question ‘What if we had a digital city?’  Its value is the possibility it provides for seeing into the future of the city and answering the types of questions that start ‘What if?’ In this way the concerns motivating the creation of SpatialOS coincide with those of my own PhD in Near Real Time Urban Data Spaces here at CASA.

In his presentation Narula considers the impact that the introduction of self-driving autonomous vehicles might have on our cities. We might expect them to change patterns of usage on transport networks by making journeys more efficient, and this could have further implications for land uses associated with related phenomena such as parking. This in turn suggests an impact on economic activity which, in turn, could have implications for transport related crime and policing. We can guess what the impacts might be but what will this really look like?

The motivation behind SpatialOS is the ability to understand the consequences of complexity. In complex adaptive systems such as cities the interactions between different components of the system are non-linear. Contrasted with the relationships in linear systems where effects will be proportional to and more directly related to their causes, the relationships in non-linear systems will be disproportionate, indirect and, therefore, extremely difficult to predict.

While large amounts of data about cities can now be gathered in near real time for the purpose of up to the minute analysis, this does not provide any guarantee that the currently observed patterns of behavior will persist into the future. This limits our ability to answer questions of the kind ‘What If?’ What if we add a new station to the tube network? What if we route the trains differently? What if we build a new district? What will the impact of a new technology be? SpatialOS offers to help answer these questions by enabling us to digitally recreate and simulate the entire system.

The challenge Narula identifies is that of achieving a form of ‘Strong Simulation’ which leverages sufficient computational power that is fully dedicated to the process of simulation. This is contrasted with ‘Weak Simulation’ which Narula characterised in a previous presentation as the necessity of sharing computation between simulation and other processes such as rendering. With SpatialOS Strong Simulation is achieved through the distribution of computational load across servers in the cloud.

This is potentially ground breaking technology for data scientists but also has amazing potential for the creation of virtual environments and video gaming. The presentation ends with a presentation from Bossa Studios whose game Worlds Adrift uses SpatialOS to provide a massive mutliplayer online gaming environment.

What makes the game special is the way in which each of the individual objects in the game environment are indefinitely persistent and affected by physics in real time. Already simulating over four million objects at the current stage of development the potential of SpatialOS for both gaming and scientific simulation is massive!

 

VUCITY: Wagstaffs 3D Model of London Ten Months On

Back in February I posted the first video of VUCITY by Wagstaffs Design. Ten months on the functionality of the product has improved considerably. Aimed at high end developers, planners and architects their 3D model visualises 80 square kilometres of London and incorporates high detail contextual models provided by Wagstaffs partners Vertex Modelling.

In addition to interactive navigation the model also provides tools for rights to light and sunlight studies, London View Management Framework (LMVF) visualisation for protected views, search for existing, planned and consented development, transport information and integration of live data overlays. At the recent MIPIM expo in London VUCITY also appeared with further functionality for intuitive navigation through gesture control.

In a recent feature on London Live Wagstaffs director Jason Hawthorn explains how VUCITY has been created with the aid of gaming technology, in this case Unity game engine, and outlines the companies future plans. VUCITY remains one to watch closely!