Category Archives: QEOP

TFL JamCam Video Feeds Integrated into CASA ViLo

For a number of years TfL have been providing open access to feeds from over 170 traffic cameras or ‘JamCams’ distributed at key locations across London’s road network. In addition to static images each camera also provides a five second video which is updated every 5 minutes. The feeds of these videos have now been incorporated into CASA’s ViLo platform.

I’d been fascinated by the videos for some time. Every morning when I arrive at CASA I check out CASA’s London CityDashboard which we have on in our reception area. The dashboard includes two static images from the cameras chose at random along with a looped video feed from another in the top right.

I was always struck by the sense of ground truth the cameras seemed to offer for a particular place. At the same time I was frustrated by the fact that I couldn’t get a sense of the wider context: What’s just out of shot? What’s the wider context in which each camera is situated? What’s going on over at the next nearest camera and the rest in the surrounding area Incorporating the feeds from those cameras into ViLo provides a spatialised sense of context in a way that the dashboard can’t. The 3D models also help users understand the orientation of each camera in a way that a map might not. Finally their incorporation in ViLo also facilitates comparison with other spatialised streams of data data.

By comparison with other real-time feeds like TfL’s real-time bus information the traffic cameras provide a much richer sense of what is happening in an area, at least within the five minute time scale provided by the video updates. Not only do we get a sense of the flow of traffic and any blockages, the information provided by the cameras also provides a wider situational awareness of factors like local weather conditions and pedestrian footfall. In this way the information they provide offers a degree of validation to other data sets that can be particularly useful when additional context is required for decision making by city officials and members of the public alike.

Thanks to Oliver O’Brien and Steven Gray for providing access to the TFL Traffic Cam data via CityDashboard the Big Data Toolkit.

 

Advertisements

ViLo and the Future of Planning

Following our recent posts on CASA’s digital urban visualisation platform ViLo, the Future Cities Catapult who collaborated with CASA on the project have released a video discussing it in further detail. Commencing in 2014 the aim of the project was to identify which combinations of urban data might be most valuable to urban planners, site operators and citizens. CASA research associates Lyzette Zeno Cortes and Valerio Signorelli discuss how it was created using the Unity game engine in order to understand its potential for visualising information in real-time.

Ben Edmonds from the London Legacy Development Corporation who run the Queen Elizabeth Olympic Park where ViLo has been tested discusses how this was used to gather environmental data and qualitative data from park visitors in order to help understand and improve their experience of the park. Including real-time information on transportation links, environmental factors and park usage by the public helps to build up an overview of the whole area so that it can be run more effectively.

Beyond this there is an expectation that use of the 3D model can be extended beyond the Olympic Park and implemented London-wide. This fits in to a wider expectation for City Information Modelling (CIM). As Stefan Webb from the Future Cities Catapult describes it, this is the idea that a 3D model containing sufficient data can enable us to forecast the impact of future developments and changes to the functioning of both the physical and social infrastructure of the city.

Nature Smart Cities: Visualising IoT bat monitor data with ViLo

In the past weeks I’ve been collaborating with researchers at the Intel Collaborative Research Institute (ICRI) for Urban IoT to integrate data from bat monitors on the Queen Elizabeth Olympic Park into CASA’s digital visualisation platform, ViLo. At present we are visualising the geographic location of each bat monitor with a pin that includes an image showing the locational context of each sensor and a flag indicating the total number of bat calls recorded by that sensor on the previous evening. A summary box in the user interface indicates the total number of bat monitors in the vicinity and the total number of bat calls recorded the previous evening. Animated bats are also displayed above pins to help users quickly identify which bat monitors have results from the previous evening to look at.

The data being visualised here comes from custom made ‘Echo Box’ bat monitors that have been specifically designed by ICRI researchers to detect bat calls from ambient sound. They have been created as part of a project called Nature Smart Cities which intends to develop the worlds first open source system for monitoring bats using Internet of Things (IoT) technology. IoT refers to the idea that all sorts of objects can made to communicate and share useful information via the internet. Typically IoT devices incorporate some sort of sensor that can process and transmit information about the environment and/or actuators that respond to data by effecting changes within the environment. Examples of IoT devices in a domestic setting would be Philips Hue Lighting which can be controlled remotely using a smartphone app, or Amazon’s Echo which can respond to voice commands in order to do things like cue up music from Spotify, control your Hue lighting or other IoT devices, and of course order items from Amazon. Billed as a ‘”shazam” for bats’ the ICRI are hoping to use IoT technology to show the value of similar technologies for sensing and conserving urban wildlife populations, in this case bats.

Each Echo Box sensor uses an ultrasonic microphone to record a 3 second sample of audio every 6 seconds. The audio is then processed and transformed into an image called a spectrogram. This is a bit like a fingerprint for sound, which shows the amplitude of sounds across different frequencies. Bat calls can be clearly identified due to their high frequencies. Computer algorithms then analyse the spectrogram to compare it to those of known bat calls in order to identify which type of bat was most likely to have made the call.

The really clever part from a technical perspective is that all of this processing can be done on the device using one of Intel’s Edison chips. Rather than having large amounts of audio transmitted back to a centralised system for storage and analysis, Intel are employing ‘edge processing’, processing on the device at the edge of the network, to massively reduce the amount of data that needs to be sent over the network back to their central data repository. Once the spectrogram has been produced the original sound files are immediately deleted as no longer required. Combined with the fact that sounds within the range of human speech and below 20kHz are ignored by the algorithms that process the data, this ensures that the privacy of passersby is protected.

This is a fascinating project and it has been great having access to such an unusual data set. Further work here can focus on visualising previous evenings data in time-series to better understand patterns of bat activity over the course of the study. We also hope to investigate the use of sonification by incorporating recordings of typical bat calls for each species in order to create a soundscape that complements the visualisation and engages with the core sonic aspect of study.

Kind thanks to Sarah Gallacher and the Intel Collaborative Research Institute for providing access to the data. Thanks also to the Queen Elizabeth Olympic Park for enabling this research. For more information about bats on the Queen Elizabeth Olympic Park check out the project website: Nature Smart Cities.

Virtual Architectures at the QEOP Smart Park Demonstrator

From Saturday 13th of February to Sunday 21st February the London Legacy Development Corporation (LLDC) held a 9 day ‘Smart Park’ demonstrator event at the ArcelorMittal Orbit tower on the Queen Elizabeth Olympic Park. Exhibits were provided by The Bartlett Centre for Advanced Spatial Analysis (CASA) and the UCL Interaction Centre (UCLIC).

As the event was held during school half-term the exhibits were specifically aimed to engage youngsters. My contribution on behalf of CASA was an immersive virtual reality fly-through of a 3D model of the Olympic Park created using CityEngine and Unity with the Ordnance Survey’s MasterMap and Building Height data. In order to capture the imagination of visitors the tour was presented as a magic carpet ride. The Oculus Rift virtual reality headset was used to provide the sense of immersion while a specially prepared soundtrack and an electric fan were used to heighten the impression of flight by simulating rushing wind.

image1

During the course of the week the event was attended by over 1500 local, national and international visitors. In this way we were able to engage the public with the Olympic Park’s work to create a data rich 3D model of the park for the purpose of ‘smart’ day to day management and planning of the park. This also served as an opportunity for a gentle introduction to my own research into the use of augmented and virtual reality technologies for the purpose of providing spatial intelligence and facilitating real time decision making in a fun and engaging way. Further work will focus on the use of the LLDC’s 3D model of the park and various emerging interaction technologies as a means for interfacing with the site’s underlying operational data infrastructure.

image2

Also exhibiting on behalf of CASA was Sharon Richardson who is conducting research into sensing and evoking behavioural change in urban spaces through dynamic data sources and the Internet of Things. Sharon’s exhibit used a web cam and computer vision to sense the emotional state of visitors and present it back to them as a visualisation in real time. Going forward Sharon hopes to take control of the Park’s fountains to visualise the emotional state of the park.

Image-1

UCLIC’s contributions included their VoxBox and a robot Roam.io. Developed in partnership with the Intel Collaborative Research Institute (ICRI) these provide novel and engaging interfaces for ‘soft sensing’ of visitor opinions.

Jointly these ongoing collaborations will investigate and contribute to aspects of the LLDC’s participation in a pan-European programme for Smart Sustainable Districts (SSD) which focuses on four primary areas of the park’s daily operation:

  • Resource Efficient Buildings – Focusing initially on the iconic London Aquatics Centre and Copper Box Arena, this workstream will create tools and approaches to enable low cost, low energy, low environmental impact management and maintenance of future ready non-domestic buildings.
  • Energy Systems – The energy systems workstream will create an efficient, smart, low carbon, resilient energy ecosystem, with specific focal points including optimisation of district energy systems, community engagement and benefits and increased renewable energy generation.
  • Smart Park / Future Living – Implementing user facing digital and data solutions that deliver financial and CO2 efficiencies and prioritise quality of life improvements for those who live, work and visit the Park.
  • These are all underpinned by the fourth workstream, Data Architecture and Management – Implementing efficient and robust data management solutions that support the identification and trialling of innovative solutions and provide the foundation for improved Park operations, user experience and approaches that can be replicated by others, including through the London Data Store.

The SSD project is overseen by Climate-KIC, one of three Knowledge and Innovation Communities created in 2010 by the European Institute of Innovation and Technology (EIT).

Our thanks to the LLDC and the management of ArcelorMittal Orbit for a fun and eventful week!

Images courtesy of Twitter user Ben Edmonds @benjaminpeter77.