Tag Archives: Smart Cities

Digital Literacy in the context of Smart Cities

September 8th is UNESCO’s International Literacy Day. This year the theme is ‘Literacy in a digital world’:

At record speed, digital technologies are fundamentally changing the way people live, work, learn and socialise everywhere. They are giving new possibilities to people to improve all areas of their lives including access to information; knowledge management; networking; social services; industrial production, and mode of work. However, those who lack access to digital technologies and the knowledge, skills and competencies required to navigate them, can end up marginalised in increasingly digitally driven societies. Literacy is one such essential skill.

Just as knowledge, skills and competencies evolve in the digital world, so does what it means to be literate. In order to close the literacy skills gap and reduce inequalities, this year’s International Literacy Day will highlight the challenges and opportunities in promoting literacy in the digital world, a world where, despite progress, at least 750 million adults and 264 million out-of-school children still lack basic literacy skills.

International Literacy Day is celebrated annually worldwide and brings together governments, multi- and bilateral organizations, NGOs, private sectors, communities, teachers, learners and experts in the field. It is an occasion to mark achievements and reflect on ways to counter remaining challenges for the promotion of literacy as an integral part of lifelong learning within and beyond the 2030 Education Agenda.

In the past days I’ve been preparing for a conference talk this weekend and its has become clear to me that digital literacy is of key importance for helping individuals to engage with urban technologies and exercise digital agency. It is through digital literacy that people living in cities will be able to understand and make informed decisions with regard to the use and impact of emerging technologies. Smartphones, the Internet of Things, driverless cars, drones, artificial intelligence and automation can be very daunting and their implications unclear.

A common response to the perceived imposition of digital technologies is to try to disconnect. It is up to the individual to determine to what extent they engage with such technologies. However, ignoring these technologies altogether is no solution. At the very least we have to provide the opportunity for those that are sufficiently capable to be able to inform themselves, enabling them to more effectively assess the advantages and disadvantages of different technologies. We need to move away from the kind of binary thinking that leads to an all or nothing approach to technology. Fostering digital literacy is key for helping individuals and communities negotiate lives that are increasingly mediated by digitally technologies.

At the conference on Monday afternoon I’ll be presenting my paper ‘Opening Urban Mirror Worlds: Possibilities for Participation in Digital Urban Dataspaces’. In this talk I’ll discuss some of the ways in which technologies like virtual and augmented reality might be used to give people access urban data. I’ll also be part of a panel discussing ‘Engagement in the Smart City’. Further details can be found on the conference website: Whose Right To The Smart City.

Advertisements

Nature Smart Cities: Visualising IoT bat monitor data with ViLo

In the past weeks I’ve been collaborating with researchers at the Intel Collaborative Research Institute (ICRI) for Urban IoT to integrate data from bat monitors on the Queen Elizabeth Olympic Park into CASA’s digital visualisation platform, ViLo. At present we are visualising the geographic location of each bat monitor with a pin that includes an image showing the locational context of each sensor and a flag indicating the total number of bat calls recorded by that sensor on the previous evening. A summary box in the user interface indicates the total number of bat monitors in the vicinity and the total number of bat calls recorded the previous evening. Animated bats are also displayed above pins to help users quickly identify which bat monitors have results from the previous evening to look at.

The data being visualised here comes from custom made ‘Echo Box’ bat monitors that have been specifically designed by ICRI researchers to detect bat calls from ambient sound. They have been created as part of a project called Nature Smart Cities which intends to develop the worlds first open source system for monitoring bats using Internet of Things (IoT) technology. IoT refers to the idea that all sorts of objects can made to communicate and share useful information via the internet. Typically IoT devices incorporate some sort of sensor that can process and transmit information about the environment and/or actuators that respond to data by effecting changes within the environment. Examples of IoT devices in a domestic setting would be Philips Hue Lighting which can be controlled remotely using a smartphone app, or Amazon’s Echo which can respond to voice commands in order to do things like cue up music from Spotify, control your Hue lighting or other IoT devices, and of course order items from Amazon. Billed as a ‘”shazam” for bats’ the ICRI are hoping to use IoT technology to show the value of similar technologies for sensing and conserving urban wildlife populations, in this case bats.

Each Echo Box sensor uses an ultrasonic microphone to record a 3 second sample of audio every 6 seconds. The audio is then processed and transformed into an image called a spectrogram. This is a bit like a fingerprint for sound, which shows the amplitude of sounds across different frequencies. Bat calls can be clearly identified due to their high frequencies. Computer algorithms then analyse the spectrogram to compare it to those of known bat calls in order to identify which type of bat was most likely to have made the call.

The really clever part from a technical perspective is that all of this processing can be done on the device using one of Intel’s Edison chips. Rather than having large amounts of audio transmitted back to a centralised system for storage and analysis, Intel are employing ‘edge processing’, processing on the device at the edge of the network, to massively reduce the amount of data that needs to be sent over the network back to their central data repository. Once the spectrogram has been produced the original sound files are immediately deleted as no longer required. Combined with the fact that sounds within the range of human speech and below 20kHz are ignored by the algorithms that process the data, this ensures that the privacy of passersby is protected.

This is a fascinating project and it has been great having access to such an unusual data set. Further work here can focus on visualising previous evenings data in time-series to better understand patterns of bat activity over the course of the study. We also hope to investigate the use of sonification by incorporating recordings of typical bat calls for each species in order to create a soundscape that complements the visualisation and engages with the core sonic aspect of study.

Kind thanks to Sarah Gallacher and the Intel Collaborative Research Institute for providing access to the data. Thanks also to the Queen Elizabeth Olympic Park for enabling this research. For more information about bats on the Queen Elizabeth Olympic Park check out the project website: Nature Smart Cities.