Monthly Archives: February 2015

Pigeon Sim with Leap Motion

Of all the demos developed here at CASA Pigeon Sim remains a firm favourite. Using a Kinect to capture body movements the simulator allows the user to fly and flap their way around London in Google Earth. What I didn’t know is that it also has a mode for Leap Motion. Steven Gray who teaches Big Data Analytics at CASA demonstrates how to use it in the video below:

If you want to give it a try the full Pigeon Sim code repository is available on github. In order to run the Leap Motion version you should only need to download the contents of the ‘web_client’ folder. Open the ‘index.html’ file in your web browser and then add ‘?enableLeap=1’ (e.g. web_client/index.html?enableLeap=1) to the search query in your browser. Happy flapping!

Hands-on VR with Leap Motion

Leap Motion

Virtual Architectures is excited to have received our Leap Motion gestural sensor. The device works a little like the Microsoft Kinect which tracks full body movements but has a much higher resolution making it suitable for tracking specific hand movements. Small enough to mount on the Oculus Rift VR headset it provides a particularly intuitive interface for interacting with virtual environments.

Although we’re yet to try any of the VR demos created for the device, first impressions are good. After downloading the sensor’s software we had demos from the Leap Motion app store running in minutes.

With an affordable price tag of €96.99 for the VR Developer Bundle and integrations for both the Unity and Unreal game engines, Leap Motion is a compelling option for exploring alternative methods of user interaction in VR. Other projects allowing I’m looking forward to experimenting in the coming months.

Google Earth Pro is Free to Download

I recent learned via the Google Earth Blog that Google Earth Pro is now free to download. I hadn’t used Google Earth for some time so I decided to try out the Movie Maker feature which is Pro only.

With movie maker it is possible to record live navigation with the mouse although this doesn’t tend to give a very smooth or or professional result. Using a specially designed 3D mouse such as the Space Navigator works much better. For my test I opted to make my video from a quick tour I created zooming into the Tower of London from orbit. This was achieved by selecting a sequence of points for the camera to visit. Movie Maker was then used to convert that tour to video. This took about 30mins to render with a fade in and fade out added in Adobe Premiere.

The attribution details at the bottom of the screen can be a little distracting because they change as the view moves between imagery from different satellites. It would be possible to crop these out but Google do insist on particular guidelines for permission to use their imagery and attribution. The are also occasional issues where parts of the 3D geometry flicker due to caching. Nevertheless, the results are very good for a quick and easy visualisation!

The combination of satellite imagery and 3D geometry are great for engaging viewers and giving a map context. Other features like a recorded voice over and the ability to highlight areas of interest can also be useful depending on the context. A comparison of features between the free and Pro versions of Google Earth can be found here, along with the link to download it.

VUCITY: Approaching real-time city information

On Tuesday I was invited to visit Wagstaffs Design in London to take a look at their latest product VUCITY. Powered by the Unity game engine, VUCITY offers an interactive 3D model of London that can be deployed to a touch screen table, video wall, tablet or desktop computer as required. The standalone application enables users to rotate and view the entire scene and zoom down to the scale of individual buildings.

VUCITY Table

Currently VUCITY covers 80 square kilometres of Central London, from Earls Court in the west to the ExCeL exhibition centre in the east, and from Old Street in the north to Battersea in the south. The project is a joint venture between Wagstaffs and Vertex Modelling who are able to provide high detail 3D models of the London area. Created from high resolution imagery the models boast an average accuracy tolerance of 40mm compared with full measured surveys. This degree of accuracy is particularly important for viewshed analysis and visualisation proposed as a possible application in Wagstaffs’ promotional video:

Over time Wagstaffs are seeking to integrate a range of data including data on demographics, property prices over time as well as live data streams for transport. The inclusion of real-time data is possibly the most exciting aspect of the project but also holds the biggest challenge. This is a fantastic project and I’ll be following it keenly. I thoroughly recommend heading over to the Wagstaffs’ website to check out VUCITY and their other great projects today.