We (Kostas, Daniel Lam and I ) have finally finished an augmented reality based app for an exhibit at the “Almost Lost: London’s Buildings Loved and Loathed” exhibition organised by English heritage at Quadriga Gallery, Wellington Arch. The event ( http://goo.gl/Qq6twI ) is going to be open until 2nd Feb – if anyone is interested in the architectural conservation history of london.
Our work in the exhibit was to build an AR app which adds animations (smoke,people, carriages) to a physical model and also augments it with the current day 3D buildings in the same area and RAF imagery. Screen Shots of the app (in the making) are below. The press coverage of the event can be seen at http://goo.gl/DtPG60 andhttp://goo.gl/OvQvMT .
Credits: 3D model of Bloomsbury by Blom
A more refined visualization of the same data used in the last post. As you can see this video is a demo of an interactive programme which can be downloaded here (32bit or 64bit).
To give a brief explanation, the visual canvas above shows an negative map (from OpenStreetMap) of central London with all the docks of Barclay’s cycle hire as orange dots. The name of the docks represented can be viewed by moving the mouse over the dots. One can view the number of trips made ‘from’ or ‘to’ a dock by selecting the dock by clicking the dot. Pressing the key ‘t’ in the keyboard toggles between the ‘start mode’ and the ‘end mode’. Two or more docks can be selected by dragging a box around them (just like Auto-CAD). Pressing the ‘r’ key deselects all the docks.
The major work here was getting the dock information from the trip data. I solved the problem by creating a array of dock class objects which is grown by serially reading the trip data and adding it to the array after checking if the specific dock is already in the list. The comparison is done through the location co-ordinates to remove any ambiguity caused due to names or ids. Apart from that building interactivity was quite tricky to understand in the beginning.
The video was done by recording a sketch frame by frame as .png images and stitching them up in blender (Thanks to Rex Harby).As usual, I request the readers to give it a try and share the results & problems in the comments section below. Also feel free to put in your suggestions and point out any mistakes.
note: The first 20 seconds of the video doesn’t have much activity since it is really early to cycle.
After an intensive term with R, it was time to move slightly away from the world of geographic data to a more abstract visualizations with processing. The above video is the first attempt in doing so. The work was done as a coursework for Digital Visualization course with Dr. Martin Z Austwick, Center for Advanced Spatial Analysis, UCL.
The base data is a CSV file, extracted from the Barclay’s cycle hire feed provided by Transport for London on 25th December. It has the information on each journey made with the bikes pertaining to the location and time of origin and destination and the total duration of the journey. The whole CSV is loaded in processing and parsed to make a array of instance objects with the information of location, time and weather it is a ‘hire’ or ‘return’. then the array is visualized according to a timer event which progress through every frame.
In the final visualization shown above, the dots represents the relative locations of bike docks at which the bikes are hired or returned and the color of the dots show if it is hire (green) or return (red). Since the animation is cumulative(builds upon as time progresses), brighter spots indicates more activity than the darker ones and the scale of the color from red to green shows if the location is more of a destination or an origin for the trips. The timer shows the number of minutes elapsed from 00.00 Hrs.
Though it is not much informative at present, the visualization shows the potential of Processing as a tool to visualize complex data sets in a more digestible form.