The visualisation is similar to what I did for the IRIS competition earlier but the difference is in the backend. Instead of reading a preset datafile and displaying it, this map here has a MySQL database in backend and queries it through PHP and visualises the result. It also has PHP based POST mechanism to send data to the database from the user. The best part is that none of the data in the image above is collected or entered by me (except for my two data points). It is rather generated by the people who individually entered their own locations.
The current functions available with this map are,
1) Speak to navigate – Click on the microphone button (or click “ctrl+alt+.” ) on the text box and start speaking. The field will recognise when you stop speaking, analyse and interpret the sound as text and if it is a place, takes you right to place you just spoke.
2) Zoom and Pan – you can use the same trick with some preliminary commands as well, the system as of now understands ,
“east direction“, “west direction“, “south direction” and “north direction” will pan the map in the corresponding direction.
“zoom in/ zoom out ” zooms the map.
3) Other commands,
“satellite“ – switches the map to a satellite map
“simple“ – switched the map to, above shown simplified default look and feel.
As usual, I request the readers to give it a try and share the results & problems in the comments section below. Also feel free to put in your suggestions and point out any mistakes.
Our team (Daniel, Kostas and I) have just finished doing 3 entries for the competition – An eye on IRIS, which aims to produce visualization of the data on esteems won by UCL researchers all around the world for which had been provided with a small subset of the IRIS database.