On Friday the Google Summer of Code Blog posted very interesting news about touchEarth created by Pawel Solyga.
touchEarth, an application he developed that allows you to control Google Earth using two finger gestures on multi-touch table. touchEarth uses the Google Earth COM API to control some of Google Earth’s features, while all the multi-touch screen events are sent to touchEarth from touchlib (or OpenTouch) using the TUIO protocol.
Most of us have seen these multi-touch interfaces before with the ESRI touch table but allowing Google Earth to be controlled this way will get this technology into more places. Of course I’m not exactly looking forward to everyone dragging their greasy hands across my MacBook Pro in a few years but coupled with that LG Touch LCD Screen we could be seeing some really interesting implementations moving forward.
Video after the jump.