Inspired by the interactive, educational AR Sandbox from UC Davis, we mashed up a Microsoft Kinect sensor running on Linux with Augmented Reality running on Unity to create a generative design project with a new type of interface. If you move the soil of the display, a new cityscape is dynamically created using generative design.

 

 

Inspiration

Our project was inspired by the "AR Sandbox", a National Science Foundation project from the University of California, Davis.
Interactive
The AR Sandbox is hard not to touch. Indeed, it is meant to be touched.
Educational
Hundreds of AR Sandbox's are used in educational settings to help the public understand earth geosciences.
Challenging
There's some serious tech in there. Linux, Microsoft Kinect sensor, and Augmented Reality.
Open Source
The AR Sandbox is open source, which means it is free to hack! We LOVE open source.

What We Accomplished in One Weekend!

No Previous Code
All new code was written this weekend.
It Works!
It's kind of amazing it works! There's lots of systems here - Kinect, Linux, TCP / IP, Unity, Augmented Reality, a Sand Box.
Open Source
The work is open source so we expect others will improve it.
Surprised Ourselves
We're surprised we got it done.

Future Improvements

Multiple AR Devices
Our work is built on Unity, which means we can easily deploy to at least FOUR other Augmented Reality devices.
Better Generative Design Building Algorithms
We can improve our algorithm for making the buildings so we can make more complex buildings.
More Algorithms!
We can add algorithms for roads, parks, commercial districts, and entertainment zones.
SAND!
We can use SAND and mount a projector for added fidelity (and cleaner fingernails).