Augmented Reality Sandbox – a day of development

Last weekend, we collaborated with JBA Trust to provide a hackathon revolving around their latest physical model – an Augmented Reality Sandbox.

Teams from different parts of the company gathered at our Belle Vue Mills office with the purpose to innovate and have a bit of fun at the same time. Three teams were formed and all three managed to prototype their idea and present it to the rest of the group all in the same day. Here’s a summary of what they came up with:

Team 1 – Justin and Paul

What did you develop?

Our team developed two iOS (Apple device) applications. The first application provides a means of generating a custom command to run the sandbox software. The custom command is generated by the user changing a series of parameters on the app itself. The second app allows the user to generate a new colour ramp for the contour shading by using a colour picker.

How did you develop it?

We used Xcode on a Mac device to design and code the user interface.

Why is it useful?

It is useful because the current method of changing parameters in the sandbox is laborious and requires an understanding of the code. Exposing these parameters to a user interface allows people without any coding experience to modify the behaviour of the software.

Team 2 – The Mermaids (Heather and Camilla)

What did you develop?

We modified the code to successfully slow down the flow of the water in the part of the sandbox where we had increased the attenuation.

How did you develop it?

We modified the sandbox code so that instead of using a global variable for the attenuation constant, it draws a location-specific attenuation constant from a texture that we created for that purpose.

Why is it useful?

This helps us make simulations where some areas of the sandbox have greater roughness than others.

Team 3 – Ed, Tasmin and Charlie

What did you develop?

A mode switch to change the function of the standard hand gesture. Plus a system for inputting commands (in this case a single “Toggle Mode” command) via external processes.

How did you develop it?

We first added a ‘water’ mode to work with the existing rain functionality. A new tool was then created using a template. We added a new mode called ‘vegetation’, which we adapted to drain the water in the sandbox once a hand is present. We then check each frame for any messages (e.g ‘Vegetation’) and if there are, then we use them.

Why is it useful?

Different modes for the hand gesture could be used to apply different parameters to certain parts of the model. For example, vegetation mode could be used to ‘paint’ different values for roughness and permeability across the model.

Once you are able to pipe text into the process to execute commands, it opens up a world of options for where to get that text. External scripts in languages other than C++, network sockets and speech recognition systems are just a few examples.

Great to be involved

Andrew Gubbin, Director, commented, “Our first hack was a great success. Three products were developed that it will be possible to introduce to the sand box in due course and expand its functionality.

“It was particularly pleasing to be able to involve staff from outside the software development teams. We look forward to the next hack involving an even wider cross section of our core services and will investigate involving other offices through remote access. An excellent day and thank you to all involved.”

Finally, congratulations go to the Mermaids, Heather and Camilla, as their idea was voted as the winner of the day!

Want to know more?

To find out more about the Hackathon day and plans for the future please contact Sam Griffiths.

You can read more about the JBA Trust Augmented Reality Sandbox on their website. The sandbox, and their other physical models, are all available to hire for events and training opportunities. Simply contact Alex Scott for more information.



Leave a Reply