Using your smartphone (any modern phone with a built-in accelerometer should work), visit the Cast Your Spell page created by Nick Strayer. (If you need to type it to your phone browser directly, here’s a shortlink:
bit.ly/castspell .) Scroll down and click the “Press To Cast!” button, and then wave your phone like a wand using one of the shapes shown.
The app will attempt to detect which of the four “spells” you gestured. It was pretty confident in its detection when I cast “Incendio”, but your mileage may vary depending on your wizarding ability and the underlying categorization model.
Nick Strayer described how he built this application in a presentation at Data Day Texas last month. The app itself was built using Shiny with the shinysense package (on Github) to collect movement data from the phone. Nick trained a convolutional neural network model (from his own casting gesture data) using the keras package to classify gestures into one of the four “spells”. (Interesting side note: because CNNs aren’t time-dependent, you can gesture in reverse and still pass the classification test.)
It’s almost like magic! For the complete details on how the Cast your Spell app was constructed, see Nick Strayer’s presentation at the link below.
Bigdata and data center
thanks you RSS link