Real Virtual Physics

posted in: webVR | 0

I’ve been working on a math/physics exploration environment, that uses the Vive VR headset’s tracking of real-world objects (particularly the hand control) to combine real things with virtual augmentation and real-time visualizations.

It’s currently playable on webVR for the Vive on webVR Chromium (with some bugs) at vihart.github.io/physics, and it’s on github, or you could just watch the video demo.

Instructions:

There’s two modes, experiment mode and analysis mode. You change between them by squeezing the controls. Going out of experiment mode will cause a pause at the moment (during which the headset goes blank), as objects get updated, because I haven’t fixed it or made it fast enough yet.

When in analysis mode, you can grab and move most objects, besides the points left by the controller movement. Those you can touch to see at what millisecond they were dropped, and their xyz coordinates. The corresponding point will turn green on both graphs. You can change the colors of the lights by grabbing them and touching the thumbpad, pressing the thumbpad to change intensity.

To align the virtual space with the room, put your headset on the floor in the middle of the space and press x. You can also move the virtual space using the keys uiojkl.

Inspiration:

This work was inspired by some of Alan’s Squeak Etoys experiments in math and physics education, particularly the Galilean Gravity project where children took video of actual objects falling and analyzed the frames to see the acceleration.[1] I thought: wouldn’t it be even better to be able to see the “frames” in real time as the object falls? We could simply drop a Vive controller while we track it and draw its path in real time!

This led to other ideas, and more research. Obviously we should be showing the graph of the position over time as well, and the speed on another graph that we can use to help understand derivatives! I showed this to Alan and he referenced Ronald Thornton’s work with using distance tracking to help students get a real embodied sense of what distance vs acceleration is.[2]

Generally, I’m reading up on previous work that involves embodied learning by education greats like Montessori and Papert.[3, 4] A lot of the work they’ve done can be much improved through VR technology, I believe. Developing intuition for graphs, derivatives, representations, etc, is so important for both math and physics understanding, and what better way than to be able to see it in real time as you move your body?

There’s plenty of low-hanging fruit here, in terms of physics and math instant-feedback that is fun to play with. More soon, I hope.

Vi Hart

[1] I heard the behind-the-scenes version from Alan Kay in person, but he also describes this work in “Squeak Etoys, Children & Learning”: http://www.vpri.org/pdf/rn2005001_learning.pdf

[2] Prof Ronald K. Thornton at Tufts University, who shows this work in video form here: https://www.youtube.com/watch?v=iyKRLL67s3Q

[3] Maria Montessori, who developed the Montessori approach to education, which includes lots of embodied learning and movement: https://en.wikipedia.org/wiki/Montessori_education

[4] Seymour Papert, who wrote Mindstorms and developed LOGO, a programming language where children learn basic math/physics concepts by programming a “turtle” to draw lines. Children use their understanding of how their own bodies move (or actually act out the instructions) to better understand the mathematical instructions.