August 2016 – Ongoing
An educational/explorational tool for understanding physics, using real-time data about real-world objects, displayed and manipulated from within VR. We use VR technology to track the physical motions of real objects falling, accelerating, swinging, and moving in any way, while also able to view metadata about these objects position and time, for an embodied understanding of motion, graphs, and derivatives.
At the moment, there’s two “toys” in addition to the more common graph representations, highlighted in each of two videos on the project:
Instructions for using Real Virtual Physics:
At the moment, make sure only one controller is running and in use.
To align the virtual space with the room, put your headset on the floor in the middle of the space and press x. You can also move the virtual space using the keys uiojkl.
To toggle whether the website shows up in the headset (going “into VR mode”), use the button on the controller above the thumbpad, or you can press enter, space, or v.
By default you’ll start in the space in analysis mode, and can move objects with the controller and trigger button.
There’s two modes, experiment mode and analysis mode. You change between them by squeezing the side buttons on the Vive controller. In experiment mode, you’ll see a trail of red dots getting left behind by the controller, and physics toys will react to your movement. Going out of experiment mode will cause a pause at the moment (during which the headset goes blank), as objects get updated (hopefully we’ll make this faster soon).
When in analysis mode, you can grab and move most objects. The points left by the controller movement can be touched to see at what millisecond they were dropped, and their xyz coordinates. The corresponding point will turn green on both graphs.
The scene is lit by three lights right now, which appear as wireframe spheres when in analysis mode. You can change the colors of the lights by grabbing them and touching the thumbpad, pressing the thumbpad to change intensity.
This work was inspired by some of Alan’s Squeak Etoys experiments in math and physics education, particularly the Galilean Gravity project where children took video of actual objects falling and analyzed the frames to see the acceleration. I thought: wouldn’t it be even better to be able to see the “frames” in real time as the object falls? We could simply drop a Vive controller while we track it and draw its path in real time!
This led to other ideas, and more research. Obviously we should be showing the graph of the position over time as well, and the speed on another graph that we can use to help understand derivatives! I showed this to Alan and he referenced Ronald Thornton’s work with using distance tracking to help students get a real embodied sense of what distance vs acceleration is. Yoshiki managed to hunt down a very interesting paper about an evaluation of this work as part of a curriculum.
Generally, I’m reading up on previous work that involves embodied learning by education greats like Montessori and Papert.[4, 5] A lot of the work they’ve done can be much improved through VR technology, I believe. Developing intuition for graphs, derivatives, representations, etc, is so important for both math and physics understanding, and what better way than to be able to see it in real time as you move your body?
There’s plenty of low-hanging fruit here, in terms of physics and math instant-feedback that is fun to play with. More soon, I hope.
 I heard the behind-the-scenes version from Alan Kay in person, but he also describes this work in “Squeak Etoys, Children & Learning”: http://www.vpri.org/pdf/rn2005001_learning.pdf
 Prof Ronald K. Thornton at Tufts University, who shows this work in video form here: https://www.youtube.com/watch?v=iyKRLL67s3Q and
 Paper related to the above, by Ronald K. Thornton and David R. Sokoloff, “Assessing student learning of Newton’s laws: The Force and Motion Conceptual Evaluation and the Evaluation of Active Learning Laboratory and Lecture Curricula”
 Maria Montessori, who developed the Montessori approach to education, which includes lots of embodied learning and movement: https://en.wikipedia.org/wiki/Montessori_education
 Seymour Papert, who wrote Mindstorms and developed LOGO, a programming language where children learn basic math/physics concepts by programming a “turtle” to draw lines. Children use their understanding of how their own bodies move (or actually act out the instructions) to better understand the mathematical instructions.
Related blog posts: http://elevr.com/real-virtual-physics/