The idea of embodied knowledge seemed fuzzy to me at first, but after a bit of research and experience I’m convinced this is a rich area that is just now becoming ripe due to the cheap easy spatial tracking hardware designed for virtual reality. I love my conscious logical mind with its mathematical precision, but I also know that before we put ideas into precise words or mathematical language there’s a great deal of abstract thought that goes uncaptured and unexpressed.
Up until now, we had no way to capture embodied knowledge, the things our bodies understand intuitively without needing help from the conscious mind. Things like space, distance, and motion, as they are felt, rather than as numbers. The ability to reach out and grab the object you want without taking the time to remember what it’s called or what it looks like. Our ability to get a feel for the behavior and properties of an object, how heavy it is, how big, how brittle, how it moves, how it falls, how it interacts with other objects, all completely unconsciously during the routine handling of things. The embodied sense of how things are, where things are, and when things are, in a fuzzy general picture that allows us to reason complexly and effectively, even if not mathematically precisely.
Recall what visual diagrams have done to augment our thinking—the ability to take in an entire graph in a glance, or to see things arranged spatially into categories as a metaphor for a non-spatial categorization, is now so common it’s easy to forget these things were invented. Take Venn diagrams, that seemingly simple and obvious tool for visualizing overlapping categories. They were introduced to the world by John Venn, in his 1880 paper “On the Diagrammatic and Mechanical Representation of Propositions and Reasonings”:
Taking written notes, too, is a very recent development in human history, given how long the human brain has been basically the same. But oh how it augments our memory, our planning abilities, our reasoning abilities, beyond that of our ancestors! The best part of these tools is that once you have them in your head, anyone can use them. I like to think there’s undiscovered thinking tools right on the horizon that are designed to help people reason more effectively by leveraging the expert knowledge of their own bodies, rather than requiring educational resources and experiences that are out of reach for many.
What really convinced me to pursue embodied cognition was our work last year on embodied physics understanding, in Real Virtual Physics. It’s easier to get a feel for what a graph means when it’s reacting in real time to your actual motions in actual space. The graphs in our project were the same old flat visual picture changing in real time, but the useful part is that you don’t need to visually multitask by comparing this picture to some other visual thing like seeing a ball bounce or watching video of an object fall. You know what the motion is because you’re the one moving the object. Your body knows. And once that body knowledge is connected to the abstract representation of motion, the hardware is no longer required. The idea is what matters.
I’m convinced that between VR technology’s ability to track spatially in real time, and our embodied knowledge, there’s going to be some new ways to capture and work with abstract thought. What thinking tools, as obvious as the Venn diagram, have we not invented yet, just because we didn’t have quick easy sketching tools for 3D space? What can we jot down when we can jot down motions and behaviors, rather than only words and static pictures? Here in our fourth year of VR research, I feel ready to find out. We can track bodies. We can effortlessly sketch objects in space. And finally, there’s enough tools out there for doing these things that we don’t have to write them all ourselves!
Maybe some of these tools/representations will even transcend the technology and become available as pure thinking tools. But where to start?
As part of our embodiment research we’ve been reading “Philosophy in the Flesh”, by Lakoff and Johnson, which discusses the idea that logical schemas come from our embodied experience with the environment. We get a feel for the abstract categorical idea of inside/outside from our physical experiences going inside and outside things, putting objects inside and outside of physical containers, etc. From that experience, we’re able to reason about groups of objects unrelated by space/distance by abstractly visualizing them as grouped in space, using the container schema. A typical representation of this idea might look like this:
Not very exciting, this one-circle Venn diagram. We’re used to infographics that group objects, pictures, and logos in space. But I don’t take this idea or visualization for granted! Given that many attempts at the foundations of mathematics have to do with categories and sets, I’m extra interested in this theory of how we get the idea/understanding of categories in the first place. If bodies and actions in space have something to do with it, I want to know!
So to better understand the idea of container schema and embodied logic, and also to practice sketching ideas that are spatial or behavioral, I used the VR creation environment Anyland to quickly sketch 50 iterations of container schema (following the art prompt of “do 50 iterations of something” that we’ve all been trying recently). Some are purely visual, some interactive, some demanding embodied action. A tour of the results is on YouTube:
I didn’t invent the new Venn diagram, but it was a good exercise to give myself a deeper understanding of the container schema concept and the many things it can apply to. After spending a couple days making the above, I’m seeing the world in terms of container schema. It’s making me see things differently, which is exactly what a good tool for thought should do. Perhaps as I continue trying out ways of sketching ideas, the feedback loop of creating/seeing will lead to something interesting.
P.S. “Container Schema” is publicly accessible in Anylands, so if you have a Vive you can go check it out for yourself!