Networked Gestures

Networked Gestures

posted in: Social VR, Uncategorized | 0

Ever since joining eleVR a few months ago, I have been exploring various apps to test whether virtual social art studios are feasible. Some of my favorite apps in this vein have been SculptVR, Anyland and a handful of the third-party apps within AltspaceVR. After spending a decent amount of time in all of these apps, however, I found my expressivity to be limited by the awkward gestures and interfaces required by simplistic VR controllers. Instead of adapting my virtual art studio behavior to inadequate interfaces, I wanted the ability to interact with my environments, objects, tools, and other users using more natural and embodied hand gestures. I wasn’t quite sure what gestures I envisioned, but I knew that awkward controller buttons were limiting my more organic hand movements.  I looked for creative apps that used more “free-form” input devices like the LeapMotion, but of all the apps I tested, none supported interactivity with hand-tracked inputs.

 

Screen Shot 2017-01-12 at 2.59.06 PMUsing disembodied controllers to attempt to draw lines in SculpTogether within AltspaceVR.

 

Since there were no apps available to help me explore natural gesture inputs for creative interactions, I decided to prototype a simple app myself that would use hand-tracking input to interact with imported 3D sculpture models, all in a social VR setting.   By creating my own app, I figured I could explore a range of gestures based on my actual needs.  By creating the app as a social VR app, I could also test the natural gesture and interaction ideas with the other eleVR members, so we could not only play with personal interaction modes but also collaborative ones: an exploration of networked gestures.

I didn’t have any experience building social VR apps, so I settled on AltspaceVR as a platform because of the robustness of its built-in social features (from avatars to audio conferencing), its innovative, open platform for supporting third-party apps, and because its SDK seemed to have some minimal LeapMotion support.

After starting on the prototype, I discovered that although the AltspaceSDK had some support for LeapMotion input data, it had no support for taking that data and exposing it to third-party apps for use in interaction.  I decided to spend some time building in that support for the AltspaceSDK in general, so that not only would I have the functionality I needed for my prototype, but also so that the functionality would be available for other developers who might have their own novel interaction ideas.  When I had finished the new functionality, I gave a talk in VR during an AltspaceSDK meetup to teach other developers how to use it.

 

Touching

Once the LeapMotion interaction functionality for the AltspaceSDK was ready, I wrote a few simple apps to sketch out initial interaction ideas for networked gestures.  The first app I wrote was a touching app.  It loads a floating, red cube into a social VR space in AltspaceVR that you can walk up to and poke with your finger, which is hand tracked via Leap Motion.  When you touch the cube, it turns from red to green.  The directness of the gesture felt satisfying and promising.  From this test, I realized that an important factor for the feeling of immediacy was access to powerful hardware to reduce lag and support smooth animation.  Here is a short animation showing the poking gesture app being tested by users:

 

poke smallUser poking a cube with hand-tracked finger inputs

 

Pushing

Next, I wanted to test an interaction that was more social.   So, the second app I wrote was a pushing app.  This app again loads a floating cube, but this time it can be pushed around with your hands.  With multiple users, the interaction can then become social by pushing the cube back and forth, or attempting to move it collaboratively to a new location.  The motions felt organic and free-form, which was great.  However, they were also surprising because the fidelity was so high that I almost expected tactile feedback from the pushed cube.  Since it’s just a virtual cube, however, the interaction is only partly immersive.  The hybridity was an interesting result that I’d like to explore further in future networked gesture experiments.  However, watching other users push the cube around was extremely compelling because suddenly users’ avatar hands went from lifeless, stiff representations to palms and fingers that looked and felt alive.  Here are a couple of videos that show the pushing interaction with single and multiple users:

 

A single user pushing a cube
Two, networked users pushing a cube

 

I then adapted this pushing app to load one of my scanned art studio object instead of using a basic cube.  Currently in my studio, I am making hybrid sculptures out of cake and plaster.  I 3D-scanned one of these and put it into the pushing app.  While the model’s geometry wasn’t completely accurate, the basic interaction was exciting for me to see.  I was happy to see objects from my art studio being interacted with by virtual users, with their personal gestures.  Here is a screenshot of the oversized cake object being touched virtually:

 

screen-shot-2016-12-21-at-8-04-23-pmUsers pushing a scanned art object with hand-tracked inputs.

 

Playing

Because the LeapMotion interaction functionality was made available to all AltspaceSDK developers, some users have started using my contributed API for their own experiments. User kai has already created an app that lets you play a virtual grand piano with your LeapMotion-tracked hands.  He also added in the ability to play the same virtual piano via MIDI interface, so if multiple users are at the same piano, and some have LeapMotions and others have MIDI keyboards, they can collaborate on the virtual piano with completely different input interactions.  It’s a sort of hybrid piano duet interaction, with both free-hand playing and actual piano playing.

evelyn2016-12-21_18-58-50Playing a virtual instrument with natural hand gestures.

 

 

Future Networked Gestures

These initial experiments into networked gesture interfaces have been exciting to both develop and play with.  Now that the basic functionality is there, I look forward to testing new natural interaction modes that I haven’t yet considered.  Hopefully, by playing with these simple, initial apps more and more, our bodies can naturally lead us to new interaction ideas to try.  I also look forward to keeping track of what other apps the AltspaceVR community comes up with as well!

To read more and stay up to date with these ideas, head over to our Networked Gestures project page.

If you’d like to try out the touching and pushing apps for yourself, head on over to AltspaceVR, using your Vive and a LeapMotion attachment, and load these two URLs into any volumetric enclosure:

elevr.com/apps/touch
elevr.com/apps/push

.. and let us know how it feels to you!

-Evelyn