haptics & amp; null space vr
TRANSCRIPT
Haptic Feedback in VRWith NullSpace VR
By Jonathan Palmer
What is this talk?
Haptic Feedback in AR/VR.
Best Practices of creating haptic content
Creating finer quality haptics
Our Suit & Developer API
Who am I?
Jonathan Palmer
Lead Game Dev at NullSpaceVR
How do I deal with Haptics?
Conceptual
Software
Practical
Gameplay implication
Diving Right In
Understand the Body
Normalizing Haptics
Understand the Body
Timing
Understand the Body
Processing Time
Understand the Body
Attention & Focus
Reusable Components
● System created haptics
● Code created haptics
● File defined haptics
Gathering
System Created Haptics
Traversal across body
System Created Haptics
Locally Randomized
System Created Haptics
Emanation
System Created Haptics
Finer Quality Haptics
Principles of Animation
Squash & Stretch
Anticipation
Staging
Timing
Exaggeration
Principles of Haptics
Interpreted Simultaneity
Attention & Focus
Relative Scale
Best Practices of Haptics
Understand the Human Body
Timing is important
Reusable components
Use systems to create complex effects
Understand normalizing haptics
Nodal Graph for emanation and gathering effectsNo gifs or images here.
Pay attention!
Moving On
NullSpace VR’s Suit
16 Haptic Regions
Built-in Tracking
Closed Alpha focusing on content & integration
Consumer & VR Arcade targeted
Audio to Haptic
VR Headset agnostic
NullSpace VR’s Developer API
Lightweight background engine (see Vive/Oculus)
Unity support
Unreal support coming soon
Code Defined Haptics
Create & play them in-line.
Combine effects to create compound effects.
File Defined
Multiple levels for construction
File Format
Tools for Easier Haptics
Nodal Graph
The Future of Haptics
NullSpaceVR will be open for crowdfunding early 2017!
Haptics as a field is moving forward
VR integration is getting easier
Thank youQuestions?
www.NullSpaceVR.com
Feeldelity - /fēlˈdelədē/
Feeldelity (n) - The accuracy or exactness that a user can understand and interpret sensations they are experiencing.
* Not an absolute measurement
* Behaves like attention & based on context