Nothing is currently selected for comparison. Please select two tools to compare on the Tools page.
Only one tool is selected for comparison. Please select another to compare on the Tools page.
Your web browser does not support the Session Storage API. This feature will not work.
3DTactileDraw
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2019 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | HapticHead |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
3DTactileDraw supports control of an a 24-actuator array using the HapticHead helmet. Two major interfaces are present in the tool: a paint interface that permits drawing desired effects directly on the surface of a model of a 3D head, and a curve interface that provides per-actuator intensity keyframes on a timeline.
For more information about 3DTactileDraw, consult the CHI 2019 paper.
Actronika EvalKit
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Actronika Unitouch |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Actronica EvalKit includes a basic dataflow interface where preset, parametrized effects can be adjusted and their output directed to different actuators. Other modes support directly playing back an audio file and trying other, more complex effects on the included haptic module.
For more information, consult the Actronika website.
AdapTics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2024 |
Platformⓘ The OS or software framework needed to run the tool. | Unity, Rust |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL-3.0 and MPL-2.0) |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ultraleap STRATOS Explore |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration, Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API, WebSockets |
Additional Information
AdapTics is a toolkit for creating ultrasound tactons whose parameters change in response to other parameters or events. It consists of two components: the AdapTics Engine and Designer. The Designer, built on Unity, allows for the creation of adaptive tactons using elements commonly found in audio-video editors, and adaptive audio editing in particular. Tactons can be created freely or in relation to a simulated hand. The Designer communicates using WebSockets to the Engine, which is responsible for rendering on the connected hardware. While only Ultraleap devices are supported as of writing, the Engine is designed to support future hardware. The Engine can be used directly through API calls in Rust or C/C++.
To learn more about AdapTics, read the CHI 2024 paper or consult the AdapTics Engine and AdapTics Designer GitHub repositories.
Android API
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (Apache 2.0) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Android |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Android API consists of preset VibrationEffect
assets and developer-added compositions of the “click” and “tick” effects. Waveforms can also be created by specifying periods a sequence of vibration durations or durations and associated amplitudes. Audio-coupled effects can also be generated using the HapticGenerator
. There are significant differences in hardware and software support across different Android devices and OS versions, including basic features such as amplitude control.
For more information, consult the Android API documentation and Android Open Source Project (AOSP).
ANISMA
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | ACM ToCHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Skin Stretch/Compression |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Shape-Memory Alloy |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | STL File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
ANISMA is a toolkit to prototype wearable haptic devices using shape-memory alloys (SMAs). Users can position types of SMAs already present in the software between two nodes and simulate how networks of SMAs would behave on the skin during actuation. Nodes to which SMAs are attached can either be simulated as adhered to the skin of the wearer or free-floating. Once the layout is complete, ANISMA can be used to fabricate the design and to play back patterns on the actual hardware.
For more information, please consult the 2022 ACM ToCHI paper and the main GitHub repository.
Apple Notifications
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | iOS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Notifications |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | iPhone |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
New vibration patterns for notifications can be added on supported devices through the settings menu. The user taps out the desired pattern on the touchscreen, can play the recorded pattern back, and save it for later use.
For more information, consult the Apple Support page.
Beadbox
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | UAHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Accessibility |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | EmotiChair, Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | MIDI |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Beadbox allows users to place and connect beads across different tracks, representing different physical actuators. Each bead provides a visual representation of vibration frequency and intensity. Duration can be controlled by forming a connection across two beads. If actuator, duration, or frequency changes between the beads, Beadbox will move between these values during playback.
For more information about Beadbox, consult the UAHCI 2016 paper and the GitHub repository.
bHaptics Designer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Gaming |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | bHaptics Devices |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Torso, Arm, Head, Hand, Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The bHaptics Designer allows for tactile patterns and animations to be created on the various bHaptics products worn on the body. Points of vibration can be set to move across the grid of the selected device with intensity changing from waypoint to waypoint. Individual paths can be placed into tracks on a timeline to create layered, more complex effects.
For more information, consult the bHaptics Designer website.
Cha et al. Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2007 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile, Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | 76-Tactor Glove, Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | MPEG-4 BIFS |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The authoring tool described by Cha et al. is meant to create interactions to be broadcast using MPEG-4 Binary Format for Scenes (BIFS). Haptic effects are represented through different “nodes” that support moving a force-feedback device along a trajectory, guiding a force-feedback device to a specific position, and triggering vibration on a tactile array. The tool itself supports recording motion on a force-feedback device for use in these nodes and an interface for creating and aligning vibration effects with pre-existing video files.
For more information, consult the WHC’07 paper.
CHAI3D
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (BSD 3-Clause) |
Venueⓘ The venue(s) for publications. | EuroHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | omega.x, delta.x, sigma.x, Phantom, Novint Falcon, Razer Hydra |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual, Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API, Device Template |
Additional Information
CHAI3D is a C++ framework for 3D haptics. Users can initialize a scene, populate it with virtual objects, and set the properties of those objects using built-in haptic effects, such as “viscosity” and “magnet”. It also uses OpenGL for graphics rendering an OpenAL for audio effects. CHAI3D can be extended to support additional haptic devices using the included device template.
For more information on CHAI3D, please consult the website, the documentation, and the EuroHaptics abstract.
Cobity
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | Mensch und Computer |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Kinova Gen3 |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Cobity is a Unity plugin for controlling a cobot in VR. The robot’s end effector and position tracking parameters can be modified within the plugin. The end effector can be moved in response to user movements, e.g., by hand tracking, and can be bound to the position of a virtual object in the scene.
For more information on Cobity, please consult the MuC’22 paper or the GitHub repository.
Component-based Haptic Authoring Tool
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | GRAPP |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The component-based haptic authoring tool extends Unity to allow for set haptic effects, such as magnetism or viscosity, to be added to elements in the scene. These elements may already have visual or audio properties. The aim of the tool is to decrease the difficulty of adding haptic interaction to an experience.
For more information, consult the 2015 Computer Graphics Theory and Applications paper.
Compressables Haptic Designer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | ACM DIS |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Skin Stretch/Compression |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Compressables |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head, Hand, Arm, Leg, Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Compressables Haptic Designer is a web app for controlling the Compressables family of pneumatic wearables. Motor power limits can be set through the app and gestures in the user interface allow for a user to control compression and decompression in real time. Time-based effects can also be created and triggered by certain physical gestures.
For more information on Compressables or the Compressables Haptic Designer, please consult the DIS’21 paper or the GitHub repository. The included graphic by S. Endow, H. Moradi, A. Srivastava, E.G. Noya, and C. Torres is licensed under CC BY 4.0.
D-BOX SDK
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Gaming, Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | D-BOX |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The D-BOX LiveMotion SDK is used to create motion effects on a D-BOX chair in response to events, such as those in a simulation or game. Telemetry information concerning the user’s avatar, vehicles, or surrounding environment must be sent when updates occur. These data are used by a custom-built D-BOX Motion System to create haptic effects with limited latency between events occurring in the virtual environment and a response being felt.
For more information, consult the D-BOX website.
DIMPLE
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2007 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 2) |
Venueⓘ The venue(s) for publications. | NIMEInteracting with Computers |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | omega.x, delta.x, sigma.x, Phantom, Novint Falcon, Razer Hydra |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual, Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
DIMPLE is a framework to connect visual, audio, and haptic simulations of a scene using OSC. Haptics support is provided via CHAI3D. DIMPLE allows for scenes to be constructed by a client, such as Pure Data, over OSC, and creates corresponding graphical and haptic representations of it using CHAI3D, ODE, and GLUT. The user can then connect data from events in these scenes (e.g., an object’s motion) to the audio synthesis environment of their choice.
For more information on DIMPLE, please consult the NIME’07 paper, the 2009 Interacting with Computers article, and the GitHub repository.
DOLPHIN
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Qt |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | ACM Symposium on Applied Perception |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Psychophysics |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ultraleap STRATOS Explore |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | CSV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
DOLPHIN is a framework with a design tool for creating ultrasound mid-air tactile renderings for perceptual studies. Users can create new classes to represent the geometries of shapes and the sampling strategies used to display them. Parameters of the shape and sampling strategy can be modified in the tool with the help of pressure and position visualizations. DOLPHIN also includes an interface to PsychoPy to aid in studies. While the framework currently only supports the STRATOS, the software is written so that support for new devices can be added in the future. A “reader” module is also available that can be included in other software to play back the renderings designed in DOLPHIN.
For more information, please consult the SAP’21 paper or the GitLab repository.
DrawOSC and Pattern Player
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | iPad |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ICMC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ilinx Garment |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm, Leg, Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
The DrawOSC and Pattern Player tools were used to compose tactile effects with the eccentric rotating mass (ERM) motors present on the arms, legs, and torso of the Ilinx garment. DrawOSC provides a visual representation of the body and allows the user to draw vibration trajectories that are played on the garment and to adjust a global vibration intensity parameter. The Pattern Player tool additionally allows for intensity to be controlled independently over these trajectories.
For more information about DrawOSC and the Pattern Player, consult the ICMC 2015 paper titled “Composition Techniques for the Ilinx Vibrotactile Garment”.
Feel Messenger
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Android |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Feel Messenger is a haptically augmented text messaging application structured around families of “feel effects”. Feel widgets or “feelgits” are types of effects that can be varied by a set of parameters, or “feelbits”. These parameters are made available to users of Feel Messenger through a set of menus. New effects can also be created by playing pre-existing effects one after the other.
For more information, consult the CHI’15 WIP paper.
FeelCraft
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Java, Python |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UISTAsiaHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Collaboration, Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Vybe Haptic Gaming Pad |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Library, Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
FeelCraft is a technical architecture where libraries of haptic effects (feel effects) are triggered by events in pre-existing media applications through the use of the FeelCraft plugin. The implemented example connects to the event system of a Minecraft server. Families of feel effects are expressed in software and controllable through sets of parmeters that are made available to users through a menu interface. In this example, as in-game events occur (e.g., it begins to rain), an associated feel effect with the selected parameters will be displayed to the person playing the game.
For more information on FeelCraft, consult the UIST’14 demo and the relevant chapter of Haptic Interaction.
Feelix
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | ACM ICMIACM NordiCHIACM TEI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Brushless Motors |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Demonstration, Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Feelix Effect File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Feelix supports the creation of effects on a 1 DoF motor through two main interfaces. The first allows for force-feedback effects to be sketched out over either motor position or time. For time-based effects, user-created and pre-existing effects can be sequenced in a timeline. The second interface provides a dataflow programming environment to directly control the connected motor. Parameters of these effects can be connected to different inputs to support real-time adjustment of the haptic interaction.
For more information, consult the 2020 ICMI paper, the NordiCHI’22 tutorial, and the TEI’23 studio. and the Feelix documentation.
Feellustrator
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2023 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control, Collaboration |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ultraleap STRATOS Explore |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON, CSV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Feellustrator is a design tool for ultrasound mid-air haptics that allows users to sketch out paths for sensations to follow, control how the focal point will move along this path, and combine them together over time to create more complex experiences. Audio and visual reference materials can be loaded into the tool, but cannot be modified once added. Hand tracking support is present so that effects can be played relative to the user’s hand rather than floating freely in space.
For more information, please consult the CHI’23 paper.
ForceHost
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Web, Faust |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | NIME |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TorqueTuner |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
ForceHost is a toolchain for embedded physical modelling of audio-haptic effects for digital musical instruments. It primarily supports the TorqueTuner device, but can optionally be used with ESP32 boards supporting audio I/O, a 1-DoF servo, and network connectivity. Network connectivity is necessary so that it can be connected to other audio synthesis programs and so users can access the editor GUI through a web application. This application allows users to create haptic effects at runtime by sketching and manipulating curves representing transfer functions. Based on Faust, ForceHost is also supported by a fork of Synth-a-Modeler and can be controlled with a lower-level API called haptic1D.
For more information, please consult the NIME’22 paper or visit the GitLab repositories.
GAN-based Material Tuning
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | iPad |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ACCESS |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The tuning method developed here uses a generative adversarial network (GAN) to produce vibrations corresponding to materials with properties in between those provided in the dataset used to train the GAN. The application described in the paper was intended to test this method of haptic tuning with users.
For more information about this method, consult the 2020 IEEE Access paper.
GENESIS
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 1995 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ICMCSymposium on Computer Music Multidisciplinary Research |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music, Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Transducteur gestuel rétroactif |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
GENESIS is a physical modelling system that uses the CORDIS-ANIMA framework to create virtual musical instruments. By creating networks of different modules in a one-dimensional space and adjusting their parameters, simulations can be constructed and interacted with via a connected haptic device. Sounds produced by the model are played back. While earlier versions of GENESIS required non-realtime simulation of instruments, multiple simulation engines are now supported, including those that run in real time (e.g., GENESIS-RT).
Information on GENESIS is included in numerous places, including the 1995 International Computer Music Conference (ICMC) paper, the 2002 ICMC paper, the 2009 ICMC paper, and the 2013 International Symposium on Computer Music Multidisciplinary Research paper.
H-Studio
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback, Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | IMU Sensor |
Additional Information
H-Studio is a tool meant to add haptic effects, primarily motion effects, to a pre-existing video file. Its primary interface provides a preview of the original audio-visual content, tracks of the different parameters that can be edited in H-Studio, and a visual preview of a selected motion effect. Data to drive a motion effect can be input from a force-feedback device directly or from another source (e.g., an IMU), and motion effects can be played back in the tool to aid in further refinement.
For more information, consult the UIST’13 poster.
H3D API
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2004 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 2) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom, Novint Falcon, omega.x, delta.x, sigma.x |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
H3D API is a framework that lets users design haptic scenes using X3D. Virtual objects, effects in space, and haptic devices can be specified using X3D and added into the scene-graph. Visual properties are rendered using OpenGL and haptic properties are rendered using the included HAPI engine. Users also have the option of adding new features or creating more complex scenes by directly programming in C++ or Python.
For more information on H3D API, please consult the H3D API website and its documentation page.
HAMLAT
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Blender |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | EuroHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | HAML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HAMLAT is an extension to Blender that adds additional menu items to control the static physical properties of modeled objects. These properties can be felt using a force feedback device in the environment itself. These properties can be imported to and exported from HAMLAT using the Haptic Applications Meta Language (HAML).
For more information, consult the EuroHaptics 2008 paper.
hAPI
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2018 |
Platformⓘ The OS or software framework needed to run the tool. | Java, C#, Python |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Education |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haply 2DIY |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
hAPI is a low-level API for controlling the Haply 2DIY device. In its basic form, it handles communication between the host and device and mapping between positions and forces within the device’s workspace and angles and torques used by the hardwware itself. hAPI can also be combined with physics engines such as Fisica for convenience.
For more information on hAPI, please consult the GitLab repository, the 2DIY development kit page, and the hAPI/Fisica repository.
Haptic Icon Prototyper
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2006 |
Platformⓘ The OS or software framework needed to run the tool. | Linux |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | 1 DoF Knob |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Haptic Icon Prototyper consists of a waveform editor used to adjust the magnitude of a force-feedback effect over position or time. Waveforms can be refined by manipulating keyframes on visualizations of them and adjusting until a desired result is achieved. These waveforms can then be combined together in a timeline either by superimposing them to create a new tile or by sequencing them. The combination of these features is meant to support the user in rapidly brainstorming new ideas.
For more information, consult the 2006 Haptics Symposium paper.
Haptic Studio (Immersion)
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TouchSense Devices |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | WAV, Haptic Studio Files |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Haptic Studio supports the creation of several kinds of base effects either from scratch or from a WAV file. These base effects can be of types MagSweep (continuous vibration along an attack-sustain-release or ASR envelope), Periodic (regular pulses that are contained in an ASR envelope), or Waveform (statically loaded from a WAV file). They can be assigned to a timeline element where they can be linked to different actuators, arranged in time, and have their applicable parameters modified. Individual effects (“components”) and timelines can be played back, refined, and exported when complete.
For more information, consult the Immersion website.
HapticLib
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | STM32 |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (ISC) |
Venueⓘ The venue(s) for publications. | ACM SAP |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | ERM |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
HapticLib is a small library for deploying haptic patterns on embedded devices. It is able to control multiple actuators at a time and provides an abstraction over low-level hardware control.
For more information about HapticLib, consult the 2013 Symposium on Applied Perception paper or the project website.
Hapticon Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Psychophysics |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | 1 DoF Knob |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | CSV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Hapticon Editor is intended to create haptic icons for 1 DoF force-feedback devices by directly recording a user’s motions on the device and by combining waveforms. While recording only allows for mapping motion over time, waveforms can be used as functions of force over position and time. Common waveforms, such as sine waves, can be automatically generated, but custom ones can be created by manipulating their constituent keyframes.
For more information, consult the 2003 Haptics Symposium paper.
HapticPilot
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2023 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM IMWUT |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | HapticPilot Glove |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HapticPilot allows for users to sketch haptic patterns on their hands in virtual reality. When patterns are played back, an algorithm that accounts for differences in hand posture is used so that patterns feel the same even as the user moves. A glove containing 12 actuators and 13 accelerometers is used to track hand movement and render recorded patterns.
For more information about HapticPilot, consult the 2023 Proceedings of the ACM on Interactive, Mobile, Wearable, and Ubiquitous Technology paper.
HaptiDesigner
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | UAHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | HaptiBoard |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HaptiDesigner is a toolkit to create vibrotactile patterns or Haptograms on multiple actuators. Each Haptogram is composed of multiple frames that each specify which actuators are activated in that frame ,the intensity of vibration, the duration, and the pause between the end of that frame and the following one. These patterns are stored in a local database so they can be modified and reused.
For more information, please consult the 2022 Universal Access in Human-Computer Interaction paper or the GitHub repository.
Hassan et al. Texture Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE TIE |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Hassan, Abdulali, and Jeon created an affective authoring space for textures based on 25 models of real materials. These were positioned on an affective space defined by two axes, hard-soft and rough-smooth. New models can be generated interpolated from the original data-driven models and played back on a voice-coil actuator, such as the Haptuator Mark II.
For more information on this method, please consult the 2020 article in IEEE Trans. on Industrial Electronics.
HFX Studio
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2018 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM VRST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback, Vibrotactile, Temperature |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Thalmic Myo, Subpack M2, Oculus Touch, Dyson Pure Cool Link |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head, Torso, Arm, Leg, Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HFX studio allows for authoring haptic effects directly on the body or attaching them to objects in a VR environment. Perceptual models are used to encode and render the desired effects to the extent supported by the connected hardware. This intermediate perceptual layer is intended to separate the design of haptic effects from the devices used to display them.
For more information, consult the VRST’18 paper.
HITPROTO
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2010 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics SymposiumComputers & Graphics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Accessibility |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HITPROTO uses a visual programming interface to let users specify the content and behavior of an interactive haptic scene. Basic programming functionality, such as loops and conditional logic, are included in the environment. Basic haptic interactions (e.g., spring effects, guidance along a path) are present to aid a user in creating haptic data visualizations.
For more information, consult the 2010 Haptics Symposium paper and the 2013 Computers & Graphics paper.
Hong et al. Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM TEI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haptuator |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | No |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This authoring tool supports the creation of vibration patterns through finger tapping on a touch screen. Duration of a touch is mapped to the duration of a vibration while touch area is mapped to intensity. Touch area is expected to be proportional to pressure. A visualization of the vibrotactile pattern is previewed to users during use, but this pattern must be transferred to a separate computer to drive an attached actuator.
For more information, consult the TEI’13 paper.
Interhaptics Haptic Composer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Gaming |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback, Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | iPhone, Android, PlayStation DualSense, Razer Kraken, Xinput Controllers, Meta Quest, OpenXR Devices |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON, WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Interhaptics Haptic Composer focuses on the creation of different materials for VR. Vibration can be added to a material by adding different regular waveforms and constants together. The “texture” menu functions in the same way, except that the resulting waveform is rendered statically over position to create bumps and changes in elevation. The “stiffness” menu determines the amount of force returned given the amount of displacement into the material. A fourth “thermal” option exists, but cannot be modified at this time.
For more information, consult the Interhaptics website.
Lofelt Studio
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation, Collaboration |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | iPhone, Android |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | WAV, Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Lofelt Studio allows users to load an audio file into the tool. This automatically creates an initial vibrotactile experience. This can be refined through controls in the editor itself, such as menus controlling global parameters and using keyframes to control the waveforms directly. These editing processes can also be used to create effects from scratch. Vibrotactile effects can be sent to iOS and Android devices with a corresponding Lofelt Studio app installed.
For more information, consult the Lofelt website as archived on the Wayback Machine.
Macaron
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON, WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Macaron uses web audio to drive an actuator connected to the user’s computer. A library of vibrotactile effects are available for playback that are visualized as waveforms expressing amplitude and frequency over time. These presets can be loaded into the editor pane where keyframes can be added to the waveforms and modified. Alternatively, effects can be created from scratch without the use of a preset. Preset and custom effects can be played back over audio output to support an iterative design process.
For more information about Macaron, consult the Haptics Symposium paper and the GitHub repository.
Mango
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | C-2 Tactor |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Mango is a graphical tool for creating effects on vibrotactile arrays. A visualization of the layout of the actuators in the array is present in the editor. Users can create “animation objects” with different positions and intensities, and create paths to define the motion of these objects over time. Parameters can be adjusted over time as well through the use of keyframes. A rendering algorithm is used to transform these tactile animations into actuator signals so that the authored vibrotactile experience is perceived by the user.
For more information, consult the UIST’15 paper.
MHaptic
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2007 |
Platformⓘ The OS or software framework needed to run the tool. | C++ |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE CW |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haptic Workstation |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | No |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
MHaptic and its associated Haptic Scene Creator allows for augmenting virtual reality 3D models so they can be interacted with using a bimanual force-feedback device. The underlying MHaptic library can be used directly, or the Creator can be used to load 3D models, generate haptic geometries for them, and manually adjust the resulting elements.
For more information about MHaptic and the Haptic Scene Creator, consult the 2007 International Conference on Cyberworlds paper.
mHIVE
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (BSD 3-Clause) |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Collaboration, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
mHIVE is a haptic instrument where the user interacts with a tablet’s touchscreen to create vibrotactile output over audio. Most of the screen is devoted to a region where frequency is mapped to the touch along the x-axis and amplitude is mapped to the y-axis. Sine, square, sawtooth, and triangle waveforms can be selected in a menu below this. An attack-decay-sustain-release (ADSR) envelope can be modified by dragging the different points of its visualization. Effects created with mHIVE can be recorded and played back later.
For more information, consult the 2014 Haptics Symposium paper and the GitHub repository.
miPhysics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2019 |
Platformⓘ The OS or software framework needed to run the tool. | Processing |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | HAID |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music, Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haply 2DIY |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
miPhysics is a physical modelling framework built on Processing. Users can specify a mass-interaction system in the code and interact with it using a Haply 2DIY or other hAPI-compatible device. The avatar representing the end effector and the physical elements in the simulation are visualized in the sketch window and sound synthesis is possible through Minim.
Example code is available on the miPhysics GitHub repository and more information on it is available in the HAID 2019 paper.
Multisensory 360° Video Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2018 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | WorldCIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation, Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Airflow system, Buttkicker LFE |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual, Olfactory |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The multisensory 360° video editor allows for such video files to be loaded and augmented with additional sounds and with new haptic and olfactory content synchronized to certain times in the video. Low frequency vibration and airflow from different directions around the viewer can be controlled using the application.
For more information about the editor, consult the 2018 World Conference on Information Systems and Technologies paper.
Neosensory SDK
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (Apache 2.0) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Neosensory Buzz |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The NeoSensory SDK allows control of the four motors embedded in the Neosensory Buzz wristband. Vibrations can be called by either setting the desired amplitude value of each motor directly, which will persist until an update is sent, or by using another class to trigger a single point of vibration interpolated between the physical actuators.
For more information, consult the Neoseonsory SDK documentation.
OM Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM MobileHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | OM Wearables |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The OM Editor is an app to control OM Wearables: sleeves on the forearm each with three ERMs. The user can create sequences of up to six vibration patterns of the same duration where they can select the intensity of vibration and which actuators will vibrate. These sequences can be annotated and stored.
For more information on the OM Editor or Wearables, please consult the MobileHCI’21 paper.
Pebble/Rebble
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2012 |
Platformⓘ The OS or software framework needed to run the tool. | Pebble |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Pebble |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Pebble smartwatches contain vibration motors that can be controlled through the C API, now maintained by the Rebble project. On-off vibration patterns can be programmed using the Vibes API where the duration of each on or off interval is specified.
For more information, consult the Rebble project site.
Penn Haptic Texture Toolkit
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | HDAPI |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Non-Commercial Research Use Only |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Penn Haptic Texture Toolkit consists of texture and friction models recorded from 100 surfaces. The data used to create the models and sample code for displaying the textures on the Phantom Omni are included as part of the toolkit.
For more information, consult the 2014 Haptics Symposium paper.
PhysVib
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ToH |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
PhysVib is an extension to the Android-based AndEngine physics engine. Certain objects in the engine are treated as being manipulated by the user (in the “haptic camera”), and collisions involving these objects generate vibrotactile feedback. This feedback is passed through audio output to an actuator attached to the Android device.
For more information, consult the 2016 Transactions on Haptics paper and the GitHub repository.
posVibEditor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE HAVEIEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | ERM |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The posVibEditor supports the creation of vibration patterns across multiple ERM motors. Vibration assets can be created by manipulating keyframes in a vibration intensity over time visualization. These assets, or provided templates, can be copied into a track interface to decide which motor each will be displayed on and at what time. A “perceptually transparent rendering” mode is included to adjust the mapping of asset amplitude values to output voltage values so that the authored effect is felt as intended.
For more information, consult the 2008 Workshop on Haptic Audio Visual Environments and Games paper and the 2009 WHC paper.
Printgets
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Pure Data, Raspberry Pi |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (LGPL 3) |
Venueⓘ The venue(s) for publications. | HAID |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Electroactive Polymer |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Printgets is a library to develop printed vibrotactile widgets for piezoelectric actuators that are triggered by capacitive input devices. Input and output devices are connected to a computer, such as a Raspberry Pi, and parameters mapping inputs to outputs can be controlled using Purr Data, a fork of Pure Data. This interface is meant to support the development of tactile widgets.
For more information on Printgets, please consult the HAID 2020 paper or the GitLab repository.
SeeingHaptics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2019 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM MobileHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile, Temperature |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| None |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | N/A |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
SeeingHaptics allows users to attach haptic feedback of various types to different virtual objects. These interactions are represented in the VR environment using different visual icons. “Haptic listeners” can be attached to other devices, such as VR controllers, so that people in the virtual environment can experience the appropriate effect when they are near an object with an associated haptic effect. No output devices are supported out of the box, and SeeingHaptics is intended to aid in planning a VR haptic experience.
For more information, consult the MobileHCI’19 paper.
Skinscape
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2001 |
Platformⓘ The OS or software framework needed to run the tool. | Protools and Max/MSP |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | V1220 Transducer, Aura Systems Interactor Cushion |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm, Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | AIFF |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Skinscape is meant to be a haptic equivalent to a MIDI sequencer. Keyboard inputs are recorded in Max/MSP and mapped to seven haptic actuators, six of which are located on the arms and one on the lower back. This sequence would then be exported and loaded into a traditional audio editing environment (e.g., ProTools) so that it can be combined with existing music.
For more information, consult Eric Gunther’s Masters Thesis.
Syntacts (Standalone)
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | IEEE ToH |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | WAV, AIFF, CSV, Syntacts Signal File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Syntacts is an API and graphical tool for creating audio-driven vibrotactile arrays. Vibrotactile “signals” can be created using common waveforms and combined using operations such as sequencing and multiplication. Complex envelopes can also be applied to these different signals. The Syntacts GUI includes a spatializer for mapping desired signals to the correct virtual location on an array and a track-based sequencer to aid in performing these signal operations.
For more information, consult the 2020 Transactions on Haptics paper and the GitHub repository.
Synth-A-Modeler Designer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2012 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 2) |
Venueⓘ The venue(s) for publications. | Linux Audio ConferenceEuroHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon, FireFader |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | No |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Faust DSP File, SAM Model File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Synth-A-Modeler Designer is designed for physical modeling for sound synthesis and can be connected to force-feedback devices as inputs. Models can be constructed using mass-interaction and waveguide modeling through graphical elements. A special “port” object provides 1 DoF of input through a supported haptic device to allow physical control over the model. When a model is complete, the Designer can export it to a Faust DSP file that can be compiled to run on various targets, such as on the Web, mobile devices, and desktop computers.
For more information, consult the 2012 Linux Audio Conference paper, the 2016 EuroHaptics paper, and the both the GitHub repository for the designer and the one for the compiler.
Tactile Brush
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2011 |
Platformⓘ The OS or software framework needed to run the tool. | Pure Data |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | C-2 Tactor |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Custom UDP |
Additional Information
The Tactile Brush is an algorithm to create tactile animations on discrete vibrotactile arrays. The algorithm allows for the creation of stationary or moving tactile objects within the limits of a tactile array. Their motion paths and vibration intensities are used to calculate which actuators should be triggered to produce the intended effect, the intensity at which each should vibrate, and the onset time of and duration of vibration for each. A version of it was implemented in a Pure Data application.
For more information, consult the CHI’11 paper.
Tactile Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | macOS |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Make Controller |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
Tactile Editor is an application that allows users to create vibration patterns for motors. “Motor objects” represent the basic unit of these patterns and include parameters for physical motor assignment, start time, duration, and intensity. These objects can be assigned to different tracks to allow different vibrations to be layered together. Patterns can be tested by playing them back on a connected device. Sensor values input to the Editor can be used to trigger the start of different patterns.
For more information, consult Markus Jonas’s Masters Thesis.
Tactile Glove Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2010 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE MultiMedia |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Tactile Glove |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | MPEG-4 BIFS |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This authoring tool allows for the design of video-synchronized tactile effects to be rendered using the actuators along the fingers and palm of the tactile glove. Frames of the video are shown in the GUI and it is possible to draw tactile lines across them that will be mapped to the array on the glove. These tactile frames are greyscale with color intensity mapped to vibration intensity. Previews of the surrounding video content are shown to aid in planning and synchronizing the effects to the existing audio-visual media.
For more information, consult the 2010 IEEE MultiMedia paper.
TactiPEd
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | INTERACT |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Generic VT |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
Users of TactiPEd would first create a template file specifying the hardware device they are using and its layout of actuators. This layout will then be preserved in the main editor interface, where sequences of vibrotactile patterns can be created and assigned to different actuators. A playback mode allows for users to quickly feel the changes made to amplitude, frequency, and timing in the tool.
For more information, consult the 2013 IFIP Conference on Human-Computer Interaction paper.
TactJam
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Electron |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT, CC-BY-4.0) |
Venueⓘ The venue(s) for publications. | ACM TEI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Collaboration, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TactJam Hardware |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head, Arm, Hand, Torso, Leg, Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
TactJam consists of a hardware component, client software component, and server component. The hardware includes eight ERM motors each connected to a board with push buttons corresponding to each. In the client, users can create effects by arranging dots representing the motors on a 3D model of a human. With the hardware connected over USB, patterns can be recorded into the client as they are played on the device itself. When a pattern is ready, a user can upload it to the TactJam server so that others may download and reuse it.
For more information about TactJam, consult the TEI’21 abstract, the TEI’22 paper, and the main GitHub repository.
TactTongue
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2023 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Accessibility, Virtual Reality, Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Electrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TactTongue |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Tongue |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Library, Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
TactTongue is an app and prototyping kit to explore electrotactile stimulation on the tongue. The parameters of the signal on each electrode can be directly controlled, or presets can be used as the basis for a design. A visualization of the resulting sensation on the tongue is shown in the application, and the pattern itself can be played on a connected TactTongue device.
For more information about TactTongue, consult the 2023 UIST paper and the GitHub repository.
TECHTILE Toolkit
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2012 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM VRICACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Education |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TECHTILE |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The TECHTILE Toolkit consists of a vibrotactile recorder and actuator that allows users to record real-world vibrations and then play them back. This is meant to permit the sharing of effects by users without strong technical backgrounds, including elementary school children.
For more information, consult the VRIC’12 article and the TECHTILE Toolkit website.The TECHTILE Toolkit was also used in the CHI’21 Extended Abstract.
TorqueTuner
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | libmapper |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | NIME |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control, Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TorqueTuner |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
TorqueTuner is a standalone 1-DoF haptic module that can be used as a standalone device or connected to a digital musical instrument such as the T-Stick. It contains a set of embedded effects that can be modified through inputs sent using OSC through libmapper. TorqueTuner also sends information about its own state back through OSC that can be used as inputs to another process running elsewhere.
For more information on the TorqueTuner hardware or software environment, please consult the NIME’20 paper or the GitHub repository. The photograph of the TorqueTuner by M. Kirkegaard, M. Bredholt, C. Frisson, and M.M. Wanderley is licensed under CC BY 4.0.
TouchCon Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2009 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ICACT |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile, Temperature |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TouchCon Device |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
The TouchCon Editor is meant to design tactile effects for use in instant messaging. Effects can be created using multiple supported output devices, each described to the system using an XML file, by arranging individual sensations along a timeline. These can be sent to other users of TouchCon if they have the necessary hardware to display the effect.
For more information, consult the ICACT’09 paper.
Ubitile
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM NordiCHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ubitile |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Users of Ubitile wear a vibrotactile actuator and gyroscope on their pointer finger. Vibration patterns are created by moving this finger between three points: A, B and C. The pitch angle between positions B and A controls intensity, the time spent travelling between positions A and B controls duration, and the time spent between positions B and C controls the gap time between vibration units. Patterns are recorded and can be played back as wanted.
For more information, consult the 2016 Nordic Conference on Human-Computer Interaction paper.
VibEd
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | iConference |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Gaming, Prototyping, Accessibility |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Android, iPhone, Xbox |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
VibEd is designed to create vibrotactile feedback for games. A waveform editing interface is present in the tool where keyframes can be used to control the duration and intensity of vibration patterns to be displayed on the selected output device. If a playback application is installed on the desired platform, the editor can send authored patterns to it at runtime.
For more information, consult the iConference 2016 paper.
vibrAteRial
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | NodeJS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | bARefoot |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
VibrAteRial is designed to create underfoot material effects in VR for the bARefoot shoe system. Each bARefoot contains a pressure sensor and vibrotactile actuators. Using the authoring tool, designers create virtual materials by controlling vibration grains that trigger as the bARefoot wearer steps down on the shoe. Users can control the distribution of these grains as a function of pressure, and the frequency and amplitude of each grain. These can be sent to a bARefoot so that the material can be tested and refined.
For more information on this or bARefoot, consult the UIST’20 paper and the Git repository.
VibScoreEditor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2009 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Brüel & Kjaer Model 4810, Voice Coil, Vibration Motor |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Score |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
VibScoreEditor applies uses music notation as a metaphor for vibrotactile editing. In this system, a “vibrotactile clef” determines the frequency and waveform assigned to each “pitch” occupying a position on the staff. Notes have a pitch by virtue of this position, an intensity indicated by a number inside the note head, and a duration set by the shape of the note. Two dynamics, crescendo and decrescendo, allow for gradual increasing and decreasing of note intensity respectively. By switching between different clefs and arranging a series of vibrotactile notes, rests, and dynamic markings, users can create complex time-varying vibration patterns.
For more information, consult the 2009 World Haptics Conference paper.
VibViz
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (BSD 3-Clause) |
Venueⓘ The venue(s) for publications. | IEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Library, Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
VibViz is a library of vibrotactile effects and associated filtering and visualization tools to find elements within it. Each effect is associated with various tags representing emotions (“angry”), metaphors (“heartbeat”), and possible uses (“alarm”) that can be used to filter the library. Two chart visualizations lay out the effects by duration and signal RMS, and by pleasantness and urgency. Additional filters are available for selecting effects with a specific tempo, rhythm structure, and roughness. A complete list of the effects in the library is present showing each one’s metaphor and usage tags along with a visualization of intensity over time. Selected effects are played as audio outputs, allowing them to be displayed on a connected actuator.
For more information on VibViz, consult the 2015 World Haptics Conference paper, the VibViz website, and the GitHub repository.
VITAKI
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | Journal of HCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Gaming |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | VITAKI Controller |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
VITAKI supports controlling ERM motors in various configurations. A photo of the output device’s configuration can be loaded into VITAKI and annotated with the locations of each actuator. Waveforms, either preset or customized through a keyframe-based editor, are assigned to each actuator by placing them in a track corresponding to one actuator. There is also an additional mode to change the mapping between the waveform values and the actual output voltages sent to the device.
For more information, consult the 2014 Journal of Human-Computer Interaction article.
ViviTouch
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | MKV, Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Vivitouch is meant to support prototyping of vibrotactile haptics aligned to audio-visual content. Haptic media is created through the use of waveforms and filters mapping the audio content at that moment of time to the vibrotactile channel. These filters, such as a low-pass filter, are meant to aid in synchronizing audio and haptic content. Effects and filters are assigned to different output channels, representing each actuator, and to different haptic tracks. Using multiple tracks allows for layering effects and filters on the same actuator at the same time.
For more information, consult the 2014 World Haptics Conference paper.
Voodle
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2017 |
Platformⓘ The OS or software framework needed to run the tool. | NodeJS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM DIS |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | CuddleBit |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
While Voodle is primarily meant to control 1 DoF robots called “CuddleBits”, it can also be used for haptic prototyping. The frequency and amplitude of a user’s voice is used to drive the output of the system. Each parameter is normalized and used to create a weighted average with the bias value set by the user. The user can then add random noise to the system and scale and smooth the resulting output. The mapping of voice input to motor output occurs in real time.
For more information, consult the DIS’17 paper and the GitHub repository.
VRML Plugin
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | VRML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This system is a Virtual Reality Modeling Language (VRML) plugin that adds force-feedback effects to a subset of VRML. This means that existing VRML scenes can automatically have haptic effects added to them. The scene can be explored using a PHANTOM device in the Netscape browser.
For more information about the VRML plugin, consult the 2003 Haptics Symposium paper.
Web-based MPEG-V Tool
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ISMIEEE HAVE |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon, omega.x, delta.x, sigma.x, Geomagic Touch, Moog HapticMaster |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | MPEG-V, Collada |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Dong et al.’s authoring tool allows for friction, sping, impulse, and shape forces to be added to 3D objects loaded into the editing environment. The resulting elements are exported using a proposed extension of MPEG-V to support haptics.
For more information, consult the 2013 IEEE International Symposium on Haptic Audio Visual Environments and Games paper and the 2015 IEEE International Symposium on Multimedia paper.
Weirding Haptics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (AGPL 3) |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Oculus Touch |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Weirding Haptics is a design tool for and within virtual reality environments where vocalizations are mapped to vibration patterns attached to virtual objects. Vocalizations are recorded and mapped to vibration patterns felt when touching the object. These patterns can have parameters such as maximum amplitude modulated based on the position or velocity of contact. Several patterns can be layered over each other to create more complex effects impossible to accomplish through one recording.
For more information on Weirding Haptics, please consult the UIST’21 paper or the GitHub repository.
YouTube Haptic Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2010 |
Platformⓘ The OS or software framework needed to run the tool. | Java |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM MM |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Custom Jacket, Custom Arm Band |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Torso, Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual, Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This authoring tool allows for users to annotate YouTube videos with time-synchronized vibrotactile content. The resulting augmented file can be played using a custom browser, written in Java, that supports the specified haptic devices. In the editing environment, the timing and actuation of a vibrotactile array can be set with the source YouTube video visible as a reference.
For more information, read the 20 ACM Multimedia Conference paper.
3DTactileDraw
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2019 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | HapticHead |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
3DTactileDraw supports control of an a 24-actuator array using the HapticHead helmet. Two major interfaces are present in the tool: a paint interface that permits drawing desired effects directly on the surface of a model of a 3D head, and a curve interface that provides per-actuator intensity keyframes on a timeline.
For more information about 3DTactileDraw, consult the CHI 2019 paper.
Actronika EvalKit
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Actronika Unitouch |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Actronica EvalKit includes a basic dataflow interface where preset, parametrized effects can be adjusted and their output directed to different actuators. Other modes support directly playing back an audio file and trying other, more complex effects on the included haptic module.
For more information, consult the Actronika website.
AdapTics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2024 |
Platformⓘ The OS or software framework needed to run the tool. | Unity, Rust |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL-3.0 and MPL-2.0) |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ultraleap STRATOS Explore |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration, Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API, WebSockets |
Additional Information
AdapTics is a toolkit for creating ultrasound tactons whose parameters change in response to other parameters or events. It consists of two components: the AdapTics Engine and Designer. The Designer, built on Unity, allows for the creation of adaptive tactons using elements commonly found in audio-video editors, and adaptive audio editing in particular. Tactons can be created freely or in relation to a simulated hand. The Designer communicates using WebSockets to the Engine, which is responsible for rendering on the connected hardware. While only Ultraleap devices are supported as of writing, the Engine is designed to support future hardware. The Engine can be used directly through API calls in Rust or C/C++.
To learn more about AdapTics, read the CHI 2024 paper or consult the AdapTics Engine and AdapTics Designer GitHub repositories.
Android API
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (Apache 2.0) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Android |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Android API consists of preset VibrationEffect
assets and developer-added compositions of the “click” and “tick” effects. Waveforms can also be created by specifying periods a sequence of vibration durations or durations and associated amplitudes. Audio-coupled effects can also be generated using the HapticGenerator
. There are significant differences in hardware and software support across different Android devices and OS versions, including basic features such as amplitude control.
For more information, consult the Android API documentation and Android Open Source Project (AOSP).
ANISMA
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | ACM ToCHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Skin Stretch/Compression |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Shape-Memory Alloy |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | STL File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
ANISMA is a toolkit to prototype wearable haptic devices using shape-memory alloys (SMAs). Users can position types of SMAs already present in the software between two nodes and simulate how networks of SMAs would behave on the skin during actuation. Nodes to which SMAs are attached can either be simulated as adhered to the skin of the wearer or free-floating. Once the layout is complete, ANISMA can be used to fabricate the design and to play back patterns on the actual hardware.
For more information, please consult the 2022 ACM ToCHI paper and the main GitHub repository.
Apple Notifications
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | iOS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Notifications |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | iPhone |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
New vibration patterns for notifications can be added on supported devices through the settings menu. The user taps out the desired pattern on the touchscreen, can play the recorded pattern back, and save it for later use.
For more information, consult the Apple Support page.
Beadbox
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | UAHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Accessibility |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | EmotiChair, Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | MIDI |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Beadbox allows users to place and connect beads across different tracks, representing different physical actuators. Each bead provides a visual representation of vibration frequency and intensity. Duration can be controlled by forming a connection across two beads. If actuator, duration, or frequency changes between the beads, Beadbox will move between these values during playback.
For more information about Beadbox, consult the UAHCI 2016 paper and the GitHub repository.
bHaptics Designer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Gaming |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | bHaptics Devices |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Torso, Arm, Head, Hand, Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The bHaptics Designer allows for tactile patterns and animations to be created on the various bHaptics products worn on the body. Points of vibration can be set to move across the grid of the selected device with intensity changing from waypoint to waypoint. Individual paths can be placed into tracks on a timeline to create layered, more complex effects.
For more information, consult the bHaptics Designer website.
Cha et al. Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2007 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile, Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | 76-Tactor Glove, Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | MPEG-4 BIFS |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The authoring tool described by Cha et al. is meant to create interactions to be broadcast using MPEG-4 Binary Format for Scenes (BIFS). Haptic effects are represented through different “nodes” that support moving a force-feedback device along a trajectory, guiding a force-feedback device to a specific position, and triggering vibration on a tactile array. The tool itself supports recording motion on a force-feedback device for use in these nodes and an interface for creating and aligning vibration effects with pre-existing video files.
For more information, consult the WHC’07 paper.
CHAI3D
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (BSD 3-Clause) |
Venueⓘ The venue(s) for publications. | EuroHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | omega.x, delta.x, sigma.x, Phantom, Novint Falcon, Razer Hydra |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual, Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API, Device Template |
Additional Information
CHAI3D is a C++ framework for 3D haptics. Users can initialize a scene, populate it with virtual objects, and set the properties of those objects using built-in haptic effects, such as “viscosity” and “magnet”. It also uses OpenGL for graphics rendering an OpenAL for audio effects. CHAI3D can be extended to support additional haptic devices using the included device template.
For more information on CHAI3D, please consult the website, the documentation, and the EuroHaptics abstract.
Cobity
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | Mensch und Computer |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Kinova Gen3 |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Cobity is a Unity plugin for controlling a cobot in VR. The robot’s end effector and position tracking parameters can be modified within the plugin. The end effector can be moved in response to user movements, e.g., by hand tracking, and can be bound to the position of a virtual object in the scene.
For more information on Cobity, please consult the MuC’22 paper or the GitHub repository.
Component-based Haptic Authoring Tool
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | GRAPP |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The component-based haptic authoring tool extends Unity to allow for set haptic effects, such as magnetism or viscosity, to be added to elements in the scene. These elements may already have visual or audio properties. The aim of the tool is to decrease the difficulty of adding haptic interaction to an experience.
For more information, consult the 2015 Computer Graphics Theory and Applications paper.
Compressables Haptic Designer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | ACM DIS |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Skin Stretch/Compression |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Compressables |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head, Hand, Arm, Leg, Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Compressables Haptic Designer is a web app for controlling the Compressables family of pneumatic wearables. Motor power limits can be set through the app and gestures in the user interface allow for a user to control compression and decompression in real time. Time-based effects can also be created and triggered by certain physical gestures.
For more information on Compressables or the Compressables Haptic Designer, please consult the DIS’21 paper or the GitHub repository. The included graphic by S. Endow, H. Moradi, A. Srivastava, E.G. Noya, and C. Torres is licensed under CC BY 4.0.
D-BOX SDK
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Gaming, Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | D-BOX |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The D-BOX LiveMotion SDK is used to create motion effects on a D-BOX chair in response to events, such as those in a simulation or game. Telemetry information concerning the user’s avatar, vehicles, or surrounding environment must be sent when updates occur. These data are used by a custom-built D-BOX Motion System to create haptic effects with limited latency between events occurring in the virtual environment and a response being felt.
For more information, consult the D-BOX website.
DIMPLE
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2007 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 2) |
Venueⓘ The venue(s) for publications. | NIMEInteracting with Computers |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | omega.x, delta.x, sigma.x, Phantom, Novint Falcon, Razer Hydra |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual, Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
DIMPLE is a framework to connect visual, audio, and haptic simulations of a scene using OSC. Haptics support is provided via CHAI3D. DIMPLE allows for scenes to be constructed by a client, such as Pure Data, over OSC, and creates corresponding graphical and haptic representations of it using CHAI3D, ODE, and GLUT. The user can then connect data from events in these scenes (e.g., an object’s motion) to the audio synthesis environment of their choice.
For more information on DIMPLE, please consult the NIME’07 paper, the 2009 Interacting with Computers article, and the GitHub repository.
DOLPHIN
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Qt |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | ACM Symposium on Applied Perception |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Psychophysics |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ultraleap STRATOS Explore |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | CSV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
DOLPHIN is a framework with a design tool for creating ultrasound mid-air tactile renderings for perceptual studies. Users can create new classes to represent the geometries of shapes and the sampling strategies used to display them. Parameters of the shape and sampling strategy can be modified in the tool with the help of pressure and position visualizations. DOLPHIN also includes an interface to PsychoPy to aid in studies. While the framework currently only supports the STRATOS, the software is written so that support for new devices can be added in the future. A “reader” module is also available that can be included in other software to play back the renderings designed in DOLPHIN.
For more information, please consult the SAP’21 paper or the GitLab repository.
DrawOSC and Pattern Player
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | iPad |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ICMC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ilinx Garment |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm, Leg, Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
The DrawOSC and Pattern Player tools were used to compose tactile effects with the eccentric rotating mass (ERM) motors present on the arms, legs, and torso of the Ilinx garment. DrawOSC provides a visual representation of the body and allows the user to draw vibration trajectories that are played on the garment and to adjust a global vibration intensity parameter. The Pattern Player tool additionally allows for intensity to be controlled independently over these trajectories.
For more information about DrawOSC and the Pattern Player, consult the ICMC 2015 paper titled “Composition Techniques for the Ilinx Vibrotactile Garment”.
Feel Messenger
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Android |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Feel Messenger is a haptically augmented text messaging application structured around families of “feel effects”. Feel widgets or “feelgits” are types of effects that can be varied by a set of parameters, or “feelbits”. These parameters are made available to users of Feel Messenger through a set of menus. New effects can also be created by playing pre-existing effects one after the other.
For more information, consult the CHI’15 WIP paper.
FeelCraft
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Java, Python |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UISTAsiaHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Collaboration, Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Vybe Haptic Gaming Pad |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Library, Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
FeelCraft is a technical architecture where libraries of haptic effects (feel effects) are triggered by events in pre-existing media applications through the use of the FeelCraft plugin. The implemented example connects to the event system of a Minecraft server. Families of feel effects are expressed in software and controllable through sets of parmeters that are made available to users through a menu interface. In this example, as in-game events occur (e.g., it begins to rain), an associated feel effect with the selected parameters will be displayed to the person playing the game.
For more information on FeelCraft, consult the UIST’14 demo and the relevant chapter of Haptic Interaction.
Feelix
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | ACM ICMIACM NordiCHIACM TEI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Brushless Motors |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Demonstration, Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Feelix Effect File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Feelix supports the creation of effects on a 1 DoF motor through two main interfaces. The first allows for force-feedback effects to be sketched out over either motor position or time. For time-based effects, user-created and pre-existing effects can be sequenced in a timeline. The second interface provides a dataflow programming environment to directly control the connected motor. Parameters of these effects can be connected to different inputs to support real-time adjustment of the haptic interaction.
For more information, consult the 2020 ICMI paper, the NordiCHI’22 tutorial, and the TEI’23 studio. and the Feelix documentation.
Feellustrator
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2023 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control, Collaboration |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ultraleap STRATOS Explore |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON, CSV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Feellustrator is a design tool for ultrasound mid-air haptics that allows users to sketch out paths for sensations to follow, control how the focal point will move along this path, and combine them together over time to create more complex experiences. Audio and visual reference materials can be loaded into the tool, but cannot be modified once added. Hand tracking support is present so that effects can be played relative to the user’s hand rather than floating freely in space.
For more information, please consult the CHI’23 paper.
ForceHost
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Web, Faust |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | NIME |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TorqueTuner |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
ForceHost is a toolchain for embedded physical modelling of audio-haptic effects for digital musical instruments. It primarily supports the TorqueTuner device, but can optionally be used with ESP32 boards supporting audio I/O, a 1-DoF servo, and network connectivity. Network connectivity is necessary so that it can be connected to other audio synthesis programs and so users can access the editor GUI through a web application. This application allows users to create haptic effects at runtime by sketching and manipulating curves representing transfer functions. Based on Faust, ForceHost is also supported by a fork of Synth-a-Modeler and can be controlled with a lower-level API called haptic1D.
For more information, please consult the NIME’22 paper or visit the GitLab repositories.
GAN-based Material Tuning
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | iPad |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ACCESS |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The tuning method developed here uses a generative adversarial network (GAN) to produce vibrations corresponding to materials with properties in between those provided in the dataset used to train the GAN. The application described in the paper was intended to test this method of haptic tuning with users.
For more information about this method, consult the 2020 IEEE Access paper.
GENESIS
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 1995 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ICMCSymposium on Computer Music Multidisciplinary Research |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music, Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Transducteur gestuel rétroactif |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
GENESIS is a physical modelling system that uses the CORDIS-ANIMA framework to create virtual musical instruments. By creating networks of different modules in a one-dimensional space and adjusting their parameters, simulations can be constructed and interacted with via a connected haptic device. Sounds produced by the model are played back. While earlier versions of GENESIS required non-realtime simulation of instruments, multiple simulation engines are now supported, including those that run in real time (e.g., GENESIS-RT).
Information on GENESIS is included in numerous places, including the 1995 International Computer Music Conference (ICMC) paper, the 2002 ICMC paper, the 2009 ICMC paper, and the 2013 International Symposium on Computer Music Multidisciplinary Research paper.
H-Studio
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback, Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | IMU Sensor |
Additional Information
H-Studio is a tool meant to add haptic effects, primarily motion effects, to a pre-existing video file. Its primary interface provides a preview of the original audio-visual content, tracks of the different parameters that can be edited in H-Studio, and a visual preview of a selected motion effect. Data to drive a motion effect can be input from a force-feedback device directly or from another source (e.g., an IMU), and motion effects can be played back in the tool to aid in further refinement.
For more information, consult the UIST’13 poster.
H3D API
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2004 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 2) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom, Novint Falcon, omega.x, delta.x, sigma.x |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
H3D API is a framework that lets users design haptic scenes using X3D. Virtual objects, effects in space, and haptic devices can be specified using X3D and added into the scene-graph. Visual properties are rendered using OpenGL and haptic properties are rendered using the included HAPI engine. Users also have the option of adding new features or creating more complex scenes by directly programming in C++ or Python.
For more information on H3D API, please consult the H3D API website and its documentation page.
HAMLAT
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Blender |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | EuroHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | HAML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HAMLAT is an extension to Blender that adds additional menu items to control the static physical properties of modeled objects. These properties can be felt using a force feedback device in the environment itself. These properties can be imported to and exported from HAMLAT using the Haptic Applications Meta Language (HAML).
For more information, consult the EuroHaptics 2008 paper.
hAPI
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2018 |
Platformⓘ The OS or software framework needed to run the tool. | Java, C#, Python |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Education |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haply 2DIY |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
hAPI is a low-level API for controlling the Haply 2DIY device. In its basic form, it handles communication between the host and device and mapping between positions and forces within the device’s workspace and angles and torques used by the hardwware itself. hAPI can also be combined with physics engines such as Fisica for convenience.
For more information on hAPI, please consult the GitLab repository, the 2DIY development kit page, and the hAPI/Fisica repository.
Haptic Icon Prototyper
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2006 |
Platformⓘ The OS or software framework needed to run the tool. | Linux |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | 1 DoF Knob |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Haptic Icon Prototyper consists of a waveform editor used to adjust the magnitude of a force-feedback effect over position or time. Waveforms can be refined by manipulating keyframes on visualizations of them and adjusting until a desired result is achieved. These waveforms can then be combined together in a timeline either by superimposing them to create a new tile or by sequencing them. The combination of these features is meant to support the user in rapidly brainstorming new ideas.
For more information, consult the 2006 Haptics Symposium paper.
Haptic Studio (Immersion)
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TouchSense Devices |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | WAV, Haptic Studio Files |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Haptic Studio supports the creation of several kinds of base effects either from scratch or from a WAV file. These base effects can be of types MagSweep (continuous vibration along an attack-sustain-release or ASR envelope), Periodic (regular pulses that are contained in an ASR envelope), or Waveform (statically loaded from a WAV file). They can be assigned to a timeline element where they can be linked to different actuators, arranged in time, and have their applicable parameters modified. Individual effects (“components”) and timelines can be played back, refined, and exported when complete.
For more information, consult the Immersion website.
HapticLib
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | STM32 |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (ISC) |
Venueⓘ The venue(s) for publications. | ACM SAP |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | ERM |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
HapticLib is a small library for deploying haptic patterns on embedded devices. It is able to control multiple actuators at a time and provides an abstraction over low-level hardware control.
For more information about HapticLib, consult the 2013 Symposium on Applied Perception paper or the project website.
Hapticon Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Psychophysics |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | 1 DoF Knob |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | CSV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Hapticon Editor is intended to create haptic icons for 1 DoF force-feedback devices by directly recording a user’s motions on the device and by combining waveforms. While recording only allows for mapping motion over time, waveforms can be used as functions of force over position and time. Common waveforms, such as sine waves, can be automatically generated, but custom ones can be created by manipulating their constituent keyframes.
For more information, consult the 2003 Haptics Symposium paper.
HapticPilot
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2023 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM IMWUT |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | HapticPilot Glove |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HapticPilot allows for users to sketch haptic patterns on their hands in virtual reality. When patterns are played back, an algorithm that accounts for differences in hand posture is used so that patterns feel the same even as the user moves. A glove containing 12 actuators and 13 accelerometers is used to track hand movement and render recorded patterns.
For more information about HapticPilot, consult the 2023 Proceedings of the ACM on Interactive, Mobile, Wearable, and Ubiquitous Technology paper.
HaptiDesigner
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2022 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | UAHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | HaptiBoard |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HaptiDesigner is a toolkit to create vibrotactile patterns or Haptograms on multiple actuators. Each Haptogram is composed of multiple frames that each specify which actuators are activated in that frame ,the intensity of vibration, the duration, and the pause between the end of that frame and the following one. These patterns are stored in a local database so they can be modified and reused.
For more information, please consult the 2022 Universal Access in Human-Computer Interaction paper or the GitHub repository.
Hassan et al. Texture Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE TIE |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Hassan, Abdulali, and Jeon created an affective authoring space for textures based on 25 models of real materials. These were positioned on an affective space defined by two axes, hard-soft and rough-smooth. New models can be generated interpolated from the original data-driven models and played back on a voice-coil actuator, such as the Haptuator Mark II.
For more information on this method, please consult the 2020 article in IEEE Trans. on Industrial Electronics.
HFX Studio
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2018 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM VRST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback, Vibrotactile, Temperature |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Thalmic Myo, Subpack M2, Oculus Touch, Dyson Pure Cool Link |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head, Torso, Arm, Leg, Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HFX studio allows for authoring haptic effects directly on the body or attaching them to objects in a VR environment. Perceptual models are used to encode and render the desired effects to the extent supported by the connected hardware. This intermediate perceptual layer is intended to separate the design of haptic effects from the devices used to display them.
For more information, consult the VRST’18 paper.
HITPROTO
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2010 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics SymposiumComputers & Graphics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Accessibility |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
HITPROTO uses a visual programming interface to let users specify the content and behavior of an interactive haptic scene. Basic programming functionality, such as loops and conditional logic, are included in the environment. Basic haptic interactions (e.g., spring effects, guidance along a path) are present to aid a user in creating haptic data visualizations.
For more information, consult the 2010 Haptics Symposium paper and the 2013 Computers & Graphics paper.
Hong et al. Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM TEI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haptuator |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | No |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This authoring tool supports the creation of vibration patterns through finger tapping on a touch screen. Duration of a touch is mapped to the duration of a vibration while touch area is mapped to intensity. Touch area is expected to be proportional to pressure. A visualization of the vibrotactile pattern is previewed to users during use, but this pattern must be transferred to a separate computer to drive an attached actuator.
For more information, consult the TEI’13 paper.
Interhaptics Haptic Composer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Gaming |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback, Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | iPhone, Android, PlayStation DualSense, Razer Kraken, Xinput Controllers, Meta Quest, OpenXR Devices |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON, WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Interhaptics Haptic Composer focuses on the creation of different materials for VR. Vibration can be added to a material by adding different regular waveforms and constants together. The “texture” menu functions in the same way, except that the resulting waveform is rendered statically over position to create bumps and changes in elevation. The “stiffness” menu determines the amount of force returned given the amount of displacement into the material. A fourth “thermal” option exists, but cannot be modified at this time.
For more information, consult the Interhaptics website.
Lofelt Studio
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Proprietary |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation, Collaboration |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | iPhone, Android |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | WAV, Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Lofelt Studio allows users to load an audio file into the tool. This automatically creates an initial vibrotactile experience. This can be refined through controls in the editor itself, such as menus controlling global parameters and using keyframes to control the waveforms directly. These editing processes can also be used to create effects from scratch. Vibrotactile effects can be sent to iOS and Android devices with a corresponding Lofelt Studio app installed.
For more information, consult the Lofelt website as archived on the Wayback Machine.
Macaron
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON, WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Macaron uses web audio to drive an actuator connected to the user’s computer. A library of vibrotactile effects are available for playback that are visualized as waveforms expressing amplitude and frequency over time. These presets can be loaded into the editor pane where keyframes can be added to the waveforms and modified. Alternatively, effects can be created from scratch without the use of a preset. Preset and custom effects can be played back over audio output to support an iterative design process.
For more information about Macaron, consult the Haptics Symposium paper and the GitHub repository.
Mango
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | C-2 Tactor |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Mango is a graphical tool for creating effects on vibrotactile arrays. A visualization of the layout of the actuators in the array is present in the editor. Users can create “animation objects” with different positions and intensities, and create paths to define the motion of these objects over time. Parameters can be adjusted over time as well through the use of keyframes. A rendering algorithm is used to transform these tactile animations into actuator signals so that the authored vibrotactile experience is perceived by the user.
For more information, consult the UIST’15 paper.
MHaptic
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2007 |
Platformⓘ The OS or software framework needed to run the tool. | C++ |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE CW |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haptic Workstation |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | No |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
MHaptic and its associated Haptic Scene Creator allows for augmenting virtual reality 3D models so they can be interacted with using a bimanual force-feedback device. The underlying MHaptic library can be used directly, or the Creator can be used to load 3D models, generate haptic geometries for them, and manually adjust the resulting elements.
For more information about MHaptic and the Haptic Scene Creator, consult the 2007 International Conference on Cyberworlds paper.
mHIVE
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (BSD 3-Clause) |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Collaboration, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
mHIVE is a haptic instrument where the user interacts with a tablet’s touchscreen to create vibrotactile output over audio. Most of the screen is devoted to a region where frequency is mapped to the touch along the x-axis and amplitude is mapped to the y-axis. Sine, square, sawtooth, and triangle waveforms can be selected in a menu below this. An attack-decay-sustain-release (ADSR) envelope can be modified by dragging the different points of its visualization. Effects created with mHIVE can be recorded and played back later.
For more information, consult the 2014 Haptics Symposium paper and the GitHub repository.
miPhysics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2019 |
Platformⓘ The OS or software framework needed to run the tool. | Processing |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | HAID |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music, Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Haply 2DIY |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
miPhysics is a physical modelling framework built on Processing. Users can specify a mass-interaction system in the code and interact with it using a Haply 2DIY or other hAPI-compatible device. The avatar representing the end effector and the physical elements in the simulation are visualized in the sketch window and sound synthesis is possible through Minim.
Example code is available on the miPhysics GitHub repository and more information on it is available in the HAID 2019 paper.
Multisensory 360° Video Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2018 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | WorldCIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation, Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Airflow system, Buttkicker LFE |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual, Olfactory |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The multisensory 360° video editor allows for such video files to be loaded and augmented with additional sounds and with new haptic and olfactory content synchronized to certain times in the video. Low frequency vibration and airflow from different directions around the viewer can be controlled using the application.
For more information about the editor, consult the 2018 World Conference on Information Systems and Technologies paper.
Neosensory SDK
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (Apache 2.0) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Neosensory Buzz |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The NeoSensory SDK allows control of the four motors embedded in the Neosensory Buzz wristband. Vibrations can be called by either setting the desired amplitude value of each motor directly, which will persist until an update is sent, or by using another class to trigger a single point of vibration interpolated between the physical actuators.
For more information, consult the Neoseonsory SDK documentation.
OM Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM MobileHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | OM Wearables |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The OM Editor is an app to control OM Wearables: sleeves on the forearm each with three ERMs. The user can create sequences of up to six vibration patterns of the same duration where they can select the intensity of vibration and which actuators will vibrate. These sequences can be annotated and stored.
For more information on the OM Editor or Wearables, please consult the MobileHCI’21 paper.
Pebble/Rebble
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2012 |
Platformⓘ The OS or software framework needed to run the tool. | Pebble |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Pebble |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Pebble smartwatches contain vibration motors that can be controlled through the C API, now maintained by the Rebble project. On-off vibration patterns can be programmed using the Vibes API where the duration of each on or off interval is specified.
For more information, consult the Rebble project site.
Penn Haptic Texture Toolkit
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | HDAPI |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Non-Commercial Research Use Only |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
The Penn Haptic Texture Toolkit consists of texture and friction models recorded from 100 surfaces. The data used to create the models and sample code for displaying the textures on the Phantom Omni are included as part of the toolkit.
For more information, consult the 2014 Haptics Symposium paper.
PhysVib
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Android |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ToH |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
PhysVib is an extension to the Android-based AndEngine physics engine. Certain objects in the engine are treated as being manipulated by the user (in the “haptic camera”), and collisions involving these objects generate vibrotactile feedback. This feedback is passed through audio output to an actuator attached to the Android device.
For more information, consult the 2016 Transactions on Haptics paper and the GitHub repository.
posVibEditor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE HAVEIEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | ERM |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The posVibEditor supports the creation of vibration patterns across multiple ERM motors. Vibration assets can be created by manipulating keyframes in a vibration intensity over time visualization. These assets, or provided templates, can be copied into a track interface to decide which motor each will be displayed on and at what time. A “perceptually transparent rendering” mode is included to adjust the mapping of asset amplitude values to output voltage values so that the authored effect is felt as intended.
For more information, consult the 2008 Workshop on Haptic Audio Visual Environments and Games paper and the 2009 WHC paper.
Printgets
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Pure Data, Raspberry Pi |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (LGPL 3) |
Venueⓘ The venue(s) for publications. | HAID |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Electroactive Polymer |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Printgets is a library to develop printed vibrotactile widgets for piezoelectric actuators that are triggered by capacitive input devices. Input and output devices are connected to a computer, such as a Raspberry Pi, and parameters mapping inputs to outputs can be controlled using Purr Data, a fork of Pure Data. This interface is meant to support the development of tactile widgets.
For more information on Printgets, please consult the HAID 2020 paper or the GitLab repository.
SeeingHaptics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2019 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM MobileHCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile, Temperature |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| None |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | N/A |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
SeeingHaptics allows users to attach haptic feedback of various types to different virtual objects. These interactions are represented in the VR environment using different visual icons. “Haptic listeners” can be attached to other devices, such as VR controllers, so that people in the virtual environment can experience the appropriate effect when they are near an object with an associated haptic effect. No output devices are supported out of the box, and SeeingHaptics is intended to aid in planning a VR haptic experience.
For more information, consult the MobileHCI’19 paper.
Skinscape
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2001 |
Platformⓘ The OS or software framework needed to run the tool. | Protools and Max/MSP |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | V1220 Transducer, Aura Systems Interactor Cushion |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Arm, Torso |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | AIFF |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Skinscape is meant to be a haptic equivalent to a MIDI sequencer. Keyboard inputs are recorded in Max/MSP and mapped to seven haptic actuators, six of which are located on the arms and one on the lower back. This sequence would then be exported and loaded into a traditional audio editing environment (e.g., ProTools) so that it can be combined with existing music.
For more information, consult Eric Gunther’s Masters Thesis.
Syntacts (Standalone)
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | IEEE ToH |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | WAV, AIFF, CSV, Syntacts Signal File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Syntacts is an API and graphical tool for creating audio-driven vibrotactile arrays. Vibrotactile “signals” can be created using common waveforms and combined using operations such as sequencing and multiplication. Complex envelopes can also be applied to these different signals. The Syntacts GUI includes a spatializer for mapping desired signals to the correct virtual location on an array and a track-based sequencer to aid in performing these signal operations.
For more information, consult the 2020 Transactions on Haptics paper and the GitHub repository.
Synth-A-Modeler Designer
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2012 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, macOS, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 2) |
Venueⓘ The venue(s) for publications. | Linux Audio ConferenceEuroHaptics |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Simulation, Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon, FireFader |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | No |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Dataflow |
Storageⓘ How data is stored for import/export or internally to the software. | Faust DSP File, SAM Model File |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The Synth-A-Modeler Designer is designed for physical modeling for sound synthesis and can be connected to force-feedback devices as inputs. Models can be constructed using mass-interaction and waveguide modeling through graphical elements. A special “port” object provides 1 DoF of input through a supported haptic device to allow physical control over the model. When a model is complete, the Designer can export it to a Faust DSP file that can be compiled to run on various targets, such as on the Web, mobile devices, and desktop computers.
For more information, consult the 2012 Linux Audio Conference paper, the 2016 EuroHaptics paper, and the both the GitHub repository for the designer and the one for the compiler.
Tactile Brush
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2011 |
Platformⓘ The OS or software framework needed to run the tool. | Pure Data |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | C-2 Tactor |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Custom UDP |
Additional Information
The Tactile Brush is an algorithm to create tactile animations on discrete vibrotactile arrays. The algorithm allows for the creation of stationary or moving tactile objects within the limits of a tactile array. Their motion paths and vibration intensities are used to calculate which actuators should be triggered to produce the intended effect, the intensity at which each should vibrate, and the onset time of and duration of vibration for each. A version of it was implemented in a Pure Data application.
For more information, consult the CHI’11 paper.
Tactile Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2008 |
Platformⓘ The OS or software framework needed to run the tool. | macOS |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | N/A |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Make Controller |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
Tactile Editor is an application that allows users to create vibration patterns for motors. “Motor objects” represent the basic unit of these patterns and include parameters for physical motor assignment, start time, duration, and intensity. These objects can be assigned to different tracks to allow different vibrations to be layered together. Patterns can be tested by playing them back on a connected device. Sensor values input to the Editor can be used to trigger the start of different patterns.
For more information, consult Markus Jonas’s Masters Thesis.
Tactile Glove Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2010 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE MultiMedia |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Tactile Glove |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | MPEG-4 BIFS |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This authoring tool allows for the design of video-synchronized tactile effects to be rendered using the actuators along the fingers and palm of the tactile glove. Frames of the video are shown in the GUI and it is possible to draw tactile lines across them that will be mapped to the array on the glove. These tactile frames are greyscale with color intensity mapped to vibration intensity. Previews of the surrounding video content are shown to aid in planning and synchronizing the effects to the existing audio-visual media.
For more information, consult the 2010 IEEE MultiMedia paper.
TactiPEd
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | INTERACT |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Generic VT |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
Users of TactiPEd would first create a template file specifying the hardware device they are using and its layout of actuators. This layout will then be preserved in the main editor interface, where sequences of vibrotactile patterns can be created and assigned to different actuators. A playback mode allows for users to quickly feel the changes made to amplitude, frequency, and timing in the tool.
For more information, consult the 2013 IFIP Conference on Human-Computer Interaction paper.
TactJam
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Electron |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT, CC-BY-4.0) |
Venueⓘ The venue(s) for publications. | ACM TEI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Collaboration, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TactJam Hardware |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Head, Arm, Hand, Torso, Leg, Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Internal |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
TactJam consists of a hardware component, client software component, and server component. The hardware includes eight ERM motors each connected to a board with push buttons corresponding to each. In the client, users can create effects by arranging dots representing the motors on a 3D model of a human. With the hardware connected over USB, patterns can be recorded into the client as they are played on the device itself. When a pattern is ready, a user can upload it to the TactJam server so that others may download and reuse it.
For more information about TactJam, consult the TEI’21 abstract, the TEI’22 paper, and the main GitHub repository.
TactTongue
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2023 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Accessibility, Virtual Reality, Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Electrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TactTongue |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Tongue |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Library, Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
TactTongue is an app and prototyping kit to explore electrotactile stimulation on the tongue. The parameters of the signal on each electrode can be directly controlled, or presets can be used as the basis for a design. A visualization of the resulting sensation on the tongue is shown in the application, and the pattern itself can be played on a connected TactTongue device.
For more information about TactTongue, consult the 2023 UIST paper and the GitHub repository.
TECHTILE Toolkit
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2012 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM VRICACM CHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Education |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TECHTILE |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
The TECHTILE Toolkit consists of a vibrotactile recorder and actuator that allows users to record real-world vibrations and then play them back. This is meant to permit the sharing of effects by users without strong technical backgrounds, including elementary school children.
For more information, consult the VRIC’12 article and the TECHTILE Toolkit website.The TECHTILE Toolkit was also used in the CHI’21 Extended Abstract.
TorqueTuner
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | libmapper |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (MIT) |
Venueⓘ The venue(s) for publications. | NIME |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control, Music |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TorqueTuner |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Open Sound Control |
Additional Information
TorqueTuner is a standalone 1-DoF haptic module that can be used as a standalone device or connected to a digital musical instrument such as the T-Stick. It contains a set of embedded effects that can be modified through inputs sent using OSC through libmapper. TorqueTuner also sends information about its own state back through OSC that can be used as inputs to another process running elsewhere.
For more information on the TorqueTuner hardware or software environment, please consult the NIME’20 paper or the GitHub repository. The photograph of the TorqueTuner by M. Kirkegaard, M. Bredholt, C. Frisson, and M.M. Wanderley is licensed under CC BY 4.0.
TouchCon Editor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2009 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ICACT |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile, Temperature |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | TouchCon Device |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
The TouchCon Editor is meant to design tactile effects for use in instant messaging. Effects can be created using multiple supported output devices, each described to the system using an XML file, by arranging individual sensations along a timeline. These can be sent to other users of TouchCon if they have the necessary hardware to display the effect.
For more information, consult the ICACT’09 paper.
Ubitile
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM NordiCHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ubitile |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Users of Ubitile wear a vibrotactile actuator and gyroscope on their pointer finger. Vibration patterns are created by moving this finger between three points: A, B and C. The pitch angle between positions B and A controls intensity, the time spent travelling between positions A and B controls duration, and the time spent between positions B and C controls the gap time between vibration units. Patterns are recorded and can be played back as wanted.
For more information, consult the 2016 Nordic Conference on Human-Computer Interaction paper.
VibEd
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | iConference |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Gaming, Prototyping, Accessibility |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Android, iPhone, Xbox |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
VibEd is designed to create vibrotactile feedback for games. A waveform editing interface is present in the tool where keyframes can be used to control the duration and intensity of vibration patterns to be displayed on the selected output device. If a playback application is installed on the desired platform, the editor can send authored patterns to it at runtime.
For more information, consult the iConference 2016 paper.
vibrAteRial
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2020 |
Platformⓘ The OS or software framework needed to run the tool. | NodeJS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (GPL 3) |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Hardware Control |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | bARefoot |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Foot |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
VibrAteRial is designed to create underfoot material effects in VR for the bARefoot shoe system. Each bARefoot contains a pressure sensor and vibrotactile actuators. Using the authoring tool, designers create virtual materials by controlling vibration grains that trigger as the bARefoot wearer steps down on the shoe. Users can control the distribution of these grains as a function of pressure, and the frequency and amplitude of each grain. These can be sent to a bARefoot so that the material can be tested and refined.
For more information on this or bARefoot, consult the UIST’20 paper and the Git repository.
VibScoreEditor
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2009 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Brüel & Kjaer Model 4810, Voice Coil, Vibration Motor |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Keyframe, Score |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | Device Template |
Additional Information
VibScoreEditor applies uses music notation as a metaphor for vibrotactile editing. In this system, a “vibrotactile clef” determines the frequency and waveform assigned to each “pitch” occupying a position on the staff. Notes have a pitch by virtue of this position, an intensity indicated by a number inside the note head, and a duration set by the shape of the note. Two dynamics, crescendo and decrescendo, allow for gradual increasing and decreasing of note intensity respectively. By switching between different clefs and arranging a series of vibrotactile notes, rests, and dynamic markings, users can create complex time-varying vibration patterns.
For more information, consult the 2009 World Haptics Conference paper.
VibViz
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2015 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (BSD 3-Clause) |
Venueⓘ The venue(s) for publications. | IEEE WHC |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Library, Description |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | WAV |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
VibViz is a library of vibrotactile effects and associated filtering and visualization tools to find elements within it. Each effect is associated with various tags representing emotions (“angry”), metaphors (“heartbeat”), and possible uses (“alarm”) that can be used to filter the library. Two chart visualizations lay out the effects by duration and signal RMS, and by pleasantness and urgency. Additional filters are available for selecting effects with a specific tempo, rhythm structure, and roughness. A complete list of the effects in the library is present showing each one’s metaphor and usage tags along with a visualization of intensity over time. Selected effects are played as audio outputs, allowing them to be displayed on a connected actuator.
For more information on VibViz, consult the 2015 World Haptics Conference paper, the VibViz website, and the GitHub repository.
VITAKI
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Windows, Linux |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | Journal of HCI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Gaming |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | VITAKI Controller |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
VITAKI supports controlling ERM motors in various configurations. A photo of the output device’s configuration can be loaded into VITAKI and annotated with the locations of each actuator. Waveforms, either preset or customized through a keyframe-based editor, are assigned to each actuator by placing them in a track corresponding to one actuator. There is also an additional mode to change the mapping between the waveform values and the actual output voltages sent to the device.
For more information, consult the 2014 Journal of Human-Computer Interaction article.
ViviTouch
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2014 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Class |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Voice Coil |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process, Sequencing, Library |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe |
Storageⓘ How data is stored for import/export or internally to the software. | MKV, Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Vivitouch is meant to support prototyping of vibrotactile haptics aligned to audio-visual content. Haptic media is created through the use of waveforms and filters mapping the audio content at that moment of time to the vibrotactile channel. These filters, such as a low-pass filter, are meant to aid in synchronizing audio and haptic content. Effects and filters are assigned to different output channels, representing each actuator, and to different haptic tracks. Using multiple tracks allows for layering effects and filters on the same actuator at the same time.
For more information, consult the 2014 World Haptics Conference paper.
Voodle
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2017 |
Platformⓘ The OS or software framework needed to run the tool. | NodeJS |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM DIS |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping, Communication |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | CuddleBit |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
While Voodle is primarily meant to control 1 DoF robots called “CuddleBits”, it can also be used for haptic prototyping. The frequency and amplitude of a user’s voice is used to drive the output of the system. Each parameter is normalized and used to create a weighted average with the bias value set by the user. The user can then add random noise to the system and scale and smooth the resulting output. The mapping of voice input to motor output occurs in real time.
For more information, consult the DIS’17 paper and the GitHub repository.
VRML Plugin
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2003 |
Platformⓘ The OS or software framework needed to run the tool. | Windows |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE Haptics Symposium |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Phantom |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | N/A |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| N/A |
Storageⓘ How data is stored for import/export or internally to the software. | VRML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This system is a Virtual Reality Modeling Language (VRML) plugin that adds force-feedback effects to a subset of VRML. This means that existing VRML scenes can automatically have haptic effects added to them. The scene can be explored using a PHANTOM device in the Netscape browser.
For more information about the VRML plugin, consult the 2003 Haptics Symposium paper.
Web-based MPEG-V Tool
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2013 |
Platformⓘ The OS or software framework needed to run the tool. | Web |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | IEEE ISMIEEE HAVE |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Force Feedback |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Novint Falcon, omega.x, delta.x, sigma.x, Geomagic Touch, Moog HapticMaster |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | N/A |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Generic Menu |
Storageⓘ How data is stored for import/export or internally to the software. | MPEG-V, Collada |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | API |
Additional Information
Dong et al.’s authoring tool allows for friction, sping, impulse, and shape forces to be added to 3D objects loaded into the editing environment. The resulting elements are exported using a proposed extension of MPEG-V to support haptics.
For more information, consult the 2013 IEEE International Symposium on Haptic Audio Visual Environments and Games paper and the 2015 IEEE International Symposium on Multimedia paper.
Weirding Haptics
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2021 |
Platformⓘ The OS or software framework needed to run the tool. | Unity |
Availabilityⓘ If the tool can be obtained by the public. | Available |
Licenseⓘ Tye type of license applied to the tool. | Open Source (AGPL 3) |
Venueⓘ The venue(s) for publications. | ACM UIST |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Virtual Reality, Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Oculus Touch |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC, Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | None |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Weirding Haptics is a design tool for and within virtual reality environments where vocalizations are mapped to vibration patterns attached to virtual objects. Vocalizations are recorded and mapped to vibration patterns felt when touching the object. These patterns can have parameters such as maximum amplitude modulated based on the position or velocity of contact. Several patterns can be layered over each other to create more complex effects impossible to accomplish through one recording.
For more information on Weirding Haptics, please consult the UIST’21 paper or the GitHub repository.
YouTube Haptic Authoring
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2010 |
Platformⓘ The OS or software framework needed to run the tool. | Java |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM MM |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Haptic Augmentation |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Custom Jacket, Custom Arm Band |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Torso, Arm |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Location-aware |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Visual, Audio |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| DPC |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track |
Storageⓘ How data is stored for import/export or internally to the software. | Custom XML |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
This authoring tool allows for users to annotate YouTube videos with time-synchronized vibrotactile content. The resulting augmented file can be played using a custom browser, written in Java, that supports the specified haptic devices. In the editing environment, the timing and actuation of a vibrotactile array can be set with the source YouTube video visible as a reference.
For more information, read the 20 ACM Multimedia Conference paper.