|General Purpose Information|
| Year of First Releaseⓘ |
The year a tool was first publicly released or discussed in an academic paper.
| Platformⓘ |
The OS or software framework needed to run the tool.
| Availabilityⓘ |
If the tool can be obtained by the public.
| Licenseⓘ |
Tye type of license applied to the tool.
|Hardware Control Information|
| Haptic Categoryⓘ |
The general types of haptic output devices controlled by the tool.
| Hardware Abstractionⓘ |
How broad the type of hardware support is for a tool.
|Consumer (C-2 Tactors)|
|Interaction and Interface Information|
| Driving Featureⓘ |
If haptic content is controlled over time, by other actions, or both.
| Effect Localizationⓘ |
How the desired location of stimuli is mapped to the device.
| Media Supportⓘ |
Support for non-haptic media in the workspace, even if just to aid in manual synchronization.
| Iterative Playbackⓘ |
If haptic effects can be played back from the tool to aid in the design process.
| Design Approachesⓘ |
Broadly, the methods available to create a desired effect.
|DPC, Process, Sequencing|
| Interaction Metaphorsⓘ |
Common UI metaphors that define how a user interacts with a tool.
|Track, Keyframe, Demonstration|
Mango is a graphical tool for creating effects on vibrotactile arrays. A visualization of the layout of the actuators in the array is present in the editor. Users can create “animation objects” with different positions and intensities, and create paths to define the motion of these objects over time. Parameters can be adjusted over time as well through the use of keyframes. A rendering algorithm is used to transform these tactile animations into actuator signals so that the authored vibrotactile experience is perceived by the user.
For more information, consult the UIST’15 paper.