Feellustrator
Tool Summary
| Metadata | |
|---|---|
| Release Yearⓘ The year a tool was first publicly released or discussed in an academic paper. | 2023 |
| Platformⓘ The OS or software framework needed to run the tool. | Unknown |
| Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
| Licenseⓘ Tye type of license applied to the tool. | Unknown |
| Venueⓘ The venue(s) for publications. | ACM CHI |
| Intended Use Caseⓘ The primary purposes for which the tool was developed. | Hardware Control, Collaboration |
| Hardware Information | |
|---|---|
| Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
| Abstractionⓘ How broad the type of hardware support is for a tool.
| Consumer |
| Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ultraleap STRATOS Explore |
| Device Templateⓘ Whether support can be easily extended to new types of devices. | No |
| Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
| Interaction Information | |
|---|---|
| Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time, Action |
| Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Target-centric |
| Non-Haptic Mediaⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | Audio, Visual |
| Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
| Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Direct, Procedural, Library, Additive |
| UI Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Track, Keyframe, Demonstration |
| Storageⓘ How data is stored for import/export or internally to the software. | Custom JSON, CSV |
| Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Feellustrator is a design tool for ultrasound mid-air haptics that allows users to sketch out paths for sensations to follow, control how the focal point will move along this path, and combine them together over time to create more complex experiences. Audio and visual reference materials can be loaded into the tool, but cannot be modified once added. Hand tracking support is present so that effects can be played relative to the user’s hand rather than floating freely in space.
For more information, please consult the CHI’23 paper.