Ubitile
Tool Summary
General Purpose Information | |
---|---|
Year of First Releaseⓘ The year a tool was first publicly released or discussed in an academic paper. | 2016 |
Platformⓘ The OS or software framework needed to run the tool. | Unknown |
Availabilityⓘ If the tool can be obtained by the public. | Unavailable |
Licenseⓘ Tye type of license applied to the tool. | Unknown |
Venueⓘ The venue(s) for publications. | ACM NordiCHI |
Intended Use Caseⓘ The primary purposes for which the tool was developed. | Prototyping |
Hardware Control Information | |
---|---|
Haptic Categoryⓘ The general types of haptic output devices controlled by the tool. | Vibrotactile |
Hardware Abstractionⓘ How broad the type of hardware support is for a tool.
| Bespoke |
Device Namesⓘ The hardware supported by the tool. This may be incomplete. | Ubitile |
Body Positionⓘ Parts of the body where stimuli are felt, if the tool explicitly shows this. | Hand |
Interaction and Interface Information | |
---|---|
Driving Featureⓘ If haptic content is controlled over time, by other actions, or both. | Time |
Effect Localizationⓘ How the desired location of stimuli is mapped to the device.
| Device-centric |
Media Supportⓘ Support for non-haptic media in the workspace, even if just to aid in manual synchronization. | None |
Iterative Playbackⓘ If haptic effects can be played back from the tool to aid in the design process. | Yes |
Design Approachesⓘ Broadly, the methods available to create a desired effect.
| Process |
Interaction Metaphorsⓘ Common UI metaphors that define how a user interacts with a tool.
| Demonstration |
Storageⓘ How data is stored for import/export or internally to the software. | Unknown |
Connectivityⓘ How the tool can be extended to support new data, devices, and software. | None |
Additional Information
Users of Ubitile wear a vibrotactile actuator and gyroscope on their pointer finger. Vibration patterns are created by moving this finger between three points: A, B and C. The pitch angle between positions B and A controls intensity, the time spent travelling between positions A and B controls duration, and the time spent between positions B and C controls the gap time between vibration units. Patterns are recorded and can be played back as wanted.
For more information, consult the 2016 Nordic Conference on Human-Computer Interaction paper.