HapticGen

An image of HapticGen to aid in identification.

Tool Summary

Metadata
Release Yearⓘ 2025
Platformⓘ Python
Availabilityⓘ Available
Licenseⓘ Open Source (MIT)
Venueⓘ ACM CHI
Intended Use Caseⓘ Prototyping
Hardware Information
Categoryⓘ Vibrotactile
Abstractionⓘ Consumer
Device Namesⓘ Meta Quest
Device Templateⓘ No
Body Positionⓘ N/A
Interaction Information
Driving Featureⓘ Time
Effect Localizationⓘ Device-centric
Non-Haptic Mediaⓘ None
Iterative Playbackⓘ Yes
Design Approachesⓘ Procedural, Description
UI Metaphorsⓘ None
Storageⓘ WAV
Connectivityⓘ None

Additional Information

HapticGen uses a modified version of Meta’s AudioCraft project to generate haptic waveforms from user-specified prompts. The AudioCraft model was fine-tuned using a haptic dataset created automatically from an existing text-to-audio dataset that was then curated by expert participants. In the design interface, the duration of a desired effect can be specified by the user, but other aspects of it are left entirely to the model.

For more information, consult the CHI’25 paper and the GitHub repository.