HapticTrace is a desktop tool for recording and analyzing haptic-response events. It combines sensor data from phyphox, iPhone/iPad screen capture, and post-analysis tools. After recording, you can correlate the video with the measured signal, adjust the offset, and save the result for repeated analysis.
Platform: macOS only
- captures sensor data via
phyphox - provides live preview and records the iPhone/iPad screen
- supports recording modes:
Sensors + Video,Sensors Only,Video Only - displays the signal and spectrogram
- plays back recorded video on a shared timeline
- supports automatic and manual offset adjustment
- saves and loads sessions as
.zip - compares multiple sessions
- macOS
- Python 3
- a device running
phyphox, reachable over the network - an iPhone or iPad available to macOS as a screen capture source
- internet access on the first launch from source
The primary mode is synchronized recording of sensor data and on-screen context followed by analysis.
A typical setup uses two mobile devices:
- Device 1 (phone) — sensor module with
phyphox
Used to captureaccelerometerandgyroscopedata. - Device 2 (phone) — target iPhone/iPad
Runs the application under test, plays haptics, and provides screen capture. - Mac — host running
HapticTrace
Connects to the sensor device over the network and to the iPhone/iPad over USB through the macOS screen capture stack.
- Open
phyphoxon the sensor device. - Create or load an experiment with
accelerometerandgyroscopeenabled. - Enable remote access in
phyphoxand obtain the experiment URL. - Connect the iPhone/iPad to the Mac over USB as a screen capture source.
- Rigidly fix both devices relative to each other (for example, in a stacked "sandwich" setup):
- iPhone/iPad — screen facing up
- sensor device — screen facing down
During recording, the devices must not shift relative to each other. Any parasitic motion directly affects measurement quality.
- Launch
HapticTraceon the Mac. - Connect the sensor device using the
phyphoxURL. - Connect the iPhone/iPad as the video source.
- Start recording (
Playbutton in the main window). - Trigger haptics in the application under test (on the iPhone/iPad).
- Stop recording.
- Analyze the signal and align it with on-screen activity on the shared timeline.
In addition to the primary synchronized workflow, HapticTrace also supports standalone recording modes:
Sensors Only— capture sensor data without videoVideo Only— record screen output without sensors
./run_app.sh --url http://<device_local_ip>:8080On first launch, the script creates .venv, updates pip, and installs runtime dependencies automatically.
run_app.commandOpen HapticTrace.app from the prepared release build.
- Project overview — start here
- Usage — continue with usage and workflow
- Development — local setup, dependencies, tests, and release builds
- Troubleshooting — common operational issues
- Third-Party Notices — third-party package notices
Apache-2.0