First Touch
Published:
First touch
The null backend served its purpose. For weeks, FeelIT 2.0 ran entirely on visual feedback — keyboard-driven pointer emulation, no force feedback, no physical sensation. Everything worked, but it was like building a musical instrument and never hearing it play.
Today we ran the first bounded native haptic pilot. Real force feedback through a physical device. The stylus pushes back when it touches a virtual surface. The material profiles — stiffness, damping, friction — are no longer just numbers in a JSON file. They’re physical sensations.
NullBackend → NativeBackend: same API, same workspaces, same tests
Only the force computation changed. Everything else — Braille cells, material profiles, scene controls — worked unchanged.
The pilot is bounded — limited workspace, limited force range, careful safety constraints. But it works. The architecture decision to make the haptic backend a pluggable abstraction from day one meant that integrating real hardware required zero changes to the 4 workspaces, the Braille engine, or the material profiles. Only the backend implementation changed.
Version jumped to 2.18.000. The configuration review flow lets you inspect and adjust haptic parameters before committing to a session — important when force feedback can be physically felt by the user.
From v0.1.0 (Braille preview only) to v2.18.000 (native haptic execution) in one month. The project that was frozen for 14 years is moving fast. FeelIT 2.0 on GitHub.
