Summary
Thumb-based interaction is important for usability in virtual and augmented environments, but accurately detecting force remains a challenge. This project addresses that gap by developing a vision-based system that detects both thumb position and applied force without requiring separate hand localization. The system supports recognition of subtle gestures from an egocentric perspective. The goal is to improve interaction accuracy and support more intuitive user experiences in immersive environments.