updated README
This commit is contained in:
parent
0fb6efadd5
commit
b8b8930649
1 changed files with 7 additions and 1 deletions
|
@ -1,5 +1,6 @@
|
||||||
# KnuckleTouch
|
# KnuckleTouch
|
||||||
This repository contains the data set and scripts for the MuC '19 paper on "KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning".
|
This repository contains the data set and scripts for the MuC '19 paper on "KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning". [YouTube](https://www.youtube.com/watch?v=akL3Ejx3bv8)
|
||||||
|
|
||||||
|
|
||||||
## Abstract
|
## Abstract
|
||||||
While mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but recognizing knuckle gestures robustly and accurately remains challenging. We present a method to differentiate between 17 finger and knuckle gestures based on a long short-term memory (LSTM) machine learning model. Furthermore, we introduce an open source approach that is ready-to-deploy on commodity touch-based devices. The model was trained on a new dataset that we collected in a mobile interaction study with 18 participants. We show that our method can achieve an accuracy of 86.8% on recognizing one of the 17 gestures and an accuracy of 94.6% to differentiate between finger and knuckle. In our evaluation study, we validate our models and found that the LSTM gestures recognizing archived an accuracy of 88.6%. We show that KnuckleTouch can be used to improve the input expressiveness and to provide shortcuts to frequently used functions.
|
While mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but recognizing knuckle gestures robustly and accurately remains challenging. We present a method to differentiate between 17 finger and knuckle gestures based on a long short-term memory (LSTM) machine learning model. Furthermore, we introduce an open source approach that is ready-to-deploy on commodity touch-based devices. The model was trained on a new dataset that we collected in a mobile interaction study with 18 participants. We show that our method can achieve an accuracy of 86.8% on recognizing one of the 17 gestures and an accuracy of 94.6% to differentiate between finger and knuckle. In our evaluation study, we validate our models and found that the LSTM gestures recognizing archived an accuracy of 88.6%. We show that KnuckleTouch can be used to improve the input expressiveness and to provide shortcuts to frequently used functions.
|
||||||
|
@ -19,3 +20,8 @@ publisher = {ACM},
|
||||||
address = {New York, NY, USA},
|
address = {New York, NY, USA},
|
||||||
}
|
}
|
||||||
</pre>
|
</pre>
|
||||||
|
|
||||||
|
## Repository structure
|
||||||
|
|
||||||
|
* viewer/ Android example application, this requires a hacked LG Nexus 5
|
||||||
|
* python/ Evaluation code of all models described in the paper.
|
||||||
|
|
Loading…
Reference in a new issue