From b8b8930649c36ba6d0e2cbb0904f1cb56fdc9f7c Mon Sep 17 00:00:00 2001 From: mayersn Date: Wed, 7 Aug 2019 18:00:11 -0400 Subject: [PATCH] updated README --- README.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 18a0042..0d284e5 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,6 @@ # KnuckleTouch -This repository contains the data set and scripts for the MuC '19 paper on "KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning". +This repository contains the data set and scripts for the MuC '19 paper on "KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning". [YouTube](https://www.youtube.com/watch?v=akL3Ejx3bv8) + ## Abstract While mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but recognizing knuckle gestures robustly and accurately remains challenging. We present a method to differentiate between 17 finger and knuckle gestures based on a long short-term memory (LSTM) machine learning model. Furthermore, we introduce an open source approach that is ready-to-deploy on commodity touch-based devices. The model was trained on a new dataset that we collected in a mobile interaction study with 18 participants. We show that our method can achieve an accuracy of 86.8% on recognizing one of the 17 gestures and an accuracy of 94.6% to differentiate between finger and knuckle. In our evaluation study, we validate our models and found that the LSTM gestures recognizing archived an accuracy of 88.6%. We show that KnuckleTouch can be used to improve the input expressiveness and to provide shortcuts to frequently used functions. @@ -19,3 +20,8 @@ publisher = {ACM}, address = {New York, NY, USA}, } + +## Repository structure + +* viewer/ Android example application, this requires a hacked LG Nexus 5 +* python/ Evaluation code of all models described in the paper.