added License and Readme

This commit is contained in:
Sven Mayer 2019-06-19 10:43:05 -07:00
parent e734ab63c5
commit 1b60913a43
2 changed files with 41 additions and 1 deletions

21
LICENSE Normal file
View file

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2019 perceptualui.org
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -1,2 +1,21 @@
# knuckletouch
# KnuckleTouch
This repository contains the data set and scripts for the MuC '19 paper on "KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning".
## Abstract
While mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but recognizing knuckle gestures robustly and accurately remains challenging. We present a method to differentiate between 17 finger and knuckle gestures based on a long short-term memory (LSTM) machine learning model. Furthermore, we introduce an open source approach that is ready-to-deploy on commodity touch-based devices. The model was trained on a new dataset that we collected in a mobile interaction study with 18 participants. We show that our method can achieve an accuracy of 86.8% on recognizing one of the 17 gestures and an accuracy of 94.6% to differentiate between finger and knuckle. In our evaluation study, we validate our models and found that the LSTM gestures recognizing archived an accuracy of 88.6%. We show that KnuckleTouch can be used to improve the input expressiveness and to provide shortcuts to frequently used functions.
This work can be cited as follows:
<pre>
@inproceedings{Schweigert:2019:KTE,
title = {KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning},
author = {Schweigert, Robin and Leusmann, Jan and Hagenmayer, Simon and Weiß, Maximilian and Le, Huy Viet and Mayer, Sven and Bulling, Andreas},
doi = {10.1145/3340764.3340767},
year = {2019},
date = {2019-09-08},
booktitle = {Mensch und Computer},
series = {MuC '19},
location = {Hamburg, Germany},
publisher = {ACM},
address = {New York, NY, USA},
}
</pre>