File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

DragID: A Gesture Based Authentication System

Author(s)
Min, Junyoung
Advisor
Chun, Seyoung
Issued Date
2014-08
URI
https://scholarworks.unist.ac.kr/handle/201301/71811 http://unist.dcollection.net/jsp/common/DcLoOrgPer.jsp?sItemId=000001754863
Abstract
With the use of mobile computing devices with touch screens is becoming widespread. Sensitive personal information is often stored in the mobile devices. Smart device users use applications with sensitive personal data such as in online banking. To protect personal information, code based screen unlock methods are used so far. However, these methods are vulnerable to shoulder surfing or smudge attacks. To build a secure unlocking methods we propose DragID, a flexible gesture and biometric based user authentication. Based on the human modeling, DragID authenticates users by using 6 input sources of touch screens. From the input sources, we build 25 fine grained features such as origin of hand, finger radius, velocity, gravity, perpendicular and so on. As modeling the human hand, inour method, features such as radius or origin is difficult to imitate. These features are useful for authentication. In order to authenticate, we use a popular machine learning method, support vector machine. This method prevents attackers reproducing the exact same drag patterns. In the experiments, we implemented DragID on Samsung Galaxy Note2, collected 147379 drag samples from 17 volunteers, and conducted real-world experiments. Our method outperforms Luca’s method and achieves 89.49% and 0.36% of true positive and false positive. In addition, we achieve 92.33% of TPR in case we implement sequence technique.
Publisher
Ulsan National Institute of Science and Technology

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.