Systems and methods for providing a natural media painting application may receive user inputs through tablet stylus gestures. A user interface may detect stylus gestures that mimic real-world actions of artists based on information collected during user manipulation of the stylus, and may map the gestures to various digital painting and image editing tasks that may be invoked and/or controlled using the gesture-based inputs. The collected information may include spatial and/or directional information, acceleration data, an initial and/or ending position of the stylus, an initial and/or ending orientation of the stylus, and/or pressure data. The stylus gestures may include translations, rotations, twisting motions, mashing gestures, or jerking motions. The application may perform appropriate painting and image editing actions in response to detecting and recognizing the stylus gestures, and the actions taken may be dependent on the work mode and/or context of the graphics application in which stylus gesture was performed.
展开▼