By assessing the precision of gestural interactions with touchscreen targets, the authors investigate how the type of gesture, target location, and scene visibility impact movement endpoints. Participants made visually and memory-guided pointing and swiping gestures with a stylus to targets located in a semicircle. Specific differences in aiming errors were identified between swiping and pointing. In particular, participants overshot the target more when swiping than when pointing and swiping endpoints showed a stronger bias toward the oblique than pointing gestures. As expected, the authors also found specific differences between conditions with and without delays. Overall, the authors observed an influence on movement execution from each of the three parameters studied and uncovered that the information used to guide movement appears to be gesture specific.