Any way to detect the area under a touch?

I haven’t seen anything in the documentation or example code that does it so I’m guessing the answer is no. Can you detect the entire area that is contacted by a touch such as if pressed a thumb down you could identify all the points under the thumb? The issue came up for me after the purchase of a stylus and trying to write notes in an app. It’s natural for your hand to touch the drawing surface but this gets interpreted as input.

You want to identify the wrist touches by the shape of the touch rather than by the position? The way drawing or notes taking apps do this by implementing a wrist touch area at the bottom of the screen. I think they just ignore touches in this area. What you want sounds interesting though I don’t know if that’s possible.

.@Marksolberg unfortunately the low-level touch data (such as the shape of the touch) is not available on iOS. We only receive computed centroids for each touch (up to 11 touches) which we pass directly on to your Codea project through the touched method.

Writing apps have a section that the ignore touches on as Spielkind says. Then they either have this manually controlled by the user, or automatically position the dead zone based on ther touches. Eg, take the highest touch in the last 30 seconds and ignore any touches more than 100 pixels below that, or some such algorithm.