This isn’t exactly a bug, but it is confusing and seemingly undocumented. And it is probably an iOS deficiency, not Codea’s, but it’s still annoying.
os.time() gives the standard Unix epoch time in milliseconds (as one might expect everywhere but in Windows).
os.clock() gives CPU time since Codea was started, in seconds.
touch.timestamp has a value that looks like neither of these (making some touch-time relative math rather difficult). It appears to be in seconds, based on how it changes, but the zero-point is unclear (perhaps since the device was last rebooted?).
What is the basis/UOM of touch.timestamp, and how can it be converted to either os.time() or os.clock() format? Or is it in its own, private, impenetrable world?
NOTE: Looking at the iOS documentation, it looks like you might be passing simply the iOS touch timestamp (makes sense), which is defined as a TimeInterval, but even Apple does not seem to have documented the basis… so timestamp is only good for comparing times relative to other touches.
It would be nice if:
(a) the value of the timestamp was more clearly defined in the Codea Reference (so people don’t have to repeat my experimentation and research to figure out exactly what is going on)
(b) the Codea touch included either the os.time() or the os.clock() value for the touch (which isn’t really that important, as a developer can grab that when initially capturing the touch)
[Final note: This came from a need to have a delay when a user is holding down a zoom button, to slow it down. It is certainly doable, just not in the ways I first attempted, until I sorted out which times were which.]