A very active senior contributor to the
Scripting forum, Rolig Loon, recently noticed that the simulation
seems to know more than it should about where the object has rotated,
based on how a script can take advantage of its current VISIBLE
rotation using the llDetectedTouchNormal function. How is this
possible?
Details and sample scripts
demonstrating the effects are in Rolig's forum thread.
As the discussion developed, it became
clear that the llDetectedTouchNormal wasn't using information the sim
normally knows about objects, but rather information it gets from the
viewer that does the touching. It's that viewer's idea of where the
object has spun that is sent to the sim as part of the touch event
itself.
There are other llDetectedTouch*
functions, too, that use the touching viewer's representation of the
object's local, visible state. In my post to the thread, I
demonstrated how llDetectedTouchUV uses the instantaneous offset of
an active texture animation, another viewer-side property.
This way the script is able to respond
to touch in a way that's consistent with how an object is shown in
the viewer doing the touching.
On further reflection, though, this
"consistency" is local to just that single viewer.
Different viewers of the same object see smooth rotation and texture
animation differently, for example depending on when the object first
rezzed into that viewer's scene. Hence a "curious"
consistency.
Reporter Qie Niangao
190701
No comments:
Post a Comment