Turns out this doesn't really affect
that many end-user scripts because most folks just use whatever
Experiences are already compiled, and a lot of Experience scripts
(such as the AVsitter auto-attach scripts I often praise) are usually
embedded in object inventories, not rezzed-out on their own, so they
weren't affected.
New objects don't seem to be affected
now, but currently it's not clear whether there's any way for the Lab
to fix the already damaged objects. If you're interested you can
watch developments at https://jira.secondlife.com/browse/BUG-227526
for updates.
Much wider attention is on the new
"Bakes on Mesh" (BoM) feature for attached mesh bodies and
heads. This enables direct wearing of skin-tight textures and alpha
masks, just as the old system avatars used, rather than the much more
complex and limited "applier" scripts and "alpha cuts"
HUDs that mesh attachments needed until now.
Much has been said about BoM already,
but some things I think have been under-reported. First, in addition
to the oft-cited improvement in rendering efficiency, the feature
overcomes a constant source of user frustration and support problems:
the unpredictable interaction of multiple layers of blended alpha
textures. This comes up all the time when people try to wear makeup
and tattoos, say, on overlapping layers of their attached mesh. All
those problems go away when those overlapping transparent layers are
simply baked on top of each other into a single displayed surface
texture.
This also completely eliminates a
problem inherent to how Advanced Lighting is implemented, limiting
how many projected light sources affect a blended alpha surface. If
you've ever noticed make-up, for example, looking weirdly coloured
depending on where it is in a scene with complicated lighting, you've
seen the problem -- which you WON'T see anymore with baked-on
textures.
Finally, a note about BoM and
Materials: if creators of BoM skin, make-up, tattoos, clothing, etc.
want to include "Materials" effects, they should simply
supply the normalmaps and/or specularmaps directly with their
products, so folks can manually compose their combined normal- and
specularmaps for the baked surface layer. These assets have zero
value as "stolen" content, so the creator loses nothing by
doing so, however much they may have been trained to "protect
IP" at all cost. Mesh avatar makers understand this and provide
a way to apply those manually composed material maps (at least this
applies to Slink, which was "first out the gate" with BoM
so I could test the existing built-in Materials applier on the skin
layer, and a single-layer Omega applier for Materials will also
work).
The reason this process still involves
manual composition is that there's no simple rule for combining the
"bumpiness" of normalmaps layered on top of one another. An
under-shirt might be tight enough to reveal skin-layer bumpiness, but
its own ribbed collar may not be revealed under a tight jacket, for
example. It's still up to the user to decide which bumps should show
through, and that means combining the normalmaps individually. (Most
end-users may not want to do this themselves but creators of quality
assembled outfits will surely follow this process for their
customers.)
Reporter Qie Niangao
190902
No comments:
Post a Comment