Incremental improvements to Open Simulator

From Open Metaverse Wiki

The views expressed below are not necessarily those of the Open Metaverse Research Group, but are the author's alone.

Can we get to Open Simulator 2.0 by incremental improvements to what we have. Maybe. Here's an approach. These are intended to be reasonable goals that can be worked on independently by individuals or small groups.

These are topics for discussion at this point.

Goals

Try to get speed and quality up to GTA V level. That's over a decade old technology. UE5 demo quality would be nice but is probably not feasible yet. Unreal Engine requires extensive scene preparation in Unreal Editor, which works on the whole scene. Metaverses don't work that way.

Networking

Safely extending UDP message types.

We're very constrained by the UDP format. It's hard to add new messages. Each end has to know exactly how long each message is. Unknown message types cannot just be ignored. Here's a way to fix that.

Add this new message type:

    // EncapsulatedMessage
    // simulator -> viewer
    // viewer -> simulator
    // reliable
    {
	    EncapsulatedMessage High 31 NotTrusted Unencoded
	    {
		    Message Single
		    {	Data Variable 2 	} // contains any message, variable length
	    }
    }

This contains any other message, with a length. So any unknown message can just be skipped, since it has a length. This simplifies extensions to the protocol.

Improved retransmission policy

Current message retranmission policy is to wait a fixed length of time for an ACK. Then retransmit the message. This is repeated a few times, then the networking system gives up. The fixed time is a bit too long. Suggest using twice the measured round trip time to start, then increasing that by a factor of 1.5x unti retransmission either succeeds or the connection is dropped as broken.

Long retransmission stalls tend to break things. I suspect this is one of the problems with region crossings.

Concurrent message processing and rendering

Current viewers work like this:

  • Read some incoming messages, not too many to prevent overload.
  • Acknowledge incoming messages that were read.
  • Process updates.
  • Draw frame.

If the frame rate drops, so does the message acknoledgement rate. This is why frame rate affects ping time.

Suggested approach (used in Sharpview [1]):

  • Render thread:
    • Draw frame, repeat.
  • Network thread:
    • Read all incoming messages.
    • Acknowledge all incoming messages.
    • Put on queue for update thread.
  • Update thread
    • Handle queued incoming messages.

Under overload, the incoming message queue could build up. In practice, this doesn't seem to happen often. That has to be monitored. The incoming queue could be split into two priority queues.

  • Object updates for objects near the camera
  • Everything else.

That way, lag mostly affects background objects, and network problems are separated from rendering delays.

IPv6 support

Each simulator needs an IP address, and IPv6 addresses are much easier and cheaper to get than IPv4 addresses. A very few message formats, the few that contain IPv4 addresses used in setting up connections, need to change.

Content

Goal: allow more in-viewer editing by average users.

Mesh unduplication

All mesh cubes are mostly the same

Standard mesh parts kit

  • Library of standard low-LI mesh parts, all open source.
  • These appear in build menus.
  • More can easily be added. They're just mesh objects with public UUIDs.
  • Lower LI for standard mesh parts, since there's only one copy of the mesh per region.
  • Rescalable, but not parameterized. Parameterization is hard. (See Archimatix Pro for Sinespace for a real parametric system.)

Everybody is going glTF

glTF as standard format for both materials and meshes. That's where the industry is going. So is SL.

Avatars and clothing

Many options here. Needs much discussion.

Suggested goal: automatic clothing fitting, like Roblox now has.

Lights

Need more than just projector and point lights. Support many short-range no-shadow directional lights (something glTF has) would be a big win. Those could be used freely to illuminate interiors. With a reasonable distance setting, they won't go through walls and roofs. Vulkan supports this.

In-viewer content editing

With standard parts

See standard mesh parts kit above.

Custom parts

NVidia

NVidia Omniverse linkage is a possibilty. Available now for collaborative editing in Blender. Connects different graphics programs. Blender, Photoshop, and viewer could be connected. Needs further research.

Unity

Unity Hub also allows for collaborative editing with Blender. There are some issues with portability of materials from Blender to Unity Hub but porting materials back from Unity Hub to Blender is easy.

Inworld Blender

Given Blender is open sourced code, merging Blender code into the OS viewer would allow for mesh creation inworld.

Blender Kit

The Blender Kit website, and similar sites, allow for integrated free use of mesh and material textures made available by other creators on various license types.

Full hierarchy of prims

Child prims can have their own children. Have a real hierarchy like everybody else. Required for glTF, which assumes a hierarchy. Surprisingly, not that hard. See SL issue BUG-232445. [2]

This will also require creating OSSL script functions allowing for scripted movement of heirarchical linksets as a form of puppetry. See the Heirarchics scripts contributed to the SL LSL script library by Nexii Malthus (which seem to have been bunged up at a later point by someone else). This would enable much simpler manipulation of prims without a lot of link messages or llGetLinkPrimitiveParams functions in every link to determine the positions and rotations of other links in order to add to the movement and rotation of child prims.

Immersion

Game/roleplay mode

An extension of no-fly mode. Turn on for game and roleplay areas.

  • First person camera.
  • No camming, but can turn head some to look around.
  • Limited click range.
  • Can't cam through walls and sit, so rooms with locked doors actually work.

Movement

More animation parameters

Speed, priority, etc. We need OSSL script functions that allow for setting or varying the speed at which animations are executed. Animation priority should not apply to an entire animation across the whole armature, but vary for each bone.

Similarly functions changing prim positions, rotations, and scales should have rate parameters just like functions like llTargetOmega already possess.

SL type puppetry.

(More)

Improved unsit

Sit script gets one last chance in the unsit event to put the avatar back to where they were when they sat down, or someplace safe.

There are smart sits where car doors open and the avatar gets inside. Then unsit leaves them standing on top of the car. Ugly. No more standing on tables by accident.

Projectiles

Goal: more reliable bullet rezzing. More parameters on rezzing, so that bullets don't need scripts. See SL BUG-233084.[3] Have a standard bullet mesh that's always available, to avoid asset fetch delays.

Inter-simulator issues

Region crossings

They have to Just Work. Tough technical problem. Worth solving.

Edge object duplication

Stationary solid objects within a few meters of a region edge should have a copy of their physics model duplicated in the adjacent region, to prevent fallilng through bridges and walking through walls.

Distant scenery

Take orthographic pictures of each region once in a while from all 4 sides and above. Do this with at least noon and midnight lighting, and maybe sunset and sunrise. Simulators display those like a sim surround past the edge of all regions currently visible. See distant mountains. When in the air, you see the "above" picture for areas out of range. This is essentially how Google Earth does their big world.

Objects might also have Object Impostors generated to represent objects beyond the lowest LOD for any object that would appear as more than a few pixels in size.

Rendering Priority

SL and OS have had, for the longest time, problems in properly prioritizing the rendering of objects based on distance and occlusion. For instance, one can stand there watching a whole scene render for 20-40 seconds or more, then suddenly a wall will appear immediately in front of you. Ray tracing calculations should easily determine what objects are occluded by other objects in between them and the viewer and prioritize rendering sequence based upon this data.

Script Spheres of Influence

Just as we have Levels of Detail on the rendering of both mesh and primitives, and there ought to be prioritization of rendering objects by distance and occlusion, there should be a prioritization of the execution of script functions based on proximity to each user other than for scripts using whole-region functions (like llRegionSay) as well as occlusion. If' my avatar is outside a building with a very complex dance club floor and light show going on inside where there is nobody present, the user not observing the content should not be lagged by scripts executing functions that are irrelevant to what the viewer has to render.