Official / Dev. Blog / ShowRoom
<12 Total Post: 11     Register or Login to post.
Joined: Jan, 2021
Total Post: 73
nrgmaster
ShowRoom
Posted on: Mar. 01 2021 02:44 pm
Im preparing a new series of tutorials to help people getting started with Objects and Assets. For the last week I've been preparing a modular PBR library for the show rooms; its still work in progress but its going to allow me to create a tour for all the Objects, all Asset and create individual tutorials for each type of Object and Asset; and also get some cool screenshots ;). The modular library and all assets will be bundled with the tours/tutorials so people can use them inside their project.

Z

The current state of the modular library.

9k=

2Q==

2Q==

Z

Some shots of the StandardMesh Room

Z

And the FracturedMesh Room


Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Mar. 20 2021 06:44 pm
Im basically done with all the rooms for showcasing the Objects; which total now at: 10. Its finally going to be a longer tour than I was initially planing but hey! The renderer have currently following capabilities implemented:

- Deferred Rendering/Shading

- Image Based Lighting (IBL)

- Physical Based Rendering (PBR)

- Ambient Occlusion

- Bloom

- Depth Of Field (DOF)

- FXAA

Nothing to fancy... I need to finalize the Screen Space Reflections (SSR), then the rooms scripts. I will try to take some shots today and post them ASAP.


Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Mar. 22 2021 01:47 am
Starting working in the LightSource rooms I've added that the LightType World can now store 2nd degree Spherical Harmonics components. It consist of 9 vector3 that can then be used in your shader to approximate the radiance of an object avoiding cubemap lookups. It can be calculated inside the editor (see the Renderer tutorial project for more info) or by other external tools.

9k=

Two new APIs have been added for LightSource specifically.

bool SetSphericalHarmonics( const unsigned char index, const vec3 component )

vec3 GetSphericalHarmonics( const unsigned char index )


The shader code is also very simple:

#define GAMMA vec3(2.2)

uniform vec3 _sh2[ 9 ];

uniform mat3 gfx_ViewMatrixInverseTranspose;

varying vec3 normal;

void main()
{
vec3 N = normalize( gfx_ViewMatrixInverseTranspose * normal );

N = N.xzy; // If the cubemap up vector is Y flip it to Z.

vec3 sh2 = vec3( 

 0.282095 * _sh2[0] +

-0.488603 * N.z * _sh2[1] +
 0.488603 * N.y * _sh2[2] +
-0.488603 * N.x * _sh2[3] +

 1.092548 * N.x * N.z * _sh2[4] +
-1.092548 * N.z * N.y * _sh2[5] + 
 0.315392 * ( 3.0 * N.y * N.y - 1.0 ) * _sh2[6] +
-1.092548 * N.x * N.y * _sh2[7] +
 0.546274 * ( N.x * N.x - N.z * N.z ) * _sh2[8] );

gl_FragColor = vec4( pow( sh2, GAMMA ), metallic );
}



Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Mar. 23 2021 03:26 pm
Yesterday I implement the alpha mechanism in order to finish the region room.I've also added that you can now connect and re-use FBO attachment to be able to write over color and re-use depth among multiple FBO. It is ideal while dealing with alpha or particles since you can write depth and color in the solid geometry pass; then simply re-use the two attachment to draw back to front semi-transparent geometry.

Now all windows connecting the rooms are see-through and the alpha geometry for the region room work as expected:

Z

And the windows between rooms:

9k=


Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Mar. 25 2021 08:56 pm
Last few days fix a couple of bugs which actually slow me down wrapping up the renderer. However yesterday a manage to finish fixing the last few renderer issues (such as with double sided alpha) and implement a quick and dirty SSR. That effect is a pain in tweaking but I got something that is basically scalable for mobile. I wannit the effect to be subtle instead of implementing it all the way and ending taking too much rendering time. I this point it actually takes half of the time compare to SSAO, since I half the SSR buffer and blur pass. Below find a few screenshots and the current implementation (probably still WIP).

Z

Z

9k=


Vertex Shader:

attribute lowp vec2 POSITION;

varying lowp vec2  texcoord0;

void main()
{
	texcoord0 = POSITION.xy * 0.5 + 0.5;

	gl_Position = vec4( POSITION.xy, 0.0, 1.0 );
}



Fragment Shader:

// https://github.com/congard/algine/blob/master/resources/shaders/SSR.frag.glsl

// SSR based on tutorial by Imanol Fotia

// http://imanolfotia.com/blog/update/2017/03/11/ScreenSpaceReflections.html

uniform sampler2D SAMPLER0; // composite (rgb + metallic)
uniform sampler2D SAMPLER1; // normal (rgb + roughess)
uniform sampler2D SAMPLER2; // depth

uniform mat4 _view;
uniform mat4 _viewinv;
uniform mat4 _projection;
uniform mat4 _projectioninv;

uniform int   binarySearchCount; // 4
uniform int   rayMarchCount;// 64
uniform float rayStep;// 0.015
uniform float falloff;// 7.250
uniform float depth_bias; // 0.125
uniform float minStep; // 6.125

varying vec2 texcoord0;

vec3 get_eye_pos( const in vec2 texcoord )
{
	float depth = texture2D( SAMPLER2, texcoord ).r;

    vec4 pos = _projectioninv * 
    		   vec4( texcoord0 * 2.0 - 1.0, 
    	  		   depth     * 2.0 - 1.0, 
    	  		   1.0 );  
    
   return pos.xyz / pos.w;
}


float get_eye_depth( const in vec2 texcoord )
{
	float depth = texture2D( SAMPLER2, texcoord ).r * 2.0 - 1.0;

    return ( ( depth * _projectioninv[2][2] ) +
			 _projectioninv[3][2] ) /			
		   ( ( depth * _projectioninv[2][3] ) +
			 _projectioninv[3][3] );
}


vec4 get_projected_coord( const in vec3 hitCoord )
{
	vec4 projectedCoord = _projection * vec4( hitCoord, 1.0 );

    projectedCoord.xy /= projectedCoord.w;
    projectedCoord.xy = projectedCoord.xy * 0.5 + 0.5;
    
    return projectedCoord;
}


vec2 binarySearch( inout vec3 dir,
				   inout vec3 hitCoord )
{
    vec4 projectedCoord;

    for( int i = 0; i < binarySearchCount; ++i )
    {
        projectedCoord = get_projected_coord( hitCoord );        

        float depth = hitCoord.z - get_eye_depth( projectedCoord.xy );

        dir *= 0.5;

		hitCoord += depth > 0.0 ? dir : -dir;
    }

    return projectedCoord.xy;
}


vec2 raycast( in	vec3 dir, 
			  inout vec3 hitCoord )
{   
    vec2 coords = vec2( -1.0 );
    
    dir *= rayStep;
    
    for( int i = 0; i < rayMarchCount; ++i )
    {
        hitCoord += dir;

        vec4 projectedCoord = get_projected_coord( hitCoord );
        
        float depth = hitCoord.z - get_eye_depth( projectedCoord.xy );

        if( ( dir.z - depth ) < depth_bias && depth < 0.0 )
        {
            coords = binarySearch( dir, hitCoord );
            break;
        }
    }

    return coords;
}


vec3 hash3( in vec3 a )
{
    a = fract( a * vec3( 0.8 ) );
    a += dot(a, a.yxz + 19.19 );
    return fract( ( a.xxy + a.yxx ) * a.zyx );
}


void main()
{
	vec4 composite = texture2D( SAMPLER0, texcoord0 );

	gl_FragColor = vec4( composite.rgb, 0.0 );


	vec4 normal = texture2D(SAMPLER1, texcoord0 );

	normal.xyz = normalize( normal.xyz );


    vec3 viewPos = get_eye_pos( texcoord0 );

    vec3 worldPos = vec3( vec4( viewPos, 1.0 ) * _viewinv );
    
    vec3 jitter = hash3( worldPos ) * normal.a * normal.a;


    vec3 reflected = reflect( normalize( viewPos ), normal.xyz );

    vec3 hitPos = viewPos;

	vec2 coords = raycast( jitter + reflected * max( -viewPos.z, minStep ), hitPos );

	if( coords.xy == vec2( -1.0 ) )
	{ return; }


	float D = length( viewPos - hitPos );

	float A = ( 1.0 / ( 1.0 + ( D * D ) ) );

    vec2 dCoords = smoothstep( 0.2, 0.6, abs( vec2(0.5) - coords.xy ) );

    float edge_factor = clamp( 1.0 - ( dCoords.x + dCoords.y ), 0.0, 1.0 );

	//  * -reflected.z
	float mul = clamp( A * falloff, 0.0, 1.0 ) * composite.a * edge_factor;

    gl_FragColor = vec4( texture2D( SAMPLER0, coords.xy ).rgb,
    					 mul );
}




Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Mar. 25 2021 09:43 pm
Current numbers from the profiler at (about) 1080p:

Z


Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Apr. 03 2021 01:32 am
Wrapping up more rooms I spend quite a bit a time working on the Miscellaneous Object room. That room demonstrate the possible usage of Simple Text Object, Route Object (path and navigation) and Empty Object.

I end up adding up a bit for Text since their usage was (I feel) pretty limited. Offset and vertex color can now be added on a per character basis allowing more flexibility and new ranges of effect. In addition, a new variable is available to control the amount of characters to be displayed. You can now either manually assign a curve to control the property or control it code. This way you can create "conversation" like dialogs and more...

YwPfMzDHo1udKKJvOgpRfrLk660pReNaShfetMBAQA7

For the Route; I've coded that the orange bot is following a predefined path imported from DAE. And the blue bot have a simple AI code attached that keep waiting for a few seconds; then find its way through the navigation map to get to the active camera location.

2Q==

For Empties; there's not much that can be done except to display that it is a transformation in space. However the "Misc." scene is having other Empties around to track the user location as well as the end target of the blue bot.

Z

Once the Object ShowRooms is out check it out. I tried to keep things as simple as possible so to get access to a specific "features" or ... simply click on the Object while being in edit mode.


Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Apr. 03 2021 07:22 pm
The Ray Object room is done here's a few shots of the room overview.

9k=

Z

For each type of Ray the mesh on the stand scan in random direction and display the closest hit for the Scene Geometry, World Physics and Navigation Map. Here's the info that you can expect to get when hitting geometry:

9k=

When hitting Physic Object(s):

2Q==

And finally; hitting the collision Navigation Map:

9k=


Update have been made in the StandardMesh rooms (based on users comments) to add the amount of triangles for each LOD or Adaptable mesh that scales based on the current LOD displayed. Here's a few shots of the added feature; like this is makes it more clear on top of changing color based on the current LOD selected. First LOD:

Z

Second LOD:

Z

Third LOD:

Z
etc...

I've also added multiple knobs to tweak the final output. Theses knobs control the brightness, contrast, exposure, gamma, filmic grain, saturation and the amount of vignette applied on screen.

10cRTWiAQCEIpLCwEkIRHKRAIBFF6lMKrFAgEgm68yc5CKcRSIBAIOolkd0IJdyafFCOLAoHgYRXILtrY09vwYo1LgUDwsNGtJv4fH0XgVHkXOu8AAAAASUVORK5CYII=


Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Apr. 05 2021 02:07 pm
The LightSource room is ready. Nothing too fancy, its uses the current data available when using LightSource Objects to draw punctual lights. All types of LightSource available within the NRGeditor are exposed and displayed here. You can learn how to use them by clicking around the room and check their property; remember also to checkout the shader attached to the materials connected around the room. By the way; the room is located on the west side of the FracturedMesh room...

Z

Z

World lights (basically feeding it the 2nd Degree Spherical harmonics data) and use the world normal to compute the SH2 color.

Z

Sun light (basically a directional light).

Z

Lamp for omni-directional lighting.

9k=

Spot (cone of light).

Z

On the back of the room I also place a bunch of cool IES lights just to demonstrate their effects, and how easy it is to implement them by simply importing the .ies file into the NRGeditor.

9k=


Joined: Jan, 2021
Total Post: 73
nrgmaster
Re: ShowRoom
Posted on: Apr. 15 2021 06:27 am
A lot of bugs have been fixed and the API as well as the editor are more and more performant and stable.  Working on the Probe room; every time I pass in from of that column down there I keep seeing flickers regardless of the FXAA method I use (either 2 or 3 with high settings). It was already on TODO list for a while but this time that was it. I've implemented  configurable multi sample anti-alias for all FBO attachment types.

weJTZhyZ8CXSwAAAABJRU5ErkJggg==

A single in-expensive "blit" is required after drawing to the MSAA buffers and can be done like this in a single script (either in a separate pass or at the end of another FBO pass).



function OnEnd()

 -- The destination framebuffer.
        local dst = event.object.data.camera.framebuffer

 -- The source framebuffer. 
        local src = _G.activeCamera.framebuffer

        dst:BlitBegin()

        dst:Blit( src,
                       Graphics.ClearBuffer.fDepthBufferBit,
                       0,
                       0,
                       false) -- Not linear for the depth.

        dst:Blit( src,
                       Graphics.ClearBuffer.fColorBufferBit,
                       0,
                       0,
                       true) 
   
        dst:Blit( src,
                       Graphics.ClearBuffer.fColorBufferBit,
                       1,
                       1,
                       true)   
   
        dst:Blit( src,
                       Graphics.ClearBuffer.fColorBufferBit,
                       2,
                       2,
                       true)
   
        dst:Blit( src,
                       Graphics.ClearBuffer.fColorBufferBit,
                       3,
                       3,
                       true)

        dst:BlitEnd()
end



Took me a few days to finish the Probe Object room since I refactor the whole Probe API to supported projected texture. Like that it can be used to create projector effect, decals, point light shadows just to give a few examples.

For the render to texture functionalities of the Probe Objects I had a drone in the SoundSource room (which is just nearby) that capture the scene from the drone point of view offscreen and displays it on a screen in the room.

wG6YeDs+jLXNgAAAABJRU5ErkJggg==

For the render to texture cubemap I simply render the world space normals of the room; and for the projected cubemap functionalities I fake it projecting some star field turning like at the planetarium (kinda cheap I know; but I wannit to keep things simple to just show the potential).

Projected Planar and 2D Render to Texture (rendering the SoundSource room from the drone POV):

wNWNYA7cBkPNQAAAABJRU5ErkJggg==

Projected Cubemap (basically just a simple cubemap lookup; running out of ideas...)

5BooQAAAABJRU5ErkJggg==

Cubemap Render to Texture (world normals of the Probe Room):

2j+hxAog1lUAAAAASUVORK5CYII=
<12