Potential bug WebGL RenderTexture bound to texture source
Hi there,
I'm developping my website using Noesis in Unity and one of the things I'm trying to achieve is to draw a shader as the application background.
The approach is simple, on unity side, I'm calling
where mat is a material with my custom shader.
On Noesis Side I'm binding the "Source" property to a TextureSource available through the DataContext.
Here's what the DataContext side looks like :
This works perfectly well in the editor, but when I build to WebGL, it only works on integrated GPU.
When I turn on hardware acceleration on the browser (I tested Chrome firefox and edge the behaviour is the same), the texture looks like transparent, I see the background color of the grid that contains the image (on the shader side the return alpha is always 1.0.
To be sure that it wasn't a problem on the unity side, I simply decided to display the render texture with an unlit shader on a quad and this shows up correctly in all scenarios.
I thought it might be an issue with the Graphics API not working consistently between integrated and dedicated so I switched to WebGL 1 (GL ES 2.0) but I have the same issue.
To resume :
- RenderTexture seems transparent
- Works fine in editor and on integrated GPU, fails on dedicated GPU
- Same behaviour between edge, firefox, chrome
- Same behaviour on 2 pc (intel/nvidia)
- Same behaviour on unity 2019.3, 2019.4
- Same behaviour on WebGL 1.0 and 2.0
- No error/warning message in console
How could I fix this ?
Thanks in advance !
I'm developping my website using Noesis in Unity and one of the things I'm trying to achieve is to draw a shader as the application background.
The approach is simple, on unity side, I'm calling
Code: Select all
Graphics.Blit(null, RenderTexture, mat);
On Noesis Side I'm binding the "Source" property to a TextureSource available through the DataContext.
Here's what the DataContext side looks like :
Code: Select all
public TextureSource RenderTextureShader
{
get
{
// Access the render texture through a singleton (which is not null)
// I explicitely specified script execution order to prevent this
var rt = AppServices.Instance.ShaderBackgroundRT;
var tex = Noesis.Texture.WrapTexture(rt);
return new TextureSource(rt);
}
}
When I turn on hardware acceleration on the browser (I tested Chrome firefox and edge the behaviour is the same), the texture looks like transparent, I see the background color of the grid that contains the image (on the shader side the return alpha is always 1.0.
To be sure that it wasn't a problem on the unity side, I simply decided to display the render texture with an unlit shader on a quad and this shows up correctly in all scenarios.
I thought it might be an issue with the Graphics API not working consistently between integrated and dedicated so I switched to WebGL 1 (GL ES 2.0) but I have the same issue.
To resume :
- RenderTexture seems transparent
- Works fine in editor and on integrated GPU, fails on dedicated GPU
- Same behaviour between edge, firefox, chrome
- Same behaviour on 2 pc (intel/nvidia)
- Same behaviour on unity 2019.3, 2019.4
- Same behaviour on WebGL 1.0 and 2.0
- No error/warning message in console
How could I fix this ?
Thanks in advance !
Re: Potential bug WebGL RenderTexture bound to texture source
This is weird, are you sure the texture doesn't contain pixels with alpha < 1.0 ? As a test, could you disable the Blit and instead use a static texture with the same content (grabbing the output of your shader in a texture at editor time).
By the way, in the latest version of Noesis you can directly expose Unity textures in the data context.
By the way, in the latest version of Noesis you can directly expose Unity textures in the data context.
Re: Potential bug WebGL RenderTexture bound to texture source
Ok so here's the fragment code
I've seen a warning in the console (browser) which I didn't notice at first, it appears when it does not work, so the issue is certainly there :
413161aa-968e-457e-8e89-e89e49cd241e:8 WebGL: INVALID_ENUM: getInternalformatParameter: invalid internalformat
_glGetInternalformativ @ 413161aa-968e-457e-8e89-e89e49cd241e:8
Binding to a RenderTexture without the Blit results in the same behaviour (warning up here).
Binding to a regular unity texture throws the same warning but the texture shows
Binding to the RenderTexture does the same thing, but thanks for the tip !
The culprit of the warning is there (as told by the browser) :
but I don't know if it's emscripten generated code or Noesis land.
Code: Select all
fixed4 frag (v2f i) : SV_Target
{
fixed2 uv = i.uv;
fixed3 col;
uv -= fixed2(0.5, 0.5);
uv *= 5.;
col = rdr(uv);
return fixed4(col.x,col.y, col.z, 1.0);
}
413161aa-968e-457e-8e89-e89e49cd241e:8 WebGL: INVALID_ENUM: getInternalformatParameter: invalid internalformat
_glGetInternalformativ @ 413161aa-968e-457e-8e89-e89e49cd241e:8
Binding to a RenderTexture without the Blit results in the same behaviour (warning up here).
Binding to a regular unity texture throws the same warning but the texture shows
Binding to the RenderTexture does the same thing, but thanks for the tip !
The culprit of the warning is there (as told by the browser) :
Code: Select all
function _glGetInternalformativ(target, internalformat, pname, bufSize, params) {
if (bufSize < 0) {
GL.recordError(1281);
return
}
var samples = GLctx["getInternalformatParameter"](target, internalformat, 32937); // This is the one that causes the warning
if (!samples) {
GL.recordError(1280);
return
}
switch (pname) {
case 32937:
...
Re: Potential bug WebGL RenderTexture bound to texture source
Do you know what format is this?Binding to a RenderTexture without the Blit results in the same behaviour (warning up here).
Binding to a regular unity texture throws the same warning but the texture shows
Re: Potential bug WebGL RenderTexture bound to texture source
The RenderTexture is R8G8B818_UNorm.
The Texture is RGB8 UNorm.
I don't know what this specific value means but I could investigate if it's what you're asking.
The Texture is RGB8 UNorm.
I don't know what this specific value means but I could investigate if it's what you're asking.
Re: Potential bug WebGL RenderTexture bound to texture source
Formats without alpha have been problematic in the past, could you try the same format but with alpha (like R8G8B8A8 or something like that)?
Re: Potential bug WebGL RenderTexture bound to texture source
Woupsie, typo error there is alpha, RenderTexture's type is R8G8B8A8_UNorm.
Re: Potential bug WebGL RenderTexture bound to texture source
Ops, yes sorry. What about the texture?
Re: Potential bug WebGL RenderTexture bound to texture source
The texture is RGB8 UNorm as told by unity inspector.
I did a test that fixed the issue, I did not see that the Depth buffer was ticked for the RenderTexture, setting it to No depth buffer made it work even if the warning is still there.
Don't know what's going on but I'm not writing to the Depth buffer at all in the shader. Do you use depth information to display the Textures ?
I did a test that fixed the issue, I did not see that the Depth buffer was ticked for the RenderTexture, setting it to No depth buffer made it work even if the warning is still there.
Don't know what's going on but I'm not writing to the Depth buffer at all in the shader. Do you use depth information to display the Textures ?
Re: Potential bug WebGL RenderTexture bound to texture source
When Noesis is being rendered to a RenderTexture you need to activate Depth buffer because that also enables Stencil buffer and we use stencil for applying masks.
But in this case, yes, you don't need it at all. Not sure what Unity is doing when sampling a RenderTarget with Depth but definitely this is what is causing the error.
But in this case, yes, you don't need it at all. Not sure what Unity is doing when sampling a RenderTarget with Depth but definitely this is what is causing the error.
Who is online
Users browsing this forum: No registered users and 2 guests