Nir Hasson
Topic Author
Posts: 53
Joined: 10 Nov 2013, 21:20
Contact:

Video Rendering

09 Jan 2014, 16:21

Hi,

I would like to integrate my video encoder system with NoesisGUI.
I already managed to generate static images content at runtime.
For now I want to keep an image source updated from the video stream.
What is the best way to achieve that goal ? Can I hint the image source object that it should handle dynamic updates ? I guess somehow in the pipline the low-level texture resouce should be created with dynamic usage flags for best performance..

Thanks,
 
User avatar
jsantos
Site Admin
Posts: 2899
Joined: 20 Jan 2012, 17:18
Contact:

Re: Video Rendering

10 Jan 2014, 18:51

Hi Nir,

You are right, using an image source is quite inefficient (a texture is created for each frame). The best way is that you create the texture yourself (you are in charge of updating it) and using a texture source. TextureSource needs a ITexture2D. You can wrap a ITexture2D using a DirectX or OpenGL handle:
DX9Texture2D::DX9Texture2D(NsSize width, NsSize height, NsSize levels, IDirect3DBaseTexture9* texture);
GLTexture2D::GLTexture2D(NsSize width, NsSize height, NsSize levels, GLuint handle);
The problem with this approach is that you have to manage the multiplatform in your code. Another alternative would be using our implementation of DynamicTextures but that API is supposed to be used only by our render thread. We would have to investigate how to expose it properly.
 
Nir Hasson
Topic Author
Posts: 53
Joined: 10 Nov 2013, 21:20
Contact:

Re: Video Rendering

12 Jan 2014, 20:38

Just want to be sure that I understand you -
In order to achieve dynamic texture effect I have to do the following:
1. Create the low level GPU resource myself (D3D9, GL etc..)
2. Create an instance of Noesis::Render::DX9Texture2D or Noesis::Render::GLTexture2D (Depends on used rendering API) with the previous native resource I created before.
3. Manage the updates of the low level texture resource.
4. Use the created Noesis::Render::ITexture2D as image source.

Is that right ?

I can also render the entire video myself and place it on actual Noesis GUI element place holder if I can grab the actual screen position of it.
How can I get the actual screen position of an element ? Can I use VisualTreeHelper::GetContentBounds to do that ?

I think exposing your Dynamic texture would be great. You can expose some kind of Lock/Unlock methods.
As for the rendering thread, do you access GPU resources on separate thread ? I think this is allowed only if device was created using multi thread special flag under D3D9 render system.

Thanks..
 
User avatar
jsantos
Site Admin
Posts: 2899
Joined: 20 Jan 2012, 17:18
Contact:

Re: Video Rendering

13 Jan 2014, 21:05

Just want to be sure that I understand you -
In order to achieve dynamic texture effect I have to do the following:
1. Create the low level GPU resource myself (D3D9, GL etc..)
2. Create an instance of Noesis::Render::DX9Texture2D or Noesis::Render::GLTexture2D (Depends on used rendering API) with the previous native resource I created before.
3. Manage the updates of the low level texture resource.
4. Use the created Noesis::Render::ITexture2D as image source.

Is that right ?
Yes, that's right.
I can also render the entire video myself and place it on actual Noesis GUI element place holder if I can grab the actual screen position of it.
How can I get the actual screen position of an element ? Can I use VisualTreeHelper::GetContentBounds to do that ?
Yes, you can use this alternative although it is quite limited: can't overlap the video with another element, can't alpha blend the video... etc

The origin of an element in screen coordinates can be calculated using:
visual->PointToScreen(Point(0,0));
I think exposing your Dynamic texture would be great. You can expose some kind of Lock/Unlock methods.
As for the rendering thread, do you access GPU resources on separate thread ? I think this is allowed only if device was created using multi thread special flag under D3D9 render system.
Thanks..
Our architecture allows having the IRenderer::Update and the IRenderer::Render in separate threads. That is optional for the client, we guarantee that the GPU is only accessed in the Render method of the IRenderer interface. Our framework (XamlPlayer, Launcher, etc) are organized that way (you can view the timing of each job hitting F2 in XamlPlayer).

Having said that, we already have an API for dynamic texture:
NS_INTERFACE IRenderSystem: public Core::Interface
{

    /// Creates a dynamic 2D texture
    /// \param width Width in pixels of the texture
    /// \param height Height in pixels of the texture
    /// \param format Member of the SurfaceFormat enumerated type
    virtual Ptr<ITexture2D> CreateDynamicTexture(NsSize width, NsSize height, 
        SurfaceFormat format) = 0;
NS_INTERFACE IRenderContext: public Core::Interface
{
    /// Locks a dynamic texture and return a pointer to the locked region. This pointer is writable
    /// until calling UnlockTexture()
    virtual void* LockTexture(ITexture2D* texture) = 0;
    
    /// Unlocks a previously locked dynamic texture
    virtual void UnlockTexture(ITexture2D* texture) = 0;
Although it is not very efficient because when you lock the texture the memory is copied to an internal buffer (held in the command buffer) that is used later in the render job to to the real lock in the GPU. If you are not doing the render in a separate thread this is unnecessary overhead. A better API would be passing a delegate in charge of filling the texture and that is invoked directly from the render job.

Instead of attacking directly the IRenderSystem, I think that a better approach would be creating a new UserControl that updates the texture inside its render method. But we need to extend the API for this case also... if this is critical for you please file us a bug and we will try to prioritize it.

For now, I would try the first approach I told you. And if it works we can start improving the solution.
 
Nir Hasson
Topic Author
Posts: 53
Joined: 10 Nov 2013, 21:20
Contact:

Re: Video Rendering

16 Jan 2014, 13:27

Thanks for the detailed reply and threading usage clarifications.

One important thing I forgot to specify is that our current video solution supports different color spaces the regular RGBA space and YCbCrA color space.
The regular case will work with the suggested solutions, but it is not optimized as the YCbCr solution since the entire decoding is performed on the CPU.

However when working with the YCbCrA color space the Video System holds up to 4 different texture sources, one for each channel, and the actual color decoding is performed on the GPU.
In that case integrating a video with Noesis GUI will require an additional low level interference at the shader level and I'm not sure how it can be implemented while keeping your clean design approach.

For now I use videos mainly as background so either ways will work.
In the future videos might be integrated with the UI level and I'll wake up this thread again if needed.

Thanks again for your help :)
 
User avatar
jsantos
Site Admin
Posts: 2899
Joined: 20 Jan 2012, 17:18
Contact:

Re: Video Rendering

17 Jan 2014, 00:56

However when working with the YCbCrA color space the Video System holds up to 4 different texture sources, one for each channel, and the actual color decoding is performed on the GPU.
In that case integrating a video with Noesis GUI will require an additional low level interference at the shader level and I'm not sure how it can be implemented while keeping your clean design approach.
We integrated a plugin in Unity that worked that way. The way is done in Unity is very similar to the first solution I gave you: we pass the low-level handle of the texture from Unity to noesisGUI. That way you can do whatever you want to generate that texture.

I don't know how this could be integrated in our architecture. We will investigate about it. Apart from that, one of the things we would like to implement in the future is support for video playback.
 
Nir Hasson
Topic Author
Posts: 53
Joined: 10 Nov 2013, 21:20
Contact:

Re: Video Rendering

17 Jan 2014, 11:00

I think that supporting Video playback using some of the most common 3rd parties in-games video libraries will be the best solution.

That way the application developer will have full control over the video playback and specific needs (threading, alpha channel, audio playing etc..) and you will keep the details of video implementation out of your scope.

I personally use OrenVideo, a new video solution with high performance and native Alpha channel support.
http://www.orenvid.com/

And off course there is Bink out there.
http://www.radgametools.com/

I'm not so familiar with Unity but as you said you already integrated video plugin of it.
 
User avatar
jsantos
Site Admin
Posts: 2899
Joined: 20 Jan 2012, 17:18
Contact:

Re: Video Rendering

17 Jan 2014, 15:06

Thanks for those links Nir!

Problem with that solutions is the cost for the final user of noesisGUI: we are not yet strong enough to negotiate a license to be integrated in our product. Anyway we will evaluate all the options whenever we have time for this.
 
Nir Hasson
Topic Author
Posts: 53
Joined: 10 Nov 2013, 21:20
Contact:

Re: Video Rendering

18 May 2014, 14:15

Progress Update:

Working on some additional video caps I had to implement the proposed deep integration method described above using Noesis Textures interfaces.
When I used the Noesis::Render::DX9Texture2D as an input to texture source the displayed image was inverted on the vertical axis.
I managed to overcome this issue by creating my own custom derived class from Noesis::Render::DX9Texture2D and overriding the IsInverted method and return flase there.

I think that the IsInverted implementation should always return false on the Noesis::Render::DX9Texture2D level and that users should create custom classes of Noesis Textures only on special cases.
 
User avatar
jsantos
Site Admin
Posts: 2899
Joined: 20 Jan 2012, 17:18
Contact:

Re: Video Rendering

19 May 2014, 20:02

I found this in the implementation:
NsBool DX9Texture2D::IsInverted() const
{
    // For now, external handles always come from Unity. And Unity textures are inverted
    return mIsExternal;
}
IsExternal is true when the handled is not created by us (just your scenario) and in that case we suppose that the texture is inverted. This is wrong, we should have a parameter indicating that.

Could you please file a bug about this?

Thanks!

Who is online

Users browsing this forum: No registered users and 1 guest