-
- KeldorKatarn
- Posts: 237
- Joined:
[Unity] A couple of questions
I already posted this in the Unity forums thread, but since the response is probably also interesting to everybody here... here it goes:
I got a couple of questions: What are XAML files exactly in the unity world? Are they some kind of customasset that sits in the assets folder and can be loaded by your planelcomponent? If that is so, can those xaml files be dynamically replaced at runtime, even with xaml files loaded from the users harddisk?
The use case woult be that my main XAML has a ContentControl somewhere that is databound to a UserControl that I specify and that a modder could potentially replace in a usermod directory to change the look of the UI.
Another thing is that you write in your docs that images are not treated as resources yet like fonts e.g. What if an image is databound? Let's say my XAML has a databinding for the image source path that is bound to a string property in my ViewModel. Can I still specify any path on the users hard disk or even the web to make XAML dynamically load that image and display it?
Use cases would again be modding and/or the user being able to use a custom avatar image or a squadron batch or stuff like that.
The new Unity GUI comes with the possibility to asign shaders to any sprite control. That makes me ask: Is there any control you have that could render something with a custom Unity shader? Like a canvas control maybe? Or could I even assign a shader to the entire UI somehow or any control? For postprocessing stuff e.g. Unity showed stuff like refraction and lighting influenced shaders in their GUI demos. How easy or at all possible would that be with your XAML solution?
And finally, can I render any Unity mesh in your UI. From what I can remember you can render 3D meshes in WPF. I have never done that though. Is that possible in your UI and how would one go about that?
What about dialog/popup windows, is that possible?
How does the UI handle mouse clicks. Does it prevent them from tickling through? meaning if I click on a control that occludes a gameobject in 3d space, will that 3d object still get a MouseDown event or will the GUI correctly swallow that up? If so, based on what 'shape', the bounding rect of the control?
Thanks
I got a couple of questions: What are XAML files exactly in the unity world? Are they some kind of customasset that sits in the assets folder and can be loaded by your planelcomponent? If that is so, can those xaml files be dynamically replaced at runtime, even with xaml files loaded from the users harddisk?
The use case woult be that my main XAML has a ContentControl somewhere that is databound to a UserControl that I specify and that a modder could potentially replace in a usermod directory to change the look of the UI.
Another thing is that you write in your docs that images are not treated as resources yet like fonts e.g. What if an image is databound? Let's say my XAML has a databinding for the image source path that is bound to a string property in my ViewModel. Can I still specify any path on the users hard disk or even the web to make XAML dynamically load that image and display it?
Use cases would again be modding and/or the user being able to use a custom avatar image or a squadron batch or stuff like that.
The new Unity GUI comes with the possibility to asign shaders to any sprite control. That makes me ask: Is there any control you have that could render something with a custom Unity shader? Like a canvas control maybe? Or could I even assign a shader to the entire UI somehow or any control? For postprocessing stuff e.g. Unity showed stuff like refraction and lighting influenced shaders in their GUI demos. How easy or at all possible would that be with your XAML solution?
And finally, can I render any Unity mesh in your UI. From what I can remember you can render 3D meshes in WPF. I have never done that though. Is that possible in your UI and how would one go about that?
What about dialog/popup windows, is that possible?
How does the UI handle mouse clicks. Does it prevent them from tickling through? meaning if I click on a control that occludes a gameobject in 3d space, will that 3d object still get a MouseDown event or will the GUI correctly swallow that up? If so, based on what 'shape', the bounding rect of the control?
Thanks
Re: [Unity] A couple of questions
Thanks for replicating the questions in our forum. I think it is better discussing this things here.
The correct way for this would be creating a new kind of asset in Unity for the XAMLs. But that is not allowed in Unity, asset types are hard-coded. What we do is having the .xaml in the project as normal files and detect changes with an AssetPostProcessor that copies the optimized xaml (a binary version) to the /StreamingAssets folder.
This implies that XAMLs cannot be loaded at runtime because the parser is not available in the game. This is usually not a problem because everything that can be done in a XAML can be replicated by code.
Yes, you can render Unity3D cameras inside a panel. The trick is using a Render Texture as an intermmediate step. It is shown here: viewtopic.php?f=12&t=294
This is handled by using Hit Testing. It is explained in this tutorial (http://www.noesisengine.com/docs/Gui.Co ... orial.html), in the HitTest section.
I got a couple of questions: What are XAML files exactly in the unity world? Are they some kind of customasset that sits in the assets folder and can be loaded by your planelcomponent? If that is so, can those xaml files be dynamically replaced at runtime, even with xaml files loaded from the users harddisk?
The correct way for this would be creating a new kind of asset in Unity for the XAMLs. But that is not allowed in Unity, asset types are hard-coded. What we do is having the .xaml in the project as normal files and detect changes with an AssetPostProcessor that copies the optimized xaml (a binary version) to the /StreamingAssets folder.
This implies that XAMLs cannot be loaded at runtime because the parser is not available in the game. This is usually not a problem because everything that can be done in a XAML can be replicated by code.
As soon as the mod is done inside Unity that is no problem. The preprocessed xaml (stored in the streamingassets folder) is what should be distributed and the part that your game will understand.The use case woult be that my main XAML has a ContentControl somewhere that is databound to a UserControl that I specify and that a modder could potentially replace in a usermod directory to change the look of the UI.
This is already being done by several games (image modding). In Unity there is an ImageSource that allows you to set a Unity Texture to be displayed by the XAML. It is explained at the end of this document: http://www.noesisengine.com/docs/Gui.Co ... orial.htmlAnother thing is that you write in your docs that images are not treated as resources yet like fonts e.g. What if an image is databound? Let's say my XAML has a databinding for the image source path that is bound to a string property in my ViewModel. Can I still specify any path on the users hard disk or even the web to make XAML dynamically load that image and display it?
Use cases would again be modding and/or the user being able to use a custom avatar image or a squadron batch or stuff like that.
For now this is not possible. You cannot assign custom shaders to XAML (though custom effects are in the roadmap). Although screen post process effects done by Unity do affects our GUI. You can also render our GUI to a texture and then use it as a normal sprite.The new Unity GUI comes with the possibility to asign shaders to any sprite control. That makes me ask: Is there any control you have that could render something with a custom Unity shader? Like a canvas control maybe? Or could I even assign a shader to the entire UI somehow or any control? For postprocessing stuff e.g. Unity showed stuff like refraction and lighting influenced shaders in their GUI demos. How easy or at all possible would that be with your XAML solution?
And finally, can I render any Unity mesh in your UI. From what I can remember you can render 3D meshes in WPF. I have never done that though. Is that possible in your UI and how would one go about that?
Yes, you can render Unity3D cameras inside a panel. The trick is using a Render Texture as an intermmediate step. It is shown here: viewtopic.php?f=12&t=294
Yes, you can create dialogs; "virtual dialogs" because they are rendered inside the main window created by the Unity player.What about dialog/popup windows, is that possible?
How does the UI handle mouse clicks. Does it prevent them from tickling through? meaning if I click on a control that occludes a gameobject in 3d space, will that 3d object still get a MouseDown event or will the GUI correctly swallow that up? If so, based on what 'shape', the bounding rect of the control?
This is handled by using Hit Testing. It is explained in this tutorial (http://www.noesisengine.com/docs/Gui.Co ... orial.html), in the HitTest section.
-
- KeldorKatarn
- Posts: 237
- Joined:
Re: [Unity] A couple of questions
Trouble with this is, that "done inside Unity" in this case means "done in Unity Pro, with a license of NoesisGUI installed, created as part of an assetbundle". And that's no longer 'modding'. Anything that cannot be done with Unity Free is no longer modding. no modder will buy a 1500 Euro license and several hundred euro lincense of a GUi framework just to mod a UI.As soon as the mod is done inside Unity that is no problem. The preprocessed xaml (stored in the streamingassets folder) is what should be distributed and the part that your game will understand.The use case woult be that my main XAML has a ContentControl somewhere that is databound to a UserControl that I specify and that a modder could potentially replace in a usermod directory to change the look of the UI.
It's really sad to realize that a modable game with Unity is next to impossible to do...
Yes I found that a bit after asking the question... that solution is far from ideal though, even though here the blame probably goes to Unity. Having to know about the XAML View in some part of the code is not exactly clean separation of concerns. it's not exactly a MVVM violation since technically anything that is derived from MonoBehavior is also part of the View when sticking to that architecture, but still... other UI framworks seems to have this problem as well. it's a shame Unity apparently doesn't offer any API to let a UI framework swallow up those input events so they don't penetrate the UI as if it wasn't there... having to put this hit test in dozens of classes is far from ideal and what I call separation of concerns. In effect in means that every single gameobject that can potentially interact with the mouse needs to know about the XAML UI, which in turn means I have a dependency on the implementation of my UI in every single gameobject that is clickable... and if I chose to switch to a different UI I'm in a world of pain. Now this can be remedied to some degree by introducing this hit test as a service injected with Dependency injection but still... very ugly...This is handled by using Hit Testing. It is explained in this tutorial (http://www.noesisengine.com/docs/Gui.Co ... orial.html), in the HitTest section.How does the UI handle mouse clicks. Does it prevent them from tickling through? meaning if I click on a control that occludes a gameobject in 3d space, will that 3d object still get a MouseDown event or will the GUI correctly swallow that up? If so, based on what 'shape', the bounding rect of the control?
Re: [Unity] A couple of questions
We provide the xaml buildtool inside our SDK. For now, this SDK can be downloaded if you bought noesisGUI in the Asset store and verified your account here (sending us the invoice number).Trouble with this is, that "done inside Unity" in this case means "done in Unity Pro, with a license of NoesisGUI installed, created as part of an assetbundle". And that's no longer 'modding'. Anything that cannot be done with Unity Free is no longer modding. no modder will buy a 1500 Euro license and several hundred euro lincense of a GUi framework just to mod a UI.
It's really sad to realize that a modable game with Unity is next to impossible to do...
Then, I would say that the current state about this is: modding can be done.
Yeah, I agree with you. The solution is far from ideal. Problem is that, EventType.Ignore only makes sense for Unity GUI, not 3d objects. We are open to suggestions about this. For now, we didn't find a better solution...Yes I found that a bit after asking the question... that solution is far from ideal though, even though here the blame probably goes to Unity. Having to know about the XAML View in some part of the code is not exactly clean separation of concerns. it's not exactly a MVVM violation since technically anything that is derived from MonoBehavior is also part of the View when sticking to that architecture, but still... other UI framworks seems to have this problem as well. it's a shame Unity apparently doesn't offer any API to let a UI framework swallow up those input events so they don't penetrate the UI as if it wasn't there... having to put this hit test in dozens of classes is far from ideal and what I call separation of concerns. In effect in means that every single gameobject that can potentially interact with the mouse needs to know about the XAML UI, which in turn means I have a dependency on the implementation of my UI in every single gameobject that is clickable... and if I chose to switch to a different UI I'm in a world of pain. Now this can be remedied to some degree by introducing this hit test as a service injected with Dependency injection but still... very ugly...
Thanks!
-
- jswigartPlayful
- Posts: 5
- Joined:
Re: [Unity] A couple of questions
Sorry to dig this old thread up, but it feels better than making a new thread since my question is related to this.
First off, I was wondering if the "moddable UI" situation has changed since this question was asked.
Secondly, I wanted to ask for clarity on one of the last responses.
If so, what does this tool produce?
First off, I was wondering if the "moddable UI" situation has changed since this question was asked.
Secondly, I wanted to ask for clarity on one of the last responses.
Are you saying that a licensee of Noesis can deploy a xaml buildtool with their shipped game which can be invoked by the game on an end users machine in order to facilitate dynamic compiling of xaml content to a unity usable form?We provide the xaml buildtool inside our SDK. For now, this SDK can be downloaded if you bought noesisGUI in the Asset store and verified your account here (sending us the invoice number).
Then, I would say that the current state about this is: modding can be done.
If so, what does this tool produce?
Re: [Unity] A couple of questions
Old Noesis buildtool is no longer used. It was deprecated long time ago. Now, we load XAML at runtime. This should suffice all UI modding scenarios. But if you have any doubts, please download the latest version and let us know if current architecture is enough for your requirements. Thanks!
I am going to close this to avoid confusion. 😀
I am going to close this to avoid confusion. 😀
Who is online
Users browsing this forum: Google [Bot], Semrush [Bot] and 9 guests