KeldorKatarn
Topic Author
Posts: 193
Joined: 30 May 2014, 10:26

Workflow to use Unity asset types in Blend unclear

23 Apr 2021, 19:44

I'm sorry for my tons of questions but I am really having a hard time understanding how to use Noesis in combination with Unity. The integration is really rough for me.

Here's what I'm trying to do. I'm trying to create a scene that is basically a map editor for a 2D game. So I have a 2D hexagon grid as the normal Unity part of the scene which registers clicks and all that. That works fine.

Now... I use Noesis to create the UI, so far a very simple top bar with just a combobox and togglebutton to start painting terrain. The painting is Unity, that's not important. That parts works anyway, I'm just writing that so you know what my usecase is.

The parts that I cannot get to work are these:

A) Databinding

So I'm trying to select a TerrainType to paint from a ComboBox. TerrainType is a custom asset deriving from ScriptableObject. It has a property of type RandomTile which is derived from the unity type Tile and has a List of Sprites which are assigned in the inspector and then one is selected randomly for display on each grid position where the terraintype is used.

What I want to do is create a datatemplate for my combobox that has the name of the asset, a simple string "Forest" "Hills" whatever. I'm not worrying about localization right now.
And below that text I want the first of the list of Sprites that the Tile of the TerrainType defines. Or rather the texture of that sprite.

So I'm databinding like this:
<DataTemplate DataType="assets:TerrainType">
  <StackPanel>
    <TextBlock HorizontalAlignment="Center" Text="{Binding Name, Mode=OneWay}" />
    <Image HorizontalAlignment="Center" Source="{Binding Tile.Sprites[0].texture, Mode=OneWay}" />
  </StackPanel>
</DataTemplate>
Now this doesn't work. I'm assuming I'm correct in databinding the image source to the texture and not the sprite itself (which doesn't work either).
I'm reading that xaml assets need to have dependencies. I'm not exactly sure how that is supposed to work. I cannot specify what the textures are, these assets are references by a Zenject dependency injection installer on another gameobject and injected into the viewmodel. Referencing them directly in the Xaml asset beats the purpose of layer separation. The view cannot know which textures it will have to display. That depends on the databinding. The TerrainTypes ARE referenced in the scene on an object, but the XAML assets shows in the inspector that it doesn't know about any dependencies. At this point I have no clue how to get this to work. Manually specifying the dependencies is an absolute no-go. That beats the entire purpose of using MVVM in the first place. I want the View to display a databound texture, no matter what it is. If that doesn't work, then I can forget using Noesis for this altogether.

B) Blendability

I still have a massive problem understanding how this Blend Workflow is supposed to work.
To make all this blendable I basically need to give every viewmodel an interface and create a completely separate DesignViewModel for the blend project.
So far so good, that's fine, I do that in WPF sometimes also, depending on how important design time is.

However I have ZERO clue how this is supposed to work in Unity. I am databinding against unity types. Against assets deriving from UnityEngine base classes.
That means the design ViewModels need to understand Unity including every derived type I create. But if I include those files in the blend project I end up
needing to reference tons of extensions I use in those files, like attributes that get used in the inspector, other extensions that I might use for Unity productivity purposes,
logging stuff etc etc.
I cannot simply wrap all that in #if NOESIS clauses or my code will be unreadable and clutterd with UI specific clauses in modules that have zero to do with UI.

So how exacty am I supposed to make these design time viewmodels in a WPF/Blend project? I don't understand how that is supposed to work?
For example if I bind against Tile, a Unity Base class, I cannot create an interface for that to replace it with a stub. The XAML will have the namespace for use in the datatemplate as the DataType. So I cannot just fake it. Even worse if I want to bind a texture. I cannot create a fake texture in a WPF application. How am I supposed to preview that for design time/blendability?

I can see how Noesis works in C# world but I still don't understand how you do this together with unity. A WPF application doesn't understand Unity types and I have no idea
how to write design time data for my use cases since I bind against those Unity types to display stuff...


I'd appreciate some insight here from people who do this extensively. The Blendability workflow and the use of Unity Types are the biggest hurdle so far for me.
That integration point Noesis<->Unity really gives me headaches and makes it hard for me to remain productive.
I'm trying to use Noesis because frankly the UI solutions by Unity itself so far have been one terrible choice after another. The newest UI Elements is a mess also in terms of productivity in my opinion and XAML is far superior to their customXML+customCSS approach but that's beside the point.

But all the productivity and flexibility I'm hoping to get out of this has so far not materialized. The XAML part is easy for me. I'm a senior WPF design engineer and I'm developing WPF controls and entire UIs for a living so I know my way around that side of things no problem. But as soon as the unity integration gets involved everything falls apart for me and I always run against walls. So sorry if my posts sound frustrating. I think Noesis is an amazing thing and Unity should have bought or contracted this instead of all their half baked solutions. I just need help in understanding how this is supposed to smoothly integrate. To me it feels incredibly unintuitive and I run into hurdles at almost every step. So far I've invested 2 weeks in trying to get a simple toolbar to work and setting up a basic application framework.

I appreciate any help someone can give me.
Last edited by KeldorKatarn on 26 Apr 2021, 19:31, edited 1 time in total.
 
User avatar
jsantos
Site Admin
Posts: 3918
Joined: 20 Jan 2012, 17:18
Contact:

Re: Workflow to use Unity assets unclear

26 Apr 2021, 13:24

A) Databinding
I recreated your scenario as follows and it is working fine here. Please, note that support for sprites is coming with NoesisGUI 3.1 (#1902) but in this case it works because we are just using the whole texture contained inside the sprite.
namespace Testing
{
    public class TerrainTile
    {
        public List<Sprite> Sprites { get; set; }
    }
    public class TerrainType : Tile
    {
        public string Name { get; set; }
        public TerrainTile Tile { get; set; }
    }
}
<Grid
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:assets="clr-namespace:Testing">
  <Grid.Resources>
    <DataTemplate DataType="assets:TerrainType">
      <StackPanel>
        <TextBlock HorizontalAlignment="Center" Text="{Binding Name, Mode=OneWay}" />
        <Image HorizontalAlignment="Center" Source="{Binding Tile.Sprites[0].texture, Mode=OneWay}" />
      </StackPanel>
    </DataTemplate>
  </Grid.Resources>
  <ContentControl Width="300" Height="300" Content="{Binding}"/>
</Grid>
public class TileBindingBehavior : MonoBehaviour
{
    public Texture2D texture;

    void Start()
    {
        NoesisView view = GetComponent<NoesisView>();
        Testing.TerrainType vm = (Testing.TerrainType)ScriptableObject.CreateInstance(typeof(Testing.TerrainType));
        vm.Name = "Forest";
        vm.Tile = new Testing.TerrainTile
        {
            Sprites = new List<Sprite>()
            {
                Sprite.Create(texture, new Rect(0.0f, 0.0f, texture.width, texture.height), new Vector2())
            }
        };
        view.Content.DataContext = vm;
    }
}
In pure MVVM approaches like this you don't need to inject extra dependencies to xaml assets as everything is handled by unity.
B) Blendability
There are many ways to solve this. First, I recommend reading our Workflow Tutorial because it give a few important hints about using Design-Time DataContexts. DataBinding is a good example of this pattern.

I know this is clear, but let me remark it: you are not going to be able to render with Unity inside Blend. So you need to have a clear separation between both worlds, with Design-Time DataContexts you don't need #ifdef at all. In approaches where you share DataContexts, then at least a few #ifdef are going to be necessary to match types, for example: UnityEngine.Texture <-> ImageSource
However I have ZERO clue how this is supposed to work in Unity
It takes a bit of time to understand all the the details in the interactions between Noesis, Unity and Blend/WPF. Please, be patient, this is already being used by many studios to create AAA games, and help us with your feedback to make this better.

Thank you!
 
KeldorKatarn
Topic Author
Posts: 193
Joined: 30 May 2014, 10:26

Re: Workflow to use Unity assets unclear

26 Apr 2021, 18:32

I recreated your scenario as follows and it is working fine here. Please, note that support for sprites is coming with NoesisGUI 3.1 (#1902) but in this case it works because we are just using the whole texture contained inside the sprite.
That Binding doesn't work for me. I bound to Tile.Sprites[0].texture just like you did, but nothing shows up in the combobox. The texture binding does not work for me. The combobox items remains empty aside from the TextBlock. So there has to be something different here. The binding you're suggesting is exactly what I have and it's not rendering the texture.
I know this is clear, but let me remark it: you are not going to be able to render with Unity inside Blend. So you need to have a clear separation between both worlds, with Design-Time DataContexts you don't need #ifdef at all. In approaches where you share DataContexts, then at least a few #ifdef are going to be necessary to match types, for example: UnityEngine.Texture <-> ImageSource
And this is exactly where my problems lies. Obviously I cannot render with Unity in WPF. My problem is how to avoid this. I know what a design time datacontext is and how that works.
As I said, I do this for a living every day. However. I do not understand how to avoid my data types.

The problem with the DataTemplate is not the datacontext, the problems arrise with the dataTYPEs.

Let's say I write a Unity ViewModel which exposes certain Unity specific types which I databind against. Let's say I want to display a Vector3Int. Or a Texture. Or a custom type derived
from a Unity base class like ScriptableObject.

I'd need a datatemplate like this:
<DataTemplate DataType="{x:Type unity:Vector3Int}" />
That doesn't compile since those types don't exist in Blend world. Now of course I can create a design time ViewModel that exposed something completely different. but that doesn't work for me. I need my DesignTime viewmodels to have the exact same interface so I have both intellisense support and refactoring robustness. If that goes out the window the entire point of Blend is lost. I need to be able to be sure that whatever I see in Blend intellisense actually exists in the Unity world and not a workflow where I manually have to keep every viewmodel pair in sync.

Could I work around this? Sure. I could NOT share the same interface... or not put the interface in my project at all, instead replace it with a different one in the same namespace or whetever.
Or I could implement a Design time type that's called Tile in the same namespace and just does something completely different. In my case I'd need a design time TerrainType which exposes a property Tile which has a type with a Property Sprite which is a type with a Property texture which somehow has a type that can work as an image source.
That's a huge pain in the butt and can break at any time whever their API or behavior runs out of sync. Not to mention that I am basically faking half the Unity namespace just to work in Blend.

Also as soon as I don't share the same identical interface, all refactoring capability goes out the window. Intellisense is worthless since I have to manually guarantee that my two viewModels don't go out of sync. Bad workflow yet again and a huge about of stuff that can break without getting noticed.

At that point Blend has become completely useless and might as well just write the XAML and try it immediately in a Unity scene where I can use replacement ViewModels that actually can contain the correct datatypes.

Do you understand my problem? I have ZERO issues understanding the Unity side of things and certainly no issues understanding the WPF side of things.
I just have absolutely no clue how to map the two since by definition I cannot use the same datatypes. My issues is not replaceing the DATA with design time data. That's normal MVVM workflow. I have an issue with replaceing the data TYPES. The ones Unity specific. In my opinion if I cannot use those or easily replace them with something that has the same interface and behavior and is refactoring robust, then the entire Blend part of developing becomes useless and just additional work that provides no benefit.

From my experience most XAML is written by hand, not designed in Blend. In fact I've never worked at a company that ever used blend. Super complicated shapes and animations are very rarely used and even if, we usually just have designers create those and then export them into XAML and just add them as resources and keep working in a text editor.

The only Benefit Blend or the Visual Studio designer can offer is a very very quick preview of the changes made, and runtime change support. I can change a width or color and it immediately shows up in my running test application. That doesn't seem to work in Unity yet, so that costs time. Honestly if Noesis had hot-reloading and I could just keep running my scene and live preview my changes I wouldn't need Blend at ALL. So far it provides just a major pain in the neck and the only benefit is what a hot-reload in Unity would solve immediately.
Om a Unity Test scene I could use PROPER fake viewmodels that actually use the same datatypes and can use my asset classes. THAT would be a good workflow.

The time I lose is EXCLUSIVELY on this Blend workflow. I keep moving projects back and forth so they don't clutter my project folder, I had to move where Visual Studio puts the /obj/ folder which only works by manually editing the project file, I have to recreate a ton of fake stuff to get even the smallest thing working, and I still run into problems all the time. RIght now my Blend project is completely neglected and not synched anymore because it simply doesn't compile right now. I specified the DataTemplate for my combobox to use the type "RandomTile", and that type is not known in Blend world and cannot be known there. Neither is Texture or Sprite.

At this point I have a good framework running in unity. I have the Caliburn Micro view model base classes and conductors. I have dependency injection inegration, I communicate with the UniRx MessageBroker, I have the lightweight UniTask liberary for asynchronous ops integrated into the Caliburn types... everything works nicely and smoothly.
But the Blend side doesn't even compile...

So will I continue using Noesis in Unity? Yes absolutely. But I cannot see myself using this Blend workflow mess. It's just a major pain in the butt. Maybe I'm completely missing something here, but I STILL don't understand how this Blend workflow is ever supposed to work. As soon as you create datatemplates for Unity types it breaks.


Sorry for the wall of text but I don't think it is clear what my actual issue is. You keep explaining DesignTime datacontexts to me, that is NOT the issue.
Last edited by KeldorKatarn on 26 Apr 2021, 18:48, edited 1 time in total.
 
User avatar
jsantos
Site Admin
Posts: 3918
Joined: 20 Jan 2012, 17:18
Contact:

Re: Workflow to use Unity assets unclear

26 Apr 2021, 18:41

I recreated your scenario as follows and it is working fine here. Please, note that support for sprites is coming with NoesisGUI 3.1 (#1902) but in this case it works because we are just using the whole texture contained inside the sprite.
That Binding doesn't work for me. I bound to Tile.Sprites[0].texture just like you did, but nothing shows up in the combobox. The texture binding does not work for me. The combobox items remains empty aside from the TextBlock. So there has to be something different here. The binding you're suggesting is exactly what I have and it's not rendering the texture.
Could you please try the example (xaml + cs) I wrote above and tell me if it works for you. Just exactly the same example, without modifications.
 
KeldorKatarn
Topic Author
Posts: 193
Joined: 30 May 2014, 10:26

Re: Workflow to use Unity assets unclear

26 Apr 2021, 19:01

Edit: Nevermind the Texture binding. I got that to work. The issue was that I had a custom tile with a Sprites list but the TerrainType property was returning the base type "Tile"
which only has a single "sprite" property. The base class doesn't have the "Sprites" property, that's why the binding failed.
That was my bad. It works now.
screenshot.png
But the Blend issue still remains.
Last edited by KeldorKatarn on 26 Apr 2021, 19:19, edited 5 times in total.
 
User avatar
jsantos
Site Admin
Posts: 3918
Joined: 20 Jan 2012, 17:18
Contact:

Re: Workflow to use Unity assets unclear

26 Apr 2021, 19:16

Could you try the following minimal scene I create for the example posted above?



PS: We will answer the second part of your post later.
 
KeldorKatarn
Topic Author
Posts: 193
Joined: 30 May 2014, 10:26

Re: Workflow to use Unity assets unclear

26 Apr 2021, 19:18

See above, the binding issue was resolved. I'm still hoping for some insights on the blend workflow though.
(In fact my Binding error was caused by the fact that I didn't have proper intellisense support in the XAML Designer thanks to the project not compiling and not recognizing my custom asset type... just proved my point about this workflow being not ideal for me right now)
 
KeldorKatarn
Topic Author
Posts: 193
Joined: 30 May 2014, 10:26

Re: Workflow to use Unity asset types in Blend unclear

26 Apr 2021, 20:00

Basically the only solution I have right now is creating something like this:
namespace VacuumBreather.Montreal.Gameplay.Assets
{
    using System;
    using System.Windows.Media.Imaging;

    public class TerrainType
    {
        public string Name { get; } = "Test Name";

        public FakeTile Tile { get; } = new FakeTile();

        public class FakeTile
        {
            public FakeSprite sprite { get; } = new FakeSprite();

            public class FakeSprite
            {
                /// <inheritdoc />
                public FakeSprite()
                {
                    var uri = new Uri(@"\Blend\Textures\hexForest.png", UriKind.Relative);
                    texture = new BitmapImage(uri);
                }

                public BitmapImage texture { get; }
            }
        }
    }
}
Since I need that type to exist and its entire hierarchy. Otherwise my Blend project doesn't compile with the datatemplate specifiying its type:
        <DataTemplate DataType="{x:Type assets:TerrainType}">
            <StackPanel>
                <TextBlock HorizontalAlignment="Center" Text="{Binding Name, Mode=OneWay}" />

                <Image
                    Width="50"
                    Height="50"
                    HorizontalAlignment="Center"
                    Source="{Binding Tile.sprite.texture, Mode=OneWay}" />
            </StackPanel>
        </DataTemplate>
This being the issue: <DataTemplate DataType="{x:Type assets:TerrainType}">
Am I going to need to do this for every Unity type I end up using?
 
User avatar
jsantos
Site Admin
Posts: 3918
Joined: 20 Jan 2012, 17:18
Contact:

Re: Workflow to use Unity asset types in Blend unclear

27 Apr 2021, 11:18

That doesn't compile since those types don't exist in Blend world.
Yes, and they will never exist. We tried in the past adding references to unity assemblies from blend and it doesn't work. At the end you have the following options:

1. Avoid Unity types in your DataContext. With a few exceptions this is what are doing in our examples. Note that for a few fundamental types like Texture, you will need #ifdef like this:
#if UNITY_5_OR_NEWER
using ImageSourceType = UnityEngine.Texture2D;
#endif
2. Strict implementation of the MVVM pattern (Model = Unity Type, View = XAML, ViewModel = Noesis Types)

3. Have different implementations for Unity and Blend. If you correctly encapsulate this you can avoid a lof of fake classes. I will show you later, in my next post.
Not to mention that I am basically faking half the Unity namespace just to work in Blend.
I don't think you need half the Unity namespace. In fact you should only need a few things.
>> The only Benefit Blend or the Visual Studio designer can offer is a very very quick preview of the changes made, and runtime change support. I can change a width or color and it immediately shows up in my running test application. That doesn't seem to work in Unity yet, so that costs time. Honestly if Noesis had hot-reloading a
Hot-reloading is implemented (it is off by default in 3.0). Please, read more about it in our Unity Tutorial.
The time I lose is EXCLUSIVELY on this Blend workflow. I keep moving projects back and forth so they don't clutter my project folder, I had to move where Visual Studio puts the /obj/ folder which only works by manually editing the project file,
Have you tried this template for Visual Studio? That's the recommended structure and the one we are following in all our examples.
 
User avatar
jsantos
Site Admin
Posts: 3918
Joined: 20 Jan 2012, 17:18
Contact:

Re: Workflow to use Unity asset types in Blend unclear

27 Apr 2021, 11:21

Am I going to need to do this for every Unity type I end up using?
Yes, but you can keep it simpler. For example:

For Blend:
public class TerrainType
{
    public string Name { get; } = "Test Name";
    public BitmapImage Image { get; } = new BitmapImage(new Uri(@"\Blend\Textures\hexForest.png", UriKind.Relative));
}
For Unity:
public class TerrainType
{
    public string Name { get; }
    public Texture2D Image { get { return Tile.sprite.texture; }
    public Tile Tile { get; }
}
and binding like this:
Source="{Binding Image, Mode=OneWay}" 

Who is online

Users browsing this forum: Google [Bot] and 35 guests