3D Transforms
NoesisGUI handles 3D effects the same way as they are handled in a 3D engine. Rotations, scales and translations can be applied to any XAML element. And a projection matrix is set at the root of the scene to provide the perspective effect.
Element Transforms
In Noesis, 3D effects can be achieved by using the Transform3D property. Transform3D is used to apply a 3D transform matrix to a XAML element. This allows to create effects where two-dimensional UI appears to exist in 3D space relative to the user. Transform3D behaves much like RenderTransform, but allows transforms in three-dimensional space and not just only in two dimensions.
There are two subclasses of Transform3D that can be used to populate the Transform3D property: CompositeTransform3D and MatrixTransform3D. Both classes represent a group of affine 3D transforms on an element. They can be used to position individual elements in 3D space. CompositeTransform3D represents 3D scale, rotation and translate transforms to be applied while MatrixTransform3D exposes all transformation matrix values. MatrixTransform3D is very convenient to bind transformations to properties of the Model.
NOTE
While these properties are not part of WPF, NoesisGUI implements them as an extension in 'noesis' namespace.
Here is an example of CompositeTransform3D to achieve a 3D effect in the UI:
<Grid
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:noesis="clr-namespace:NoesisGUIExtensions;assembly=Noesis.GUI.Extensions">
<StackPanel Orientation="Horizontal">
<Rectangle Width="300" Height="200" Fill="CornflowerBlue" />
<Rectangle Width="300" Height="200" Fill="CadetBlue" Margin="10">
<noesis:Element.Transform3D>
<noesis:CompositeTransform3D RotationY="-30" TranslateZ="-75" CenterX="150" />
</noesis:Element.Transform3D>
</Rectangle>
<Rectangle Width="300" Height="200" Fill="OrangeRed">
<noesis:Element.Transform3D>
<noesis:CompositeTransform3D TranslateZ="-150" />
</noesis:Element.Transform3D>
</Rectangle>
</StackPanel>
</Grid>
NOTE
Transform3D does not affect the order in which elements are drawn. Elements further away from the viewer along the Z-axis might still be rendered above elements that are closer. 'Canvas.ZIndex' attached property and the position of elements in the XAML visual tree can be used in this case to manage the drawing order of elements in your UI.
View Projection
Views by default apply a perspective transform to the root element to provide a common viewport for all the elements. Under this perspective transform, elements further away from the user appear to shrink towards a common vanishing point. This effect preserves coordinates in the Z=0 plane, where UI elements reside by default.
To change this default projection matrix, SetProjectionMatrix is exposed as part of the IView interface. Being 'width' and 'height' the dimensions of the view, the projection matrix converts to a homogeneous clip space limited by [0, width] in the x-direction and [0, height] in y-direction. The limits of the z-direction are 0 for the front plane and 1 for the back plane. To seamlessly inter-mix UI elements with other 3D game objects the same projection matrix must be used for the UI and for the 3D world.
The following table compares the clip space used by OpenGL and D3D in comparison with NoesisGUI. Take this into account when adapting projection matrices from other similar systems.
System | X Range | Y Range | Z Range |
---|---|---|---|
OpenGL | [-1...+1] | [-1...+1] | [-1...+1] |
Direct3D | [-1...+1] | [-1...+1] | [0...1] |
NoesisGUI | [0...width] | [0...height] | [0...1] |
Multi-Pass stereo
Noesis also offers the possibility of changing the projection matrix at render time. This is useful, for example, when rendering in Virtual Reality. In VR, each eye needs to be rendered with a different perspective.
// Render Scene to Eye Buffers
for (int eye = 0; eye < 2; ++eye)
{
// UI Offscreen textures
uiView->GetRenderer()->RenderOffscreen(eyeMtx);
// Clear and set up rendertarget
DIRECTX.SetAndClearRenderTarget(color, depth);
DIRECTX.SetViewport(eyeX, eyeY, eyeWidth, eyeHeight);
// Render scene
roomScene->Render(&prod, 1, 1, 1, 1, true);
// Render UI
uiView->GetRenderer()->Render(eyeMtx);
// Commit rendering to the swap chain
pEyeRenderTexture[eye]->Commit();
}
NOTE
We provide an example using OculusSDK as part of our C++ SDK.
Single-Pass Stereo
Instead of rendering each eye separately, Single-Pass Stereo renders both eyes at the same time. This is more efficient for the CPU as both eye share the work required by traversing and culling the visual tree.
const Matrix4& projection = GetCullingMatrix();
const Matrix4& leftEyeProjection = GetLeftEyeMatrix();
const Matrix4& rightEyeProjection = GetRightEyeMatrix();
// UI Offscreen textures
uiView->GetRenderer()->RenderOffscreen(eyeMtx);
// Render scene
roomScene->Render(projection, leftEyeProjection, rightEyeProjection);
// Render UI
uiView->GetRenderer()->Render(projection, leftEyeProjection, rightEyeProjection);
Table of Contents