Page 1 of 1

How do I save/render an <Image> element as a file using the managed SDK?

Posted: 13 Feb 2019, 20:40
by darthmaule2
I have this BitmapSource property:
public BitmapSource RecalledImage { get; set; }
The above property is populated by reading an image from disk into a buffer "bytes" and then by calling Create:
RecalledImage = Noesis.BitmapSource.Create(file.Width, file.Height, 96, 96, bytes, file.Width * 3, BitmapSource.Format.BGR8);
This image is displayed in the Ui like this:
<Image x:Name="imageElement" Source="{Binding Path=RecalledImage}"/>
That all works fine. But now suppose I manipulate the <Image> element, transform it, add overlays to it, whatever... and then want to to save that back to a file.

How would I do that? I can't find any managed example that do anything like that although I've found multiple suggestions to "render it to a texture" but usually referencing the native SDK.

I'm currently just testing this in Windows 10, starting with the HelloWorld example, which I've added to.

Re: How do I save/render an <Image> element as a file using the managed SDK?

Posted: 16 Feb 2019, 02:04
by jsantos
I am not sure to be following here. When you say, "save an <Image> element", does it mean you are modifying the Image yourself? Or you mean that the Image is composed with more elements in the XAML and you want to capture those effects (you want to capture the render of a XAML).

Re: How do I save/render an <Image> element as a file using the managed SDK?

Posted: 19 Feb 2019, 20:34
by darthmaule2
I guess the answer to your second option would get me pretty far.

Using WPF, I have a function where I combine multiple BitmapSource images by rendering one on top of the other "BlendBitmapSources" and another one "CreateZoomedImage" where I transform a BitmapSource and then render that. The RendeTargetBitmap class facilitates these operations in WPF. Is there a way to do this with the managed Noesis SDK?
        private BitmapSource BlendBitmapSources(BitmapSource backgroundSource, BitmapSource overlaySource)
            if ((backgroundSource.PixelWidth != overlaySource.PixelWidth) ||
                (backgroundSource.PixelHeight != overlaySource.PixelHeight))
                throw new ArgumentException();

            int width = backgroundSource.PixelWidth;
            int height = backgroundSource.PixelHeight;

            Grid renderGrid = new Grid();
            renderGrid.HorizontalAlignment = HorizontalAlignment.Left;
            renderGrid.VerticalAlignment = VerticalAlignment.Top;
            renderGrid.Width = width;
            renderGrid.Height = height;
            renderGrid.Measure(new Size(width, height));
            renderGrid.Arrange(new Rect(new Size(width, height)));

            Image backgroundImage = new Image();
            backgroundImage.Stretch = Stretch.Fill;
            backgroundImage.Source = backgroundSource;

            Image overlayImage = new Image();
            overlayImage.Stretch = Stretch.Fill;
            overlayImage.Source = overlaySource;


            RenderTargetBitmap renderTargetBitmap = new RenderTargetBitmap(width, height, 96, 96, PixelFormats.Pbgra32);

            return renderTargetBitmap;

        private BitmapSource CreateZoomedImage(BitmapSource sourceImage, double zoomFactor, Point zoomCenter)
            int sourceWidth = (int)(sourceImage.PixelWidth / zoomFactor);
            int sourceHeight = (int)(sourceImage.PixelHeight / zoomFactor);
            int sourceLeft = (int)zoomCenter.X - sourceWidth / 2;
            int sourceTop = (int)zoomCenter.Y - sourceHeight / 2;

            CroppedBitmap bitmap1 = new CroppedBitmap();
            bitmap1.Source = sourceImage;
            bitmap1.SourceRect = new Int32Rect(sourceLeft, sourceTop, sourceWidth, sourceHeight);

            // zoomed image must have same dimensions as source image
            double scaleX = (double)sourceImage.PixelWidth / sourceWidth;
            double scaleY = (double)sourceImage.PixelHeight / sourceHeight;

            TransformedBitmap bitmap2 = new TransformedBitmap();
            bitmap2.Source = bitmap1;
            bitmap2.Transform = new ScaleTransform(scaleX, scaleY);

            return bitmap2;

Re: How do I save/render an <Image> element as a file using the managed SDK?

Posted: 21 Feb 2019, 12:34
by jsantos
Yes, there is a way to do that but it is not as convenient as RenderTargetBitmap. I think we should implement that class in Noesis. Right now, the way to do that in Noesis would be:
1. Create a render texture in the current renderer. If you are using OpenGL, then using OpenGL (API not part of Noesis)
2. Bind that render target (API not part of Noesis)
3. Create a NoesisView setting the root to the desired element
4. Render that view (described in
5. Create TextureSource from the texture and use it.
I think this could be also implemented using VisualBrushes (that are partially supported in Noesis) but I am not 100% sure.

Could you please create a ticket for the RenderTargetBitmap implementation?

Re: How do I save/render an <Image> element as a file using the managed SDK?

Posted: 23 Feb 2019, 11:39
by darthmaule2
RenderTargetBitmp feature request:

CroppedBitmap and TransformedBitmap feature request:

Re: How do I save/render an <Image> element as a file using the managed SDK?

Posted: 25 Feb 2019, 15:55
by jsantos
Thank you!