displaying 16 bit grayscale images at runtime
Hi
how can i display 16 bit grayscale images that are chosen from a directory at run time in a Noesis application .
how can i display 16 bit grayscale images that are chosen from a directory at run time in a Noesis application .
Re: displaying 16 bit grayscale images at runtime
Please let us know what SDK you are using,
Re: displaying 16 bit grayscale images at runtime
Hi
I am using the C++ SDK
I am using the C++ SDK
Re: displaying 16 bit grayscale images at runtime
The images are read from files, processed with our image processing library and then need to be displayed (preferably in a Noesis Image container)
Re: displaying 16 bit grayscale images at runtime
The easiest way is using BitmapSource::Create
Code: Select all
class BitmapSource: public ImageSource
{
public:
enum Format
{
Format_BGRA8,
Format_BGR8,
Format_RGBA8,
Format_RGB8
};
/// Creates a new BitmapSource from an array of pixels that are stored in memory
static Ptr<BitmapSource> Create(int32_t pixelWidth, int32_t pixelHeight, float dpiX,
float dpiY, const uint8_t* buffer, int32_t stride, Format format);
Re: displaying 16 bit grayscale images at runtime
thanks for the reply.
we have 10bit medical monitors. i am trying to display 16bit grayscale images without converting to rgb8 and as 10bit.
is there a way to display the image as a single channel grayscale 10bit?
we have 10bit medical monitors. i am trying to display 16bit grayscale images without converting to rgb8 and as 10bit.
is there a way to display the image as a single channel grayscale 10bit?
Re: displaying 16 bit grayscale images at runtime
The alternative is creating the native texture yourself. For example, for D3D11 you need to create a ID3D11Texture2D and then wrap it into a Noesis::Texture using WrapTexture:
Then you can create a TextureSource that can be used as a normal ImageSource.
Not sure if this is worth the effort though.
Code: Select all
struct NS_RENDER_D3D11RENDERDEVICE_API D3D11Factory
{
static Noesis::Ptr<Noesis::Texture> WrapTexture(ID3D11Texture2D* texture, uint32_t width,
uint32_t height, uint32_t levels, bool isInverted, bool hasAlpha);
}
Not sure if this is worth the effort though.
Who is online
Users browsing this forum: No registered users and 35 guests