Using XAML in a holographic space

I am currently trying to add some UI to my holographic app. My whole application is written with DirectX and I will not be using Unity.

The main function of the app is to display a 3D volume in holographic space and interact with gesture such as rotation and translation.

The idea to improve the interaction is to create multiple floating windows, add additional manipulations and functions.

In an ideal world, I would like to draw in my floating windows (which would be a floating plane or billboard) an interface written in XAML or directly rendered by some kind of Windows.UI.XAML.UIElement.

So basically I want to render a XAML element in Texture2D and I tried to do the following:

await CoreApplication.MainView.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, async () =>
{
      RenderTargetBitmap renderBitmap = new RenderTargetBitmap();
      await renderBitmap.RenderAsync(PanoramicImage);
      XamlPixel = await renderBitmap.GetPixelsAsync();
});

      

The problem is RenderTargetBitmap.RenderAsync is blocking undefinitly and I cannot reach the GetPixelsAsync function.

Do you have any ideas?

Thanks in advance.

+3


source to share


1 answer


I've been working for quite a while in my own uwp hololens application in directx. We solved the "UI problem" with direct2d ... you can render a custom texture in a rendertarget and then use it as a shaderressourceview for the ui element. I think even with the "now" dispatcher there is no way to render the xaml controls in holographic space. So I think you will need to write your own ui elements.

As Microsoft describes, the only way to use xaml controls is to switch between holographic space and normal mode:



Microsoft Viewmodel

+2


source







All Articles