Reputation: 605
I'm loading a XAML UserControl with XamlReader and then I'm trying to convert it to an image with RenderTargetBitmap.
It's working fine with a simple UserControl.
But if I use this Viewport3D, it doesn't work. The image is created without the 3D part... :
<Viewport3D>
<ModelVisual3D>
<ModelVisual3D.Content>
<GeometryModel3D>
<GeometryModel3D.Geometry>
<MeshGeometry3D
Positions="-0.5 0.5 -0.5, 0.5 0.5 -0.5,
-0.5 0 0.5, 0.5 0 0.5"
TriangleIndices=" 0 2 1, 1 2 3"
TextureCoordinates="0 0, 1 0, 0 1, 1 1" />
</GeometryModel3D.Geometry>
<GeometryModel3D.Material>
<DiffuseMaterial>
<DiffuseMaterial.Brush>
<VisualBrush>
<VisualBrush.Visual>
<Button>Hi</Button>
</VisualBrush.Visual>
</VisualBrush>
</DiffuseMaterial.Brush>
</DiffuseMaterial>
</GeometryModel3D.Material>
<!-- Non-Affine Matrix Transform. -->
<GeometryModel3D.Transform>
<MatrixTransform3D>
</MatrixTransform3D>
</GeometryModel3D.Transform>
</GeometryModel3D>
</ModelVisual3D.Content>
</ModelVisual3D>
<!-- Light sources. -->
<ModelVisual3D>
<ModelVisual3D.Content>
<Model3DGroup>
<AmbientLight Color="#404040" />
<DirectionalLight Color="#C0C0C0" Direction="0 -2 -1" />
</Model3DGroup>
</ModelVisual3D.Content>
</ModelVisual3D>
<!-- Camera. -->
<Viewport3D.Camera>
<PerspectiveCamera Position="0 0.2 1"
LookDirection="0 0 -1.5"
UpDirection="0 1 0"
FieldOfView="100">
</PerspectiveCamera>
</Viewport3D.Camera>
</Viewport3D>
My method to convert a UserControl to an image :
private static byte[] ConvertUserControlToPng(UserControl userControl)
{
userControl.Measure(new Size(userControl.Width, userControl.Height));
userControl.Arrange(new Rect(new Size(userControl.Width, userControl.Height)));
var renderTargetBitmap = new RenderTargetBitmap((int)(userControl.Width * 300 / 96),
(int)(userControl.Height * 300 / 96),
300, 300, PixelFormats.Pbgra32);
renderTargetBitmap.Render(userControl);
BitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Clear();
encoder.Frames.Add(BitmapFrame.Create(renderTargetBitmap));
byte[] result;
using (var stream = new MemoryStream())
{
encoder.Save(stream);
result = stream.ToArray();
}
return result;
}
Am I missing something ?
Upvotes: 1
Views: 1260
Reputation: 1440
I was able to get something visible in a png save to a file, I just added a call to UpdateLayout() after the calls to Measure and Arrange:
var viewport= (Viewport3D)XamlReader.Parse("... xaml ....");
viewport.Measure(new Size(viewport.Width, viewport.Height));
viewport.Arrange(new Rect(new Size(viewport.Width, viewport.Height)));
viewport.UpdateLayout();
var renderTargetBitmap = new RenderTargetBitmap((int)viewport.Width * 300 / 96,
(int)viewport.Height * 300 / 96,
300, 300, PixelFormats.Pbgra32);
renderTargetBitmap.Render(viewport);
BitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Clear();
encoder.Frames.Add(BitmapFrame.Create(renderTargetBitmap));
encoder.Save(new FileStream(@"D:\Test.png", FileMode.Create, FileAccess.Write));
Upvotes: 2
Reputation: 1165
So, my guess is that it is somewhat virtualized with the rendering. Maybe not -- but the best thing I can notice in your code above is that it's using a UserControl -- not a Control (Viewport3D). When you run the code, does it make it through it at all? Or does it throw an exception?
Upvotes: 0