Reputation: 105
In Unity, I have 2 images in the form of a texture that I am merging together (one overlaying the other). I do this in a Compute Shader
and put the results on a RenderTexture
. I want this RenderTexture
to be everything that the output camera sees.
I have found articles saying to use a the ReplacementShader
property of the camera, but I couldn't get that to properly work.
Currently I have simply put the RenderTexture
onto a UIRawImage
that covers the whole UI Canvas so that the entire camera is filled. This however has a lot of lag and is obviously a suboptimal solution.
So how does one output the Rendertexture
or the compute Shader
result directly onto the camera. Thanks.
Upvotes: 3
Views: 7074
Reputation: 90683
You could probably use OnRenderImage
Event function that Unity calls after a Camera has finished rendering, that allows you to modify the Camera's final image.
and use Graphics.Blit
Copies source texture into destination render texture with a shader.
and do something like e.g.
// This script goes onto your according Camera
public class RenderReplacement : MonoBehaviour
{
public RenderTexture replacement;
void OnRenderImage(RenderTexture src, RenderTexture dest)
{
// To overwrite the entire screen
Graphics.Blit(replacement, null);
// Or to overwrite only what this specific Camera renders
//Graphics.Blit(replacement, dest);
}
}
Where
dest
The destinationRenderTexture
. Set this tonull
to blit directly to screen. See description for more information.
Note as mentioned in the API the OnRenderImage
is called after this Camera already finished rendering. So in order to make this more efficient - since we basically throw away that render - simply make sure that camera basically isn't rendering anything by disabling all Layers and e.g. let it only render a single color background
Upvotes: 6