Lionlake
Lionlake

Reputation: 147

Rendering complete camera view(16:9) onto a texture in Unity3d

I was playing around with Unity's render textures where you can render a camera's view onto a texture. However, I noticed that it doesn't render the entire camera's view. It only renders a square slice of the camera's view.

enter image description here

What I'm trying is to get the entire view of the camera(16:9 aspect ratio) rendered onto a texture(also 16:9 aspect ratio). But right now it only seems to be able to project a square slice of its view on a square surface. Is there any kind of solution to this?

Upvotes: 5

Views: 23539

Answers (2)

Sylker Teles
Sylker Teles

Reputation: 569

It's quite simple. No code required. You need to have same values between your RenderTexture size (x,y) and the RectTransform (width/height) you are using to render your texture to (the one with the RawImage component attached). Let's say you want a standard 16:9 HD aspect ratio. You need to set up your RenderTexture size to 1280x720 and your RectTransform Width and Height to 1280x720 as well. Then you can scale your RectTransform to any size you need to fit the UI layout you are designing.

Upvotes: 2

Rafal Wiliński
Rafal Wiliński

Reputation: 2390

With 'RenderTexture' you can specify your texture size: http://docs.unity3d.com/ScriptReference/RenderTexture.Create.html

It should go like this:

Camera camera = GameObject.Find("Main Camera");
int resWidth = Screen.width;
int resHeight = Screen.height;

RenderTexture rt = new RenderTexture(resWidth, resHeight, 24);
camera.targetTexture = rt; //Create new renderTexture and assign to camera
Texture2D screenShot = new Texture2D(resWidth, resHeight, TextureFormat.RGB24, false); //Create new texture

camera.Render();

RenderTexture.active = rt;
screenShot.ReadPixels(new Rect(0, 0, resWidth, resHeight), 0, 0); //Apply pixels from camera onto Texture2D

camera.targetTexture = null;
RenderTexture.active = null; //Clean
Destroy(rt); //Free memory

Upvotes: 9

Related Questions