Alex Stone
Alex Stone

Reputation: 47354

can I use view transforms/OpenGL to turn a flat image into Oculus Rift compatible dual image?

A while ago I saw the code for "Live effects cam" linked here on stackoverflow. That app transforms ordinary camera feed using OpenGL and can make it look like the image is warped in a number of ways. I'm interested if I can transform an ordinary, flat camera feed or an image using OpenGL in iOS to create a couple images suitable for viewing on Oculus rift?

In other words, is it possible to use the code like above to appropriately transform a flat image to trick the eye into thinking it is 3D when viewed through Oculus rift?

Below is an example of what I'm trying to achieve. I assume that in this case the rendering engine can create 2 viewports and render two images at slightly different angles. But I have only one image or camera feed. enter image description here

enter image description here

Upvotes: 1

Views: 1321

Answers (1)

Jherico
Jherico

Reputation: 29240

You can take simple 2D images and display them in the Rift, but there will not be any sense of depth to the images. Additionally, unless the images we rendered or captured with fairly large field of view, they will only take up a small amount of the screen.

The simplest way to do this is to convert the image into an OpenGL texture and then render it onto a simple piece of rectangular geometry in an offscreen buffer, and then pass the offscreen render(s) to the Oculus SDK for distortion and display on the Rift. Alternatively, if you know the exact field of view of the image, you can create an Oculus distortion matrix specifically to match the image, and then you can pass the image texture directly to the SDK.

This link is to a C++ example that uses OpenCV to capture images from a webcam, copy them to OpenGL and then render them within a 3D scene for subsequent display on the Oculus Rift.

Upvotes: 1

Related Questions