creimers
creimers

Reputation: 5305

Location based AR with Three.js (+ React) - camera configuration?

I want to augment the image of a stationary webcam with location based markers. This is to be added to an existing React app that uses three.js (through react-three-fiber) in other parts already, so these technologies are to be reused.

While it is quite eays to calculate the position of the markers (locations known) relative to the camera (location known), I'm struggling with the configuration of the camera in order to get a good visual match between "real" object and AR marker.

I have created a codesandbox with an artificial example that illustrates the challenge.

Here's my attempt at configuring the camera:

const camera = {
    position: [0, 1.5, 0],
    fov: 85,
    near: 0.005,
    far: 1000
};

const bearing = 109;  // degrees

<Canvas camera={camera}>
    <Scene bearing={bearing}/>
</Canvas>

Further down in the scene component I’m rotating the camera according to the bearing of the webcam like so:

...

const rotation = { x: 0, y: bearing * -1, z: 0 };
camera.rotation.x = (rotation.x * Math.PI) / 180;
camera.rotation.y = (rotation.y * Math.PI) / 180;
camera.rotation.z = (rotation.z * Math.PI) / 180;

...

Any tips/thoughts on how to get that camera configured for a good match of three.js boxes and real life objects?

Upvotes: 0

Views: 2354

Answers (1)

Pauli
Pauli

Reputation: 76

As a GIS-developer I can give a few hints to this issue:

  • Most 3D graphics API's (such as OpenGL / WebGL, for which Three.js is a wrapper) expect all coordinates to be in linear Cartesian space, but here the input data (the markers and camera locations) are given in Latitude and Longitude (degrees as floating point numbers). These are Spherical coordinates that need to be transformed first.
  • So to convert our spherical lat-lon coordinates to a linear space, we need to first decide which coordinate system we want to work in. Many Coordinate Reference Systems exist for various purposes, but for the sake of simplicity we will just use the so called Web Mercator Projection (defined by the identifier code EPSG:3857) as our target reference system. This will transform our values from degrees to meters.
  • There are a few ways to achieve this conversion, for example by using a JS library known as Proj4js or some helper functions of web mapping libraries such as Leaflet or OpenLayers.
  • Example: var xy = proj4( "EPSG:4326", "EPSG:3857", [ lon, lat ] );
  • Once we have the coordinates in meters it is important to put the marker locations in relation to the camera location. Otherwise we would get very large coordinate values. So my advise is to put the camera at the scene origin (0, 0, 0) and position the markers by their delta vector.
  • Example: var position = new THREE.Vector3( camXY[ 0 ] - markerXY[ 0 ], 0.0, markerXY[ 1 ] - camXY[ 1 ] );
  • You can then rotate the camera according to the given heading and pitch and it should more or less line up if the field-of-view is correct.

Upvotes: 6

Related Questions