Reputation: 1332
I'm creating a .obj
file importer at runtime in Unity3D.
The problem is that when I import a .obj
file, it comes with the x
coordinate flipped; by "flipped" I mean that if the x
coordinate in a vertex in the file says -12
, then unity reads a 12
and vice versa.
The logged coordinates look like this:
-42.4 -6.608938 -1.6
-42 -6.579293 -1.6
-42.4 -6.652683 -1.2
-42.4 -6.608938 -1.6
Whereas my original .obj
file had vertices like this:
v 42.4000015258785 -6.60893774032594 -1.60000002384146
v 42.4000015258785 -6.65268325805652 -1.20000004768452
v 42.0000000000008 -6.57929277420054 -1.60000002384146
v 42.0000000000004 -6.57929277420055 -1.60000002384106
So, the question is, how can I prevent Unity from changing the signs of my coordinates?
Upvotes: 1
Views: 325
Reputation: 3777
I found a similar question on gamedev.stackexchange.com. From one of the answers there:
The actual OBJ file format specification declares that, "A right-hand coordinate system is used to specify the coordinate locations."
Unity uses a left-hand coordinate system.
Conversion from right-handed to left-handed is accomplished by negating the coordinates on any axis (it doesn't matter which one).
Unity negates the X-coordinates to convert the right-handed OBJ data to left-handed data.
Upvotes: 3