12ksins
12ksins

Reputation: 307

calculating cubic bezier curve offset using normal vector doesnt produce correct offset

I'm trying to offset a cubic bezier curve, by doing the following

  1. find the x and y derivative to get the tangent vector (dx, dy)
  2. rotate the vector 90deg by (-dy, dx) to get the normal
  3. making it a unit vector by dividing by the magnitude
  4. multiplying by the desired offset, and adding it the the point on the curve
def cubic_offset(P0, P1, P2, P3, t, dist): 
    initx, inity = cubic(P0, P1, P2, P3, t) 
    dx, dy = cubic_dt(P0, P1, P2, P3, t) # this is the tangent vector
    
    normx, normy = -dy, dx
    mag = sqrt(normx**2 + normy**2)
    normx, normy = normx/mag, normy/mag
    
    return initx + dist*normx, inity + dist*normy

however, this isnt working, esspecialy near extrema

red is an extrema point

I also saw this blog post https://observablehq.com/@s-silva/bezier-curve-offsets that confirms this process actually works, and provides a preview that doesnt produce errors.

Our code matches almost perfectly, however his works and mine doesnt.

my derivative is also correct as Im using it to get the extrema and the bounding box which is working perfectly fine.

Upvotes: 0

Views: 196

Answers (1)

12ksins
12ksins

Reputation: 307

Solved:

Because the offset was only wrong around extrema, where the derivative is close to 0, I figured it could have something to do with floating-point arethmatic.

I copied the derivative thats on Wikipedia

3(1-t)^2(P1 - P0) + 6t(1-t)(P2-P1) + 3t^2(P3-P2)

and it worked perfectly fine, even though it is mathematically equivalent to my derivative

3t^2(P3 - 3P2 + 3P1 - P0) + 6t(P2 - 2P1 + P0) + 3(P1 - P0)

Apparently, when calculated by Python, the average difference between them for t between 0 and 1 and P's between 0 and 700 is 0.049 (Calculated 2,000,000 times) and for t between 0 and 100 the average difference is a staggering 43.2.

Any corrections to how this error is produced would be most appreciated, as I'm not that knowledgeable in such things.

EDIT:

Turns out it's not floating-points to blame, but tuples! or maybe something else I dont know, any help would be appreciated.

from random import randint, uniform

def dt1(P0, P1, P2, P3, t):
    return (3*(1-t)**2*(P1[0] - P0[0]) + 6*(1-t)*t*(P2[0] - P1[0]) + 3*t**2*(P3[0] - P2[0]),
            3*(1-t)**2*(P1[1] - P0[1]) + 6*(1-t)*t*(P2[1] - P1[1]) + 3*t**2*(P3[1] - P2[1]))

def dt2(P0, P1, P2, P3, t):
    return (t**2*3*(P3[0]-3*P2[0]+3*P1[0]-P0[0])+t*6*(P2[0]-2*P1[0]+P0[0])+3*(P1[0]- P0[0]),
            t**2*3*(P3[1]-3*P2[1]+3*P1[1]-P0[1])+t*6*(P2[1]-2*P1[1]+P0[0])+3*(P1[1]- P0[0]))
            

def dt1_(P0, P1, P2, P3, t):
    return 3*(1-t)**2*(P1 - P0) + 6*(1-t)*t*(P2 - P1) + 3*t**2*(P3 - P2)

def dt2_(P0, P1, P2, P3, t):
    return t**2*3*(P3-3*P2+3*P1-P0)+t*6*(P2-2*P1+P0)+3*(P1- P0)

reps = 1_000_000

# with tuples t between 0 and 1
sum = 0
for i in range(reps):
    Points = [(randint(0, 700), randint(0, 700)) for _ in range(4)]
    t = uniform(0, 1)
    dt_1 = dt1(*Points, t)
    dt_2 = dt2(*Points, t)
    sum += dt_1[0] - dt_2[0]
    sum += dt_1[1] - dt_2[1]
    
print(f"for t between 0 and 1, with tuples, the average diffrence is {sum/(reps*2)}")

# with tuples, t between 0 and 100
sum = 0
for i in range(reps):
    Points = [(randint(0, 700), randint(0, 700)) for _ in range(4)]
    t = uniform(0, 100)
    dt_1 = dt1(*Points, t)
    dt_2 = dt2(*Points, t)
    sum += dt_1[0] - dt_2[0]
    sum += dt_1[1] - dt_2[1]
    
print(f"for t between 0 and 100, the average diffrence is {sum/(reps*2)}")

# without tuples, t between 0 and 1
sum = 0
for i in range(reps*2):
    Points = [randint(0, 700) for _ in range(4)]
    t = uniform(0, 1)
    dt_1 = dt1_(*Points, t)
    dt_2 = dt2_(*Points, t)
    sum += dt_1 - dt_2

print(f"for t between 0 and 1, without tuples, the average diffrence is {sum/(reps*2)}")

# without tuples, t between 0 and 1
sum = 0
for i in range(reps*2):
    Points = [randint(0, 700) for _ in range(4)]
    t = uniform(0, 100)
    dt_1 = dt1_(*Points, t)
    dt_2 = dt2_(*Points, t)
    sum += dt_1 - dt_2

print(f"for t between 0 and 100, without tuples, the average diffrence is {sum/(reps*2)}")

Results:

for t between 0 and 1, with tuples, the average diffrence is 0.10319855785147072
for t between 0 and 100, with tuples, the average diffrence is -21.841299912204903
for t between 0 and 1, without tuples, the average diffrence is -3.370475946951057e-17
for t between 0 and 100, without tuples, the average diffrence is -1.1170571903237891e-12

Upvotes: -1

Related Questions