BigD4J
BigD4J

Reputation: 83

Python: Approximating ln(x) using Taylor Series

I'm trying to build an approximation for ln(1.9) within ten digits of accuracy (so .641853861).

I'm using a simple function I've built from ln[(1 + x)/(1 - x)]

Here is my code so far:

# function for ln[(1 + x)/(1 - x)]

def taylor_two(r, n):
    x = 0.9 / 2.9
    i = 1
    taySum = 0
    while i <= n:
        taySum += (pow(x,i))/(i)
        i += 2
    return 2 * taySum

print taylor_two(x, 12)

print taylor_two(x, 17)

What I need to do now is reformat this so that it tells me the number of terms needed to approximate ln(1.9) to the above 10 digits, have it display the value that series gives, and also show the error.

I assume I need to build my function into a for loop somehow, but how can I get it to stop iterating once it's reached the 10 digits needed?

Thank you for your help!

Upvotes: 3

Views: 5454

Answers (1)

Roland Smith
Roland Smith

Reputation: 43563

The principle is;

  • Look at how much each iteration adds to the result.
  • Stop when the difference is smaller than 1e-10.

You're using the following formula, right;

ln formula

(Note the validity range!)

def taylor_two():
    x = 1.9 - 1
    i = 1
    taySum = 0
    while True:
        addition = pow(-1,i+1)*pow(x,i)/i
        if abs(addition) < 1e-10:
            break
        taySum += addition
        # print('value: {}, addition: {}'.format(taySum, addition))
        i += 1
    return taySum

Test:

In [2]: print(taylor_two())
0.6418538862240631

In [3]: print('{:.10f}'.format(taylor_two()))
0.6418538862

Upvotes: 5

Related Questions