Reputation: 119
I am warning you, this could be confusing, and the code i have written is more of a mindmap than finished code..
I am trying to implement the Newton-Raphson method to solve equations. What I can't figure out is how to write this
equation in Python, to calculate the next approximation (xn+1) from the last approximation (xn). I have to use a loop, to get closer and closer to the real answer, and the loop should terminate when the change between approximations is less than the variable h.
How do I terminate the loop when the approximations are not changing anymore?
def derivative(f, x, h):
deriv = (1.0/(2*h))*(f(x+h)-f(x-h))
return deriv
def solve(f, x0, h):
xn = x0
prev = 0
while ( approx - prev > h):
xn = xn - (f(xn))/derivative(f, xn, h)
return xn
Upvotes: 2
Views: 31373
Reputation: 319
If code under 'try:' cannot be implemented and the compiler is given the error 'ZeroDivisionError' then it will execute the code under 'except ZeroDivisionError:'. Although, if you'd like to account for another compiler exception 'XYZ' with a specific code implementation then add an additional 'except XYZ:"
try:
nextX = lastX - newY / derivative(function, lastX, h)
except ZeroDivisionError:
return newY
Upvotes: 2
Reputation: 46405
Here is the implementation of a N-R solver expanding what you wrote above (complete, working). I added a few extra lines to show what is happening...
def derivative(f, x, h):
return (f(x+h) - f(x-h)) / (2.0*h) # might want to return a small non-zero if ==0
def quadratic(x):
return 2*x*x-5*x+1 # just a function to show it works
def solve(f, x0, h):
lastX = x0
nextX = lastX + 10* h # "different than lastX so loop starts OK
while (abs(lastX - nextX) > h): # this is how you terminate the loop - note use of abs()
newY = f(nextX) # just for debug... see what happens
print "f(", nextX, ") = ", newY # print out progress... again just debug
lastX = nextX
nextX = lastX - newY / derivative(f, lastX, h) # update estimate using N-R
return nextX
xFound = solve(quadratic, 5, 0.01) # call the solver
print "solution: x = ", xFound # print the result
output:
f( 5.1 ) = 27.52
f( 3.31298701299 ) = 6.38683083151
f( 2.53900845771 ) = 1.19808560807
f( 2.30664271935 ) = 0.107987672721
f( 2.28109300639 ) = 0.00130557566462
solution: x = 2.28077645501
Edit - you could also check the value of newY
and stop when it is "close enough to zero" - but usually you keep this going until the change in x
is <=h
(you can argue about the value of the =
sign in a numerical method - I prefer the more emphatic <
myself, figuring that one more iteration won't hurt.).
Upvotes: 7
Reputation: 4079
How do I write the code for the equation?
The Newton-Raphson method actually finds the zeroes of a function. To solve an equation g(x) = y
, one has to make the function passed to the solver g(x)-y
so that when the function passed to the solver gives zero, g(x)=y
.
How do I terminate the loop when the approximations are not changing anymore?
Your code already does that as when the two approximations are equal to each other, the difference will be 0
, which is less than h
.
current_approx - previous_approx > h
should be current_approx - previous_approx >= h
since you want it to terminate when the difference is less than h
. Also improved the variable names.
def derivative(f, x, accuracy):
deriv = (1.0/(2*accuracy))*(f(x+accuracy)-f(x-accuracy))
return deriv
def solve(f, approximation, h):
current_approx = approximation
prev_approximation = float("inf") # So that the first time the while loop will run
while current_approx - prev_approx >= h:
prev_approximation = current_approx
current_approx = current_approx - (f(current_approx))/derivative(f, current_approx, h)
return current_approx
Upvotes: 0
Reputation: 234725
You'll be lucky to get convergence since your derivative is not exact to the limits of numerical precision.
Aside from your not guarding against division by zero, there's nothing wrong with your implementation of the Newton-Raphson result: that is your statement xn = xn - (f(xn))/derivative(f, xn, h)
But since you're using an approximate derivative, you should switch to a different optimisation scheme once you have the root bracketed. So as far as the Newton Raphson part is concerned, that is your termination condition. A good optimiser to use is brent which will always find a root once bracketed; even without a derivative.
Upvotes: 0