Reputation:
I am trying to make a program where if I input the degree, it converts it into radian and then calculate its Sine, Cosine and Tangent using Taylor's Theory. Note that I am not supposed to invoke the math module.
def sinAns(rad):
rad=rad
radPlaceHolder=rad
counter=0
numberForFacto=1
tOld=rad
tNew=0
tPlaceHolder=rad
pr=precision+1
tDiff=1
sinAns=0
while abs(tDiff)>.5*10**(-pr):
tOld=tPlaceHolder
tNew=(-1*tOld*rad*rad)/((numberForFacto+1)*(numberForFacto+2))
tPlaceHolder=tNew
counter+=1
numberForFacto+=2
tDiff=abs(tOld)-abs(tNew)
radPlaceHolder+=tNew
sinAns=radPlaceHolder
return sinAns
This is how my code for calculating Sine looks like.
print("{0:}{1:{2}f}".format("sin= ", sinAns(rad), precision))
This is the line where it prints the value later in the program, and I already have the precision defaulted at 10. But when I run the program, it displays 0.500000 instead of the precision I want it to be. Even if I change the precision to any other value between 2-10, it ALWAYS shows 0.500000.
Any idea how to fix this? Have been tweaking around for an hour and still haven't got a clue.
Upvotes: 2
Views: 84
Reputation: 4537
You are missing the point .
:
print("{0}{1:.{2}f}".format("sin= ", sinAns(rad), precision))
# ^
Upvotes: 2