Reputation: 67
class VariableRateLoan(Loan):
def __init__(self, term, rateDict, face):
super(VariableRateLoan, self).__init__(term, rateDict, face)
self._term = float(term)
self._face = float(face)
if type(rateDict) is not dict:
print('Please enter the rate as a dict')
else:
self._rateDict = rateDict
def getRate(self, T):
keyList = list(self._rateDict.keys())
if T > self._term:
print('Term entered cannot exceed the total terms of this loan!')
for i, v in enumerate(keyList):
if keyList[i] <= T < keyList[i+1]:
return self._rateDict[keyList[i]]
else:
return self._rateDict[max(self._rateDict.keys())]
Below is my test program
def main():
rate_dict = {0: 0.03, 5: 0.05, 11: 0.07}
loan1 = VariableRateLoan(120, rate_dict, 100000)
print(loan1.getRate(7))
So basically, if I enter T=7 for getRate, I should get a rate at T=5, which is 0.05, but now it returns 0.07 for me. But if I enter anything below 5, it correctly gave me a 0.03 which starts from T=0. I was unsure what was wrong with my code.
keyList[i] <= T < keyList[i+1] (Here is the logic I try to achieve, if T entering is in a range between the two keys, it will return the value by using the lower key)
One thing to notice: I don't really sort the key because every dictionary passing in starts from 0, ends at a later term.
Upvotes: 0
Views: 63
Reputation: 734
May not be the most elegant one, but should be simple to read at
def getRate(self, T):
if T > self._term:
print("Term entered cannot exceed the total terms of this loan!")
required_key = max(term for term in self._rateDict if term <= T)
return self._rateDict[required_key]
Upvotes: 2