Reputation: 1
I need to solve a problem in a Python milking beginners course.
You are offered a one-time payment of $1,000,000 or 1 cent, which doubles every day for 30 days (the amount received doubles every day).
Write a program to calculate the amount obtained by doubling to determine in which case the amount is greater.
My attempts:
i = 0.01
for i in range(31):
day = i
day_1 = day * (2 ** 2)
print(i)
I don't know how to tell the program to calculate the amount for each day until the 30th, except to specify it for each day separately - but then you get a very long code (one line for each day), I think there should be a command that does it faster.
i = 0.01
for i in range(31):
day = i
day_1 = day * (2 ** 2)
print(i)
Upvotes: -3
Views: 85
Reputation: 703
first I dont think using i as your starting value is not a good move since its often used as iterator for loops. Try this instead:
value = 0.01
step = 1
while step < 30:
value = value * 2
step = step + 1
print(value)
Upvotes: -1
Reputation: 268
You can use this code:
start = 0.01
temp = start
for i in range(0, 30):
temp = temp*2
print(temp)
This code will have a start value, and set a temp variables value to the start. For every day in the 30 days, it will double the value in the temp value, and then print the final total amount when done.
Upvotes: 0