Reputation: 51
My task is to calculate the distance between a rectangle and the 0/0 point in an coordinate system and print a particular answer. If it's nearer than 100m (the unit of the system is meters, 1 unit = 1 meter), it should print 100m, if distance < 200m, print 101m...
I've learned that I can use the Pythagorean theorem to get the distance between two coordinates. I implemented it into my program (in Python) but I have got some trouble with the output.
Let's try an example. A rectangle with the coordinates (–400,200); (–300, 200); (–300, 300); (–400, 300) is 360m away from the point (0/0). The right output would be "103m".
Somebody asked something like this before and they said, you have to divide the distance through 100 and add it to "10{}".
print("10{}m".format(distance//100))
Actually, this works for everything below 1000. If the coordinates would be (–4000,2000); (–3000, 2000); (–3000, 3000); (–4000, 3000), the correct distance would be "3605m" and it should output "136m".
Hope you can understand my case/question!
Upvotes: 4
Views: 453
Reputation: 879691
print("{:d}m".format(100+(distance//100)))
For example,
In [16]: distance = 50; "{:d}m".format(100+(distance//100))
Out[16]: '100m'
In [17]: distance = 360; "{:d}m".format(100+(distance//100))
Out[17]: '103m'
In [18]: distance = 3605; "{:d}m".format(100+(distance//100))
Out[18]: '136m'
Upvotes: 3