pceccon
pceccon

Reputation: 9844

Floats print differently when shown in dictionary - python

I'm creating a dictionary with this simple code:

pixel_histogram = {}
min_value = -0.2
max_value = 0.2
interval_size = (math.fabs(min_value) + math.fabs(max_value))/bins

for i in range(bins):
    key = min_value+(i*interval_size)
    print key
    pixel_histogram[key] = 0
    print pixel_histogram

But I'm a little surprised 'cause I got these values with my prints:

#Printing keys
-0.2
-0.16
-0.12
-0.08
-0.04
0.0
0.04
0.08
0.12
0.16

#Printing the dictionary
{0.0: 0, 
-0.08000000000000002: 0, 
0.15999999999999998: 0, 
-0.16: 0, 
0.12: 0, 
-0.12000000000000001: 0, 
0.08000000000000002: 0, 
-0.04000000000000001: 0, 
-0.2: 0, 
0.03999999999999998: 0}

I didn't figure out why the values are different and how could I solve this. Any help would be appreciated. Thanks.

Upvotes: 3

Views: 1176

Answers (2)

Alok--
Alok--

Reputation: 724

Python's print statement uses str() on the item being printed. For floating-point values, str() will print up to certain number of decimal values.

When printing a dictionary, the print statement is calling str() on the dictionary object. The dictionary, in its __str__() method definition, uses repr() on the keys. The repr() function for floating-point values prints to more decimal places than the str() function does.

The reason dictionaries use repr() and not str() for keys is that you almost definitely want to see print {'1': 1} print differently than print {1: 1}.

Upvotes: 8

Marcin
Marcin

Reputation: 49826

The values are the same. In the first case, they have been truncated or rounded to decimal places. Surely you are familiar with the rounding of numbers?

Upvotes: -3

Related Questions