czy
czy

Reputation: 513

ValueError: invalid literal for int() with base 2: '1.0'

I am using genetic algorithm to optimize something and the binary code is used for encoding variables.

When the varibles represented a sequence of binary numbers were decoded as its corresponding values, the error (ValueError: invalid literal for int() with base 2: '1.0' was return)

Specifically,

child = 1.0
V = int(child,2)
ValueError: invalid literal for int() with base 2: '1.0'

Could you please give me some clues how to fix this error?

Upvotes: 0

Views: 1245

Answers (1)

Mark Ransom
Mark Ransom

Reputation: 308130

You can't convert a string with a decimal point to integer. You need to remove the decimal point and everything after it.

child = '1.0'
V = int(child.split('.')[0], 2)

Upvotes: 1

Related Questions