user4826496
user4826496

Reputation:

Why does javascript accept a decimal as an integer

I have this html:

<input type='number' id='length' step='0.1' min='0'; max='5'>Length

and this Javascript

num=document.getElementById('length').value;
if(num==1 || 2 || 3 || 4|| 5){
num='0'+num;
}

My problem is this: while I only want the code inside the brackets to execute if the number from the input is an integer, it also activates if it detects 0.8 or some other decimal. Any Idea why? How do I fix it? Thanks.

Upvotes: 4

Views: 279

Answers (4)

Madness
Madness

Reputation: 2726

To make sure num is a whole number, without having to define all possibilities, use:

if (num % 1 == 0)

Upvotes: 7

Ted
Ted

Reputation: 570

You should do

if (num == 1 || num == 2 || num == 3 || num == 4 || num == 5)

WRONG - otherwise it will compare 2 with 2 and says it's true for the 4 last 'if' parameters.

CORRECTO - any number in JS is considered as true.

Upvotes: 2

Leo
Leo

Reputation: 13848

Why:

num==1 || 2 || 3 || 4|| 5

equals to:

(num==1) || 2 || 3 || 4|| 5

so if num is "1" (always a string type), the expression returns true, otherwise 2 (also a truthy value), eventually your if statement always succeeds.

How to fix:

// implicitly converts the string type into number type before comparison
// returns true if it is an integer-like string
num == Math.floor(num) 

So you could do it like this:

if (num == Math.floor(num) && num > 0 && num < 6) {
    // an integer-like string that meets the requirement [1, 5]
}

But remember, the num is still string type now. If you want a number, do:

num = +num

Upvotes: 2

FlokiTheFisherman
FlokiTheFisherman

Reputation: 234

You have to edit the "If" loop:

if (num == 1 || num == 2 || num == 3 || num == 4 || num == 5)

Upvotes: 0

Related Questions