Reputation: 3692
I am creating a Regular Expression in JavaScript which should look for both numbers (at least 1) and letters (at least 1), with the total length being between 6 and 10. I came across some unexpected behavior.
My regex - /^[a-z+\d+]{6,10}$/g
.
This doesn't work properly because being in a character class, it checks for letters or numbers, but noth BOTH. Therefore, I would expect "123456"
to fail, because while it contains 6 characters, and has at least 1 digit, it does not include 1 letter.
However, in the below code snippet, when I store the regex in the rgx
variable and use .test()
on it, it somehow correctly returns false
, as shown in the second console.log
statement. But on the very next line when I directly use the regex with .test()
, it returns true
.
let rgx = /^[a-z+\d+]{6,10}$/g;
// works fine
console.log(rgx.test("abcd12"));
// returns false
console.log(rgx.test("123456"));
// same regex returns true
console.log(/^[a-z+\d+]{6,10}$/g.test("123456"));
What's going on here?
Upvotes: 0
Views: 192
Reputation: 49945
Try
let rgx = /^[a-z+\d+]{6,10}$/g;
let rgx2 = /^[a-z+\d+]{6,10}$/g;
console.log(rgx.test("abcd12")); //true
console.log(rgx2.test("123456")); //true
It's because
Javascript RegExp object is stateful i.e. if you re-use a RegExp object (with 'g' flag set), subsequent matches would start from the end of the last match.
Upvotes: 4