Reputation: 23498
I'm writing an algorithm to decode base64. In the below code near the very end, if I change:
Binary.substr((FirstChar - 1) >= 0 ? (I - 1) : 0);
to
Binary.substr((I - 1) >= 0 ? (I - 1) : 0);
It throws std::out_of_range
. However, if I leave it alone, it works fine..
The entire code is as follows:
#include <iostream>
#include <bitset>
#include <algorithm>
static const std::string Base64Chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
std::string DecodeBase64(std::string Data)
{
std::string Binary = std::string();
std::string Result = std::string();
for (std::size_t I = Data.size(); I > 0; --I)
{
if (Data[I - 1] != '=')
{
std::string Characters = Data.substr(0, I);
for (auto it = Characters.begin(); it != Characters.end(); ++it)
Binary += std::bitset<6>(Base64Chars.find(*it)).to_string();
break;
}
}
for (std::size_t I = 0; I < Binary.size(); I += 8)
{
int FirstChar = I;
std::string str = Binary.substr((FirstChar - 1) >= 0 ? (I - 1) : 0);
Result += static_cast<char>(std::bitset<8>(str).to_ulong());
if (I == 0) ++I;
}
return Result;
}
int main()
{
std::cout<<DecodeBase64("aGVsbG8gdGhlcmUgbm9vYg==");
}
It is weird because I assigned I
to FirstChar
right before I call substr
so it should be the same exact value.. Any ideas why this is happening?
Upvotes: 2
Views: 154
Reputation: 726479
This is because I
is of type std::size_t
, which is unsigned. When I
is zero, I - 1
is interpreted as a very large positive number.
Converting I
to an int
which happens in the assignment fixes the problem, because FirstChar
is now signed, so FirstChar -1
could become negative.
Converting I-1 >= 0
to an equivalent I >= 1
should fix this problem:
Binary.substr(I >= 1 ? (I - 1) : 0);
Upvotes: 6