Reputation: 145
I missed posting here, you always teach me a lot!
So, I have an assignment to write my own 2 functions to change from integer to ASCII and back from ASCII to integer for an embedded system, and they have very specific features:
here are the provided function definitions that I should build on. it is put in a data.h file:
#include <stdint.h>
uint8_t my_itoa(int32_t data, uint8_t * ptr, uint32_t base);
int32_t my_atoi(uint8_t * ptr, uint8_t digits, uint32_t base);
these functions should be used for basic data manipulation, here's how these functions are going to be used in another file course1.c which has data.c included:
digits = my_itoa( num, ptr, BASE_16);
value = my_atoi( ptr, digits, BASE_16);
and:
digits = my_itoa( num, ptr, BASE_10);
value = my_atoi( ptr, digits, BASE_10);
there are certain features that should be in both functions:
for my_itoa:
for my_atoi:
after my research into this thread: Writing my own atoi function
I was able to write this code, my data.c:
#include <stdlib.h>
#include <string.h>
int main(){
return 0;
}
uint8_t my_itoa(int32_t data, uint8_t * ptr, uint32_t base)
{
return *ptr;
};
int32_t my_atoi(uint8_t * ptr, uint8_t digits, uint32_t base)
{
const char* str= ptr;
uint8_t len = strlen(str);
str = (uint8_t*) malloc(len * sizeof(uint8_t));
while (*str != '\0')
{
uint8_t a;
a = *str -'0';
*ptr = a;
str++;
ptr++;
}
str = str - len;
ptr = ptr -len;
return *ptr;
};
as I understand this part removes null character:
a = *str -'0';
this code has a main problem, which is when i compile it this way I get the in my_atoi that pointer differs in sginedness and pointer assignment differs in signedness:
src/data.c: In function ‘my_atoi’:
src/data.c:20:19: error: pointer targets in initialization differ in
signedness [-Werror=pointer-sign]
const char* str= ptr;
^~~
src/data.c:22:6: error: pointer targets in assignment differ in
signedness [-Werror=pointer-sign]
str = (uint8_t*) malloc(len * sizeof(uint8_t));
^
and when I edit it to this:
uint8_t len = strlen(str);
I get errors that pointer target in passing argument 1 of strlen differ in sigdness, and in string.h strlen definition expected a const char*:
src/data.c: In function ‘my_atoi’:
src/data.c:21:19: error: pointer targets in passing argument 1 of
‘strlen’ differ in signedness [-Werror=pointer-sign]
int len = strlen(str);
^~~
In file included from src/data.c:3:0:
/usr/include/string.h:384:15: note: expected ‘const char *’ but
argument is of type ‘unsigned char *’
extern size_t strlen (const char *__s)
^~~~~~
src/data.c:22:6: error: pointer targets in assignment differ in
signedness [-Werror=pointer-sign]
str = (char*) malloc(len * sizeof(char));
^
and also the fact that I need to get rid of strlen and string.h alltogether. like here: Converting ASCII to Hex and vice versa - strange issue so it was not very useful. Actually I don't understand the code inside of it.
Another challenge is with the determining input variable to function(3rd input) which should be BASE_10, BASE_16, or BASE_2 etc..., how should the variable:
uint32_t base
which is an int, store this value "BASE_10" to compare it in an if conditional? and what are the bases of conversion to different bases and how should I handle them?
I'm stuck on many different frontiers the answer to this challenge, where to learn more, what the requirement of maximum string size, how should I handle signed data. I'm seeking help. sorry for this long question but I needed to include all the factors.
EDIT editing this the code like this:
uint8_t* str= ptr;
uint8_t len = strlen((const char *)str);
following this thread: cast unsigned char * (uint8_t *) to const char *
I have no errors now, but still need to completely get rid of string.h
Upvotes: 1
Views: 482
Reputation: 154592
What?
str = (uint8_t*) malloc(len * sizeof(uint8_t));
while (*str != '\0')
This my_atoi()
code makes little sense. Code allocates memory, whose contents are unknown - then immediately tries to test them?
Memory allocation is not needed and then strlen()
is not needed.
Character to decimal value
"I understand this part removes null character: a = *str -'0';
" --> Not quite. When *str
is a digit character, *str -'0'
converts the character encoded value (usually ASCII) to its numeric value (like 0 to 9).
*str -'0'
is insufficient for hex digits like 'a'
, and 'F'
.
Alternative
Instead, iterate through ptr
and scale a running sum.
The below does not meet 2 coding goals, which I leave for OP as sub-problems to handle.
** string functions or libraries shouldn't be used.
** function needs to handle signed data.
.
#include <ctype.h>
// Convert 1 digit
static int value(int ch) {
if (isdigit(ch)) {
return ch - '0';
}
if (isxdigit(ch)) {
ch = tolower(ch);
const char *xdigits = "abcdef";
return strchr(xdigits, ch) - xdigits + 10;
}
return INT_MAX; // non-digit
}
int32_t my_atoi(uint8_t * ptr, uint8_t digits, uint32_t base) {
int32_t sum = 0;
while (*ptr) {
sum *= base;
sum += value(*ptr++);
}
return sum;
}
Better code would detect non-numeric input, overflow, no conversion, etc. Other considerations not shown include handling a '-'
sign, validating the base in range.
int32_t my_atoi(uint8_t * ptr, uint8_t digits, uint32_t base) {
int32_t sum = 0;
while (isspace(*ptr)) ptr++; // skip leading white space
if (*ptr == '+') ptr++; // Allow optional leading `+`
bool digit_found = false;
while (*ptr) {
unsigned digit = value(*ptr++);
if (digit >= base) {
return INT_MIN; // TBD signal unexpected digit
}
if (sum >= INT_MAX/base && (sum > INT_MAX/base || digit > INT_MAX%base)) {
// Overflow
return INT_MIN; // TBD signal OF
}
sum *= base;
sum += digit;
digit_found = true;
}
if (*str) {
sum = INT_MIN; // TBD signal unexpected junk at the end
}
if (!digit_found) {
sum = INT_MIN; // TBD signal no digits
}
return sum;
}
my_itoa()
For help on my_itoa()
, review What is the proper way of implementing a good “itoa()” function?. It does have an nice answer in there - somewhere.
Upvotes: 2