Reputation: 63
I've been working on a shell writing assignment in C and I am currently stuck with something that I believe should be quite simple to fix, but I just can't see where the fault is. I've extracted the core parts of my code to demonstrate the problem.
The idea is that I get a string from the user which gets divided into separate strings, then inserted into an array which is then passed to execvp for execution.
You can see that I have a hardcoded some values to char *arguments[2]. If I use that like in execvp, like execvp(arguments[0], arguments) it executes "ls -l" and works perfectly fine.
However problem seems to appear when I let tokenizeInput handle any string that I enter. In this example hardcoded as "ls -l". It is quite straight forward, the toknizer separates it using the delimiter " " and assigns each value to my array of char pointers. I print out the values to check if they are correct, which they seem to be. And then I return the array.
So now in main, I should I received my array that have been through the tokenizer. Its content shouldn't be any different from the contents in char *arguments[2]. But when I use it, I just get the error "No such file or directory". I've been printing out the array in main and I can't seem to spot anything wrong with them.
I've been reading the manual on execvp and strtok I'm pretty sure that both the array and the strings inside them are null terminated, just like execvp requires.
I have not a clue what I'm doing wrong, any pointers toward the right direction is appreciated. I hope it's not something silly.
int main(int argc, char *argv[]) {
/*This works fine!*/
char *arguments[2];
arguments[0] = "ls";
arguments[1] = "-l";
arguments[2] = NULL; /*null terminating*/
/*This does not work...*/
char **inputs;
inputs = tokenizeInput("ls -l");
execvp(inputs[0], inputs);
perror("Error");
return 0;
}
char** tokenizeInput(char *input) {
char newInput[70];
strcpy(newInput, input);
char **array = malloc(4 * sizeof(newInput)); /*memory pls*/
int i = 0;
array[i] = strtok(newInput, " ");
while(array[i] != NULL) {
printf("Putting string: %s into array[%i]\n", array[i], i);
array[++i] = strtok(NULL, " "); /*Returns next token for each loop*/
}
printf("First argument: %s. ", array[0]); /*Should be "ls"*/
printf("Second argument: %s. ", array[1]); /*Should be "-l"*/
printf("Third argument: %s. ", array[2]); /*Should be NULL*/
return array;
}
Upvotes: 1
Views: 709
Reputation: 977
The problem you have is that you copy the input string into an array locally allocated on the stack (the newInput
array). The strtok()
function simply returns pointers into this newInput
array.
Everything works fine in your tokenizeInput()
function, but when you return from the function, the stack gets popped, and then reused for other things so your data may get overwritten.
Try mallocing the newInput
array instead. Don't forget to free your memory when you're done though!
Edit: Rather than using malloc()
, you could use strdup()
so you don't have to worry about checking the input length. But unless you're sure the first character is not a space, the free()
operation may get a bit messy. Alternatively you could strdup()
each word that goes into the array and then free up the array elements when you're done.
Upvotes: 2