Reputation: 467
I've been creating some basic system health checks, and one of the checks includes a yum repo health status that uses one of Chef's tools called 'knife'. However, when I try to awk a column, I get
can't read "4": no such variable.
Here is what I am currently using:
read -s -p "enter password: " pass
/usr/bin/expect << EOF
spawn knife ssh -m host1 "sudo bash -c \"yum repolist -v| grep Repo-name| awk '{print $4}'\" "
expect {
-nocase password: { send "$pass\r" }; expect eof }
}
EOF
I've tried other variations as well, such as replacing the awk single quotes with double curly braces, escaping the variable, and even setting the variable to the command, and keep getting the same negative results:
awk {{print $4}}
awk '{print \$4}'
awk '\{print \$4\}'
awk {\{print \$4\}}
Does anyone know how I can pass this awk column selector variable within a spawned knife ssh command that sends the variable to the ssh host?
Upvotes: 1
Views: 2481
Reputation: 437288
Donal Fellows' helpful answer contains great analysis and advice.
To complement it with an explanation of why replacing $4
with \\\$4
worked:
The goal is ultimately for awk
to see $4
as-is, so we must escape for all intermediate layers in which $
has special meaning.
I'll use a simplified command in the examples below.
Let's start by temporarily eliminating the shell layer, by quoting the opening here-doc delimiter as 'EOF'
, which makes the content behave like a single-quoted string; i.e., the shell treats the content as a literal, without applying expansions:
expect <<'EOF' # EOF is quoted -> literal here-doc
spawn bash -c "awk '{ print \$1 }' <<<'hi there'"
expect eof
EOF
Output:
spawn bash -c awk '{ print $1 }' <<<'hi there'
hi
Note that $1
had to be escaped as \$1
to prevent expect
from interpreting it as an expect
variable reference.
Given that the here-doc in your question uses an unquoted opening here-doc delimiter (EOF
), the shell treats the content as if it were a double-quoted string; i.e, shell expansions are applied.
Given that the shell now expands the script first, we must add an extra layer of escaping for the shell, by prepending two extra \
to \$1
:
expect <<EOF # EOF is unquoted -> shell expansions happen
spawn bash -c "awk '{ print \\\$1 }' <<<'hi there'"
expect eof
EOF
This yields the same output as above.
Based on the rules of parsing an unquoted here-doc (which, as stated, is parsed like a double-quoted string), the shell turns \\
into a single \
, and \$1
into literal $1
, combining to literal \$1
, which is what the expect
script needs to see.
(Verify with echo "\\\$1"
in the shell.)
As you can see, the multiple layers of quoting (escaping) can get confusing.
One way to avoid problems is to:
use a quoted here-doc, so that the shell doesn't interpret it in any way, so you can then focus on expect
's quoting needs
pass any shell variable values via command-line arguments, and reference them from inside the expect
script as expressions (either directly or by assigning them to an expect
variable first).
expect -f - 'hi there' <<'EOF'
set txt [lindex $argv 0]
spawn bash -c "awk '{ print \$1 }' <<<'$txt'"
expect eof
EOF
Text hi there
is passed as the 1st (and only) command-line argument, which can be referenced as [lindex $argv 0]
in the script.
(-f -
simply explicitly tells expect
to read its script from stdin, which is necessary here to distinguish the script from the arguments).
set txt ...
creates expect
variable $txt
, which can then be used unquoted or as part of double-quoted strings.
To create literal strings in expect
, use {...}
(the equivalent of the shell's '...'
).
Upvotes: 1
Reputation: 137567
This line:
spawn knife ssh -m host1 "sudo bash -c \"yum repolist -v|grep Repo-name|awk '{print $4}'\""
has many layers of quoting (Tcl/Expect, ssh, bash, awk), and it's quoting of different types. Such things are usually pretty nasty and can require using rather a lot of backslashes to persuade values to go through the outer layers unmolested. In particular, Tcl and the shell both want to get their hands on variables whose uses start with $
and continue with alphanumerics. (Backslashes that go in deep need to be themselves quoted with more backslashes, making the code hard to read and hard to understand precisely.)
spawn knife ssh -m host1 "sudo bash -c \"yum repolist -v|grep Repo-name|awk '{print \\\$4}'\""
However, there's one big advantage available to us: we can put much of the code into braces at the outer level as we are not actually substituting anything from Tcl in there.
spawn knife ssh -m host1 {sudo bash -c "yum repolist -v|grep Repo-name|awk '{print \$4}'"}
The thing inside the braces is conventional shell code, not Tcl. And in fact we can probably simplify further, as neither grep
nor awk
need to be elevated:
spawn knife ssh -m host1 {sudo bash -c "yum repolist -v"|grep Repo-name|awk '{print $4}'}
Depending on the sudo
configuration, you might even be able to do this (which I'd actually rather people did on systems I controlled anyway, rather than giving general access to root shells out):
spawn knife ssh -m host1 {sudo yum repolist -v|grep Repo-name|awk '{print $4}'}
And if my awk
is good enough, you can get rid of the grep
like this:
spawn knife ssh -m host1 {sudo yum repolist -v|awk '/Repo-name/{print $4}'}
This is starting to look more manageable. However, if you want to substitute Repo-name
for a Tcl variable, you need a little more work that reintroduces a backslash, but now it is all much tamer than before as there's fewer layers of complexity to create headaches.
set repo "Repo-name"
spawn knife ssh -m host1 "sudo yum repolist -v|awk '/$repo/{print \$4}'"
In practice, I'd be more likely to get rid of the awk
entirely and do that part in Tcl code, as well as setting up key-based direct access to the root account, allowing avoiding the sudo
, but that's getting rather beyond the scope of your original question.
Upvotes: 3