n.nasa
n.nasa

Reputation: 551

Escaping JSON string inside Jenkinsfile shell

I am trying to escape a json-string inside a jenkinsfile but it is a little complicated that the examples I have checked online.

sshagent(credentials: ['keypair']) {
            sh """
                ssh ansible@ansible-server \
                '
# vars
src_env="${params.source_env}"
dest_env="${params.destination_env}"
src_prefix="${params.source_s3_prefix}"
dest_prefix="${params.destination_s3_prefix}"
region="${region}"
ch="${ch}"
aws_account="${aws_account}"

# Run ECS task
echo "---- Run ECS task ----"
task_id=$(aws ecs run-task \
  --cluster cluster-\${ch}-\${dest_env} \
  --task-definition \$td_id \
  --region \${region} \
  --overrides {\\\"containerOverrides\\\": [{\\\"name\\\": \\\"s3-sync\\\", \\\"environment\\\": [{\\\"name\\\": \\\"SRC_ENV\\\", \\\"value\\\": \\\"$${src_env}\\\"}, {\\\"name\\\": \\\"DEST_ENV\\\", \\\"value\\\": \\\"$${dest_env}\\\"}, {\\\"name\\\": \\\"SRC_PREFIX\\\", \\\"value\\\": \\\"$${src_prefix}\\\"}, {\\\"name\\\": \\\"DEST_PREFIX\\\", \\\"value\\\": \\\"$${dest_prefix}\\\"}]}], \\\"taskRoleArn\\\":  \\\"arn:aws:iam::$${aws_account}:role/$${ch}-$${dest_env}\\\"} \
  --query 'tasks[0].taskArn' \
  --output text | cut -d'/' -f 2)

                '
            """
}

So, the problem is to escape the following json string:

--overrides {\\\"containerOverrides\\\": [{\\\"name\\\": \\\"s3-sync\\\", \\\"environment\\\": [{\\\"name\\\": \\\"SRC_ENV\\\", \\\"value\\\": \\\"$${src_env}\\\"}, {\\\"name\\\": \\\"DEST_ENV\\\", \\\"value\\\": \\\"$${dest_env}\\\"}, {\\\"name\\\": \\\"SRC_PREFIX\\\", \\\"value\\\": \\\"$${src_prefix}\\\"}, {\\\"name\\\": \\\"DEST_PREFIX\\\", \\\"value\\\": \\\"$${dest_prefix}\\\"}]}], \\\"taskRoleArn\\\":  \\\"arn:aws:iam::$${aws_account}:role/$${ch}-$${dest_env}\\\"} \

Any help appreciated? I have tried lots of different ways but nothing seems to work.

Upvotes: 0

Views: 4906

Answers (2)

n.nasa
n.nasa

Reputation: 551

Based on Vasiliki Siakka's answer, I parsed the json parameter beforehand and escaped the double-quotes in json string so as to avoid them being stripped in ssh command. So, solution that worked for me is as follows:

sshagent(credentials: ['keypair']) {

  // ch is defined is defined somewhere in the pipeline
  // aws_account is also defined somewhere in the pipeline

  def overrides = [
    containerOverrides: [
      [
        environment: [
          [
            name: "SRC_ENV",
            value: params.source_env
          ],
          [
            name: "DEST_ENV",
            value: params.destination_env
          ],
          [
            name: "SRC_PREFIX",
            value: params.source_s3_prefix
          ],
          [
            name: "DEST_PREFIX",
            value: params.destination_s3_prefix
          ]
        ],
        name: "s3-sync"
      ]
    ],
    taskRoleArn: "arn:aws:iam::${aws_account}:role/${ch}-${params.destination_env}"
  ]

  def parsed_overrides = groovy.json.JsonOutput.toJson(overrides).replace("\"", "\\\"")



  sh """
      ssh ansible@ansible-server \
      '
# Run ECS task
echo "---- Run ECS task ----"
task_id=\$(aws ecs run-task \
  --cluster cluster-${ch}-${params.destination_env} \
  --task-definition \$td_id \
  --region ${region} \
  --overrides "${parsed_overrides}" \
  --query 'tasks[0].taskArn' \
  --output text | cut -d'/' -f 2)
      '
    """
}

Upvotes: 6

Vasiliki Siakka
Vasiliki Siakka

Reputation: 1283

I suggest that instead of trying to evaluate all the variables after you ssh, you instead compute the shh command as part of the shell script evaluation.

In groovy when you use double quotes (single or triple) anything that follows a $ will get evaluated/replaced. So, you can just add a $ in front of every variable that's used in your shell script and the variable will get evaluated before the shell scrip even runs. You also need to escape any $ that you don't want to get evaluated as part of the string processing.

For the json blob specifically, I suggest creating a map outside of the shell script that contains the overrides, and then converting that variable to json string. That way you don't have to worry about using the correct number of \ (although you can use single quotes inside a triple quote string without escaping) and your pipeline file is more readable.

Here's how I'd write it:

sshagent(credentials: ['keypair']) {

  // ch is defined is defined somewhere in the pipeline
  // aws_account is also defined somewhere in the pipeline

  def overrides = [
    containerOverrides: [
      [
        environment: [
          [
            name: "SRC_ENV",
            value: params.source_env
          ],
          [
            name: "DEST_ENV",
            value: params.destination_env
          ],
          [
            name: "SRC_PREFIX",
            value: params.source_s3_prefix
          ],
          [
            name: "DEST_PREFIX",
            value: params.destination_s3_prefix
          ]
        ],
        name: "s3-sync"
      ]
    ],
    taskRoleArn: "arn:aws:iam::${aws_account}:role/${ch}-${params.destination_env}"
  ]


  sh """
    ssh ansible@ansible-server \
    '
# Run ECS task
echo "---- Run ECS task ----"
task_id=\$(aws ecs run-task \
  --cluster cluster-${ch}-${params.destination_env} \
  --task-definition \$td_id \
  --region ${region} \
  --overrides ${groovy.json.JsonOutput.toJson(overrides)} \
  --query 'tasks[0].taskArn' \
  --output text | cut -d'/' -f 2)
    '
  """
}

Note that I didn't escape any of the variables that I want the string processing to replace and I only escaped

task_id=\$(aws ecs run-task \

So that the aws ecs run-task ... runs when the ssh script is executed (and not when the string is evaluated).

Upvotes: 2

Related Questions