deralbert
deralbert

Reputation: 1053

Deploy artifacts to physical machine with Azure Pipelines

I have a project that is built with an azure pipeline, the artifacts are sent to both the File Share and Azure DevOps. A self hosted agent is currently used for the pipeline that builds my project. Now I want to deploy my artifacts on a physical computer in the network of my company. Artifacts are installation files that support "silent installation mode", so I can install everything by executing corresponding files with PowerShell.

My question is: What should I do to achieve my goal? I considered that I would simply create a release pipeline (I use Microsoft's classic UI editor for pipelines) and add the necessary tasks. In the end everything will look something like this:

  1. Copy artifacts from the computer on which the product was built (A) to the computer on which the product is to be installed (B).
  2. Perform the installation of services that are required for the main application.
  3. Perform the installation from the main application. Is this the right course of action or should I choose something else?

One more question. It is not entirely clear to me how I should divide the deployment stages, because: Step 1 is still done by the agent on the computer A. Steps 2 and 3 should already do another agent. Hence, an agent must be also present on the computer B (in total I need two agents, one for the build pipeline on the computer A and one on the target computer B). Is that correct?

Upvotes: 1

Views: 3886

Answers (2)

Leo Liu
Leo Liu

Reputation: 76996

Deploy artifacts to physical machine with Azure Pipelines

I think you are very close to the answer, but I am not sure whether the advice I provided is the best, because it is more a matter of taste, you can check my suggestions below.

We could achieve this by one agent on the computer A:

In this case, we do not need to create a new agent on the computer B, and also no need to divide the deployment stages.

Upvotes: 1

Paul Stoner
Paul Stoner

Reputation: 1522

We have a web application that has a similar requirement, we need to include location specific files when we deploy our application to that location.

I'm not sure this is the best route, but we incorporated using Ansible in our deployment pipeline. Our Ansible playbook utilizes the hosts file which contains variables for the location and the jar version to be deployed.

Again, our situation is different, but the same should work.

For instance, you may be able to something like this...

Your Ansible inventory (hosts) file may look like this

[servers]
xxx.xxx.xxx.xxx

[all:vars]
app_name=mfgweb
env=qa
app_version=2020.03.4

*** If your artifact is a jar file
jar_group=com.somegroup
jar_name=some_jar <-- note: no extentsion
jar_version=2019.11.3

Your playbook may look something like this

---
- hosts: localhost
  connection: local

  tasks:
  - name: Find  old working directories
    find:
        paths: /tmp
        patterns: 'ansible.*'
        file_type: directory
        age: 2d
    register: tmp_dirs

  - name: Cleanup old working directories
    file:
        path: "{{ item }}"
        state: absent
    loop: "{{ tmp_dirs.files | map(attribute='path') | list }}"

  - name: Create working directory
    tempfile:
        state: directory
        suffix: work
    register: workdir

  - name: Download artifact
    maven_artifact:
        group_id: "{{ jar_group }}"
        artifact_id: "{{ jar_name }}"
        version: "{{ jar_version }}"
        repository_url: "{{ artifacts_url }}"
        username: "{{ artifacts_user }}"
        password: "{{ artifacts_password }}"
        verify_checksum: never
        dest: "{{ workdir.path }}/{{ app_name }}/"

This is just a sample, but you would be able to install required software in you playbook using yum and/or pip.

Upvotes: 2

Related Questions