How to use Azure Artifacts with dotnet in Jenkins pipeline?

You probably want to have one PAT (Personal Access Token) for Azure DevOps in Jenkins Credentials and use it not only for getting source code from GIT, but also for getting NuGet packages for build, right?

So, you need the withCredentials construct with an environment variable VSS_NUGET_EXTERNAL_FEED_ENDPOINTS to be used like this:

steps {
  script {
    withCredentials(bindings: [usernamePassword(credentialsId: '...', passwordVariable: 'PAT', usernameVariable: 'PAT_USERNAME')]) {
      env.VSS_NUGET_EXTERNAL_FEED_ENDPOINTS = "{\"endpointCredentials\": [{\"endpoint\":\"https://YourOrgName.pkgs.visualstudio.com/_packaging/YourFeedName/nuget/v3/index.json\", \"username\":\"build\", \"password\":\"${PAT}\"}]}"
      sh "dotnet ..."
    }
  }
}

This greatly simplifies Jenkins CI solution with Azure DevOps, because now you have to monitor much less possible expirations.

Generalized declarative Jenkinsfile

My wisely simplified Jenkinsfile template for most of my projects looks like:

def STAGE_URL='...'
def STAGE_DEPLOY_HOSTNAME='...'
...

def PROD_URL='...'
def PROD_DEPLOY_HOSTNAME='...'
...

pipeline {
  agent {
    label '... && ... && ... && ...'
  }
  options {
    skipDefaultCheckout true
  }
  stages {
    stage('Configure') {
      steps {
        script {
          if (env.BRANCH_NAME == 'master') {
            currentBuild.displayName = "1.0.${BUILD_NUMBER}"
          }
          else {
            currentBuild.displayName = "non-master-build-${BUILD_NUMBER}"
          }
        }
      }
    }
    stage('Checkout') {
      steps {
        // deleteDir()   /* clean up our workspace before checkout */
        checkout scm
      }
    }
    stage('Build') {
      steps {
        // ...
      }
    }
    stage('Test') {
      steps {
        // ...
      }
    }
    stage('Deploy') {
      when {
        branch 'master'
      }
      stages {
        stage('Stage') {
          stages {
            stage('Service deploy') {
              options {
                lock('...-stage-deploy')
              }
              steps {
                // ...
              }
            }
          }
        }
        stage('Prod') {
          input {
            message "Deploy service to production?"
            submitter 'Administrators'
          }
          stages {
            stage('Service deploy') {
              options {
                lock('...-prod-deploy')
              }
              steps {
                // ...
              }
            }
          }
        }
        stage('Package publishing') {
          input {
            message "Publish client libraries?"
            submitter 'Administrators'
          }
          steps {
            // ...
          }
        }
      }
    }
  }
}

Why ‘Deploy’ is so deep in levels?

In a declarative pipeline, unfortunately, we can’t combine ‘when’, ‘input’ and ‘lock’ in a reasonable way (check for branch, then ask for confirmation and then lock for deploy) using less levels.

What ‘Checkout’ stage is for?

You may need to build in a clean workspace, but keep it for an investigation after the build. So, post condition is not applicable and you have to clean up before checkout.

What for to use definitions?

You can parametrize something in def ...='…' to not spread things all over the file. I don’t insist it to be a URL or hostname.

Fast way to setup OpenSSH on Windows Server (version below 2019)

Install:

Set-ExecutionPolicy Bypass -Scope Process -Force; iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1'))
choco install openssh
cd "C:\Program Files\OpenSSH-Win64\"
.\install-sshd.ps1
sc.exe config sshd start=auto

Change default shell (I assume you need PowerShell 6):

New-ItemProperty -Path "HKLM:\SOFTWARE\OpenSSH" -Name DefaultShell -Value "C:\Program Files\PowerShell\6\pwsh.exe" -PropertyType String -Force

Generate keys for remote access:

  1. Login under proper user.
  2. Execute ssh-keygen.exe.
  3. Copy $Env:UserProfile\.ssh\id_rsa to external instrument (ssh client).
  4. Move or add $Env:UserProfile\.ssh\id_rsa.pub to $Env:UserProfile\.ssh\authorized_keys.
  5. Fix authorized_keys permissions:
$acl = Get-Acl $Env:UserProfile\.ssh\authorized_keys
$usersid = New-Object System.Security.Principal.Ntaccount("Everyone")
$acl.PurgeAccessRules($usersid)
$acl | Set-Acl $Env:UserProfile\.ssh\authorized_keys

For localized Windows: there’s no “Everyone”, but something in your language. Use GUI.

Debug (if things go wrong):

((New-Object System.Net.WebClient).DownloadFile('https://download.sysinternals.com/files/PSTools.zip', 'C:\PSTools.zip'))
Expand-Archive -LiteralPath 'C:\PSTools.zip' -DestinationPath 'C:\pstools'
sc.exe stop sshd
C:\pstools\PsExec64.exe -s sshd.exe -d

Making a SQL Server database an Azure DevOps artifact

Why doing it?

Perhaps, your project has a database-first approach. So, you need an instance of your database for debugging and testing of code.

Nowadays teams can be spread over different regions, having no ability to share a single database instance (which would also be a bad practice because of dependency introduction and drift acceleration).

If your production database is not much evolving in its schema and has a moderate size, such database is a good candidate to be handled as an artefact.

Database artefacts are just right for:

  • Reproducing of bugs by deploying a certain version of a database.
  • Testing migrations by redeploying a version multiple times.

The good case for making an artefact from a production database is when its schema is managed by some centralized software (not one that’s going to be developed) and you have to just adapt to database changes.

What do you need to minimise burden?

To alleviate burdens you need:

  1. Docker
  2. VSTS CLI
  3. sqlcmd (Mac / Linux)
  4. bcp
  5. SqlPackage

You can use SQL Server Data Tools (SSDT) (requires Visual Studio) instead of SqlPackage for manual extraction of a database schema.

Avoid DOMAIN\... format for database users

The thing that can surprise a developer during deployment of a database in his / her local environment is SqlPackage error like:

Error SQL72014: .Net SqlClient Data Provider: Msg 15401, Level 16, State 1, Line 1 Windows NT user or group 'DOMAIN\ReportJobs' not found. Check the name again.

The command CREATE USER [DOMAIN\...] WITHOUT LOGIN; somewhy (even on Linux version of SQL Server) expects Windows NT user or group…

So, don’t hesitate to fix a production database with commands like:

ALTER USER [DOMAIN\ReportJobs] WITH NAME = [ReportJobs]

Create feed, PAT and authorize

Database artefact is going to be universal package. You need a feed in your project (create it in Azure DevOps Artifacts).

You need to create your Azure DevOps Personal Access Token (PAT) for accessing the feed. Scope Packaging(Read & write) would be enough. Copy the token value from DevOps, then paste it into CLI command:

vsts login --token token

How to create an artefact from a production database?

Prepare

A local folder named artifacts/artifact-name/ in the root of your project is going to be used, so artifacts/ should be in .gitignore.

Scripts

./tools/make-db-artifact.ps1

The tool extracts schema of the database and exports data of some tables from a production database to the files in artifacts/artifact-name/.

Replace artifact-name, server-address, database-name and table names with actual values:

param (
    [Parameter(Mandatory=$true)]
    [String]
    $login,

    [Parameter(Mandatory=$true)]
    [ValidateNotNullOrEmpty()]
    [SecureString]
    $password
)

$ErrorActionPreference = "Stop"

# convert a secure password to plain text
$BSTR = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($password)
$unsecurePassword = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto($BSTR)

#
# PREPARE DIRECTORY
#

$path = "./artifacts/artifact-name"

if (!(Test-Path -Path $path)) {
    mkdir $path
}
else {
    Remove-Item $path\*
}

#
# EXTRACT SCHEMA
#

SqlPackage `
    /a:Extract `
    /ssn:"server-address" `
    /sdn:"database-name" `
    /su:"$login" `
    /sp:"$unsecurePassword" `
    /tf:"$path/database-name-schema.dacpac" `
    /p:ExtractApplicationScopedObjectsOnly=true `
    /p:ExtractReferencedServerScopedElements=false `
    /p:IgnoreUserLoginMappings=true `
    /p:ExtractAllTableData=false

#
# EXPORT DATA
#

function BulkCopy([string]$tableName)
{
    bcp [database-name].[dbo].[$tableName] out $path/$tableName.bcp -N -S server-address -U $login -P $unsecurePassword
}

# tables (include only the ones you need)
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
...

./tools/publish-db-artifact.ps1

Replace org-name, feed-name, artifact-name with actual values:

param (
    [Parameter(Mandatory=$true, ValueFromRemainingArguments=$true)]
    [ValidatePattern('^\d{1}.\d{1}.\d{1,5}$')]
    [String]
    $version
)

vsts package universal publish -i https://dev.azure.com/org-name/ --feed feed-name --name artifact-name --version $version --path ./artifacts/artifact-name

Execute “make” then “publish” tools provided above.

Manual approach

An extraction can also be done with SQL Server Data Tools:

Right-click the database in SQL Server Object Explorer
Use schema only or schema and data extraction and uncheck user login mappings
Wait until it’s done

I recommend to extract “schema only” and use bcp tool for tables. If you opt to “schema and data” having a foreign keys in your database, probably, you have no choice except to export all the data, which can be of a huge amount.

You can upload a package version with the command:

vsts package universal publish -i https://dev.azure.com/org-name/ --feed feed-name --name artifact-name --version version --path ./artifacts/artifact-name

How to deploy a database in a development environment?

Scripts

docker-compose.yml

I hope your development environment is started with docker-compose, so there are some docker-compose.yml where you put an arbitrary sa password:

version: "3"
services:
  sqlserver:
    image: mcr.microsoft.com/mssql/server:2017-latest-ubuntu
    container_name: db
    ports:
      - 11433:1433
    environment:
      ACCEPT_EULA: 'Y'
      SA_PASSWORD: 'SomeArbitraryPassw0rd'

In this example I map dockerized SQL Server to the port 11433.

./tools/get-db-artifact.ps1

To get an artefact to ./artifacts/artifact-name/ back from Azure DevOps Artifacts use a script like:

param (
    [Parameter(Mandatory=$true, ValueFromRemainingArguments=$true)]
    [ValidatePattern('^\d{1}.\d{1}.\d{1,5}$')]
    [String]
    $version
)

$ErrorActionPreference = "Stop"

vsts package universal download -i https://dev.azure.com/org-name/ --feed feed-name --name artifact-name --version $version --path ./artifacts/artifact-name

./tools/deploy-db-artifact.ps1

The sa password (from docker-compose) I use in the deployment script, that looks like:

$password = "SomeArbitraryPassw0rd"

$ErrorActionPreference = "Stop"

$path = "./artifacts/artifact-name"

#
# PREPARE DATABASE
#

sqlcmd -S tcp:localhost,11433 -U sa -P $password -i $PSScriptRoot/sql/db-artifact-pre-deploy.sql

#
# PUBLISH SCHEMA
#

SqlPackage /a:Publish /tcs:"Server=tcp:localhost,11433;Database=database-name;User Id=sa;Password=$password;" /sf:"$path/database-name-schema.dacpac"

#
# IMPORT DATA
#

function BulkCopy([string]$tableName)
{
    bcp dbo.$tableName in $path/$tableName.bcp -N -S tcp:localhost,11433 -d ELMA3 -U sa -P $password -q
}

# tables
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
BulkCopy table-name
...

#
# POST-DEPLOY
#

sqlcmd -S tcp:localhost,11433 -U sa -P $password  -i $PSScriptRoot/sql/db-artifact-post-deploy.sql

I have added some ./tools/sql/db-artifact-pre-deploy.sql and ./tools/sql/db-artifact-post-deploy.sql scripts to the example, which usually are needed to:

  1. Check if the database exists, drop and recreate it (pre-deploy).
  2. Create a login and a corresponding user for software you develop.
  3. Update sequences and other things corresponding to the table data.

Rollout step-by-step

On a developer’s machine there should be, again: Docker, VSTS CLI, sqlcmd, bcp and SqlPackage. He or she ought to be authorized to Azure DevOps:

vsts login --token token

To get an environment up he / she need to:

  • Start containers:
docker-compose up
  • Get an artefact:
./tools/get-db-artifact.ps1
  • Deploy an artefact:
./tools/deploy-db-artifact.ps1

Fun of being at Global DevOps Bootcamp

A few days ago I visited Global DevOps Bootcamp. During a registration through a meetup platform, I was quite inattentive and therefore I didn’t read the event program.

When I arrived at the event and found people watching some introductory video, I set down with them and then started to notice, that everyone has a backpack or laptop, but I was the only person with a small bag, containing my glasses, wallet and smartphone. That was becoming suspicious …

Finally, it turned out to be a hackathon! I joined to team-04 as an assistant.

Working from a smartphone, you know, also fun. And helps. I was googling a lot. Also, I found out that my experience in Azure and Azure DevOps wasn’t so bad, I could give relevant advice.

We weren’t the best team, but we got a new experience! Finally, we stumbled upon finding malicious code in dependencies in challenge 6 of 8.

Сonclusion

Participation in hackathon doesn’t require from you extreme mental tension. It’s just fun for those, who want to learn something in gaming style.

Now I know what is Global DevOps Bootcamp and I’ll visit it next year, but that time with a laptop! 😉

Setting up OpenSSH Server on Windows Server 2012 R2

I wasn’t lucky enough this week.

I have OpenSSH installed on my Windows Server 2012 R2 using Desired State Configuration and Chocolatey (more about that) with the following configuration:

Configuration SSHServerFeature
{
    param (
        [String]$NodeName
    )
    
    Import-DscResource -Module cChoco

    Node $NodeName
    {
        cChocoInstaller installChoco
        {
            InstallDir = "c:\choco"
        }
        cChocoPackageInstaller installOpenSSH
        {
            Name                 = 'openssh'
            Ensure               = 'Present'
            DependsOn            = "[cChocoInstaller]installChoco"
            AutoUpgrade          = $True
        }
    }
}

Pretty simple, isn’t it? Too much .. after I had it installed, SSH service wasn’t there.

I dug deeper into DSC resource and added the property to installOpenSSH:

Params               = '"/SSHServerFeature /KeyBasedAuthenticationFeature"'

But, no luck. And I installed service up with C:\Program Files\OpenSSH-Win64\install-sshd.ps1.

Now OpenSSH Server’s service wasn’t starting. It was complaining so unclear that I had to research how to debug sshd on Windows.

The article that helped me very much is — Troubleshooting Steps in GitHub repo for Win32-OpenSSH. But a frustrating surprise was Win32-OpenSSH bug #1304 that lead to the inability of just performing troubleshooting steps, because sshd -d under user account is failing to fork unprivileged child. Oh, gosh…

So, the ultimate way to make OpenSSH Server working after manual crumpling stuff was:

  1. Install PsExec by Mark Russinovich.
  2. Delete keys from C:\ProgramData\ssh to avoid any possible issues with ACLs I have changed. Of course, if you already have clients to your server, don’t do this, instead read that article.
  3. Run: C:\pstools\PsExec64.exe -s ssh-keygen -A
  4. Run: C:\pstools\PsExec64.exe -s sshd.exe -d

After client logoff from SSH, sshd started with -d exits and writes to a console something like:

Received disconnect from 10.20.21.28 port 38572:11: disconnected by user
Disconnected from 10.20.21.28 port 38572
debug1: do_cleanup
debug1: do_cleanup
sshd.exe exited on CRM1 with error code 255.

It’s ok. When it’s started as a service, it works fine.

What I recommend to know about DSC resources

If you started to learn Microsoft’s approach to Configuration as Code and found out what Desired State Configuration is, you probably tried to scrutinize PowerShell DSC resources code on GitHub in didn’t get it.

For ones who need to know how it works and how to find out resource parameters, I recommend to read the following articles (in the order provided):

  1. Understanding CONFIGURATION keyword … from PowerShell Team.
  2. Hungry for more Windows PowerShell Desired State Configuration Resources? from PowerShell Team.
  3. Anatomy of a PowerShell DSC Resource from Dr Scripto.

It helped me very much.

Now, imagine you found some DSC resource module, for instance cChoco (in PowerShell Gallery, on GitHub), and you need to know what properties its resources have. Super helpful may be command Get-DscResource, applied like:

(Get-DscResource cChocoPackageInstaller).Properties | Format-Table

.. which outputs something like:

Name                 PropertyType   IsMandatory Values
----                 ------------   ----------- ------
Name                 [string]              True {}
AutoUpgrade          [bool]               False {}
chocoParams          [string]             False {}
DependsOn            [string[]]           False {}
Ensure               [string]             False {Absent, Present}
Params               [string]             False {}
PsDscRunAsCredential [PSCredential]       False {}
Source               [string]             False {}
Version              [string]             False {}

It says, parameters Name, AutoUpgrade, chocoParams, DependsOn, Ensure, Params, PsDscRunAsCredential, Source, Version available to you if you build configuration like:

Configuration SSHServerFeature
{
    param (
        [String]$NodeName
    )
    
    Import-DscResource -Module cChoco

    Node $NodeName
    {
        cChocoInstaller installChoco
        {
            InstallDir = "c:\choco"
        }
        cChocoPackageInstaller ...
        {
            Name                 = '...'
            Ensure               = 'Present'
            DependsOn            = "[cChocoInstaller]installChoco"
            AutoUpgrade          = $True
        }
    }
}

By the way, if you install OpenSSH server don’t forget about Params with the value like '"/SSHServerFeature /KeyBasedAuthenticationFeature"', otherwise you get into the story as I got.

Bitbucket and Jenkins surprises

I did everything as told and got some experiences:

  1. Webhooks are not working (Jenkins ver. 2.164.3 and Bitbucket cloud). Instead, I had to rely on periodic ‘Scan Multibranch Pipeline Triggers’.
  2. Jenkins builds for PRs can be made using the wrong commits.

The second one comes as a real surprise!

I have the branch jenkins-test-3 forked from master and jenkins-test-4 forked from master.

Before pull-request from jenkins-test-4 to jenkins-test-3 there were commits:

  • c013c79... — the last commit in jenkins-test-3.
  • dc05341... — the last commit in jenkins-test-4 made in master after jenkins-test-3 was forked, but before jenkins-test-4.

Pull request from jenkins-test-4 to jenkins-test-3 made its own commit 95a42f5.... In Bitbucket UI we see the build is passed in CI, but what is in Jenkins log:

Branch indexing

Loading trusted files from base branch jenkins-test-3 at c013c792c09f81b7a178c0fa77343e65e3c02871 rather than 95a42f534118ee8584ab34ef0dd6195f6078e6ee
....

And yes, it really built c013c79... for that PR.

It’s Bitbucket Branch Source Plugin who manage to get a source code from the repository and choose which one to build.

Azure DevOps Service Hook actions for Jenkins CI

Configuring a service hook in Azure DevOps for Jenkins CI you have to choose an action — Trigger generic build or Trigger Git build.

Which one to choose?

Trigger generic build

This one forces you to choose a build. For a multibranch pipeline, it would be a combination of Jenkins job and branch, like <job-name>/master.

This is ok for freestyle job with only one branch for CI.

Trigger Git build

If you are making CI for features, fixes and pull-requests this action is recommended one. Jenkins Git plugin with ‘Discover branches’ behaviour in a job takes care of the rest.

It’s enough to make a single service hook with this action for all repositories in a project.