Imagine we have a Jenkins or another CI / CD tool that executes PowerShell on an agent machine (Linux in my case). The tool writes its (standard) output to the build log.
PowerShell Core is installed on the target machine and added to /etc/ssh/sshd_config:
If something goes wrong with the invoked command, it can frustrate you — to see an error in the build log like this:
NotSpecified: (:String) [], RemoteException
script returned exit code 1
Even if you use -ErrorAction Stop in the following way (to convert an error output to an exception) it doesn’t help, because the output of the invoked command is not a PowerShell error output:
You probably want to have one PAT (Personal Access Token) for Azure DevOps in Jenkins Credentials and use it not only for getting source code from GIT, but also for getting NuGet packages for build, right?
So, you need the withCredentials construct with an environment variable VSS_NUGET_EXTERNAL_FEED_ENDPOINTS to be used like this:
In a declarative pipeline, unfortunately, we can’t combine ‘when’, ‘input’ and ‘lock’ in a reasonable way (check for branch, then ask for confirmation and then lock for deploy) using less levels.
What ‘Checkout’ stage is for?
You may need to build in a clean workspace, but keep it for an investigation after the build. So, post condition is not applicable and you have to clean up before checkout.
What for to use definitions?
You can parametrize something in def ...='…' to not spread things all over the file. I don’t insist it to be a URL or hostname.
Perhaps, your project has a database-first approach. So, you need an instance of your database for debugging and testing of code.
Nowadays teams can be spread over different regions, having no ability to share a single database instance (which would also be a bad practice because of dependency introduction and drift acceleration).
If your production database is not much evolving in its schema and has a moderate size, such database is a good candidate to be handled as an artefact.
Database artefacts are just right for:
Reproducing of bugs by deploying a certain version of a database.
Testing migrations by redeploying a version multiple times.
The good case for making an artefact from a production database is when its schema is managed by some centralized software (not one that’s going to be developed) and you have to just adapt to database changes.
You can use SQL Server Data Tools (SSDT) (requires Visual Studio) instead of SqlPackage for manual extraction of a database schema.
Avoid DOMAIN\... format for database users
The thing that can surprise a developer during deployment of a database in his / her local environment is SqlPackage error like:
Error SQL72014: .Net SqlClient Data Provider: Msg 15401, Level 16, State 1, Line 1 Windows NT user or group 'DOMAIN\ReportJobs' not found. Check the name again.
The command CREATE USER [DOMAIN\...] WITHOUT LOGIN; somewhy (even on Linux version of SQL Server) expects Windows NT user or group…
So, don’t hesitate to fix a production database with commands like:
ALTER USER [DOMAIN\ReportJobs] WITH NAME = [ReportJobs]
Create feed, PAT and authorize
Database artefact is going to be universal package. You need a feed in your project (create it in Azure DevOps Artifacts).
You need to create your Azure DevOps Personal Access Token (PAT) for accessing the feed. Scope Packaging(Read & write) would be enough. Copy the token value from DevOps, then paste it into CLI command:
vsts login --token token
How to create an artefact from a production database?
Prepare
A local folder named artifacts/artifact-name/ in the root of your project is going to be used, so artifacts/ should be in .gitignore.
Scripts
./tools/make-db-artifact.ps1
The tool extracts schema of the database and exports data of some tables from a production database to the files in artifacts/artifact-name/.
Replace artifact-name, server-address, database-name and table names with actual values:
Right-click the database in SQL Server Object Explorer
Use schema only or schema and data extraction and uncheck user login mappings
Wait until it’s done
I recommend to extract “schema only” and use bcp tool for tables. If you opt to “schema and data” having a foreign keys in your database, probably, you have no choice except to export all the data, which can be of a huge amount.
You can upload a package version with the command:
I have added some ./tools/sql/db-artifact-pre-deploy.sql and ./tools/sql/db-artifact-post-deploy.sql scripts to the example, which usually are needed to:
Check if the database exists, drop and recreate it (pre-deploy).
Create a login and a corresponding user for software you develop.
Update sequences and other things corresponding to the table data.
Rollout step-by-step
On a developer’s machine there should be, again: Docker, VSTS CLI, sqlcmd, bcp and SqlPackage. He or she ought to be authorized to Azure DevOps:
A few days ago I visited Global DevOps Bootcamp. During a registration through a meetup platform, I was quite inattentive and therefore I didn’t read the event program.
When I arrived at the event and found people watching some introductory video, I set down with them and then started to notice, that everyone has a backpack or laptop, but I was the only person with a small bag, containing my glasses, wallet and smartphone. That was becoming suspicious …
Finally, it turned out to be a hackathon! I joined to team-04 as an assistant.
Working from a smartphone, you know, also fun. And helps. I was googling a lot. Also, I found out that my experience in Azure and Azure DevOps wasn’t so bad, I could give relevant advice.
We weren’t the best team, but we got a new experience! Finally, we stumbled upon finding malicious code in dependencies in challenge 6 of 8.
Сonclusion
Participation in hackathon doesn’t require from you extreme mental tension. It’s just fun for those, who want to learn something in gaming style.
Now I know what is Global DevOps Bootcamp and I’ll visit it next year, but that time with a laptop! 😉
I have OpenSSH installed on my Windows Server 2012 R2 using Desired State Configuration and Chocolatey (more about that) with the following configuration:
Delete keys from C:\ProgramData\ssh to avoid any possible issues with ACLs I have changed. Of course, if you already have clients to your server, don’t do this, instead read that article.
Run: C:\pstools\PsExec64.exe -s ssh-keygen -A
Run: C:\pstools\PsExec64.exe -s sshd.exe -d
After client logoff from SSH, sshd started with -d exits and writes to a console something like:
Received disconnect from 10.20.21.28 port 38572:11: disconnected by user
Disconnected from 10.20.21.28 port 38572
debug1: do_cleanup
debug1: do_cleanup
sshd.exe exited on CRM1 with error code 255.
It’s ok. When it’s started as a service, it works fine.
Now, imagine you found some DSC resource module, for instance cChoco (in PowerShell Gallery, on GitHub), and you need to know what properties its resources have. Super helpful may be command Get-DscResource, applied like:
It says, parameters Name, AutoUpgrade, chocoParams, DependsOn, Ensure, Params, PsDscRunAsCredential, Source, Version available to you if you build configuration like:
By the way, if you install OpenSSH server don’t forget about Params with the value like '"/SSHServerFeature /KeyBasedAuthenticationFeature"', otherwise you get into the story as I got.
I did everything as told and got some experiences:
Webhooks are not working (Jenkins ver. 2.164.3 and Bitbucket cloud). Instead, I had to rely on periodic ‘Scan Multibranch Pipeline Triggers’.
Jenkins builds for PRs can be made using the wrong commits.
The second one comes as a real surprise!
I have the branch jenkins-test-3 forked from master and jenkins-test-4 forked from master.
Before pull-request from jenkins-test-4 to jenkins-test-3 there were commits:
c013c79... — the last commit in jenkins-test-3.
dc05341... — the last commit in jenkins-test-4 made in master after jenkins-test-3 was forked, but before jenkins-test-4.
Pull request from jenkins-test-4 to jenkins-test-3 made its own commit 95a42f5.... In Bitbucket UI we see the build is passed in CI, but what is in Jenkins log:
Branch indexing
Loading trusted files from base branch jenkins-test-3 at c013c792c09f81b7a178c0fa77343e65e3c02871 rather than 95a42f534118ee8584ab34ef0dd6195f6078e6ee
....