Friday, April 6, 2018

Transforming Unicorn files in a release pipeline using YmlTransform



When you use an automated deployment to different environments that also need changes in unicorn files (without having to update the sitecore configuration manually) you can use YmlTransform. This is an easy tool that uses an input Json file to update unicorn files before copying them to the server. Together with the ReplaceTokens task this is a very powerfull solution.

Our deployment process looks like this before we publish the web application to Azure.

First you'll need to create a Json file containing the fields you want to replace, it currently supports shared and language fields. In our case we call it unicorn.ymltransform with the following content:
[
    {
        "FieldId": "379de7bc-88f2-42ae-8d4a-50dd0b8796ea",
        "Languages": "",
        "Path": "/sitecore/content/Home/Item1",
        "Type": "Shared",
        "Value": "#{ApiUrl1}#"
    },
    {
        "FieldId": "379de7bc-88f2-42ae-8d4a-50dd0b8796ea",
        "Languages": "*",
        "Path": "/sitecore/content/Home/Item2",
        "Type": "Shared",
        "Value": "#{ApiUrl2}#"
    },
    {
        "FieldId": "86ee9731-e7fb-47c9-bab6-5cb282c3a920",
        "Languages": "*",
        "Path": "/sitecore/content/Home/Item3",
        "Type": "Shared",
        "Value": "#{OtherSetting}#"
    }
]

When you parse this file through the ReplaceTokens and run it over your *.ymltransform files, the unicorn files will be transformed using the actual value from the VSTS variables (which could also be loaded from a keyfault).

In the next step you can run a command with the following settings to transform the actual unicorn files:
ymltransform.exe -p "App_Data/unicorn" -r -t "unicorn.ymltransform" 

The output of this command:
2018-04-06T06:28:47.6709328Z Updating file D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item1.yml section Shared id 379de7bc-88f2-42ae-8d4a-50dd0b8796ea to https://apiurl1.com
2018-04-06T06:28:47.6711423Z Transformed: D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item1.yml
2018-04-06T06:28:47.7732198Z Updating file D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item2.yml section Shared id 379de7bc-88f2-42ae-8d4a-50dd0b8796ea to https://apiurl2.com
2018-04-06T06:28:47.7735015Z Transformed: D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item2.yml
2018-04-06T06:28:47.9817169Z Updating file D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item3.yml section Shared id 86ee9731-e7fb-47c9-bab6-5cb282c3a920 to HelloWorld
2018-04-06T06:28:47.9819775Z Transformed: D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item3.yml

Now the yml files contain the correct information for your environment, you can upload the yml files and run a unicorn sync (automatically).

The full sourcecode is available on Github:
https://github.com/luuksommers/ymltransform 

Happy transforming!
Luuk

Friday, March 23, 2018

Using the Sitecore bootloader to add Redis to an Sitecore 8.2 XP0 installation

The Sitecore bootloader is a nifty little-underdocumented thing that can be used to easily extend a Sitecore installation. When you'll install EXM using the web deploy packages, you'll notice that it has to add some connection strings to the ConnectionStrings.config, but you cannot modify that file without restarting the application. So I was wondering, how does that work.
It appears that the bootloader scans directories and can apply xdt transforms to all configs. Just by placing a file called ConnectionStrings.config.xdt on the Azure WebApp in the following folder: App_Data\Transforms\Redis\Xdts\App_Config, the bootloader will try to match the filename without xdt and run the transformation. So the path after Xdts should match the folder structure from your site root.
The Redis package does exactly the same: I've created a WebDeploy package containing this file, so we can deploy Redis to Sitecore without any modifications to the Sitecore installation beforehand.

The content of the xdt file:
<?xml version="1.0"?>
<connectionStrings xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
 <add name="redis.sessions" connectionString="Redis Connection String" xdt:Transform="Insert" />
</connectionStrings>

By adding a parameter to the web deploy package called 'Redis ConnectionString' we can now push the Redis connection string to the web deployment package.
{
  "name": "[concat(variables('singleWebAppNameTidy'), '/', 'MSDeploy')]",
  "type": "Microsoft.Web/sites/extensions",
  "location": "[parameters('location')]",
  "apiVersion": "[variables('webApiVersion')]",
  "properties": {
    "addOnPackages": [
      {
        "packageUri": "[parameters('redisMsDeployPackageUrl')]",
        "setParameters": {
          "Application Path": "[variables('singleWebAppNameTidy')]",
          "Redis Connection String": "[concat(reference(resourceId('Microsoft.Cache/Redis', 
                   variables('redisCacheNameTidy')), variables('redisApiVersion')).hostName, ':', reference(resourceId('Microsoft.Cache/Redis', 
                   variables('redisCacheNameTidy')), variables('redisApiVersion')).sslPort, ',password=', listKeys(resourceId('Microsoft.Cache/Redis', 
                   variables('redisCacheNameTidy')), variables('redisApiVersion')).primaryKey, ',ssl=True,abortConnect=False')]"
        }
      }
    ]
  }
}

During the first hit of the site, the bootloader will scan the App_Data\Transforms directory. The bootloader then runs an installer in a separated process on the Web App to apply the transformation.

Supporting code:
https://github.com/luuksommers/sitecore-xp0-redis

Happy caching!
Luuk

Monday, April 25, 2016

First steps towards an awesome build and deploy pipeline

So we moved to the new awesome scripted build from VSTS and GitHub as our main code repository. But how do we handle the build and deployments? In this post I will show all the tips and tricks we did to get it working. Please note that this is a continues improving plan, but it will get you started.

Code setup

First our code setup. In the root of our project we have a couple of folders to not mix up code with other tools. So we have the following folders defined:
  • src (for sources)
  • build (for build scripts, see versioning below)
  • tools (for external tools)

Build setup

For the build server we've created the following steps. See the specific configurations below each name.

Build Tab

Delete files
Contents: **\<Namespace>*.nupkg (this is because we don't do a clean checkout each time)

Powershell
Script Filename: build/ApplyVersionToAssemblies.ps1

NuGet Installer
Path to Solution: **\*.sln

Visual Studio Build
Solution: **\*.sln
MSBuild Arguments: /p:RunOctoPack=true

Visual Studio Test
Test Assembly: **\bin\$(BuildConfiguration)\*test*.dll; -:**\xunit.runner.visualstudio.testadapter.dll; -:**\Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll;-:**\<NameSpace>.TestUtils.dll

NuGet Publisher
Path/Pattern to nupkg **\bin\**\MeteoGroup.RouteGuard*.nupkg
NuGet Server Endpoint: Octopus Deploy (see Octopus Deploy, below)

Triggers Tab

The triggers trigger only the master, feature and hotfix branches. All other branches are not automatically build. The branch prefixes for feature and hotfix are based on the GitFlow naming, so if we want to use GitFlow, the naming is at least the same.


General Tab

Build Number Format: <YourProjectName>_2.$(Year:yy)$(DayOfYear)$(rev:.r)

Version Updater

The version updating script is a nifty little thing that uses regex to set the version number of the build in your assemblies. And with the assemblies properly versioned, your OctoPack will also use the proper version and so will your deployment. This brings the awesomeness that everything is connected to each other!
It will create versions like 2.16109.1.0 (for master branch builds) and 2.16109-feature-<FEATURENAME> for feature builds. The versions should be SemVer 1 compatible due to limitations in the NuGet 2.0 protocol. Safe the powershell script below in the build directory as ApplyVersionToAssemblies.ps1


##-----------------------------------------------------------------------
## <copyright file="ApplyVersionToAssemblies.ps1">(c) Microsoft Corporation.
## This source is subject to the Microsoft Permissive License.
## See http://www.microsoft.com/resources/sharedsource/licensingbasics/sharedsourcelicenses.mspx.
## All other rights reserved.</copyright>
##-----------------------------------------------------------------------
# Look for a 0.0.0.0 pattern in the build number. 
# If found use it to version the assemblies.
#
# For example, if the 'Build number format' build process parameter 
# $(BuildDefinitionName)_$(Year:yyyy).$(Month).$(DayOfMonth)$(Rev:.r)
# then your build numbers come out like this:
# "Build HelloWorld_2013.07.19.1"
# This script would then apply version 2013.07.19.1 to your assemblies.

# Enable -Verbose option
[CmdletBinding()]

# Regular expression pattern to find the version in the build number 
# and then apply it to the assemblies
$BuildVersionRegex = "\d+\.\d+\.\d+"
$FileVersionRegex = "\d+\.\d+\.\d+\.\d+"
$VersionTagRegex = "refs\/heads\/(\w*)\/([\w-]*)"

# If this script is not running on a build server, remind user to 
# set environment variables so that this script can be debugged
if(-not ($Env:BUILD_SOURCESDIRECTORY -and $Env:BUILD_BUILDNUMBER))
{
    Write-Error "You must set the following environment variables"
    Write-Error "to test this script interactively."
    Write-Host '$Env:BUILD_SOURCESDIRECTORY - For example, enter something like:'
    Write-Host '$Env:BUILD_SOURCESDIRECTORY = "C:\code\FabrikamTFVC\HelloWorld"'
    Write-Host '$Env:BUILD_BUILDNUMBER - For example, enter something like:'
    Write-Host '$Env:BUILD_BUILDNUMBER = "Build HelloWorld_0000.00.00.0"'
    exit 1
}

# Make sure path to source code directory is available
if (-not $Env:BUILD_SOURCESDIRECTORY)
{
    Write-Error ("BUILD_SOURCESDIRECTORY environment variable is missing.")
    exit 1
}
elseif (-not (Test-Path $Env:BUILD_SOURCESDIRECTORY))
{
    Write-Error "BUILD_SOURCESDIRECTORY does not exist: $Env:BUILD_SOURCESDIRECTORY"
    exit 1
}
Write-Verbose "BUILD_SOURCESDIRECTORY: $Env:BUILD_SOURCESDIRECTORY"

# Make sure there is a build number
if (-not $Env:BUILD_BUILDNUMBER)
{
    Write-Error ("BUILD_BUILDNUMBER environment variable is missing.")
    exit 1
}
Write-Verbose "BUILD_BUILDNUMBER: $Env:BUILD_BUILDNUMBER"
Write-Verbose "BUILD_SOURCEBRANCH: $Env:BUILD_SOURCEBRANCH"

# Get and validate the version data
$VersionData = [regex]::matches($Env:BUILD_BUILDNUMBER,$BuildVersionRegex)
switch($VersionData.Count)
{
   0        
      { 
         Write-Error "Could not find version number data in BUILD_BUILDNUMBER."
         exit 1
      }
   1 {}
   default 
      { 
         Write-Warning "Found more than instance of version data in BUILD_BUILDNUMBER." 
         Write-Warning "Will assume empty version tag."
      }
}

$VersionTagData = [regex]::matches($Env:BUILD_SOURCEBRANCH,$VersionTagRegex)
switch($VersionTagData.Captures.Groups.Count)
{
   0 {}
   3 
      {
        $VersionTag = $VersionTagData.Captures.Groups[1].value + '-' + $VersionTagData.Captures.Groups[2].value
      }
   default 
      { 
         Write-Error "Invalid version tag data in BUILD_SOURCEBRANCH." 
      }
}

$NewVersion = $VersionData[0].value
Write-Verbose "Version: $NewVersion"
if($VersionTag){
    Write-Verbose "VersionTag: $VersionTag"  
}

# Apply the version to the assembly property files
$files = gci $Env:BUILD_SOURCESDIRECTORY -recurse -include "*Properties*","My Project" | 
    ?{ $_.PSIsContainer } | 
    foreach { gci -Path $_.FullName -Recurse -include AssemblyInfo.* }
if($files)
{
    Write-Verbose "Will apply $NewVersion to $($files.count) files."

    foreach ($file in $files) {
        $filecontent = Get-Content($file)
        attrib $file -r
        $FileVersion = $NewVersion + ".0"
        $filecontent -replace $FileVersionRegex, $FileVersion | Out-File $file

        if($VersionTag) {
            Add-Content $file "`n[assembly: AssemblyInformationalVersion(`"$NewVersion-$VersionTag`")]"
            Write-Verbose "$file.FullName - version tag applied"
        }
        else {
            Write-Verbose "$file.FullName - version applied"
        }
    }
}
else
{
    Write-Warning "Found no files."
}

Octopus Deploy

You've probably already heard of Octopus Deploy. Your build server builds, and Octopus Deploys.
In the previous step you've seen that we've created a service endpoint in VSTS. You can add it by clicking on the Settings button and add a service.


With the versioning in place, everything will work fine in Octopus, and because we have names like 'Feature' or 'Hotfix' in the package, you can even setup channels to quickly deploy hotfix patches to production and allow feature packages only to be deployed on your development environment. But this is something we still need to setup (maybe in a future blog post).

Because the applications in the end don't know the release version / environment name, we've created a Variable set which is called 'Default Environment' and added the following 2 keys in it:
All applications that need to do something with it, can now use the Version and Environment name (in our case we use it to log the version to LogStash).

When you want to update more that one project, the octo.exe is there to help you, with this super simple tool, you can create releases and deploy multiple projects at once. For example:

@echo off
set SERVER=http://<YOUR OCTOPUS SERVER>/
set APIKEY=<API KEY>
set PACKAGEVERSION=<PACKAGE VERSION FROM VSTS, when not Feature/Hotfix, add .0 to it>
set TO=Development
set RELEASENOTES="<RELEASE NOTES>"

octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Api"
octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Application"
octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Data Ingestor"
octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Product Worker"

octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Api"
octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Application"
octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Data Ingestor"
octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Product Worker"

You'll be amazed with the nice colored logging that comes out of this beauty. Now a 5 O'clock deployment is nothing more than a click away (but don't do it :)) !

We're still in the process of improving the flow of the deployment, but I like the progress we made so far. If you have any tips or questions, let me know in the comments below!

Happy deploying!
Luuk

Wednesday, October 7, 2015

Tail with cmder (or powershell)

With powershell it's really easy to tail log files. But to make it even easier, I've added a Tail with Powershell context menu in windows. Just save the next lines as a .reg file and run it.


Btw, mine is called 'Tail with cmder' just because cmder is awesome!

Windows Registry Editor Version 5.00

[HKEY_CLASSES_ROOT\*\shell\Tail with cmder\command]
@="C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe -NoExit -Command Get-Content -Wait -Tail 10 '%1'"

Cheers,
Luuk

Monday, September 28, 2015

Automatic update of assembly version using TFS2013

To successfully implement Octopus Deploy you need to have unique version numbers for each build. If you don't want to manually edit the assembly info this could be a real pain in the ***. With the following trick you can automatically generate version numbers using TFS Build server 2013.

What I did was create a BuildCommon.targets that automatically searches for the AssemblyInfo and updates the version number that matches the build number as generated by TFS, and check this file in your codetree. In our case the file is named: BuildCommon.targets and is placed next to the root of the solution:

<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="4.0">

<!--
    Defining custom Targets to execute before project compilation starts.
-->
<PropertyGroup>
    <CompileDependsOn>
        CommonBuildDefineModifiedAssemblyVersion;
        $(CompileDependsOn);
    </CompileDependsOn>
</PropertyGroup>

<!--
    Creates modified version of AssemblyInfo.cs, replaces [AssemblyVersion] attribute with the one specifying actual build version (from MSBuild properties), and includes that file instead of the original AssemblyInfo.cs in the compilation.

    Works with both, .cs and .vb version of the AssemblyInfo file, meaning it supports C# and VB.Net projects simultaneously.
-->
<Target Name="CommonBuildDefineModifiedAssemblyVersion" Condition="'$(VersionAssembly)' != ''">
    <!-- Find AssemblyInfo.cs or AssemblyInfo.vb in the "Compile" Items. Remove it from "Compile" Items because we will use a modified version instead. -->
    <PropertyGroup>
        <VersionAssembly>$([System.Text.RegularExpressions.Regex]::Replace($(VersionAssembly), `[\w|\D]+_`, ``, System.Text.RegularExpressions.RegexOptions.IgnoreCase))</VersionAssembly>
    </PropertyGroup>
    <ItemGroup>
        <OriginalAssemblyInfo Include="@(Compile)" Condition="(%(Filename) == 'AssemblyInfo') And (%(Extension) == '.vb' Or %(Extension) == '.cs')" />
        <Compile Remove="**/AssemblyInfo.vb" />
        <Compile Remove="**/AssemblyInfo.cs" />
    </ItemGroup>
    <!-- Copy the original AssemblyInfo.cs/.vb to obj\ folder, i.e. $(IntermediateOutputPath). The copied filepath is saved into @(ModifiedAssemblyInfo) Item. -->
    <Copy SourceFiles="@(OriginalAssemblyInfo)"
          DestinationFiles="@(OriginalAssemblyInfo->'$(IntermediateOutputPath)%(Identity)')">
        <Output TaskParameter="DestinationFiles" ItemName="ModifiedAssemblyInfo"/>
    </Copy>
    <!-- Replace the version bit (in AssemblyVersion and AssemblyFileVersion attributes) using regular expression. Use the defined property: $(VersionAssembly). -->
    <Message Text="Setting AssemblyVersion to $(VersionAssembly)" />
    <RegexUpdateFile Files="@(ModifiedAssemblyInfo)"
                Regex="Version\(&quot;(\d+)\.(\d+)(\.(\d+)\.(\d+)|\.*)&quot;\)"
                ReplacementText="Version(&quot;$(VersionAssembly)&quot;)"
                />
    <!-- Include the modified AssemblyInfo.cs/.vb file in "Compile" items (instead of the original). -->
    <ItemGroup>
        <Compile Include="@(ModifiedAssemblyInfo)" />
    </ItemGroup>
</Target>

<UsingTask TaskName="RegexUpdateFile" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
    <ParameterGroup>
        <Files ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true" />
        <Regex ParameterType="System.String" Required="true" />
        <ReplacementText ParameterType="System.String" Required="true" />
    </ParameterGroup>
    <Task>
        <Reference Include="System.Core" />
        <Using Namespace="System" />
        <Using Namespace="System.IO" />
        <Using Namespace="System.Text.RegularExpressions" />
        <Using Namespace="Microsoft.Build.Framework" />
        <Using Namespace="Microsoft.Build.Utilities" />
        <Code Type="Fragment" Language="cs">
            <![CDATA[
            try {
                var rx = new System.Text.RegularExpressions.Regex(this.Regex);
                for (int i = 0; i < Files.Length; ++i)
                {
                    var path = Files[i].GetMetadata("FullPath");
                    if (!File.Exists(path)) continue;

                    var txt = File.ReadAllText(path);
                    txt = rx.Replace(txt, this.ReplacementText);
                    File.WriteAllText(path, txt);
                }
                return true;
            }
            catch (Exception ex) {
                Log.LogErrorFromException(ex);
                return false;
            }
        ]]>
        </Code>
    </Task>
</UsingTask>

</Project>

Then change the build number format to:
$(BuildDefinitionName)_0.1.$(Year:yy)$(DayOfYear)$(Rev:.r)

and the MSBuild arguments:
/p:CustomAfterMicrosoftCommonTargets="$(TF_BUILD_SOURCESDIRECTORY)\src\BuildCommon.targets" /p:RunOctoPack=true /p:OctoPackPublishApiKey=API-123465 /p:OctoPackPublishPackageToHttp=http://octopus-server/nuget/packages /p:VersionAssembly=$(TF_BUILD_BUILDNUMBER)

This should result in unique assembly versions for each build.

Many thanks for the creators of these posts to help me create this:
http://www.lionhack.com/2014/02/13/msbuild-override-assembly-version/
http://blog.casavian.eu/blog/2014/04/23/increment-version-for-changed-assemblies-only-first-part/
http://blogs.msdn.com/b/visualstudio/archive/2010/04/02/msbuild-property-functions.aspx

Cheers,
Luuk

Tuesday, September 15, 2015

Install Sentry (an open source error logger) on Azure using Docker containers

Start with an Azure VM: Docker on Ubuntu Server (create one on http://portal.azure.com) Create an account at http://hub.docker.com so you can pull containers. Now, when the VM is fully loaded, login with putty or any other ssh client to your Azure vm and type the following commands:
$ docker login
$ docker search redis ### (optional search for redis)
$ docker pull redis
$ docker pull postgres
$ docker pull sentry
$ docker run -d --name sentry-redis redis
$ docker run -d --name sentry-postgres -e POSTGRES_PASSWORD=yourpassword -e POSTGRES_USER=sentry postgres
$ docker run -d --name sentry -p 8080:9000 --link sentry-redis:redis --link sentry-postgres:postgres sentry
$ docker run -it --rm --link sentry-postgres:postgres --link sentry-redis:redis sentry sentry upgrade
$ docker run -d --name sentry-celery1 --link sentry-redis:redis --link sentry-postgres:postgres sentry sentry celery worker
$ docker run -d --name sentry-celery-beat --link sentry-redis:redis --link sentry-postgres:postgres sentry sentry celery beat
For me, the initial user didn't have enough rights, so I created an additional user using:
$ docker run -it --rm --link sentry-redis:redis --link sentry-postgres:postgres sentry sentry createsuperuser
To make the web portal accessible, you'll have to open the port Azure, using Settings - Endpoints:

All kudo's for this post go to: https://hub.docker.com/_/sentry/ for the excellent description, I've only created this post to add the additional docker pull / Azure stuff. I don't know if this is how you want to run it on production, but at least you have a very easy test environment.

Now you can compare this with other error loggers like:

What is your experience with error loggers and monitoring tools and which one would you recommend?

Cheers,
Luuk

Friday, July 3, 2015

Unhandled Exception: System.InvalidOperationException: Cannot dispose the build manager because it is not idle.

Today we got this really annoying error when building on TFS2010:


Long story short: Not our build server was causing this error, but the TFS server itself... it was out of diskspace..

So please check this first before blaming everything else except TFS :)

Cheers,
Luuk