Tuesday, May 31, 2022

Build and deploy .net framework / .net core to IIS with Gitlab and some awesome templates

In this blog I'll explain the way we do our builds with gitlab. It took some time to get to this, so I'm happy to share it with you. At first we set the ground-rules:
  1. You use GitLab
  2. You have GitLab runners on windows machines on powershell (core)
  3. Make use of merge requests to merge code changes to the default branch
  4. Create release tags in the form of semver '1.0.0'
  5. The default branch always gets deployed to your development IIS server
  6. After creating a release in GitLab, a production build is generated
  7. Manual trigger starts the deployment to your production IIS server
  8. There is a solution file in the root of the project
  9. There is only 1 deployable webproject in your solution
  10. The dotnet-* templates reside in a shared repository
Challenges we faced we're especially with the msdeploy command which behaves really funky under powershell (I dare you to do a duckduckgo search on this and see what you'll find). I really like the way everything is parameterized, so the core just works, but if you need a bit more, you just adjust where needed without breaking all the defaults in the build an deploy pipelines. This makes maintaining them a lot easier when environments change and forces you to use a standardized way of work. 

I also like the validation of the release tag numbering using the .pre step validate_tag job. It only runs when a tag is set and it's not in the form of a semver. You can never build a production packages without a proper number.

The .gitlab.yml is the ci file for your end project, it sets urls and can customize some configuration. The dotnet-framework.v1.gitlab-ci.yml and dotnet-core.v1.gitlab-ci.yml files are the base templates you can inherit in the .gitlab-ci.yml. 

Below are all the gists needed to get you going or just to get inspired. If you have any feedback, I'm happy to learn about your experiences. Comment on the gist or this blog!

Happy deploying,
Luuk

Wednesday, February 24, 2021

Umbraco modelsbuilder line endings

Everytime the modelsbuilder detects a change in LiveAppData moe, the generated file will be touched.

Git uses auto line-endings, meaning CRLF on windows, while the umbraco models builder uses LF (https://github.com/modelsbuilder/ModelsBuilder.Original/blob/v4/dev/src/Our.ModelsBuilder/Building/CodeWriterBase.cs#L65) This is seen as a change, but when running git add this change is corrected by git and then dropped.

This is very annoying, but easy to fix with a .gitattributes file. As a base I've used the one in the UmbracoCms repository, and extended it with the path to the models builder output:

Tuesday, December 4, 2018

Remove format on paste in SXA Experience Editor

The experience editor uses the contenteditable feature. To remove formatting on paste you cannot use the Telerik RTE feature, because that one isn't used.



To solve, I've created a new Editing Theme in the medialibrary\Base Themes:



Upload the editor.js file containing:

window.$xa(document).ready(function () {
    $('[contenteditable]').on('paste', function(e) {
     e.preventDefault();
     var text = '';
     if (e.clipboardData || e.originalEvent.clipboardData) {
       text = (e.originalEvent || e).clipboardData.getData('text/plain');
     } else if (window.clipboardData) {
       text = window.clipboardData.getData('Text');
     }
     if (document.queryCommandSupported('insertText')) {
       document.execCommand('insertText', false, text);
     } else {
       document.execCommand('paste', false, text);
     }
 });
});

Then add My Editing Theme to the Editing Theme of your site:




Now you can do paste from word without keeping the content. Happy content editors, happy developers!

Cheers, Luuk

Wednesday, November 21, 2018

Kendo UI Invalid template message

Just in case you'll get:

Invalid template:'<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <cp:coreProperties xmlns:cp="http://schemas.openxmlformats.org/package/2006/metadata/core-properties" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dcmitype="http://purl.org/dc/dcmitype/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><dc:creator>${creator}</dc:creator><cp:lastModifiedBy>${lastModifiedBy}</cp:lastModifiedBy><dcterms:created xsi:type="dcterms:W3CDTF">${created}</dcterms:created><dcterms:modified xsi:type="dcterms:W3CDTF">${modified}</dcterms:modified></cp:coreProperties>' Generated code:'var $kendoOutput, $kendoHtmlEncode = kendo.htmlEncode;with(data){$kendoOutput='<?xml version="1.0" encoding="UTF-8" standalone="yes"?>\r\n<cp:coreProperties xmlns:cp="http://schemas.openxmlformats.org/package/2006/metadata/core-properties" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dcmitype="http://purl.org/dc/dcmitype/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><dc:creator>'+($kendoHtmlEncode(creator))+'</dc:creator><cp:lastModifiedBy>'+($kendoHtmlEncode(lastModifiedBy))+'</cp:lastModifiedBy><dcterms:created xsi:type="dcterms:W3CDTF">'+($kendoHtmlEncode(created))+'</dcterms:created><dcterms:modified xsi:type="dcterms:W3CDTF">'+($kendoHtmlEncode(modified))+'</dcterms:modified></cp:coreProperties>';}return $kendoOutput;' 

With firefox I got the message 'Content Security Policy: The page’s settings blocked the loading of a resource at eval (“script-src”).'

So in your Content Security Policy policy add 'unsave-eval' to the script-src.

Cheers,
L

Tuesday, November 6, 2018

Extend Sitecore SXA metadata with hreflang and better favicon support (pt. 2 - hreflang)

When you're creating a multi lingual site, you probably want to tell the search engines to tell where to find the content in a different language. The way to do this is using hreflang.

Using the hreflang from the MetadataExtended this is really easy peasy!

You'll only have to add the hreflang rendering to the metadata partial design (like with the FaviconExtension in the previous post) manually, the rest is in the config.

The idea is that you don't use the language cookie, but always use the language from the url. To do so we need to enable the custom link manager and add a pipeline to remove the language cookie:

<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
  <sitecore>
    <linkManager defaultProvider="sitecore">
      <providers>
        <add name="ExtendedLinkProvider" 
             type="SXA.Feature.MetadataExtended.Providers.ExtendedLinkProvider, SXA.Feature.MetadataExtended" 
             cacheExpiration="5" 
             addAspxExtension="false" 
             alwaysIncludeServerUrl="false" 
             lowercaseUrls="true" 
             encodeNames="true" 
             languageEmbedding="always" 
             languageLocation="filePath" 
             shortenUrls="false" 
             useDisplayName="true">
        </add>
      </providers>
      <patch:attribute name="defaultProvider">switchableLinkProvider</patch:attribute>
    </linkManager>
    <pipelines>
      <httpRequestProcessed>
        <processor type="SXA.Feature.MetadataExtended.Pipelines.HttpRequestProcessed.LanguageCookieRemover, SXA.Feature.MetadataExtended" resolve="true" />
      </httpRequestProcessed>
    </pipelines>
  </sitecore>
</configuration>

Then enable the ExtendedLinkProvider in your SXA site, and disable language cookie support. Browse to /sitecore/content/{tenant}/{site}/Settings/Site Grouping/{site} and update the Link Provider Name. You'll also need to set the disableLanguageCookie to true:


The ExtendedLinkProvider does a bit of magic, it adds the language to the url for all languages, except for 'en'. You can always modify this behavior, but I like it this way.

To get the right localized url which uses the displayname instead of the item name (see LinkManager config above 'useDisplayName'), you'll have to do something weird, I would expect that it works when you use the language switcher it will pick up the displayname in that language. Too bad it doesn't do that, you'll need to get the language item. But the language item itself doesn't add the language in front of the url, so wrapping it around a LanguageSwitcher does the magic:

var language = Language.Parse(languageCode);

// We have to get the language item for the right display-name
var languageItem = item.Database.GetItem(item.ID, language);

// And use the Language switcher to get the right language in the url
using (new LanguageSwitcher(language))
{
    return LinkManager.GetItemUrl(languageItem); 
    // NOTE: The actual code has an option to return the full server url
    // which is needed for hreflang
}

With this, you'll get the hreflang:


If you have any questions or issues, please hit me a message below or file a ticket on:
https://github.com/luuksommers/sitecore-sxa-metadataextended

[dutch]Computer ze![/dutch]

Luuk

Saturday, November 3, 2018

Extend Sitecore SXA metadata with hreflang and better favicon support

Sitecore SXA only supports a simple favicon OOTB. Because we have a full blown mobile supported website, we want to support all the native font formats as generated by https://realfavicongenerator.net/. For this I've created MetadataExtended feature for SXA.

To use it add the _FaviconExtended template to the Settings template that has been generated for your website:


After adding this template, you'll get a lot more options in the favicon section of your settings:


Here you can upload all the files that are generated by https://realfavicongenerator.net/.

Please note that the site.webmanifest and browserconfig.xml contain a link to an icon. There files are not automatically generated in the initial version, so update them with the right media path.

Now all we need to do is add the right rendering to the medatdata partial design (/sitecore/content/{tenant}/{site}/Presentation/Partial Designs/Metadata). You can do this by adding the rendering to the /sitecore/content/{tenant}/{site}/Presentation/Available Renderings and add it using the experience editor, or directly using the Presentation Details. And voila on the favicon checker, all is green!


On the next post I'll explain the hreflang extension.

All code is available on https://github.com/luuksommers/sitecore-sxa-metadataextended

Happy faviconning,

Luuk

Friday, June 1, 2018

Patching Sitecore web.config on Azure using VSTS WebApp deploy

With Sitecore, it's a best practice to not copy the web.config to your project, but use transformations. This is a really nice idea, but when deploying Sitecore using the ARM templates to Azure and apply your solution on top of it, you cannot edit the web.config. With the solution below I want to show how to apply config transformations on Azure using the VSTS Web App Service Deploy task. The method is kinda easy once you know how to do it.

First, add the code from configtransfrom to your App_Data\tools folder, the binaries and postdeployment command are stored here. Secondly add a web.azure.config to the project, with the build action set to Content. Lastly, update the release definition and add the postdeployment task to the Web App Service Deploy.



The line in the postdeploy.cmd that does the magic is:
"%WEBROOT_PATH%\App_Data\tools\configtransform\SlowCheetah.Xdt.exe" "%WEBROOT_PATH%\web.config" "%WEBROOT_PATH%\web.azure.config" "%WEBROOT_PATH%\web.config"

To see all other available environment variables run 'set' from the kudu console.

Offcourse the source is available on Github: https://github.com/luuksommers/sitecore-azure-configtransform

Happy transforming!
Luuk

Friday, April 6, 2018

Transforming Unicorn files in a release pipeline using YmlTransform



When you use an automated deployment to different environments that also need changes in unicorn files (without having to update the sitecore configuration manually) you can use YmlTransform. This is an easy tool that uses an input Json file to update unicorn files before copying them to the server. Together with the ReplaceTokens task this is a very powerfull solution.

Our deployment process looks like this before we publish the web application to Azure.

First you'll need to create a Json file containing the fields you want to replace, it currently supports shared and language fields. In our case we call it unicorn.ymltransform with the following content:
[
    {
        "FieldId": "379de7bc-88f2-42ae-8d4a-50dd0b8796ea",
        "Languages": "",
        "Path": "/sitecore/content/Home/Item1",
        "Type": "Shared",
        "Value": "#{ApiUrl1}#"
    },
    {
        "FieldId": "379de7bc-88f2-42ae-8d4a-50dd0b8796ea",
        "Languages": "*",
        "Path": "/sitecore/content/Home/Item2",
        "Type": "Shared",
        "Value": "#{ApiUrl2}#"
    },
    {
        "FieldId": "86ee9731-e7fb-47c9-bab6-5cb282c3a920",
        "Languages": "*",
        "Path": "/sitecore/content/Home/Item3",
        "Type": "Shared",
        "Value": "#{OtherSetting}#"
    }
]

When you parse this file through the ReplaceTokens and run it over your *.ymltransform files, the unicorn files will be transformed using the actual value from the VSTS variables (which could also be loaded from a keyfault).

In the next step you can run a command with the following settings to transform the actual unicorn files:
ymltransform.exe -p "App_Data/unicorn" -r -t "unicorn.ymltransform" 

The output of this command:
2018-04-06T06:28:47.6709328Z Updating file D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item1.yml section Shared id 379de7bc-88f2-42ae-8d4a-50dd0b8796ea to https://apiurl1.com
2018-04-06T06:28:47.6711423Z Transformed: D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item1.yml
2018-04-06T06:28:47.7732198Z Updating file D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item2.yml section Shared id 379de7bc-88f2-42ae-8d4a-50dd0b8796ea to https://apiurl2.com
2018-04-06T06:28:47.7735015Z Transformed: D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item2.yml
2018-04-06T06:28:47.9817169Z Updating file D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item3.yml section Shared id 86ee9731-e7fb-47c9-bab6-5cb282c3a920 to HelloWorld
2018-04-06T06:28:47.9819775Z Transformed: D:\a\r1\a\TestProject\drop\artifacts\Website\App_Data/unicorn\Project\serialization\Content\Home\Item3.yml

Now the yml files contain the correct information for your environment, you can upload the yml files and run a unicorn sync (automatically).

The full sourcecode is available on Github:
https://github.com/luuksommers/ymltransform 

Happy transforming!
Luuk

Friday, March 23, 2018

Using the Sitecore bootloader to add Redis to an Sitecore 8.2 XP0 installation

The Sitecore bootloader is a nifty little-underdocumented thing that can be used to easily extend a Sitecore installation. When you'll install EXM using the web deploy packages, you'll notice that it has to add some connection strings to the ConnectionStrings.config, but you cannot modify that file without restarting the application. So I was wondering, how does that work.
It appears that the bootloader scans directories and can apply xdt transforms to all configs. Just by placing a file called ConnectionStrings.config.xdt on the Azure WebApp in the following folder: App_Data\Transforms\Redis\Xdts\App_Config, the bootloader will try to match the filename without xdt and run the transformation. So the path after Xdts should match the folder structure from your site root.
The Redis package does exactly the same: I've created a WebDeploy package containing this file, so we can deploy Redis to Sitecore without any modifications to the Sitecore installation beforehand.

The content of the xdt file:
<?xml version="1.0"?>
<connectionStrings xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
 <add name="redis.sessions" connectionString="Redis Connection String" xdt:Transform="Insert" />
</connectionStrings>

By adding a parameter to the web deploy package called 'Redis ConnectionString' we can now push the Redis connection string to the web deployment package.
{
  "name": "[concat(variables('singleWebAppNameTidy'), '/', 'MSDeploy')]",
  "type": "Microsoft.Web/sites/extensions",
  "location": "[parameters('location')]",
  "apiVersion": "[variables('webApiVersion')]",
  "properties": {
    "addOnPackages": [
      {
        "packageUri": "[parameters('redisMsDeployPackageUrl')]",
        "setParameters": {
          "Application Path": "[variables('singleWebAppNameTidy')]",
          "Redis Connection String": "[concat(reference(resourceId('Microsoft.Cache/Redis', 
                   variables('redisCacheNameTidy')), variables('redisApiVersion')).hostName, ':', reference(resourceId('Microsoft.Cache/Redis', 
                   variables('redisCacheNameTidy')), variables('redisApiVersion')).sslPort, ',password=', listKeys(resourceId('Microsoft.Cache/Redis', 
                   variables('redisCacheNameTidy')), variables('redisApiVersion')).primaryKey, ',ssl=True,abortConnect=False')]"
        }
      }
    ]
  }
}

During the first hit of the site, the bootloader will scan the App_Data\Transforms directory. The bootloader then runs an installer in a separated process on the Web App to apply the transformation.

Supporting code:
https://github.com/luuksommers/sitecore-xp0-redis

Happy caching!
Luuk

Monday, April 25, 2016

First steps towards an awesome build and deploy pipeline

So we moved to the new awesome scripted build from VSTS and GitHub as our main code repository. But how do we handle the build and deployments? In this post I will show all the tips and tricks we did to get it working. Please note that this is a continues improving plan, but it will get you started.

Code setup

First our code setup. In the root of our project we have a couple of folders to not mix up code with other tools. So we have the following folders defined:
  • src (for sources)
  • build (for build scripts, see versioning below)
  • tools (for external tools)

Build setup

For the build server we've created the following steps. See the specific configurations below each name.

Build Tab

Delete files
Contents: **\<Namespace>*.nupkg (this is because we don't do a clean checkout each time)

Powershell
Script Filename: build/ApplyVersionToAssemblies.ps1

NuGet Installer
Path to Solution: **\*.sln

Visual Studio Build
Solution: **\*.sln
MSBuild Arguments: /p:RunOctoPack=true

Visual Studio Test
Test Assembly: **\bin\$(BuildConfiguration)\*test*.dll; -:**\xunit.runner.visualstudio.testadapter.dll; -:**\Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll;-:**\<NameSpace>.TestUtils.dll

NuGet Publisher
Path/Pattern to nupkg **\bin\**\MeteoGroup.RouteGuard*.nupkg
NuGet Server Endpoint: Octopus Deploy (see Octopus Deploy, below)

Triggers Tab

The triggers trigger only the master, feature and hotfix branches. All other branches are not automatically build. The branch prefixes for feature and hotfix are based on the GitFlow naming, so if we want to use GitFlow, the naming is at least the same.


General Tab

Build Number Format: <YourProjectName>_2.$(Year:yy)$(DayOfYear)$(rev:.r)

Version Updater

The version updating script is a nifty little thing that uses regex to set the version number of the build in your assemblies. And with the assemblies properly versioned, your OctoPack will also use the proper version and so will your deployment. This brings the awesomeness that everything is connected to each other!
It will create versions like 2.16109.1.0 (for master branch builds) and 2.16109-feature-<FEATURENAME> for feature builds. The versions should be SemVer 1 compatible due to limitations in the NuGet 2.0 protocol. Safe the powershell script below in the build directory as ApplyVersionToAssemblies.ps1


##-----------------------------------------------------------------------
## <copyright file="ApplyVersionToAssemblies.ps1">(c) Microsoft Corporation.
## This source is subject to the Microsoft Permissive License.
## See http://www.microsoft.com/resources/sharedsource/licensingbasics/sharedsourcelicenses.mspx.
## All other rights reserved.</copyright>
##-----------------------------------------------------------------------
# Look for a 0.0.0.0 pattern in the build number. 
# If found use it to version the assemblies.
#
# For example, if the 'Build number format' build process parameter 
# $(BuildDefinitionName)_$(Year:yyyy).$(Month).$(DayOfMonth)$(Rev:.r)
# then your build numbers come out like this:
# "Build HelloWorld_2013.07.19.1"
# This script would then apply version 2013.07.19.1 to your assemblies.

# Enable -Verbose option
[CmdletBinding()]

# Regular expression pattern to find the version in the build number 
# and then apply it to the assemblies
$BuildVersionRegex = "\d+\.\d+\.\d+"
$FileVersionRegex = "\d+\.\d+\.\d+\.\d+"
$VersionTagRegex = "refs\/heads\/(\w*)\/([\w-]*)"

# If this script is not running on a build server, remind user to 
# set environment variables so that this script can be debugged
if(-not ($Env:BUILD_SOURCESDIRECTORY -and $Env:BUILD_BUILDNUMBER))
{
    Write-Error "You must set the following environment variables"
    Write-Error "to test this script interactively."
    Write-Host '$Env:BUILD_SOURCESDIRECTORY - For example, enter something like:'
    Write-Host '$Env:BUILD_SOURCESDIRECTORY = "C:\code\FabrikamTFVC\HelloWorld"'
    Write-Host '$Env:BUILD_BUILDNUMBER - For example, enter something like:'
    Write-Host '$Env:BUILD_BUILDNUMBER = "Build HelloWorld_0000.00.00.0"'
    exit 1
}

# Make sure path to source code directory is available
if (-not $Env:BUILD_SOURCESDIRECTORY)
{
    Write-Error ("BUILD_SOURCESDIRECTORY environment variable is missing.")
    exit 1
}
elseif (-not (Test-Path $Env:BUILD_SOURCESDIRECTORY))
{
    Write-Error "BUILD_SOURCESDIRECTORY does not exist: $Env:BUILD_SOURCESDIRECTORY"
    exit 1
}
Write-Verbose "BUILD_SOURCESDIRECTORY: $Env:BUILD_SOURCESDIRECTORY"

# Make sure there is a build number
if (-not $Env:BUILD_BUILDNUMBER)
{
    Write-Error ("BUILD_BUILDNUMBER environment variable is missing.")
    exit 1
}
Write-Verbose "BUILD_BUILDNUMBER: $Env:BUILD_BUILDNUMBER"
Write-Verbose "BUILD_SOURCEBRANCH: $Env:BUILD_SOURCEBRANCH"

# Get and validate the version data
$VersionData = [regex]::matches($Env:BUILD_BUILDNUMBER,$BuildVersionRegex)
switch($VersionData.Count)
{
   0        
      { 
         Write-Error "Could not find version number data in BUILD_BUILDNUMBER."
         exit 1
      }
   1 {}
   default 
      { 
         Write-Warning "Found more than instance of version data in BUILD_BUILDNUMBER." 
         Write-Warning "Will assume empty version tag."
      }
}

$VersionTagData = [regex]::matches($Env:BUILD_SOURCEBRANCH,$VersionTagRegex)
switch($VersionTagData.Captures.Groups.Count)
{
   0 {}
   3 
      {
        $VersionTag = $VersionTagData.Captures.Groups[1].value + '-' + $VersionTagData.Captures.Groups[2].value
      }
   default 
      { 
         Write-Error "Invalid version tag data in BUILD_SOURCEBRANCH." 
      }
}

$NewVersion = $VersionData[0].value
Write-Verbose "Version: $NewVersion"
if($VersionTag){
    Write-Verbose "VersionTag: $VersionTag"  
}

# Apply the version to the assembly property files
$files = gci $Env:BUILD_SOURCESDIRECTORY -recurse -include "*Properties*","My Project" | 
    ?{ $_.PSIsContainer } | 
    foreach { gci -Path $_.FullName -Recurse -include AssemblyInfo.* }
if($files)
{
    Write-Verbose "Will apply $NewVersion to $($files.count) files."

    foreach ($file in $files) {
        $filecontent = Get-Content($file)
        attrib $file -r
        $FileVersion = $NewVersion + ".0"
        $filecontent -replace $FileVersionRegex, $FileVersion | Out-File $file

        if($VersionTag) {
            Add-Content $file "`n[assembly: AssemblyInformationalVersion(`"$NewVersion-$VersionTag`")]"
            Write-Verbose "$file.FullName - version tag applied"
        }
        else {
            Write-Verbose "$file.FullName - version applied"
        }
    }
}
else
{
    Write-Warning "Found no files."
}

Octopus Deploy

You've probably already heard of Octopus Deploy. Your build server builds, and Octopus Deploys.
In the previous step you've seen that we've created a service endpoint in VSTS. You can add it by clicking on the Settings button and add a service.


With the versioning in place, everything will work fine in Octopus, and because we have names like 'Feature' or 'Hotfix' in the package, you can even setup channels to quickly deploy hotfix patches to production and allow feature packages only to be deployed on your development environment. But this is something we still need to setup (maybe in a future blog post).

Because the applications in the end don't know the release version / environment name, we've created a Variable set which is called 'Default Environment' and added the following 2 keys in it:
All applications that need to do something with it, can now use the Version and Environment name (in our case we use it to log the version to LogStash).

When you want to update more that one project, the octo.exe is there to help you, with this super simple tool, you can create releases and deploy multiple projects at once. For example:

@echo off
set SERVER=http://<YOUR OCTOPUS SERVER>/
set APIKEY=<API KEY>
set PACKAGEVERSION=<PACKAGE VERSION FROM VSTS, when not Feature/Hotfix, add .0 to it>
set TO=Development
set RELEASENOTES="<RELEASE NOTES>"

octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Api"
octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Application"
octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Data Ingestor"
octo create-release --server %SERVER% --releasenotes=%RELEASENOTES% --apiKey %APIKEY% --packageversion %PACKAGEVERSION% --project "<Project Name> Product Worker"

octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Api"
octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Application"
octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Data Ingestor"
octo deploy-release --server %SERVER% --apiKey %APIKEY% --releaseNumber %PACKAGEVERSION% --deployto %TO% --waitfordeployment --project "<Project Name> Product Worker"

You'll be amazed with the nice colored logging that comes out of this beauty. Now a 5 O'clock deployment is nothing more than a click away (but don't do it :)) !

We're still in the process of improving the flow of the deployment, but I like the progress we made so far. If you have any tips or questions, let me know in the comments below!

Happy deploying!
Luuk

Wednesday, October 7, 2015

Tail with cmder (or powershell)

With powershell it's really easy to tail log files. But to make it even easier, I've added a Tail with Powershell context menu in windows. Just save the next lines as a .reg file and run it.


Btw, mine is called 'Tail with cmder' just because cmder is awesome!

Windows Registry Editor Version 5.00

[HKEY_CLASSES_ROOT\*\shell\Tail with cmder\command]
@="C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe -NoExit -Command Get-Content -Wait -Tail 10 '%1'"

Cheers,
Luuk

Monday, September 28, 2015

Automatic update of assembly version using TFS2013

To successfully implement Octopus Deploy you need to have unique version numbers for each build. If you don't want to manually edit the assembly info this could be a real pain in the ***. With the following trick you can automatically generate version numbers using TFS Build server 2013.

What I did was create a BuildCommon.targets that automatically searches for the AssemblyInfo and updates the version number that matches the build number as generated by TFS, and check this file in your codetree. In our case the file is named: BuildCommon.targets and is placed next to the root of the solution:

<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="4.0">

<!--
    Defining custom Targets to execute before project compilation starts.
-->
<PropertyGroup>
    <CompileDependsOn>
        CommonBuildDefineModifiedAssemblyVersion;
        $(CompileDependsOn);
    </CompileDependsOn>
</PropertyGroup>

<!--
    Creates modified version of AssemblyInfo.cs, replaces [AssemblyVersion] attribute with the one specifying actual build version (from MSBuild properties), and includes that file instead of the original AssemblyInfo.cs in the compilation.

    Works with both, .cs and .vb version of the AssemblyInfo file, meaning it supports C# and VB.Net projects simultaneously.
-->
<Target Name="CommonBuildDefineModifiedAssemblyVersion" Condition="'$(VersionAssembly)' != ''">
    <!-- Find AssemblyInfo.cs or AssemblyInfo.vb in the "Compile" Items. Remove it from "Compile" Items because we will use a modified version instead. -->
    <PropertyGroup>
        <VersionAssembly>$([System.Text.RegularExpressions.Regex]::Replace($(VersionAssembly), `[\w|\D]+_`, ``, System.Text.RegularExpressions.RegexOptions.IgnoreCase))</VersionAssembly>
    </PropertyGroup>
    <ItemGroup>
        <OriginalAssemblyInfo Include="@(Compile)" Condition="(%(Filename) == 'AssemblyInfo') And (%(Extension) == '.vb' Or %(Extension) == '.cs')" />
        <Compile Remove="**/AssemblyInfo.vb" />
        <Compile Remove="**/AssemblyInfo.cs" />
    </ItemGroup>
    <!-- Copy the original AssemblyInfo.cs/.vb to obj\ folder, i.e. $(IntermediateOutputPath). The copied filepath is saved into @(ModifiedAssemblyInfo) Item. -->
    <Copy SourceFiles="@(OriginalAssemblyInfo)"
          DestinationFiles="@(OriginalAssemblyInfo->'$(IntermediateOutputPath)%(Identity)')">
        <Output TaskParameter="DestinationFiles" ItemName="ModifiedAssemblyInfo"/>
    </Copy>
    <!-- Replace the version bit (in AssemblyVersion and AssemblyFileVersion attributes) using regular expression. Use the defined property: $(VersionAssembly). -->
    <Message Text="Setting AssemblyVersion to $(VersionAssembly)" />
    <RegexUpdateFile Files="@(ModifiedAssemblyInfo)"
                Regex="Version\(&quot;(\d+)\.(\d+)(\.(\d+)\.(\d+)|\.*)&quot;\)"
                ReplacementText="Version(&quot;$(VersionAssembly)&quot;)"
                />
    <!-- Include the modified AssemblyInfo.cs/.vb file in "Compile" items (instead of the original). -->
    <ItemGroup>
        <Compile Include="@(ModifiedAssemblyInfo)" />
    </ItemGroup>
</Target>

<UsingTask TaskName="RegexUpdateFile" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
    <ParameterGroup>
        <Files ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true" />
        <Regex ParameterType="System.String" Required="true" />
        <ReplacementText ParameterType="System.String" Required="true" />
    </ParameterGroup>
    <Task>
        <Reference Include="System.Core" />
        <Using Namespace="System" />
        <Using Namespace="System.IO" />
        <Using Namespace="System.Text.RegularExpressions" />
        <Using Namespace="Microsoft.Build.Framework" />
        <Using Namespace="Microsoft.Build.Utilities" />
        <Code Type="Fragment" Language="cs">
            <![CDATA[
            try {
                var rx = new System.Text.RegularExpressions.Regex(this.Regex);
                for (int i = 0; i < Files.Length; ++i)
                {
                    var path = Files[i].GetMetadata("FullPath");
                    if (!File.Exists(path)) continue;

                    var txt = File.ReadAllText(path);
                    txt = rx.Replace(txt, this.ReplacementText);
                    File.WriteAllText(path, txt);
                }
                return true;
            }
            catch (Exception ex) {
                Log.LogErrorFromException(ex);
                return false;
            }
        ]]>
        </Code>
    </Task>
</UsingTask>

</Project>

Then change the build number format to:
$(BuildDefinitionName)_0.1.$(Year:yy)$(DayOfYear)$(Rev:.r)

and the MSBuild arguments:
/p:CustomAfterMicrosoftCommonTargets="$(TF_BUILD_SOURCESDIRECTORY)\src\BuildCommon.targets" /p:RunOctoPack=true /p:OctoPackPublishApiKey=API-123465 /p:OctoPackPublishPackageToHttp=http://octopus-server/nuget/packages /p:VersionAssembly=$(TF_BUILD_BUILDNUMBER)

This should result in unique assembly versions for each build.

Many thanks for the creators of these posts to help me create this:
http://www.lionhack.com/2014/02/13/msbuild-override-assembly-version/
http://blog.casavian.eu/blog/2014/04/23/increment-version-for-changed-assemblies-only-first-part/
http://blogs.msdn.com/b/visualstudio/archive/2010/04/02/msbuild-property-functions.aspx

Cheers,
Luuk

Tuesday, September 15, 2015

Install Sentry (an open source error logger) on Azure using Docker containers

Start with an Azure VM: Docker on Ubuntu Server (create one on http://portal.azure.com) Create an account at http://hub.docker.com so you can pull containers. Now, when the VM is fully loaded, login with putty or any other ssh client to your Azure vm and type the following commands:
$ docker login
$ docker search redis ### (optional search for redis)
$ docker pull redis
$ docker pull postgres
$ docker pull sentry
$ docker run -d --name sentry-redis redis
$ docker run -d --name sentry-postgres -e POSTGRES_PASSWORD=yourpassword -e POSTGRES_USER=sentry postgres
$ docker run -d --name sentry -p 8080:9000 --link sentry-redis:redis --link sentry-postgres:postgres sentry
$ docker run -it --rm --link sentry-postgres:postgres --link sentry-redis:redis sentry sentry upgrade
$ docker run -d --name sentry-celery1 --link sentry-redis:redis --link sentry-postgres:postgres sentry sentry celery worker
$ docker run -d --name sentry-celery-beat --link sentry-redis:redis --link sentry-postgres:postgres sentry sentry celery beat
For me, the initial user didn't have enough rights, so I created an additional user using:
$ docker run -it --rm --link sentry-redis:redis --link sentry-postgres:postgres sentry sentry createsuperuser
To make the web portal accessible, you'll have to open the port Azure, using Settings - Endpoints:

All kudo's for this post go to: https://hub.docker.com/_/sentry/ for the excellent description, I've only created this post to add the additional docker pull / Azure stuff. I don't know if this is how you want to run it on production, but at least you have a very easy test environment.

Now you can compare this with other error loggers like:

What is your experience with error loggers and monitoring tools and which one would you recommend?

Cheers,
Luuk

Friday, July 3, 2015

Unhandled Exception: System.InvalidOperationException: Cannot dispose the build manager because it is not idle.

Today we got this really annoying error when building on TFS2010:


Long story short: Not our build server was causing this error, but the TFS server itself... it was out of diskspace..

So please check this first before blaming everything else except TFS :)

Cheers,
Luuk

Wednesday, March 4, 2015

What I've learned from reading RESTful Web APIs

I've recently finished reading the book RESTful Web APIs by Leonard Richardson, Mike Amundsen, Sam Ruby.


Wish I've read this book before building an API. To summarize the things I would have done different (and you've should have done probably too):

  • Use standard naming conventions for properties from for example: http://schema.org/docs/schemas.html
  • Don't use application/json but a custom format like application/vnd.sameproblemmorecode.blog+json
  • Make better use of the default HTTP Headers (e.g. the WWW-Authenticate and Link header)
  • Return errors as described in https://tools.ietf.org/html/draft-nottingham-http-problem-06
  • Create hypermedia links in the HTTP headers to describe possible links. These links should also have standartized names from for example http://www.iana.org/assignments/link-relations/link-relations.xhtml
  • If time allows; event create hypermedia profiles (this allows the server to change without breaking clients). One of the writers is also writing a book on how to create hypermedia driven clients for this.
  • Make sure to reuse as much standards as possible, we don't need another new standard. This enables us to reuse webcomponents (or at least parts of) between projects.
Hope this helps.

Cheers,
Luuk

Sunday, November 2, 2014

My Development Setup

Last week my PC got upgraded. This blogpost serves as a reference for all the stuff I do to personalize Visual Studio and Fiddler.

First thing I do is disable 'Automatically adjust visual experience based on client performance' and 'Enable rich client visual experience', but keep 'Use hardware graphics acceleration if available' enabled. Speed is everything baby!

Then I customize the toolbar and add BC. Pro-tip; remove all toolbar button's you never use.

I always install the following plugins:

Setup the Rename Visual Studio Window Title plugin (to see which branch I'm working in):

Setup Scrum Power Tools (I use this for code reviews and workitem shortcuts in the toolbar):
  1. Assign the work item and backlog items to Shortcut #1 and Shortcut #2
  2. Customize the toolbar and add the button for Shortcut #1 and Shortcut #2 to the standard toolbar

This is what my final toolbar looks like:


I also use fiddler for API debugging. For API's its really important to see the HTTP method. To add this column, enable / add the following block in the Rules > Customize rules file:
public static BindUIColumn("Method", 60)
function FillMethodColumn(oS: Session): String {
   return oS.RequestMethod;
}

When you also retrieve large binary blocks, fiddler can really slow down when you accidentically click on one. The very powerfull Customize rules file, also has a solution for this. Add the following code inside the OnPeekAtResponseHeadersfunction. This will drop large response bodies, which slows down fiddler.
// This block enables streaming for files larger than 5mb
if (oSession.oResponse.headers.Exists("Content-Length"))
{
  var sLen = oSession.oResponse["Content-Length"];
  var iLen: Int32 = 0;
  if (!isNaN(sLen)){
    iLen = parseInt(sLen);
    if (iLen > 5120000) {
      oSession.bBufferResponse = false;
      oSession["ui-color"] = "brown";
      oSession["log-drop-response-body"] = "save memory";
    }
  }
}

Monday, September 22, 2014

Major update for the SBQueueManager

Also having problems managing the service bus for windows server?
With the latest update of the SBQueueManager you can handle it all.

The improvements include:

  • Topic support
  • Subscription support
  • New user right to manage queues and topics
  • Update queue and topic support
  • Less crashes
  • More feedback
Check it out!


And offcourse: source is available at Github: https://github.com/luuksommers/SBQueueManager

Happy managing!
Luuk

Tuesday, July 8, 2014

Automatically deploy services using TFS2010

All credits for this post goes out to: Hrusikesh Panda and Mike Hadlow (see references below).

To deploy Windows services with TFS2010 there are 2 major challenges; seperate the binaries on a per-project basis and automatically stop and start a service without changing the build workflow. Sounds hard? It isn't!

If you don't want any blabla and download the sample solution directly, goto github.

First add the following files to your Solution folder and make sure they will be committed to TFS:
DeployApplication.targets (this is a copy of the WebDeploy targets from Microsoft with an extra change to copy the .config file).
DeployService.targets (this stops the service, copies the files and starts the service, it also contains the location on where to put the files)
safeServiceStart.bat (helper to start a remote service)
safeServiceStop.bat (helper to stop a remote service)
you can simply add these files to the solution items (right click solution and click add existing item).

Edit the DeployService.targets for the right paths on the remote machine. The directory where the service is located should be available for the user running the build using a standard windows share (\\<servername>\<project directory>\<servicename>, the servername is determined during the build and can be configured for each build quality.

Copy the Deploy.targets into the project you want to deploy, unload the project file and edit it with visual studio. Lookup the following line:
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
And add this line below:
<Import Project="Deploy.targets" Condition="'$(DeployOnBuild)'=='True'" />
Save the file and reload the project

Once you've reloaded the project, edit the Deploy.targets (just double click on it) and change the service name and service location. Commit all changes to TFS.

The next step is to install the service on the remote machine. (using installutil, or whatever method you like, in the sample we can install a service by running it with the -I command line argument). The automated deployment will only start and stop the service and is not able to install and delete the service. (In my case, the service is running as a domain user, and therefor I prefer to pre-install the service with the right credentials or else you'll have to enter the credentials in the deployment script).

To enable automatic deployment, create a new Build Configuration on your TFS2010 server and set it up with a scheduled trigger (or whatever trigger you want). In the process section add the following MSBuild arguments:
 /p:DeployOnBuild=True /p:DeploymentServerName=\\<server-to-deploy-to>
If you also want to configure transformations (change the .config file during deployment), have a look at SlowCheetah. This works without a problem with this configuration. To make sure SlowCheetah only works during deployment, I've edited the project file again and added AND '$(DeployOnBuild)'=='True' in the following line:
<Import Project="$(SlowCheetahTargets)" Condition="Exists('$(SlowCheetahTargets)') AND '$(DeployOnBuild)'=='True'" Label="SlowCheetah" />

(If you don't see this line, make sure you've installed SlowCheetah extension and added the transformations by right click on a .config file and click add transformations)

References:
http://mrchief.calepin.co/deploying-windows-service-via-msbuild
http://mikehadlow.blogspot.nl/2009/06/tfs-build-publishedwebsites-for-exe-and.html
https://github.com/luuksommers/TFSDeployedService

Happy deploying!

Luuk

Monday, March 24, 2014

Running Transmission as a Windows service

I've recently updated my netbook server to a full-blown core i3 system with Windows 8.1. Transmission, the torrent client of my choice, can be very easily installed under windows as a service.

Follow these steps:

  1. Download the client and cygrunsrv  (both .zip). from: http://sourceforge.net/projects/transmissiondaemon/files/
  2. Extract the transmission executables to a folder (in my case c:\apps\transmission)
  3. Extract cygrunsrv to the same folder
  4. Starts a command prompt as administrator and go to the transmission directory (cd c:\apps\transmission)
  5. Execute cygrunsrv --install TransmissionBT --path c:/Apps/transmission/transmission-daemon.exe -a "--pid-file transmission-daemon.pid --foreground" -e HOME=c:/Apps/transmission/ -e TRANSMISSION_WEB_HOME=c:/Apps/transmission/web --stdout c:/Apps/transmission/log/transmissionbt.log --stderr c:/Apps/transmission/log/transmissionbt.error.log
To make the whole setup beyond awesome, install the chrome plugin .torrent to transmission. Now you can just right click every .torrent or magnet link and just sent it to transmission.

Happy leeching!

Luuk 

Sunday, October 27, 2013

Bind WPF controls to attributes using Caliburn Micro

To bind for example an DecimalUpDown control to a RangeAttribute specified in a domain model, or a maxlength to a StringLengthAttribute you need to change the automatic binding of Caliburn Micro.

The example below is for a DecimalUpDown, but you can use it for all kind of fun stuff. In my github working example you can also see a Textbox.

First we need to add an ElementConvention for out DecimalUpDown in the BootStrapper's Configure method. Override this method and add the following convention (edit 2014-03-26 added attribute check to prevent binding errors):
ConventionManager.AddElementConvention<DecimalUpDown>(DecimalUpDown.ValueProperty, "Value", "ValueChanged").ApplyBinding =
(viewModelType, path, property, element, convention) =>
{
    if (!ConventionManager.SetBindingWithoutBindingOrValueOverwrite(viewModelType, path, property, element, convention, DecimalUpDown.ValueProperty))
        return false;

    if (property.GetCustomAttributes(typeof (RangeAttribute), true).Any())
    {
        if (!ConventionManager.HasBinding(element, DecimalUpDown.MaximumProperty))
        {
            var binding = new Binding(path) {Mode = BindingMode.OneTime, Converter = RangeMaximumConverter, ConverterParameter = property};
            BindingOperations.SetBinding(element, DecimalUpDown.MaximumProperty, binding);
        }

        if (!ConventionManager.HasBinding(element, DecimalUpDown.MinimumProperty))
        {
            var binding = new Binding(path) {Mode = BindingMode.OneTime, Converter = RangeMinimumConverter, ConverterParameter = property};
            BindingOperations.SetBinding(element, DecimalUpDown.MinimumProperty, binding);
        }
    }

    return true;
};
As you can see the binding uses a RangeMaximumConverter and a RangeMinimumConverter. These are fairly simple with the AttributeConverter baseclass:
public sealed class RangeMaximumConverter : AttributeConverter<RangeAttribute>
{
    public override object GetValueFromAttribute(RangeAttribute attribute)
    {
        return attribute.Maximum;
    }
}
And the AttributeConverter base class:
public abstract class AttributeConverter<T> : IValueConverter
    where T : Attribute
{
    public object Convert(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)
    {
        var property = parameter as PropertyInfo;

        if (property == null)
            return new ArgumentNullException("parameter").ToString();

        if (!property.IsDefined(typeof(T), true))
            return new ArgumentOutOfRangeException("parameter", parameter,
                "Property \"" + property.Name + "\" has no associated " + typeof(T).Name + " attribute.").ToString();

        return GetValueFromAttribute((T)property.GetCustomAttributes(typeof(T), true)[0]);
    }

    public object ConvertBack(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)
    {
        throw new NotSupportedException();
    }

    public abstract object GetValueFromAttribute(T attribute);
}
You can find a working example in GitHub.

Happy coding,
Luuk