Moving vRealize Automation blueprints between environments with vRealize Suite Lifecycle Manager 1.2

When large enterprises deploy a cloud management platform like VMware vRealize Automation, they often have a number of different environments. To ensure best practices, blueprints and orchestration scripts are created in development before being tested in another environment, before finally being transported into 

production. This prevents broken blueprints making it into production and ensures customers can continue to consume resources in the usual manner.

One of the biggest challenges facing IT teams is how to move artifacts (blueprints, scripts, custom properties etc) between environments. Whilst it is possible to leverage the power of vRealize Orchestrator and vRA’s Event Broker to create a workflow to manage this lifecycle, this requires a lot of work and thorough testing.

In 2016 VMware released the vRealize Code Stream Management Pack for IT DevOps, commonly referred to as “Houdini” for those who thought its full title was a bit of a mouthful. This enabled customers to promote code between environments without having to craft their own solution. However it wasn’t without its issues. For example, custom properties used by composite blueprints had to exist in the target environment prior to being moved. Software components also had to be moved beforehand, which could be cumbersome for blueprints which made use of a lot of these.

Houdini is dead, long live Blackstone

Thankfully, these issues are a thing of the past with the introduction of VMware vRealize Suite Lifecycle Manager 1.2 (vRSLCM for short) which was released on 12 April 2018.

Note: vRSLCM is a huge product which automates the installation, upgrading, patching, configuration and health management of your entire vRealize Suite of products. It has numerous capabilities, however, in this post I will focus purely on content management between environments.

VMware has integrated Houdini into the vRSLCM product and has improved it significantly. Recognizing that moving artifacts individually can be cumbersome, vRSLCM gathers up all dependencies prior to transportation, ensuring blueprints work successfully upon reaching their target environment

vRSLCM also introduces source control, enabling you to export your artifacts out to GitLab so that they can be versioned. This is extremely handy as it enables you to re-import a blueprint which featured changes

Getting Started

Login to your GitLab environment and create a user for vRSLCM:

Once created, impersonate the vRSLCM account and click Settings. On the left-hand side, select Access Tokens:

Give the token a name and click API:

Make a note of the token and store it in a safe place.

Repository

You should already have your projects and repository setup and configured. In my environment, the project is marked as private. As it was my first project, I know the project number is 1.

If you try to view the project using the following URL:

https://gitlab.nl.mdb-lab.com/api/v4/projects/1

It should fail with a 404 error. However, if you append the access token using:

?private_token=<inset token here>

It will return data (using Firefox is this example):

This verifies the access key (token) works as expected

Source Control

Login into vRSLCM and select Content Management, followed by Content Settings and then Source Control Access. Click Add Source Control Server. Select the type as GitLab, and enter your server’s address. Finally click Submit.

Once submitted, click the pencil icon to the right of your server to edit it. Paste in the access token you recorded earlier and click Submit:

Note: Your GitLab installation must be available on HTTPS. If not the connection between vRSLCM and GitLab will fail at a later stage.

Endpoints

In this example we will be using five endpoints:

Click Content Management, Endpoints, then New Endpoint:

Choose your destiny…

Note: Orchestration endpoints must be added before Automation.

Configure your server details as appropriate, ensuring the following:

  • A tag to describe the environment
  • Port 8281 for an external vRO server

Once complete click Test Connection:

Once done click Next followed by Submit. Repeat for your remaining orchestration endpoints.

Follow the same procedure for creating your automation endpoints, remembering to select the correct orchestration endpoint you created previously:

Finally, create a source control endpoint for GitLab:

Capturing Content

In the following example I have chosen to promote my newly created Kubernetes blueprint from Development to Production. The blueprint and all dependencies will also be stored in source control.

This is a relatively simple blueprint that consists of the following:

  • Composite blueprint (Automation-CompositeBlueprint)
  • Software components (Automation-Software)
  • Property definition (Automation-PropertyDefinition)

All of the above only exists in the Development environment, not in Production.

Click on Content Management followed by Content, then select Add Content:

Select Check In and then Proceed. Choose the content you wish to transport:

Click Next. Select your GitLab server:

Finally click Submit.

Pipeline

To monitor the capturing process, click Content Management, Content Pipelines, followed by Executions. This will show the execution you just initiated:

Once complete, everything should be green:

Success

If we login in to GitLab as our vRSLCM user we will see the commits we just made:

Deploying Content

The process of deploying content is even simpler.

Under Content, we can see everything that was captured previously:

Click on the composite blueprint:

On the right-hand side, click Deploy.

Here you can choose to release to all endpoints configured as such, or can select one based on a tag. I have a number of environments each with multiple tenants, so for this example, I will select the endpoint based on the “Prod” tag:

Click Proceed.

Monitor the pipeline execution as before:

Once it has completed, verify it has arrived in the target environment:

Together with the software components:

That’s it! The blueprint and all dependencies have now been migrated to production.

Final Thoughts

Bear in mind that moving content between environments needs careful consideration. For example, if you have a template in Development that uses the Guest User Agent (gugent), then that, and the blueprint that references that template will point to that specific vRA environment.

To work around this, you will either need to name your templates so they are environment agnostic, or alter the blueprint when it arrives in the target environment.

The former is the easiest path, but will possibly require a dedicated templates datastore for each environment. Plus having all templates named the same with no obvious way to differentiate between them could lead to increased administration costs. It would not be difficult to automate the latter using vRO and the event broker, but would, of course, require time to produce a working solution.

For more information on vRealize Suite Lifecycle Manager, check out https://blogs.vmware.com/management/2018/03/vrealize-suite-lifecycle-manager-1-2-introducing-content-management-integrated-marketplace.html.

7 thoughts on “Moving vRealize Automation blueprints between environments with vRealize Suite Lifecycle Manager 1.2

  1. Pingback: Newsletter: April 14, 2018 – Notes from MWhite

  2. I’m assuming you modified the OOTB Pipelines in vRLCM? The capture vRA element never populated to GITLAB.

    Also,
    I couldn’t retrieve composite-blueprints; however property definitions, group, subscriptions…all worked.

    Like

  3. I believe I setup GITLAB correctly. I’m using my own account instead of a service account; however I set up Access Keys under my account.

    I regards to composite blueprints, I dont see any in the dropdown for Composite-Blueprint. I get a “No corresponding content found” notification

    Like

    • Make sure the connection account you use to talk from vRSLCM to vRA has the permissions to request blueprints.

      As a test, use the account you logon to vRA with to request catalog items.

      Like

      • So that the account you login in to vRA with can see composite blueprints in the catalog – great.

        Now configure an endpoint in LCM using that same account. After that, can you select the blueprint from the dropdown?

        Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.