March 21, 2022 By Frederic Lavigne 5 min read

How to use the IBM Cloud plugin for Packer with the Instance Metadata service to create a custom image with pre-installed IBM Cloud CLI and automatically configured Log Analysis and Monitoring agents.

As you build a workload on top of virtual servers, you may have to configure all servers to comply with your enterprise policies and specific regulations. This would mean manipulating configuration files, locking down iptables, enabling or disabling services, updating or removing software packages, installing your application files, securing the environment and so on. The result is a hardened virtual server.

To build these hardened servers, you have several options. One option is to create a virtual server instance, connect to the instance, install all required software, perform any configuration changes and, finally, take a snapshot of the virtual server. From there, you can create new identical instances from this snapshot.

This approach works, but it can be brought to the next level with automation and custom images. When talking about automation and infrastructure provisioning, Terraform and Ansible automatically come to mind. When it comes to virtual servers, Packer is to images what Terraform is to your infrastructure.

In this blog post, I’m going to use Packer to create a virtual server image and then create a new virtual server instance from this image. This image will pre-install the IBM Cloud CLI and the IBM Cloud Log Analysis and IBM Cloud Monitoring agents. In addition, a script running at boot time will automatically configure the agents when a new instance is created. The auto-configuration will make use of the Instance Metadata service to authenticate against IAM and will retrieve Log Analysis and Monitoring credentials from IBM Cloud Secrets Manager:

Architecture of the solution described in this blog post.

Let’s annotate the diagram above to better understand the overall flow:

Architecture annotated with the sequence of steps.

The flow goes as follows:

  1. Define a trusted profile providing read access to Secrets Manager and secrets containing credentials to access Log Analysis and Monitoring services.
  2. Create one VPC and one subnet that will be used by Packer to build the image.
  3. Run Packer to create the custom image.
  4. Create a virtual server instance from the custom image.
  5. The instance starts, retrieves credentials and initializes the Log Analysis and Monitoring agents.

Use trusted profiles to authenticate compute resources

Trusted profiles allow you to give an identity to your compute resources. By linking a compute resource to a trusted profile, you can grant the compute resource access to IBM Cloud services without having to hardcode any credentials on your compute resource. Used in combination with the Instance Metadata service, the trusted profile allows a compute resource to obtain an IAM API token. With this token, your code running on the compute resource can make authenticated calls to other IBM Cloud services it has been granted access to.

In the diagram, in Step 1, a secret group named “observability” is created in an existing instance of Secrets Manager. This secret group defines two key-value secrets containing the credentials to access the Log Analysis and Monitoring services. These credentials will be used by the agents running on virtual servers. A trusted profile is configured with policies to access the Secrets Manager instance and the observability secret group. Any virtual service instance linked to this trusted profile will have the ability to retrieve the credentials stored in this secret:

A trusted profile with policies to access a Secrets Manager instance and a secret group.

Automate the process of building custom images with Packer

Packer is similar to Terraform in that it relies on plain text files describing the actions to take to build a custom image (in the same way you define your target state in Terraform template files). It uses the same HCL syntax (HashiCorp Configuration Language). It also supports plugins to integrate with cloud providers. A plugin is available for use with IBM Cloud. To create VPC custom images, the Packer plugin requires you to provision a minimum environment made of a VPC and a subnet. This is what Step 2 captures in the diagram.

In Step 3 of the diagram, Packer reads a configuration file (vm.pkr.hcl in this example) and builds the image according to the instructions in the file. It creates a temporary instance in the VPC from Step 2. It runs a script on the instance to pre-install IBM Cloud and other dependencies. It also copies some files that will be used when a new instance is created. Finally, Packer captures a custom image of the temporary instance and deletes this instance once done. This process of creating an instance, waiting for it to be available, running scripts, capturing an image and deleting the custom instance is automated by Packer. It is a really good approach to continuously build images from a CI/CD pipeline:

Packer uses a temporary instance to create a custom image from a configuration file.

Instantiate a virtual server instance from a custom image and automatically configure Log Analysis and Monitoring agents

When you create a virtual server instance from the command line, the API or Terraform, you can specify a default trusted profile to attach to the instance. In Step 4, when creating an instance from the image captured in Step 3, the trusted profile defined in Step 1 is linked to the instance.

During the instance boot phase, a custom service (using scripts installed in Step 3) uses the Instance Metadata service to authenticate with IAM and retrieve credentials from Secrets Manager. The credentials are then injected into the pre-installed Log Analysis and Monitoring agents. From there, the instance completes its boot sequence:

Boot sequence of the virtual server instance created from the custom image.

In the code, you will notice the on-boot.sh script is executed before cloud-init. It is one arbitrary choice made in the sample, although it has a nice side-effect in that the cloud-init logs get sent to Log Analysis. Depending on your own scenario, you may choose to only configure the Observability agents after cloud-init is complete. Similarly, the on-boot.sh script reconfigures the agents at every reboot.

Try it yourself

Packer brings an essential tool to capture the definition of your custom images as code, just like Terraform does for your infrastructure. The image-building capability can easily be integrated in your CI/CD pipeline and approval flow. And when it comes to your apps, a trusted profile allows you to use a compute resource (a virtual server, for example) as a principal when authenticating against IAM.

The source code for this blog post can be found here along with instructions on how to test it in your own account and entry points to drill down in the code.

If you have feedback, suggestions or questions about this post, please reach out to me on Twitter (@L2FProd).

Was this article helpful?
YesNo

More from Cloud

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into…

The power of the mainframe and cloud-native applications 

4 min read - Mainframe modernization refers to the process of transforming legacy mainframe systems, applications and infrastructure to align with modern technology and business standards. This process unlocks the power of mainframe systems, enabling organizations to use their existing investments in mainframe technology and capitalize on the benefits of modernization. By modernizing mainframe systems, organizations can improve agility, increase efficiency, reduce costs, and enhance customer experience.  Mainframe modernization empowers organizations to harness the latest technologies and tools, such as cloud computing, artificial intelligence,…

Modernize your mainframe applications with Azure

4 min read - Mainframes continue to play a vital role in many businesses' core operations. According to new research from IBM's Institute for Business Value, a significant 7 out of 10 IT executives believe that mainframe-based applications are crucial to their business and technology strategies. However, the rapid pace of digital transformation is forcing companies to modernize across their IT landscape, and as the pace of innovation continuously accelerates, organizations must react and adapt to these changes or risk being left behind. Mainframe…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters