April 10, 2020 By Henrik Loeser 4 min read

A hands-on example demonstrating how you can increase (multi)cloud security.

Both the Key Protect and Hyper Protect Crypto Services on IBM Cloud are appreciated as highly secure tools to manage the lifecycle of encryption keys. But did you know that both can also be used as vaults for credentials? Their fine-granular access control and audit logging come in handy when there is a need to protect external credentials in a multicloud environment.

In this blog post, I am going to provide some hands-on insights on how to utilize Key Protect as such a vault. In the example, I demonstrate how to encode and upload to Key Protect JSON-structured credentials (an example could be the key ID and secret to access the AWS S3 storage for importing data into IBM Cloud). I will then show a Cloud Function that retrieves the access data and makes it available.

Encode credentials as a standard key

Key Protect and Hyper Protect Crypto Services are solutions that can help manage encryption keys on IBM Cloud. Both integrate with many cloud services, and you can generate new keys or bring or keep your own encryption keys (BYOK/KYOK). Basically any base64-encoded string can be imported as so-called standard keys. This allows you to store custom credentials, not just encryption keys. 

The following shows a JSON object with a userID and password that could be stored:

{
"USERID": "my-user-id",
  "PASSWORD": "my-password"
}

What you want to do, in this case, is to encode the JSON object to a base64 string and to store (create) the resulting string as new key in Key Protect. Assuming the JSON object is stored in a variable creds in a bash script, the following lines would encode the object and then create the key:

encoded=$(base64 -w 0 - <<< ${creds} )

ibmcloud kp create MY_CREDS --standard-key --key-material $encoded -i my-key-protect-instance-id --output json

You can find the full script with comments in the linked Gist on GitHub. The script makes use of the IBM Cloud CLI with the plugin for Key Protect as well as the standard “base64” command. Some required values are read from environment variables. 

Upon completion, the script will return an output similar to the following. It is the metadata for the new key, showing its ID and the resource identifier (CRN):

Sample output for new key with key ID and CRN.

Security configuration

You need to set up user roles and permissions for access to the Key Protect instance and the managed keys (credentials). Key Protect has a special ReaderPlus role. Similar to the Reader role, it allows you to retrieve information about keys. In addition, users with the ReaderPlus role can access the actual key material for standard keys (i.e., the stored credentials). This means that the ReaderPlus role is needed for all users and service IDs which need to access the credentials. 

The Writer role is required to create/store new keys. Note that it is possible to grant access to individual keys, further enhancing security by fine-tuning the need-to-know access.

In the introduction, I promised to provide a Cloud Function action for retrieving the credentials off of the vault. Cloud Functions organize objects such as actions and packages in namespaces. They are controlled by IAM concepts—hence IAM namespaces. Each namespace maps to an IAM service ID. In order for the action to be able to access the key in Key Protect, we need to grant the service ID the ReaderPlus service access role. Optionally, access could be limited to the keys (credentials) needed to perform the designed tasks.

You can grant the access directly to the service ID. The best practice, however, is to first create an access group. It is used to manage the relationship between users and service IDs in the group and the assigned privileges. The following screenshot show an access group “KP Access” with the assigned ReaderPlus role for a specific Key Protect instance.

ReaderPlus service access role to retrieve key material.

Retrieve credentials from the vault

Once access to Key Protect is configured, the designated users or apps can access the stored credentials. I implemented a Cloud Function action in Python to retrieve the credentials and pass them on in a JSON object again. The following actions in a sequence can then use the credentials to access an external service (e.g., an Amazon AWS S3 storage bucket or an emailing service).

All it takes is an IAM access token and information about the Key Protect instance and the key in question. This information can be passed in or obtained from the execution environment:

def getKeyFromKeyProtect(access_token, kpInstId, kpKeyId):
    # replace REGION with, e.g., us-south
    url="https://REGION.kms.cloud.ibm.com/api/v2/keys/%s" % kpKeyId
    headers = { "accept" : "application/vnd.ibm.kms.key+json", "bluemix-instance": kpInstId,
                "Authorization": "Bearer " + access_token}
    response  = requests.get( url, headers=headers).json()
    # credentials are in payload
    kpPayload=response["resources"][0]["payload"]
    return json.loads(base64.b64decode(kpPayload).decode())

You can find the source for the whole Cloud Function action in this Gist on GitHub. Of course, you can use similar code in a (containerized) app on Cloud Foundry, Kubernetes, or OpenShift.

Conclusions

It is fairly easy to use Key Protect as a vault for credentials. A common use case is the secure management of external credentials (e.g., in a multicloud scenario). By making use of the IBM Cloud IAM capabilities, it is possible to grant users or service IDs (read) privileges on a need-to-know basis. All access is logged for compliance.

If you want to read more, look here:

 If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn

Was this article helpful?
YesNo

More from Cloud

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into…

The power of the mainframe and cloud-native applications 

4 min read - Mainframe modernization refers to the process of transforming legacy mainframe systems, applications and infrastructure to align with modern technology and business standards. This process unlocks the power of mainframe systems, enabling organizations to use their existing investments in mainframe technology and capitalize on the benefits of modernization. By modernizing mainframe systems, organizations can improve agility, increase efficiency, reduce costs, and enhance customer experience.  Mainframe modernization empowers organizations to harness the latest technologies and tools, such as cloud computing, artificial intelligence,…

Modernize your mainframe applications with Azure

4 min read - Mainframes continue to play a vital role in many businesses' core operations. According to new research from IBM's Institute for Business Value, a significant 7 out of 10 IT executives believe that mainframe-based applications are crucial to their business and technology strategies. However, the rapid pace of digital transformation is forcing companies to modernize across their IT landscape, and as the pace of innovation continuously accelerates, organizations must react and adapt to these changes or risk being left behind. Mainframe…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters