January 14, 2020 By Henrik Loeser 4 min read

Utilizing IAM capabilities for enhanced security.

In the past, I have blogged about how to access a Db2 database on IBM Cloud from your local machine. I cataloged the cloud host and its database. Then, connected to it using the command line processor (CLP) by passing a username and password. 

Last year, I was delighted to read that Db2 on Cloud and Db2 Warehouse on Cloud now support authentication by either IBMid, API key, or access token—it is called IAM (Identity and Access Management) support. I recently had some time to actually use it, and these are the steps I took to connect from my local machine to IBM Db2 on Cloud using either API key or token.

Db2 client 

In order to connect to a Db2 database, I needed a Db2 client. Since working more and more with cloud environments and containers, I typically do not have a Db2 server or client installed. Thus, for testing, I installed the IBM Data Server Runtime Client and, separately, also used the official IBM Db2 Warehouse client container. As it turned out, it was a good choice.

Once available, I tested the CLP for CONNECT syntax, and both environments returned the following:

$ db2 ? connect
CONNECT [USER username [{USING password
[NEW new-password CONFIRM confirm-password] |
CHANGE PASSWORD}]]


CONNECT RESET


CONNECT TO database-alias
[IN {SHARE MODE | EXCLUSIVE MODE [ON SINGLE DBPARTITIONNUM]}]
[[USER username [{USING password
[NEW new-password CONFIRM confirm-password] |
CHANGE PASSWORD}]] |
[ACCESSTOKEN accesstoken] |
[APIKEY apikey]]
NOTE: From the operating system prompt, prefix commands with 'db2'.
      Special characters MAY require an escape sequence (\), for example:
      db2 \? change database
      db2 ? change database xxx comment with \"text\"

CLP supports APIKEY and ACCESSTOKEN, but it is not reflected in the docs for all Db2 product editions.

Cataloging the database 

To make the remote database known to CLP, I first needed to catalog the remote host (TCPIP NODE), then the actual database:

db2 catalog tcpip node Db2oCfra remote db2host-fra02-xxx.services.eu-de.bluemix.net server 50001 security ssl

The above adds information about a node name Db2oCfra to the local node directory. The important part is to use port (SERVER) 50001 and to explicitly specify that it uses SSL. For security reasons, IAM support is restricted to SSL connections only. If port 50000 is used or if you forget to specify “security ssl”, you are going to run into all kinds of error messages. Trust me on that…

The following command adds the remote database “bludb” with the alias “fradb” to the local database directory. Note that the authentication type is set to GSSPLUGIN to allow for login via IBMid:

db2 catalog db bludb as fradb at node db2ocfra authentication gssplugin

During my attempts to make IAM-based authentication work, I ran into a couple problems. The command ipclean helped me to reset/refresh the communication infrastructure and make sure I didn’t run into some caching issues.

Connect to Db2 using an API key

With the above setup in place, I added my IBM Cloud userid to my Db2 instance. It can be done in the browser console (tested) or by REST API. Next, I generated a new API key using the IBM Cloud CLI. Then, I ran the tests to connect to the database using the API key:

db2 connect to fradb APIKEY myIBMCloudplatformApiKey

It succeeded in the client container, but failed in my local Db2 client environment. What was different? I found that the SSL support was not set up. I needed to install GSKit (through my package manager or, alternatively, by downloading), then configure SSL support for use with CLP:

  1. Create a database for the encryption key:
    gsk8capicmd_64 -keydb -create -db "db2clientssl.kdb" -pw "myClientPassw0rdpw0" -stash
  2. Add the root certificate to the key database. The certificate is part of the Db2 client install:
    gsk8capicmd_64 -cert -add -db "db2clientssl.kdb" -pw "myClientPassw0rdpw0"  -file sqllib/cfg/DigiCertGlobalRootCA.arm
  3. Configure the instance to use the key database and stash file for SSL-based client connections:
    db2 update dbm cfg using SSL_CLNT_KEYDB /home/hloeser/db2clientssl.kdb SSL_CLNT_STASH /home/hloeser/db2clientssl.sth

With the above configured, I was able to successfully connect from my Db2 client environment.

Connect to Db2 using a bearer access token

An access token is a bearer token (actually a JSON Web Token, JWT) which can be generated based on an API key:

curl -k -X POST --header "Content-Type: application/x-www-form-urlencoded" --header "Accept: application/json" --data-urlencode "grant_type=urn:ibm:params:oauth:grant-type:apikey" --data-urlencode "apikey=myIBMCloudplatformApiKey" "https://iam.cloud.ibm.com/identity/token"

The returned token can then be used to authenticate against Db2:

db2 connect to fradb ACCESSTOKEN eyHereIsAVeryLongStringblablabla

The token is valid for 60 minutes after it has been issued, so keep that in mind when testing. The established connection is not impacted by an expiring token, but when you try to connect using an expired token, it gives the following error message:

SQL30082N  Security processing failed with reason "28" ("EXPIRED SERVER CREDENTIAL").  SQLSTATE=08001

Conclusion 

Getting a proper (security) configuration takes the right order of baby steps. With the right instructions and experience, it is accomplished quickly and easily. Once done, it is possible to connect to Db2 using either a platform API key of IBM Cloud or a generated access token, replacing traditional username and password. It could be used to share an automatically expiring token (for a test instance) or to pass an API key that gets revoked at a later point. The IAM support not only adds more authentication options, but also new items to the security toolbox. See “Secure Credential Brokering in Analytic Jobs on Cloud Data Sources” on how they could be used with the SQL Query service.

If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn.

But wait, there is more… 

A collection of error messages

For the sake of documentation and finding help in this blog post, here are some error messages I ran into in setting up my environments:

  • SQL30081N: A communication error has been detected. Communication protocol being used: “SSL”.  Communication API being used: “SOCKETS”.  Location where the error was detected: “”.  Communication function detecting the error: “sqlccSSLSocketSetup”.  Protocol specific error code(s): “414”, “*”, “*”.  SQLSTATE=08001
  • SQL10013N: The specified library “GSKit Error: 100” could not be loaded.  SQLSTATE=42724
  • SQL30082N: Security processing failed with reason “25” (“CONNECTION DISALLOWED”).  SQLSTATE=08001
  • SQL30082N: Security processing failed with reason “24” (“USERNAME AND/OR PASSWORD INVALID”).  SQLSTATE=08001
  • SQL30082N: Security processing failed with reason “27” (“INVALID SERVER CREDENTIAL”).  SQLSTATE=08001
  • SQL30081N: A communication error has been detected. Communication protocol being used: “TCP/IP”.  Communication API being used: “SOCKETS”.  Location where the error was detected: “xxx.yyy.yyy.xxx”.  Communication function detecting the error: “recv”.  Protocol specific error code(s): “104”, “*”, “0”.  SQLSTATE=08001
Was this article helpful?
YesNo

More from Cloud

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into…

The power of the mainframe and cloud-native applications 

4 min read - Mainframe modernization refers to the process of transforming legacy mainframe systems, applications and infrastructure to align with modern technology and business standards. This process unlocks the power of mainframe systems, enabling organizations to use their existing investments in mainframe technology and capitalize on the benefits of modernization. By modernizing mainframe systems, organizations can improve agility, increase efficiency, reduce costs, and enhance customer experience.  Mainframe modernization empowers organizations to harness the latest technologies and tools, such as cloud computing, artificial intelligence,…

Modernize your mainframe applications with Azure

4 min read - Mainframes continue to play a vital role in many businesses' core operations. According to new research from IBM's Institute for Business Value, a significant 7 out of 10 IT executives believe that mainframe-based applications are crucial to their business and technology strategies. However, the rapid pace of digital transformation is forcing companies to modernize across their IT landscape, and as the pace of innovation continuously accelerates, organizations must react and adapt to these changes or risk being left behind. Mainframe…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters