February 8, 2019 By Phil Alger 4 min read

Migrating your data to IBM Cloud Databases for Redis

If you’re moving your data over to IBM Cloud Databases for Redis, you’ll need to take some steps to successfully migrate all of your data. We’ve got you covered. In this post, we’ll show you a quick way to start migrating your data across to Databases for Redis, whether your database is on-premise or in the cloud.

If you’re a Redis user, you already know why Redis is a great database for quickly storing and retrieving in-memory data. If you’re thinking about moving into the cloud or transitioning from Compose for Redis (or another cloud provider) to Databases for Redis, we’d like to guide you through the migration process. The good thing is that there isn’t much for you to do if you’re migrating from your local or Compose for Redis database to Databases for Redis.

Let’s talk about how to migrate.

The migration script

Migrating to Databases for Redis involves running a simple Python script that we’ve made available on Github. The script will copy all the keys from your source database over to your Databases for Redis deployment. You’ll want to download the script and make sure you have Python 3.x installed. If you’re on macOS, you can use homebrew to install it running brew install python3, which will give you the latest version.

Note that you will also need the python dependencies listed in the comments of the script.

Next, we recommend creating a migration window to let your users know that you’ll be doing some maintenance. That way you’ll have some time to migrate all your data over to your new Databases for Redis deployment. If your using Redis as a key-value store with expire times on keys, rest assured that these expiry times will be copied over to your new database. We’re testing this with 10 million keys in our database, which won’t take much time to migrate depending on your bandwidth.

Getting the destination and source database credentials

Now, you’ll need to have the credentials of both the source database and your Databases for Redis deployment. You can get the credentials for the Databases for Redis deployment by clicking on your database from the IBM Cloud resources panel. Then click on the Service credentials link from the left-hand menu that’ll take you to the Service credentials view. From there, you can create a New credential by clicking on that button or you can use any credentials that you’ve already created.

Another way to get this information is using the IBM Cloud CLI. Using the cdb plugin, you’d run:

ibmcloud cdb deployment-connections <Redis deployment name>

This will provide you with your Databases for Redis connection URI that includes the hostname and port. To get the decoded CA certificate for the database, you’d run:

ibmcloud cdb deployment-cacert <Redis deployment name>

Once the CA certificate is decoded, you need to save that to a file to connect to the database later. If you don’t know the password for your Redis deployment, you’ll need to either get that from your generated service credentials or you can create a new password by running:

ibmcloud cdb deployment-user-password <Redis deployment name> admin <new password>

If your destination database is running Redis 6 or greater, you will need both a username and a password.

With this information, we have what we need for the destination. For the source, let’s say that we want to migrate our data from Compose for Redis to our new Databases for Redis deployment. To do that, you’d have to get the host, port, and password for your Compose for Redis database. You can get those by following the same steps that we covered above by clicking on your Compose for Redis database from the IBM Cloud resource panel and then creating or using your service credentials.

Running the script and migrating data

Since you have all the credentials for both databases (Compose and Databases for Redis), we’ll now show you how to run the script. We’ve named the Python script file pymigration.py. All you’ll need to do now is run the code from your terminal using the credentials you’ve gotten above:

python pymigration.py <source host> <source password> <source port> 
<destination host> <destination password> <destination port> 
<destination ca certificate path> --sslsrc --ssldst

If your destination database is running Redis 6 or greater, your “destination password” should be formatted as username:password. If less than Redis 6, you can simply provide your password.

Since we’re copying data from a Compose for Redis database, you’ll need to add the --sslsrc flag if your Compose for Redis database is SSL/TLS enabled. If it isn’t, then don’t add the flag. This makes sure that Redis is connecting to a SSL/TLS enabled database. You also need to add --ssldst since the destination database is Databases for Redis which also is SSL/TLS enabled. Supplementary flags you could add are --dband --flush. Using --db, you can indicate the database your keys are copied from, which will be the database they recopied into in your Databases for Redis deployment. The --flush flag allows you to flush the destination database before importing the keys from the source database. If you want to keep things fresh in your Databases for Redis deployment, flush will delete all the keys first then import the new keys from your source database.

Running the above script using Compose for Redis as the source for the data migration and Databases for Redis as the destination for the migrated data, we’d get something like:

python pymigration.py portal0.0000.composedb.com composepassword1 88888 000.000.databases.appdomain.cloud dbredispassword1 99999 ~/dbredisCA  --sslsrc --ssldst  10000000 keys: 100% |###################################################| Time: 0:00:00 Keys disappeared on source during scan: 0 Keys already existing on destination: 0

As you can see from the results, we copied 10 million keys from Compose for Redis to Databases for Redis. No keys were deleted on the Compose for Redis database. If we add a new key to the Compose for Redis deployment and attempt to migrate the data again, we’ll see that the Keys already existing on destination will change to 10000000 since the original 10 million keys already exist on that database.

10000000 keys: 100% |###################################################| Time: 0:00:00
Keys disappeared on source during scan: 0
Keys already existing on destination: 10000000

Conclusion

Migrating your data couldn’t be simpler. After your migration, all you need to do is swap out your application’s database connection strings with your Databases for Redis connection string and credentials. That’s all it takes!

Was this article helpful?
YesNo

More from Cloud

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into…

The power of the mainframe and cloud-native applications 

4 min read - Mainframe modernization refers to the process of transforming legacy mainframe systems, applications and infrastructure to align with modern technology and business standards. This process unlocks the power of mainframe systems, enabling organizations to use their existing investments in mainframe technology and capitalize on the benefits of modernization. By modernizing mainframe systems, organizations can improve agility, increase efficiency, reduce costs, and enhance customer experience.  Mainframe modernization empowers organizations to harness the latest technologies and tools, such as cloud computing, artificial intelligence,…

Modernize your mainframe applications with Azure

4 min read - Mainframes continue to play a vital role in many businesses' core operations. According to new research from IBM's Institute for Business Value, a significant 7 out of 10 IT executives believe that mainframe-based applications are crucial to their business and technology strategies. However, the rapid pace of digital transformation is forcing companies to modernize across their IT landscape, and as the pace of innovation continuously accelerates, organizations must react and adapt to these changes or risk being left behind. Mainframe…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters