July 31, 2020 By Vidyasagar Machupalli 5 min read

This post shows how to automatically assign a floating IP to a newly created VSI by monitoring Activity Tracker events and using Cloud Functions to interact with the VPC API.

Reserving a floating IP for one or two VSIs sounds easy. But how about for tens of VSIs provisioned in your Virtual Private Cloud (VPC)? Ever thought of auto-assigning a floating IP on-the-fly as and when a new VSI is provisioned in your VPC? 

In this post, you will use the IBM Cloud Activity Tracker with LogDNA service to track how users and applications interact with IBM Cloud Virtual Private Cloud (VPC). You will then create a view and an alert on Activity Tracker with LogDNA filtering VSI creation logs. The logs are then passed to IBM Cloud Functions Python action as JSON. The action reserves a floating IP to the newly provisioned VSI (instance) using the instance ID in the passed JSON.

IBM Cloud Activity Tracker with LogDNA records user-initiated activities that change the state of a service in IBM Cloud. You can use this service to investigate abnormal activity and critical actions and to comply with regulatory audit requirements. In addition, you can be alerted about events as they happen. In simply words, the services logs lines associated with changes (events) to the cloud and alerts on matching log lines.

With IBM Cloud Functions, you can use your favorite programming language to write lightweight code that runs app logic in a scalable way. You can run code on-demand with HTTP-based API requests from applications or run code in response to IBM Cloud services and third-party events. The Functions-as-a-Service (FaaS) programming platform is based on the open source project, Apache OpenWhisk. A web action is accessible through a REST interface without the need for credentials.

Get started

If you don’t have IBM Cloud CLI installed on your machine, follow the instructions here. A GitHub repository with companion scripts and a detailed README.md is provided for you to set up and create the required resources to smoothly walk-through the use case.

Clone or download the code from the repository:

     git clone https://github.com/IBM-Cloud/vpc-instance-extension
       cd vpc-instance-extension

Before running the scripts, load the environment variables with your configuration information into the terminal or command prompt. A template—.env.template—is provided for the same with predefined environment variables. Provide the IAM_API_KEY of the user who will be executing the scripts as the resources will be created under their user account.

The shell scripts (.sh) are at the heart of the repository and are numbered in the order of execution. You will start by installing required plugins and tools on your machine.

00-prereqs.sh: Installs IBM Cloud plugins—Infrastructure-service/VPC infrastructure (is), functions (fn), and Schematics. Additionally, it checks for Docker and jq.

Provision the IBM Cloud service

You will now provision the IBM Cloud service required for this use case.

01-services.sh: The script checks whether there is an IBM Cloud Activity Tracker with LogDNA service with 7-day event search Paid plan in your account. If there is none, it asks your permission to provision one with a 7-day search paid plan. 

You may have to re-run the script to create an access group, add the required policies, and add the users to the access group. Every user that accesses the IBM Cloud Activity Tracker with LogDNA service in your account must be assigned an access policy with an IAM user role defined. Add the email IDs of the users associated with IBM Cloud account to the .env file and source the file.

Create a Python action

In this section, you will create the following:

  • A namespace: Namespaces contain Cloud Functions entities, such as actions and triggers, and belong to a resource group. You can let users access your entities by granting them access to the namespace.
  • A Python action to reserve and bind a floating IP to a newly provisioned VSI.

02-functions.sh: The script uses the contents of the functions folder. 

functions/

┣ __main__.py

┣ helper.py

┣ init.sh

┗ requirements.txt

The Python (.py) files use vpc-python-sdk to check for existing unbound floating IPs. Create a new floating IP if there are none, and reserve an existing or new floating IP to an instance.

The script internally calls another script, init.sh, that pulls the ibmfunctions/action-python-v3.7 container image, installs the dependencies mentioned in requirements.txt, and creates a virtual environment (virtualenv). All of this is done using the pulled container image, without the hassle of installing anything on your machine. Once done, the code in the Python files and the created virtualenv are zipped, and a Python 3.7 action is created using the functions.zip file. This is one of the many ways to package your Python code.

The script also creates a secured web action. When you create a web action, the result is a URL that can be used to trigger the action from any web app. In this case, you will use the URL in the Activity Tracker with LogDNA service.

Configure Activity Tracker with LogDNA

You will create a LogDNA view and an alert from the view. Views are saved shortcuts to a specific set of filters and search queries. You can see the list of views in the Views pane on the left:

Check the README.md in the repository for detailed steps on how to configure Activity Tracker with LogDNA.

Once configured, the Activity Tracker with LogDNA service sends an alert when a new VSI is provisioned. The alert includes a JSON with two important elements:

  1. The matches element includes number of logs matching the query in the search filter. 
  2. The lines element includes details of one or more instances, like instance ID, that will be passed to the Python action. Using this info, the action reserves a floating IP to the instance.

Test the flow by provisioning VPC resources

You will test the complete flow by provisioning VSIs in a VPC.

03-vpc.sh:  The script uses IBM Cloud Schematics to provision the VPC resources from Terraform files in the vpc-tutorials GitHub repo. With Schematics, you can enable Infrastructure as Code (IaC) by codifying your IBM Cloud resources with Terraform and using Schematics workspaces to start automating the provisioning and management of your resources.

The script, after successful execution, provisions a VPC, a subnet, and two VSIs as mentioned in your .env file.

Run the below command to see the VSIs without floating IPs assigned:

ibmcloud is instances

You can also confirm the instance creation by navigating to the LogDNA view instance-extension.

To check whether the action invoked successfully and check the action logs, run the below command:

ibmcloud fn activation logs $(ibmcloud fn activation list | awk 'FNR == 2 {print $3}')

Re-run the ibmcloud is instances command to see the VSIs with floating IPs. To provision more VSIs, update the TF_VAR_instance_count variable in the .env file, source the .env, and re-run the 03-vpc.sh script.

Clean up

04-cleanup.sh:  The script deletes everything you created for this sample except the Activity Tracker with LogDNA service because only one service is allowed per region. If you wish to delete, you can delete it from the IBM Cloud resource list.

What’s next?

Was this article helpful?
YesNo

More from Cloud

Top 6 innovations from the IBM – AWS GenAI Hackathon

5 min read - Eight client teams collaborated with IBM® and AWS this spring to develop generative AI prototypes to address real-world business challenges in the public sector, financial services, energy, healthcare and other industries. Over the course of several weeks, cross-functional teams comprising client teams, IBM and AWS representatives worked to design, develop and iterate on prototypes that push the boundaries of what's possible with generative AI. IBM used design thinking and user-centric approach to guide the teams throughout the hackathon. AWS provided…

IBM + AWS: Transforming Software Development Lifecycle (SDLC) with generative AI

7 min read - Generative AI is not only changing the way applications are built, but the way they are envisioned, designed, tested, documented, and deployed. It’s also revolutionizing the software development lifecycle (SDLC). IBM and AWS are infusing Amazon Bedrock generative AI capabilities into the IBM® SDLC solution to drive increased efficiency, speed, quality and value in every application lifecycle consistently and at scale. The evolution of the SDLC landscape The software development lifecycle has undergone several silent revolutions in recent decades. The…

How digital solutions increase efficiency in warehouse management

3 min read - In the evolving landscape of modern business, the significance of robust operational and maintenance systems cannot be overstated. Efficient warehouse management helps businesses to operate seamlessly, ensure precision and drive productivity to new heights. In our increasingly digital world, bar coding stands out as a cornerstone technology, revolutionizing warehouses by enabling meticulous data tracking and streamlined workflows. With this knowledge, A3J Group is focused on using IBM® Maximo® Application Suite and the Red Hat® Marketplace to help bring inventory solutions…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters