August 9, 2024 By Ben Ball 3 min read

Domain Name System (DNS) is rightly celebrated as the protocol that makes the internet possible. DNS is known as the phonebook of the internet because it translates domain names to IP addresses so browsers can load internet resources. It usually works sufficiently enough that most network teams barely pause to think about it.

Yet DNS also has the dubious distinction of being the protocol that breaks the internet or slows it down. “It’s always DNS” isn’t just a meme or a joking explanation for downtime that appears on Reddit threads. It’s an expression of reality: the blame for poor application performance is often traced back to issues with underlying DNS systems.

The challenge of DNS performance measurement

Most network teams understand that the way they deploy authoritative DNS matters, but only in a conceptual way. They know that managed DNS services offer faster response times and improve application performance, but there’s no hard data to back that up. Surprisingly, most network teams make decisions about a critical system that powers their entire business using mostly guesswork and gut-level intuition.

Some network teams use sites like DNSPerf to benchmark the performance of managed authoritative DNS solutions, but they don’t really delve deeply into what the numbers on that site mean or how they’re calculated. They also can’t use the data on that site to compare managed DNS solutions and self-hosted DNS.

This lack of reliable, comparative data on managed DNS solutions has been a known gap for a long time. Here at IBM®, we’re constantly asked about how our IBM NS1 Connect® solution for managed DNS is different from all the other offerings in the authoritative DNS market. Network teams want to know if it is really going to make a difference in the performance of their applications or websites.

A joint IBM-Catchpoint study

Given all the questions we get from customers and prospects on this issue, we’ve decided that it was time to fill this gap ourselves. We reached out to Catchpoint, our long-time collaborator and the experts in network performance measurement, to see what we could learn.

That outreach led both IBM and Catchpoint down a fascinating road. We found that too much of the data on authoritative DNS performance floating around the internet lacked statistical rigor. It wasn’t representative of different geographies or market verticals. The data had too many technical asterisks by it. It didn’t provide an apples-to-apples comparison because there are just so many ways to configure DNS.

After running through various research options and operational pathways, we decided to go big and design our own study of DNS performance from scratch. IBM built out a list that represents over 2,500 of the largest companies in the world. The sample represents DNS usage across geographies, market verticals and authoritative DNS deployment types. Catchpoint created a performance test, which stripped out all the various DNS accelerators and intermediate layers. This resulted in a common baseline measurement that would be applicable to any enterprise.

We ran tests in the busiest season of the year for most businesses, November and December, to capture the period where most enterprises require the best DNS performance. The data was captured over the course of several weeks, so we could see the ebb and flow of internet traffic over time rather than relying on less representative spot tests.

What we found

When the results came back, they demonstrated very clear results. Like most network teams, we had assumptions about the relative performance of managed DNS solutions versus self-hosted systems. Those assumptions were not only verified, but the gap turned out to be much larger than we realized. Spoiler alert: managed DNS solutions are faster—a lot faster.

We were also naturally curious about the performance of the IBM NS1 Connect solution against other authoritative DNS offerings. We got some fascinating data in this area as well, some of which surprised us.

Beyond the comparative performance of DNS solutions, we also learned a lot about differences in internet performance between different geographies and market verticals. It turns out that where you’re accessing the internet can have just as much to do with performance as which company’s applications or websites you’re accessing.

If you’re currently hosting your own authoritative DNS and are thinking about moving to a managed DNS solution, or if you are trying to compare vendors on the market, you’re going to want to see this data. This study will also be of interest if you’re just interested in benchmarking the performance of your DNS against a reliable standard.

Read our white paper
Was this article helpful?
YesNo

More from Automation

AIOps vs. MLOps: Harnessing big data for “smarter” ITOPs

5 min read - Digital data has exploded in recent decades. Driven by significant advancements in computing technology, everything from mobile phones to smart appliances to mass transit systems generate and digest data, creating a big data landscape that forward-thinking enterprises can leverage to drive innovation. However, the big data landscape is just that. Big. Massive, in fact. Wearable devices (such as fitness trackers, smart watches and smart rings) alone generated roughly 28 petabytes (28 billion megabytes) of data daily in 2020. And in…

Technology Lifecycle Services: Envisioning the next generation of support with AI

4 min read - As companies integrate AI to enhance customer experiences and optimize business processes, AI is becoming ingrained in their operating models. This has created a need to effectively design, deploy, and support the underlying infrastructure for smooth operations of AI-enhanced mission-critical applications. IBM Technology Lifecycle Support (TLS) provides a wide range of integrated data center services and support designed to help accelerate our clients’ transformation to hybrid cloud and AI. IBM TLS delivers support services for IBM infrastructure products and products…

Maximizing compliance: Integrating gen AI into the financial regulatory framework

8 min read - In an era where financial institutions are under increasing scrutiny to comply with Anti-Money Laundering (AML) and Bank Secrecy Act (BSA) regulations, leveraging advanced technologies like generative AI presents a significant opportunity. Large Language Models (LLMs) such as GPT-4 can enhance AML and BSA programs, driving compliance and efficiency in the financial sector, but there are risks involved with deploying gen AI solutions to production. Financial institutions face a complex regulatory environment that demands robust compliance mechanisms. The integration of…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters