July 18, 2019 By Rebecca Hardy 2 min read

By 2025, worldwide data is expected to grow to 175 zettabytes. Translation: 175 trillion gigabytes. That’s a staggering number by any standard — and likely unfathomable in 1967, when a team of IBM engineers led by David L. Noble began work on a project that would change computing forever.

For context, 1967 was the same year the Beatles released Sgt. Pepper’s Lonely Hearts Club Band. The most popular piece of consumer tech was a portable radio that took 14 hours to recharge. Clouds were for rain, not data. And people still used paper punch cards to feed information to computers. Noble and his team set out to create something that could do the work of thousands of punch cards.

Origin story

The first floppy disks were 8-inch behemoths that were difficult to handle and store because they got dirty easily. Noble and his team addressed this issue by developing slim, durable envelopes with an element that wiped dust off the disk. And thus, the floppy disk as we know it was born. Over the years, the floppy disk shrank from the original 8 inches, to 5 ¼ inches, to 3 ½ inches.

The golden age of floppy disks

In 1977, when the Apple II was released with two 5 ¼-inch floppy drives, it was a massive consumer breakthrough. Companies could write software and operating systems and easily distribute them by mail or in stores. The personal computer was increasingly accessible and user friendly. Floppy disks made information portable — documents could be saved on a floppy disk and opened on another computer, and people could share those disks with each other. Floppy disks continued to enjoy the warm glow of the spotlight until the new millennium.

“The floppy disk provided the first genuinely easy way to transfer files,” says IBM Systems’ Worldwide Storage and SDI Flash Technical Enablement Manager Roger Kasten. “Back in the ‘80s and ‘90s, we often depended on ‘sneaker-net’ to transfer data between computers. All you had to do was put data on a floppy, walk over to a colleague or friend, hand the floppy over and allow that person to copy the data to their system.”

Then, in 2011, all floppy disk manufacturing ceased.

“In the early 2000s, the USB thumb drive appeared,” says Kasten, “Over the next several years, it largely eliminated the floppy as a cheap and easy data transfer device.”

And then… the cloud

The traditional floppy disk only held about a megabyte of information. That’s minuscule by today’s standards. But remember, in 1967 people were still using paper punch cards to feed information to computers. The floppy disk did the work of thousands of these cards. More importantly, it sparked innovation that changed the way we look at storage today.

“Fast forward to 2019, and we see cloud storage services have largely replaced USB thumb drives as the data transfer method of choice,” says Kasten, “And the imminent arrival of 5G technology means we’re about to see cloud take an even greater role in data transfers.”

Technology is always moving forward, and today’s throwback might lead to tomorrow’s innovation.

Connect with IBM Services on Twitter and LinkedIn and tell us what #retrotech you’d like to see covered next.

Was this article helpful?
YesNo

More from Cloud

IBM Cloud Virtual Servers and Intel launch new custom cloud sandbox

4 min read - A new sandbox that use IBM Cloud Virtual Servers for VPC invites customers into a nonproduction environment to test the performance of 2nd Gen and 4th Gen Intel® Xeon® processors across various applications. Addressing performance concerns in a test environment Performance testing is crucial to understanding the efficiency of complex applications inside your cloud hosting environment. Yes, even in managed enterprise environments like IBM Cloud®. Although we can deliver the latest hardware and software across global data centers designed for…

10 industries that use distributed computing

6 min read - Distributed computing is a process that uses numerous computing resources in different operating locations to mimic the processes of a single computer. Distributed computing assembles different computers, servers and computer networks to accomplish computing tasks of widely varying sizes and purposes. Distributed computing even works in the cloud. And while it’s true that distributed cloud computing and cloud computing are essentially the same in theory, in practice, they differ in their global reach, with distributed cloud computing able to extend…

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters