Welcome!

@DevOpsSummit Authors: Mehdi Daoudi, Radu Gheorghe, Pat Romanski, Elizabeth White, Derek Weeks

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, @BigDataExpo, SDN Journal

@CloudExpo: Article

Unlimited vs. Storage-Based Cloud Storage

Which one is right for your enterprise?

When deciding on your preferred method of cloud storage, you will inevitably be faced with storage-based and unlimited storage options. While storage-based "pay as you go" pricing might seem like a way to save some dollars now, it's important to evaluate what it will end up costing in the long run.

Storage-Based Pricing: The Hidden Costs
Storage-based plans require you to pay for a limited amount of storage space. If you choose this type, you usually determine an appropriate amount of storage for your organization's current circumstances, knowing that you can buy an additional chunk of storage as your company expands. Organizations will select this route as a short-term cost-saving measure, claiming that it reduces capital expenses since you pay for only the storage amount you immediately need. However, this approach can end up actually costing more than an unlimited storage plan over time. Consider that in addition to purchasing the storage plan, the organization will need to engage in some due diligence, including:

  • A multi-year storage needs analysis. According to the International Data Corporation (IDC) we will have produced up to 40 Zettabytes (that's 40,000,000,000,000,000,000,000 bytes!) of data by 2020. The same study revealed that most businesses experience between 40-60 percent growth in data volume annually. In short: data doesn't shrink, and you need to have a plan for its expansion, whether it's continuing to buy more storage as needed or attempting a data cleanup operation every time you get close to your limit. Also, don't forget to plan for backing up all that data, which won't just consist of the total amount of current data, but also every version of that data that has ever existed. Estimating the amount of storage you need now as well as in the future involves a lengthy calculation process, including analyzing the amount of current data you have, new data over time and the rate of change for your data.
  • A risk mitigation and communications plan. In the event your data cap is exceeded (and it almost certainly will be at some point), how will administrators be notified when the limits are close to being reached? What is the plan when that notification is received - buy more storage or spend time purging, cleaning up and relocating data? How quickly can this plan be executed with not just the current number of employees in place but with the number of employees the company is expected to have in the future? Failure to plan for when your storage limit is reached could result in data loss or productivity grinding to a halt while things get sorted out.
  • Additional budgeting, approval and purchasing decisions. Depending on the organization, the approval process to buy something might not be arduous and more storage can quickly be purchased when needed. Or every purchase approval might need to go through a formal procurement process and several levels of management before it's approved, leaving end-users unable to do their jobs as usual or vulnerable to data loss while management is deciding on a course of action.

Each of these items has a monetary cost, an opportunity cost and are based on estimates of growth determined by IT. But what happens to all this planning if the organization acquires a company, brings a design team in-house or falls under temporary spending holds? All the research and planning might need to be redone, which means more time and money being spent. Is it all really worth it?

Unlimited Storage Pricing = Simplicity
With an unlimited storage plan for your data, you pay one rate and store all of your information, no matter how much, in one convenient location at one consistent price. That's it.

Eventually, the capital it takes to continuously monitor, manage and acquire more storage will outpace the cost of holding an unlimited storage account. Adding to the long-term benefits of an unlimited storage package for your data is the ability to more easily and predictably plan your company finances. Organizations know the exact cost of adding each user and are thus able to budget appropriately, no matter the fluctuations in how much data each user produces.

It is easy to see that unlimited data storage plans are more simple, convenient, and cost-effective than storage-based pricing structures. Though it may sound too good to be true, the option for unlimited storage is one that's increasingly available at affordable prices, and over time, has the potential to save those few dollars that everyone is looking for in their budgets.

More Stories By Mat Hamlin

Mat Hamlin is the Director of Product at Spanning where he oversees the development of the industry’s most trusted, highest-ranked enterprise-class backup and recovery products for cloud applications including Google Apps and Salesforce.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@DevOpsSummit Stories
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between what is available in the public cloud and the early private clouds?
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, Alex Lovell-Troy, Director of Solutions Engineering at Pythian, presented a roadmap that can be leveraged by any organization to plan, analyze, evaluate, and execute on moving from configuration management tools to cloud orchestration tools. He also addressed the three major cloud vendors as well as some tools that will work with any cloud.
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to capturing configuration information between Development, Test and Production, the case study shows how NXTmonitor can create dependencies, automate health scripts and scalable performance groups to handle peak production loads.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers. In addition, he will discuss known issues and solutions for enterprise applications in containers.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers with heavy investments in serverless computing, when most of the industry has its eyes on Docker and containers.
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software security issues.
"There is a huge interest in Kubernetes. People are now starting to use Kubernetes and implement it," stated Sebastian Scheele, co-founder of Loodse, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the works because of misaligned incentives.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, Capital One now has 500+ Agile Teams delivering quality software via Agile and DevOps practices.
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers and the benefits. In addition, he discussed known issues and solutions for enterprise applications in containers.
SYS-CON Events announced today that Catchpoint, a leading digital experience intelligence company, has been named “Silver Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Catchpoint Systems is a leading Digital Performance Analytics company that provides unparalleled insight into your customer-critical services to help you consistently deliver an amazing customer experience. Designed for digital business, Catchpoint is the only end-user experience monitoring (EUM) platform that can simultaneously capture, index and analyze object-level performance data inline across the most extensive monitor types and node coverage, enabling a smarter, faster way to preempt issues and optimize service delivery. More than 350 customers in over 30 countries trust Catchpoint to strengthen their brand and grow their bu...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also discuss how the "Ops" side of DevOps is making their life easier and becoming invisible to developers for storage-related provisioning and application performance.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture.
“RackN is a software company and we take how a hybrid infrastructure scenario, which consists of clouds, virtualization, traditional data center technologies - how to make them all work together seamlessly from an operational perspective,” stated Dan Choquette, Founder of RackN, in this SYS-CON.tv interview at @DevOpsSummit at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Francisco, is developing the next generation of cloud monitoring required for microservices and DevOps.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its “We Keep IT Green®” initiative and provides customers with the most energy-efficient, environmentally friendly solutions available on the market.
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to change lives by teaching Linux and cloud technology to the tens of thousands of students that learn at the Linux Academy.