Welcome!

@DevOpsSummit Authors: Liz McMillan, Elizabeth White, Pat Romanski, Stackify Blog, Automic Blog

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog, @CloudExpo, @DXWorldExpo

@DevOpsSummit: Blog Post

Data Demands of #DevOps | @DevOpsSummit #Docker #Microservices

The data as a service solution: new technologies deliver data at the speed of DevOps

Data Demands of DevOps | Part 2

In Part 1 of this article, we explored how data all too often becomes the critical bottleneck in software development, delaying projects and undermining the benefits of DevOps tools and processes. In Part 2, we'll look at the emerging category of Data as a Service solutions, which turn data from a drag to a driver.

Data as a Service
In order to deliver on the promise of DevOps and hit continuous release targets for even the largest, most complex and integrated applications, companies need solutions that provide the same flexibility for data as for code bases, the same automation and repeatability for data as for configurations. They need Data as a Service. DaaS solutions offer a single, integrated platform that serves up faithful copies of source data as easily as codes or configurations, and have sophisticated features to enable collaboration, project agility, and strong governance.

Provision and Branch
The most fundamental capability of any DaaS solution is the ability to deliver multiple copies of data promptly and with sophisticated automation. In order to deliver true DevOps capabilities, data standup should take no more time and effort than containerized code delivery or automated configuration: a few keystrokes and a few minutes.

DaaS solutions often deliver this capability through sophisticated block sharing and virtual files. Instead of moving data from system to system, such solutions keep a single repository of record, and then create virtual data instances by pointing to the correct set of blocks within the repository. That allows data provisioning to occur rapidly and automatically, and decouples time and effort from the size of the data set.

Of course, the ability to provision a full initial copy of the data is not enough. In order for developers and testers to achieve the flexibility they need, the DaaS solution must be able to branch the data as easily as code. A DevOps-ready DaaS solution will enable end users to spin off additional copies of the data they are working on, with whatever adjustments they have made, or from production as of many points in time. With this capability, developers and testers can keep code and data in sync, even as they pursue parallel projects, working on different versions or tests.

Bookmark and Share
DevOps isn't just about self-sufficiency. It's also about sophisticated collaboration. Without a DaaS solution, often data can be the bottleneck to efficient collaboration.

For example, suppose a QA staffer is reviewing a new piece of code for bugs. We're in a DevOps workplace, so this isn't dull, automated testing-perhaps it's an advanced scenario test, or a complicated A-B test setup. Let's say the tester finds a bug. He sends a note to the developer, with whom they've been working closely, outlining the bug. The developer uses the note and automated configuration tools, etc., to get her code into the same state, but she's unable to reproduce the bug. She lets the tester know she can't find it. The tester verifies the bug, and together, dev and test confirm that their code is in the same state. So the difference must be in the data.

With a legacy solution, there are two options. Either the developer would have to file an ops ticket to get her data into the right state-a process that could take days or weeks, and might fail repeatedly, depending on how the tester got his data-or she can take over the tester's data set. That will let her run down the bug quickly, in exchange for preventing the tester from doing any work at all. Either way, the process is broken. And if we imagine that the code being tested is part of a major push, or a daily feature cadence, or even a crucial patch to a bug running rampant in production, it becomes clear how disruptive this data management task can be.

With a DaaS solution, users can save data at any state, and share a copy of that data to any other user, with the same few clicks they would use to share code. Developers and testers don't contend for the same data. They can even skip the process of checking to see if the problem is data mismatch. Instead, they share data readily for every task, as easily and naturally as they share code or underlying hardware resources.

Refresh and Reset
Along with initial environment setup and collaborative debugging, test cycles are some of the most voracious consumers of data in the software development lifecycle. With legacy data delivery methods, testers often have to wait many hours for data to be provisioned to their test environment, in order to run a fifteen-minute test. This creates a very low ceiling on the number of test cycles available in a day, and can prevent the early detection and collaborative resolution of issues that are the keys to DevOps quality.

A DaaS solution can refresh an environment in minutes, accelerating the test cycle by a factor of ten. However, top-line solutions can do even more. A refresh would repopulate the test environment with data from production. But a strong DaaS solution can simply rewind the data state to that immediately before the test. This means that any changes to the data will need to be made only once. A test cycle characterized by long wait times for data and repeated set-up activities can be replaced by one where each test is followed by a rapid, effortless reset, and any data set-up is performed just once.

Governance
The DevOps movement drives cross-functional collaboration to meet the needs of both developers and operations staff. A good DaaS solution will serve both groups' stakeholders. The above capabilities have outlined some of the benefits that a DaaS solution can provide to Dev and Test teams, but the solution should meet Ops needs as well.

To do that, it needs a distinct set of permissions and management interfaces, so that Ops can carefully manage existing infrastructure and resources, even as Dev and Test staff spin up their own environments as-needed. A well-designed DaaS tool will not only save Ops time and effort by automating some of the dullest and most repetitive data-delivery tasks, it will also provide a full view of the team's resources for optimal management.

Conclusion
The growing acceptance of the DevOps philosophy, and the maturing ecosystem of associated tools, promises to revolutionize software development across industries, replacing outdated processes and models with collaborative teams that can truly deliver business value at digital speeds. Data as a Service solutions will be a key component of this revolution, enabling the full stack of environment creation, sharing, and management, leading to an overall doubling of project delivery.

More Stories By Louis Evans

Louis Evans is a Product Marketing Manager at Delphix. He is a subject-matter expert developing content, surveys and best practices pertinent to the DevOps community. Evans is also a speaker at DevOps focused industry events. He is a graduate of Harvard College, with a degree in Social Studies and Mathematics.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@DevOpsSummit Stories
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, which can process our conversational commands and orchestrate the outcomes we request across our personal and professional realm of connected devices.
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the public cloud best suits your organization, and what the future holds for operations and infrastructure engineers in a post-container world. Is a serverless world inevitable?
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buyers learn their thoughts on their experience.
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close their feedback loops to drive continuous improvement.
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Sanjeev Sharma Joins June 5-7, 2018 @DevOpsSummit at @Cloud Expo New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual business failure.
"Cloud4U builds software services that help people build DevOps platforms for cloud-based software and using our platform people can draw a picture of the system, network, software," explained Kihyeon Kim, CEO and Head of R&D at Cloud4U, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and cost-effective resources on AWS, coupled with the ability to deliver a minimum set of functionalities that cover the majority of needs – without configuration complexity.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was joined by a panel of industry experts and real-world practitioners who shared their insight into an emerging set of best practices that lie at the heart of today's digital transformation.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone innovative products that help customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business and personal computing needs.
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He explained how automation, orchestration and governance are fundamental to managing today's hybrid cloud environments and are critical for digital businesses to deliver services faster, with better user experience and higher quality, all while saving money.
SYS-CON Events announced today that Google Cloud has been named “Keynote Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Companies come to Google Cloud to transform their businesses. Google Cloud’s comprehensive portfolio – from infrastructure to apps to devices – helps enterprises innovate faster, scale smarter, stay secure, and do more with data than ever before.
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes.
Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams. In his session at 22nd Cloud Expo | DXWorld Expo, Daniel Jones, CTO of EngineerBetter, will answer: How can we improve willpower and decrease technical debt? Is the present bias real? How can we turn it to our advantage? Can you increase a team’s effective IQ? How do DevOps & Product Teams increase empathy, and what impact does empathy have on productivity?
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol, we have been able to solve many of these problems at the communication layer. This makes it possible to create rich application experiences and support use-cases such as mobile-to-mobile communication and large file transfers that would be difficult or cost-prohibitive with traditional networking.
You know you need the cloud, but you're hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You're looking at private cloud solutions based on hyperconverged infrastructure, but you're concerned with the limits inherent in those technologies. What do you do?
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering management. To date, IBM has launched more than 50 cloud data centers that span the globe. He has been building advanced technology, delivering “as a service” solutions, and managing infrastructure services for the past 20 years.
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve full cloud literacy in the enterprise world.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the benefits of the cloud without losing performance as containers become the new paradigm.