Click here to close now.

Welcome!

@DevOpsSummit Blog Authors: Pat Romanski, Elizabeth White, Lori MacVittie, Liz McMillan, VictorOps Blog

Related Topics: @BigDataExpo Blog, @MicroservicesE Blog, @ContainersExpo, @CloudExpo Blog, SDN Journal, @DevOpsSummit Blog

@BigDataExpo Blog: Article

Shots Across the Data Lake

Big Data Analytics Range War

Range Wars
The settling of the American West brought many battles between ranchers and farmers over access to water. The farmers claimed land near the water and fenced it to protect their crops. But the farmers' fences blocked the ranchers' cattle from reaching the water. Fences were cut; shots were fired; it got ugly.

About a century later, with the first tech land rush of the late1980s and early '90s - before the Web - came battles between those who wanted software and data to be centrally controlled on corporate servers and those who wanted it to be distributed to workers' desktops. Oracle and IBM versus Microsoft and Lotus. Database versus Spreadsheet.

Now, with the advent of SoMoClo (Social, Mobile, Cloud) technologies and the Big Data they create, have come battles between groups on different sides of the "Data Lake" over how it should be controlled, managed, used, and paid for. Operations versus Strategy. BI versus Data Science. Governance versus Discovery.  Oversight versus Insight.

The range wars of the Old West were not a fight over property ownership, but rather over access to natural resources. The farmers and their fences won that one, for the most part.

Those tech battles in the enterprise are fights over access to the "natural" resource of data and to the tools for managing and analyzing it.

In the '90s and most of the following decade, the farmers won again. Data was harvested from corporate systems and piled high in warehouses, with controlled accessed by selected users for milling it into Business Intelligence.

But now in the era of Big Data Analytics, it is not looking so good for the farmers. The public cloud, open source databases, and mobile tablets are all chipping away at the centralized command-and-control infrastructure down by the riverside.  And, new cloud based Big Data analytics solution providers like BigML, Yottamine (my company) and others are putting unprecedented analytical power in the hands of the data ranchers.

A Rainstorm, Not a River
Corporate data is like a river - fed by transaction tributaries and dammed into databases for controlled use in business irrigation.

Big Data is more like a relentless rainstorm - falling heavily from the cloud and flowing freely over and around corporate boundaries, with small amounts channeled into analytics and most draining to the digital deep.

Many large companies are failing to master this new data ecology because they are trying to do Big Data analytics in the same way, with the same tools as they did with BI, and that will never work. There is a lot more data, of course, but it is different data - tweets, posts, pictures, clicks, GPS, etc., not RDBMS records - and different analytics - discovery and prediction, not reporting and evaluation.

Successfully gleaning business value from the Big Data rainstorm requires new tools and maybe new rules.

Embracing Shadows
These days, tech industry content readers frequently see the term "Shadow IT" referring to how business people are using new technologies to process and analyze information without the help of "real IT".  SoMoClo by another, more sinister name.  Traditionalists see it as a threat to corporate security and stability and modernists a boon to cost control and competitiveness.

But, it really doesn't matter which view is right.  Advanced analytics on Big Data takes more computing horsepower than most companies can afford.  Jobs like machine learning from the Twitter Fire Hose will take hundreds or even thousands of processor cores and terabytes of memory (not disk!) to build accurate and timely predictive models.

Most companies will have no choice but to embrace the shadow and use AWS or some other elastic cloud computing service, and new, more scalable software tools to do effective large scale advanced analytics.

Time for New Rules?
Advanced Big Data analytics projects, the ones of a scale that only the cloud can handle, are being held back by reservations over privacy, security and liability that in most cases turn out to be needless concerns.

If the data to be analyzed were actual business records for customers and transactions as it is in the BI world, those concerns would be reasonable.  But more often than not, advanced analytics does not work that way.  Machine learning and other advanced algorithms do not look at business data. They look at statistical information derived from business data, usually in the form of an inscrutable mass of binary truth values that is only actionable to the algorithm.  That is what gets sent to the cloud, not the customer file.

If you want to do advanced cloud-scale Big Data analytics and somebody is telling you it is against the rules, you should look at the rules.  They probably don't even apply to what you are trying to do.

First User Advantage
Advanced Big Data analytics is sufficiently new and difficult that not many companies are doing much of it yet.  But where BI helps you run a tighter ship, Big Data analytics helps you sink your enemy's fleet.

Some day, technologies like high performance statistical machine learning will be ubiquitous and the business winners will be the ones who uses the software best.  But right now, solutions are still scarce and the business winners are ones willing to use the software at all.

More Stories By Tim Negris

Tim Negris is SVP, Marketing & Sales at Yottamine Analytics, a pioneering Big Data machine learning software company. He occasionally authors software industry news analysis and insights on Ulitzer.com, is a 25-year technology industry veteran with expertise in software development, database, networking, social media, cloud computing, mobile apps, analytics, and other enabling technologies.

He is recognized for ability to rapidly translate complex technical information and concepts into compelling, actionable knowledge. He is also widely credited with coining the term and co-developing the concept of the “Thin Client” computing model while working for Larry Ellison in the early days of Oracle.

Tim has also held a variety of executive and consulting roles in a numerous start-ups, and several established companies, including Sybase, Oracle, HP, Dell, and IBM. He is a frequent contributor to a number of publications and sites, focusing on technologies and their applications, and has written a number of advanced software applications for social media, video streaming, and music education.

@DevOpsSummit Stories
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers. In addition, he will discuss known issues and solutions for enterprise appl...
"We have a tagline - "Power in the API Economy." What that means is everything that is built in applications and connected applications is done through APIs," explained Roberto Medrano, Executive Vice President at Akana, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend the time metric, the DevOps cadence reinvents project scope, and cost metrics expand past software ...
"We provide a web application framework for building really sophisticated web applications that run on a browser without any installation need so we get used for biotech, defense, and banking applications," noted Charles Kendrick, CTO and Chief Architect at Isomorphic Software, in this SYS-CON.tv interview at @DevOpsSummit (http://DevOpsSummit.SYS-CON.com), held June 9-11, 2015, at the Javits Center in New York
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
SYS-CON Events announced today that JFrog, maker of Artifactory, the popular Binary Repository Manager, will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based in California, Israel and France, founded by longtime field-experts, JFrog, creator of Artifactory and Bintray, has provided the market with the first Binary Repository solution and a software distribution social platform.
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal, Sr. Principal with CSC, and Mike Kavis, Vice President & Principal Cloud Architect at Cloud Techno...
Agile, which started in the development organization, has gradually expanded into other areas downstream - namely IT and Operations. Teams – then teams of teams – have streamlined processes, improved feedback loops and driven a much faster pace into IT departments which have had profound effects on the entire organization. In his session at DevOps Summit, Anders Wallgren, Chief Technology Officer of Electric Cloud, will discuss how DevOps and Continuous Delivery have emerged to help connect development with IT operations (mainly through the introduction of Automation) to support and amplify a...
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. The problem is there are a lot of moving parts in these designs; this makes assuring performance com...
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust IT industrialization – allowing customers to provide amazing user experiences with optimized IT per...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might slip up with the wrong focus, how to manage change and risk in all three areas, what is possible a...
Mashape is bringing real-time analytics to microservices with the release of Mashape Analytics. First built internally to analyze the performance of more than 13,000 APIs served by the mashape.com marketplace, this new tool provides developers with robust visibility into their APIs and how they function within microservices. A purpose-built, open analytics platform designed specifically for APIs and microservices architectures, Mashape Analytics also lets developers and DevOps teams understand which APIs are used most frequently, from what endpoints and by which paying customers, so they can p...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fillin...
Sumo Logic has announced comprehensive analytics capabilities for organizations embracing DevOps practices, microservices architectures and containers to build applications. As application architectures evolve toward microservices, containers continue to gain traction for providing the ideal environment to build, deploy and operate these applications across distributed systems. The volume and complexity of data generated by these environments make monitoring and troubleshooting an enormous challenge for development and operations teams. The Sumo Logic Collector and Application for Docker now a...
IBM is delivering of enterprise class containers that make it easier for clients to deliver production applications across their hybrid environments. Containers give developers flexibility to build once and move applications without the need to rewrite or redeploy their code. IBM Containers, based on Docker and built on Bluemix, IBM's platform-as-a-service, provide a more efficient environment that enables faster integration and access to analytics, big data and security services. Enterprises will now be able to use the combination of IBM, Docker, Cloud Foundry and OpenStack to create a new ...
A broad coalition of industry leaders and users are joining forces to create the Open Container Project (OCP), chartered to establish common standards for software containers. Housed under the Linux Foundation, the OCP’s mission is to enable users and companies to continue to innovate and develop container-based solutions, with confidence that their pre-existing development efforts will be protected and without industry fragmentation. As part of this initiative, Docker will donate the code for its software container format and its runtime, as well as the associated specifications. The leader...
SYS-CON Events announced today that the "Second Containers & Microservices Conference" will take place November 3-5, 2015, at the Santa Clara Convention Center, Santa Clara, CA, and the “Third Containers & Microservices Conference” will take place June 7-9, 2016, at Javits Center in New York City. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Shippable has announced a new product line, Shippable Formations, along with major upgrades to its existing continuous integration platform, Shippable CI/CD. Together, these products help developers easily build, validate and deploy Docker-based applications into full-topology environments without writing any DevOps code. Empowering developers to ship code faster With CI/CD and Formations, Shippable is reinventing traditional DevOps with a developer-centric approach. Traditional DevOps relies on IT Automation tools that are expensive and need significant resources and code to make automati...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!