Welcome!

@DevOpsSummit Authors: Elizabeth White, Carmen Gonzalez, Ruxit Blog, JP Morgenthal, Mehdi Daoudi

Related Topics: @BigDataExpo, Microservices Expo, Containers Expo Blog, @CloudExpo, SDN Journal, @DevOpsSummit

@BigDataExpo: Article

Shots Across the Data Lake

Big Data Analytics Range War

Range Wars
The settling of the American West brought many battles between ranchers and farmers over access to water. The farmers claimed land near the water and fenced it to protect their crops. But the farmers' fences blocked the ranchers' cattle from reaching the water. Fences were cut; shots were fired; it got ugly.

About a century later, with the first tech land rush of the late1980s and early '90s - before the Web - came battles between those who wanted software and data to be centrally controlled on corporate servers and those who wanted it to be distributed to workers' desktops. Oracle and IBM versus Microsoft and Lotus. Database versus Spreadsheet.

Now, with the advent of SoMoClo (Social, Mobile, Cloud) technologies and the Big Data they create, have come battles between groups on different sides of the "Data Lake" over how it should be controlled, managed, used, and paid for. Operations versus Strategy. BI versus Data Science. Governance versus Discovery.  Oversight versus Insight.

The range wars of the Old West were not a fight over property ownership, but rather over access to natural resources. The farmers and their fences won that one, for the most part.

Those tech battles in the enterprise are fights over access to the "natural" resource of data and to the tools for managing and analyzing it.

In the '90s and most of the following decade, the farmers won again. Data was harvested from corporate systems and piled high in warehouses, with controlled accessed by selected users for milling it into Business Intelligence.

But now in the era of Big Data Analytics, it is not looking so good for the farmers. The public cloud, open source databases, and mobile tablets are all chipping away at the centralized command-and-control infrastructure down by the riverside.  And, new cloud based Big Data analytics solution providers like BigML, Yottamine (my company) and others are putting unprecedented analytical power in the hands of the data ranchers.

A Rainstorm, Not a River
Corporate data is like a river - fed by transaction tributaries and dammed into databases for controlled use in business irrigation.

Big Data is more like a relentless rainstorm - falling heavily from the cloud and flowing freely over and around corporate boundaries, with small amounts channeled into analytics and most draining to the digital deep.

Many large companies are failing to master this new data ecology because they are trying to do Big Data analytics in the same way, with the same tools as they did with BI, and that will never work. There is a lot more data, of course, but it is different data - tweets, posts, pictures, clicks, GPS, etc., not RDBMS records - and different analytics - discovery and prediction, not reporting and evaluation.

Successfully gleaning business value from the Big Data rainstorm requires new tools and maybe new rules.

Embracing Shadows
These days, tech industry content readers frequently see the term "Shadow IT" referring to how business people are using new technologies to process and analyze information without the help of "real IT".  SoMoClo by another, more sinister name.  Traditionalists see it as a threat to corporate security and stability and modernists a boon to cost control and competitiveness.

But, it really doesn't matter which view is right.  Advanced analytics on Big Data takes more computing horsepower than most companies can afford.  Jobs like machine learning from the Twitter Fire Hose will take hundreds or even thousands of processor cores and terabytes of memory (not disk!) to build accurate and timely predictive models.

Most companies will have no choice but to embrace the shadow and use AWS or some other elastic cloud computing service, and new, more scalable software tools to do effective large scale advanced analytics.

Time for New Rules?
Advanced Big Data analytics projects, the ones of a scale that only the cloud can handle, are being held back by reservations over privacy, security and liability that in most cases turn out to be needless concerns.

If the data to be analyzed were actual business records for customers and transactions as it is in the BI world, those concerns would be reasonable.  But more often than not, advanced analytics does not work that way.  Machine learning and other advanced algorithms do not look at business data. They look at statistical information derived from business data, usually in the form of an inscrutable mass of binary truth values that is only actionable to the algorithm.  That is what gets sent to the cloud, not the customer file.

If you want to do advanced cloud-scale Big Data analytics and somebody is telling you it is against the rules, you should look at the rules.  They probably don't even apply to what you are trying to do.

First User Advantage
Advanced Big Data analytics is sufficiently new and difficult that not many companies are doing much of it yet.  But where BI helps you run a tighter ship, Big Data analytics helps you sink your enemy's fleet.

Some day, technologies like high performance statistical machine learning will be ubiquitous and the business winners will be the ones who uses the software best.  But right now, solutions are still scarce and the business winners are ones willing to use the software at all.

More Stories By Tim Negris

Tim Negris is SVP, Marketing & Sales at Yottamine Analytics, a pioneering Big Data machine learning software company. He occasionally authors software industry news analysis and insights on Ulitzer.com, is a 25-year technology industry veteran with expertise in software development, database, networking, social media, cloud computing, mobile apps, analytics, and other enabling technologies.

He is recognized for ability to rapidly translate complex technical information and concepts into compelling, actionable knowledge. He is also widely credited with coining the term and co-developing the concept of the “Thin Client” computing model while working for Larry Ellison in the early days of Oracle.

Tim has also held a variety of executive and consulting roles in a numerous start-ups, and several established companies, including Sybase, Oracle, HP, Dell, and IBM. He is a frequent contributor to a number of publications and sites, focusing on technologies and their applications, and has written a number of advanced software applications for social media, video streaming, and music education.

@DevOpsSummit Stories
Have you ever noticed how some IT people seem to lead successful, rewarding, and satisfying lives and careers, while others struggle? IT author and speaker Don Crawley uncovered the five principles that successful IT people use to build satisfying lives and careers and he shares them in this fast-paced, thought-provoking webinar. You'll learn the importance of striking a balance with technical skills and people skills, challenge your pre-existing ideas about IT customer service, and gain new insights into how to build your own satisfying and rewarding career by rising above the ordinary and mundane to build an extraordinary life and career as a world-class Compassionate Geek.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker containers gain prominence. He explored these challenges and how to address them, while considering how containers will influence the direction of cloud computing.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software security issues.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of SolidFire, discussed how to leverage this concept to seize on the creativity and business agility to make it real.
Cloud Expo, Inc. has announced today that Aruna Ravichandran, vice president of DevOps Product and Solutions Marketing at CA Technologies, has been named co-conference chair of DevOps at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Tricky charts and visually deceptive graphs often make a case for the impact IT performance has on business. The debate isn't around the obvious; of course, IT performance metrics like website load time influence business metrics such as conversions and revenue. Rather, this presentation will explore various data analysis concepts to understand how, and how not to, assert such correlations. In his session at 20th Cloud Expo, Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, will review data analysis basics, and then move into some data-to-actionable information concepts. Afterward, YOU decide whether to use your newfound knowledge for good or evil.
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
To manage complex web services with lots of calls to the cloud, many businesses have invested in Application Performance Management (APM) and Network Performance Management (NPM) tools. Together APM and NPM tools are essential aids in improving a business's infrastructure required to support an effective web experience... but they are missing a critical component - Internet visibility.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (releases quarterly).
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cloud marketplaces and DevOps are changing the economics of hosting and delivering software.
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is relevant to small scale DevOps, and if there is an expectation of growth as the number of build targets, test topologies and delivery topologies that need to be orchestrated rapidly grow.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your containers from your laptop to the cloud. How do we build software for containers? How do we ship containers? How do we do all of it without shooting ourselves in the foot?
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
DevOps is a hot topic. It seems that everyone is talking about it. Some have built business models around DevOps-related tools and themes. There are conferences and trade shows dedicated to DevOps-strategies and techniques. Some people have even made their careers around talking about it. In light of all of that, I find it chuckle-worthy that very few people actually know what DevOps is (just follow #devops on Twitter for proof.) I am not going to be one of many trying to create a buzzword-infested definition of DevOps to suit my particular agenda. Instead, I’d like to talk about what DevOps is not. So, without further ado, DevOps …
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also discuss how the "Ops" side of DevOps is making their life easier and becoming invisible to developers for storage-related provisioning and application performance.
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, SSL, peer-to-peer, mobile browser, and live streaming solutions.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Francisco, is developing the next generation of cloud monitoring required for microservices and DevOps.