Click here to close now.




















Welcome!

@DevOpsSummit Authors: Liz McMillan, Elizabeth White, Don MacVittie, SmartBear Blog, XebiaLabs Blog

Related Topics: @BigDataExpo, Microservices Expo, Containers Expo Blog, @CloudExpo, SDN Journal, @DevOpsSummit

@BigDataExpo: Article

Shots Across the Data Lake

Big Data Analytics Range War

Range Wars
The settling of the American West brought many battles between ranchers and farmers over access to water. The farmers claimed land near the water and fenced it to protect their crops. But the farmers' fences blocked the ranchers' cattle from reaching the water. Fences were cut; shots were fired; it got ugly.

About a century later, with the first tech land rush of the late1980s and early '90s - before the Web - came battles between those who wanted software and data to be centrally controlled on corporate servers and those who wanted it to be distributed to workers' desktops. Oracle and IBM versus Microsoft and Lotus. Database versus Spreadsheet.

Now, with the advent of SoMoClo (Social, Mobile, Cloud) technologies and the Big Data they create, have come battles between groups on different sides of the "Data Lake" over how it should be controlled, managed, used, and paid for. Operations versus Strategy. BI versus Data Science. Governance versus Discovery.  Oversight versus Insight.

The range wars of the Old West were not a fight over property ownership, but rather over access to natural resources. The farmers and their fences won that one, for the most part.

Those tech battles in the enterprise are fights over access to the "natural" resource of data and to the tools for managing and analyzing it.

In the '90s and most of the following decade, the farmers won again. Data was harvested from corporate systems and piled high in warehouses, with controlled accessed by selected users for milling it into Business Intelligence.

But now in the era of Big Data Analytics, it is not looking so good for the farmers. The public cloud, open source databases, and mobile tablets are all chipping away at the centralized command-and-control infrastructure down by the riverside.  And, new cloud based Big Data analytics solution providers like BigML, Yottamine (my company) and others are putting unprecedented analytical power in the hands of the data ranchers.

A Rainstorm, Not a River
Corporate data is like a river - fed by transaction tributaries and dammed into databases for controlled use in business irrigation.

Big Data is more like a relentless rainstorm - falling heavily from the cloud and flowing freely over and around corporate boundaries, with small amounts channeled into analytics and most draining to the digital deep.

Many large companies are failing to master this new data ecology because they are trying to do Big Data analytics in the same way, with the same tools as they did with BI, and that will never work. There is a lot more data, of course, but it is different data - tweets, posts, pictures, clicks, GPS, etc., not RDBMS records - and different analytics - discovery and prediction, not reporting and evaluation.

Successfully gleaning business value from the Big Data rainstorm requires new tools and maybe new rules.

Embracing Shadows
These days, tech industry content readers frequently see the term "Shadow IT" referring to how business people are using new technologies to process and analyze information without the help of "real IT".  SoMoClo by another, more sinister name.  Traditionalists see it as a threat to corporate security and stability and modernists a boon to cost control and competitiveness.

But, it really doesn't matter which view is right.  Advanced analytics on Big Data takes more computing horsepower than most companies can afford.  Jobs like machine learning from the Twitter Fire Hose will take hundreds or even thousands of processor cores and terabytes of memory (not disk!) to build accurate and timely predictive models.

Most companies will have no choice but to embrace the shadow and use AWS or some other elastic cloud computing service, and new, more scalable software tools to do effective large scale advanced analytics.

Time for New Rules?
Advanced Big Data analytics projects, the ones of a scale that only the cloud can handle, are being held back by reservations over privacy, security and liability that in most cases turn out to be needless concerns.

If the data to be analyzed were actual business records for customers and transactions as it is in the BI world, those concerns would be reasonable.  But more often than not, advanced analytics does not work that way.  Machine learning and other advanced algorithms do not look at business data. They look at statistical information derived from business data, usually in the form of an inscrutable mass of binary truth values that is only actionable to the algorithm.  That is what gets sent to the cloud, not the customer file.

If you want to do advanced cloud-scale Big Data analytics and somebody is telling you it is against the rules, you should look at the rules.  They probably don't even apply to what you are trying to do.

First User Advantage
Advanced Big Data analytics is sufficiently new and difficult that not many companies are doing much of it yet.  But where BI helps you run a tighter ship, Big Data analytics helps you sink your enemy's fleet.

Some day, technologies like high performance statistical machine learning will be ubiquitous and the business winners will be the ones who uses the software best.  But right now, solutions are still scarce and the business winners are ones willing to use the software at all.

More Stories By Tim Negris

Tim Negris is SVP, Marketing & Sales at Yottamine Analytics, a pioneering Big Data machine learning software company. He occasionally authors software industry news analysis and insights on Ulitzer.com, is a 25-year technology industry veteran with expertise in software development, database, networking, social media, cloud computing, mobile apps, analytics, and other enabling technologies.

He is recognized for ability to rapidly translate complex technical information and concepts into compelling, actionable knowledge. He is also widely credited with coining the term and co-developing the concept of the “Thin Client” computing model while working for Larry Ellison in the early days of Oracle.

Tim has also held a variety of executive and consulting roles in a numerous start-ups, and several established companies, including Sybase, Oracle, HP, Dell, and IBM. He is a frequent contributor to a number of publications and sites, focusing on technologies and their applications, and has written a number of advanced software applications for social media, video streaming, and music education.

@DevOpsSummit Stories
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, will discuss how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a practical, powerful, and insanely valuable asset to enterprises. You may call it DevSecOps, or SecDevOps...
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, answered that question citing examples, showing how to create opportunities for diverse candidates and taking feedback from the audience on their experiences with encouraging diver...
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean, destroying it completely when you log out. If you wish to store your data, the solution will inclu...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ability. Many are unable to effectively engage and inspire, creating forward momentum in the direction...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on demos and comprehensive walkthroughs.
Everyone talks about continuous integration and continuous delivery but those are just two ends of the pipeline. In the middle of DevOps is continuous testing (CT), and many organizations are struggling to implement continuous testing effectively. After all, without continuous testing there is no delivery. And Lab-As-A-Service (LaaS) enhances the CT with dynamic on-demand self-serve test topologies. CT together with LAAS make a powerful combination that perfectly serves complex software development and delivery pipelines. Software Defined Networks (SDNs) turns the network into a flexible confi...
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery cycles and a poor customer experience.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, discussed why containers should be paired with new architectural practices such as microservices rather than mimicking legacy server virtualization workflows and architectures.
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approach. NoOps enables developers to deploy, manage, and scale their own code, creating an infrastructure...
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will demonstrate examples of...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usage of their services for licensing and billing purposes? In his session at 16th Cloud Expo, Delano ...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block - and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming onto the cloud hosting scene over 10 years ago. So, what has changed? In his session at 16th Cloud Ex...
XebiaLabs has announced that XL Deploy, its Application Release Automation software, has received certification of its integration with ServiceNow. With XL Deploy from XebiaLabs, ServiceNow users can now easily automate the application deployment process so releases can occur in a repeatable, standard and efficient way leading to faster delivery of software at enterprise scale. XL Deploy also enables companies to reduce the risk of release failures, while providing comprehensive reporting and supporting IT compliance. Certification by ServiceNow signifies that XL Deploy has successfully co...
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
SYS-CON Events announced today that the "Second Containers & Microservices Expo" will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at @DevOpsSummit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, presented a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mockups and enhance them all the way through functional prototypes, to final working applications. Learn ...
Automic has been listed as a representative ‘established and active vendor’ in Gartner’s recent Market Guide for Application Release Automation (ARA) Solutions. Gartner has defined categories of ‘established and active’, ‘evolving’ and ‘emerging’ and categorized vendors accordingly. Automic views the growing global DevOps market as a strategic area of focus for the business. The ARA market is, “Driven by growing business demands for rapid (if not continuous) delivery of new applications, features and updates.” Furthermore, “enterprise infrastructure and operations (I&O) leaders invest in ARA...
Graylog, Inc., has added the capability to collect, centralize and analyze application container logs from within Docker. The Graylog logging driver for Docker addresses the challenges of extracting intelligence from within Docker containers, where most workloads are dynamic and log data is not persisted or stored. Using Graylog, DevOps and IT Ops teams can pinpoint the root cause of problems to deliver new applications faster and minimize downtime.
Scrum Alliance has announced the release of its 2015 State of Scrum Report. Almost 5,000 individuals and companies worldwide participated in this year's survey. Most organizations in the market today are still leading and managing under an Industrial Age model. Not only is the speed of change growing exponentially, Agile and Scrum frameworks are showing companies how to draw on the full talents and capabilities of those doing the work in order to continue innovating for success.