Welcome!

@DevOpsSummit Authors: Elizabeth White, Pat Romanski, Liz McMillan, Yeshim Deniz, SmartBear Blog

Related Topics: @DevOpsSummit, Open Source Cloud, Agile Computing

@DevOpsSummit: Blog Post

Using Log Data Streams for Real-Time Analytics By @MattKiernan | @DevOpsSummit #DevOps

Part 1: The definition and benefits of using log data streams and real-time analytics for some common IT Ops uses cases

Using Log Data Streams for Real-Time Analytics: Part 1
By Matt Kiernan

Analytics tools are often focused on analyzing historical data. Taking a sample of data from historical events, you can perform calculations to determine what happened during that period of time and report on your findings. Monitoringtools are more often thought of in terms of real-time data, reporting raw metrics as they are recorded. Somewhere between these two types of tools exists Real-Time Analytics: the practice of performing constant monitoring and analysis in real-time, delivering raw metrics and up-to-the-second actionable insights.

The need for Real-Time Analytics has grown increasingly as IT infrastructures continue to evolve into more advanced systems, often distributed across thousands of instances that automatically scale up or down depending on the immediate need.

In Logentries' latest article, Using Log Data Streams for Real-Time Analytics, we explore four real-world situations where Real-Time Analytics are necessary. As noted in the article, there are many common challenges that can occur when data is not real-time such as when working with timestamps:

To further demonstrate the definition of real- time analytics, let's start by comparing it to the more commonly known, data batch processing. While batch processing can still append new data to an existing set, it does so in batches rather than a continuous stream. Batch processing comes with several disadvantages to real-time streaming. For example, if the data being processed doesn't include timestamps, every event in a batch will be assigned the same timestamp (the date and time the batch process occurred).

Batch processing also makes it impossible to generate immediate alerts off of events as they occur. How effective can a system alert be if you'll still experience several minutes of downtime before even receiving the alert? Tools that are actually real-time can deliver information within seconds of occurring, alerting you to the warning signs leading up to an issue, improving your chances of identifying, diagnosing and resolving problems before they negatively impact end-users.

Want to learn more about which situations demand Real-Time Analytics and what to look for in a Real-Time Analytics tool? Download Logentries' free whitepaper, Using Log Data Streams for Real-Time Analytics.

More Stories By Trevor Parsons

Trevor Parsons is Chief Scientist and Co-founder of Logentries. Trevor has over 10 years experience in enterprise software and, in particular, has specialized in developing enterprise monitoring and performance tools for distributed systems. He is also a research fellow at the Performance Engineering Lab Research Group and was formerly a Scientist at the IBM Center for Advanced Studies. Trevor holds a PhD from University College Dublin, Ireland.

@DevOpsSummit Stories
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes.
For better or worse, DevOps has gone mainstream. All doubt was removed when IBM and HP threw up their respective DevOps microsites. Where are we on the hype cycle? It's hard to say for sure but there's a feeling we're heading for the "Peak of Inflated Expectations." What does this mean for the enterprise? Should they avoid DevOps? Definitely not. Should they be cautious though? Absolutely. The truth is that DevOps and the enterprise are at best strange bedfellows. The movement has its roots in the tech community's elite. Open source projects and methodologies driven by the alumni of companies like Netflix, Google and Amazon. This is a great thing for the evolution of DevOps. It can be alienating for Enterprise IT though. Learning about Netflix and their simian armies, or Facebook and their mind-melting scale is fascinating. Can you take it back to the office on Monday morning though?
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so you can decide for yourself.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examining how the Internet and the cloud has allowed for the democratization of IT, resulting in an increased demand for the cloud and the drive to develop new ways to utilize it.