Welcome!

@DevOpsSummit Authors: Zakia Bouachraoui, Yeshim Deniz, Pat Romanski, Roger Strukhoff, Elizabeth White

Related Topics: @DevOpsSummit, Linux Containers, Agile Computing, @CloudExpo, Apache

@DevOpsSummit: Blog Post

Best Practices for Cloud Logging and Data Security By @TrevParsons | @DevOpsSummit [#DevOps]

Best Practices for Cloud Logging, Security, & Data Protection

Best Practices for Cloud Logging, Security, & Data Protection

This article originally published on the Logentries Blog.

When we first founded Logentries in 2010 a lot of people thought Viliam Holub (co-founder, CTO, and the brain behind processing billions and billions of log events in real time) and I were crazy. The common response was:

"People are not going to send their logs to the cloud... logs might contain very sensitive data..."

Like typical stubborn founders we persevered in spite of this, and today we have more than 35,000 users across 100 countries. Our customers also range from fortune 100 companies to individual developers across almost all verticals from SaaS companies, to healthcare, financial services, commerce and a bunch of others.

So, why do companies now trust sending log data to a cloud based service?  Id like to share some of the reasons we have found our customers are using secure, cloud-based logging.


  • For on-prem workloads: Your logs may be a lot safer in the cloud than they are on-premise. Cloud vendors like AWS, Google, Microsoft invest heavily in security and have hardcore security teams looking out for you. For example the level of security provided in Amazon's cloud platform is described in more detail here. Furthermore your logging provider should be looking out for you by making sure data is encrypted on the wire, sensitive data is stripped out before it leaves your network, and encrypting data at rest. It's unlikely that your homegrown logging solution will have your data locked down quite so securely, at least not without significant investment.
  • Sending logs to a remote location can be a MORE secure option: With cloud-based logging your logs are stored remotely from your running systems. This is recommended as a security best practice; if your system is compromised, a hacker will often delete the logs on your local system to remove any evidence of his/her activity. By storing your logs remotely this cannot be performed as you will have a redundant copy of the log data at Logentries.

As you consider moving your log management and analytics to the cloud, here are our five best practices you should look for from log management service to assure security:

  1. Website Integrity: A good indicator of how serious a company takes it's security is how it deals with website integrity. For cloud services this is often the gateway to your data. For example at Logentries we we redirect all web HTTP requests to our website to HTTPS. This ensures the integrity of the Logentries website by using SSL authentication between the Customer and the Logentries web interface. The Logentries service must show a valid SSL certificate to each Customer to initiate this link. Perfect Forward Secrecy is also used on our web servers for HTTPS. In addition to the usual confidentiality and integrity properties of HTTPS, forward secrecy adds a new property. If an adversary is currently recording all a users' encrypted traffic, and they later crack or steal Logentries private keys, by using perfect forward secrecy they should not be able to use those keys to decrypt the recorded traffic at a point in the future.
  2. What if I have sensitive data in my logs: While it's usually a not a good idea to write PII or sensitive data to your logs, it is not always unavoidable and in some cases it can occur inadvertently or as a result of an oversight. Having the ability to search for and filter out/redact/obfuscate sensitive data from your logs is a key requirement for many organizations. The Logentries Datahubhas been designed with this in mind so that you can easily filter out and redact any sensitive data before it leaves your network. It has been designed in conjunction with a number of our customers who have data protection and security requirements and in particular for those who require PCI, HIPPA or similar audits.
  3. Is your data encrypted on the wire: Sending your logs is the clear is rarely a good idea. Data sent to a cloud logging service should be done so via SSL so that it is encrypted on the wire. You should check if your log forwarding agent/collector or syslog setup is configured to support this.
  4. Is your data encrypted at rest: Data at rest should also be encrypted. I.e any data you send to a cloud service should be encrypted when it sits on disk in the cloud environment. Ask your logging provider if this is the case and what encryption they are using for this.
  5. Where does your data reside: Check where your logs actually reside. Are they in a SOC 2 compliant data center? How is it protected? What jurisdiction is it in and what are the data protection policies in that jurisdiction.

Want to learn more or chat with a cloud logging expert? Get started with a free 30-day trial here, or contact us at [email protected]

More Stories By Trevor Parsons

Trevor Parsons is Chief Scientist and Co-founder of Logentries. Trevor has over 10 years experience in enterprise software and, in particular, has specialized in developing enterprise monitoring and performance tools for distributed systems. He is also a research fellow at the Performance Engineering Lab Research Group and was formerly a Scientist at the IBM Center for Advanced Studies. Trevor holds a PhD from University College Dublin, Ireland.

@DevOpsSummit Stories
Technology has changed tremendously in the last 20 years. From onion architectures to APIs to microservices to cloud and containers, the technology artifacts shipped by teams has changed. And that's not all - roles have changed too. Functional silos have been replaced by cross-functional teams, the skill sets people need to have has been redefined and the tools and approaches for how software is developed and delivered has transformed. When we move from highly defined rigid roles and systems to more fluid ones, we gain agility at the cost of control. But where do we want to keep control? How do we take advantage of all these new changes without losing the ability to efficiently develop and ship great software? And how should program and project managers adapt?
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. DevOpsSUMMIT at CloudEXPO expands the DevOps community, enable a wide sharing of knowledge, and educate delegates and technology providers alike.
The vast majority of organizations today are in the earliest stages of AI initiatives and this shift will be dramatic as more enterprises move forward in the AI journey. Although companies are at different stages of this journey, most agree that finding or developing analytic talent is a key concern and bottleneck for doing more. What if your business could take advantage of the most advanced ML/AI models without the huge upfront time and investment inherent in building an internal ML/AI data scientist team? In this presentation, I will introduce the pros and cons of three pathways: 1. Utilize prepackage ML APIs, 2. Customizable AutoML, 3. Training your your ML models specifically tailored to your business needs. To win with Cloud ML, you will need to know how to choose a right approach in a quicker time frame and without significant investment.
Your applications have evolved, your computing needs are changing, and your servers have become more and more dense. But your data center hasn't changed so you can't get the benefits of cheaper, better, smaller, faster... until now. Colovore is Silicon Valley's premier provider of high-density colocation solutions that are a perfect fit for companies operating modern, high-performance hardware. No other Bay Area colo provider can match our density, operating efficiency, and ease of scalability.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes.