Welcome!

@DevOpsSummit Authors: Elizabeth White, Liz McMillan, Kevin Jackson, Pat Romanski, Tim Hinds

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog, Apache, FinTech Journal

@DevOpsSummit: Blog Post

DevOps and SQL Review By @Datical | @DevOpsSummit [#DevOps]

DevOps patterns are in the constant crusade to bring high-quality products to market faster

Automating SQL Review to Save Time and Money

I’ve spent the majority of my tech career in startups.  I love the fast pace, the opportunity to learn new things, and the sense of accomplishment that comes from bringing a successful new product to market.  I began my career in Quality Assurance.  In startups, you rarely enjoy the low ratio of Developers to QA Engineers that you might in a large enterprise.  As a QA engineer in a startup, your inbox is always much more full than your outbox. You are the last gate before the next release so you’re always under the microscope. In an early stage startup you are most likely also the “Customer Support” team, so when an issue is hit in production you become VERY popular.

As someone in that position, I always kept an eye out for the right tools to lighten my load without sacrificing any of my own personal quality standards for the work I was doing.  This is how I came across FindBugs about 10 years ago.  The first time I ran it and shared the output with the development engineers on my team they felt that the tool emitted more false positives or “nitpicky” patterns than true bugs.  But over time, as we tweaked and extended the checks performed to cover our specific needs and correlated the data from FindBugs with actual counts of bugs found in test and production, FindBugs became an integral part of our nightly and on-demand builds.  The reports were an excellent early indicator of potential issues and allowed developers to rectify misdeeds before we used up testing cycles or troubleshooting time in operations.  The developers on my team also started committing fewer and fewer infractions as the daily reminders they got from our build system helped them to change their bad habits into safer, better performing, more stable code. Release cycles shortened, product quality improved, and customer satisfaction rose proving that an ounce of prevention really is worth a pound of cure.

As Enterprise IT embraces agile development practices and adopts DevOps patterns in the constant crusade to bring high-quality products to market faster, DBAs are really starting to feel the pinch.  The description above of a QA Engineer in a software startup is apt.  With more frequent releases the DBA’s inbox of SQL scripts to write, review, modify or optimize is always more full than her outbox. The DBA is the last bastion of defense for data quality, data security, and data platform performance and is therefore under constant scrutiny. When there is a production outage, the DBA is among the first called to respond.

One of the most time consuming tasks for the Fortune 50 DBAs we work with is SQL review.  Some DBAs are allocating 70% of their time manually reviewing SQL scripts.  They are checking for the same things in SQL that tools like FindBugs are looking for in Java code: code patterns that indicate logical problems, security flaws, performance issues, and non-compliance to internally defined best practices or externally mandated regulations.

It’s clear that DBA’s need a tool that does for them what FindBugs did for my team a decade ago.  Static analysis for SQL is nothing new, but current offerings only go so far.  Typically, they evaluate the SQL statements with no contextual sensitivity. This omission severely limits the productivity and quality gains that can be achieved because so much of Database Lifecycle Management is being aware of Who is doing What, Where and When.  For example, an organization may allow privilege grants and INSERT statements in a TEST environment, but never allows such activity in an automated session in PROD. Any static analysis tool for SQL must take environmental parameters into consideration.

Also complicating matters is the nature of database ‘versioning.’  While your application is packaged, versioned and replaced wholesale from release to release, the database schema that supports your application is persistent and evolves over time.  What’s more, external compliance standards (SOX or PCI DSS for example) and internal audit requirements often dictate that incremental changes to the database be rigidly controlled and tracked in a well-defined process. This means the DBA must also confirm (through manual process and reviewing SQL for the appropriate comments) that the change can be traced to its cause and the application of the change can be traced through each environment.

The Datical DB Rules Engine was designed and implemented to satisfy the unique set of challenges posed by SQL review & static analysis.  Here are just a few of the reasons that Datical DB enables acceleration through static analysis safely and sanely.

  • Models Make for Powerful Evaluation – Datical DB abstracts the application schema into a strictly defined and validated object model. Authoring powerful rules is fast, straightforward and simple. Once they are written they are enforced every time a Forecast or Deploy is performed on any database in the lifecycle.
  • Environmentally Aware Change Validation - The model includes information about the client environment and various database instances in your applications lifecycle. Your rules can be written to allow maximum flexibility in early stage environments and maximum security in sensitive environments simultaneously.
  • Easily Confirm Internal & External Audit Requirements – In Datical DB, everything you need to remain in compliance with external and internal audit requirements is tied tightly to individual changes in the Data Model.  Manual review to confirm auditability of change is replaced with automated checks that are executed every time you (or your automation frameworks) Forecast or Deploy.
  • Automatically Validate What’s Important to YOU - Provides the capability to customize analysis to cover internal best practices like naming conventions, SQL DOs and DON’Ts, and object dependency management
  • Automate The Boring Stuff. Get Back To The Fun Stuff - Like many static analysis tools for code, Datical DB integrates into your build and deployment systems in a few mouse clicks. Now every time you build or promote an application, Rules validations are performed and a report is generated for dissemination throughout the organization. Your DBAs, having considerably reduced the time they spent with eyes on the screen reading SQL, are concentrating on more strategic projects and problems.
  • Better Coding Means Fewer Bugs - DBAs author rules and share them with development.  Development then has a codified repository of what is and is not acceptable in their organization to work against. Fewer bugs escaping DEV saves time and money.
  • Increasing Operations Involvement In Database Development – The Rules Engine is tightly integrated with Datical DB Forecast.  This feature allows you to simulate database change without actually altering the target database.  When DBAs share their Rules with Operations, Operations can run nightly Forecasts against STAGE or PROD to ensure that what’s currently in DEV or TEST will comply with the stricter validations performed downstream, once again finding problems earlier in the lifecycle when they are cheaper and easier to fix.

More Stories By Pete Pickerill

Pete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software industry veteran who has built his career in Austin’s technology sector. Prior to co-founding Datical, he was employee number one at Phurnace Software and helped lead the company to a high profile acquisition by BMC Software, Inc. Pete has spent the majority of his career in successful startups and the companies that acquired them including Loop One (acquired by NeoPost Solutions), WholeSecurity (acquired by Symantec, Inc.) and Phurnace Software.

@DevOpsSummit Stories
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, which can process our conversational commands and orchestrate the outcomes we request across our personal and professional realm of connected devices.
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the public cloud best suits your organization, and what the future holds for operations and infrastructure engineers in a post-container world. Is a serverless world inevitable?
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infrastructure/applications, self-remediation workflows, integrating monitoring and complimenting integrations between Atlassian tools and other top tools in the industry.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was joined by a panel of industry experts and real-world practitioners who shared their insight into an emerging set of best practices that lie at the heart of today's digital transformation.