Welcome!

@DevOpsSummit Authors: Elizabeth White, Liz McMillan, Kevin Jackson, Pat Romanski, Tim Hinds

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

Artifact Repository in Continuous Delivery | @DevOpsSummit #DevOps #APM

We explain why tools like Maven uses an artifact repository and so should anyone designing a continuous delivery process

Why You Need an Artifact Repository for Continuous Delivery
By Ron Gidron

Both prospects and customers often ask me why we need an artifact repository. Some think that as their favorite CI (Continuous Integration) server such as Jenkins already stores the output of each build, maybe there's no need to add an artifact repository to their existing tool chain. Others simply wonder why they need such a repository at all.

In this blog post I'll discuss why it's essential for any continuous delivery and deployment project to version everything, and why artifact repositories such as Artifactory or Nexus are great choices for managing binary and other artifacts.

What repositories do
Let's start with some basics: Artifact repositories manage collections of artifacts (binaries or any type of files really) and metadata in a defined directory structure. They are typically used by software build tools such as Maven (in the Java world) as sources for retrieving and storing needed artifacts. But there is really no limit to what you can store in an artifact repository. Some examples:

  • Any type of binary
  • Source archives
  • Flash archives
  • Documentation bundles

Why use a repository?
Artifact repositories are great at managing multilevel dependencies, much better then the old text file with a list that developers update and maintain. This dependency management is critical for reducing errors and ensuring the right pieces make it with each build/deployment/release, especially in large-scale business applications.

Repositories also support the notion of snapshot and release versions, where snapshots are intermediate versions of said artifact (usually marked with a data and timestamp attached to the version number) and release versions are those that are marked for "official" release. Metadata that describes each artifact and its dependencies is great for governance and security.

How repositories work
Artifact repositories use a standard addressing mechanism for accessing artifacts, which really simplifies automation. It also assists the parameterization of searching and retrieving versioned artifacts from these repositories, often using a REST call with a URL translation for the directory structure...OK, now I'm geeking out more than is necessary for this blog entry!

Basically, if you're in the process of designing a continuous delivery and automated deployment process for either an application, a department or even your entire IT landscape, we highly recommend you take a look at an artifact repository and make sure to version everything.

More Stories By Automic Blog

Automic, a leader in business automation, helps enterprises drive competitive advantage by automating their IT factory - from on-premise to the Cloud, Big Data and the Internet of Things.

With offices across North America, Europe and Asia-Pacific, Automic powers over 2,600 customers including Bosch, PSA, BT, Carphone Warehouse, Deutsche Post, Societe Generale, TUI and Swisscom. The company is privately held by EQT. More information can be found at www.automic.com.

@DevOpsSummit Stories
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, which can process our conversational commands and orchestrate the outcomes we request across our personal and professional realm of connected devices.
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the public cloud best suits your organization, and what the future holds for operations and infrastructure engineers in a post-container world. Is a serverless world inevitable?
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infrastructure/applications, self-remediation workflows, integrating monitoring and complimenting integrations between Atlassian tools and other top tools in the industry.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was joined by a panel of industry experts and real-world practitioners who shared their insight into an emerging set of best practices that lie at the heart of today's digital transformation.