Welcome!

@DevOpsSummit Authors: Elizabeth White, Yeshim Deniz, Pat Romanski, Aruna Ravichandran, Liz McMillan

Related Topics: Java IoT

Java IoT: Article

Test-Driven Development Is Not About Testing

Test-Driven Development Is Not About Testing

I am always on the look out for good questions to ask candidates in an interview. Not the "How many oranges can I fit in this room?" kind of nonsense (the stock response to which is apparently "with or without us standing in it?"). Nor the picky, encyclopedic type such as "In the javax.obscure.DustyCorner class, which method throws a FullyDocumentedException?" (If you do not respond with "I would check the Javadocs" on the grounds that you actually know, you really ought to get out more.)

Instead, I like the sort of technical question that allows candidates to demonstrate real insight; where they can show not only technical depth and breadth, but also a mature understanding of the software development process. So I was delighted when a colleague offered me a perfect interview question, namely: "What is the point of test-driven development?"

Test-driven development (TDD) has grown out of the Agile software movement (www.agilealliance.org) and Extreme Programming (XP) in particular. Extreme Programming stipulates a set of best practices that collectively encourage core values such as feedback and simplicity. The feedback occurs in the form of tests, by delivering in short iterations, and by the simple expedient of talking to one another. The simplicity comes from the process of refactoring - ruthlessly - and from only delivering exactly what the software has to do right now.

Kent Beck, the original champion of XP, has extracted the essence of its development practices and named it test-driven development. And so to the model interview answer. The point of TDD is to drive out the functionality the software actually needs, rather than what the programmer thinks it probably ought to have. The way it does this seems at first counterintuitive, if not downright silly, but it not only makes sense, it also quickly becomes a natural and elegant way to develop software.

We start by writing some client code as though the code we want to develop already existed and had been written purely to make our life as easy as it could possibly be. This is a tremendously liberating thing to do: by writing a model client for our code, in the form of a test, we can define programmatically the most suitable API for our needs. In addition, we assert the behavior we want.

Obviously this won't even compile, and this is the counterintuitive part - the code that will sit on the other side of the API doesn't even exist yet! The next stage is to write the minimum amount of code to get the test compiling. That's all, just a clean compile, so you can run the test (which at this stage will fail). IDEs such as IntelliJ IDEA or the open source Eclipse will generate missing classes and implement missing methods for you. Now, and only now, you write the application code to satisfy the test. The final piece of the puzzle is to refactor the code so it's as simple as it can be. This then becomes your development rhythm: write a test, write some code, refactor.

Writing the test before you write the code focuses the mind - and the development process - on delivering only what is absolutely necessary. In the large, this means that the system you develop does exactly what it needs to do and no more. This in turn means that it is easy to modify to make it do more things in the future as they are driven out by more tests.

We keep the tests we wrote and run all of them, often, to make sure the system does everything it is supposed to do (and to alert ourselves immediately if we break any existing functionality). However, the extremely useful test suite we've created is very much a secondary benefit of the TDD process.

So when you're sitting in an interview and someone asks you about testdriven development, remember that it's not about the tests; it's about seeing how little you actually need to do and how cleanly you can do it! If someone asks you to fill a room with oranges? Well, I'll leave that to you.

More Stories By Dan North

Dan North has been writing software for 12 years, and is a programmer and coach for ThoughtWorks (www.thoughtworks.com), a software development consultancy, where he encourages people to write tests.

Comments (21) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Kamal Mettananda 06/13/06 08:49:03 AM EDT

Good article.
TDD is one of the ways that can be used to solve the issue of unclear requirements. Difference between what the customer wanted and what programmgers developed always has been + or -. Most of the time prorammers get confused inside all the requirements and write the code according to their knowledge.

But if the test based mindset is there, the solution gets close to the requirements. The hardest thing to achieve is to develop the correct test set. If the programmers who got confused inside requirements does this as well, then the goal will not be achieved.

William 01/13/05 06:38:58 PM EST

I am a 'code everything that MIGHT be needed in the next 100 years' type of programmer. To date, I have completed 1 program in the last 15 years of programming (this way).
It dawned on me that there is a better way. Having taught for several years, I taught the spiral and waterfall models. The department head hated XP programming. However from personal expierience, I have taken a hard look at my techniques and models I use. Here is what I found:
* Gather your requirements, and note dependancies.
* Do a preliminary system design. System design and future forseable changes MUST play a large part of the initial [API] design. This part of the project takes anywhere fom a couple of hours to several weeks, depending on functionality and requirements.
* Code from the user interface back, testing as you go.

This is another view of TDD. Code what is needed, not what might be. I know as I have thrown away more working and tested code than I keep (just to keep dead weight out of the source).

treyst1 12/04/04 02:29:35 PM EST

I agree with CW: "agile development" has become a trigger-word for knee jerk reactions. At its most simple, agile software development focuses on continuous care of design: what's not to like about that?

Walter 12/04/04 01:49:37 PM EST

Nice article -- thoughtful comments.
My view of value of TDD is in driving the requirements to completion. Especially useful if the tests are validated by the stakeholders. However, no small number of tests, and in many cases no finite set of tests, can be a complete specificaiton. So there is always 1 more bug.

Arup 12/02/04 11:59:00 AM EST

The word Agile itself means being able to adapt / changes itself according to the context. The kind of methodology that should be used should depend on the project requirements, there is no one stragey suits all magic formula.

I see over the last few years we are exactly making the same kind of mistakes with XP that we did with earlier methodologies by mandating a particular way of development.

TDD however gave me another interesting benefit. I am a person who by nature has a habit of over engineering application designs. TDD helps me curb that and helps me make design which are much simpler.

Eivind Eklund 08/02/04 10:05:30 AM EDT

The last post (talking about more work) is missing the point. It is *more code to type up front*, but the clue is to make LESS work.

I find the 2x to 3x more code than the individual methods being tested to about match my own experiences. The efficiency in refactoring and the cleanliness of the code produced, however, turn development of the same functionality into less work (if you measure in terms of development hours). It also make the code much more pleasant to work with afterwards.

Eivind

Simon 08/02/04 08:14:03 AM EDT

Well TDD was all over TechED in the .NEt arena and built straight into the tools, hell nUnit will do it right now. So having the tools to help is a bonus in doing this kind of stuff, centrainly in MS arena. In fact in the new team system rules can be put in place so that code cant even be checked in without a valid test run.

Thing is, moving away from the technical, you mentioned that it is to help with the programmer not writing a bunch of stuff he feels is necessary so only things with a test get through.

Who do you think is writing the tests ? who is auditing the results of those tests ? just how good are the tests in the first place ?

It all sounds good, but, as usual, with pretty much any Xp method you are going to have to put in shed loads of resource to ensure not only the code is correct but the test code is to spec also, with a test for practically every function you could more than double/triple the amount of code to write.

People wonder why corporations are shipping development overseas, with this much effort for every project it better come cheap.

Kevin 05/05/04 11:56:14 PM EDT

Nice article - short and to the point. TDD is definitely about design. But the take-away here needs to be that some amount of design MUST be done up front (in direct contrast to what some are saying). If you don''t think about the interface (i.e. contract) that your software must provide, you can''t even begin to write your test client.

Also, on an unrelated note, I agree with what Sparky says regarding the dangers inherent in a lack of forethought.

Sparky 04/29/04 08:10:52 PM EDT

CW: Where in my comments did I attack any abbreviations or acronyms?

I simply disgreed with the authors statements that you can start coding with only a scant amount of forethought and expect that you will be better able to satisfy all future requirements as a result of that behavior.

Nowhere did I blame his statements on any abbreviations. I happen to like abbreviations.. =8>j)

Regards,
sparky

CW 04/29/04 04:49:49 PM EDT

I''m always amazed when those otherwise innocent abbreviations, like XP or TDD, are mentioned it brings the nay sayers out of the woodwork. Nowhere in this article did the author say, "x must be done y way". Yet when agile development is mentioned, to give background to this piece, many people lose all cogent reasoning and respond emotionally with "that type of development doesn''t work". In fact, nowhere in agile methodology is restrictive guidance documented; quite the contrary, individual developers (or teams) are supposed to TAILOR the agile FRAMEWORK to fit their needs.

This is an excellent article about perspective. Folks who construe it any other way are either managers who do nothing but embrace abbreviations and acronyms without understanding them, or the "legacy" type of people who complain about everything at work without ever offering any solutions!

Sparky 04/21/04 02:40:44 PM EDT

Dan North''s statement that "the system you develop does exactly what it needs to do and no more.. This in turn means that it is easy to modify to make it do more things in the future" is absolutely not true in most cases. In fact, some functionality can never be retrofitted "in the future" if not taken into consideration from the ground up.

It''s a bit irresponsible to suggest that "you should just start coding and everything will eventually work itself out and fall into place." except for a relatively trivial development effort.

(Has anyone seen "The Prototype That Wouldn''t Die"? =8>\)

- sparky

Jörg 04/20/04 04:08:28 PM EDT

Really a nice article.

I am not a XP programmer, but I am not ashame to use refactoring or unit testing where it makes sense. I think it''s good pratice to refactor code to make it both easier to understand and easier to modify.

Wayne 04/16/04 08:20:40 PM EDT

Refactoring sounds like one of those things you can do easily; the sort of thing you do when you don''t want too much of a challenge, but you still want to do something useful to your code. The trouble is that its a bit of a bore when you need to do it. So, you tend to leave the code unfactored and it starts to become a mess. That''s where refactoring tools shine. When you see a need to refactor the tools take away the drudgery so you can concentrate on the intersting part - improving the code.

I''ve only tried refactoring tools on toy applications and I can already see that they are going to be a must have feature in my next development toolset.

Aiden 04/02/04 08:29:31 PM EST

Dan, thanks for the interesting article.

I basically agree with Stuart''s post, except I don''t really get which "tools" are necessary for doing refactoring or unit testing. Do you need more than a simple text editor and compiler?

Neither why it is easier to refactor due to a lack of type declarations. In my experience, the more code the compiler checks for you, the less likely you are to introduce errors.

Eivind Eklund 02/12/04 09:59:09 AM EST

Stuart''s idea that Test Driven Development was not feasible "years ago", due to the lack of refactoring tools. This is represent a misunderstanding. Refactoring is perfectly feasible today without tools - ASSUMING YOU USE THE RIGHT LANGUAGE. I personally mainly use a TDD style when doing development where I can do that. I can''t do it all through my day job yet, due to short-short-short (bi-hourly or similar) deadlines and complex interdependencies. I do not use any particular tools for refactoring. However, I mostly program in languages (Ruby for love, Perl for money) that does not have type declarations. This cuts my code size to between 1/2 (perl) and 1/4 (ruby), and makes refactoring much easier.

I''ll not claim to have done test driven development before it became reasonably well know (I haven''t - I have used tests, but not consistently, and not test first). I have, however, done stuff that is some form of refactoring. I have just not used it as consistently as I do now (all the time, and not as a mix of restructuring and refactoring) nor did it form an as distinct concept in my head as it does now.

Eivind.

Stuart 01/05/04 09:16:08 AM EST

Reading some of the above comments would make me yelp ''Ouch!'', as if someone has smacked me around the head a couple of times. (with the exception of Mr. Putman)

What most people have failed to realise is the point Dan is making, TDD is not about testing - it''s about design. Design does not have to be completed up front, refactoring should be ruthlessly done (if it isn''t broken, let''s pile up some more duplicate code, which isn''t broken), and double negatives prove a point (''An ambiguous API can not never be implemented cleanly'').

The truth of the matter is that although not a complete software development process for large scale projects, TDD will provide you with the simplest solution to your software problem. As the project grows in complexity the tests are a handy extra when it comes to refactoring at a low level. As the user changes his mind about what the software is supposed to do, new tests can be written, old tests can be deleted, code can be changed, and the whole lot refactored.

Is all this a waste of extra effort... of course not. How many times have you worked on a mature project only to have a clever business analyst / user come up and say... we kind of need our software to do this one little simple thing. A ''simple'' thing is business terms can sometimes turn your project upside down. Write a test, write some code and refactor. Try doing that without a test suite and you''re gonna have fun.

And if anyone else mentions that they used to program this way years and years ago... please go out onto a busy street and shout, ''I am a fool!''. There simply weren''t the tools available to program this way before especially with regards to refactoring.

Final note: All of this presumes a developer knows how to write a test. Something that was not particularly clear to me for a while, and took some time to learn. Try to think of testing behaviour, not functionality.

Ben Kuehlhorn 11/25/03 02:09:40 PM EST

I agree that XP is just another development methodology.

Test Driven Development works well when the API is well defined. An ambigous API can not never be implemented cleanly. Critical design must be completed before writing a test case or line of code. The author writes a sample of code to use the API. A lot of detail must be specified for the test to make sense. Any correction or clarification needs to be fixed twice: test case and code. A lot of effort would be wasted without sufficient design.

When does testing end? Do I have to write test cases for my test cases? There is a point to just write the code from a clean, complete design.

Hal Peters 11/14/03 04:27:10 PM EST

Interesting article. It seems some of the concepts from the "old" Top-Down Development and Prototyping approaches have been refactored and are now called XP (Extreme Programming), ASD (Agile Software Development), GEP (Good-enough programming), GP (Guerilla Programming), or what-ever. How much these improve on more "professional" approaches is uncertain, just ask NASA about doing things quickly, cheaper, etc.

Where do USER requirements, pseudo code and decision tables fit into this "new" extremely agile way of doing things? Is Test-Driven Development (TDD) appropriate for industrial strength enterprise applications? YES! If it is managed by professional project managers and it can improve on the horrific success rate of current initiatives. It has been estimated that currently, business users are happy with our computer programming results in only 1 of 3 projects. Be honest, is YOUR experience or your clients'' environment any better?

The point is that with .net, Macromedia Flash MX 2004, ActionScript 2.0, Java, Coffee Beans, Chocolate, and Vanilla all evolving . . . a whole new set of challenges exist. It is imperative that teams learn to embrace tried and true (ie: proven) development techniques, a prerequisite to building more complex applications. We must do thing better and smarter. The goal is to improve the process!

As a closet Methodologist, I say it does not as much matter which Guideline you follow, as long as you and your fellow teammates use (and agree) on ONE!

It must be remembered that with TDD, one needs to keep not only the tests, but the regression test data as well. Resources will now need to be planned for maintenance of these new additions to a project''s software assets.

Refactoring, in general, seems counter intuitive to me. Years of coding experience (starting with FORTRAN2 on the IBM 1620) brings to mind these two adages: - "let sleeping dogs lie" and "If it is not broken, don''t fix it"?

David Putman 11/12/03 07:58:09 AM EST

I sort of saw this article as a clean and concise description of a valuable programming technique which can be used anytime anywhere. I didn''t see any hard and fast rules, just some good, clear, general guidelines.
I know Dan mentioned that the origins of TDD were with Agile Software Development and XP but can''t see anywhere that he says you *have* to be doing these to do TDD. Ditto pair programming.
I think the previous comments missed the point about feedback, never mind the point of the article.

Chris L. 11/11/03 04:46:22 PM EST

I agree with the previous comment. XP/Agile could make some developer and bussiness users hyper, over excited. They think it is a silver bullet to their problems. XP/Agile are like other methodology has its pros and cons. It has been proved somewhat sucessful in small scale, isolated application/prototype. It is not practical in large/enterprise level project. It is obvious that "daily refactory" just simple won''t work if there are lots of dependencies between different systems, which are controled under different departments or different outsourcing parties. Pair programming is useful when the satuiation fits and would not gain much under some other senarios....etc... The whole list deserve its own topic.
I am just curious that why not XP/Agile advocates address its short comings and in turns it can be prompted in the right way.......

Robert Cresswell 11/06/03 04:49:18 AM EST

This is an exploration of some programming techniques and I am sure its very usefull as teaching aid. It falls into the same trap that most of methods do (RUP,XP agile), by saying this is the way follow the way. I have been programming a very long time and I used methods like this and more exteme testing code fragments even. But the point is my methods are agile I customise my development techniques to suit the problem or the type of work I am addressing.
Please have pity every time you guys give hard and fast rules I have to deal with the fools who think its the one and only true way to work. Agile methods just says go back to the ways you always worked before the achedemics started telling you how to program.

@DevOpsSummit Stories
SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their associated CPUs, memory storage and network) into one or more large servers capable of handling the biggest Big Data problems and most unpredictable workloads.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant that knows everything and can respond to your emotions and verbal commands!
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching of virtual storage services to its enterprise market.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lavi, a Nutanix DevOps Solution Architect, explored the ways that Nutanix technologies empower teams to react faster than ever before and connect teams in ways that were either too complex or simply impossible with traditional infrastructures.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing, CA Technologies. "It's this results-driven combination of technology and business that makes me so passionate about DevOps and its future in the industry. I am truly honored to take on this co-chair role, and look forward to working with the DevOps Summit team at Cloud Expo and attendees to advance DevOps."
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's application ecosystem, tweaking existing software to stitch various components together leads to sub-optimal solutions. This definitely deserves a re-think, and paves the way for a new breed of lightweight application servers that are micro-services and DevOps ready!
SYS-CON Events announced today that mruby Forum will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. mruby is the lightweight implementation of the Ruby language. We introduce mruby and the mruby IoT framework that enhances development productivity. For more information, visit http://forum.mruby.org/.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making real-time decisions based on a combination of real user monitoring, synthetic testing, APM, NGINX / local load balancers, and other data sources, is critical.
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp empowers global organizations to unleash the full potential of their data to expand customer touchpoints, foster greater innovation and optimize their operations.
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. That means serverless is also changing the way we leverage public clouds. Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we’ve lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the ...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. This leads to a waste of cloud resources and increased operational overhead.
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers to house increasing amounts of storage infrastructure.
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with legacy components, and also go over automated capabilities provided by operators to auto-update Kubernetes with zero downtime for current and secure deployments.
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its start in 2008 with a mission to use fast, flash-based storage in the most efficient, effective manner possible. What the team had discovered was a technology that optimized storage resources and reduced dependencies on sprawling storage installations. Launched as the Avere OS, this advanced file system not only boosted performance within standard, on-premises, network-attached storage systems but ...
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost and increase in speed. Sometimes in order to reduce complexity teams compromise features or change requirements
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous architectural and coordination work to minimize the volatility of the cloud environment and leverage the security features of the cloud to the benefit of the CICD pipeline.
SYS-CON Events announced today that SkyScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. SkyScale is a world-class provider of cloud-based, ultra-fast multi-GPU hardware platforms for lease to customers desiring the fastest performance available as a service anywhere in the world. SkyScale builds, configures, and manages dedicated systems strategically located in maximum-security facilities, allowing customers to focus on results while minimizing capital equipment investment.
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: implement always-vigilant DNS security
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, will discuss how by using new techniques such as feature flagging, rollouts, and traffic splitting, experimentation is no longer just the future for marketing teams, it’s quickly becoming an essential practice for high-performing development teams as well.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.