Welcome!

@DevOpsSummit Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, SmartBear Blog

Related Topics: @CloudExpo, @DXWorldExpo, @ThingsExpo

@CloudExpo: Blog Feed Post

Digital Transformation Asset Management | @CloudExpo #DX #Cloud #Analytics

From virtual machines to chatbots to Bitcoin, physical has become last century’s modus operandi

Today’s businesses run in the virtual world. From virtual machines to chatbots to Bitcoin, physical has become last century’s modus operandi.  Dealing with this type of change in business even has its own buzzword – Digital Transformation.  From an information technology operations point of view, this has been manifested by organizations increasingly placing applications, virtual servers, storage platforms, networks, managed services and other assets in multiple cloud environments.

Managing these virtual assets can be much more challenging than it was with traditional physical assets in your data center.  Cost management and control are also vastly different than the physical asset equivalent.  Challenges abound around tracking and evaluating cloud investments, managing their costs and increasing their efficiency.  Managers need to track cloud spending and usage, compare costs with budgets and obtain actionable insights that help set appropriate governance policies.

The cloud computing operational expenditure (OPEX) model demands a holistic management approach capable of monitoring and taking action across a heterogeneous environment. This situation is bound to contain cloud services from multiple vendors and managed service providers.  Enterprises also need to manage services from a consumption point of view. This viewpoint looks at the service from the particular application down to the specific IT service resources involved, such as storage or a database. Key goals enterprises need to strive for to be successful in this new model include:

  • Obtaining ongoing visibility into true-life cloud inventory;
  • Viewing current and projected costs versus industry benchmarks;
  • Establishing and enforcing governance control points using financial and technical policies;
  • Receiving and proactively responding to cloud cost and operational variances and deviations;
  • Gaining operational advantages through advanced analytics and cognitive computing capabilities;
  • Simulating changes to inventory, spend goals and operational priorities before committing;
  • Managing policies through asset tagging across providers and provider services; and
  • Identifying and notifying senior managers about waste and opportunities for cost savings.

Accomplishing these goals across a hybrid IT environment will also require timely, accurate and consistent information delivery to the organizations, CIO, CFO, IT Financial Controller and IT Infrastructure and Operations Managers.  Ideally, this information would be delivered via a “single pane of glass” dashboard.

One path towards gaining these capabilities would be through the use of a cloud services brokerage platform like IBM® Cloud Brokerage Managed Services – Cost and Asset Management. This “plug and play” service can assist in the management of spending and assets across hybrid clouds by visualizing data that provides focus on asset performance.

Through the use of predictive analytics, it can also provide insight-based recommendations that help in the prioritization of changes according to their expected level of impact.  Analytics enables an ability to recalibrate cost by comparing planned versus actual operational expenditures.  The built-in cloud service provider catalog, pricing, and matching engines can also help organizations find alternative providers more easily. Using IBM Watson® cognitive capabilities, IBM Cloud Brokerage Managed Services – Cost and Asset Management will also highlight cloud best practices and expected results based on IBM’s rich knowledge base of cross-industry cloud transition experience.

Operating a business from a virtual IT platform is different.  That is why advanced cost and asset management skills, capabilities and tools are needed.  According to Gartner, more than US$1 trillion in IT spending will be directly or indirectly affected by the shift to cloud during the next five years. This makes cloud computing one of the most disruptive forces of IT spending since the early days of the digital age.  You and your organization can be ready for these tectonic changes by implementing the straightforward five-step process supported by IBM Cloud Service Brokerage capabilities:

  1. Establish governance thresholds and policies for services;
  2. Connect the advanced management platform across all cloud service accounts;
  3. Track the costs of the services, including recurring and usage-based costs;
  4. Enforce compliance on the costs and asset usage using the purpose-built cost analytics engines; and
  5. Simulate and optimize the control and compliance actions and better control your costs.

 


This post was brought to you by IBM Global Technology Services. For more content like this, visit IT Biz Advisor

(Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

Read the original blog entry...

More Stories By Kevin Jackson

Kevin Jackson, founder of the GovCloud Network, is an independent technology and business consultant specializing in mission critical solutions. He has served in various senior management positions including VP & GM Cloud Services NJVC, Worldwide Sales Executive for IBM and VP Program Management Office at JP Morgan Chase. His formal education includes MSEE (Computer Engineering), MA National Security & Strategic Studies and a BS Aerospace Engineering. Jackson graduated from the United States Naval Academy in 1979 and retired from the US Navy earning specialties in Space Systems Engineering, Airborne Logistics and Airborne Command and Control. He also served with the National Reconnaissance Office, Operational Support Office, providing tactical support to Navy and Marine Corps forces worldwide. Kevin is the founder and author of “Cloud Musings”, a widely followed blog that focuses on the use of cloud computing by the Federal government. He is also the editor and founder of “Government Cloud Computing” electronic magazine, published at Ulitzer.com. To set up an appointment CLICK HERE

@DevOpsSummit Stories
For better or worse, DevOps has gone mainstream. All doubt was removed when IBM and HP threw up their respective DevOps microsites. Where are we on the hype cycle? It's hard to say for sure but there's a feeling we're heading for the "Peak of Inflated Expectations." What does this mean for the enterprise? Should they avoid DevOps? Definitely not. Should they be cautious though? Absolutely. The truth is that DevOps and the enterprise are at best strange bedfellows. The movement has its roots in the tech community's elite. Open source projects and methodologies driven by the alumni of companies like Netflix, Google and Amazon. This is a great thing for the evolution of DevOps. It can be alienating for Enterprise IT though. Learning about Netflix and their simian armies, or Facebook and their mind-melting scale is fascinating. Can you take it back to the office on Monday morning though?
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so you can decide for yourself.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examining how the Internet and the cloud has allowed for the democratization of IT, resulting in an increased demand for the cloud and the drive to develop new ways to utilize it.
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MySQL cluster as a Kubernetes application.