Welcome!

@DevOpsSummit Authors: Pat Romanski, Elizabeth White, Liz McMillan, Dalibor Siroky, Stackify Blog

News Feed Item

Cloud Advisory Council Announces the Formation of Open BigCloud Group

The Cloud Advisory Council, a not-for-profit organization dedicated to develop and enable the next-generation cloud architecture, today announced the formation of the Open BigCloud group. The Open BigCloud group will bring together industry, academia, and research communities to shape the future of open cloud computing and Big Data.

The objective of the Open BigCloud group will be to advance understanding of how the revolutionary paradigm of cloud computing can be applied at extreme scales of computing, data processing, and integration with instruments. Those issues constitute the cornerstone of future innovation and require a broad spectrum of expertise ranging across architectures, networking, devops, and software systems.

“The Cloud Advisory Council is excited to lead the efforts in cloud computing and Big Data research,” said Eli Karpilovski, Chairman of the Cloud Advisory Council. “Our goal is to create an ecosystem of active contributing members where together we can establish ground-breaking results. We look forward to seeing the long-term impact of our results.”

“As open technologies such as Open Compute and OpenStack expand their footprint into the enterprise and academia, our goal is to build an open foundation across industry, academia, and research to help navigate the intersections of high performance computing (HPC), cloud computing, and Big Data,” said Paul Rad, Director of Cloud and Big Data at University of Texas San Antonio and Vice President of Open Research, Rackspace. “The BigCloud Foundation intends to enhance the international conversation by investigating Cloud and Big Data from workload perspective.”

"The Nimbus Project at Argonne National Laboratory pioneered Infrastructure-as-a-Service cloud computing. Now that OpenStack and other open source efforts provide a solid open source platform, we have refocused our efforts on helping the scientific community through exploring cloud computing paradigm at extreme scales; we will leverage Nimbus for BigCloud activities," said Kate Keahey, creator of Nimbus and cloud scientist at Argonne National Laboratory.

The move aims to spur innovation and fuel a new ecosystem of collaboration – ranging from research, academia, to established enterprise players. Together with Internet2 and Cloud Advisory Council, this group shares a vision for creating a new innovation model that transforms how research and industry collaborate.

“To facilitate discovery across the disciplines, it is imperative that our scientists have the computational resources and knowledge to effectively explore theories and validate hypotheses. To this end, open platforms have allowed for dynamic innovation, fundamentals based user education, and cost controls in a dynamic and evolving ICT environment,” said Paul Brenner, University of Notre Dame Center for Research Computing. “The OBC collaboration will provide a foundation for commercial innovation while leveraging contributions from open academic discovery; working on common platforms where market competitors can share lower base infrastructure costs and investing, instead on new technical discoveries.”

The BigCloud group objectives will be as follows:

  • Understand, document and define requirements as well as barriers to adoption for applications relying on BigCloud capabilities
  • Develop relevant prototypes in the context of open source projects, such as OpenStack and Open Compute, to demonstrate and integrate innovation
  • Create connections between communities seeking to experiment with innovative computing resources and providers supporting such experimentation, in particular fostering a tighter connection between academia and research on one hand and industry on the other

"The Internet2 community applauds the creation of the Open BigCloud Group," said Steve Wolff, Internet2 chief technology officer. "The research and education community increasingly relies upon advances in cloud computing to meet the requirements of data-intensive research, especially as compute, storage, and network infrastructure are converging. We look forward to working with the Open BigCloud Group as it brings together researchers across academia, industry, and government to collaboratively accelerate the development of innovative cloud technologies."

The group will foster a “virtual innovation game room” community, interacting via symposiums and workshops, social media, and activities within special interest groups disseminating new solutions and proposals for cloud implementations on a large scale. An initial community meeting, giving raise to the BigCloud collaboration, has already taken place at Argonne National Laboratory http://www.rackspace.com/blog/cloud-hpc-and-open-technologies-converge-to-fuel-research-innovation/. The 2nd BigCloud Symposium will take place in May 7-8 at UTSA http://utsa.edu/today/2014/04/openbigcloud.html and bring together participants from industry, research centers, and academia to discuss issues arising in cloud computing in the Big Compute and Big Data context.

About the Cloud Advisory Council

The Cloud Advisory Council is a not-for-profit organization with the mission to develop the next generation cloud architecture, to provide cloud designers and IT managers with the tools needed to enable computing in the cloud, to strengthen the qualification and integration of cloud solutions and to provide best practices. The Cloud Advisory Council is led by a broad coalition of industry practitioners and corporations. For more information, visit www.cloudadvisorycouncil.com, or follow us on Twitter: @Cloud_Advisory.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@DevOpsSummit Stories
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infrastructure/applications, self-remediation workflows, integrating monitoring and complimenting integrations between Atlassian tools and other top tools in the industry.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was joined by a panel of industry experts and real-world practitioners who shared their insight into an emerging set of best practices that lie at the heart of today's digital transformation.
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacenter.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and cost-effective resources on AWS, coupled with the ability to deliver a minimum set of functionalities that cover the majority of needs – without configuration complexity.