Welcome!

@DevOpsSummit Authors: Pat Romanski, Elizabeth White, Eric Robertson, Liz McMillan, Yeshim Deniz

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, @BigDataExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Article

Big Data, Open Data and Cloud Strategy

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud

The Big Data and Cloud market has been growing at a staggering pace. Data is becoming unmanageable and too big to be handled by relational database systems alone and there is a need to effectively provision, manage elastic scalable systems. Information technology is undergoing a major shift due to new paradigms and a variety of delivery channels. The drivers for these technologies are social networks, proliferation of devices such as tablets and phones. Social business and collaboration are continuing to develop further to enhance productivity and interaction. There has been a big void in the Big data area and a need to come up with solutions that can manage Big Data. Part of the problem has been that there was so much focus on the user interfaces that not many organizations were thinking further about the core - Data. So now with the proliferation of large and unstructured data, it is important to extract and process large data sets from different systems expeditiously. To deliver strategic business value, there should be the capability to process Big data and have the analytics for enhanced decision making. In addition, systems that process Big Data can rely on the Cloud to rapidly provision and deploy elastic and scalable systems.

The key elements of a comprehensive strategy for Big Data, Open Data and Cloud includes conducting a cost benefit analysis, hiring resources with the right skills, evaluating requirements for data and analytics, developing a sound platform that can process and analyze large volumes of data quickly and developing strong analytic capabilities to respond to important business questions. A sound strategy also includes assessing the existing and future data, services, applications as well as the projected growth. In addition there should be a focus on ensuring that the infrastructure can support and store unstructured as well as structured data. As part of the strategy, data protection including security and privacy is very important. With the evolution to complex data sets, data can be compromised at the end points or while it is being transmitted. Hence proper security controls have to be developed to address these issues. Organizations also need to develop policies, practices and procedures that support the effective transition to these technologies.

As part of the strategic transition to Big Data and Cloud it is important to select a platform that can handle such data, parse through records quickly and provide adequate storage for the data. With the high velocity of data coming through systems, in memory analytics and fast processing are key elements that the platform should support. It should have good application development capabilities and the ability to effectively manage, provision systems and related monitoring. The platform should have components and connectors for Big Data to come up with integrated solutions. From a development perspective, Open source software such as Hadoop, Hive, Pig, R are being leveraged for Big Data. Hadoop was developed as a framework for the distributed processing of large data sets and to scale upwards. Hadoop can handle  data from diverse systems including structured, unstructured, media. NoSQL is being used by organizations to store data that is not structured. In addition, there are vendors who offer proprietary software Hadoop solutions. The choice to go with a proprietary or open source solution depends on many factors and requires a through assessment.

Systems that process Big Data need the Cloud for rapid provisioning and deployment. The elastic and scalable aspects of the Cloud support the storage and management of massive amounts of data. The data can be obtained and stored in a Cloud based storage solution or database adapters can be used to obtain the data from databases with Hadoop, Pig, Hive. Vendors also offer data transfer services that move Big data from and to the Cloud. Cloud adds the dynamic computing, elasticity, self-service, measured aspects in addition to other aspects for rapid provisioning and on demand access. Cloud solutions may offer lower life cycle costs based on usage and the monitoring aspects can lay out a holistic view of usage, cost assessments and charge back information. All this information can enhance the ability of the organization to plan and react to changes based on performance and capacity metrics.

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud. There are initiatives underway related to Open data that drive the development and deployment of innovative applications. Making data accessible enables the development of new products and services. This data should be made available in a standardized manner so that developers can utilize it quickly and effectively. Open data maximizes value creation built on the existing structured and unstructured data.

Open Data strategy and initiatives should define specific requirements of what data will be made available based on the utility of that information. Just providing massive dumps of data that are hard to use is not the solution. There has to be proper processing that can extract useful information from the data. The data that is obtained should support automated processing  to develop custom applications and can be rendered as html, xml etc. This  can promote greater number of not just traditional applications, but also mobile applications. There has to be great emphasis on security and privacy since any errors can compromise important information when the data is made accessible. A comprehensive strategy for Big Data, Cloud and Open Data will enable a smooth transition to achieve big wins!

(This has been extracted from and is reference to blog. All views and information expressed here do not represent the positions and views of anyone else or any organization)

More Stories By Ajay Budhraja

Ajay Budhraja has over 24 years in Information Technology with experience in areas such as Executive leadership, management, strategic planning, enterprise architecture, system architecture, software engineering, training, methodologies, networks, and databases. He has provided Senior Executive leadership for nationwide and global programs and has implemented integrated Enterprise Information Technology solutions.

Ajay has a Masters in Engineering (Computer Science), and a Masters in Management and Bachelors in Engineering. He is a Project Management Professional certified by the PMI and is also CICM, CSM, ECM (AIIM) Master, SOA, RUP, SEI-CMMI, ITIL-F, Security + certified.

Ajay has led large-scale projects for big organizations and has extensive IT experience related to telecom, business, manufacturing, airlines, finance and government. He has delivered internet based technology solutions and strategies for e-business platforms, portals, mobile e-business, collaboration and content management. He has worked extensively in the areas of application development, infrastructure development, networks, security and has contributed significantly in the areas of Enterprise and Business Transformation, Strategic Planning, Change Management, Technology innovation, Performance management, Agile management and development, Service Oriented Architecture, Cloud.

Ajay has been leading organizations as Senior Executive, he is the Chair for the Federal SOA COP, Chair Cloud Solutions, MidTech Leadership Steering Committee member and has served as President DOL-APAC, AEA-DC, Co-Chair Executive Forum Federal Executive Institute SES Program. As Adjunct Faculty, he has taught courses for several universities. He has received many awards, authored articles and presented papers at worldwide conferences.

@DevOpsSummit Stories
“DevOps is really about the business. The business is under pressure today, competitively in the marketplace to respond to the expectations of the customer. The business is driving IT and the problem is that IT isn't responding fast enough," explained Mark Levy, Senior Product Marketing Manager at Serena Software, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed for digital business, Catchpoint is the only end-user experience monitoring (EUM) platform that can simultaneously capture, index and analyze object level performance data inline across the most extensive monitor types and node coverage, enabling a smarter, faster way to preempt issues and optimize service delivery. More than 350 customers in over 30 countries trust Catchpoint to strengthen their ...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is relevant to small scale DevOps, and if there is an expectation of growth as the number of build targets, test topologies and delivery topologies that need to be orchestrated rapidly grow.
DevOps is a hot topic. It seems that everyone is talking about it. Some have built business models around DevOps-related tools and themes. There are conferences and trade shows dedicated to DevOps-strategies and techniques. Some people have even made their careers around talking about it. In light of all of that, I find it chuckle-worthy that very few people actually know what DevOps is (just follow #devops on Twitter for proof.) I am not going to be one of many trying to create a buzzword-infested definition of DevOps to suit my particular agenda. Instead, I’d like to talk about what DevOps is not. So, without further ado, DevOps …
Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies - speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating the organization can lead to inconsistent masking and exposure of sensitive data. But some organizations are adopting a new method of data management for DevOps that is delivering transformational business outcomes in faster time to market, lower costs, and great control. In his session at DevOps Summit, Brian Reagan, Managing Director of Blackthorne Consulting Group, an Actifio company, revi...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers. In addition, he will discuss known issues and solutions for enterprise applications in containers.
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner's company. In his session at 20th Cloud Expo, Oleg Lola, CEO of MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers with heavy investments in serverless computing, when most of the industry has its eyes on Docker and containers.
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex software systems for startups and enterprises. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobile software company with over 200 developers, designers, quality assurance engineers, project managers in house, specializing in the world-class mobile and web development.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud using both containers and VMs.
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between what is available in the public cloud and the early private clouds?
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers and the benefits. In addition, he discussed known issues and solutions for enterprise applications in containers.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of the Progress Corticon and Rollbase businesses, discussed and provided a deep understanding of the low-code application platforms that address these concerns.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the works because of misaligned incentives.
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem filling in your buzzword bingo cards. Evangelist for F5 Networks
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.