Click here to close now.

Welcome!

@DevOpsSummit Authors: Pat Romanski, Liz McMillan, Roger Strukhoff, Elizabeth White, Tim Hinds

Related Topics: @CloudExpo, @MicroservicesE Blog, Open Source Cloud, @BigDataExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Article

Big Data, Open Data and Cloud Strategy

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud

The Big Data and Cloud market has been growing at a staggering pace. Data is becoming unmanageable and too big to be handled by relational database systems alone and there is a need to effectively provision, manage elastic scalable systems. Information technology is undergoing a major shift due to new paradigms and a variety of delivery channels. The drivers for these technologies are social networks, proliferation of devices such as tablets and phones. Social business and collaboration are continuing to develop further to enhance productivity and interaction. There has been a big void in the Big data area and a need to come up with solutions that can manage Big Data. Part of the problem has been that there was so much focus on the user interfaces that not many organizations were thinking further about the core - Data. So now with the proliferation of large and unstructured data, it is important to extract and process large data sets from different systems expeditiously. To deliver strategic business value, there should be the capability to process Big data and have the analytics for enhanced decision making. In addition, systems that process Big Data can rely on the Cloud to rapidly provision and deploy elastic and scalable systems.

The key elements of a comprehensive strategy for Big Data, Open Data and Cloud includes conducting a cost benefit analysis, hiring resources with the right skills, evaluating requirements for data and analytics, developing a sound platform that can process and analyze large volumes of data quickly and developing strong analytic capabilities to respond to important business questions. A sound strategy also includes assessing the existing and future data, services, applications as well as the projected growth. In addition there should be a focus on ensuring that the infrastructure can support and store unstructured as well as structured data. As part of the strategy, data protection including security and privacy is very important. With the evolution to complex data sets, data can be compromised at the end points or while it is being transmitted. Hence proper security controls have to be developed to address these issues. Organizations also need to develop policies, practices and procedures that support the effective transition to these technologies.

As part of the strategic transition to Big Data and Cloud it is important to select a platform that can handle such data, parse through records quickly and provide adequate storage for the data. With the high velocity of data coming through systems, in memory analytics and fast processing are key elements that the platform should support. It should have good application development capabilities and the ability to effectively manage, provision systems and related monitoring. The platform should have components and connectors for Big Data to come up with integrated solutions. From a development perspective, Open source software such as Hadoop, Hive, Pig, R are being leveraged for Big Data. Hadoop was developed as a framework for the distributed processing of large data sets and to scale upwards. Hadoop can handle  data from diverse systems including structured, unstructured, media. NoSQL is being used by organizations to store data that is not structured. In addition, there are vendors who offer proprietary software Hadoop solutions. The choice to go with a proprietary or open source solution depends on many factors and requires a through assessment.

Systems that process Big Data need the Cloud for rapid provisioning and deployment. The elastic and scalable aspects of the Cloud support the storage and management of massive amounts of data. The data can be obtained and stored in a Cloud based storage solution or database adapters can be used to obtain the data from databases with Hadoop, Pig, Hive. Vendors also offer data transfer services that move Big data from and to the Cloud. Cloud adds the dynamic computing, elasticity, self-service, measured aspects in addition to other aspects for rapid provisioning and on demand access. Cloud solutions may offer lower life cycle costs based on usage and the monitoring aspects can lay out a holistic view of usage, cost assessments and charge back information. All this information can enhance the ability of the organization to plan and react to changes based on performance and capacity metrics.

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud. There are initiatives underway related to Open data that drive the development and deployment of innovative applications. Making data accessible enables the development of new products and services. This data should be made available in a standardized manner so that developers can utilize it quickly and effectively. Open data maximizes value creation built on the existing structured and unstructured data.

Open Data strategy and initiatives should define specific requirements of what data will be made available based on the utility of that information. Just providing massive dumps of data that are hard to use is not the solution. There has to be proper processing that can extract useful information from the data. The data that is obtained should support automated processing  to develop custom applications and can be rendered as html, xml etc. This  can promote greater number of not just traditional applications, but also mobile applications. There has to be great emphasis on security and privacy since any errors can compromise important information when the data is made accessible. A comprehensive strategy for Big Data, Cloud and Open Data will enable a smooth transition to achieve big wins!

(This has been extracted from and is reference to blog. All views and information expressed here do not represent the positions and views of anyone else or any organization)

More Stories By Ajay Budhraja

Ajay Budhraja has over 23 years in Information Technology with experience in areas such as Executive leadership, management, strategic planning, enterprise architecture, system architecture, software engineering, training, methodologies, networks, and databases. He has provided Senior Executive leadership for nationwide and global programs and has implemented integrated Enterprise Information Technology solutions.

Ajay has a Masters in Engineering (Computer Science), and a Masters in Management and Bachelors in Engineering. He is a Project Management Professional certified by the PMI and is also CICM, CSM, ECM (AIIM) Master, SOA, RUP, SEI-CMMI, ITIL-F, Security + certified.

Ajay has led large-scale projects for big organizations and has extensive IT experience related to telecom, business, manufacturing, airlines, finance and government. He has delivered internet based technology solutions and strategies for e-business platforms, portals, mobile e-business, collaboration and content management. He has worked extensively in the areas of application development, infrastructure development, networks, security and has contributed significantly in the areas of Enterprise and Business Transformation, Strategic Planning, Change Management, Technology innovation, Performance management, Agile management and development, Service Oriented Architecture, Cloud.

Ajay has been leading organizations as Senior Executive, he is the Co-Chair for the Federal SOA COP and has served as President DOL-APAC, AEA-DC, Co-Chair Executive Forum Federal Executive Institute SES Program. As Adjunct Faculty, he has taught courses for several universities. He has received many awards, authored articles and presented papers at worldwide conferences.

@DevOpsSummit Stories
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. The problem is there are a lot of moving parts in these designs; this makes assuring performance com...
"We help to transform an organization and their operations and make them more efficient, more agile, and more nimble to move into the cloud or to move between cloud providers and create an agnostic tool set," noted Jeremy Steinert, DevOps Services Practice Lead at WSM International, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Agile, which started in the development organization, has gradually expanded into other areas downstream - namely IT and Operations. Teams – then teams of teams – have streamlined processes, improved feedback loops and driven a much faster pace into IT departments which have had profound effects on the entire organization. In his session at DevOps Summit, Anders Wallgren, Chief Technology Officer of Electric Cloud, will discuss how DevOps and Continuous Delivery have emerged to help connect development with IT operations (mainly through the introduction of Automation) to support and amplify a...
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers. In addition, he will discuss known issues and solutions for enterprise appl...
SYS-CON Events announced today that JFrog, maker of Artifactory, the popular Binary Repository Manager, will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based in California, Israel and France, founded by longtime field-experts, JFrog, creator of Artifactory and Bintray, has provided the market with the first Binary Repository solution and a software distribution social platform.
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult – let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and live demonstrations of each method. Special emphasis will be put on sysdig, an open source troubleshoot...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attendees will understand how different components work together to solve the problems to manage applicatio...
"We have a tagline - "Power in the API Economy." What that means is everything that is built in applications and connected applications is done through APIs," explained Roberto Medrano, Executive Vice President at Akana, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend the time metric, the DevOps cadence reinvents project scope, and cost metrics expand past software ...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes. In his session at DevOps Summit, Michael Demmer, VP of Engineering at Jut, will discuss how this can only work if the underlying analytics platform is flexible and powerful enough to handle the variou...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal, Sr. Principal with CSC, and Mike Kavis, Vice President & Principal Cloud Architect at Cloud Techno...
"We provide a web application framework for building really sophisticated web applications that run on a browser without any installation need so we get used for biotech, defense, and banking applications," noted Charles Kendrick, CTO and Chief Architect at Isomorphic Software, in this SYS-CON.tv interview at @DevOpsSummit (http://DevOpsSummit.SYS-CON.com), held June 9-11, 2015, at the Javits Center in New York
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust IT industrialization – allowing customers to provide amazing user experiences with optimized IT per...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might slip up with the wrong focus, how to manage change and risk in all three areas, what is possible a...
Mashape is bringing real-time analytics to microservices with the release of Mashape Analytics. First built internally to analyze the performance of more than 13,000 APIs served by the mashape.com marketplace, this new tool provides developers with robust visibility into their APIs and how they function within microservices. A purpose-built, open analytics platform designed specifically for APIs and microservices architectures, Mashape Analytics also lets developers and DevOps teams understand which APIs are used most frequently, from what endpoints and by which paying customers, so they can p...
Sumo Logic has announced comprehensive analytics capabilities for organizations embracing DevOps practices, microservices architectures and containers to build applications. As application architectures evolve toward microservices, containers continue to gain traction for providing the ideal environment to build, deploy and operate these applications across distributed systems. The volume and complexity of data generated by these environments make monitoring and troubleshooting an enormous challenge for development and operations teams. The Sumo Logic Collector and Application for Docker now a...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fillin...
IBM is delivering of enterprise class containers that make it easier for clients to deliver production applications across their hybrid environments. Containers give developers flexibility to build once and move applications without the need to rewrite or redeploy their code. IBM Containers, based on Docker and built on Bluemix, IBM's platform-as-a-service, provide a more efficient environment that enables faster integration and access to analytics, big data and security services. Enterprises will now be able to use the combination of IBM, Docker, Cloud Foundry and OpenStack to create a new ...
A broad coalition of industry leaders and users are joining forces to create the Open Container Project (OCP), chartered to establish common standards for software containers. Housed under the Linux Foundation, the OCP’s mission is to enable users and companies to continue to innovate and develop container-based solutions, with confidence that their pre-existing development efforts will be protected and without industry fragmentation. As part of this initiative, Docker will donate the code for its software container format and its runtime, as well as the associated specifications. The leader...
SYS-CON Events announced today that the "Second Containers & Microservices Conference" will take place November 3-5, 2015, at the Santa Clara Convention Center, Santa Clara, CA, and the “Third Containers & Microservices Conference” will take place June 7-9, 2016, at Javits Center in New York City. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.