Welcome!

@DevOpsSummit Authors: Elizabeth White, Yeshim Deniz, Pat Romanski, Aruna Ravichandran, Liz McMillan

Related Topics: Agile Computing

Agile Computing: Blog Feed Post

The Facebook Open Graph Announcement: What’s Not to “Like”

Facebook made an audacious and smart positioning move to grab the social media high ground

Now that the collective tech/social media world is coming off the sugar rush of Facebook’s big Open Graph announcements of last week, I thought I’d take a fresh look. I should say that I am 41 years old and an active user of Facebook. I certainly see the powerful implications of sharing subjective information across a social network (graph). What I am not 100% sure about is if anyone is stepping back and questioning the viability of their approach.

Facebook made an audacious and smart positioning move to grab the social media high ground, but what are the real benefits to consumers? To Brands? To Facebook? Did Facebook really just “Win The Web”, as the New York Times proclaimed?

Not New News
First of all, last week’s thunderstruck, gushing over-reaction of adoration among industry insiders and press seems somewhat odd, given that Facebook introduced the Open Graph concept at the Developer Garage of October 28, 2009. Here’s an article by Nick O’Neil that outlines what he rightly categorizes (at the time) as “part of a broader move by Facebook”.

From Facebook at the time: “The Open Graph API will allow any page on the Web to have all the features of a Facebook Page…it will show up on that user’s profile and in search results, and that page will be able to publish stories to the stream of its fans.”

So, why then the wild enthusiasm and assertions? Perhaps the youthful Zuckerberg-lead Facebook team best-embodies the promise of social networks to finally dethrone some of the entrenched ad-supported superpowers like Google.

A Google Killer?
Is this move so profound that it will allow Facebook to collect enough information to power a “social search engine” and, as such, topple Google? There is no doubt that Facebook is in a position to learn, store and categorize opt-in personal preferences of individuals and utilize them to great advertising sales advantage. But there are major differences between these companies.

Google makes $23 Billion annually by giving people useful personal and business tools and serving effective and unobtrusive contextually ads in exchange for the use of these tools. Google increased Gmail users by 43% last year and their list of services is impressive, and growing. Google also has direct connections with the local retail points of sale that are so important for tracking incremental purchases and, as I have written about, is well-along the path toward deciding on the right way to collect, measure, and capitalize on these metrics.

Facebook, on the other hand, is not useful. Yes, I said it…fun, interesting, and a novel new socially relevant way to correspond with others with shared commonality. But useful? No. Over 37% of all people signed up for Facebook are inactive. That’s 150 million of them. Facebook made an estimated $650 Million last year and certainly has a lot of traffic, but they have not yet capitalized in a way that comes even close to challenging Google. The fundamental value proposition Facebook offers consumers is different. Try not checking your Facebook page for a week and see what happens. Facebook is a nice to have and, as such, needs to be incrementally more thoughtful about what they do and how they do it.

Size Matters
There were other implied assertions Facebook made last week that I question. Namely, that growth and unique appeal can coexist. Facebook has rocketed to popularity by mimicking the same voyeuristic appeal as the original printed freshman facebooks most of us used to peruse the social landscape back in college. But, after freshman year, the book became irrelevant. Why? Because the size of the graph made the details of the graph highly relevant. If the network grows and becomes indistinct, it loses its effectiveness and the stream of information becomes cloudy and irrelevant in the context of a broader network (no longer wow’d by the initial relevance).

For brands, the “fan page” acts as a tighter circle of consumer interaction and an opt-in sub-network, within the broader context of the web. Consumers have to “become a fan” and the thoughtful act of doing this makes the sub-network powerful and relevant to the brand and others within it. Facebook’s switch to the “Like” button was designed to make it easier for people to convey their preferences. This also has the potential negative side effect of broadening the input stream of consumers to specific sub-networks and clouding the waters  by making the size of the pool exponentially larger and, as such, less meaningful. The more the merrier for Facebook, as this grows the audience to whom they will serve ads to and pads their knowledge about every Facebook user. But it could dilute the opt-in pool for brands and clog the feedback loops.

The Like Button Is Too Easy
The sharing of subjective opinions and preferences based on real world interactions with products and services is the real power of social media (and location based marketing). Ratings and reviews are the best example of how consumers interact with real places and share input, currently, but it does not take much imagination to see that real-world interaction with a wider range of products and services is coming soon.

Providing this subjective input takes a minute or two and this fact (especially when consumers are mobile) serves as detergent to flip or casual positive or negative inputs. The “Like” button allows instant input, with less thought, all designed to rapidly fill Facebook’s master database. Great for Facebook and their advertising machinery plans, but the user experience (in the form of people’s news feeds) could-well become clogged with a deluge of “likes” that become less impactful in direct proportion to the times the too-easy “Like” button is used.

Personal Preference Profile Probes
What Facebook has announced is very smart, but it requires compliance by companies and brands. They are essentially telling any company that has a web page dedicated to something someone would “Like” to infuse Facebook code into that page, with specific metadata tags that categorizes the real-world product shown. This is very good for Facebook, but it essentially means web pages need to insert little  “probes” under their skin that feed a stream of data back to the Facebook mothership. Will companies and brands do this?

They might, but they also might realize that they are turning over the keys to the kingdom to the same barbarian at the gate who will then come back and charge them advertising fees based on the personal preference profile metrics they delivered on a silver platter. They could also do things in the future with this “holy grail” (the personal preference profile) that we can not conceive of currently. My point is that brands should not jump on this before they carefully consider the implications of the volume of valuable opt-in metrics they will be delivering to Facebook, and the benefits.

Content websites should be careful too, as Facebook is sure to sell advertising based on consumer preferences for something they read. Again very good for Facebook, but it could mean a thorny editorial/sales line in the sand gets crossed if readers get hit with ads for products related to an element of the content that does not resonate with the consumer targeted or if the ad seems to imply a paid connection between the editorial content and the advertiser.

Three Things Not Announced: Location, Location, Location
In a surprise to many (including me), Facebook made no mention last week of location-based marketing and framed their announcements around web-based open graph linkages and, more specifically, the integration of “like” button code on product pages to tap the power of personal preference aggregation. While the “visionary” open graph high ground move got the press, the real pot of gold lies at the end of the point-of-sale rainbow, reached by linking marketing to incremental tracked sales. Brands make money by selling more products in stores, period.

It was widely speculated that the reason Facebook did not announce checkin functionality or QR codes or NFC to link updates with real-world physical locations last week was that they might buy Gowalla or Foursquare. We now know that Facebook was about to launch a “door sticker” campaign to reach out directly to merchants and is using, of all things, SMS short codes to track consumer interaction and link it to location.

I personally think this is just the beginning and Facebook will dive headlong into the location-verified Proof Of Presence Metrics game soon. But can they pull it off? A simple location-enabled “Like, with comments option” might not be the right move here. This is too flip, to fast, to easy. Again, good for Facebook as they seek to remove friction for aggregation of personal profile preferences for who, what and where, but I am not sure members of the social graphs want to hear about every checkin and every store or venue or brand that those in their network simply “Like”.

My Friends All Like Different Things
I know the people in my social network and I am certainly more interested in hearing their preferences and opinions than the blanket ads I see every day, foisted upon me by those charged with selling the products. This, of course, is the power of social networks to shape consumer behavior. But I also have a solid majority of pals who are not on Facebook. The two I reached both gave me the same answer, which was, essentially, “Facebook is stupid. It’s full of asinine egocentric banter and takes way too much time to deal with”.  I sympathize and often have to weed though posts about spilled milk (literally) and inane random thoughts.

But I also use Facebook for business and have made an effort to be a fan only of pages conveying important, relevant information. I, for one, do not intend to fill my feed with all my “Likes” and hope those who fill my feed will hold off too. Aside from the obvious volume implications, I am not going to be swayed by the fact that someone “Likes” anything. Now, if they took the time to write a review or checkedin on Gowalla and stopped to rave about something and this was posted with intent, I’d be inclined to take a look. But the click of a “Like” button is too fast, to flip, and too easy and we all like many, many different things, for different reasons.

I know brands and companies have a different Facebook opportunity to potentially take advantage of, but the people making these social media marketing decisions are usually personal Facebook users as well.

Considering The Implications
Facebook has the traffic and the momentum to do some powerful things. I just hope the collective Social Media/LBS/Mobile world can stop for a minute and consider the positive and negative implications of not just the “open graph”, but the site-integrated Facebook metadata tags that, if implemented, will feed consumer preference back to the now-warming Facebook ad engine. Agencies should consider this move on behalf of their clients carefully. I hope Facebook users consider the long and short term implications of sharing so much about their personal product preference profiles with Facebook, the privacy issues this raises, and the effect of potentially having volume and size dull down the interactions with others within their network. And, I hope Facebook considers the user experience implications and that they treat the heavy crop of rich realtime opt-in metrics they will/hope to reap with consideration. Easy is not always good.

More Stories By Wilson Kerr

Wilson has 11+ years experience in the Mobile and Location Based Services (LBS) space. Recently, he became Director Of Business Development and Sales for Unbound Commerce, a Boston-based mobile commerce solution provider. He has deep expertise in the areas of mobile commerce, social media, branded location integration, branded content licensing, and is knowledgeable in a broad range of navigation technologies. Wilson has worked with top tier brands, content providers, device manufacturers, and application developers, including Nokia, Unbound Commerce, Tele Atlas/TomTom, The Travel Channel, Langenscheidt Publishing, Intellistry, Parking In Motion, GPS-POI-US, and others. Wilson is a blogger on all things location-based, edits the LBS topic page on Ulitzer, teaches a Social Media 101 class, and has served as a panelist and speaker at Mobile LBS conferences and networking events. Wilson has held positions in Business Development, Sales/Marketing, and Digital Licensing at The North Face, Outdoor Intelligence, Fishing Hot Spots Maps, Tele Atlas North America/TomTom and, most-recently, Unbound Commerce. Wilson left Tele Atlas to start Location Based Strategy, LLC in 2007. Company Website: http://www.LBStrategy.com. Twitter: @WLLK

@DevOpsSummit Stories
SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their associated CPUs, memory storage and network) into one or more large servers capable of handling the biggest Big Data problems and most unpredictable workloads.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant that knows everything and can respond to your emotions and verbal commands!
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching of virtual storage services to its enterprise market.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lavi, a Nutanix DevOps Solution Architect, explored the ways that Nutanix technologies empower teams to react faster than ever before and connect teams in ways that were either too complex or simply impossible with traditional infrastructures.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing, CA Technologies. "It's this results-driven combination of technology and business that makes me so passionate about DevOps and its future in the industry. I am truly honored to take on this co-chair role, and look forward to working with the DevOps Summit team at Cloud Expo and attendees to advance DevOps."
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's application ecosystem, tweaking existing software to stitch various components together leads to sub-optimal solutions. This definitely deserves a re-think, and paves the way for a new breed of lightweight application servers that are micro-services and DevOps ready!
SYS-CON Events announced today that mruby Forum will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. mruby is the lightweight implementation of the Ruby language. We introduce mruby and the mruby IoT framework that enhances development productivity. For more information, visit http://forum.mruby.org/.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making real-time decisions based on a combination of real user monitoring, synthetic testing, APM, NGINX / local load balancers, and other data sources, is critical.
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp empowers global organizations to unleash the full potential of their data to expand customer touchpoints, foster greater innovation and optimize their operations.
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. That means serverless is also changing the way we leverage public clouds. Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we’ve lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the ...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. This leads to a waste of cloud resources and increased operational overhead.
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers to house increasing amounts of storage infrastructure.
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with legacy components, and also go over automated capabilities provided by operators to auto-update Kubernetes with zero downtime for current and secure deployments.
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its start in 2008 with a mission to use fast, flash-based storage in the most efficient, effective manner possible. What the team had discovered was a technology that optimized storage resources and reduced dependencies on sprawling storage installations. Launched as the Avere OS, this advanced file system not only boosted performance within standard, on-premises, network-attached storage systems but ...
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost and increase in speed. Sometimes in order to reduce complexity teams compromise features or change requirements
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous architectural and coordination work to minimize the volatility of the cloud environment and leverage the security features of the cloud to the benefit of the CICD pipeline.
SYS-CON Events announced today that SkyScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. SkyScale is a world-class provider of cloud-based, ultra-fast multi-GPU hardware platforms for lease to customers desiring the fastest performance available as a service anywhere in the world. SkyScale builds, configures, and manages dedicated systems strategically located in maximum-security facilities, allowing customers to focus on results while minimizing capital equipment investment.
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: implement always-vigilant DNS security
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, will discuss how by using new techniques such as feature flagging, rollouts, and traffic splitting, experimentation is no longer just the future for marketing teams, it’s quickly becoming an essential practice for high-performing development teams as well.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.