Welcome!

@DevOpsSummit Authors: Liz McMillan, Elizabeth White, Pat Romanski, Yeshim Deniz, SmartBear Blog

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Article

Using Prefetch | @DevOpsSummit @Catchpoint #DevOps #APM #AI #Monitoring

It is very important to proactively monitor webpages to ensure if this performance optimization technique is working or not

Using Prefetch as a Proactive Approach
By Ashish Kumar

Paul The Octopus was known as an animal oracle who could predict the result of a football match. He made accurate predictions for the matches played in the 2010 FIFA World Cup.

What if a browser could do the same thing? What if it anticipated the next page that the user was going to visit and downloaded it in advance? If that happened, it would help download critical resources ahead of time and significantly affect its performance.

One might say, ‘isn’t that what cache is for?’ Well yes, it is. Caching helps browsers avoid making expensive HTTP calls and delivers needed resources from the disk cache. However, it presents many challenges on modern websites and situations, where caching may not help:

  • First-time visitors: Cache only helps if the site is visited for a second or third time by a user. If you visit a site for the first time, caching would have not kicked in.
  • Expired cache: 69% of resources don’t have any cache headers or are cacheable for less than a day. If the user is visiting a page the second time and the cache has expired, then an HTTP request is needed to check for a fresh resource. Let’s assume that the web server states that the cache is valid, then network delays could factor in and make the pages load slowly.
  • Purged cache: If browser starts caching resources for every website, it’s possible that the browser will purge a website’s cache to make room for another.
  • Cleared cache: Cache gets cleared very frequently. It’s often done by the user, as well as some tools on your computer such as an antivirus program.

To address issues presented by caching, we need to do something more; this is where a pre-browsing or predictive browsing/prefetching, technique comes into play.

Prefetching defined
Websites today are driven by dynamic content and multimedia. A single webpage may load multiple images and videos from different sources which could result in long wait times and a poor user experience.

Modern browsers, such as Chrome, address this challenge by downloading the content in advanced. Different content, which will most likely be accessed by the user, gets downloaded in the background and becomes instantly available when the user needs to interact with it. This performance optimization technique is called pre-fetching, and it can help in reducing DNS lookup times, TCP connections, HTTPS handshakes, etc.

A few methods in practice today include:

  • DNS Prefetching
  • Link Prefetching
  • Prerendering

Types of prefetching
1. DNS Prefetching: Usually, a domain lookup takes anywhere between 1 milliseconds to several seconds, depending on caching done at different levels involved in DNS resolution process. DNS prefetching reduces domain lookup times by resolving different domains, such as Google Analytics or other social media domains a that need to load different resources on the webpage in advance.

Browsers such as Chrome, Firefox, etc., allow DNS prefetching by default; this allows a page to resolve different domains, which deliver content (such as eternal JS, images, etc.) on the page, in advance. Although modern browsers support DNS prefetch, it is still not available on mobile browsers and older IE versions. Also, if an external file has references to other domains, then the browser will not prefetch those until the external file has been downloaded and parsed. In such cases, we can explicitly ask the browser to do a DNS prefetch by specifying DNS-prefetch in your page’s <head> tag as shown below:

<link rel=’dns-prefetch’ href=’http://abc.com’>

Now, when the browser parses the page and comes across this HTML tag, it immediately performs DNS lookup in the background for the specified domain. When the resource from this domain is called, it only requires resource download and no more DNS lookup.

Let’s take an example of a user visiting Google Search and searching for “Flipkart iPhone 7.” It is very likely that the user will click on the link www.flipkart.com in the search results. In this case, Google can prefetch DNS for this domain which will help save DNS lookup time when the user visits www.flipkart.com.

The snapshot below illustrates DNS prefetch implemented on their pages:

2. Link Prefetching: It is a technique which downloads the entire resource including font, images, etc. If we are confident that the user will navigate to flipkart.com and we are aware of the site’s critical resources, then we can download these resources in advance using prefetch.

<link rel=’prefetch’ href=’http://www.flipkart.com/critical.js’>

Currently, its implementation is not consistent, and its specs are vague. So, you might see different implementation by different browsers such as:

  • Firefox downloads one prefetch at a time whenever it is idle. Chrome, on the other hand, downloads up to 10 resources in parallel.
  • Android browser, Firefox browser, and Firefox mobile browser starts prefetching after “window.onload” but Chrome does it immediately which impacts the page’s resource download as it utilizes current TCP connections.

It is best to prefetch only those resources that are critical to the page and cacheable.

3. Prerendering: This is a next level technique which downloads the entire webpage content and caches it in the background. With prerendering, the browser creates DOM tree, applies CSS, executes JS, etc.

In the same example that we considered above, if we are certain that the users next action would be to visit www.flipkart.com, then we can prerender the entire webpage like this:

<link rel=’prerender href=’http://www.flipkart.com/’>

When the browser encounters this, it will load the webpage and all the necessary resources in a hidden tab. If the user visits the specified URL, then the current tab is replaced with the new tab making the webpage visible instantly. Google Search has had this feature for years under the name Instant Pages. Microsoft recently announced they’re going to implement a similar feature for prerendering in Bing on IE11.

The following image illustrates the browser support for the pre-browsing techniques mentioned above:

It is very important to proactively monitor webpages to ensure if this performance optimization technique is working or not. Catchpoint’s digital experience intelligence platform helps you monitor it actively (synthetic monitoring) and passively (real user measurement).

The post Using Prefetch as a Proactive Approach appeared first on Catchpoint's Blog - Web Performance Monitoring.

More Stories By Mehdi Daoudi

Catchpoint radically transforms the way businesses manage, monitor, and test the performance of online applications. Truly understand and improve user experience with clear visibility into complex, distributed online systems.

Founded in 2008 by four DoubleClick / Google executives with a passion for speed, reliability and overall better online experiences, Catchpoint has now become the most innovative provider of web performance testing and monitoring solutions. We are a team with expertise in designing, building, operating, scaling and monitoring highly transactional Internet services used by thousands of companies and impacting the experience of millions of users. Catchpoint is funded by top-tier venture capital firm, Battery Ventures, which has invested in category leaders such as Akamai, Omniture (Adobe Systems), Optimizely, Tealium, BazaarVoice, Marketo and many more.

@DevOpsSummit Stories
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Business Unit at CA Technologies, will share his vision about the true ‘DevOps Royalty’ and how it will take a new breed of digital cloud craftsman, architecting new platforms with a new set of tools to achieve it. He will also present a number of important insights and findings from a recent cloud and DevOps study – outlining the synergies high performance teams are exploiting to gain significant busin...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chief Architect at Cedexis, covered strategies for orchestrating global traffic achieving the highest-quality end-user experience while spanning multiple clouds and data centers and reacting at the velocity of modern development teams.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be looking at some significant engineering investment. On-demand, serverless computing enables developers to try out a fleet of devices on IoT gateways with ease. With a sensor simulator built on top of AWS Lambda, it’s possible to elastically generate device sensors that report their state to the cloud.
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cross-functional purposes. So, when traditional IT Service Management (ITSM) meets the cloud, and equally, DevOps, there is invariably going to be conflict.