Web Performance: 25+ Experts Share Their Advice and Mistakes

By Brian Jackson
Updated on April 10, 2024
Web Performance: 25+ Experts Share Their Advice and Mistakes

We had a lot of great feedback from the community on our previous post, web performance experts to follow online. We thought we would take it a step further and ask those experts to share their advice and common mistakes that they see people making when it comes to web performance and which optimizations you should spend time prioritizing.

Web performance questions

We asked each expert to answer the following two questions.

  1. If there was only one web performance optimization you could focus on, what would it be?
  2. What are some common mistakes you see people making when it comes to web performance?

Experts answers

The people on this list were selected in no particular order.

Tim Evko


Lead Frontend Engineer at BaubleBar. Podcaster. Creator of RICG-responsive-images. Writer on CSS-Tricks, Smashing Magazine, SitePoint.

Answer 1 - Advice

Over engineering applications. Specifically the overuse of frameworks and libraries to perform menial tasks. This most commonly leads to decreased speed, mobile support, and heavy web pages.

Answer 2 - Mistakes

Lack of progressive enhancement, failure to properly manage dependencies, not implementing a responsive image solution, and ignoring third party content sizes.

Una Kravets

@Una / una.im

Frontend Developer at IBMDesign.

Answer 1 - Advice

Images! Developers often focus on improving scripting performance, but they need to realize that the bulk of their performance woes come from media content. The most dramatic performance improvements can come from optimizing content, using proper file formats, and leveraging progressive rendering.

Answer 2 - Mistakes

Mindlessly adding frameworks to websites for small features is unfortunately a trend that we keep seeing and that keeps driving up web page sizes. Performance needs to be thought about in the beginning of a product's lifecycle, not at the end.

Jeff Atwood

@codinghorror / blog.codinghorror.com

Co-founder of Stack Exchange and Discourse.

Answer 1 - Advice

HTTP/2 adoption across the board - huge improvements for everyone.

Answer 2 - Mistakes

Failing to run basic web performance checks using freely available web performance tools like Google PageSpeed.

Dean Hume

@DeanoHume / deanhume.com

Software developer. Author of Fast ASP.NET Websites, a book aimed at improving the performance of high transaction websites.

Answer 1 - Advice

If there was only one optimization to focus on it would be images! According to the HTTP archive they make up over 60% of the weight of a web page, so it makes sense to focus there.

Answer 2 - Mistakes

Some of the most common mistakes are really simple ones to make. They include not optimizing images and not turning Gzip on, these are such quick wins everyone should do them!

Patrick Meenan

@patmeenan / webpagetest.org

Creator of WebPageTest, Chrome Engineer at Google.

Answer 1 - Advice

Deliver the minimum bits needed for the user experience first. This may be cheating a bit since it technically includes image optimization, compression, async js, lazy loading images and most of the typical optimizations but I think it works as a useful lens when looking at a site and to guide the optimization work. As you look through a waterfall (preferably aligned with the filmstrip of the page loading), look at each request and see if it is necessary yet or if it is competing with something later that is needed for the experience (and if it is as optimized as it could be).That will help keep the focus on the user experience and will lead to a much better result than tracking something like the on load time.

Answer 2 - Mistakes

For the most part, all of the issues I see are still classic optimization 101 issues (even on larger sites).

Horrible performance on cheap hosting. This isn't usually a problem for medium/large sites but the long tail of the web is usually hosted on really cheap shared hosting plans where the servers are horribly oversubscribed. It's not unusual to see backend times well over 5 seconds. For everything that isn't just a hobby site it's worth investing the few dollars that even a low-end VPS will provide (though at times that takes more technical skill than the site owners have).- Poor image compression. This spans everything from serving 24-bit PNG's for photos to high-resolution, high quality JPEGs instead of a reasonable quality level. Browser-specific formats would be nice but even getting regular image compression working consistently is a challenge, particularly for content sites where editors can upload article photos.

Explosion of third party dependencies. Chat bots, analytics, retargeting, ads, social widgets, etc. The number of domains that are relied on for assembling pages these days is exploding and is going to eliminate a lot of the optimizations and benefit that HTTP/2 provides.

Lots of external scripts and CSS, loaded in the head, synchronously. Tag managers loaded synchronously. They completely obscure the content from the browser's preload scanner and cause huge blocking delays as they document.write more blocking third party scripts.

Sara Soueidan

@SaraSoueidan / sarasoueidan.com

Frontend developer and SVG advocate.

Answer 1 - Advice

Image optimization. Optimize and use responsive images (<image> and the srcset attribute). I'm almost always on a slow connection and the amount of time it takes images to load on most websites is just horrid.

Answer 2 - Mistakes

Not optimizing images, and loading too many unused font faces/variations of a font. I find that these two are the most common causes of long page loading times in the vast majority of websites I browse on the web.

Anselm Hannemann

@helloanselm / helloanselm.com

Frontend developer creating solid, scalable code architectures. Curator of the WDRL Newsletter.

Answer 1 - Advice

The most important thing about web performance is to have a proper technical concept before starting a project. If you keep the principles of the web and browsers in mind, like, the fetching, parsing and loading order of resources, it's the biggest improvement to performance you can get and way more effective than micro-optimizations like minification. Invest into delivering your resources as fast as possible to the browser, and, for example, don't serve styles via JavaScript unless you have a strong reason to do this as a performance improvement.

Answer 2 - Mistakes

People tend to rather do a lot of micro-optimizations but then, because we have too many things we can optimize these days, miss out on the really important performance improvements and make the whole application more complex due to the micro-optimizations.

Patrick Sexton

@PatrickSexton / varvys.com

Creator of GetListed.org and now runs Varvys, free SEO and web perf tools.

Answer 1 - Advice

Images. Optimizing and deferring images will speed up any web page with very little pain.

Answer 2 - Mistakes

I think the most common mistake is when frameworks are used for things that can be done without them. I have seen people add jQuery for some crazy reasons, like just to move an image up a couple pixels or some other simple task. Frameworks for CSS are also all too common. We seem to forget that we are making web pages for the user, not the convenience of the designer or coder. The "convenience" of frameworks is why web pages are getting larger and larger. By far the most common problem in web performance today.

Christian Heilmann

@codepo8 / christianheilmann.com

Developer Evangelist - Works at Microsoft on Edge.

Answer 1 - Advice

Images. Images are by far the biggest problem we have on the web. Learn about optimization, using different image formats, set up your CMS to allow for responsive images, stop using an image for everything.

Answer 2 - Mistakes

A lot of people do a good job optimizing their main product, but then they clog it up with third party scripts, includes and services that aren't performance optimized. You don't need a button for each social network - stop using those. It is still fashionable to have scrolling pages (parallax) but people forget to tell the browser with will change or transforms to create an own layer for the fixed part. This is incredibly important. Most importantly, people test their product on high-end hardware with fast connections. Spend some money buying mid to low range devices and test on those. Throttle your connections in developer tools. Your setup is very much not what your end users have.

Peter Cooper


Publisher-in-chief at CooperPress which publishes Web Operations Weekly. Software developer and code experimenter.

Answer 1 - Advice

The user experience. While numbers provide an important way to measure the effects of optimizations, ultimately it's (usually) real people who use our sites and the most important thing is their perception and experience. If including an extra element helps them perceive the site as loading faster, even if it's not, by the numbers, that element is welcome IMHO.

Answer 2 - Mistakes

As an industry, we have a tendency to swing too far one way or another when different trends become popular. I'm starting to get the feeling a focus on web performance is coming, temporarily, at the expense of thinking about Web architecture and overall design of sites. A page may load quickly, but if it's not working in the context of a cohesive site that does what the end user wants, all that speed is a waste of time.

Another possible issue is when people focus on making their homepages load as quickly as possible, but don't invest as much effort elsewhere. Thanks to social media, the importance of the homepage is falling, and users are increasingly being driven directly to content or product pages. Focus where the users are coming into your site, not on the "headline" locations that the CXO types might be worried about.

Kent Alstad

@kentalstad / cloud.army

VP of Acceleration at Radware. Tech innovator & performance researcher with a (seemingly) simple goal: to make the Internet faster.

Answer 1 - Advice

It is hard for me to narrow the focus to a single optimization. I think there are some key areas to focus on: Images are a large part of most applications so delivering the best image for each user context and caching the images at the right locations for reuse usually results in big wins. Fixing or decoupling third party component performance from your application user experience keeps application SLAs in your control. And lastly think HTTP/2 related optimizations that move many traditionally HTML transformation optimizations to the network layer where the optimizations are less intrusive for developers and often more effective.

Answer 2 - Mistakes

Failing to realize that measuring performance, and the related optimization improvements, is often just as difficult (both technically and conceptually) as making the improvements themselves. Sort of a twist on, you can't manage what you can't measure. For example you may make a change and see that RUM stats take a turn for the worse when you are 40% confident that you are 5% slower with your new optimized code. It turns out that measurement and education around what is required for sound statistical decision making are not simple things to communicate and not everyone understands the results and measurements at a level required to make good decisions.

To address this problem you can communicate an agreed upon measurement criteria so you can work effectively through the cycles of making improvements and then measuring the improvements. Another common mistake is letting developers react to performance problems with their old familiar tool, namely, writing or rewriting code. The first response should always be to measure and isolate the problem. Most of the time the problems are in the frontend code and are unrelated to server side code performance and yet I see a lot of server code rewritten in the name of performance.

I see third party resources confusing a lot of performance teams. Often marketing and other non-dev-ops departments control the addition of new third party beacon or dependencies and when the third parties fail or get slow the developers are not prepared or informed enough to fix the problems. Sites that are concerned with performance and rely on third parties need to have a strategy for managing how third party slow-down or failure will be handled.

Maximiliano Firtman

@firt / firt.mobi

Mobile+web development & consulting. Author of Programming the Mobile Web & jQuery Mobile, from O'Reilly.

Answer 1 - Advice

To extreme techniques for mobile, including delivering ATF content in 1 second, whatever it takes.

Answer 2 - Mistakes

  • Not measuring on cellular networks and real mobile devices.
  • Thinking that responsive web design is a goal to achieve losing focus of the real goal: performance.
  • Underestimating mobile browsers and web views.
  • Overusing web fonts.

Andreas Grabner

@grabnerandi / apmblog.dynatrace.com

Andreas Grabner is a software geek and blogs about at Dynatrace.

Answer 1 - Advice

On every page/feature think about "what's the bare minimum we need for this to be functional and usable" -> then get rid of the rest!

Answer 2 - Mistakes

  • "Works in my browser" attitude -> just because you use Chrome or Safari doesn't mean that the whole world uses these browsers!
  • Give the app to your mom and see if she can work with it without asking you what to do next to complete the task
  • Never tested with low bandwidth or different devices -> that would reveal too large (size and dimension) images
  • Devs often don't think about the Total Cost of a Feature for a single user and the main use cases in their app: how much data do we ask them to download through their mobile connection, how much do we need to pay the CDN for that? how much CPU & Storage do we need that we need to pay our Cloud provider?
  • Dependency with too many third party services/components
  • No proper cache layers (browser, CDN, web server and app server)
  • Wrong sizing of connection pools on web server, app servers, DB servers
  • Lack of load & performance testing before putting it live!
  • Functional Green Doesn't Mean Ready To Deploy. Extend your test horizon from "Browser Only" to include the whole delivery chain.

Jason Grigsby

@grigs / cloudfour.com

Co-Founder of Cloud Four. Passionate about the mobile web.

Answer 1 - Advice

Gzip. It has a huge impact on transfer size and is insanely simple to enable on most web servers. Yet, there are still a shocking number of web sites that don't use Gzip. (2nd src) There is no excuse for the large number of sites that don't use Gzip. So while it may not be glamorous, Gzip is the first thing people should do.

Answer 2 - Mistakes

The biggest mistakes come from not working on a culture of performance and instead assigning one or two individuals. Lara Hogan has done a great job of describing the difference between having a janitor cleaning up messes versus having an organization focused on performance. You want the latter, but most companies do the former.

Denys Mishunov

@mishunov / mishunov.me

Frontend developer at FastName and author at Smashing Magazine.

Answer 1 - Advice

Definitely, it would be not performance optimization itself, but *perception* of performance. More often than not, developers overlook the benefits that psychology provides them with. Psychology can give simple yet effective tools for managing user's perception of performance. This is not to say that we should ignore performance optimization in code, assets and so on; we definitely should do that as well in order to save server time, loading time, bandwidth and follow other attributes of "responsible web". But optimizing it all takes time. Psychology gives precious time for developers to do all of that while keeping your users satisfied right here right now.

Answer 2 - Mistakes

Developers concentrate too much on milliseconds, bytes, number of requests. While being an essential part of what we call "performance" these days, they do not constitute a performance site on their own. First of all we should think about how users perceive our sites, not on how many bytes we can save. Only after users' perception is under control can we talk about milliseconds, bytes and number of requests.

Philip Tellis

@bluesmoon / tech.bluesmoon.info

Chief Architect at SOASTA and Co-Founder of LogNormal.

Answer 1 - Advice

Make UIs smooth.

Answer 2 - Mistakes

Not measuring. Too often people fix what they think is a problem based on a bunch of rules without first measuring to set a baseline and then quantify any improvements or regressions.

Further expert opinions

We've expanded our survey to experts to get even more input on performance. In doing so, we asked questions about what web developers should and shouldn't do to improve the speed of their websites.

Chris Coyier

@chriscoyier / css-tricks.com

Web designer and developer. Built CSS-Tricks and co-founded CodePen.


There are some performance best practices that are kind of a lot of work if you have to do it entirely manually, but can be done better and with little work if automated.

Say for example you have a WordPress site. WordPress hooks you up with responsive images (<img srcset>) stuff out of the box, which is wonderful for performance. But install the Jetpack plugin, and you can, with the flip of a switch, also serve your images from the WordPress CDN (and optimize them at the same time). That's powerful stuff for almost no effort. Another switch and you are lazy loading your images with Jetpack. Again a huge performance boon for little work.

I love nerding out about performance tweaks you can make, but my favorites are always the ones with big impact and little effort.

Stefan Judis


Frontend developer and curator of perf-tooling.today.


One old practice immediately coming to mind is image spriting. The http/2 adoption today is pretty good, and as thanks to multiplexing several HTTP connections can share one TCP connection, the cost of latency decreases whereas assets can be adequately cached. Say finally goodbye to huge images sprites!

Another outdated practice is icon fonts. Icon fonts have several downsides - they suffer from FOIT (Flash of invisible text) harming use experience, and when you're not careful they're very harmful to accessibility. SVG is cross-platform supported today and definitely

the way to go!

And the last practice coming to mind is focusing too much on metrics that are not reflecting user experience. Page load time is one of the metrics that are no telling the full story. For metrics that better reflect user experience, try looking at time to first paint, time to interactive and the by WebPageTest provided Speed Index. They are all valuable metrics that give you more information on how your users experience your site.

At the end, web performance optimizations are always about shipping as little as possible as fast as possible. Following these two principles, I think you'll be mainly good to go.

Suggestions for improving web performance:

The tooling around web performance evolved drastically last year, so my number one suggestions it to get to know the great tools like Lighthouse from Google, Sonarwhal from Microsoft and WebPageTest. They all show possible improvements while providing useful resources to find information about best practices.

Try these tools and read, learn, improve!

Peter Hedenskog

@soulislove / sitespeed.io

Part of the performance team at Wikimedia and creator of Sitespeed - a set of open source tools that makes it easy to monitor and measure the performance of your website.


Make sure you don't follow the old YSlow advice "Put JavaScripts at the Bottom". You should always async/defer loading your JavaScript files. And never ever be dependent on JavaScript to render your page.

Before you start optimizing your website you need to make sure you continuously measure the performance of your site (if you don't do that already). Measure from real users (RUM) and do synthetic testing. Make sure the tools you use are GDPR compliant (if you have users that are based in the EU) so that you can continue to use them after May 25th, 2018. When you measure and feel confident in the metrics you can start to optimize.

Stefan Baumgartner

@ddprrt / fettblog.eu

Web developer/web lover at Dynatrace and co-host at the German Workingdraft podcast.


To optimize your website for performance, there is no way around HTTP/2. Activate HTTP/2 on your servers, then look closely how the transport of your assets has changed. Chances are that you get immediate benefits from multiplexed streams over a single TCP connection. Then start to tear your website's resources apart. Don't transport all at once, just deliver what's needed on that particular page. Stop concatenating files just for the sake of saving one TCP connection. Be picky and remember: The best request is no request.

Jem Young


Software engineer at Netflix and recent speaker at the #PerfMatters conference on the topic of "Modern Performance in the Year of the Dog".


Stop thinking a framework or library will solve your performance problems. Measure, identify problem areas, and address any issues. True performance comes from taking a holistic view of your website and empathizing with your user base.

Start treating mobile as a first-class citizen. The next 4 billion joining the internet will on a mobile device so when you think about performance, think mobile performance.

Léonie Watson

@LeonieWatson / tink.uk

Accessibility engineer, W3C Web Platform WG co-chair, and recent speaker at the #PerfMatters conference on the topic of "there is more to performance than meets the eye"


More things than you think affect Time To Interaction (TTI). When someone is running an Assistive Technology (AT) like a screen reader, browsers behave differently and this has an impact on performance. As well as creating the DOM, the browser creates an accessibility tree and, depending on the browser, gives the screen reader different ways to access the information in the accessibility tree and present that information to the user. With large pages, this can add several seconds to the TTI for screen reader users, even in the latest versions of Chrome and Firefox.

Brian Jackson


Blogger at Woorkup, and developer of the perfmatters WordPress performance plugin.


I still see many users combining their JavaScript and CSS. In a lot of cases, this is no longer needed due to most sites are now running over HTTPS (or should be) and utilizing HTTP/2 which has better support for parallelism. In fact, combining things like this can result in a slower site.

Another outdated practice or misconception I see a lot is thinking a CDN won't have a huge impact on the performance of your site and it's something you'll get to eventually. If you're serving visitors globally a CDN is essential to help speed up the delivery of your assets (depending on the locations I've seen up to 70% decrease in load times). A CDN is not optional, it should be a normal part of a web developers stack.

Developers need to stop just including external scripts or services on a whim and determine just how much each might impact the performance of a site. Take Font Awesome for example. While it's probably one of the most popular ways to easily include font icons on your site, it's better to determine which icons you're actually using. If your site is only utilizing 10 out of the hundreds of icons... repackage your icon fonts with a tool like IcoMoon instead. I've seen this drop the size of the file down by over 90%!

In other words... don't just grab the CDN hosted script because it's the popular thing to do. Take a few moments and determine if that is the best way. A lot of times, hosting something on your own CDN is better as it reduces another DNS lookup and you'll have more control over caching of the file, etc. If you take a performance-driven approach to these things it can quickly result in a much faster site.

Suggestions for improving web performance:

  1. Stop trying to save money by going with cheap web hosting. A lot of times when it comes to hosting you get what you pay for. Your hosting provider is the backbone of your site and probably plays one of the most important roles in just how fast your site will load. If you'd rather spend your time growing your business and site, go with a managed host, especially if you don't have the knowledge to troubleshoot server side related issues.
  2. If you're using PHP, upgrade as fast as you can to the latest versions. Not only will PHP 5.6 be EOL this year in terms of support and security, but PHP 7.2 has been shown to handle up to 3x as many requests. It's also important to note that in regards to WordPress, HHVM is no longer being supported.
  3. Optimize your images. This might sound like a broken record to many, but 50% of an average web page's weight is still made up of images. Finding a good balance of compression and quality is essential for every website.
  4. If you're struggling with WordPress performance you may need to go beyond the typical troubleshooting steps. I recommend utilizing amazing enterprise tools like New Relic which can help pinpoint slow queries or bad code. Check things such as autoloaded data in your wp_options table, corrupt transients, cache-hit ratios, CRON jobs, etc. I can't tell you how many times I've seen corrupt transients and large wp_options tables with unnecessary autoloaded data bring WordPress sites to a crawl.

There is also great software which can help larger and more dynamic WordPress sites such as WooCommerce, community sites, etc. Redis, in terms of caching, allows for the reuse of cached objects rather than requiring the MySQL database to be queried a second time for the same object. This can help decrease load time on the database. ElasticSearch is another one that helps speed up WordPress search by providing an additional indexed layer which is quicker to search than a MySQL query against the database.

Sergey Chernyshev

@sergeyche / sergeychernyshev.com

Web technologist with passion for web performance and open source. Organizer of Meetup: NY Web Performance and WebPerfDays NY.


Don't use technical metrics to represent real user's experience.

The most common these days is to measure so-called "Page Load" time calculating time from the start of navigation to the time when browser's onLoad event fires. Unfortunately, this event as well as many other, "technical" events inside the browser only represent the inner-workings of the page, but not experience real users have at that point with because page is often in an incomplete state not meaningful to the user.

This applies to other techincal metrics currently used, from Time To First Byte (TTFB) which measures time to generate the page on the server and even Time To First Paint which is a bit better as it represents when page stops being completely unusable for the user, but still does not really tell us when experience is useful.

There are some automatable metrics emerging like Time To First Contentful Paint (FCP) which attempts to measure when parts of the DOM are first painted and Time To Interactivity (TTI) which attempts to represent when browser's CPU is idle enough after First Contentful Paint for user to start scrolling and interacting with the page.

These metrics are better automated metrics that tools can capture without developer involvement and that's why you see more information on them as tool providers like Google, for example, can work with them in bulk without working with individual sites.

The best option, however is for you as a developer of a specific site is to measure specific events in your application like the Time to first Tweet on Twitter or a bit more generic Pinner wait time on Pinterest. Metrics like these will let you track user experience as it relates to your product and business KPIs that you are trying to improve.

You can use W3C UserTiming API and hopefully in the future, upcoming Element Timing API (supported only by SpeedCurve in their synthetic tools) to report what matters on your site and track that in relation to your performance efforts.

Suggestions for improving web performance:

At this point in time, so many technical solutions and tricks exist to get your site to work faster, so many frameworks and technical approaches to make that happen - from traditional, server side rendered pages with inline critical CSS and asynchronously loaded JS to Single-page Applications (SPA) that use browser history API to update URL and contents of the page without full deconstruction and reconstruction of DOM on each navigation and all the way to Progressive Web Apps (PWAs) that utilize Service Workers to progressively enhance your pages caching application shell and data locally only requesting data changes over network if necessary.

All of those are commendable methods of speeding up the technology, but in my opinion the main solution to slowness lies outside the technical realm itself and needs to solve the problem of overall disregard of performance in product development lifecycle which leaves it until the product is built and resulting app is slow and disappointing requiring post-development optimization.

The only way for us to change our ways is to start "Designing Speed" the same way we design visual aspects of user experience and branding and the way we design technical solutions for new features.

Only by making speed a first-class citizen in product and technology conversations can we make sure is built into each product. This is extremely important because unlike other, functional features, speed is much harder to add or remove after development is done and cost of rework is sometimes prohibitively high.

This is not unlike responsive design process, which made us all see sense in regards to multi-device support after we neglected it for so long or accessibility initiatives which are, unfortunately, still quite neglected by our industry. As Tammy Everts pointed out in her talk about hunting the performance metric unicorn we need to scale empathy within our organization and to do so, it is critical to inject speed design into organization's workflow, together with general education with web performance.

There are not that many resources available yet about it, but you can read my article where I propose Progressive Storyboards as visual technique and check out speedpatterns.com, a new and upcoming catalog of speed design patterns (feel free to contribute as it is being built).

Vitaly Friedman

@smashingmag / smashingmagazine.com

Co-founder of Smashing Magazine, a leading online magazine dedicated to design and web development.


I think at this point it's critical to look into the possibilities of what you can achieved with service workers for advanced performance optimization, but in terms of low hanging fruits, I'd definitely explore optimization of web font loading and deferring/dealing with 3rd-party-scripts. Most of the time they are slowing down the entire experience massively.

Beyond that, obviously examining and deferring JavaScript in general, with code splitting via webpack, for example, has become extremely important for every website with heavy JavaScript bundle. You might find some useful ideas in a checklist I published recently.

Harry Roberts

@csswizardry / csswizardry.com

Consultant Frontend Architect: Google, UN, BBC, Kickstarter, Etsy.


There are some very specific things that I would consider anti-patterns that developers should avoid doing. The first two both pertain to loading JavaScript in ways that were previously thought to be beneficial to performance, but actually happen to be a net loss.

The first is using document.write to instantiate a new script. This blocks the parser for up to units of seconds at a time. Chrome has already begun intervening on slow connections and are effectively blocking its usage:

Based on instrumentation in Chrome, we've learned that pages featuring third party scripts inserted via document.write() are typically twice as slow to load than other pages on 2G.

- Google Developers

The second is the use of asynchronous JavaScript snippets to load subsequent files. This is still a surprisingly common practice-even used by Google Analytics-despite better alternatives being available. The issue with a snippet like the one below is that the reference to Google Analytics is actually just a JavaScript string, not an actual reference to a file. This means that the URL is not discovered until the browser has parsed and executed this script block. If the URL was in a regular script tag's src attribute, then the browsers secondary parser-the lookahead pre-parser, or preload scanner-could find the reference much sooner, making everything much faster.

    (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),

    ga('create', 'UA-xxxxxxx-x', 'auto');
    ga('send', 'pageview');

Both of these are oddly specific, but I do wish people would stop using them: their time is done.

Suggestions for improving web performance:

  • Move over to HTTP/2: It currently enjoys about 80% global support.
  • Begin thinking about offline: ServiceWorker is wonderful, and it doesn't need to be a complex addition.
  • Consider the next billion users: there are lots of people coming online who don't enjoy the same connectivity that we enjoy in the West.

Matt Shull

@TheMattShull / mattshull.com

Data Science Program Manager at Thinkful and product/performance consultant.


HTTP/2 is allowing us to move away from using thing like CSS sprites for icons/logos which is really nice. I've also been working with teams that rely on gifs to show some type of instructional animations to use videos instead. In a lot of cases, the gifs are larger than the videos! Finally, if you're working with blurred images or single color photos online you can reduce the file size of the photos you're working on by using CSS filters. CSS has made a number of improvements that allow us to create performance-focused experiences.

Suggestions for improving web performance:

Lighthouse reports are a great place to start looking for ways to improve performance. Seeing unused JavaScript and CSS through Lighthouse can be really eye-opening. With the teams I work with for performance consulting we've been moving towards user-centric metrics like First Meaningful Paint, First Interactive, and Consistently Interactive, as well as monitoring the (usually) tried-and-true metrics of Time to First Byte, Speed Index, and Page Size. One of the most interesting observations I've found working with a few teams is that these metrics can differ drastically depending on the network speed. In one particular case I noticed that there is a positive correlation between Consistently Interactive and Page Weight on a 3G connection or lower, but not on faster network speeds. Measure all the things, on all the speeds.

Aaron Gustafson

@AaronGustafson / aaron-gustafson.com

Web standards advocate at Microsoft. Author of Adaptive Web Design.


I'm hopeful that most developers have made the switch from icon fonts to SVG icons. Not only are SVGs far smaller, they're also more flexible and better supported than icon fonts.

My top performance-related suggestions are really the same ones I've been harping on since we were on dial-up:

  1. Get rid of any extraneous/unnecessary images, scripts, and CSS. Make them fight for their place in your pages.
  2. Optimize your images! First, choose the right format. Provide more performant alternatives like WebP to browsers that support them. Scale them to the sizes you need. Use a tool (or three) to compress the heck out of them. Mark them up as adaptive images (picture and/or srcset and sizes as appropriate) to deliver the best, smallest image possible to your users.
  3. Minify your text-based assets, including your HTML, as much as possible. If you can, pre-compress your files using both Gzip and Zopfli compression to improve server side performance.
  4. Concatenate common resources. Even though HTTP/2 streams files faster, combining commonly-referenced files like you global CSS and JavaScript is still a good idea.

One final suggestion that's new is to add a Service Worker that enables you to more elegantly handle caching and speed up page rendering. For browsers that support this new worker type (which is all current browser versions), the performance benefits to your users will be huge.


As you can see above there is a lot of great advice from people that deal with web performance optimization on a daily basis. A few things to make note of is the importance of image optimization! Almost 50% of all the responses included an emphasis on images and how they relate to page weight. Another factor that a couple mentioned is "perceived performance." Sometimes we over analyze the data and speed tests and we need step back and actually look at it from a user perspective.

You can also subscribe to our web performance experts list on Twitter.

  • Share

Supercharge your content delivery 🚀

Try KeyCDN with a free 14 day trial, no credit card required.

Get started


Comment policy: Comments are welcomed and encouraged. However, all comments are manually moderated and those deemed to be spam or solely promotional in nature will be deleted.
  • **bold**
  • `code`
  • ```block```
KeyCDN uses cookies to make its website easier to use. Learn more