15+ Experts Share Their Web Performance Advice

By Cody Arsenault
Updated on September 3, 2022
15+ Experts Share Their Web Performance Advice

A couple of years ago we reached out to a number of web performance experts in the community and asked them two questions about which performance tip they would recommend focussing on and what are some common performance mistakes.

The web performance advice they provided was top notch and extremely useful to the rest of the performance-driven community. That's why, we wanted to reach out to these web performance experts again and get their updated insights.

Web performance questions

For this performance advice post, we focused on the current landscape of web performance. Therefore, we asked questions related to what web developers should and shouldn't do to improve website speed.

This post reveals which practices some of the best web performance experts believe to be outdated as well as their top suggestions for people looking to optimize their site.

Experts answers

The people on this list were selected in no particular order.

Stefan Judis


Frontend developer and curator of perf-tooling.today.


One old practice immediately coming to mind is image spriting. The http/2 adoption today is pretty good, and as thanks to multiplexing several HTTP connections can share one TCP connection, the cost of latency decreases whereas assets can be adequately cached. Say finally goodbye to huge images sprites!

Another outdated practice is icon fonts. Icon fonts have several downsides - they suffer from FOIT (Flash of invisible text) harming use experience, and when you're not careful they're very harmful to accessibility. SVG is cross-platform supported today and definitely

the way to go!

And the last practice coming to mind is focusing too much on metrics that are not reflecting user experience. Page load time is one of the metrics that are no telling the full story. For metrics that better reflect user experience, try looking at time to first paint, time to interactive and the by WebPageTest provided Speed Index. They are all valuable metrics that give you more information on how your users experience your site.

At the end, web performance optimizations are always about shipping as little as possible as fast as possible. Following these two principles, I think you'll be mainly good to go.

Suggestions for improving web performance:

The tooling around web performance evolved drastically last year, so my number one suggestions it to get to know the great tools like Lighthouse from Google, Sonarwhal from Microsoft and WebPageTest. They all show possible improvements while providing useful resources to find information about best practices.

Try these tools and read, learn, improve!

Chris Coyier

@chriscoyier / css-tricks.com

Web designer and developer. Built CSS-Tricks and co-founded CodePen.


There are some performance best practices that are kind of a lot of work if you have to do it entirely manually, but can be done better and with little work if automated.

Say for example you have a WordPress site. WordPress hooks you up with responsive images (<img srcset>) stuff out of the box, which is wonderful for performance. But install the Jetpack plugin, and you can, with the flip of a switch, also serve your images from the WordPress CDN (and optimize them at the same time). That's powerful stuff for almost no effort. Another switch and you are lazy loading your images with Jetpack. Again a huge performance boon for little work.

I love nerding out about performance tweaks you can make, but my favorites are always the ones with big impact and little effort.

Peter Hedenskog

@soulislove / sitespeed.io

Part of the performance team at Wikimedia and creator of Sitespeed - a set of open source tools that makes it easy to monitor and measure the performance of your website.


Make sure you don't follow the old YSlow advice "Put JavaScripts at the Bottom". You should always async/defer loading your JavaScript files. And never ever be dependent on JavaScript to render your page.

Before you start optimizing your website you need to make sure you continuously measure the performance of your site (if you don't do that already). Measure from real users (RUM) and do synthetic testing. Make sure the tools you use are GDPR compliant (if you have users that are based in the EU) so that you can continue to use them after May 25th, 2018. When you measure and feel confident in the metrics you can start to optimize.

Denys Mishunov

@mishunov / mishunov.me

Frontend developer, speaker, and author at Smashing Magazine.


First of all, I think there are not that many outdated practices in general. Any practice that we have in our industry has been serving some particular purpose and at some point of time, it served it well. Yes, sometimes some of such techniques or practices get outdated but when it comes to performance, any practice that we would call an "outdated" otherwise still helps make a site/project/app faster. So I can not really think of any really outdated practice right now. In general, web performance is an interesting matter in that it's different for each particular project. For example, for a lot of projects with poor server infrastructure and HTTP/1.1, the number of server requests that are used as one of the metrics affecting performance of a website might be crucial and it could happen that doing some heavy computations on the client rather than communicating to such server turns out to give better performance results. At the same time, for projects driven with HTTP/2 this widely-spread parameter might not play as significant role due to the more requests being delivered simultaneously without blocking the page.

That being said, even though web performance practices from the past can still do their job pretty well, we have some metrics that we should consider moving away from these days as we're getting more descriptive and practical alternatives. Like "load" event for window, for example. Or "DOMContentLoaded". At some point, these were the only metrics we had but nowadays they can easily give false impression about performance because they do not take the current state of the technology into consideration. More info on the modern state of performance metrics could be got from Tammy Evert's The Hunt for the Unicorn Performance Metric talk.

In web performance, one size doesn't necessarily fit all: you should measure your own project and your own optimizations to achieve the best result. And it might easily be that some "outdated" practice is exactly what gives you the best result.

Suggestions for improving web performance:

The only suggestion I would make: take it easy. I don't mean to not bother about performance of course. I mean take a small step, measure, take another small step, measure again and so on. Never try applying several optimizations in one chunk. Otherwise, at best, you risk to not know what actually lead to performance improvement; at worst, you might improve one thing while getting regression on a couple of others and get worse performance results after all.

Also, always start with analysis: you need to see easy solutions and more complex ones early on in the optimization process. Always begin with the easy ones before you start digging fancy techniques like server side rendering for example: as practice shows, easy solutions like images' optimizations, usage of Resource Hints (probably my favorite "low hanging fruit" optimization), deferring resources' loading will get you quite far allowing you some breathing room before you get to heavyweights like Service Workers for example.

So, when it comes to performance, small and easy steps will get you farther than you might think.

Stefan Baumgartner

@ddprrt / fettblog.eu

Web developer/web lover at Dynatrace and co-host at the German Workingdraft podcast.


To optimize your website for performance, there is no way around HTTP/2. Activate HTTP/2 on your servers, then look closely how the transport of your assets has changed. Chances are that you get immediate benefits from multiplexed streams over a single TCP connection. Then start to tear your website's resources apart. Don't transport all at once, just deliver what's needed on that particular page. Stop concatenating files just for the sake of saving one TCP connection. Be picky and remember: The best request is no request.

Maximiliano Firtman

@firt / firt.mobi

Mobile+web development & consulting. Author of Programming the Mobile Web & jQuery Mobile, from O'Reilly.


I'm not sure there are too many outdated practices, as the basics of web performance are still the same. Maybe CSS Sprites makes no sense anymore, but most of the practices - even bundling all important JS for rendering in one file- are still important even on HTTP/2. There are some techniques that are now under discussion such as the usage of LQIP (Low Quality Image Placeholders) and JPEG Compressive Images.

Suggestions for improving web performance:

  • Analyze network opportunities: Brotli, HTTP/2, QUIC
  • Help the browser as much as possible: DNS Prefetch, Preconnect hint, Preload only on important assets
  • Code splitting for big apps, but bundling in one file per group
  • Be careful with repaints and the usage of active listeners
  • Use Reactive Web Performance to adapt the experience for different scenarios based on Client Hints and other APIs
  • Use next-generation image formats and ideas: including Zopfli compressor for PNGs, Guetzli for JPEG, WebP and videos instead of Animated GIFs among other ideas.

Peter Cooper


Publisher-in-chief at CooperPress which publishes StatusCode Weekly. Software developer and code experimenter.


Outdated practices:

  • Stop trying to make bad features faster, and focus on getting rid of features entirely. Your pages and apps often don't need to be as heavy as they are.
  • The speed of the underlying language behind your services is a secondary consideration nowadays.
  • A lot of classic JavaScript micro-optimizations have become obsolete as engines continue to improve.
  • Resource inlining, asset concatenation, and splitting assets across subdomains are less important than ever due to HTTP/2.

Suggestions for improving web performance:

  • Get on top of HTTP/2, particularly if your pages/apps have to remain complex in terms of assets, scripts, etc.
  • See if you can go "static". The result will be much faster load times (even if it's at the expense of longer build times) and you'll find it easier to use global CDNs, aggressive caching, and follow other performance best practices.

Dean Hume

@DeanoHume / deanhume.com

Software developer. Author of Fast ASP.NET Websites, a book aimed at improving the performance of high transaction websites.


This is going to sound a little crazy, but I don't really believe there are many (if any) outdated web performance practices. If you think back to Steve Souders amazing book High Performance Web Sites and his 14 rules for faster loading web sites, all of those rules are still applicable today. It's kinda reassuring to know that these rules have stood the test of time! While we have some newer techniques, these original rules are still the foundations.

Suggestions for improving web performance:

  • Get a service worker on your site! With a few lines of code, you can have caching in place that will produce lightning fast response times. The best part is - even if a user visits your site and their browser doesn't support them, it will simply fall back. It's a no brainer!
  • If you haven't already done so, upgrade your site to use HTTP/2. With this in place, HTTP/2 uses multiplexing that allows your Browser to fire off multiple requests at once on the same connection and receive the requests back in any order. Compared to HTTP/1.1, this means no multiple connections and a much faster response times.
  • If your server supports Brotli - I highly recommend enabling it. Brotli is a compression algorithm that compresses data and can be more effective than Gzip and Deflate in certain circumstances. Zopfli is a good solution for resources that don't change much and are designed to be compressed once and downloaded many times. In fact, at settled.co.uk, we noticed a 10% improvement in the size of the files that we compressed using Brotli.
  • Use responsive images! According to the HTTP archive, images make up around 50% of the average web page. By using responsive images, you can tailor the image sizes to suit the browser's viewport and in turn, save on the total download size of your web pages. You can save your users bandwidth and ensure speedy response times at the same time!

Jem Young


Software engineer at Netflix and recent speaker at the #PerfMatters conference on the topic of "Modern Performance in the Year of the Dog".


Stop thinking a framework or library will solve your performance problems. Measure, identify problem areas, and address any issues. True performance comes from taking a holistic view of your website and empathizing with your user base.

Start treating mobile as a first-class citizen. The next 4 billion joining the internet will on a mobile device so when you think about performance, think mobile performance.

Léonie Watson

@LeonieWatson / tink.uk

Accessibility engineer, W3C Web Platform WG co-chair, and recent speaker at the #PerfMatters conference on the topic of "there is more to performance than meets the eye"


More things than you think affect Time To Interaction (TTI). When someone is running an Assistive Technology (AT) like a screen reader, browsers behave differently and this has an impact on performance. As well as creating the DOM, the browser creates an accessibility tree and, depending on the browser, gives the screen reader different ways to access the information in the accessibility tree and present that information to the user. With large pages, this can add several seconds to the TTI for screen reader users, even in the latest versions of Chrome and Firefox.

Brian Jackson


Blogger at Woorkup, and developer of the perfmatters WordPress performance plugin.


I still see many users combining their JavaScript and CSS. In a lot of cases, this is no longer needed due to most sites are now running over HTTPS (or should be) and utilizing HTTP/2 which has better support for parallelism. In fact, combining things like this can result in a slower site.

Another outdated practice or misconception I see a lot is thinking a CDN won't have a huge impact on the performance of your site and it's something you'll get to eventually. If you're serving visitors globally a CDN is essential to help speed up the delivery of your assets (depending on the locations I've seen up to 70% decrease in load times). A CDN is not optional, it should be a normal part of a web developers stack.

Developers need to stop just including external scripts or services on a whim and determine just how much each might impact the performance of a site. Take Font Awesome for example. While it's probably one of the most popular ways to easily include font icons on your site, it's better to determine which icons you're actually using. If your site is only utilizing 10 out of the hundreds of icons... repackage your icon fonts with a tool like IcoMoon instead. I've seen this drop the size of the file down by over 90%!

In other words... don't just grab the CDN hosted script because it's the popular thing to do. Take a few moments and determine if that is the best way. A lot of times, hosting something on your own CDN is better as it reduces another DNS lookup and you'll have more control over caching of the file, etc. If you take a performance-driven approach to these things it can quickly result in a much faster site.

Suggestions for improving web performance:

  1. Stop trying to save money by going with cheap web hosting. A lot of times when it comes to hosting you get what you pay for. Your hosting provider is the backbone of your site and probably plays one of the most important roles in just how fast your site will load. If you'd rather spend your time growing your business and site, go with a managed host, especially if you don't have the knowledge to troubleshoot server side related issues.
  2. If you're using PHP, upgrade as fast as you can to the latest versions. Not only will PHP 5.6 be EOL this year in terms of support and security, but PHP 7.2 has been shown to handle up to 3x as many requests. It's also important to note that in regards to WordPress, HHVM is no longer being supported.
  3. Optimize your images. This might sound like a broken record to many, but 50% of an average web page's weight is still made up of images. Finding a good balance of compression and quality is essential for every website.
  4. If you're struggling with WordPress performance you may need to go beyond the typical troubleshooting steps. I recommend utilizing amazing enterprise tools like New Relic which can help pinpoint slow queries or bad code. Check things such as autoloaded data in your wp_options table, corrupt transients, cache-hit ratios, CRON jobs, etc. I can't tell you how many times I've seen corrupt transients and large wp_options tables with unnecessary autoloaded data bring WordPress sites to a crawl.

There is also great software which can help larger and more dynamic WordPress sites such as WooCommerce, community sites, etc. Redis, in terms of caching, allows for the reuse of cached objects rather than requiring the MySQL database to be queried a second time for the same object. This can help decrease load time on the database. ElasticSearch is another one that helps speed up WordPress search by providing an additional indexed layer which is quicker to search than a MySQL query against the database.

Anselm Hannemann

@helloanselm / helloanselm.com

Frontend developer creating solid, scalable code architectures. Curator of the amazing WDRL Newsletter.


While I'm not sure if this fits 'outdated' very well, I think it's important to highlight that you shouldn't focus on specific parts of your websites to be fast but on the overall experience. I see a lot of services that render incredibly fast once loaded due to virtual DOM and other cool techniques but the same pages take several seconds to load the initial layout due to the heavy render-blocking JavaScript application that they serve. Instead, we should try to use code-splitting and serve a very small bundle for the initial application experience and asynchronously load additional features later on.

Suggestions for improving web performance:

First of all, I think one of the most important yet often forgotten things to optimize the page load performance is caching. Especially with immutable cache we can speed up the rendering of a page for recurring visitors noticeably with relatively low effort.

Second to that, serving assets as quickly as possible is key to good performance. Whether you have a JavaScript-driven application of HTML document, you should indicate downloadable assets such as your scripts, images, and CSS via HTML preload elements or even consider HTTP/2 Server Push and for this to ensure that the most important content to render the initial experience is served as quick as possible.

Third, refactor your code and leave out everything you don't need. This might include libraries or some over-engineered code you wrote previously. You might even want to consider not transpiling your JavaScript anymore depending on what browsers you need to support.

Sergey Chernyshev

@sergeyche / sergeychernyshev.com

Web technologist with passion for web performance and open source. Organizer of Meetup: NY Web Performance and WebPerfDays NY.


Don't use technical metrics to represent real user's experience.

The most common these days is to measure so-called "Page Load" time calculating time from the start of navigation to the time when browser's onLoad event fires. Unfortunately, this event as well as many other, "technical" events inside the browser only represent the inner-workings of the page, but not experience real users have at that point with because page is often in an incomplete state not meaningful to the user.

This applies to other techincal metrics currently used, from Time To First Byte (TTFB) which measures time to generate the page on the server and even Time To First Paint which is a bit better as it represents when page stops being completely unusable for the user, but still does not really tell us when experience is useful.

There are some automatable metrics emerging like Time To First Contentful Paint (FCP) which attempts to measure when parts of the DOM are first painted and Time To Interactivity (TTI) which attempts to represent when browser's CPU is idle enough after First Contentful Paint for user to start scrolling and interacting with the page.

These metrics are better automated metrics that tools can capture without developer involvement and that's why you see more information on them as tool providers like Google, for example, can work with them in bulk without working with individual sites.

The best option, however is for you as a developer of a specific site is to measure specific events in your application like the Time to first Tweet on Twitter or a bit more generic Pinner wait time on Pinterest. Metrics like these will let you track user experience as it relates to your product and business KPIs that you are trying to improve.

You can use W3C UserTiming API and hopefully in the future, upcoming Element Timing API (supported only by SpeedCurve in their synthetic tools) to report what matters on your site and track that in relation to your performance efforts.

Suggestions for improving web performance:

At this point in time, so many technical solutions and tricks exist to get your site to work faster, so many frameworks and technical approaches to make that happen - from traditional, server side rendered pages with inline critical CSS and asynchronously loaded JS to Single-page Applications (SPA) that use browser history API to update URL and contents of the page without full deconstruction and reconstruction of DOM on each navigation and all the way to Progressive Web Apps (PWAs) that utilize Service Workers to progressively enhance your pages caching application shell and data locally only requesting data changes over network if necessary.

All of those are commendable methods of speeding up the technology, but in my opinion the main solution to slowness lies outside the technical realm itself and needs to solve the problem of overall disregard of performance in product development lifecycle which leaves it until the product is built and resulting app is slow and disappointing requiring post-development optimization.

The only way for us to change our ways is to start "Designing Speed" the same way we design visual aspects of user experience and branding and the way we design technical solutions for new features.

Only by making speed a first-class citizen in product and technology conversations can we make sure is built into each product. This is extremely important because unlike other, functional features, speed is much harder to add or remove after development is done and cost of rework is sometimes prohibitively high.

This is not unlike responsive design process, which made us all see sense in regards to multi-device support after we neglected it for so long or accessibility initiatives which are, unfortunately, still quite neglected by our industry. As Tammy Everts pointed out in her talk about hunting the performance metric unicorn we need to scale empathy within our organization and to do so, it is critical to inject speed design into organization's workflow, together with general education with web performance.

There are not that many resources available yet about it, but you can read my article where I propose Progressive Storyboards as visual technique and check out speedpatterns.com, a new and upcoming catalog of speed design patterns (feel free to contribute as it is being built).

Aaron Gustafson

@AaronGustafson / aaron-gustafson.com

Web standards advocate at Microsoft. Author of Adaptive Web Design.


I'm hopeful that most developers have made the switch from icon fonts to SVG icons. Not only are SVGs far smaller, they're also more flexible and better supported than icon fonts.

My top performance-related suggestions are really the same ones I've been harping on since we were on dial-up:

  1. Get rid of any extraneous/unnecessary images, scripts, and CSS. Make them fight for their place in your pages.
  2. Optimize your images! First, choose the right format. Provide more performant alternatives like WebP to browsers that support them. Scale them to the sizes you need. Use a tool (or three) to compress the heck out of them. Mark them up as adaptive images (picture and/or srcset and sizes as appropriate) to deliver the best, smallest image possible to your users.
  3. Minify your text-based assets, including your HTML, as much as possible. If you can, pre-compress your files using both Gzip and Zopfli compression to improve server side performance.
  4. Concatenate common resources. Even though HTTP/2 streams files faster, combining commonly-referenced files like you global CSS and JavaScript is still a good idea.

One final suggestion that's new is to add a Service Worker that enables you to more elegantly handle caching and speed up page rendering. For browsers that support this new worker type (which is all current browser versions), the performance benefits to your users will be huge.

Vitaly Friedman

@smashingmag / smashingmagazine.com

Co-founder of Smashing Magazine, a leading online magazine dedicated to design and web development.


I think at this point it's critical to look into the possibilities of what you can achieved with service workers for advanced performance optimization, but in terms of low hanging fruits, I'd definitely explore optimization of web font loading and deferring/dealing with 3rd-party-scripts. Most of the time they are slowing down the entire experience massively.

Beyond that, obviously examining and deferring JavaScript in general, with code splitting via webpack, for example, has become extremely important for every website with heavy JavaScript bundle. You might find some useful ideas in a checklist I published recently.

Harry Roberts

@csswizardry / csswizardry.com

Consultant Frontend Architect: Google, UN, BBC, Kickstarter, Etsy.


There are some very specific things that I would consider anti-patterns that developers should avoid doing. The first two both pertain to loading JavaScript in ways that were previously thought to be beneficial to performance, but actually happen to be a net loss.

The first is using document.write to instantiate a new script. This blocks the parser for up to units of seconds at a time. Chrome has already begun intervening on slow connections and are effectively blocking its usage:

Based on instrumentation in Chrome, we've learned that pages featuring third party scripts inserted via document.write() are typically twice as slow to load than other pages on 2G.

- Google Developers

The second is the use of asynchronous JavaScript snippets to load subsequent files. This is still a surprisingly common practice-even used by Google Analytics-despite better alternatives being available. The issue with a snippet like the one below is that the reference to Google Analytics is actually just a JavaScript string, not an actual reference to a file. This means that the URL is not discovered until the browser has parsed and executed this script block. If the URL was in a regular script tag's src attribute, then the browsers secondary parser-the lookahead pre-parser, or preload scanner-could find the reference much sooner, making everything much faster.

    (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),

    ga('create', 'UA-xxxxxxx-x', 'auto');
    ga('send', 'pageview');

Both of these are oddly specific, but I do wish people would stop using them: their time is done.

Suggestions for improving web performance:

  • Move over to HTTP/2: It currently enjoys about 80% global support.
  • Begin thinking about offline: ServiceWorker is wonderful, and it doesn't need to be a complex addition.
  • Consider the next billion users: there are lots of people coming online who don't enjoy the same connectivity that we enjoy in the West.

Matt Shull

@TheMattShull / mattshull.com

Data Science Program Manager at Thinkful and product/performance consultant.


HTTP/2 is allowing us to move away from using thing like CSS sprites for icons/logos which is really nice. I've also been working with teams that rely on gifs to show some type of instructional animations to use videos instead. In a lot of cases, the gifs are larger than the videos! Finally, if you're working with blurred images or single color photos online you can reduce the file size of the photos you're working on by using CSS filters. CSS has made a number of improvements that allow us to create performance-focused experiences.

Suggestions for improving web performance:

Lighthouse reports are a great place to start looking for ways to improve performance. Seeing unused JavaScript and CSS through Lighthouse can be really eye-opening. With the teams I work with for performance consulting we've been moving towards user-centric metrics like First Meaningful Paint, First Interactive, and Consistently Interactive, as well as monitoring the (usually) tried-and-true metrics of Time to First Byte, Speed Index, and Page Size. One of the most interesting observations I've found working with a few teams is that these metrics can differ drastically depending on the network speed. In one particular case I noticed that there is a positive correlation between Consistently Interactive and Page Weight on a 3G connection or lower, but not on faster network speeds. Measure all the things, on all the speeds.


In conclusion, there are a variety of different ways you can optimize your website's performance. However, a few key suggestions noted by multiple web performance experts above include:

  • Use HTTP/2
  • Implement SVG icons instead of icon fonts
  • Cache your static content
  • Add service workers

Thanks to all of the web performance experts who participated! Taking this advice and implementing best practices or removing outdated ones on your own site is a step towards a faster web experience for all. Let us know in the comments section below what you found was the best piece of advice.

  • Share

Supercharge your content delivery 🚀

Try KeyCDN with a free 14 day trial, no credit card required.

Get started


Comment policy: Comments are welcomed and encouraged. However, all comments are manually moderated and those deemed to be spam or solely promotional in nature will be deleted.
  • **bold**
  • `code`
  • ```block```
KeyCDN uses cookies to make its website easier to use. Learn more