Monday, March 15, 2021
Google PageSpeed is an open-source web development tool to audit webpages. It’s run online or within your Chrome Browser’s development tools panel. If you’re running it within your browser it can also go by the name Google Lighthouse.
The online half, PageSpeed, is found here.
The Chrome half, Lighthouse, is built into Chrome Browser. Hit CTRL-SHIFT-I (if you’re on a PC). This will load ‘DevTools’ and take you directly to the ‘Audit’ tool tab. You can also use F12 to load Dev Tools and manually navigate to the ‘Audit’ panel.
Yes … and no. The data you’ll get back is the same. But I lean on the built-in-browser version more because PageSpeed’s online results are different every time I run it, even when it’s run back to back, even when I make zero changes to the site I’m auditing.
I’ve looked up why on various forum boards and haven’t found an official (i.e., Google Rep.) answer despite other users noticing the same thing.
There are five categories to audit.
Two ‘devices’ are audit-able.
I suggest you pay careful attention to your mobile performance scores. Google prioritizes mobile performance, and your organic rank will improve if your website is snappy on mobile devices.
They include an SEO audit tab, but it’s minimal. It covers almost nothing, and what it does cover isn’t SEO.
Here’s what I mean: This SEO portion of this auditing tool has very little to do with actual SEO. SEO is about understanding keywords, writing SEO optimized content and conducting competitor research. This tool covers none of that.
If you are interested in learning about what keywords are and how you can identify them, I’ve written an article titled How To Do Organic Keyword Research. That said, this auditing tool is still extremely valuable because it includes a relatively detailed performance report that you’ll use as a part of your SEO efforts.
Categories have a performance score from zero to one-hundred. While it’s wonderful to see all green when you finish an audit, you’ll often see a mix of green and orange.
Scores are compared against other top-performing websites that Google has cached as historical data. They’ve seen that top-performing websites load in around 1.5 seconds. You’ll see green if your site loads in less than 2s, orange is between 2s-4s, red is over 4s.
Note: some websites will never see all green. Not every website needs to be all green, either. However, optimize it as best as you can.
Performance looks at six metrics to calculate page load speed. It’s scored by time. A good target is a score below two seconds. However, some web pages might not be capable of having a below two-second score for some of these values.
First Contentful Paint (FCP) marks when the first block of text or image is visible to the person loading the site.
Improving your FCP score depends on what’s slowing it down. This requires understanding how the DOM works. And that is a little too in-depth of a topic for this. I’ll come to it at a later time in a later article.
Speed Index measures how fast your content’s displayed during page load. Here, the software captures a video of your page loading and measures it. It then uses a JavaScript module to calculate the index store.
Now, your page speed is rather finicky to work with. If you’re using WordPress with a lot of plugins, your page speed will probably be slow. This is because plugins rely on a lot of JavaScript to function, and JavaScript is the speed culprit.
Side Note: There are plugins available that improve your speed time by managing other plugins. And that’s the go-to solution for many people. My opinion’s to use as few plugins as possible and write as much code as possible. Many of the features plugins provide end up being much easier to code than you think.
Another performance killer are render blocking resources. Lighthouse splits them into three main kinds:
Each time you load a script, stylesheet or import something, the browser needs to download and process another file.
WordPress, as impressive as it is, slows down very quickly with all its fancy themes and plugins people download into it because each adds another set of scripts, style-sheets and imports.
Time to interactive (TTI) is how long it takes for your web page to load and be fully interactive fully. The delay between when it’s displayed and when it’s functional is important to minimize because it might mistakenly think your website’s broken and leave.
Improving your TTI involves typically getting rid of unnecessary JavaScript (I’m looking at you, plugins). If you can’t get rid of them, you can often minimize the impact through asset cleaning plugins.
Asset cleaning plugins try to ‘minify’ or condense all of the files you’re loading into one large load. Some will also allow you to avoid loading certain assets. This sometimes breaks things, so it won’t be usable in every case, but many do a good job for the most part.
First CPU Idle is being skipped because while it’s still displayed as a metric, Google hasn’t used this since Lighthouse 6.0.
Instead, they recommend focusing your efforts on TTI in place of this. Max Potential First Input Delay (FID) measures what the longest time a user needs to wait before the website acts on their action.
For example, your website’s loading and the user immediately clicks your catchy button. If the server is still loading the website, there will be a time-delay after they click it (before the clicked action happens). That annoys people. This metric helps you avoid that.
Accessibility audits thirty-one metrics.
Assistive technology relies on the many conditions in the above list to make sure users who use assistive tech can access the content.
This also covers content blocking issues, like a website’s content not being loaded because of a geographical location (looking at you, Youtube).
It also covers the navigation menu’s not loading properly on tablets or mobile phones.
An interesting, often over-looked design element is contrast problems. This happens when someone selects a background color that’s too close to the font being displayed. Think of a dark grey background with black font.
If you’re interested into deep-diving this feature, check out the Web Content Accessibility Guidelines.
There are fifteen audits within this metric.
Best practices is an attempt to get a certain standard to web development. I’m not going to expand on many of these, however, their’s one worth pointing out.
It’s important to use HTTPS on every website. This includes websites that aren’t E-comm.
HTTPS encrypts the data transfer between you and your users. Contact and quote forms are a fantastic example of why you want HTTPS on your site. There’s no reason to allow a hacker to see the contact details being provided by your customers.
This is important enough that Google Chrome flags non-secure sites (standard HTTP-only sites), asking if users want to be directed back to search results just in case it’s a scam site.
Title’s show up in search results. If you don’t set a manual title, you will end up with whatever default or blank title came with your website. For example, WordPpress defaults the page title to the one you typed in when creating a page or post.
Meta descriptions are the summaries provided in search results so users can decide if they want to click your link. WordPress does it’s best to capture the text from the first paragraph of your page, but I prefer to manually type it out myself.
HTTP status codes errors point to resources being inaccessible to Google’s robots. Pages with these errors might not get indexed by Google (depending on the error).
Links have an area to create a title in them. These tags are used by assistive technology and are standard practice to include.
Pages can be blocked from being indexed through the robots.txt has denied permissions. There are times when you want to block certain pages from being indexed, but the audit won’t know if it’s an intentional block.
Robots.txt tells search engines which parts of your website they should crawl and which to avoid. It also controls whether they should be indexed for search or not. There are times when you want a few webpages to not show up in results.
Image elements that are informational need to have descriptive texts. The robots won’t know whether an image is informative or decorative, so deal with it on a case by case basis.
Hreflang tells search engines which version of a webpage they should include within search results for a given language or region.
Canonical tags are important. If you need to have duplicate content on your website, this tag tells the search engine that this page is the one that should receive credit. Without it, Google might penalize you for having duplicate content.
Font sizes need to be 12px or larger for most people to read. This is here so people won’t have to zoom in or pinch to zoom to read your text.
The platform you're reading this blog on right now is a Progressive Web App. It probably doesn't look any different than most other websites you've been on, however, the way this is coded makes it one.
Progressive Web Apps (PWA) are built and enhanced with modern APIs to deliver native-like capabilities, reliability, and installability while reaching anyone, anywhere, on any device with a single codebase. To help you create the best possible experience, use the core and optimal checklists and recommendations to guide you.
From Web.Dev, What Makes A Good Progressive Web App
The take-aways are installability, speed and security. This is beyond a responsive website. It’s a website that really feels like a fully installed application. LinkedIn is a fantastic example of one. When you load it in a mobile browser it looks and acts exactly as a download application would.
It sounds odd. I know.
That covers it!