Google's Search Quality Evaluator Guidelines SEO guide

Google’s Search Quality Evaluator Guidelines: Here’s What They Say About SEO & Content

Originally published December, 2015 and completely updated October, 2018.

Google is anything but transparent. As such, its algorithm inner workings have never been easy to interpret.

In fact, SEOs dedicate themselves to a sort of “algorithm watch.” They spend eons of time poring over search metrics. They write novel-length blog posts analyzing the changes they can only guess happened, and how these changes may or may not affect search rankings.

So, when Google threw everyone a bone, the SEO community latched on. Back in October of 2015, The SEM Post got a leaked copy of Google’s Search Quality Guidelines, and their interpreted version went viral.

In response, Google broke the internet by releasing the Search Quality Evaluator Guidelines in their entirety.

Since then, Google has released multiple updates of these guidelines. The most recent hit the internet on July 20, 2018, and we’ve updated this post to reflect all the major changes

While Google’s Search Quality Evaluator Guidelines don’t lay out exactly what we need to know to rocket to the top of the rankings, they do provide some valuable information:

  • What kind of pages are viewed as high quality
  • Which factors influence high- and low-quality ratings (SUPER important, as these factors may be similar to how Google measures page quality for SERP rankings)

We’ve taken an inside look and studied the document as they relate to your SEO and on-page site content, including those fresh updates. 🔍

Without further ado, here’s a rundown of key points in this major SEO document for your online content writing and publishing.

Google's Search Quality Evaluator Guidelines guide

Still not clear on how Google ranks pages? Here's everything you need to know, dissected by @JuliaEMcCoy from Google's 200+ page Search Quality Evaluator Guidelines. #EAT #YMYL #Google #SEO Click To Tweet

What Are Google’s Search Guidelines All About?

google's search thinks like a human

Screenshot from page 4 of the Google Search Quality Evaluator Guidelines

Google’s search guidelines document is over 160 pages long and broken into an overview, three separate parts, and an appendix.

The major parts are as follows:

  • General Guidelines Overview
  • Part 1: Page Quality Rating Guideline
  • Part 2: Understanding Mobile User Needs
  • Part 3: Needs Met Rating Guideline
  • Appendix: Using the Evaluation Platform

In addition to focusing heavily on mobile search, Google’s search guidelines also focus on the importance of building trust and a good reputation for websites and/or content creators.

This isn’t hugely surprising – it’s simply a variation on what Google has been saying for years: The best websites are ones that deliver relevant, trustworthy, quality information to users.

We all know Google focuses heavily on experimentation and adjusting their algorithms to improve web quality. These guidelines provide specific instructions on what the Google engineers want people to do to improve individual site quality.

Needless to say, the Google search guidelines are dense. They cover everything from important definitions to duplicate landing pages and all the places in between.

For those of you who want to read through the guidelines on your own, you can find the link here. For everyone else, here’s the breakdown of key points we’ve found within them.

12 Key SEO Content Factors in the Google Search Quality Evaluator Guidelines

For SEOs who have dedicated themselves to keeping up with Google’s ever-changing algorithms, this document will serve mainly to reaffirm what you already know, with a few goodies thrown in here and there.

For SEO newbies, though, this document offers an expansive guide to Google’s preferences and the future of SEO. The guidelines lay out specifics about Google’s algorithms and how, exactly, SEOs can better predict changes to it in the future.

1. Beneficial Purpose

One of the newer additions to the guidelines is the concept of “beneficial purpose.” This term defines websites with pages created, first and foremost, for the user’s benefit.

On the other hand, many pages are created solely for the purpose of ranking on Google or are created with no intention of helping users. In Google’s eyes, these pages have zero beneficial purpose.

According to the guidelines (part one, section 3), raters are supposed to give these pages the lowest rating:

content must have beneficial purpose to rank well

“Websites or pages without any beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating.”

In stark contrast, pages with beneficial purpose are the very definition of high-quality:

“High-quality pages exist for almost any beneficial purpose, from giving information to making people laugh to expressing oneself artistically to purchasing products or services online.” – Part one, section 4.1

According to Google, high-quality pages not only have a beneficial purpose; they also achieve that purpose.

In other words, if you’re not writing to help your audience in some way, your page will have little overall value to the search engine. Thus, “beneficial purpose” is the ground-floor factor that affects your page quality.

High-quality pages not only have a beneficial purpose; they also achieve that purpose. This and more takeaways on @JuliaEMcCoy's post on Google's Search Quality Guidelines #googlerankings #serpsranking #googlesearch Click To Tweet

2. Page Quality (E-A-T)

Page quality has always been a bit of a mystery. Google uses hundreds of ranking factors and it’s often unclear how they all relate to one another.

We’ve always known unique, relevant, well-written content helps produce a high-quality page, but the guidelines have some additional insights to offer on this topic.

According to the guidelines, it’s not just high-quality main content (MC) that matters. In fact, Google has created a name for what every high-quality page needs: E-A-T.

EAT YMYL Google's Search Quality Evaluator Guidelines

E-A-T stands for “Expertise, Authoritativeness, Trustworthiness,” and it may be one of the major factors Google is using to rank pages.

screenshot showing google's guidelines on eat content

Screenshot via Google’s Guidelines, section 3.2

Pages that are expert, authoritative, and trustworthy will be viewed as higher-quality than those that aren’t.

But what does that mean, exactly?

A. High-Quality Pages

Google’s guidelines state that the search algorithm ranks websites on a scale of lowest, low, medium, high, and highest.

screenshot of quality ratings scale google uses

Via section 3.0

According to Section 4.1 of Part 1, high-quality pages possess the following characteristics:

  • A “satisfying amount” of high-quality MC, including a title that’s appropriately descriptive/helpful
  • “Satisfying website information” or information about the website’s owner/creator (shopping or transactional pages need satisfying customer service information, conversely)
  • The page and its associated website have a high amount of E-A-T (Expertise, Authoritativeness, and Trustworthiness)
  • The website (or the MC creator) has a good reputation

It’s worth noting that Google doesn’t specify how much content a page needs to be considered “satisfying,” only that it depends on “the purpose of the page.”

Google provides this page as an example of high-quality content (partial screenshot):

google describes what quality content looks like

According to Google, this page has high-quality, humorous MC. Plus, the website has a positive reputation and displays expertise in farcical humor.

B. Low-Quality Pages

According to the Google search guidelines (part one, section 6.0), low-quality pages feature the following:

  • Poor, low-quality MC
  • An inadequate amount of E-A-T
  • Unsatisfying amounts of MC for the purpose of the page (a dense topic with little information, for example)
  • A page title that is essentially clickbait (“exaggerated or shocking”)
  • An author that doesn’t have the level of expertise needed to write about the topic
  • A website or content creator with a “mildly negative” or mixed reputation
  • Unsatisfying information about who created the content/who’s behind the website
  • Page content that distracts from the MC, like intrusive ads/interstitials

Google goes on to say that you can land yourself in low-quality content land by making things up, not editing material enough, buying papers, using obvious facts (“A German Shepherd is a dog”) or over-complicating simple facts.

Here’s an example Google provides of a low-quality page (partial screenshot):

website example of google's definition of low-quality content

According to Google, this page has low-quality MC, is lacking in E-A-T, and has a misleading page title.

Google also says that pages will be considered low-quality if they’re created “without adequate time, effort, expertise, or talent/skill.” This is a broad statement, but it’s safe to say that it encompasses everything from poorly designed and scraped content to content that’s written by unskilled or unknowledgeable writers.

The Google search guidelines close by saying that low-quality content is reason enough for a quality rater to grant you a low page rating.

The takeaway: Make sure you’re always creating content with a high level of E-A-T. If your site doesn’t have the E-A-T that raters are looking for, you need to dedicate some time and effort to increase it.

C. How Can You Increase E-A-T on Your Pages?

One of the main ways E-A-T standards have been tweaked with the recent update to the guidelines: A bigger emphasis is on the author/creator.

According to Larry Alton for ProBlogger, you can make sure your content meets current E-A-T standards in a few ways:

  • Enlist high-authority content contributors
  • Include author credentials alongside content (A.K.A. author bylines)
  • Update author bios and “About me” pages
  • Create publicly visible profile pages

All of these actions help establish your expertise, authoritativeness, and trustworthiness (and your contributors’, if you have them).

No matter what you choose to do, ensuring your E-A-T level is high is one of the best ways to earn high page rankings.

3. YMYL Content

YMYL Google's Search Quality Evaluator Guidelines

Leaked copies of the guidelines have been making the rounds on the web since as early as 2007. The concept of YMYL (Your Money or Your Life) pages was first introduced during one of these leaks.

According to the full guidelines, these pages are the ones that Google pays the most attention to because they’re the ones that can most profoundly impact a person’s life.

how google defines your money or your life (ymyl) pagefs

Screenshot via Google’s 2018 Guidelines, section 2.3

Google says YMYL pages are the ones that can “impact the future happiness, health, financial stability, or safety of users.” These pages include:

  • Shopping or financial transaction pages
  • Medical information pages
  • Legal information pages
  • Financial information pages
  • News articles and/or public/official pages important for informing citizens
  • Any other topics that can deeply affect users’ lives, i.e. child adoption or car safety information

Because of their importance, these pages have high, high page quality standards.

They must be authoritative, factual, and written by experts.

4. Expert Reputation, Credentials and/or Experience

The guidelines make it clear that any content needs to be created in an authoritative and expert manner. While there are “expert” websites in all niches, including food, industry, fashion, law, and medicine, Google makes no bones about it: When “expert” content is needed, true experts need to write it.

This means the following:

  • Any high-quality medical advice that gets published needs to be written by individuals and communities with appropriate levels of medical accreditation.
  • Complex financial advice, tax advice, or legal advice needs to come from highly qualified, expert sources and must be updated and maintained on a regular basis to accommodate changing information, laws, and statutes.
  • Medical advice must be written in a professional fashion and, once published, must be edited, reviewed, maintained, and updated regularly in order to keep up with changing medical consensus and beliefs.
  • Pages that address topics that can cost consumers thousands of dollars (investment platforms, for example) or that can affect the health of a family or individual (parenting sites, mental health sites, etc.) must be written by expert/experienced sources that readers can trust.
  • Pages with scientific information must be written by people/organizations with relevant scientific expertise. For topics where scientific consensus exists, producers should represent that consensus accurately.
  • News articles need to be written with journalistic professionalism and contain factually accurate information.
  • Pages on specific hobbies, like horseback riding or hockey, must also be written by people who are knowledgeable about the topic and can provide sound advice.
  • Recent updates to the guidelines also stipulate that the content creator must have a positive reputation and adequate experience in relation to the topic about which they’re writing. In short, page authors/creators must also have a high level of E-A-T. (According to Stone Temple, two pages with basically the same information might be ranked differently based on the reputation and authority level of their authors.)
Google pays special attention to the fact that YMYL pages are authoritative, factual, and written by experts. This and more on @JuliaEMcCoy's blog post discussing Google's Search Quality Guidelines #googlesearch #searchmarketing #seo Click To Tweet

A. What Does It Take to Be an Expert Content Creator?

Now, upon reading all that, it’s likely you’ll wonder what constitutes an “expert.”

No, an expert doesn’t always have to be a credentialed, highly trained person (the exceptions: when they’re writing about medicine, law, finances, taxes, or other YMYL topics).

First-Person Experience

Google makes it clear that, in some cases, first-person experience can be a form of expertise, especially in settings where you don’t necessarily need formal training to have an extensive knowledge base, such as on hobby pages.

In fact, Google states that “for some unusual hobbies, the most expert advice may exist on blogs, forums, and other user-generated content websites.”

In these instances, what Google is looking for is a display of expertise.

  • Example 1: Say you have lived with diabetes for 22 years. You may be qualified to offer tips about coping with the disease (YMYL content) because you have extensive first-hand experience. However, at the same time, you would not be qualified to write a high-quality medical blog about the symptoms and onset of diabetes.
  • Example 2: On the hobby site The Spruce Crafts, expert crafters teach all kinds of techniques in informative blog posts. These are highly ranked because each writer has plenty of personal experience that qualifies them as experts. Take this post on “How to Knit the Garter Stitch”:

how to knit the garter stitch blog post

The author is an expert because of her years of personal experience. Her bio reflects this perfectly:

screenshot of expert author's biography

The Reputation of the Website/Creator

Finally, reputation plays a role in expertise, too.

There’s a whole section dedicated to this facet of expertise in the guidelines (under part one, section 2.6):

screenshot of google's guidelines on the reputation of a content creator

This information is not about how creators or websites describe their own credentials and expertise. It’s how the wider web (“reputable external sources”) views these things.

According to Google, these external sources that provide independent reputation information about a website or MC creator may include:

  • News articles
  • Wikipedia articles
  • Magazine articles
  • Blog posts
  • Ratings from independent organizations
  • Forum discussions
  • Customer reviews (for these, content matters as much as the number of reviews available – one negative review or one positive review are not good sources unless you have a number of other reviews to compare it to)

B. Why Is Google So Stringent About Expertise?

The search engine wants to ensure deep, broad, important topics get the necessary treatment so searchers can find accurate, useful information about them.

If the search results served up low-quality, untrustworthy content constantly, we would quickly begin to distrust and stop using Google to fulfill our information needs.

  • Example 3: Most kids in the U.S. learn about World War II in school. However, it would be absurd to believe this type of broad knowledge qualifies anyone to write an informative page about what it was like to live through it.

In the end, it’s important to think about what constitutes an expert for different topics:

How much expertise do you need to possess to write about a subject in a way that’s useful and valuable to others?

How much expertise do you need about a topic so you don’t lead readers astray or negatively impact their lives?

5. Supplementary Content

The importance of supplementary content (such as sidebar tips) is one of the most interesting features of the Google search guidelines. This content is supportive because it provides additional information to users alongside the MC.

Supplementary content can also include links to similar articles or anything else that can help the reader understand your page’s information. Pages with high-quality, useful supplementary content may be generally ranked higher than those without.

Allrecipes has good examples of pages with supplementary content (SC). On their recipe pages, you get the ingredients and instructions (the MC) as well as photos, recommended recipes, user comments, reviews, and serving information (the SC).

screenshot detailing where supplement content can be found on a website

6. Lowest-Quality Pages

Some pages receive the “lowest” rating from search quality evaluators on principle. These types of pages are created with the intent to misinform or deceive users or may potentially harm them or spread hate.

Here’s the full list of types of pages that automatically get rated as the lowest quality possible:

  • Pages that promote hate or violence towards other people (like a specific group)
  • Pages that encourage harming oneself or others
  • Malicious pages (scams, phishing, malware, etc.), or pages with a malicious/extremely negative reputation attached to the creator/website
  • Pages that could spread misinformation, including content that’s obviously inaccurate, YMYL content that contradicts the consensus of experts, and content that propagates debunked/unsubstantiated conspiracy theories
  • Pages meant to deceive users, including deceptive page design (ads that look like MC)
  • “Lack of purpose pages” that have no MC, MC that is “gibberish,” or content with no apparent purpose
  • “Pages that fail to achieve their purpose”
    • These have the lowest possible E-A-T
    • May include copied or auto-generated content
    • May have content that’s inaccessible or obstructed
    • May have unsatisfying information about the website/MC creator
    • May have unmaintained pages, hacked pages, defaced pages, or spam

Google’s example of a page with lowest-quality is this deceptive site designed to imitate the ABC News homepage:

example of a page google ranks as lowest-quality

A. Copied Content

Google also specifies what they mean by “copied content” in this subsection (part one, section 7.2.4). Naturally, any content that is not original will get the lowest quality rating from a search evaluator.

What many people don’t know, however, is that Google doesn’t consider rewritten content original if it relies too heavily on its source. Google puts it like this in the guidelines:

screenshot of rating awarded to copied content

“The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.”

Content creators who like to “spin” content should thus tread carefully here.

7. Mobile Optimization

One of the first things SEOs who consult the Search Quality Evaluator Guidelines notice is no less than ¼ of this huge document is dedicated to mobile search.

Check out this chart from “Part 2: Understanding Mobile User Needs”:

image showing the needs of a mobile user

The chart underscores just how much people turn to their mobile phones for different tasks.

These tasks vary from simple to complex. As such, the Google guidelines are careful to lay out information about how algorithms understand and interpret mobile queries.

This focus on clarifying search queries is indicative of Google’s leaning toward voice search, which is becoming a search optimization priority. (According to Gartner, by 2020, 30% of all searches will be voice searches.)

Mobile search is one of the most important trends in digital marketing right now. Every page on a website needs to be optimized for mobile platforms to do well in search (but you already knew that, right?).

8. User Experience: “Needs Met” Ratings

In the user experience portion of the Google search guidelines (Part 3: Needs Met Rating Guideline), we circle back to mobile platforms. In this section, Google asks raters to evaluate the results of various search queries.

For example, the guidelines ask raters to consider mobile user needs and how helpful the result is for those mobile users. This chart in the guidelines illustrates the rating scale, from “Fully Meets” all the way down to “Fails to Meet”:

google's needs-met ratings explained in a chart

These ratings help Google understand how search queries are related to user intent, and how their search results are measuring up. For example, if a lot of low-quality pages that “fail to meet” user needs are showing up for a certain query, Google obviously needs to work on delivering better, more relevant and useful results for that query.

9. E-A-T Versus Needs Met

The guidelines make a clear distinction between “needs met” ratings and page quality ratings. The difference is important to understand.

“Needs met” ratings are based on both the search query and the result, while page quality (E-A-T) ratings are only based upon the result and whether it achieves its purpose. This means that useless results for a particular query are always rated “fails to meet” – even if they have outstanding page quality ratings.

Think of it this way: A high-quality page with fantastic information about sea lions is useless to you if you actually want information about otters. If you searched for “otters” but got search results featuring pages about sea lions, your search needs would be unfulfilled.

What are Google's E-A-T and needs-met ratings? @JuliaEMcCoy discusses Google's Search Quality Evaluator Guidelines in this blog post #searchmarketing #searchrankingfactors #googlesearch Click To Tweet

Conversely, when considering page ratings, the search query is unimportant. This means that high E-A-T pages can still have low “meet” scores if they are deemed unhelpful for a query or do not fulfill a user’s search needs.

quality content that fails to meet your specific search needs fails the needs-met rating

According to Google’s guidelines, this page about sea lions would receive a high page quality rating, but may not necessarily receive a high “needs met” rating – that depends on the page’s relevance to the search query.

The guidelines also state that when a user is searching for very recent information (like breaking news, for instance) a site can earn a “fails to meet” rating if the content is stale or useless for the user’s particular query. This means pages appearing in search results for time-sensitive queries featuring content about past events, old products, or outdated information will be marked useless and given a “fails to meet” rating.

While fresh content is important, older content can have a high E-A-T rating without sacrificing usefulness. This is true for evergreen content and “timeless” information.

For example, users who search for information about Ronald Regan will find biographical information useful, even if it was written many years ago. This is not true, however, for unmaintained or abandoned websites that feature infrequently updated or inaccurate content.

10. “Fails to Meet” Pages

“Fails to meet” content is a boat you don’t want to be in.

According to the guidelines, “fails to meet” content is helpful and satisfying to virtually nobody. The content results are unrelated to the query, filled with incorrect facts, or in dire need of additional supporting information. Because of these things, this content doesn’t meet a user’s search intent or need.

The guidelines go on to state that content may also be marked “fails to meet” when it is low-quality, stale, outdated, or impossible to use on a mobile device. The guidelines also specify that it is possible for sites to earn in-between ratings.

Here are a few examples of “fails to meet” content results for different queries:

examples of content that fails to meet user expectations

As you can see, in the second example (for the query “American beauty”), the result is actually directly related/relevant to the topic of the search. However, because the result has unsatisfying content, it gets the lowest possible “needs met” rating.

11. Clickbait

In the updated guidelines, Google makes plenty of references to clickbait. Specifically, they don’t want to see it. Ever.

That’s because clickbait builds up a user’s expectations and then fails them spectacularly. This leaves the user dissatisfied, confused, and frustrated/annoyed, all things Google does not want to be associated with its search results.

In the section on “Low-Quality Main Content” (part one, section 6.3), the guidelines specifically mention that raters should pay attention to a page’s title, as it “should describe the content.” If the title doesn’t properly do that or creates unrealistic expectations of the MC, Google says the page should be rated “Low.”

Here is Google’s example of a clickbait title that helps the page in question earn a low “needs met” rating:

example of content with clickbait headline

“Planet Nibiru has appeared in the sky and DOOMSDAY is on the way” – clickbait much?

12. Medium-Quality Pages

In the guidelines, we have seen that raters may rank page quality anywhere from highest to lowest.

Google defines each rating and which characteristics exemplify that rating. One of the most interesting is the definition of “medium” quality pages (part one, section 8).

Google states that there are two types of medium pages:

  • Nothing is wrong with the page, but then again, there’s nothing special about it, either.
  • The page has high-quality characteristics mixed with some low-quality characteristics.

The first type of medium-quality page goes straight to the heart of what it takes to stand out in content. You can do everything right SEO-wise, but if there is nothing unique or special about your page/your content, you can’t expect to rank well.

From Google, here is an example of a medium-quality page. The website is a trusted source, but the content is merely “okay”:

example of content that gets a medium-quality rating from google

3 Major Takeaways from the Updated Google Search Guidelines

Two of the biggest takeaways from the guidelines is the importance of mobile optimization and producing and publishing content written by an expert.

1. The Need for Expert Content Is HUGE

As Google made clear with their discussions on both E-A-T and YMYL, the need for expert content is huge.

Google values pages with high levels of expertise, authority, and trustworthiness. Websites and content creators that champion these things by hiring and staffing expert writers will be rewarded for their efforts. This is especially true for YMYL pages.

Because YMYL pages are so important and have big potential to positively or negatively affect a reader’s life, Google puts them under heavy scrutiny. That means websites that specialize in these pages absolutely need to hire expert writers and content creators. The price of not doing this is too high for both websites and readers alike.

Fortunately, when websites hire expert writers to improve their page’s E-A-T and to write important YMYL pages, more than likely, they will enjoy both higher rankings in Google’s index and a position as an industry leader.

2. Reputation Matters

The recent updates to Google’s Search Evaluator Guidelines underline the importance of website/MC creator reputation when determining page quality.

Google exhaustively goes over the different ways reputation can affect a page’s quality and stipulates the best ways to research this vital factor. For example, the guidelines recommended using third-party websites and sources to do research about websites and content creators/authors.

A few they particularly mention include Wikipedia, the Better Business Bureau, Yelp, Amazon reviews, and Google Shopping.

Here’s the section mentioning the power of Wikipedia. Google calls it a “good source,” and throughout the doc, mentions the linking of Wikipedia to other sites as a quality factor:

screenshot of google's guideline on website reputation

Google respects these sites’ opinions of other sites and will consider content low or high-quality based on BBB ratings, Wikipedia links and claims, and outside reviews/evaluations.

3. You Must Be Mobile-Friendly

Sites that aren’t mobile-friendly have a 0% chance of ranking well. Obviously, Google cares more now than ever about mobile-friendly pages – after all, nearly a quarter of their search evaluator guidelines are dedicated to mobile user needs.

How are pages rated? How much value does Google put on the mobile-friendliness of a website? @JuliaEMcCoy discusses Google's Search Quality Evaluator Guidelines in this post #googlesearchguidelines #searchmarketing #contentmarketing Click To Tweet

image showing website content on a smartphone's screen

Image via Google Search Guides

Great content isn’t enough, so be sure that your entire website is optimized for mobile users.

4. You Must Create Content That Benefits Users

Imagine the new inclusion of the concept of “beneficial purpose” in these guidelines as a huge flag waving in your SEO landscape.

It’s clear that Google is looking at it as the main determiner of a page’s quality. If a page has no apparent beneficial purpose for users, it automatically gets a low rating from search evaluators. That tells us a lot about Google’s user-first mentality, and also how we should be treating each and every piece of content we create.

Plus, the concept is reflected across Google’s other guidelines, including the brief but pointed Quality Guidelines in Search Console Help:

screenshot of basic principles to follow when creating content

Take this as a sign that you should be asking yourself, “What’s the beneficial purpose of this page?” for each content piece you create.

To Be SEO-Savvy, Don’t Stop at Reading This Blog Post

My favorite SEO and content marketing resources include Backlinko (Brian Dean), BuzzSumo, Moz, and Content Marketing Institute. You can also subscribe to our Write Blog for the latest in content marketing, SEO and content writing.

Look up industry content marketing and SEO authors, too, for some must-read books. For a few solid marketing reads, I recommend anything by Ryan Holiday, Jonah Berger, Ann Handley, Joe Pulizzi, Mark Schaefer.

I’ve also written two books on content marketing and copywriting, and a course on content strategy as well as SEO writing that you might find useful.

Dr. Seuss said it best:

“The more that you read, the more things you will know. The more that you learn, the more places you’ll go.” – Dr. Seuss

Google meta description length

Google Has Increased Meta Description Snippet Length: What It Means, Plus What to Do Next

No, your eyes are not deceiving you.

Meta descriptions in Google search results ARE longer.

These descriptions show up right underneath the link to each search result. Google calls them “snippets,” and they’re a big deal.

We’ll go more into defining how it’s a “big deal” soon, but here’s a look at the old length vs. the new:

The change in meta length just happened across the last month of 2017 (less than a month ago).

Here’s a chart from RankRanger showing the SERP changes:

So, first, why do Google meta descriptions matter so much?

They are instrumental in describing the page that’s linked. Reading the snippet can thus help searchers understand whether the search result is relevant to what they need.

If the meta description is optimizing, clear on what to expect from the content, AND enticing (SEO, clarity, creativity), you’re more likely to click on it. The click-through on the organic ranking gets higher. That’s a lot to do in one meta.

You can see why great meta descriptions are so important.

Here at EW, we write meta descriptions all the time for our clients. It’s a fine art, because you have to cram the essence of what a page is about into a limited amount of characters. Then, you have to make it sound awesome.

With the character limit increased, this gives us a little more room to be creative and really speak to the reader. In turn, this gives you a higher chance of getting clicks and conversions for your content.

Let’s discuss the change, including exactly when it happened, and why this is great news for your business.

Google meta description length

Why Google Upped the Character Limit for Meta Description Snippets

Search Engine Land was able to confirm the meta description change with Google in December.

Here’s what Google said:

“We recently made a change to provide more descriptive and useful snippets, to help people better understand how pages are relevant to their searches. This resulted in snippets becoming slightly longer, on average.”

The snippets grew from around 160 characters to an average of 230 characters.

The official maximum character count allowed is now 320.

Why Does the Change Matter?

When you’re trying to write a description of a page for the search results, double the amount of space makes a huge difference.

Let’s be clear, though: The meta description has no effect on your page ranking. This was true before and it’s true now.

Instead, this snippet of descriptive text is for users, for their benefit.

As we said, the snippet could make a user want to click on your page in search results over a competing page. It may sound more tempting because it’s a better description, it’s persuasive, or both.

That said, Google won’t always use your meta description in the snippet.

Depending on the user’s search query, the search engine may instead pull snippets of content from your page.

John Mueller goes into this in detail on a recording of a Google Hangout that streamed on December 12, soon after the changes occurred. This topic starts up at about the 29:41 mark:

Here’s the tldr; –

  • Meta descriptions are important to get right.
  • They help describe your pages for users.
  • Google will sometimes (but not always) pull your meta description to use in the snippet that shows up with your link in search results.
  • Google will pull your meta description if they think it’s a more accurate or relevant summary than any text they could pull from your content.
  • If your description is accurate, relevant, concise, and well-written, you may have a better chance of nabbing click-throughs.

Most importantly, Google still recommends including a meta description on each page of your site.

Google highlights the importance of high-quality descriptions, specifically:

Besides the benefit to you when you create good descriptions for each of your web pages, it’s simply a good usability practice to follow.

What to Do Moving Forward

You get that you should be creating unique, high-quality meta descriptions for each of your pages on your site.

But what about the descriptions you already have in place? Should you go back and lengthen them just because you can?

Not necessarily.

Don’t Lengthen Old Descriptions – Unless They’re Critical Pages

According to Google, they’re still looking for relevance and conciseness when they consider text to use in snippets.

Lengthening your meta descriptions won’t necessarily make them better in either of these areas.

Instead, think of this change as a chance to make your meta descriptions going forward even better. You have a little more wiggle room for creativity and persuasiveness to sprinkle into a highly relevant summary of your page.

One exception would be critical pages of your site – the most important content pieces, landing pages, etc. that get the most search traffic. Moz, in particular, recommends going back to these and reoptimizing the meta descriptions.

Don’t just lengthen them though – rewrite them with the new limits in mind. You may come up with something completely different, but even better than before.

What Are Best Practices for Meta Description Creation?

For meta descriptions, striking the balance between appealing to users and still giving a great summary can be tricky.

Because it can be an art form, here are some best practices to follow to help guide you:

  1. Always include the focus keyword and the top secondary keyword in the description. This helps establish relevance right off the bat.
  2. Use the focus keyword as early in the description as possible.
  3. Use action-oriented words to describe the benefits to users if they click on your page. For example, start with words like “discover,” “find,” or “explore” – i.e. “Discover how to write fantastic meta descriptions.”

Of course, this is just a primer on writing meta descriptions.

Great ones don’t always follow a formula, but they do accurately entice readers with hints about what’s waiting for them when they follow a link.

Not a Meta Description Wizard? No Worries

If creating snazzy meta descriptions that bring in the click-throughs is a bit daunting for you, Express Writers can help.

We regularly write descriptions that sing the exact tune searchers want to hear. If you need some assistance with wordsmithery, let us write your meta descriptions for you.

Google's Newest Panda 2016

Why Google’s Newest Panda Might Not Hit Till 2016

Anyone who is familiar with Google’s algorithm updates knows that they are extensive, frequent and often vague in terms of detail.

Although Google’s most recent update, Google Panda 4.2, was released in July of 2015, many SEOs believe that the rolling-out process of the newest major Panda update isn’t actually done rolling out.

Read on to learn more.

Google's Newest Panda 2016

Looking At the Current Panda 4.2 Update

When Panda 4.2 was introduced in July of this year, its express purpose was to reward quality content and down rate scraped, duplicate or low-quality content. From the get-go, it was clear that this Panda was a little slower moving than the others, but, even as the weeks began to wear on, very few SEOs expected so much time to pass before admins started to notice changes in their sites. Sites that were hit by Panda 4.1 have had to wait 10 months to redeem themselves and, due to the fact that 4.2 is also painfully slow to roll out, it’s unclear whether the changes made by those previously affected sites have actually been effective. Needless to say, this is a source of frustration in the SEO community.

panda gif

In addition to being amazingly slow to implement, Panda 4.2 is also impressively extensive. When Google released Panda 4.2, it stated that 2-3% of Google’s search queries would be affected but, when you take into account that Google gets billions of daily searches, 2-3% equates to roughly 36 million affected queries. This makes it the broadest update in quite some time due to the fact that, between 2011 and May of 2014, no individual Google update affected more than 2.4% of search queries. Even though SEOs know that Panda 4.2 will affect millions of searches, the entire process has been so slow to roll out that nobody knows how, exactly, those millions of searches will be eventually be affected.

In the light of all of that ambiguity, it’s tough to make any solid statements about Panda 4.2. The one thing that SEOs do know for sure about this Panda update is that, as usual, quality content is the only safe place to be. This means avoiding things like keyword stuffing, ugly sites or exploitative SEO practices. Because Google has dedicated itself to rewarding high-class content and down-ranking sub-par content, sites that publish well-written, original content are less likely to be negatively affected by the mysterious new Panda updates.

While nobody quite knows just yet what Panda 4.2 will reward or punish, most SEOs believe that creating great content and avoiding bad SEO practices is a safe place to sit and wait for the changes to begin showing themselves.

Questions Regarding Panda 4.2’s Status

The main question people have about Panda 4.2 is “is it here?” The answer is “Yes – but only kind of.” Panda 4.2 was instated around July 18th, 2015, at which point Google said the changes would take place over “several months.” Obviously, that’s vague at best and absolutely unclear at worst. What we do know is that Google waited 10 months after Panda 4.1 to release its next update and that a large portion of that was due to technical glitches and complicated system issues.

Additionally, while it’s not uncommon for a Google algorithm change to take a number of months to go into full effect, it’s clear that this one is going extra-slowly. This is actually the slowest Panda roll out to date and, as such, many SEOs are concerned that the new changes are so complex that they can’t easily be instated in a matter of days.

Google, however, says this isn’t true. During numerous interviews, the company has stated that the new Panda roll out isn’t going slowly purposely to confuse SEOs or to make life more difficult. Instead, the company says that this slow Panda roll-out is a preview of coming attractions: a time when Panda will be one, large, continuously rolling and changing system that is incorporated into the company’s most fundamental algorithms. While the company acknowledges that they’re not there, yet, it seems as if this ultra-slow Panda roll out is the first step in that direction.

Additionally, the company has assured SEOs that, while Google Panda 4.2 is a site-wide action, it is unlikely to affect each of a site’s pages in the same way, which is why many SEO’s have yet to see any real or definitive changes. Sites that are bound to be dinged by Panda 4.2 may not actually know it until the entire roll out is complete – some time in the distant future. At that point, affected sites will need to wait until Google releases its next update in order to revamp their pages.

The Verdict: The Newest Panda In the Works May Not Hit Us Before 2016

Considering the fact that there were 10 long months between Panda 4.1 and Panda 4.2, it seems unlikely that the next update will happen any time before 2016.

For now, the best Google has to offer is that site owners should keep an eye on their analytics. By isolating organic traffic driven by Google and closely analyzing any large dips or boosts, site owners can begin to get an idea about whether the big Panda is here and, furthermore, whether it has begun to affect their sites or not.

What You Can do to Stay Sane During the Slow Roll Out

Although many SEOs feel a bit helpless during this time of Google ambiguity, there are a few things you can do to mitigate the uncertainty and ascertain whether Panda 4.2 has affected your site:

1. Frequent Site Audits. The first is that you can keep checking your site for any positive or negative impacts that seem different from the norm. This is likely to be a strong indicator of the fact that Panda 4.2 was there. Unfortunately, however, once the changes are instated, it’s too late to do anything about your site and, if you have content that is going to be punished by Google, it will be punished no matter what you do. 

2. Stay Informed. With that in mind, it’s wise to continue keeping an eye on your site and visiting popular Google hangouts for any definitive news about Panda 4.2.

In the meantime, be extra aware of web spam and crappy content and avoid both like the plague. While the new Panda may be more mysterious than all of the others, SEOs need to take comfort in the fact that there will eventually be some answers.

While Google Panda 4.2 is officially here, it’s tough to know exactly how it’s going to affect sites just yet. While it’s not all that different than previous updates, Panda 4.2 promises to be increasingly tough on content mills, web spam and poorly written pieces.


In light of this, the solution is the same as it ever was: cling to high-quality, original content and avoid bad SEO.

That way, when the big, bad Panda does decide to show its face, your site stands a better chance of being prepared.

Panda gif via Tumblr

Google’s Latest Update: Why Your Site Content Must Be HTTPS

Roundabout August 2014, Google announced that it was including HTTPS as a lightweight criteria for aiding page ranking.

When Google said this, not a whole lot of people took the news truly to heart.

Previous exploration of the statistics had shown that HTTPS was pretty much a parallel of HTTP and that there was no real overlap between them.

But now, suddenly, in the last week or so—HTTPS traffic has had a big ranking boost. Is this Google’s realization of a statement that Matt Cutts made in 2014 about wanting to see more rewards for sites using TLS?

Reviewing What SSL & HTTPS Content Is

To understand why this is an important issue, let’s define these two acronyms. No insult to anyone’s intelligence: it really is something that needs defining.

HTTPS stands for “Hyper Text Transfer Protocol Secure“, which is, in layman’s terms, a more secure version of the HTTP that we all know and love. Its major use is in sites that traffic in sensitive information, i.e. banks and other e-commerce based pages. HTTPS is easy to spot online as it’s usually prefaced by a padlock icon right before the site name in the address bar.

SSL stands for Secure Socket Layer and is the successor to TLS (Transport Layer Security). SSL encrypts a connection (as opposed to a single file) ensuring that all data that passes through the connection is secure and unable to be tampered with by external entities. Together, HTTPS and SSL provide a solid layer of security and a good deterrent to malicious entities.

What Does HTTPS Have To Do With SEO?

Until recently, HTTPS (if it was even considered relevant to SEO) provided a very minor rankings boost, typically less than other indicators such as high quality content. This seems to have changed recently. In the last few days, the investigation of HTTPS URL’s on page one of a search increased dramatically compared to the previous ten days. This in itself is news and cause for concern. What is it that’s making these HTTPS sites rank so well all of a sudden? When we assess the possible causes we are left to assume that either:

  1. Google’s algorithm updated to a point where HTTPS is now considered a lot more important to page ranking or
  2. A massive movement of one or more popular domains from HTTP to HTTPS.

Taking these two as our premises, we can now set out to get to the bottom of this crazy swing in page rankings.

Welcome to HTTPS, Wikipedia! Following the Statistics Trail

The first thing we have to see is if there was a large domain that shifted across from HTTP to HTTPS that might account for a huge (9.9% in fact) change in page ranking for HTTPS pages. A cursory glance allowed us to discover that Wikipedia, a page that already accounts for a lot of page-one traffic, was slowly doing a changeover to HTTPS. Since Wikipedia makes up a large volume of our page-one rankings then it may be safe to assume that their switch is what skewed our readings by such a large margin. The only way to figure if Wikipedia’s change was what caused our statistics spike is to leave it out of considerations to see if this is an HTTPS gain and not one due to Wikipedia’s massive bulk.

When we isolate our statistics to remove HTTP/HTTPS from the results (by considering them both as equal), we still see a change (although obviously less massive than before) when it comes to page-one rankings. This translates to the idea that HTTPS may be getting a boost in rankings from somewhere. We can see that having HTTPS as a protocol is beneficial to the user and maybe this provides a further clue as to whether this is just an anomaly or something more in-depth.

HTTP Content & Google’s Overall Perspective

Google has changed its Internet focus from being search oriented to being user oriented. They have realized that by catering to users first, they are building a trustworthy presence on the Internet. How they are doing this starts with their algorithm changes. From the time Panda was released to Google’s mobile update a couple months ago, we can see how Google is slowly making webmasters consider their audience. Gone are the days when a page’s ranking was based solely on the amount of keywords it had stuffer per total page count on the screen. Now it’s all about user benefits and HTTPS offers a lot of benefits to the average user. HTTPS is especially important in situations where sensitive information may be at risk.

HTTPS exists as a method of empowering the user by ensuring that all information that concerns him or her is unable to be broken into by a third party that is unaffiliated with either side of the connection. HTTP doesn’t allow for protection of a user’s account information or ID and if it is used on a login page then it can be vulnerable to penetration by third parties and makes for a great target for people to obtain information about a user. If this is a Google update, it is centered on the user (as most of the modern updates to their algorithm are) and rewards sites that put users first.

What This Means for Us as Content Publishers

We understand exactly how important any addition to Google’s algorithm is. Although it’s not a confirmed addition it has all the bells and whistles associated with a Google update. The only way we’ll know for sure if it’s a permanent Google update is when Wikipedia’s site finally settles down into HTTPS mode and we can observe the ripples on both the HTTPS and HTTP side of things. In any case, what we should be considering is how HTTPS can help our users since it’s likely that in the future whether your site is SSL-compatible or not may actually affect your search ranking a bit.

As a content publisher your content is accessible by all users and information they submit to your site can be vulnerable to external penetration. This in itself is not too worrying if the information submitted is of a non-essential nature. Things get a little bit trickier when it comes to more sensitive information. HTTPS ensures that an eternal entity cannot spoof your address in the hopes of phishing information out of your visitors. Your audience trusts your site to the point where it would allow your site access to some information unequivocally. Having HTTPS on your site ensures that no on abuses this trust between you and your audience and makes you a more trustworthy site overall: an important factor in your overall page ranking.

Two Main Ways HTTPS Works

HTTPS ensures a connection is secured on both ends so that an external source cannot garner information passed over the connection for malicious purposes. How does this comes down a three-step process:

  1. Encryption: Data passed from the client to the server and vice versa are encrypted to keep that information safe. This means that when a user is on a site, it is impossible for another user to “listen in on” or “eavesdrop” on the data being sent to and from the server.
  2. Data Integrity: This means that the data going to and coming from the server cannot be changed. It stops attacks by “injection” where an external entity can change or edit data as to make it unusable by the server.
  3. Authentication: This ensures that the server that the user is connected to belongs to the business they intend to deal with. It also stops “man-in-the-middle” attacks where another user spoofs the server in order to intercept data that is meant for the server which can then be decrypted.

How do I Add HTTPS support for my site?

There are a number of ways to do this but the most efficient method (and that recommended by Google) is to include server-side 301-redirect requests for any HTTP page to send the user on to an HTTPS secure page. Alternatively you can use a server that supports HTTP Strict Transport Security (HSTS) which shunts users to an HTTPS site even though they entered an HTTP site in the URL bar. It serves as a drastic measure but ensures that you don’t serve unsecured content to your audience.

Why HTTPS May be Important in the Future

Methods of obtaining information through illicit means on the Internet is not exactly something new. From the early days of viruses that installed back doors in computers to the relatively modern practice of phishing, it is clear that we can’t ever wipe out the processes by which people are relieved of their valuable information online. And make no mistake, in the virtual world, information is as good as hard currency in the real world. In order to protect users that utilize our site and to ensure that we remain a secure and trustworthy domain to deal with, we should consider setting up HTTPS on our servers.

It may take some work, but the overall benefits would be worth it in the long run, especially if this Google update pans out and further enforces HTTPS favor.