Categories
SEO

How to Rank Your Old Content

Do you know what percentage of your search traffic comes from your top 10 pages? When’s the last time you looked? Chances are, it’s a pretty big part of your total monthly website traffic.  The older your website is and the more content it has on it, the less those 10 pages tend to account for a larger portion of your traffic.

Generally, with smaller sites, the percentages are much higher where the top 10 pages make up the majority of their search traffic simply because there isn’t that much material available on the website. Does that mean you should just focus on your top 10 pages and ignore the rest, or should you focus just on generating more new content?

Quality Matters Over Quantity

Many people seem to believe that when it comes to web content, the philosophy of more is better applies. There are people who crank out dozens of articles every week and sometimes publish more than one article a day on their blog. And over time, they see their traffic grow, but not by much. They spend all of this time writing just to realize the majority of the content they publish never even ranked. So what should you do in this situation?

Begin focusing on your old and outdated content to boost your traffic. Consider this. You can publish one new piece of content a week to keep things fresh, but your team can update older articles as you work on new content.

When you write more frequently, your top 10 pages will typically make up a larger portion of your search traffic. By updating your older content, you can increase your search traffic exponentially and reduce your reliance on your top 10 pages.

Start with the Google Search Console

In the Google Search Console, you have access to data for up to 16 months. Compare this month’s results to the same period as last year. Click on date and then compare. Next, choose your older date first and then today’s date.

Look for Pages That Used to Get a Lot of Traffic

On the generated report, look for articles that used to get a lot of traffic but have less now.  This way, you’ll see old content that Google used to love but no longer pays attention to.

Look for Pages That Never Got a Lot of Traffic

After we’ve located those, it’s time to find content on your site that Google has never loved. Go back into the search console and look for pages that have a high impression count but never got any real clicks. The easiest way to find these pages is to choose a date range in the last month and look at each page  metric from an impression, click, and click through rate perspective. Sort the click through rate column in descending order so the lowest percentage is at the top and the highest person is just at the bottom.

Generally, the pages  at the top of the list have the most potential because it means Google is ranking you but you  just aren’t getting enough clicks. Most of the time, it isn’t just related to your title tag and meta description and has to do with the content on the pages.

With this information, it’s time to create a list of pages that have the greatest potential.

Prioritize Your Content Updates

Most often, the pages that have the highest potential are ones that used to rank but no longer do so. Google used to rank in like them which means that if you give those pages a bit of TLC, you can easily get Google to love them again.

The second group of pages may have potential but not as much as the first. These are the ones that have a high impression count but a low click through rate. These pages are more difficult to fix because they never really performed well.

Start Updating Old Content

Find the first article you want to update in your Google Search Console, and click on it. Then, click on “Queries.” If you see keywords that don’t rank in the top five or have a high impression value, go to your ranking article and check to make sure it is relevant to that term.

If it’s not, edit the article so that it at least includes that term and covers the topic.

For terms where you’re already ranking in the top 5 spots, use a keyword tool to get more keyword ideas. You should see long-tail variations of the keyword, so you can edit the article to include any of the long-tail phrases, you should see some quick gains in traffic.

Beyond including the right keywords, update the content to make sure all of the information is relevant, the photos are current and if you could include any kind of multimedia such as embedding relevant YouTube videos. This will help you increase the time on site of your visitors.

User Experience

Because user behavior is one of the biggest influencing factors with Google’s algorithm, you’ll need to optimize your newly updated content for these are signals to help boost its rankings. Takes time to optimize your title tags and meta description.

Why? If everyone searched keyword on Google and clicked on the second result instead of the first, Google eventually learns that the second result is more relevant and it should be ranking in the first spot rather than the second. Eventually, Google would change the ranking of the two sites.

By using persuasive copy and convincing people to click on your search listing rather than the competition, you’ll see your rankings climb.

Promote your content again because you’ve updated it and optimized it. Once you’ve updated the content, update the published date or the last updated date within your WordPress platform to signal to readers and the search engines that your content is changed, up-to-date, and more relevant. Share it on social media to get it circulating again and traffic will start coming in.

Build Links

Links are an important part of the ranking equation. You should create a new strategy or continue working on your old one to build more links to the newly updated content.

Once your website has 150 pages or more, consider focusing the majority of your time and effort on updating old content instead of creating new content.

If you have more than 1,000 pages, spends at least 80% of your time updating old content rather than writing new content.

The key to getting that old outdated content ranking again is a focus on the content that used to rank but doesn’t anymore.

Categories
SEO

No More Support for Robots.txt Noindex

Google has officially announced that GoogleBot will no longer obey robots.txt directives related to indexing. If you are a publisher relying on robots.txt and no index directives, you have until September 1st 2019 to remove it and start using an alternative.

Why the Change?

Google will no longer support the directive because it’s not an official one. In the past, they have supported the directive but this will no longer be the case. This is a good time to take a look at your robots.txt  file to see where you’re using the directive and what you can do to prepare yourself when support officially ends.

Google Mostly Obeyed the Directive in the Past

As far back as 2008, Google has somewhat supported this directive. Both Matt Cutts and John Mueller have discussed this. In 2015, Perficient Digital decided to run a test to see how well Google obeyed the command. They concluded:

“Ultimately, the NoIndex directive in Robots.txt is pretty effective. It worked in 11 out of 12 cases we tested. It might work for your site, and because of how it’s implemented it gives you a path to prevent crawling of a page AND also have it removed from the index. That’s pretty useful in concept. However, our tests didn’t show 100 percent success, so it does not always work.

Further, bear in mind, even if you block a page from crawling AND use Robots.txt to NoIndex it, that page can still accumulate PageRank (or link juice if you prefer that term).

In addition, don’t forget what John Mueller said, which was that you should not depend on this approach. Google may remove this functionality at some point in the future, and the official status for the feature is ‘unsupported.’”

With the announcement from Google that noindex robots.txt is no longer supported, you cannot expect it to work.

The official tweet reads: “Today we’re saying goodbye to undocumented and unsupported rules in robots.txt. If you were relying on these rules, learn about your options in our blog post.”

In that blog post, they went on to say: “In the interest of maintaining a healthy ecosystem and preparing for potential future open source releases, we’re retiring all code that handles unsupported and unpublished rules (such as noindex) on September 1, 2019.”

What to Use Instead

Instead of using noindex in the robots.txt file, you can use noindex in robots meta tags.  This is supported in both the HTTP response headers and in HTML making it the most effective way to remove URLs from the index when crawling is allowed.

Other options include:

  • Using 404 and 410 HTTP status codes: Both of these status codes mean the page does not exist which will drop the URLs from the Google index once they are crawled and processed.
  • Disallow in robots.txt: Search engines are only able to index pages they know about so blocking a page from being crawled typically means it won’t be indexed. A search engine may also index URLs based on links from other pages without seeing the content itself so  Google says they aim to make those pages less visible in the future.
  • Password protection: Unless you use markup to indicate paywalled or subscription-based content, hiding a page behind a login generally removes it from Google’s index.
  • Search Console Remove URL tool: Use this tool to quickly and easily remove the URL from the Google search results temporarily.

Other Changes to Consider

All of this comes on the heels of an announcement that Google is working on making the robots exclusion protocol a standard and this is likely the first change that’s coming. Google released its robots.txt parser as an open source project alongside this announcement.

Google has been looking to change this for years and by standardizing the protocol, it can now move forward. In analyzing the usage of robots.txt rules, Google focused on looking at how unsupported implementations such as nofollow, noindex and crawl delay effect things. Those rules were never documented by Google so their usage in relation to the Googlebot is low. These kinds of things hurt a website’s presence in Google search results in ways they don’t believe webmasters intend.

Take time to make sure you are not using the no index directive in your robots.txt file.If you are, make sure to choose one of the suggested methods before September 1st. It’s also a good idea to look to see if you’re using the nofollow or crawl-delay commands. If you are, look to use the true supported methods for these directives going forward.

NoFollow

In the case of nofollow, instead of using the robots. Txt file, you should use no  follow in the robots meta tags. If you need more granular control, you can use nofollow and the Rel attribute on an individual link level.

Crawl Delay

Some webmasters opt to use the crawl delay setting when they have a lot of pages and many of them are linked from your index. The bot starts crawling the site and may generate too many requests to the site for a short period of time. The traffic peak could possibly lead to depleting  hosting resources that are monitored hourly. To avoid problems like this, webmasters set a crawl delay to 1 to 2 seconds so the bots crawl a website more moderately without causing load peaks.

However, the Google bot doesn’t take the crawl delay setting into consideration so you shouldn’t worry about the directive influencing your Google rankings. You can safely use it in case there are other aggressive bots you are trying to stop. It’s not likely you’ll experience issues as a result of the Googlebot crawling, but if you want to reduce its crawl rate the only way you can do this is from the Google Search Console.

Take a few minutes to look over everything in your robots.txt file to make sure you make the necessary adjustments ahead of the deadline. Though it may take a little bit of time to sort through everything and  execute the changes, it will be well worth it come September.

Categories
SEO

June 2019 Broad Core Update

In a move very unlike the search engine giant, Google announced a broad core update in June, before it actually happened. The update will continue to roll out for a while, and as of this writing, there’s no indication when it will be complete. It’s still too early to determine the full impact of the update, but it’s important to be aware of.

What is a Broad Core Update?

According to Google SearchLiason on Twitter, every day Google releases at least one change geared toward making the search results better. While some of the updates are focused on specific improvements, the others are considered broad core changes. They say these updates are routine and take place multiple times a year.

When these updates occur, there’s nothing wrong with the pages that notice drops. It’s a reflection of changes in the system that were benefiting previously under-rewarded sites. There’s no “fix” to regain rankings you lost as a result of one of these broad core updates. Google suggests focusing efforts on quality content creation, as this may help your site rise through rankings relative to other pages.

My understanding is a broad core update makes changes to the main search algorithm, as there are at least 200 (and likely many more) ranking factors that are part of it. A broad core update may make adjustments to the order, importance, or weight of any given one of these factors in an effort to make overall search results better.

What Happened in June?

The update began June 3rd, and continued until all data centers were updated. Google announced on June 8 that the update had been fully rolled out.. Google doesn’t typically announce these sorts of changes before or as they occur, but it may have something to do with the fact that the SEO community has been asking Google to announce when these kinds of updates would occur to allow them to prepare and remain proactive about any changes that occur because of the updates. Google’s Danny Sullivan said in a tweet, “Nothing special or particular ‘big.’ It’s the usual type of core updates that we regularly do. We just wanted to be more proactive. Rather than people scratching their heads after-the-fact and asking ‘hmm?,’ we thought it would be good to just let folks know before it rolled out.”

With this update, as well as other broad core updates, we don’t see anything extraordinary. Google just wanted to alert the community so they knew what was going on and didn’t stress about any changes they may notice.

What the Data Says

In the wake of the updates, many large data providers released reports about how this update has affected them. RankRanger, Moz, SearchMetrics, and Sistrix have amassed large datasets around Google rankings, allowing them to see the bigger picture when it comes to algorithm updates and how they affect rankings.

RankRanger found that gambling, health, and financial websites were hit the hardest in terms of visibility loss. They noted that while many sites fluctuate in the search results, the fluctuation wasn’t as strong as seen in the past.

Dr. Pete Myers, the Marketing Scientist at Moz, shared his early findings on Twitter. He says though it’s not an in-depth analysis, there was high flux across verticals, but unusually high for health, and food and groceries. Day over day flux was high, but 22 days in 2019 have been at or over that temperature.  In another update, he said there was some flux on the 4th, but by the 5th it was stabilizing. Sites that had gained or lost big on the first day continued to lose or gain, but at a smaller scale.

SearchMetrics found, at least in preliminary analysis, that many parts of the March broad core update were reversed. It appears that Google changed some factors to brand and authority too much in March, and used the June update to revert it. The reason for this line of thinking is that many websites, especially in the medical niche that lost visibility as a result of the March update gained back that visibility. But in other areas, they’re not seeing the same pattern. They also found that trusted aggregator sites, such as Yelp and YellowPages were boosted.

Sistrix found that in comparison to the previous days results, you could definitely see the impact of the core update. Sites like Mercola.com and Daily Mail were among the hardest hit, whereas sites like Mirror, HuffPost, and Healthline were among the ones to receive the biggest gains.

Bonus: Separate Algorithm Update Launches Alongside Broad Core

What’s interesting to see is that on June 4th, just one day after starting the broad core update, Google also released what they dubbed as a Diversity update. The fact that updates came out so close together makes analysis a bit more confusing, but it’s nice to see Google providing more information about the updates.

In a June 6 tweet, Google announced they were making a change to search results because they’ve received feedback indicating that many searches provide several results from the same site. As a result of the change, you typically will not see more than two results from the same site in the top results. There are some instances, though, such as when the system determines it is especially relevant, to show more than two results from the same site.

The site diversity update will treat subdomains as part of a root domain so results from a subdomain will be considered the same as results from the root domain in terms of diversity. However, when deemed relevant, they are to be treated separately.

At the end of the thread, Google made it know the launch of the diversity update was separate from the core update that had also launched that week – as different, unconnected releases.

I, for one, am glad to see Google keeping us up to date on what’s going on before it happens because it lets us keep our clients informed about the process and what they may be able to expect.

Categories
SEO

Getting Ready: SEO During an Economic Slump

In 2008, we saw the stock market crash, suffering one of the largest points losses in history – and holding the record for the largest drop until 2018. For many, starting a business during this time ended up being a mistake, but in the case of SEO and direct marketing, economic downturns make it possible to thrive.

If you’re paying attention to the economics experts, it’s time for a rather significant financial plunge – and it’s impossible at this point, to determine whether it will send us into a recession, or if it’ll just be a momentary slow down that leaves as quickly as it came in. The only thing we know for sure is that eventually, our economy will decline – because it always does. And it will rebound, eventually, because it does that, too.

Why Does SEO Thrive When the Economy is Down?

Put simply, when there’s less money available to invest in your business, you want the money you do have to invest to work harder for you, going further than before. That means cutting the fat and focusing solely on the channels you know are working. When profits are down, you don’t focus on the long term – you focus on generating as many leads and sales as you possibly can, to keep your business in survival mode until the economy begins to climb again. And search keeps those leads and sales coming in, even if it’s not as effective as it is when combined with branding efforts and a long term strategy.

Preparing for SEO During Economic Downturns

Whether you’re working in-house or as an agency, there are things you must do to prepare for any economic slump. Not taking the time to get ready could mean that you’ll get cut, along with the rest of the marketing budget. When the economy starts going bad, the key to keeping all your decision-makers happy with their search marketing efforts is communicating the value you offer, along with effective campaign tracking.

Step one is to make sure that everyone involved is up-to-date on how the campaign is doing. If you’re not currently aware of customer lifetime value, invest in determining that number. Turn to attribution modeling if you haven’t already.

As an SEO expert, you’re only as value as the decision-makers perceive you to be, and ultimately, you’re aiming to make yourself irreplaceable. How do you do this? Communicate your value early, and often, celebrating all wins. Alert decision-makers to the fact that someone offers to come in and do your job cheaper, the results you achieve will more than make up for the additional costs they incur as a result of your employment.

To do this, you must be on the same wave-length with all the decision-makers, and that makes you must agree on what the results are.

Focus on Goal Setting

With a clear definition of what the results are, it’s easier to set your smart goals for what the results will be. Setting the smart goals allows you to become irreplaceable. Setting realistic goals that the decision-makers agree with set you up for long-term success. Get the agreement in writing or via email. If there’s an unrealistic goal request, suggest that it becomes a “stretch goal” and set something that’s more realistic so you’re not pressured into something that’s impossible to deliver.

No Guarantees

Even if you do not hit a goal, you can salvage the relationship by communicating why you didn’t hit the goal and what you are changing in the future. When marketing budgets are being cut because the economy is down, having stated goals with numbers to support them can save your job.

The truth is that the economy will go south. It could happen tomorrow, next quarter, or at some point within the next two years.

But, because the leading economic experts are suggesting it’s going to happen sooner rather than later, it means SEOs need to prepare themselves – and get ready to thrive. Why? What SEO does for a business has a direct affect on their bottom line – and when the work you do improves it, you can solidify your place as a partner to keep your income on steady ground.

But to survive the decline, preparation is necessary. Start setting goals with the decision makers. Come to an agreement on what success looks like. And do the work to serve the client well. Do these things, and you’ll make it through the slump quite well.

Categories
SEO

Audio SEO: Podcasts in Search Results

Every year, Google hosts their I/O Conference, bringing together developers from around the world to learn from Google experts, and get an early look at the latest developer projects. One of the most interesting takeaways from the 2019 I/O Conference was the revelation that Google will be (and in some cases, already is!) showing podcasts in search results.

This is one of the early results that indicates Google is now indexing podcast content and providing audio clips in search results.

Is Google Able to Transcribe Audio Content?

Yes. Google has offered a speech-to-text service as part of the Google Cloud Platform since 2017, which we’ve already seen get some upgrades. Not long ago, Android Police found changes in source code that suggested Google was transcribing some podcasts on the Google Podcasts platform.

Not only this, but Google sends automatic transcripts of voicemails on Google Pixel phones and to Google Voice numbers.

There’s also evidence of this in the search results, but with video. Google started testing suggested YouTube video clips in search results in April 2017. Starting with video makes sense for Google because of their profits. They own YouTube, which is much larger than Google Podcasts, so it was a smarter financial move to start there. It works by providing search results based on the audio portion of a video, so it stands to reason they can apply the same technology to audio files.

How Will Audio Appear in Search?

We can expect the starting points to be extensions of the Google podcast engine, including both automatic transcription and full-text and full-audio search. Both of these things are already in the works. Once you’re able to search within Google Podcasts, we can expect that to expand to general Google searches as well.

Right now, there is the question as to whether Google will return audio content or transcribed text. In some situations, it may be better to return audio clips for better matching user intent. If you’re searching for something you hear in a podcast, it makes for a much better experience to be able to hear the audio rather than having to sift through plain text to find it. The big advantage however, is to voice devices such as Google Home and Amazon Echo. Being able to return audio results fills the content gap for these smart speakers and voice devices, and bridges into full podcasts along with other non-text content.

Should I Start a Podcast?

Well, that’s entirely up to you since starting a decent podcast requires a bit more time, effort, and planning than simply grabbing a microphone and recording what you have to say. It’s true we are in the middle of a small podcast revival, and it’s reasonable to think that audio search may cause that revival to grow even more.

Before you get too excited, remember Google will always gradually release changes and test them for a few weeks – maybe even a few months. If you’re planning to start a podcast, don’t do it just because Google’s going to start including them in results. Do it to serve your audience first and foremost – because that’s how you earn brownie points with Google anyway.

If you’ve already got a podcast  and you want to make it search accessible, it’s important to make sure you’ve added it to Google Podcasts, and are entering the available metadata. If you’re not – go get started on that now – updating all the metadata you can.

All that you’ll really need to get your content transcribed is a clean audio file in a format that Google can easily process. That said, it’s important to consider how the audio content is structured, since completely free-form content may be harder for Google to parse and evaluate. Make sure your podcast theme is evident, along with the theme of each episode. Do you have a structure where a machine could separate questions from answers? Do you have concise takeaways, such as a summary at the end of each episode.

Audio SEO ultimately means we’ll need to be more deliberate and structured with our approach to audio. As Google continues to grow and evolve across devices, we need to be hyper aware of the content that best fits our audience’s needs. Is the searcher looking for video, audio, or text? Each modality is there to fit a different need and a different device or set of devices in the search landscape.

Are you excited about the potential audio SEO brings to the marketing landscape? Share your thoughts in the comments. I’d love to hear from you!

Categories
SEO

How to Use SERPs to Build Your Keyword List

One of the most important, yet frustrating jobs as an SEO is to develop keyword lists for clients. There’s a lot of time and effort that goes into producing a powerful keyword list, and having a good one can be the difference between seeing the big picture, or just a small piece of it.

This is a guide to my favorite way to build keyword lists, which relies on three search engine results page (SERP) features: the “People Also Ask” box, the “People Also Search For” box, and the “related searches” found at the bottom of the SERPs.

 I’ll explain why you should use these features and how you can get your hands on all of the Google-vetted queries to create the ultimate keyword list to help your clients crush their competition.

Google Approved Search Terms

These features are the keyword gold mines because all three of them link to new SERPs for the terms that are semantically related to the original query. As such, they provide great insight into how users follow up, refine, and narrow down their searches to reveal relevant topics that could easily be overlooked.

Google has put a lot of resources into understanding and mapping how topics and searches are linked and these features are the direct result of all that research. Google is literally showing you how and what everyone is searching which is why these features are so useful.

“People Also Ask”

The “People Also Ask” box contains questions that are related to the original query, which expand to reveal answers Google has pulled from other websites.

These questions make great long tail keywords to add to your list and they are an amazing source of content inspiration. The numerous ways users express the same basic question can help you expand your topics. One piece of content could easily answer multiple questions, too.

It’s important not to fall down the rabbit hole, because while the box used to provide anywhere from one to four question and answer combinations, many of them are now infinite and can’t easily provide hundreds of options giving you an infinite number of pages to track.

Google does not always choose the questions based on actual search queries so it seems that many of the questions are the result of machine learning. This is because Google is doing its work to understand actual search queries and produce relevant searches to save users effort. It makes sense for us to be on those pages when users decide to take Google up on the offer.

Create a spreadsheet and for each of your keywords that return a “People Also Ask” box keep track of the questions that people also asked and note the result that Google sourced the answers from. You will likely find a lot of duplicates so once you remove them you can narrow down your keyword list to topically related queries to explore.

“People Also Search For”

The people also search for term isn’t new to the search engine results page but the feature did get an update in February which helped it become more useful. Now, instead of just being attached to a knowledge graph, the box also attaches itself to organic URLs and contains extra queries. You’ll get up to eight related searches  on a desktop and up to six on a mobile device. These queries are related to the URL that surfaces it. It’s Google’s way of saying that if you didn’t find what you’re looking for you can try these options instead.

This feature requires you to do a little bit more work in order to find it. You must click on an organic search result and then navigate back to the results page before it shows. Collecting these terms this way involves a lot of work and finger cramps. Fortunately for us, there is a little bit of javascript code from Carlos Canterello that helps you find all of the boxes on a certain without going back-and-forth.

Or if you are feeling link doing it yourself coming you can pull the raw HTML from the serbs and parse them Yourself using the STAT API.

You’ll want to add all of this information to your spreadsheet. You’ll end up with a ton of options, and like with the questions asked, many will be duplicates. Once you remove them, you’ll have even more long-tail keywords and inspiration to work with.

Related Searches

 The final place you can use Google to find keywords for your list is the 8 related searches that are found at the bottom of the results page. When these are clicked they become the search query of a new results page.

Manually collecting this data that could be time consuming so there are tools that will help you collect these into your spreadsheet. Like the other features, you are bound to find many duplicates that you’ll need to remove.

Evaluating the Keywords

Google will definitely offer up the best suggestions, but you want to be sure you’re only got the most relevant keywords for your project, so you’ll want to do a keyword audit.

Combine all the queries into a master list, and remove any that don’t make sense. Load them into STAT or another SEO tool of your choice to keep an eye on them for a few days so you can vet them. Organize the new queries into groups of the SERP feature where they came, so you can track which of the features makes the best suggestions, and keep the data super organized. (You’ll appreciate it later!)

Now, with your search volume information in hand, choose and remove keywords that returned no search volume. Doing so gets rid of clutter and allows you to focus efforts on queries that will bring you traffic. It’s also a good idea to remove keywords with low search volume. You’re free to choose your own definition of low, but this helps you focus on the higher value keyword phrases. You can keep them, if you want, as well.

Now that you know how to use Google to do the keyword research, it’s time to figure out the next steps, which is dependent on your SEO strategy. If you need help with any or all of it, feel free to reach out.

Categories
SEO

Search Quality Evaluator Guidelines Updated

In May 2019, Google updated the Search Quality Evaluator Guidelines again, for the first time since July 2018. The new version of the guidelines added in more detailed instructions about content creator expertise and interstitial pages, and adds “E-A-T” (Expertise, Authoritativeness, Trustworthiness) within the Page Quality in certain areas. Quality raters are expected to follow the updated version of these guidelines over the course of their work.

What’s Changed?

The document has increased two pages, for a total of 166 pages. Though the document has grown in length, the table of contents and the majority of the guidelines are the same.

If you’re an advertiser that uses interstitial pages or ads or an app developer, you should make sure your ads don’t limit a user’s ability to get to the main content on a page.

A paragraph that explicitly mentions content creator expertise emphasizes how important it is to vet the information included in your content.

E-A-T is now part of the Page Quality section, in the explanation column of tables in sections 15 and 17.

The revisions don’t particularly alter the majority of the guidelines how quality raters evaluate websites, but they are impactful enough for Google to update the document. As such, content creators, advertisers, and marketers should be aware of the changes.

Why This Matters

The Search Quality Evaluator Guidelines are what humans use to evaluate websites and search engine results pages. Though they do not have a direct effect on rankings, the judgements they make do influence improvements to the Google search algorithm.

Adding E-A-T to the Page Quality section may indicate how Google wants the quality raters to approach content evaluation. The extra emphasis on interstitial pages within the Distracting Ads section suggests that advertisers and webmasters who make use of them may see lower ratings. The additional guidance about content creator expertise may mean lower quality content is under more scrutiny.

Quick Google Quality Guidelines Cheat Sheet

Some Websites are Held to Higher Standards

Google will place certain websites under more of a microscope than others. This is the case when the content affects a person’s wealth, health, or happiness. This means sites in the health, finance, and personal development space need to pay greater attention to content quality and accuracy.

If Your Site Doesn’t Look Trustworthy, Google Won’t Treat it as Trustworthy

If Google sees spam comments, clickbait like advertisements, poor formatting, and other things that detract from your site’s trustworthiness, it will give it a low quality rating because it doesn’t appear to be trustworthy.

Every Page Needs a Purpose

All pages on your website need to have a purpose. The high quality websites out there contain content that helps the user learn something, do something, or go somewhere. Each page should be focused on helping the user accomplish their goal. This means framing content around the user intent of the keyword – not just writing whatever you want as it pertains to the keyword.

High Quality Content Can Become Low Quality Content

If you create content you believe is high quality, but then never go back and update it to keep it fresh and current, it can and will become low quality content. This is especially important for medical and financial related topics, because the information changes in these areas often.

For content to be considered high quality, it should be written by, or supported by an expert. Financial and medical advice needs to be written by, or at least contain quotes from accredited experts. Real world experience is an acceptable measurement of expertise for other topics. Life experience is considered real-world expertise, so an author who has lived through something and shared their experience can be considered an expert.

Broken, Buggy, or Hard to Use Pages are Low Quality

If your want a high quality website, you should ne maintaining it and monitoring it to ensure users consistently have a functional experience. If pages aren’t loading, aren’t easy to use, or are full of bugs, Google will penalize you.

Reputation Matters

If you have great content but a bad reputation, you’ll get a lower quality score. A bad reputation will bring the entire site down. If you have a bad reputation, it’s not the end of the world because there are things you can do to improve your online reputation. However, it’s not something that can be considered an instant fix. We offer online reputation management services to help you keep an eye on things and improve them when and if it becomes necessary.

To maintain a good reputation, it’s important to provide quality customer service. When someone takes the time to leave a review of your business online – whether on Google, Facebook, or another platform, you should always take time to respond to the review whether it is positive or negative. Never go on the attack, even if the commenter is wrong in their review. Reach out and offer to take the conversion offline via phone, or off the review platform via email to ensure it gets resolved.

To ensure your website gets a high quality rating any time a quality rater takes a look at it, you’ll want to regularly check your site for broken links and images (making repairs if any are ever discovered), keep your content up to date and add new information when and where appropriate, remove spam comments, create author biographies that showcase experience, keep the user experience clean and remove any distracting ads, and make your site as easy to use as possible.

Categories
SEO

Ultimate Guide to Image Optimization

Images are a vital part of the online experience, as visual content produces better recall, people follow visual instructions better than written ones, and increases engagement on Facebook and Twitter.

But, not all images we see online are created equally. Without proper optimization, the huge file sizes can disrupt load time.

Here’s a quick and easy guide to use to help you make sure you’re handling all your images the right way.

Why is Image Optimization Important?

Whether you’re using an existing image or part of the 30% of marketers creating your own visuals, optimization is important to page load time, which is crucial for user experience and plays a role in search engine ranking. Up to 40% of visitors will abandon your site if it takes longer than three seconds to load, and 50% of your visitors (or more) may be on slower mobile connections! That’s why you want to strike a balance between image quality and file size.

Beyond size and quality, image optimization also factors in the file name, alt attributes, thumbnails, and image sitemaps. Addressing all of these is key to getting the most from your images.

A Closer Look at Web Image Formats

There are many different image formats you can use, but for the sake of time, I’m only going to focus on the three most popular and widely used options you’ll see online.

JPEG/JPG

A JPEG, which stands for Joint Photographic Experts Group, or the organization that created the standard, is a type of lossy graphics file. It can also use the JPG file extension.

PNG

PNG stands for Portable Network Graphic. It uses lossless compression, so that no information is lost when images are compressed. The PNG format was created to address limitations with the GIF format, and to provide another image format without a patent license. It is a format that supports image transparency, which is particularly useful for web use. You can choose between PNG-8, which handles a maximum of 256 colors, or PNG=24, which allows for unlimited colors.

GIF

GIF, or Graphics Interchange Format, is an image format that allows for animation. Animated GIFs are everywhere these days – especially easy to find on Facebook and Twitter. You can have single image GIFs, or combine several images to create an animation. Like JPGs, GIFs are lossy images.

Choosing the Best Format for the Task

Unfortunately, there’s no universal file format that makes the best choice for web content. JPG is best for still images and photography. GIFs can only display a maximum of 256 colors, so they are great for simple animations, graphics with flat colors, and graphics without gradients. PNG handles still images and transparency.

JPEG format is best for all images that contain nature scenes or photographs with smooth intensity and color variations exist. PNG is best for images that need transparency, and for images with text and objects with sharp contrasts (think logos) and use GIF for anything that is animated.

Let’s take a look at an image in each of the three file formats – kept at the same image size for easier comparison. We’re using a free stock photo from Canva at 300×300 px.

The JPG is 81 KB.

The PNG is 116 KB.

The GIF is 55 KB.

Not much of a visible difference to the eye, but massive differences in the image size. Now, let’s use compression to shrink the file size even more, and watch what happens to the image. In this case, the GIF would be the best to use, simply because it’s the smallest one.

Compression Tools

There are a number of image compression tools available to help you shrink your images without compromising the quality. Some include:

I opt to use Compress.io any time I have a GIF to compress, because I can handle the three major file formats with the same tool. In addition to GIF, it also supports SVG files, which are on the rise for logos. Plus, before you download the compressed version, there’s a slide tool that lets you compare the original version to the compressed version.

Here are the same images again. I’m willing to bet you won’t be able to see a difference in any of the images.

The compressed JPG is 10.89 KB, for an 87% reduction in file size.

The compressed PNG is 39.93 KB, for a 67% reduction in file size.

The compressed GIF is 21.39 KB, for a 61% reduction in file size.

After compression, we have a significant reduction in all the file sizes, but the GIF is no longer the smallest file size. JPG is the clear winner here, with about half the file size.

Each image you work with online should be compressed so it takes up less space on your hosting plan and allows it to load faster for the end user.

Image Names

To increase the chance your images show in the appropriate Google Images searches, you should use clear, descriptive names, rather than the auto-generated filenames from your camera. The image above, for instance, could be rose.jpg, or red-rose.jpg, or red-rose-in-hand.jpg.

Alt Attributes

Your Alt attributes are what screen readers use to describe images to those who have vision issues, making them critical for ADA compliance. Here’s what Google says about what makes a good alt tag:

“Google uses alt text along with computer vision algorithms and the contents of the page to understand the subject matter of the image. […] When choosing alt text, focus on creating useful, information‐rich content that uses keywords appropriately and is in context of the content of the page. Avoid filling alt attributes with keywords (keyword stuffing) as it results in a negative user experience and may cause your site to be seen as spam.”

Image Sitemaps

Images give Google a great deal of information about the content on your website. If you include an image sitemap, you can give Google additional details about the images, along with the URLs of images the crawler may not otherwise discovery. Image sitemaps can contain URLs from other domains so you can use a content delivery network (CDN) to host your images.

Focus on image optimization as part of your search engine optimization strategy, and you’ll reap many benefits.

Categories
SEO

User Intent: Crafting Content Based on What Users Want and Need

To be successful with your SEO campaigns today, the most important thing to consider is user intent. Behind every search, there is an intention. People are looking for something in particular when they search, whether it’s the answers to their problems, information about available services, or sources for a product they want. If you want people to be able to discover your business online, your content must be optimized for user intent.

Google’s algorithm has come a long way in its ability to recognize user intent, and that’s how it serves the most relevant content to the user. As a result, understanding user intent as you create content will improve the relevance of your website, and thus go a great way toward improving your SEO.

Understanding the Types of User Intent

To create an SEO strategy based on user intent, you must first be able to tell the difference between the types of user intent. Search Google with the terms your audience will be looking for, and based on what you see, you’ll be able to determine the type of content users want to see at the various stages of intent.

Informational

In the informational stage, users are trying to gather more information about a topic or product, but are not ready to buy. This is where you’ll find how-to posts and tutorials particularly useful. There are generally no ads on these types of searches because there isn’t a particular product to buy.

Navigational

In the navigational stage, the user is looking for content to help consider their options, but still isn’t ready to buy. This is where you’ll find best of lists and the like, to help people make their decision. Options may contain links to buy, but the sites are not pushing making the sale too hard.

Transactional

At this stage, the user is ready to buy. They have their credit card in hand and are ready to purchase. They may search “where can I buy?” Most of the search results that will show up are from online stores that sell the product the user wants to buy. These sites are delivering product pages, because they know based on the nature of the query, the user already has the information they need.

Audit Your Existing Content

Now, you’ll take a look at whether or not your existing content matches the needs of your audience. You can do this with Google Analytics. Take a look at your top performing keywords. If you determine that your top performing keyword, “buy laptop”, a transactional keyword, leads to a blog post with tips on how to choose the best laptop for your needs, you’re not giving users what they want.

In this case, you should adjust your strategy so that people who search for “buy laptop” are taken to a product page instead of an informative blog post. This way, your users get what they are looking for, and you’ll get more conversions.

Look at your other blog posts. If users aren’t seeing them, start incorporating more informational keywords. And if your product pages aren’t showing up in the search results, add some transactional keywords to improve your ranking for that user intent.

Create Content with User Intent in Mind

Keywords are an important part of your SEO strategy, but they aren’t the only thing you need to consider. You must craft the content around the keywords, based on user intent. Because Google understands user intent, if your content doesn’t match the intent for your keyword, you won’t rank for it no matter how well other factors indicate that you should.

For instance, if you are targeting the long-tail keyword phases, “how to bake a cherry pie” but you stray from that subject in your blog post and it ends up being more about how to grow a cherry tree, you won’t show up in any results related to baking. Instead, your content will show up in searches related to gardening, which doesn’t match your target audience. You’ll get traffic both ways, of course, but without matching intent, that traffic won’t be targeted, which means conversions will suffer, and ultimately, so will your revenue.

Plan content for each type of user intent as this will help guide users through each stage of the buyer’s journey. That way, users will discover you in the informational stage, and they’ll find helpful content from you again once they move to the navigational phase. At this point, you’ll have built up some trust and credibility with the audience so you can nurture them all the way through buying your product or service.

Take a look at all the keywords you’re trying to rank for. Separate them into buckets based on intent. Create content that supports that intent, even if it means reworking, or completely scrapping your existing content. Each time you conduct keyword research to find more potential ranking opportunities, make sure you first understand the intent of the keyword.

Knowing (and making use of) user intent will make your SEO strategy that much more successful. By following this advice, you’ll be serving your audience with highly useful content at the right stage of the journey, and will keep them coming to your site. It’ll go a long way toward ensuring those people become your customers, and start referring more customers to you.

If you want help with crafting a user-intent based SEO strategy, you can reach out to the team here at Sachs Marketing Group.

Categories
SEO

How to Create a Global SEO Strategy

If you’re nailing your business here in the United States and feel like it’s time to expand your brand to an international market, then it’s time to create your global SEO strategy. To be successful with SEO in other countries, it takes more than translating your website and letting people select the language they prefer to read your website with.

I don’t recommend this approach for just any brand. It’s generally only worth your time and effort if your analytics data reveals you have a steady amount of traffic coming from a foreign market or a spike in customers from a particular region or country.

Global SEO vs. Local SEO vs. General SEO

International SEO focuses on optimizing your content for a number of regions across the globe. Local SEO focuses on offline businesses that have a brick-and-mortar presence in a local area that relies on general SEO and reviews, business information networks, and local directories. General SEO focuses on the basic factors that make it easy for search engines to crawl, index, and list your content. General factors include:

  • Keyword optimization
  • Site speed
  • Sitemap
  • User experience
  • Backlinks
  • Schema markup

You’ll use general SEO whether you’re focused on local or global SEO.

Let’s say you sell shoes and you have business operations in Spain, Germany, and France, but your local market is in the United States. Your international SEO optimizes for those counties, as well as the word “shoes” in their languages.

In Spain, you’d optimize for Zapatos

In Germany, you’d optimize for “Schuhe”

And in France, you’d optimize for “Des chaussures”

And of course, at home, in the United States you’d still optimize for “shoes.”

Relevance Remains Key

No matter the market, Google wants to ensure the content they serve is relevant to the user. On-page and off-page SEO, along with visitor behavior is used to determine relevance. You won’t be successful if you copy and paste content across pages and translate it according to the language. Each market has unique needs, and your site’s content needs to reflect that. What performs best in the United States may be completely irrelevant to your German market. Craft content for each region and market.

Now let’s get to the nuts and bolts of crafting your strategy.

Step One: Determine Potential in Target Countries

Estimate your international SEO potential by using tools such as Google Analytics and SEMRush Domain Analytics to determine how your website is ranked in other countries. In Google Analytics, go to Audience > Geo > Language or Location and look at the number of sessions from the countries you’re interested in targeting.

Step Two: Conduct a Competitive Analysis

Regardless of your international SEO potential, you need to know who you’re going up against in the new market. You may already know who your biggest competitors are in the country, but they may not be as successful as you think when it comes to their online presence. As such, you should focus on your organic competitors.

Step Three: International Keyword Research

After you determine who your main competitors are, determine the keywords they are indexed for, so you can choose the best ones you should use for your SEO efforts. This helps you get a bunch of keywords you can use to improve your content and gives you a decent chance at being able to compete against them. Repeat the process for a few competitors to make sure you have plenty to work with.

Step Four: Localizing Your Brand

With the international keywords in hand, it’s time to create content in each of the local languages. You shouldn’t just translate it and stuff it with keywords because of differences in market. If you’re selling fruit in a country with a cooler climate, then the content you’ve written for a warmer climate in another market wouldn’t make sense to the audience even if it was translated.

It would be worth the investment to hire a native speaker to translate content you’ve written specifically for the audience – or to write the content itself. Native speakers are more familiar with the nuances of language and idioms so you avoid issues with certain forms of “you” because you need to use the polite form when you’re speaking with someone you don’t know. Germans, in particular, and more strict with their use of the formal “you” than Italians are, for example.

Step Five: Address the Technical Aspects

Once everything else is planned, it’s time to handle the technical side of SEO, including:

Website structure: Are you targeting a language or a country? If a language, use the language targeting approach. If you’re targeting a certain country or audience, you’ll want to use country targeting. Determine whether you’ll use country code TLDs, subfolders, or subdomains.

Server location: The further your server is from the destination, the slower the site loads. You can either go with a content delivery network that has multiple markets, a single provider with multiple server locations, or by hosting on servers in key markets.

Correct hreflang implementation: When correctly done, this ensures users in each of your target countries are coming to the right country or language version of your site. If they are not done properly, it can damage your ranking and user experience, since they are meant to cross-reference pages with similar content but target different audiences.

An important thing to remember is data privacy regulations vary from one country to the next. Europe’s General Data Protection Regulation (GDPR) got a lot of media attention, but Africa and Asia also have their own guidelines in place, known as Personal Data Protection Guidelines for Africa (PDPGA) and Asia Pacific Data Regulation and Cyber Security Guide, respectively. When you build out your global SEO strategy, make sure you’re in compliance with all data privacy regulations in the markets where you’re conducting business.

If you’re ready to go international with your business, let’s talk. I can help you develop the right strategy to expand your brand into new markets and keep your customers happy.

Exit mobile version