Digital Marketing

The Google Discovery Feed and Advertising

Word on the street is that the Google Discovery feed will become its new advertising cash flow in 2020.  It hasn’t attracted much attention since its beta launch last year, but Google is certainly excited about it. In 2020, it is expected that ads in the personalized mobile news feed will reach 800,000 users. The company paid attention to Facebook’s news feed ads growing as Google+ faded away. Tests have been conducted in the mobile feed and they are excited by the current results. We do not know what exactly is impressive but it’s certainly something to watch. Not everyone has seen ads yet, even though testing initially began in late 2018. Google officially introduced the Discovery campaigns at their Google Marketing Live last May.

What is Discovery?

Discovery is a personalized service that shows news articles, stories, and topics that are based on the individual user’s search history. Browsing, app behaviors, location history and settings, and stated interests come into play as well. Discovery ads are centered around images and are similar to social creatives.

Visibility is Still a Concern

The biggest downfall we can see is that advertisers cannot target ads to only the feed. The campaigns are a part of Google’s electronic-powered campaign types that are run across multiple platforms automatically. Discovery ads will serve users on Gmail and the homepage on YouTube. With the combination, these ads can reach hundreds of millions of users. While the ads cannot be targeted solely to the news feed, the campaign results do compensate for lack of control and visibility.  Many advertisers are willing to accept Discovery campaigns as they are for the sake of improved sales.

Which Advertisers Should Test Discovery?

Advertisers are starting to test Discovery campaigns with clients who represent a variety of businesses. It isn’t so much the product or service that is important, the goals are what matter. Brand awareness, increased sales, and acquisitions are some goals that have been reached with major success. Even something as simple as increased site traffic will notice new activity after a Discovery campaign. The point is, Discovery is a high volume and low CPC channel that will have the extra benefit of boosting brand awareness.

Targeting the Right Audience

Remarketing is an important step for businesses who want to attract previous clients and sales back to their website for the sake of brand loyalty. Brand loyalty is perfect for building stability and maintaining relationships with the target audience. It will eventually lead to new connections as users talk about the business.  Can it be possible with Discovery campaigns? One would initially think not since there is little control over visibility. Yet many advertisers have been using Discovery campaigns for remarketing and have found them to be successful.

The reason for this is that Google Analytics allows one to target tag-based audiences that become a part of the Discovery campaign. It does create a sense of control over who is seeing your ads. The best part is that tag-based audiences are more targeted and result in a lower CPA. The downside is that the volume is lower than general audiences, but that is to be expected with tag-based audiences.

Repurpose Your Creatives

You may have noticed carousel and single image ads on Facebook. These are so successful on Facebook that Discovery campaigns are designed to support the same things. This gives you more flexibility with your past and current creatives while designing new creatives. One thing about this support is that you can reuse your old creatives in new ways. Many advertisers have tested the carousel and single image ads on Discovery that were originally used on Facebook. Having the ability to take creatives from other social channels and repurpose them for Discovery campaigns adds freedom to the creative process. The repurposing also helps maintain a consistent brand all over the internet.  It is recommended that you use multiple images in landscape and square settings. Your ad should look organic to the Discovery feed.

Want To See How Successful It Is?

The only way to truly see how successful your Discovery campaign is, is by reviewing key performance indicators (KPI’s). They should be tied to your client’s goals but you can also look at the impact of the Discovery campaigns on your broader customer base.

If you are an advertiser doing a remarketing campaign, CPA and ROAS will be the most reliable numbers to review. If the focus is on top of funnel, impression vs sales and the customers path in Google Analytics will be more helpful. Increased sales and traffic numbers will have a focus on numbers visiting, lurkers, purchases completed, and interaction with consumers on social media platforms.

What Does the Future Hold?

For now, those with a Google account manager can get into the beta run. You can start small, think $50 per day, and scale from there to see it’s effectiveness. Setup can be tricky since it is still in beta and there are some frustrations with having to re-upload assets.

We are still waiting to see if Discover will be another successful path for the Google corporation. This time around, they are working slowly. We know this because of the relatively little chatter since the ads were launched. Until numbers are released, we really don’t know the weight the news feed is pulling. We have heard success stories with many advertisers and yet it can still vary. Advertisers do have the ability to see channel data in Google Analytics source/medium reporting. Some advertisers have done this and see situations where half the campaign traffic shifted to another platform, like Youtube. On Youtube, there is volume and success awaiting.

For some of you, it may be better to wait a little longer before doing a test run. Allow Google to take the time to work out some kinks and improve the system. Be mindful that cost could rise with improvements to the site and once it is released to the advertising public. Now is a good time to set aside some budget money to use when it is released.


Google Explains How to Use Headings for SEO

Headings help readers who scan through to know what to expect when reading an article. Headings also play a heavy role in SEO presence, but not necessarily for the reason you may think. Google uses H1, H2, and other HTML headings in a specific way for displaying articles in searches on its platform. When Google speaks, we listen, because we want to make sure to incorporate its advice into our SEO strategy.

A Bit Of History On Headers

In the early 2000s, heading elements were among the ranking factors Google used to determine where to rank a page for a particular keyword or phrase. If you wanted to rank, one of the most important things to do was to use your keywords throughout the headings. For the past few years, this hasn’t been the case, but it’s still a common SEO practice today. It’s a habit for many, though if you look at top-ranked sites, you’ll likely see headings that don’t include keywords.

Google’s Thoughts on Headers and Keywords

John Mueller was recently asked about Google’s thoughts on the use of keywords in a heading and their ranking ability.

Mueller responded: I think in general, headings are a bit overrated in the sense that it’s very easy to… get pulled into lots of theoretical discussions on what the optimal headings should be.

Google no longer ranks by keyword but rather the heading and the following contents. They examine the heading to ensure the subsequent information matches. The way that headings should be.

H1 Headings Do Not Outrank H2 Headings

In the past, it was understood that headings were hierarchy based.  H1 was more important than H2 and H2 was more important than H3. The most important keywords would be placed in the highest level of heading. The lesser important keywords were in lower-level headings. Though this may have been the case 15 years ago – it doesn’t apply today – despite the fact that many people still approach using headings this way.

Google explains that headings are irrelevant to rank. Should they still be used? Yes! They are essential for accessibility and for user experience.

What Is The Proper Use For Heading Tags?

Google stands by the idea that the best use of heading tags is for the reader. They indicate the information to follow, introduce a video, or an image. Google doesn’t look for the keyword but the quality of the text. They want to rank a site because the information is relevant to the heading and they can draw the reader to look at more information. The right searches will find your information, not the keywords in the header.

Heading Tags Aren’t Ranked

Headings tags used to make the top lists of ranking factors for decades, even though Google has made changes and has been transparent about them. If you do your own search and really study the results, you will notice the results don’t include the headers with a keyword. Google stands by providing search results with information about the content and that is all.

John Mueller has said: So it’s not so much that suddenly your page ranks higher because you have those keywords there. But suddenly it’s more well Google understands my content a little bit better and therefore it can send users who are explicitly looking for my content a little bit more towards my page.

Mueller also took the time to explain the proper use of heading tags: So obviously there’s a little bit of overlap there with regards to… Google understanding my content better and me ranking better for the queries that I care about. Because if you write about content that you want to rank for which probably you’re doing, then being able to understand that content better does help us a little bit. But it’s not that suddenly your page will rank number one for competitive queries just because you’re making it very easy for Google to understand your content. So with that said, I think it’s useful to… look at the individual headings on a page but… don’t get too dug down into all of these details and variations and instead try to find a way to make it easy for people and for scripts to understand the content and kind of the context of things on your pages.

What This Means For Websites

The great thing about understanding the way Google uses headings is that it encourages more quality content than keyword stuffing we saw for several years. Quality content is much more beneficial to your audience than keyword-stuffed fluff, and Google wants to make its users happy. If people can’t find what they’re looking for in Google, then Google risks losing them to other search engines.

By worrying less about keywords in your content headings, you can ensure your content comes across with more authenticity. You’re less likely to be hit with penalties for keyword stuffing, and more likely to rank ahead of any competitors who are still relying on the old school tactics. Ultimately, you’ll earn a reputation for quality, which helps to build and strengthen relationships with your consumers.


January 2020 Google Core Update

On January 13th, 2020, Google announced that its first core update of the year would be rolling out that day. It’s important to remember that as with any other core update, there’s nothing necessarily wrong with the pages that drop in rankings after week or update. They are just being reassessed against content that has been published since the last update or content that was overlooked previously.

Widely noticeable effects are to be expected, which may include gains or drops, so start paying attention to your rankings in the days and weeks following core updates is important. If you found that your rankings dropped, look at what is ranking ahead of your content and consider how you can create an even more comprehensive solution for searchers.

Compared to previous core updates, this one does seem fairly substantial based on some of the early signals and chatter. The core update we saw in September 2019 was slow to roll out and wasn’t very upsetting. The January 2020 core update, however, seems much larger than the September update according to what’s going on in the SEO community.

The core updates usually take a few days, it can take up to two weeks to fully complete so that’s why it’s important to keep an eye on your analytics and traffic in the days and weeks after Google makes an announcement about the core update.

Part of the core update included an update to the search engine results page layout which displays company favicon in the desktop search results.

How Big are the Changes?

It may be a few more weeks before we can truly determine the scope of all of the changes, but it is safe to assume that Google is adjusting its trust in entire domains based on machine learning for the score updates. The more credibility and trust a domain has from its industry, the bigger the potential change is when Google adjusts the value internally. We can see it well in the health and finance sectors which at the start of the core update processes have seen changes that are proportionately greater than others.

With the latest score update, Google has widened the circle of affected domains and the trend continues. In the daily visibility index data for mobile search results from Sistrix,  you can see the first reactions begin on the 15th to the 16th of January.

Who Won as a Result of This Core Update?

This update focused on domain-wide recalculation of Google trust, there are both winners and losers. Either all of the content ranks better than before or the domain drops a few positions. So far, the winners include the following domains:


Though many expected the movements in the health sector because there are rumors that Google is planning something in this area, there is a wide range of sectors and content types.

Which Domains Lost?

With winners, also come losers. The top losers so far are:

● BoxOfficeMojo

A lot of the focus in the past core updates seems to mostly impact the “Your Money, Your Life” domains, but what’s surprising about this round is the number of sites related to car buying. This may be related to the amount of finance-related information on the site, such as pricing, financing, and insurance. The computerized, data-based quality evaluation is behind the changes, and that could differ greatly from a human evaluation.

Domains that relate to YMYL topics have been re-evaluated by the algorithm, and are either gaining or losing visibility as a whole. Domains previously affected by core updates are more likely to be affected again in the future. The good news is, the absolute fluctuations seem to be declining with each update, as Google is becoming more certain of its assessment and doesn’t deviate much from the previous one.

We’ll have to wait and see what comes of the April 2020 Core Update. Though it hasn’t been announced because the focus is all on the January update, it’s reasonable to expect them on an almost quarterly basis – since we saw one in June 2019, September 2019, and November 2019 (though the November one wasn’t announced, but confirmed by Google.)


Google Has Changed How Local Search Results are Generated

Until recently, searches for local businesses needed to have a business name and location included, which isn’t very helpful if you don’t know the exact business name. If you misspelled the name or completely butchered it, it was likely to not show up at all. This led to a lot of frustration for users that Google has finally addressed. Earlier in the year, Google announced some changes to searches. In November, Google rolled out a tweak for local searches and we think it’s a big relief for many.

November 2019 Local Search Update

It took the entire month to roll out the new update and so far, it’s working well. Many contribute it to the neural matching, but what is it? Google calls it a super-synonym system, we call it smart. Neural matching gives Google the chance to understand the meaning behind searches and match them to local businesses that are most relevant to the search. This gives some businesses a chance to be found when they may not have shown up previously. This also means that some businesses that dominated a keyword will show up less.

Artificial Intelligence Behind The Searches

Artificial Intelligence (AI) has been used for a lot of different things and Google has been in the game for a while. Google Analytics is a perfect example of AI and its abilities.  It knows a person’s browser, location, what websites they visit the most, and how much time is spent on those websites. Google Maps is also using AI to help with travel. It gives you an ETA, alerts you of detours, and traffic jams. Maps even will give you the option of a new route to compensate for missing a turn or needing to avoid a traffic jam. The next step was to use AI in searches to simplify and streamline the user’s experience just as Google has always strived to do.

No More Multiple Listings For One Site

How many of us have typed in our search and seen six listings for Pinterest in the results? Or maybe we want to shop and we get a whole page of Amazon links. Annoying, isn’t it? Well, that was one of the reasons why the November 2019 Local Search Update was implemented. Many businesses weren’t getting the chance to be seen because other websites took up most of the first page. The likelihood of people making it to the second page isn’t very high. Users also don’t want to see just one business listed, they want the smorgasbord. Trust us, if they wanted to shop at Walmart or visit Pinterest, they would’ve gone straight to that site instead of Google. While the businesses that appear the most may not enjoy the shift, they will still be seen after the local businesses. We live in a society that is striving to go back to supporting local businesses. Google jumped on board and seems to support the very idea.

The Algorithm Effect

Many have wondered how this new update affects our usual searches. We all know that Google caters to the individual user. Searches are based on past searches and they show you what they think the user wants to see. But the update is now promoting more localized businesses, so how does that affect your future searches? Google says that searches will shift a bit but only to match the user’s new activity.

When these searches now show new information they haven’t seen before, they shift the algorithm when they investigate new pages. Let’s say you are searching for an old cookie recipe that your Grandma used to make. While a recipe site will appear, other local businesses relating to cookies and baking will appear. If you fall down the rabbit hole of looking at baking supplies from a store and the new bakery opening down the street, your algorithm will shift a bit. The more this happens with your searches, the more your algorithms shift to fit you as a user. So what does this mean for a business who is no longer appearing as much in searches? It means engagement could slow down or be lost.

Good For The User, Bad For Business?

Google has always been about the user and not so much the business. While the user will now see businesses they hadn’t seen before, this means that some businesses won’t be seen. So what does a business do if suddenly they see a drop in numbers? The same as they always have. They adjust their social media interactions and tweak their marketing plans. They look for new and innovative ways to bring people to their sites.

It isn’t a bad thing for business when Google rolls out these updates. The users have been asking for this change for a while and Google listened.  It can be frustrating for the business but it’s a situation you have to roll with. It helps to also re-think who you are interacting with. Maybe you need to refocus and market more towards your local audience. While Google is streamlining the experience for the user, it is also working to get the best matches for business as well. If you don’t want to lose that fabulous market you have built on the other side of the world, you can still work to keep them on board.

These updates are barely a month old and there will be a lot to witness over time. So far, the changes have had a positive response from the users. As algorithms shift and AI gets smarter, that will be when we really see how well the updates do. We believe it will make business with users better. Change is always a frustrating hurdle for businesses but fortunately, Google has been talking about this one a while. Hopefully, businesses have made flexible plans to keep their engagement and continue to grow.



Using Schema to Create Google Actions

Recently, Google announced that publishers can now create Google Actions from web content with schema markup.

Google Actions are a great way for brands to get more mileage out of their SEO strategy while offering another chance to reach searchers organically. Optimizing your website for newer SEO features such as Google Actions and Rich Results is becoming increasingly critical to keeping the Google algorithm happy.

Though the option is not available for every content type, this new capability is huge for those of us who are less technical.

What Are Google Actions?

Google Actions are apps that are designed for Google Assistant. They range from apps like the Domino’s delivery action to health and fitness apps, ride-hailing services, and even personality tests.

Actions work when the user prompts Google Assistant with a phrase like “OK Google, [Action].

All actions take place inside the cloud even though users can access them on any device with the Google Assistant enabled. Each ActionAction is tied to a specific intent and is programmed with the corresponding fulfillment process to complete the request.

Google Actions and Schema

Schema refers to a type of microdata that provides Google with more context about the intent of any piece of content.

Adding schema markup to a web page creates a rich result or an enhanced description that appears on the front page of Google. Rich Results include everything from book now buttons for local businesses to recipe instructions, events, and contact information.

Search engines need to match content to search inquiries, and part of determining the quality of a search result depends on user intent.

Schema is a way for websites to alert search engines about the intent behind the content. It’s also required for websites that want to be eligible for Google’s Rich Results, which increasingly accounts for the biggest part of the first page in the search results.

Adding schema markup alone, of course, will not guarantee you land at position 0. You’ll have to follow Google’s recommendations perfectly and choose the right schema for the page you’re targeting. Your content must also be useful, engaging, and credible.

Google’s latest announcement brings schema to Google Actions, which offers another channel for you to earn some of your SEO share back.

For content creators, this means they now have the ability to create Google Actions, whether or not they know their way around Dialogflow or the Google Actions Console. Instead, Google will automatically generate an action when users add specific mark up to the eligible content types.

Content Types for Google Actions Schema

Using schema for content actions provides an opportunity for you to increase brand awareness in a format that has limited advertising opportunities. With schema markup, Google can create a variety of actions based on five types of content that you may publish on the web.


Google’s guidelines that you can apply the FAQ schema to any site that features a list of questions and answers on nearly any subject. This means the option isn’t limited to an official FAQ page included on your website. Instead, you can create FAQ Pages for any resource or topic relevant to your business.

The FAQ schema, whether it’s linked to an action or not, allows brands that aren’t in position 0 to take up a ton of real estate on the search engine results pages.

As with other types of schema, your FAQ content needs to match what’s on your website 100%. Otherwise, Google may slap you with a manual action. It’s also worth noting that FAQ content is purely informational and intent, and as such, you should not use markup as a free advertising Channel.

By converting your FAQ pages into Google actions, the Google Assistant reads your answers out loud when searchers enter a related voice query.


With recipe markup, users can promote their content through which cards presented in the Google Assistant and learn about your content in the assistant directory. You can use it to highlight nutritional information, ingredients list and prep time, and images to get searchers interested in your food.

You can also use the recipe schema together with the guidance markup to give consumers a way to follow along with audio instructions for your recipes.

You’ll need to fill out a Google Form to get started with the feature. It requires only your name, company name, email, and domain.

You’ll need to be sure that your page features both the recipe and guidance markup to be eligible for the rich search results and as a Google action. You’ll also need to make sure that you’ve set up your structured data correctly.


Google announced last May that they would be adding podcasts to the search results screen with a new structured markup option.

For podcasters who were reliant on search features on Stitcher or Apple podcast, the option to improve discoverability in the Google search results is huge.

With this markup, podcasters can improve their showing in the search results and on Google podcast with individual episode descriptions and an embedded player for each on the first page. An additional new feature, Deeper Podcast search, allows users to search for actual audio directly inside the podcast with Google transcription.

Connecting podcasts to a Google action elevates things to the next level because it makes it easy for users to find your podcast in the assistant directory and play episodes from their smart speaker, Google home display, or their phone.

All you have to do is sign into the Google Play portal, click “add podcast” from the menu at the top right corner, add your RSS feed and apply required tags, then follow Google’s podcast markup guidelines to ensure you create an automatic action.


Adding mark up to your news content helps increase your visibility and allows users the option to consume your content via Google Assistant.

You can apply this schema to blog content, news articles, and articles, though you will need to be registered as a publisher on Google News to take advantage of this option.

The news markup makes stories visually stand out in the search results pages. Features like the top story carousel, the host carousel, visual stories, & headlines allow users the opportunity to attract more organic traffic to their sites by giving them a larger piece of real estate to share their content.

To add voice compatibility to the list of features, you’ll have to choose between AMP and non-AMP formatting.

To turn your news content into a Google action, you must sign up with Google Publisher. You must also have a dedicated news site that uses static, unique URLs with original content. Keep your ads, sponsored content, and affiliate links to a minimum. Also, consider using news specific XML sitemaps for easier crawling.

How-To Guides

You can use the how-to schema to mark up articles that contain instructional information that show readers how to do something new.

According to Google Developers, how to markup applies to content where the main focus of that page is the how-to. In other words, it doesn’t count if your long-form article includes the short how she section along with several other elements. The content has to be read sequentially as a series of steps.

You cannot markup offensive, violent, or explicit content. You must mark up each step and its entirety. You are not allowed to use this kind of markup for advertising purposes. This markup does not apply to recipes because of the fact they have their own schema. If applicable, include a list of materials and tools to complete the task along with images.

As voice search and smart devices become increasingly popular, they are valuable to the SEO landscape. Google actions offer a new point of entry for Brands who are looking to increase their visibility and the organic search results. This update makes Google actions accessible to a greater range of marketers who may not otherwise be able to build an action from scratch.

Digital Marketing

Annotations to Add Context to Google Analytics

Taking a periodic look at data inside Google Analytics is vital to understanding how well you are meeting your goals and objectives. However, as you look at the data, it can be challenging to remember exactly what was going on on certain days.

You may see a spike in traffic related to a specific campaign, or a decrease in traffic as a result of a local holiday or even a temporary server outage. Though it may be possible to open your calendar and match various dates to activity, it’s not very likely you have all of the dates of every single campaign stored in a central location that it’s easy to access and review.

When you see something happening right now, you can quickly determine why it’s happening and what’s going on. As time passes, however, it’s easy to lose track of what was going on, and when it comes to analyzing your website, not readily having this information can present quite a problem. If we want to measure the impact of a circumstance on our site, we have to know what happened and when to make the proper analysis.

That’s where using annotations inside Google Analytics offers a wonderful benefit. Creating annotations will provide the context you need when it comes to data analysis. Over time, the annotations become more valuable because, as the data gets older, you are less likely to remember the circumstances of that particular campaign.

What are Annotations in Google Analytics?

An often-overlooked feature of Google Analytics is the ability to annotate your reports by date. You can click the arrow tab below the overtime graph on any reports to display the annotations panel.

Annotations are small notes that allow you to record information about what was going on on a particular day and your Google Analytics dashboard.

You can create private annotations that are only visible to you when you log into your Google Analytics account. However, if you have collaborate access to Google Analytics for other accounts, you can create shared annotations that can be seen by anyone with access to the reporting view.

If you have annotations that you consider crucial or of higher importance, it is possible to star them, so they stand out a bit more. It is easy to keep up with who created what annotations as each annotation is associated with the email address that was used to create it. It is also possible to edit and delete annotations.

All you need to create annotations in a Google Analytics view is basic read and analyze access. Anyone who can access a view can annotate it.

The default visibility setting for annotations is shared. If you do not want anyone else to be able to see The annotation you are creating, select private.

Annotations are replicated among reports with the same view to help save time. For instance, if you create an annotation in the landing pages report, you’ll see the caption icon appear in the all referrals report.

Annotations, however, are not replicated among views. If you and your team work with multiple views for the same Google Analytics property or website, make sure you’re clear about which view will house all of the shared annotations.

How to Add Annotations

  1. Look for the tab below the time graph on the report you wish to annotate.
  2.  Click “+ Create new annotation”
  3.  Select the date for the annotation.
  4.  Enter your note.
  5. Choose the visibility of the annotation. If you only have “Read and Analyze” access, you will only be able to create private annotations.
  6. Click “Save.”

Once the annotation has been saved, you will see a small icon on the timeline. This allows you to see that there is a note attached to that date.

How You Can Use Annotations

You can, and should, use annotations to keep track of anything that could influence website activity – either positively or negatively, including:

  • Marketing campaigns
  • Major website design and content changes
  • Industry developments such as Google algorithm updates
  • Website outages
  • Competitor activity
  • Weather
  • General news
  • Other time-specific factors that may affect website behavior

What Google Analytics Annotations Can’t Do (Yet!)

It’s worth noting that Google Analytics annotations could use a few improvements. We hope to see them come at some point in the future.

Annotations are not included when you export your reports. If you select the PDF export option, you can see the icon, but you do not see the details of the annotation.

It is only possible to create annotations for specific dates. There is not an option to include a time or create an annotation for a month, week, or custom date range.

There is no option to import annotations from a Google Calendar automatically; however, this would be an excellent option for those of us who are keeping an external timeline of all of our website and marketing activities.

Beyond creating a timeline directly within your Google Analytics, you may want to record events in a separate spreadsheet or calendar so you can color code and categorize and add additional notes about status and follow-up.

For instance, you may want to know that two months after you have modified your website’s navigation bar, you will check specifically for changes to the conversion rate and page visits along with other relevant metrics.

The advantage of keeping annotations within Google Analytics is that they provide context with the caption icons appearing directly in the reports. It may be easier for you to connect your data with the occurrences that you have recorded.

To make your Google Analytics even more powerful, set up custom Intelligence Alerts by email when a metric threshold is reached for a specific period.

For instance, after you feature a product on your homepage, you can create an intelligence alert that will generate an email if your traffic to your product page increases by more than 10% compared to the previous week.

If you choose to use intelligence alerts to complement your annotations, the alert will remain active until you delete it, so any factor may trigger it, rather than the one you annotated. They aim to work independently of annotations to provide a quick and easy way to monitor key metrics on your website actively.


New Snippet Settings Make Controlling Listings Easier

Google has added new snippet settings that allow webmasters to control how Google search displays their listings. The settings work either through a set of robots meta tags or an HTML attribute.

New Meta Tags to Settings Snippets

“Nosnippet” –  This old option has not changed. It allows webmasters to specify that they do not want any textual snippets shown for the page.

“Max-snippet:[number]”  -This new meta tag lets webmasters specify the maximum text length in the number of characters for the snippet of a page.

“Max video preview:[number]” –  This meta tag allows webmasters to specify a maximum duration in seconds of an animated video preview.

“Max image preview:[setting]” – This is a new meta tag that allows webmaster specify a maximum size of an image preview to be shown for the images on the page. You can set it to none, standard, or large.

It is possible to combine the meta tags if you want to control both the max length of the text and the video.

Here is an example:

<meta name=”robots”  content=”max-snippet:50, max-image-preview:large”>

HTML Attributes

If you’d rather, you can use these as HTML attributes rather than meta tags. This allows you to prevent that part of an HTML page from being shown within the text snippet on the page.

Here’s an example:

<p><span data-nosnippet>Text</span> additional text….</p>

Other Search Engines

As of right now, Bing and other search engines don’t currently support these new snippet settings. Because they are so new, even to Google, it could be awhile before we see them supported in other search engines, if at all.

Google is Using These as Directives

Google says these are directives the search engine bots will follow, rather than hints that it will consider, but possibly ignore.

Can You Preview the New Snippets?

Unfortunately not. Right now, there isn’t a way to preview how the snippet settings will work in live search. The only thing to do is to implement them and wait for Google to show them. Use the URL inspection tool to speed up crawling, and once Google has crawled your site, you should be able to see the new snippet in the search results.

This will be live in mid-to late October, and will take time to fully roll out. It could take over a week before everyone gets it.

Getting Ready for the Update

Google has provided about a month’s heads up to allow webmasters to implement the changes on our sites now and then see how it impacts the listings in Google results when it goes live.

Keep in mind if you restrict Google from showing certain information, it may impact whether you show up for Featured Snippet results. It may also impact how your search results look. Features require snippets to have a certain minimum number of characters to be displayed and if you opt to show less than that minimum, it may mean your pages do not qualify for the featured snippet position.

Content in structured data is eligible to be displayed as rich results in search. These kinds of results don’t conform to the limits declared in the metal robot settings but rather can be addressed more specifically by limiting or modifying the contents in the structured data itself.  For instance, if a recipe is included in structured data, the contents of that structured data may be presented in a recipe carousel in the search results. If an event is marked up with structured data it may be presented as such within the search results. To limit that presentation, a publisher can limit the amount and type of content in the structured data.

Some of the special features on search depend on the availability of preview content so limiting your previews may prevent your content from appearing in these areas. Featured snippets require certain amount of characters to be displayed.

This number can vary by language which is why there is no exact max snippets length Google can provide to ensure appearing in this feature. If you do not want to have content appear as featured snippets, you can experiment with lower Max snippet lengths. Those who want a guaranteed way to opt-out of featured snippets should use “no snippet”

It’s worth noting that you can also use these tags to control what size your images are shown in your AMP results. Publishers who do not want Google to use larger thumbnail images for their AMP pages can you use the settings to specify the max image preview of standard or none.

While these changes will not impact your overall Google web search rankings, it may impact your listing showing up with certain rich results or your site showing up as a featured snippet. However, Google will still crawl index and rank your page has the same way it did before so it will not impact your overall ranking in Google search.

One of the larger request webmasters, site owners, and search engine optimizers have wanted was more control over what Google shows for their listings in the search results. These new settings provide more flexibility in terms of what you do and do not want to show in your search results snippet on Google.


Google Quality Raters Guidelines Updated

Google recently published new Quality Raters Guidelines. Within it you’ll find significant changes to the “Quality Raters Guidelines” section and additional new areas of focus. In this blog post, we’ll track the changes and how they may influence search engine optimization trends.

Something that’s important to notice that many people use these guidelines for tips on how Google search algorithm works but this is not the right approach.

The guidelines are there for quality raters to focus on certain signals and page properties for the purpose of judging the quality of the pages. They are not instructed to look for Quality Raters Guidelines. Google uses quality raters to see if the content the algorithm is ranking meets quality standards and that’s it.

The guidelines are written to assist third-party Quality Raters Guidelines to rate the web pages. They do not contain hints as to what the ranking signals are within the algorithm. The guidelines however do provide hints as to what kind of quality issues the algorithm may be focusing on.

So far, the quality guidelines have been incredibly accurate for predicting trends in the algorithm. For example, the increased instruction on how to rate medical and financial sites coincided with algorithms designed to improve the relevance of those kinds of websites.

The last few Google core Quality Raters Guidelines strongly affected news websites. We see a new news section was added to the quality rater guidelines and it shows how the quality rater guidelines can reflect we’re past or future algorithms are focused. Even though there may not be hints about ranking signals in the quality rater guides, it may be possible to deduce algorithm trends.

What’s Changed

Though a lot of the guidelines have changed overall, an important section to pay attention to is the guidelines that appear in section 2.3.

Section 2.3 handles your money or your life topics. This change affects Quality Raters Guidelines and government related topics. Before this update, the news topic section was grouped in with public and official information pages.

Now the news topic is its own section providing guidance about how to judge and rate news pages. This is likely in response to the fact that Google has gotten a lot of negative attention from politicians and government pundits who Quality Raters Guidelines. It may not be coincidental that the news and government/civics section has been given greater emphasis within the new guidelines.

Topics are Now Emphasized Over Pages

Though it may seem minor, Google has emphasized the topic of a page over the word page itself. The word Pages has been removed in many places throughout the new guidelines whereas the word “topic” has been added in many places.

Removing emphasis from the word “pages” refocuses the sentence on the newly added instances of the word topic. Take a look at this change at the YMYL  section.

Old Version

“Some types of pages could potentially impact the future happiness, health, financial stability, or safety of users. We call such pages “Your Money or Your Life” pages, or YMYL. The following are examples of YMYL pages:”

New Version, with additions added:

“Some types of pages or topics could potentially impact a person’s future happiness, health, financial stability, or safety. We call such pages “Your Money or Your Life” pages, or YMYL. The following are examples of YMYL topics:”

This change may appear minor but it has the effect of emphasizing the topicality of a page as something to focus on.

“Some types of pages or topics…”

YLML Rewritten

Section 2.3 that deals with YMYL has been rewritten almost in its entirety. In previous versions, Financial, medical, and shopping topics were in the top three of that section. Now we see the top topics are News and Current Events and Civics, Government and Law.

Those topics are followed by Finance, Shopping, Health and Safety, the new Groups and People and Other. The Other category has also been revised with this new series of update.

It’s worth noting that the Medical section has been demoted from third to fifth place but it has been renamed to Health and Safety.

New YLML Content

The changes to the YLML content are as follows:

  • News and current events: news about important topics such as international events, business, politics, science, technology, etc. Keep in mind that not all news articles are necessarily considered YMYL (e.g., sports, entertainment, and everyday lifestyle topics are generally not YMYL). Please use your judgment and knowledge of your locale.
  • Civics, government, and law: information important to maintaining an informed citizenry, such as information about voting, government agencies, public institutions, social services, and legal issues (e.g., divorce, child custody, adoption, creating a will, etc.).”

These are the newly revised sections, including the new Health and Safety section:

  • “Finance: financial advice or information regarding investments, taxes, retirement planning, loans, banking, or insurance, particularly webpages that allow people to make purchases or transfer money online.
  • Shopping: information about or services related to research or purchase of goods/services, particularly webpages that allow people to make purchases online.
  • Health and safety: advice or information about medical issues, drugs, hospitals, emergency preparedness, how dangerous an activity is, etc.
  • Groups of people: information about or claims related to groups of people, including but not limited to those grouped on the basis of race or ethnic origin, religion, disability, age, nationality, veteran status, sexual orientation, gender or gender identity.
  • Other: there are many other topics related to big decisions or important aspects of people’s lives which thus may be considered YMYL, such as fitness and nutrition, housing information, choosing a college, finding a job, etc. Please use your judgment.”

“Other” Topic Expanded

As shown above, the other section has been expanded. I specifically calling out Fitness and Nutrition, College search and job search, which are big topics in the affiliate marketing room, it’s possible they may receive additional scrutiny and emphasis in a future round of core algorithm updates.

It’s also a reasonable assumption that quality raters just hadn’t been focusing enough on these topics. The fact that Google includes them by name in the new guidelines suggests that these are areas Google is paying special attention to.

Housing information could relate to anything from real estate and home loans to home improvement. These niches are also among big money in the affiliate area. Because that topic of housing information itself is vague, it will be interesting to see if housing-related niches will be impacted in future broad core updates.

Section on Identifying Content Updated

Google added information to this section to include new guidance on news and shopping pages.

“News website homepage: the purpose is to inform users about recent or important events. (MC – News Homepage)

News article page: the purpose is to communicate information about an event or news topic. (MC – News Article)

Store product page: the purpose is to sell or give information about the product.

  • Content behind the Reviews, Shipping, and Safety Information tabs are considered to be part of the MC. (MC – Shopping Page)”

Author Information Section Updated

In section 2.5.2, they cover information about finding out who is responsible for a website and Quality Raters Guidelines. The section remains intact but has one addition:

“Websites want users to be able to distinguish between content created by themselves versus content that was added by other users.”

This change relates to news magazines as well as any sites that accept guest articles or allows Quality Raters Guidelines question-and-answer content.

This could mean that Google is reviewing guest posts and authors and it may indicate that Google is focusing on identifying low-quality sites and excluding them. Sites like Quality Raters Guidelines are full of spam links so this may be one reason why Google is considering paying closer attention to them.

Very High Quality Content Section Expanded

In section 5.1 where very high-quality main content is addressed, the section has been expanded to address the uniqueness and originality of the content. This section has a new focus on news sites but it isn’t limited to sites in that niche. The standards of high-quality also apply to all sites, but in particular YMYL sites.

The new section goes beyond the quality of the text content encouraging quality readers to judge the quality in the originality of artistic content that includes photography, images, and videos.

“A factor that often distinguishes very high quality MC is the creation of unique and original content for the specific website.

While what constitutes original content may be very different depending on the type of website, here are some examples:

  • For news: very high quality MC is original reporting that provides information that would not otherwise have been known had the article not revealed it. Original, in-depth, and investigative reporting requires a high degree of skill, time, and effort. Often very high quality news content will include a description of primary sources and other original reporting referenced during the content creation process. Very high quality news content must be accurate and should meet professional journalistic standards.
  • For artistic content (videos, images, photography, writing, etc.): very high quality MC is unique and original content created by highly skilled and talented artists or content creators. Such artistic content requires a high degree of skill/talent, time, and effort. If the artistic content is related to a YMYL topic (e.g., artistic content with the purpose of informing or swaying opinion about YMYL topics), YMYL standards should apply.
  • For informational content: very high quality MC is original, accurate, comprehensive, clearly communicated, professionally presented, and should reflect expert consensus as appropriate. Expectations for different types of information may vary. For example, scientific papers have a different set of standards than information about a hobby such as stamp collecting. However, all types of very high quality informational content share common attributes of accuracy, comprehensiveness, and clear communication, in addition to meeting standards appropriate to the topic or field.”

Very Positive Reputation Section Expanded

Section 5.2.3 referencing very positive reputation also got an update. The new content reads:

For YMYL topics especially, careful checks for reputation are required. YMYL reputation should be based on evidence from experts, professional societies, awards, etc. For shopping pages, experts could include people who have used the store’s website to make purchases; whereas for medical advice pages, experts should be people or organizations with appropriate medical expertise or accreditation. Please review section 2.3 for a summary of types of YMYL pages/topics.”

We know that website reputation has been a part of the quality raters guidelines for a long time. This doesn’t mean you should go out and then join as many associations as possible and seek testimonials from your customers. But if you are creating YMYL content it should be expert and focus on quality.

Big Changes: Are More on the Horizon?

With this Google quality raters is guideline update, we see a number of significant changes. It’ll be interesting to see how  sites are affected by it’s your garage for updates. If you want to read more of the guidelines and see the changes for yourself, you can download the Quality Raters Guidelines here.

Digital Marketing

How to Survive Algorithm Changes – Straight from Google

In case you aren’t aware, Google makes changes to its algorithm on a regular basis. In fact, in just one year’s time, Google recently made 3,200 changes to its search algorithm. People often sweat the major announcements regarding algorithm changes, such as the August 2019 one about Google core updates. However, there’s really no need to stress out. With so many changes being made in general, keeping some common sense advice in mind will help you to organically achieve better rankings over time. Follow these tips for surviving algorithm changes, straight from Google’s own recommendations.

Quality of Content

As the saying goes, “Content is king.” This remains true. One rule that has always applied when it comes to SEO is that you should write for people, not search engines. There is some advice beyond this fundamental rule. Let’s take a look at some of the company’s suggestions.

First, they emphasize the need to provide original information, reporting and research. Google looks for new and original content, not just a rewrite of what’s already out there. That means their search engine is looking for you to go further. In fact, Google official recommendation is that webmasters work to provide “a substantial, complete or comprehensive description of the topic.” When you take the time to provide users with a thorough answer to your question, they’re less likely to bounce to other pages. This will reflect positively in your favor when it comes to rankings.

Along these lines, it’s also important that you go above and beyond the competition. Google encourages users to provide information that is beyond the obvious. They want detailed research, along with an analysis of that information. Share data and insight from your own experience. This will set you apart. Be sure not to simply cut and paste data you come across, though. Take time to delve deeper into the content and to share your own original thoughts on the information.

Not only is your content important in Google’s algorithm; headlines matter, too. Your headline should be accurate and descriptive. If they don’t, your reader won’t stay on the page. You also want to avoid the temptation to write sensational or outlandish headlines for this reason.  A high bounce rate will hurt your rank.

You want to strive for content that people love. Google recommends making it so that folks will want to bookmark, share and recommend. In fact, they go a step beyond that and suggest that you write material that could be seen in a print publication such as a book or magazine. It can be tempting to dash off less than stellar content simply because online information is so readily available and easy to access. Shoot for professional, quality content and you’ll never go wrong.

Display of Expertise

Clearly, we can’t all be experts in everything. However, you should at the very least be knowledgeable about the subject you’re presenting. While it’s true that you’re writing for humans, search engines like Google also look for content that reflects a certain level of expertise. Your audience should at least have a modicum of trust in what you’re offering on the page. This will be reflected in the amount of time they remain there.

Google’s new recommendations involve providing your audience with evidence as to why they should trust you. Be sure you create an About Page that tells of your credentials and link to it throughout your site. Provide information about your background, linking out to works you’ve provided elsewhere. When you’re citing information that isn’t yours, be sure to list or link your source. You want to present yourself as an authority on the subject. You can do that with evidence of your knowledge and credentials, along with the fact that you are aware of other current and relevant sources.

Another issue that falls under the realm of expertise is the presentation of factual information. Be sure to fact check what you’re putting out there. You don’t want to unknowingly spread disinformation from questionable sources. It’s even worse if you are intentionally spreading “fake news.” Google will penalize you for this.

Comparative Information

Some of your ranking will be with regard to how your site compares to others. Google’s newest recommendations regarding algorithm advise webmasters to be sure they are providing substantial value in comparison to other similar pages within the search results. So, it’s a good idea to take some time to research the top several results on the first page for your search term prior to writing your content. This will help to guide your own content in a way that ensures your content stands out and offers a new angle.

Google also wants you to write in ways that are useful for the folks who are visiting your site, rather than trying to guess what will rank better. Remember the very first rule of content creation is to write for real people, not for the search engines. Pay attention to what visitors respond well to on your site already. Give them more of that or expand upon what you’ve already offered. Over time, as you continue providing content of value, Google will reward your efforts with higher rankings.


Finally, the way your site looks and how it functions also matters to Google. Check for spelling and grammatical errors. In addition, your content should be easily readable. That means clear fonts and colors that make things easy to see are essential. Your graphics and videos should also look professional. Avoid just throwing things together or grabbing a free stock image. Avoid sharing too many ads, as this will make your page look cluttered and distract from the content. You also want to be certain your page looks good on mobile devices, as this is the preferred viewing method for much of the population. Consider the quality of your presentation in order to give yourself an advantage with regard to rank.

Keep these tips in mind as you move forward. These Google recommendations will help you to survive algorithm changes and maintain high rankings in searches.


No More Support for Robots.txt Noindex

Google has officially announced that GoogleBot will no longer obey robots.txt directives related to indexing. If you are a publisher relying on robots.txt and no index directives, you have until September 1st 2019 to remove it and start using an alternative.

Why the Change?

Google will no longer support the directive because it’s not an official one. In the past, they have supported the directive but this will no longer be the case. This is a good time to take a look at your robots.txt  file to see where you’re using the directive and what you can do to prepare yourself when support officially ends.

Google Mostly Obeyed the Directive in the Past

As far back as 2008, Google has somewhat supported this directive. Both Matt Cutts and John Mueller have discussed this. In 2015, Perficient Digital decided to run a test to see how well Google obeyed the command. They concluded:

“Ultimately, the NoIndex directive in Robots.txt is pretty effective. It worked in 11 out of 12 cases we tested. It might work for your site, and because of how it’s implemented it gives you a path to prevent crawling of a page AND also have it removed from the index. That’s pretty useful in concept. However, our tests didn’t show 100 percent success, so it does not always work.

Further, bear in mind, even if you block a page from crawling AND use Robots.txt to NoIndex it, that page can still accumulate PageRank (or link juice if you prefer that term).

In addition, don’t forget what John Mueller said, which was that you should not depend on this approach. Google may remove this functionality at some point in the future, and the official status for the feature is ‘unsupported.’”

With the announcement from Google that noindex robots.txt is no longer supported, you cannot expect it to work.

The official tweet reads: “Today we’re saying goodbye to undocumented and unsupported rules in robots.txt. If you were relying on these rules, learn about your options in our blog post.”

In that blog post, they went on to say: “In the interest of maintaining a healthy ecosystem and preparing for potential future open source releases, we’re retiring all code that handles unsupported and unpublished rules (such as noindex) on September 1, 2019.”

What to Use Instead

Instead of using noindex in the robots.txt file, you can use noindex in robots meta tags.  This is supported in both the HTTP response headers and in HTML making it the most effective way to remove URLs from the index when crawling is allowed.

Other options include:

  • Using 404 and 410 HTTP status codes: Both of these status codes mean the page does not exist which will drop the URLs from the Google index once they are crawled and processed.
  • Disallow in robots.txt: Search engines are only able to index pages they know about so blocking a page from being crawled typically means it won’t be indexed. A search engine may also index URLs based on links from other pages without seeing the content itself so  Google says they aim to make those pages less visible in the future.
  • Password protection: Unless you use markup to indicate paywalled or subscription-based content, hiding a page behind a login generally removes it from Google’s index.
  • Search Console Remove URL tool: Use this tool to quickly and easily remove the URL from the Google search results temporarily.

Other Changes to Consider

All of this comes on the heels of an announcement that Google is working on making the robots exclusion protocol a standard and this is likely the first change that’s coming. Google released its robots.txt parser as an open source project alongside this announcement.

Google has been looking to change this for years and by standardizing the protocol, it can now move forward. In analyzing the usage of robots.txt rules, Google focused on looking at how unsupported implementations such as nofollow, noindex and crawl delay effect things. Those rules were never documented by Google so their usage in relation to the Googlebot is low. These kinds of things hurt a website’s presence in Google search results in ways they don’t believe webmasters intend.

Take time to make sure you are not using the no index directive in your robots.txt file.If you are, make sure to choose one of the suggested methods before September 1st. It’s also a good idea to look to see if you’re using the nofollow or crawl-delay commands. If you are, look to use the true supported methods for these directives going forward.


In the case of nofollow, instead of using the robots. Txt file, you should use no  follow in the robots meta tags. If you need more granular control, you can use nofollow and the Rel attribute on an individual link level.

Crawl Delay

Some webmasters opt to use the crawl delay setting when they have a lot of pages and many of them are linked from your index. The bot starts crawling the site and may generate too many requests to the site for a short period of time. The traffic peak could possibly lead to depleting  hosting resources that are monitored hourly. To avoid problems like this, webmasters set a crawl delay to 1 to 2 seconds so the bots crawl a website more moderately without causing load peaks.

However, the Google bot doesn’t take the crawl delay setting into consideration so you shouldn’t worry about the directive influencing your Google rankings. You can safely use it in case there are other aggressive bots you are trying to stop. It’s not likely you’ll experience issues as a result of the Googlebot crawling, but if you want to reduce its crawl rate the only way you can do this is from the Google Search Console.

Take a few minutes to look over everything in your robots.txt file to make sure you make the necessary adjustments ahead of the deadline. Though it may take a little bit of time to sort through everything and  execute the changes, it will be well worth it come September.

Exit mobile version