In part 1 of our series, we delved into the basics of search engines and the importance of website visibility. Once your site is accessible and it is being crawled correctly, search engines will start to analyze your site. When crawlers go through your site, they ask themselves whether or not your page is even worth indexing. If it’s worth indexing, how should your pages be ranked? Even if your site is already indexed, are crawlers analyzing your content favorably to rank well?
All of these questions help define the top goals for SEO. When it comes to technical SEO, the top goals can be broken down solely to quality signals. But what is a quality signal and how does this affect my rankings? In the second part of this series, we will explain quality signals for search engines as well as delve into best practices to increase positive signals and common mistakes that create negative signals.
So what is a Quality Signal?
This might sound a bit stereotypical, but think back to the times when there was a new kid in grade school. Now imagine your new website is that new kid. At first, nobody knows you exist. That is, until you start to say or do things that get you noticed. Just like crawlers, students have different aspects (ranking factors) that are valued. Once you are noticed by other students (crawlers), you start to get analyzed. What clothes do they wear? Do they have a good personality? How dated is their haircut? Are they trustworthy? Do they have good connections? Are they helpful to others? Are they popular? Do they have a reputation? Just like that new kid, your website is giving off quality signals that tell search engines about your visibility, relevancy, trustworthiness, and popularity.
Quality signals are the sets of criteria utilized by search engine crawlers determining if a site provides content that is relevant, useful and easy to read, which are all considered in ranking algorithms. There are lots of ranking factors in search engine algorithms and many that nobody knows about. Ultimately, these signals can be broken down to this. The more quality signals you please, the higher you will rank in search engines, but having negative quality signals potentially destroys your ranking status and can devalue your site.
How do I send good signals?
When sending signals to search engines, you want to make sure that your site is:
- Visible – as covered in part 1 of this series, search engines should be able to discover, store, analyze, and retrieve your site’s content.
- Relevant – content and rich descriptions provided should be industry related to what users would search delivering quality content.
- Trustworthy – based on your reputation and authority, your site should deliver content that is unique and delivering “factual” information allowing users to recognize your site as an industry leader.
- Popular – occurs when your site is naturally engaging users to loyally link and share your site’s content.
One of the easiest ways to please crawlers is by providing high-quality content on the site that is reflected in your rich descriptions (page titles, meta-descriptions, headers, and body content) and can be easily understood by search engines as well as users.
Common Mistakes Creating Negative Signals?
With algorithms always changing trying to provide the best search experience for users, manipulative SEO tactics are becoming more and more outlawed and outdated. For example, here are some of Google’s most recognized algorithm changes to date:
- Panda – the purpose of Panda was to try and rank high-quality sites higher and demote lower quality sites based on questions users may ask themselves when visiting your site and providing the best user experience possible.
- Penguin – the Penguin algorithm reduced the trustworthiness of a site by mainly targeting the manipulative tactic of creating unnatural backlinks to rank higher in search results.
- Hummingbird – this was a complete upgrade to Google’s algorithm with a goal to better understand a user’s search query and provide improved local search results.
Search engines will keep on making algorithm updates to make sure users are receiving the best experience. The more changes that occur, the more mistakes webmasters can make to send negative quality signals. For example, here are some common mistakes webmasters can do that create a negative quality signal to search engines:
- Not implementing title tags.
- Using the same title tag across multiple pages.
- Having more than 1 meta-description on a page.
- Not defining a canonical homepage.
- More than 1 H1 tag on a page.
- A page not having an H1 tag.
- Putting an H1 tag around a logo or image.
- Using heading tags as style tags in template.
- Keyword stuffing.
- Having a “noindex” meta-robots tag on a page that should be indexed.
- Internal links using “rel=nofollow”.
- Hiding text for style purposes (ex: “display:none;”) or using negative text indent.
- Internal URLs that unnecessarily redirect.
- Improperly excluding/including pages in Robots.txt.
- Duplicate content due to no redirects (having same content on multiple URLs).
Check Your Quality Signal
Improving your quality signal will help you potentially improve your search rankings and increase the number of users visiting your site. Most of the time, webmasters are unaware of the damage they are doing to a site. Fortunately, at ClickGiant our team of designers, developers, and content strategists work together to provide clients a click-worthy user experience that is search engine friendly. We check your site for any past, present, or potential issues that could result in negative signals and get rid of them.