As an overview: in part 1 we talked about search engine basics and the value of your site being visible ; in part 2 we covered the importance of quality signals and common mistakes webmasters can do to create negative quality signals. Now that we have covered the overall basics of search engines and their behavior, we will be jumping into the nitty gritty of SEO and its fundamentals.

In part 2 we mention the importance of high-quality content to make sure your content is visible, relevant, trustworthy, and popular, but what do you need to do to concur with these best practices? In part 3 of this series, we are going to focus on the content itself, provide best practices, and things to avoid when dealing with content on your site.

What is Content Exactly?

We all talk about content like it’s a cookie cut answer, but when it comes to a website, what exactly constitutes as content on a page? Web content is what the user encounters on a website that contributes to the user’s experience. Web content can come in a variety of shapes including text, images, sounds, videos, and/or animations.

All of these types of web content play a factor in the user experience. The question is, are you serving users high-quality content? When it comes down to your website, you should be informing, educating, entertaining, and/or connecting users with content that is relevant to your industry. Whether the content is through topics, ideas, facts, or statements, the ultimate goal is to provide content that gets users to want to come back for more.

Although content can be used in multiple ways, it is important to consider the limitations of what search engines can crawl and understand.

What Content Can Search Engines See?

Search engines are improving day-by-day. Although their ultimate goal is to provide the best user experience for the user, there are current limitations that crawlers have trouble understanding. As a matter of fact, if your content does not include a textual alternative, search engines are unable to understand the content. Here are some best practices for making sure search engines can understand your content:

  • Images & Animations (GIFs) – provide images with an Alt text and a file name that includes relevant, descriptive, and keyword-rich text about the image in a natural manner. Decorative images do not apply.
  • Videos – adding video transcriptions to the HTML creates additional textual content and gives users as well as search engines a better understanding of the video’s value. Link to individual video pages to allow for simple indexing. Populate videos with rich metadata like, Facebook, or Twitter tags to allow search engines to use their video resource library effectively.

So what does this mean for my content?

As long the content valued on your site is in a textual format that is in a font/point size easily read by users, crawlers will not have any issues. Just make sure the copy on each page is clear, concise, and organized around the page’s unique topic. Remember to cater your content towards users rather than search engines. Naturally incorporate relevant keywords that the search behavior of new users as well as repeat visitors might use to look for your content. Consider the following tips when optimizing the body content of your pages with textual content:

  • Use a minimum of 400 words in the body content per page.
  • Keywords should be used near the beginning of the content.
  • Stay away from repeating keyword phrases excessively.

It is important to create content that is both unique and relevant. Updating and refreshing your content is the most ideal to keep users engaged. Above all, do NOT DUPLICATE CONTENT!

Why can I not Duplicate Content?

When content is repurposed from your site or other sites, it can create duplicate content issues that send search engines negative quality signals (diminishing site authority, relevance, and trust) that could potentially lead to affecting the ability to rank. Any duplication of content, whether it’s on-site or off-site, reflects these signals. Each piece of content should have one primary location and any duplicate content that is present on other URLs should not be indexed. Google has the ability to filter pages with duplicate content out of its index, not allowing them to even rank.

With algorithm updates such as Panda, providing users with the best experience through quality content is crucial. Quality content is when unique, valuable content is created rather than repurposed. Trust me, if creating unique content was easy, it wouldn’t be so valuable.

What About Cloaking?

It is important to shed some light on a once common practice that web developers could still attempt to use today. When a developer wants to show search engines content but does not want users to see it on the page, the content is cloaked. Search ecngines frown upon cloaking practice and implement heavy penalties like blacklisting your site if caught. It is a manipulative tactic that deliberately attempts to bait and switch search engines. It is usually done through the HTTP header or IP address.

But what if you want to cater a page based on each user themselves (different users are served different content)? For example, serving different content based on the users geographic location (Geo-IP). This is considered personalization. As long as the crawler receives the same type of content and the content doesn’t choose based on whether the visitor is a bot or a user, cloaking is not present.

Make Sure Your Content is King

When creating content, it is important to think about the user’s experience and providing a remedy to any questions one might have. The more unique your content, the more value your site will be to users and search engines. At ClickGiant, we pride ourselves in providing unique content for our clients and serving your target audience with the most optimal user experience. So just remember, when it comes to your site and its value, content is king!