Thursday, 31 March 2022

SEO Is Important for Your Content Marketing

Businesses are always looking for an edge on their competitors, a way to stand out from the same boring thing that everyone else is doing.

When it comes to content marketing, if you want to engage more users and give your pages a better shot at ranking highly on Google, you should get into video content.

Videos can explain the same concepts that regular written content does, but with the added benefits of visuals, graphics, effects, and, of course, someone talking.

Videos are a great example of the types of multi-media in content marketing that tend to land with users. Many people just prefer to consume content that way instead of reading words on a page.

Here’s the catch, though: like any other SEO content designed to rank, you don’t just shoot to the top of everyone’s SERPs on Google or YouTube because you made a video.

Other businesses make videos, too. If you want to outrank them, you need to practice video SEO.

Let’s learn more about SEO for video, why your videos won’t rank without it, and what you need to do to optimize your video marketing efforts.


Google has been ranking video content more and more on the SERPs over the last few years, and not just because the largest search engine in the world owns the second-largest search engine in the world (YouTube).

We said in our multimedia blog post referenced above that the human brain processes visual information 60,000 times faster than text-based information. It’s true anecdotally, too. Ask your friends and family if they’d prefer to read for 15 minutes or watch a 15-minute video.

I think the results will be skewed toward videos.

So, with so many people essentially telling the search engines that they really like video content, the search engines are showing video results more often.

Which videos, though? First of all, if you do a lot of searches on Google, you’ll notice that the search engine shows videos for the kinds of searches that would seem to invite video results.

You’ll get more video results for “how-to” searches, for example. Even though “how to change a doorknob” can be explained with a blog post, a video allows users to see what’s being done in real-time.

Even with that kind of filter, though, there’s always going to be a lineup of business competitors all trying to rank their videos first.

The ones at the top will attract more organic traffic, earn more views and engagement, and have better chances of securing customers’ business.

Including videos alongside written content on your website can also increase that engagement and keep users on your site longer. It’s a great signal for Google that you’re offering something worthwhile.

That’s why video SEO is so important.

Now, what factors matter when it comes to video SEO?

Direct Videos Toward the Right Audiences

Before you can use SEO to market your videos effectively online, you’ll need to think about who your audience is.

For instance, if you are making a video on changing a doorknob, you’re probably talking to beginners in home improvement. In that case, your content should use beginner language by explaining all concepts and taking your time.

If you’re making a video on one of the most advanced forms of heart disease for an academic-journal audience, you won’t need to explain any basics. You’ll actually be required to speak to the more advanced ideas around that subject.

The speaking that goes on in your video is going to be crucial, as well. That takes us to the next point.

Create Keyword-Rich Video Transcripts

Before you create your video, you’ll want to complete appropriate keyword research for it just like for any other SEO-based content.

You should then use those keywords in your video, and there’s a reason for that.

Using keywords in the video will allow you to include those keywords in the transcript that you should include right alongside the video, either in its description or on the content page where the video will live.

Google crawls all that text when it indexes your videos, and keywords help it to understand what your video is about.

The added bonus of including a keyword-rich transcript with your video is that users who need to or prefer to read your content instead can do just that.

Create Engaging Thumbnails

I know I do this, and you probably do, too: when I search videos, I click on one based on the thumbnail I see. That’s because thumbnails are (or should be) strong indicators of what is in that video.

Thumbnails that are engaging, exciting, and relevant to the content are going to get more clicks.

Your thumbnails should include a few essential elements to stand out from all the others:

  • A person (people like seeing people)
  • The title of the video
  • Graphics showing relevance

The right thumbnail will stop people in their tracks on your video as opposed to someone else’s.

If I want to see a video showing me how to install a fence in my yard, I want the thumbnail to show people digging holes, not just a completed fence in the ground.

The digging one gives me an idea of the work involved. The other one just shows me the finished result that I don’t have yet.

You can use YouTube Studio or another tool or app to make awesome video thumbnails.

Write Keyword-Infused Video Titles and Descriptions

Video SEO is a lot like technical SEO for any other content. Videos need good titles and descriptions if you want Google to pick them up.

If you did keyword research for your video transcript, then those keywords will apply here, too.

For your video titles, choose the keywords with the highest search volume that best match the intent of the searcher.

t’s important to be smart with this. You can’t just stuff all these keywords into your video titles and separate them with pipes. It looks spammy and like you’re trying too hard.

Think of a keyword-based title that would be the most useful to your intended audience.

The same goes for descriptions. Use keywords naturally as in any other SEO content. Don’t overdo them. Think of the user, and write a detailed description that will help with understanding the video.

Be Logical About Embedded Videos

You can use all of these video SEO tips whether you’re creating videos on YouTube or any other platform.

When you decide you want to embed videos right on your website’s content pages, though, there are some other best practices to follow.

For instance, did you know that Google usually ranks only one video per website page, and it’s almost always the first one?

So, if you have a resource page on your website, and you include multiple videos on it, Google probably isn’t going to rank that page for any other videos but the first one. Make it count.

This is just a generally good practice to follow: if you want users and search engines to find the videos on your website, embed them at the top of the page, above the fold.

That way, no one has to take extra time to find them again.


Wednesday, 30 March 2022

What is Robots ,Important For SEO

A robots.txt file is an ASCII or plain text document made up of commands specifically meant to be read by search engine crawlers. Crawlers (sometimes called bots or spiders) are autonomous programs used by search engines like Google and Bing to find and “read” web pages.

Crawlers enable search engines to understand what kind of information is stored on a page and then index that page so it can be displayed in response to user queries. During indexing, the search engine’s algorithm sorts pages into an order that directly affects their SERP ranking.

The first thing crawlers do when visiting any website downloads the native robots.txt file. This gives you a chance to communicate with the crawler, explain to it how it should read your site, and differentiate which pages are important and which pages are unimportant.

    Every search engine has its own crawler, and every crawler has its own identifying “user-agent” designation.

    • Google: Googlebot
    • Google Images: Googlebot-Image
    • Bing: Bingbot
    • Yahoo: Slurp
    • DuckDuckGo: DuckDuckBot
    • Because crawlers are always scouring the internet for new pages, it’s important not only to have a robots.txt file but also to ensure that the file stays as up-to-date and accurate as possible. In a very real sense, robots.txt gives you the opportunity to take greater control over how your website is indexed, which has a huge impact on how your site’s pages will rank in search results.

    Robots.txt is important for SEO

    Allowing/Disallowing Certain Pages

    A robots.txt file is an essential part of every website for a few different reasons. The first and most obvious is that they enable you to control which pages on your site do and do not get crawled. 

    This can be done with an “allow” or “disallow” command. In most cases, you’re going to be using the latter more than the former, with the allow command really only being useful for overwriting a disallow. Disallowing certain pages means that crawlers will exclude them when reading your website.

    You might wonder why you would ever want to do that; after all, isn’t the whole point of SEO to make it easier for search engines, and therefore users, to find your pages?

    Yes and no. Actually, the whole point of SEO is to make it easier for search engines and their users to find the correct pages. Virtually every website, no matter how big or small, will have pages that aren’t meant to be seen by anyone but you. Allowing crawlers to read these pages increases the likelihood of them showing up in search results in place of the pages you actually want users to visit.

    Examples of pages you might want to disallow crawling include the following:

    • Pages with duplicate content
    • Pages that are still under construction
    • Pages meant to be exclusively accessed via URL or login
    • Pages used for administrative tasks
    • “Pages” that are actually just multimedia resources (such as images or PDF files)

    Additionally, for large websites with hundreds or even thousands of pages (for example, blogs or e-commerce sites), disallowing can also help you avoid wasting your “crawl budget.”

    Since Google and other search engines can only crawl so many pages on a website, it’s important to make sure that your most important pages (i.e. the ones that drive traffic, shares, and conversions) are prioritized over less important ones.

    Allowing/Disallowing Certain Crawlers

    Most of the time, you’ll be allowing or disallowing all crawlers from a certain page or pages. However, there may be instances where you want to target specific crawlers instead.

    For instance, if you’re trying to cut down on image theft or bandwidth abuse, instead of disallowing a long list of individual media resource URLs, it makes more sense to simply disallow Googlebot-Image and other image-centric crawlers.

    Another time when you might want to disallow certain crawlers rather is if you’re receiving a lot of problematic or spammy traffic from one search engine more than another.

    Spam traffic from bots and other sources isn’t likely to harm your website (although it can contribute to server overloads, a topic we’ll discuss more a little later). However, it can seriously skew your analytics, inhibiting your ability to make accurate, data-based decisions.

    Directing Crawlers to the XML Sitemap

    Robots.txt files aren’t the only tool you have to funnel search engine crawlers towards the most important pages on your website. XML sitemaps likewise serve a very similar function.

    Additionally, XML sitemaps contain other pieces of useful information, including when pages were last updated, which pages search engines should prioritize, and how to locate important content that might otherwise be deeply buried.

    All this makes having an XML sitemap an extremely potent weapon in your SEO arsenal. Of course, just as those kids in The Blair Witch Project discovered the hard way, a map is only useful as long as you can actually find it.

    Enter robots.txt. Since a crawler will read your robots.txt file before it does anything else, you can use this to direct the crawler directly to your sitemap, ensuring that no time or resource is wasted.

    This is especially helpful if you have a large website with tons of links per page, as without a sitemap crawlers rely primarily on links to find their way. If your website has rock-solid interlinking (or very few pages), then it might not be something you have to worry much about. Nevertheless, using robots.txt hand-in-hand with an XML sitemap is definitely recommended.

    Protecting Against Web Server Overload

    Okay, this one isn’t an “official” robots.txt directive, but it is one that several major search crawlers take heed of regardless. If anyone asks where you heard this, don’t tell them it was us.

    By including a “crawl-delay” command in your robots.txt, you can control not only which pages crawlers read, but the speed at which they do it. Normally, search engine crawlers are remarkably fast, bouncing from page to page to page to page much more quickly than any human could manage. That makes them extremely powerful and efficient.

    It also makes them a liability, at least for sites with limited hosting resources.

    The more traffic a website receives, the harder the server it’s hosted on has to work to display the site’s pages. When the rate of traffic exceeds the server’s ability to accommodate it, the result is an overload. That means page speed slowing to a crawl, as well as a sharp increase in 500, 502, 503, and 504 errors. Simply put, it means disaster.

    Although it’s doesn’t happen often, search engine crawlers can contribute to server overloads by pushing traffic past the tipping point. If this is something you’re concerned about, you can actually command crawlers to slow down, delaying them from moving to the next page by anywhere from 1 to 30 seconds.

    Microsoft Thwarts Chinese Cyber Attack Targeting Western European Governments

      Microsoft on Tuesday   revealed   that it repelled a cyber attack staged by a Chinese nation-state actor targeting two dozen organizations...