How to Fix Indexed Alerts Although Blocked By Robots.txt

There are various kinds of index coverage issues for the new webmaster or google search console.

A few days ago I got a warning on the Google Search console which is valid with a warning.

That means that the page is valid or true but the google crawler cannot or is not permitted to crawl the page, so a warning message appears.

Alert status for indexed reasons, even if blocked by robots.txt is not a serious problem that you need to fix.
How to Fix Indexed Alerts Although Blocked By Robots.txt
This issue needs to be reviewed again, is the page blocked by the robots.txt really unnecessary or not so important to be crawled by the Google search engine?

If it's not so important, then it's better to leave it alone.

For users of the blogger platform, this problem arises because the page, for example, the label page is blocked by the robots.txt file that you are using, but it can also be caused by the meta tag used in your blog template.

There are several ways to deal with the problem of warning indexes blocked by robots.txt, for more details you might see the tutorial below.

Are Warning Issues The Robots.txt Blocked Index Needs To Be Repaired Immediately?
Depends,

Why I say this, because there are various kinds of warnings about this index.

If the problem you are experiencing is like mine, look at the picture below:
Example Error:
xxxxxxxxdotcom/search/label/Tutorial
xxxxxxxxxxxdotcom/search/label/Tutorial?&max-results=8

This is because I use the meta tag below in my blog template, whereas in the robots.txt file that I use, I don't block anything.

The meaning of the meta tag above is that each archive page, label page, and search results page or search will not be indexed and or terminated by the google search engine.

So when googlebot tries to crawl your blog or website, this warning will appear because Google is not allowed to crawl the page.

In my opinion,

The archive, label, and search results pages on the blog do not need to be indexed, so I let this warning.

However, if this happens on the posting page or product pages that we have.

Then you really need to fix this problem as quickly as possible.

Therefore I say that the warning problem is indexed even though it is blocked by robots.txt depending on which page is problematic.

So if the page is not so important to be indexed on Google, then you can just ignore this warning while if you want the page / post / product or service that you have to appear or appear on Google, then fix this problem immediately.

How to overcome indexed warnings, even though blocked by Robots.txt?
There are two ways that the warning status on the Google Search console is lost.

But before doing these steps it would be better if you review yourself for a number of pages on your blog or website that might or might not be indexed.

For example, label pages, search pages, and archive pages that really should not be indexed.

The reason is that this page is not an original page, which means it can change, but this page also has fewer visits so it doesn't need to be indexed.

But if you still want to keep the alert status indexed, even if it is blocked by robots.txt, please follow the steps below.

1. Double check the robots.txt file used
Please check the robots.txt file you are using, is there a disallow command like the following picture:
If there is a disallow command like that, please delete or change it to allow, the way you can see it below:

1. Log in to Blogger.com
2. Click Settings> Search preferences
3. Then in the special robots.txt section, please click Edit
4. If there is a Disallow command: / search, please delete it

Disallow: means it's not permitted
/ search means all pages in the folder ... / search / ...

So,

Disallow: / search means that all pages in the / search folder are not allowed to be crawled.

5. You can also change the disallow command to allow, but if it's like this it's better to just delete it
6. Finally, please click Save changes

If you still use the Google search console with an old display, please open the following URL:

https://www.google.com/webmasters/tools/robots-testing-tool

Then select the property or blog that you want to change the robots.txt file.

2. Re-check the meta tag used
For wordpress users and bloggers you should check the meta tags on the theme used on your blog or website.

If you don't want the indexed warning to appear, even if it's blocked by robots.txt, it's best to avoid using the noindex meta tag.


For blogger users, please follow the steps below to delete the noindex meta tag.

1. Log in to Blogger.com
2. Click Theme> Edit HTML
3. Then look for the noindex code (use ctrl + f to speed it up)
4. If you meet please delete the code, for example like this:
  •  <meta content='noindex' name='robots'/>
Delete all the code.

5. Make sure the deleted noindex code is in the <head> ... until ... </ head> section
6. If there is no more noindex code in the <head> to </ head> section, please click Save template

Immediately Perform Repair Validation
If you have made repairs with the two methods above, then immediately do a repair validation.

So that the Google webmaster team checks the validation report you submitted.

If the validation you have done is correct then the warning status regarding the index coverage problem will disappear from my google search console.

To do the validation, repair the method like the following:

1. Enter the new google search console
2. Then select your website or blog in the Browse properties section
3. Then click Coverage> click on Valid with warning> then click on the Warning status
4. Then click Repair Validation
5. Finish, please wait for the results

That's an article about how to deal with indexed warnings, even though it's blocked by robots.txt, please try it. May be useful.

0 Belum ada Komentar untuk "How to Fix Indexed Alerts Although Blocked By Robots.txt"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel