Skip to Content
Menú
This question has been flagged
1 Respondre
1951 Vistes

Hello everyone, I have a problem with the indexing of my website on google.


Here is my robot:


User-agent: Googlebot

Disallow: 

User-agent: googlebot-image

Disallow: 

User-agent: googlebot-mobile

Disallow: 

User-agent: MSNBot

Disallow: 

User-agent: Slurp

Disallow: 

User-agent: Teoma

Disallow: 

User-agent: Gigabot

Disallow: 

User-agent: Robozilla

Disallow: 

User-agent: Nutch

Disallow: 

User-agent: ia_archiver

Disallow: 

User-agent: baiduspider

Disallow: 

User-agent: naverbot

Disallow: 

User-agent: yeti

Disallow: 

User-agent: yahoo-mmcrawler

Disallow: 

User-agent: psbot

Disallow: 

User-agent: yahoo-blogs/v3.9

Disallow: 

User-agent: *

Disallow: 

Disallow: /cgi-bin/

Sitemap: exemple


Normally everything is fine, only google is showing me this error:


Exploring allowed?
error
No: blocked by robots.txt file
Page retrieval
error
Failed: Blocked by robots.txt" error


I don't understand what's going on, I've used this type of robot several times on other websites and it doesn't work on this website I'm trying to index.


thank you in advance for your help

Avatar
Descartar
Best Answer
Hello I have the same problem. Did you find a solution ?


Avatar
Descartar
Related Posts Respostes Vistes Activitat
1
de febr. 25
501
2
de juny 23
4669
0
d’oct. 24
619
1
d’abr. 23
3139
1
d’abr. 23
3528