تخطي للذهاب إلى المحتوى
القائمة
لقد تم الإبلاغ عن هذا السؤال
1 الرد
6323 أدوات العرض

When you have an Odoo website and you want to add Google softwares such as “Shopping feed Optimization” or Google Ads, Google might need to crawl all the website’s pages. An error can then occur with the default configuration of the robot.txt file in Odoo. This situation happened in v13, it might be fixed with the next versions.

الصورة الرمزية
إهمال

okay got it. Then how can i enable to crawl for all the crawlers other than Google?

الكاتب أفضل إجابة

Here is how to solve this problem.

We have to modify the current robot.txt file directly in Odoo.

To do so, activate the developer mode.

Then in general settings, go in Technical, then “Views” under the “User interface” tab.
Search for a view named “robots” and modify the architecture.

By default you have the following content

“User-agent: *”

We are going to erase this line and add the following ones:

“User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:”

Save the changes and now Google should be able to access all your website’s pages.

الصورة الرمزية
إهمال
المنشورات ذات الصلة الردود أدوات العرض النشاط
1
سبتمبر 21
5371
0
أغسطس 24
1550
1
يناير 23
3622
1
أكتوبر 21
3082
2
يونيو 20
4145