It seems that there may be an issue with the formatting of your robots.txt file. The errors reported by PageSpeed Insights and Search Console suggest that there are unknown or incorrect directives in your robots.txt file.
Here are a few things you can check to resolve this issue:
Make sure that your robots.txt file is located at the root of your website's domain (e.g. www.example.com/robots.txt). If it's not located there, search engine bots may not be able to find and read it.
Check the syntax of your robots.txt file to ensure that it's properly formatted. Each directive should be on a separate line, and the file should be saved in plain text format (not HTML or any other format).
Verify that the directives in your robots.txt file are valid. The User-agent directive should be followed by the name of the search engine bot you want to control (e.g. Googlebot), and the Allow and Disallow directives should be followed by the URL paths you want to allow or disallow crawling (e.g. Allow: /blog/).
If you're using Odoo 11 to manage your website, you may want to check if there are any add-ons or customizations that could be affecting the robots.txt file. Make sure that any changes you make to the robots.txt file are reflected in Odoo as well.
Once you've made any necessary changes to your robots.txt file, you can resubmit it to the Google Search Console to ensure that it's being correctly read and interpreted by search engine bots.