This question has been flagged
13 Replies
29419 Views

Hi,

I notice that if I run the server with the odoo.py, using the workers parameter, I also get an additional process using the openerp-gevent script with runs at port 8072.

A couple of questions:

1. Can I change the gevent port number? If I want to run several instances of odoo in one server, they are going to try to use the same port for gevent.

2. Do I use the gevent port for users? If I do so, do this gevent process uses the other processes? The main question is, how can I use the live chat and maintain multi-process capability for my users?

Thanks

Avatar
Discard

I have same problem.

Author

Nobody knows?

--workers ? does not work, odoo should fix this issue. Should handle with reverse proxy etc:

When increase workers why it lost the global variables values ??

Author Best Answer

I found the answer and I didn't want to let the question unanswered, so here it is:

You start the server with --workers > 0 (depending on your hardware), so you have that many threads on port 8069. You will also have a couple of cron threads on 8069 (configurable with --max-cron-threads) and one gevent thread on port 8072 (configurable with --longpolling-port).

You have to setup a reverse proxy (apache2 or nginx will do the job), and map your 8069 port to the external 80.

Here is the important part: you also have to reverse proxy your 8072 port to the external 80, but only for location /longpolling (in nginx this is done with a second location).

That way you have your users using the workers normally and the gevent thread only for getting the bus messages.

I will post soon a complete installation and configuration guide for v8.0 with all the details usually missing in available guides.

Regards

Avatar
Discard

Hello, maybe you solved the part with gevent? I was able to launch Odoo with apache2 + wsgi and it is working, but im_chat is not working properly. I get bus.Bus exception. I guess it has something to do with gevent. When I ran Odoo out of the box, chat was working fine.

Didi you try with wsgi?

Well there are number of workers specified in wsgi file, but it seems it does nothing, when you run Odoo using apache2. I tried adding reverse proxy for longpolling, but I would always get connection refused error. You can see what I have tried in this issue: https://github.com/odoo/odoo/issues/3793

Author

@NOD, do you really need WSGI? I do not run odoo with WSGI but instead y just run the odoo.py with the correct options, from a config file. If you do so, using --workers=2 (or more) will enable longpolling. You will see that your odoo will listen on port 8072. You can change this with --longpolling-port=NNNN. Once odoo is running you can check what ports it is listening on with "netstat -lnput | grep python". If your are using default ports you will see it listening on ports 8069 and 8072. If you do, configure apache or nginx to do a reverse proxy on those ports. 8069 for all comunications except for /longpolling which should go to the longpolling one.

I use wsgi to be able to run apache. And I use to create subdomains, so I can access database by entering something like db1.domain.com. Normally running odoo that was not possible (or I don't know how).

Hy Carlos, first I was thinking like you. No need to use wsgi and all the deployment stuff for making running odoo. Yet, when you start to go through service failure, reverse proxy issues etc... running an embedded deployement conf bind to your favorite webserver is a great chance.
Additionnly you could do some conf with wsgi that hou can't easily do with default conf. I agree with @NOD

Hy Carlos, Any additionnal tip on this?

Best Answer

My question is even when I set worker > 0, there no gevent process listen to 8072.

And I found in source code only one line will use longpolling_port, which is in the __init__() of GeventServer, and only when the odoo is configured to user gevent server, the GeventServer will be constructed instead of PreforkServer.

So I am totally confused, which port will be used for longpolling under multiprocess mode?

Avatar
Discard