Community mailing list archives
Re: Production Costingby
Ray, this seems a really clean approach with efficiency optimum regarding management information confidency level, given common level of data capturing.
Could you identify any inferred costs in usurping the product concept for this purpose?
Valuation wise I see a problem in the valuation of non-stockable overhead. (Labor) In countries where production resides within a special chapter of the CoA or similar. How would you get non-stockable overhead on the right accounts at the right time? It would be by making labor stockable, to be able to virtually move arround units of labors on the different accounts. This is somewhat ugly, as it is definitvely usurption. I think a reasonable intermediate development (before advanced data capturing) would be to add an accounting enabler of ressources concept. Allowing for accounting moves based on (correctly: estimated) ressource usage (in bom).-> 'mrp_accounting'
@Ana: I like the idea of this approach keeping analytic accounts clean for analytic purposes, as this is affecting stock valuation and so it's the legal view, rather than the management view. See also my pad comment which is based on this very same conceptualization.
I'll take the task of making a consolidated discussion protocol on the pad in Fakradeen's suggested syntax, to build a base / reference document for possible further usage...
Hello Ray,thanks for these valuable comments. Well certainly a statistical approach of COGS is possible and it's probably the only one that fit on a stock Odoo system indeed.But, it may not do the cut in many companies still.For instance our electrical connectors customer has a full tracking via product lots.Tracking the real cost of each individual production orders allows them to detect quality problems and track them down to specific lots, suppliers or operators and take appropriate corrective actions.So I believe depending on the situation both approaches can be done. Of course it's always good to remember the simplest as you just reminded and the empirical tuning of overheads costs trick is certainly a good one.
Now, just as you said, the "price of collecting" these real cost data is high indeed. And this is exactly where extra developments kicks in to build interfaces (with things like sign in/sign out by operator, with propagation of real move quantities from consumed components to the original reservations moves...) to make that cost reasonable again at least when the value of the analysis justifies it.Regards.On Thu, Aug 21, 2014 at 12:45 AM, Ray Carnes - Implementation Strategy <email@example.com> wrote:
P.S. This approach can be handled out of the box.
I discussed this topic at length with a CFO at a client that wanted this and he (after discussing with his CPA friends) concluded:
The information required to do this accurately, and the price of collecting it (entering when machines were turned on, clocking in and out, pausing work, allowing for overtime, etc, etc) was high, and would need to be corrected unless it was always 100% up to date and changed in real-time as production costs changed (labor, capital, assets).
He therefore decided to have several ‘overhead’ products that represented different ‘units’ of costs. According to the BoM, a certain quantity of overhead units would be added. Manufactured products that were simple to build (simple machinery, low labor) had less units than manufactured products that were more complex to build (many machines or more expensive machinery, more labor). He ended up with three units of overhead production costs. It was something like $10, $25 and $45.
The point was that units were allocated according to the overhead effort believed (educated estimate) to go into making manufactured products.
Each month, the ‘price’ of these overhead products would be adjusted up or down, according to the actual expenses incurred. This way labor, electricity, machine leasing, rent, etc could all be allocated ‘equitably’ across the production of all manufactured products, and as those costs changed, they would be retroactively included in COGS.
On Wed, Aug 20, 2014 at 11:45 PM, <firstname.lastname@example.org> wrote:
First, I want to say THANK YOU all!
Fakrudeen, your contribution (content and pad) seem a very good basis for discussion.
Raphael, as I understand you, this so far is private code, so you cannot share, nof even informally? It would be nice if we could as a community accompany your efforts in some way. Maybe the pad is a good reference point and a good tool to aggregate knowledge.
The entity who should take the decision to use its AGPL right is the customer company who owns that code, no me. As the code was developed by another company who didn't use to publish things, I cannot afford interfering in that kind of decision at this stage (probably later when the project is safe with us). Moreover, as it is it would be of little help given all the customers specific logic inside, given it's on 6.1 and given it was written by Python beginners. Now the positive point is that it does roughly what is described in the pad and it seems to have been working fine for one year (one more precision: time of operation is recorded live by manufacturing module, not processed from time-sheets.
Time is really something we have not, so not easy to interact with third parties on this these days. As for money, it hardly buys any more of the required resource here, nor allows cloning ourselves so I'm afraid, unless somebody has a silver bullet, you may hear back from us on that one, but you may need to be patient. On the other hand if somebody does something else before we clean it up, that's a risk taken by the company who didn't asked for a publicly maintained solution since the start.
I'll try to keep aligned with Ana and Pedro on that one, already got a meeting with them, telling them roughly this.