Sheila Moran on Inventory Valuation

Photo of Sheila MoranThe following is a guest post from Sheila Moran, co-author of The Board Director and Audit Committee Members Guide to Fiduciary Responsibilities, about quick assurance on quantity and unit cost.

Inventory valuation for many companies represents a large portion of total assets. For lenders, board directors, and other key stakeholders, obtaining assurance that inventory valuation is not inflated is crucial.  Yet, many companies lack a robust internal audit function and fail to fully utilize computer-enforced controls and auditing available in their financial software systems.  Year-end external audits, which test only a modest sample of parts, are unlikely to catch inflated inventory valuations that on an individual part basis may be below scope.  However, inventory distortions in excess of 10 percent of inventory, in the aggregate, are possible in spite of below-scope variances at the part-level.

Targeted data-mining quickly and credibly validates for unit cost and quantity on hand (QOH).   A programmer, either in-house or a consultant, can design a program that can:

  • Repopulate FIFO (first-in-first-out) layers using receipts only transactions. Use of only the receipt transactions will eliminate adjustment transactions that are more susceptible to errors in the manual-entry of unit cost.
  • Determine expected QOH: recalculating projected QOH will verify the accuracy of manual counts performed either during full inventory or cycle counts.

Steps to validate inventory valuation include:

1. Determine expected QOH.  Use of all transactions from inception of the part to arrive at a projected QOH.
2. Once expected QOH has been determined, the FIFO layers can be populated using the most recent transactions and pulling in enough receipts in reverse chronological order until sufficient receipts are included to equal QOH.
3. Test findings:

  • Identify parts with significant variances between the validated and general ledger values.  Scope should be sufficient to capture at least 25 percent of total variance.

For parts with significant QOH variances:

  • Recalculate expected QOH using transaction history.   If the QOH from the data-mining program can be recalculated, that’s a strong signal that the data-mining program is effectively pulling the right transaction codes. Since there are wide variations in transaction codes among various ERP systems, trial and error at this stage will be required.
  • Once expected QOH is confirmed, select a sample of parts to be counted.  Confirmation via cycle count provides assurance that the data-mined QOH is accurate.

For parts with significant unit cost variances:

  • Recalculate data-mined blended costs using transaction history. As with testing data-mined QOH, with the wide variations in transaction codes among various ERP systems, trial and error at this stage will be required.
  • Once the data-mined blended FIFO unit cost can be verified, select a sample of parts for which the receipts used to populate the FIFO layers are agreed to source documents (e.g., invoices or purchase orders). Confirmation of FIFO unit cost to source documents provides assurance on the data-mined unit costs.

4. Correct general ledger:

  • Corrections to QOH: errors identified in QOH can be corrected with transactions that adjust in or adjust out QOH.  These transactions operate in a similar fashion as adjustments to QOH recorded during cycle or full inventory counts.
  • Corrections to unit cost:

Since a large amount of variances identified in the data-mining project are likely to be below the scope of materiality and it is unlikely that most companies would apply the necessary resources to correct each part with valuation distortions, a journal entry should be recorded that will post a loss (or gain) to income and a reduction (or increase) to inventory on the balance sheet.

For parts with significant variances, a company may choose to correct the unit cost by adjusting out all QOH and adjusting in QOH at either the blended FIFO cost (not text-book GAAP, but expedient) or at unit costs per receipts for all receipts included in the FIFO layers.

This data-mining project is within reach for most organizations. Programmers can create the logic within an hour.  Tweaking the transaction codes could take a few attempts.  Once the program has been properly vetted, the program can provide credible results within 15 minutes, placing the task within reach as a routine procedure for even the busiest finance groups.

Jacket image, The Board of Directors and Audit Committee Guide to Fiduciary Responsibilities by Sheila Moran and Ron KralThe data-mining audit of inventory valuation should be performed before and after every full physical inventory count, year-end and through-out the year as an internal audit procedure.  The effort of creating a data-mining program is well worth it.  With the use of a data-mining program like this, there is no reason lenders and board directors should be denied the assurance that a significant portion of most organizations is properly valued.

Sheila Moran CPA, CFE is CFO at Professional Power Products and serves on the faculty of the Association of Certified Fraud Examiners. She is the co-author, along with Ron Kral, of The Board Director and Audit Committee Members Guide to Fiduciary Responsibilities.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s