the Obsolescence of Cost Accounting Systems part 3
III. The Present Day Crisis and Flawed Assumptions
A. The Flawed Assumptions Driving Today's Irrelevance
Information Deficiency
Today's management accounting information, driven by the financial reporting cycle, is too late, too aggregated, and too distorted to be relevant for managers' planning and control decisions [Johnson and Kaplan].
Managers now commonly rely on financial numbers alone [Johnson and Kaplan].
The Problem with Today’s Numbers: Too Late, Too Fuzzy, and Too Wrong
In the 19th and early 20th centuries, business managers created accounting systems specifically to help them run their factories and railroads efficiently. These internal systems provided quick, detailed, and relevant information. However, the sources explain that this changed after 1925 when external financial rules (like GAAP) took over.
Today’s management accounting reports are now deeply flawed for internal decision-making, as summarized by the statement:
Today's management accounting information, driven by the financial reporting cycle, is too late, too aggregated, and too distorted to be relevant for managers' planning and control decisions.
Too Late (Not Timely)
The fundamental issue here is timing. Management accounting systems are structured around the rhythm of financial reporting, which produces data monthly or quarterly.
Example: To produce a complete monthly or quarterly profit and loss (P&L) statement, accountants must wait for the period to close, calculate all the costs, and record all the necessary accruals. Even with fast closing procedures, this process means the final cost information for a given period appears well into the next period (for example, a monthly report appears mid-way through the following month).
This delay makes the information useless for immediate control. Production problems—involving labor, material, or machine use—happen daily, even hourly. If a manager needs to know immediately why a production line is inefficient, waiting three weeks for the official monthly variance report is too late to fix the problem.
Consequently, clever production managers often by-pass the official accounting system entirely and create their own, unofficial, real-time tracking systems—often just "back-of-an-envelope" data or simple spreadsheets on personal computers—to monitor performance promptly. The official system, therefore, fails the timeliness criterion.
Too Aggregated (Lacking Detail)
The term "aggregated" means the information is clumped together into large totals, lacking the specific detail needed to pinpoint a problem.
When reports are designed for senior management or external auditors, they focus on broad summary figures, like the total cost of goods sold or the divisional Return on Investment (ROI). This is because the overall financial statements do not require highly specific details about every single operation.
Example: The monthly profit and loss (P&L) statement often produces total cost variances for a whole division or a large cost center. However, these aggregated variances are often too broad to identify the source of a production problem. For instance, a manager might see an "unfavorable variance" for labor usage in their monthly report, but that single number doesn't tell them which machine, which product, or which day the inefficiency occurred.
The information is produced at too high a level to help with short-term, operational control.
Too Distorted (Factually Misleading)
The most severe failing is that the numbers are often distorted and misleading. This happens because the system is designed to satisfy external financial rules, not operational reality.
To value inventory for financial reports, fixed indirect costs (overhead, such as rent, salaries, and depreciation) must be "attached" to products. Since auditors typically accept simple, arbitrary methods for this allocation, most companies use measures like direct labor hours or dollars to distribute massive overhead costs across all products.
Example: A modern, automated factory has very low direct labor content (perhaps 10% of total product cost or less) but high overhead costs from expensive machinery and engineering support. If the accounting system allocates 80% of the overhead based on direct labor hours, the following occurs:
Simple Products: Products that require a lot of hands-on labor time will be assigned an unfairly large share of the overhead cost, making them appear too expensive and perhaps unprofitable.
Complex Products: Products that use high-tech machinery and engineering (high overhead demand) but little direct labor time will appear too cheap and highly profitable.
This systematic distortion (called cross-subsidy) means the "product costs" used internally are often wrong, leading managers to make misguided decisions on pricing, product sourcing, and product mix.
The Consequence: Managers "Manage by the Numbers"
The result of this historical shift and resulting flawed information is stated clearly:
Managers now commonly rely on financial numbers alone.
Since the system designed for external reporting is the only source of standardized, comprehensive cost information, managers often rely on it for strategic decisions, despite its irrelevance.
This reliance leads to several negative outcomes:
Short-Term Focus: Managers’ horizons shrink to the short-term cycle of the monthly or quarterly financial statement. They may reduce valuable long-term investments (like R&D, maintenance, or training) to meet immediate profit targets and boost reported short-term earnings, compromising the firm's long-term health.
Dysfunctional Behavior: Executives who lack knowledge of their firm’s technology increasingly make decisions based solely on the projected impact on financial measures, such as earnings per share or ROI. They become isolated from the real value-creating operations of the organization.
Failure to Adapt: The academic community and management have perpetuated the idea that cost accounting's chief activity is valuing inventory for financial reports, ensuring that the use of flawed financial data for managerial decisions continues despite its irrelevance.
In essence, the system intended to measure efficiency became a reporting mechanism for external compliance, leading to a situation where management's "visible hand" attempts to guide the complex organization while blindfolded by irrelevant, distorted data.
Analogy: Imagine navigating a modern warship using a weather report from two weeks ago (too late), which only gives the average temperature of the entire ocean (too aggregated), and reports a favorable wind direction when you are actually sailing into a massive headwind (too distorted). The captain, relying solely on that information, will be making tactical decisions that guarantee failure, even though he is following the "official" rules of navigation. This is the challenge faced by managers using today’s outdated management accounting systems.
The Obsolete Cost Structure Assumption
Current systems rely on assumptions made 60 to 100 years ago that are undermined by modern business reality [Johnson and Kaplan].
Direct Labor is Fixed, Not Variable: Cost systems were designed when labor was the most costly input. Today, direct labor is a decreasing fraction of total product costs (often 10% or less) and is often more properly thought of as a fixed rather than a variable cost [Johnson and Kaplan].
Overhead Distortion: Overhead costs are now a much higher fraction of total costs, but they are still allocated based on the products' direct labor content (the smallest cost category) [Johnson and Kaplan]. This practice ignores the fact that many overhead expenses vary with activities other than direct labor [Johnson and Kaplan].
The Outdated Foundation of Cost Accounting
The internal accounting systems used by most companies today rely on assumptions made decades ago that simply no longer reflect modern business reality.
Current systems rely on assumptions made 60 to 100 years ago that are undermined by modern business reality [Johnson and Kaplan].
By 1925, almost every management accounting procedure used today, such as budgeting and standard costing, had been developed. These procedures were excellent for the manufacturing environment of that time, which was characterized by mass production of standardized goods and relatively high labor content.
However, the rapid changes in technology, global competition, and product complexity that followed World War II have effectively made the intellectual foundation of these systems obsolete. The rules that managers follow today were designed for factories where the cost structure and operational complexity looked very different from those of a modern automated plant.
The main problem is that the official accounting system is still trying to measure and control costs based on the economic reality of the 1920s, leading to product cost calculations that are too late, too aggregated, and too distorted for contemporary managers to use for planning and control.
The Changing Nature of Labor Costs
The first major assumption that has been undermined relates to the cost of human labor:
Direct Labor is Fixed, Not Variable: Cost systems were designed when labor was the most costly input. Today, direct labor is a decreasing fraction of total product costs (often 10% or less) and is often more properly thought of as a fixed rather than a variable cost [Johnson and Kaplan, 262, 276, 509].
Historically, management accounting systems were built around the idea that direct labor (the wages paid to workers directly involved in making a product) was the largest and most important cost, and it was primarily a variable cost. If the factory produced less, the company could cut labor hours or reduce its workforce, and those costs would go down directly.
Today, the situation has reversed:
Labor’s Share Has Shrunk: Due to automation and computer-integrated manufacturing (CIM), direct labor is now a decreasing fraction of total product costs. For many manufacturers, direct labor is 10% or less of the total product cost.
Labor is now "Fixed": A lot of the remaining direct labor is now considered a fixed cost. Management is often reluctant to lay off skilled workers (due to the difficulty and cost of finding replacements) except under dire circumstances. Therefore, direct labor wages often remain constant even when production volume temporarily drops, making them a fixed expense in the short run.
Because the official cost system was designed to meticulously track and control the flow of this variable labor cost—a small and increasingly fixed expense—it directs management's attention to the wrong area.
The Overhead Distortion Nightmare
While labor costs have shrunk, the cost category known as "overhead" has exploded, yet the methods for tracking it remain stuck in the past.
Overhead Distortion: Overhead costs are now a much higher fraction of total costs, but they are still allocated based on the products' direct labor content (the smallest cost category) [Johnson and Kaplan]. This practice ignores the fact that many overhead expenses vary with activities other than direct labor [Johnson and Kaplan].
The Allocation Flaw
Overhead includes all the indirect costs of running a factory, such as utilities, machine maintenance, engineering, and supervisory salaries.
Overhead is Massive: Overhead costs are now a much higher fraction of total costs than they were when the systems were designed.
Allocation is Arbitrary: Despite overhead being the largest cost category, it is often still allocated (spread) across products using the one measure that is readily available in the outdated system: the products' direct labor content.
Example of Distortion: In one company study, the overhead costs distributed to products represented 60% of the product's total cost, yet this enormous expense was allocated based on direct labor cost, which was only 11% of the total cost. Since the largest cost is being distributed by the smallest cost, the resulting calculation of the individual product's cost is likely wrong—so wrong that the sources suggest it is unlikely that even the first digit of the reported cost is correct.
This practice causes severe systematic bias and distortion in product costs, often leading to enormous cross subsidiesacross product lines. This means cheap products appear expensive, and truly expensive products appear cheap.
Ignoring the Real Cost Drivers
The core flaw is that this simple allocation method ignores the fact that many overhead expenses vary with activities other than direct labor. This means the system fails to identify the real drivers of cost.
For instance: The cost of quality control, engineering, and machine setup does not change based on how many hours a direct laborer spends on a product. These costs are driven by the number of times a machine is set up (set-up hours), the complexity of the product design (number of components), or the time the machine runs (machine hours).
By forcing a cost system to allocate massive, complex overhead based on the shrinking, simple measure of direct labor, managers receive distorted data. This irrelevant cost information, designed purely to satisfy external financial reporting requirements, is then often misused for strategic decisions, such as pricing, leading to misguided choices that harm the firm's overall profitability.
Analogy: Relying on current management accounting systems to manage a modern factory is like trying to navigate a supersonic jet using a road map drawn for a horse-drawn carriage. The map's details are too slow (late), too generalized (aggregated), and focus intensely on the horse's feed costs (direct labor), while completely miscalculating the fuel consumption for the jet's engine (overhead), leading the pilot to wildly inaccurate destinations (misguided strategic decisions).
The "Costs Attach" Nightmare
Many companies use a simplistic system where material, labor, and overhead costs at each stage are combined into a single cost transferred to the next stage [Johnson and Kaplan].
This system, which generates a fully absorbed product cost, destroys all semblance of cost structure[Johnson and Kaplan].
It becomes impossible to estimate direct costs or make even a crude separation into fixed or variable costs [Johnson and Kaplan].
The Black Hole of Accounting: Losing Track of What Things Really Cost
A simplistic method of tracking costs that, while easy for bookkeeping, actually makes it impossible for managers to understand what is happening inside their company.
This practice is the culmination of decades where internal accounting systems were surrendered to the rules of external financial reporting, leading to the use of systems that are "too late, too aggregated, and too distorted" for managers' needs.
1. The Simplistic System of Cost "Attaching"
The first point addresses how this misleading system works:
Many companies use a simplistic system where material, labor, and overhead costs at each stage are combined into a single cost transferred to the next stage [Johnson and Kaplan, 201].
In manufacturing, products often move through multiple stages (e.g., from raw metal to component part, then to subassembly, and finally to a finished unit). When a product moves from one stage to the next, the costs incurred so far must be recorded and transferred.
This simplistic system relies on the idea that cost dollars "attach like barnacles to the physical flow of materials". At each stage of production, accountants combine three main cost categories and transfer the total to the next step:
Raw Material Cost (the cost of the initial input, like steel)
Direct Labor Cost (the wages paid to workers directly assembling the product)
Overhead Cost (the arbitrary allocation of indirect costs, like factory electricity or maintenance).
In this simplistic system, once these three costs are calculated for a stage, they are combined into a single total cost. This total then becomes the "material input cost" for the next stage of production.
Example: Imagine a valve being made in a factory.
Stage 1 (Drill, Face, Tap): The cost of the purchased raw part (£1.20) plus the cost of labor (£0.04) and overhead (£0.24) is calculated, totaling £1.48.
Stage 2 (Degrease, Remove Burrs): The cost of Stage 1 (£1.48) is treated as the "material cost" for Stage 2. The new labor and overhead costs for Stage 2 are added to this consolidated figure.
Final Assembly: When the valve reaches the final step, its "material cost" line includes all the previous stages' labor and overhead costs that have been rolled up.
This procedure, called full-absorption accounting or "cost attaching," ensures that all factory costs are eventually "attached" to the products and accounted for in the financial reports.
2. The Destruction of Cost Structure
While this system satisfies external auditors by producing a fully absorbed product cost for inventory valuation, it immediately makes the information useless for managers:
This system, which generates a fully absorbed product cost, destroys all semblance of cost structure [Johnson and Kaplan, 201].
"Cost structure" refers to the breakdown of expenses into their foundational categories (material, labor, fixed overhead, variable overhead). Knowing the structure allows a manager to understand why a product costs what it does and how that cost might change.
When the system combines labor and overhead from previous stages into the "material" line of the next stage, that foundational structure is destroyed. The final cost figure is merely an aggregated total, hiding the true nature of the costs within.
Example of Lost Structure: If the final product sheet shows a "material cost" of £4.38 and total cost of £5.18, a manager cannot look at the £4.38 and know how much of that figure is actually:
Raw material (the true variable cost).
Labor from three steps ago (a semi-fixed cost).
Allocated plant rent (a fixed overhead cost).
The system, though "elegantly simple" for accountants performing inventory costing, obscures the real consumption of resources.
3. The Inability to Separate Costs for Decision-Making
The final consequence of this destruction of structure is that managers cannot obtain the specific, relevant cost data they need to make decisions:
It becomes impossible to estimate direct costs or make even a crude separation into fixed or variable costs [Johnson and Kaplan, 202, 168].
For good strategic decisions—like pricing, accepting a special order, or determining product mix—managers need to know two things about their costs:
Direct Costs (or Prime Costs): The costs directly traceable to a specific product (usually material and direct labor). If the "material cost" line contains rolled-up labor and overhead from previous stages, it is impossible to calculate this basic figure.
Fixed vs. Variable Costs:
Variable Costs change directly with production volume (e.g., raw materials). Managers must know these to calculate the true contribution margin on a product.
Fixed Costs do not change in the short run (e.g., factory building rent or salaries).
Because the "material" line of the final product includes costs that should be fixed (overhead from previous stages) alongside costs that are truly variable (raw material), the manager cannot make even a rough split between fixed and variable costs using the official accounting report.
If the manager cannot separate variable costs from fixed costs, they cannot correctly assess the true contribution margin of a product, leading to potentially "misguided decisions on product pricing, product sourcing, product mix, and responses to rival products". The resulting fully absorbed unit cost figure is, at best, useless for guiding the operational decisions of the company and is more likely misleading.
In summary, this simplistic cost attaching system ensures compliance with external reporting but turns the internal cost ledger into a "black hole." All complexity is hidden, and the resulting numbers are unable to serve management purposes, forcing managers either to bypass the system or to "manage by the numbers" based on highly irrelevant data.
Historical Costs vs. Future Decisions
Financial accounting is designed to record historical events and mathematically assign monetary value to past activity [Smith (1999)].
The concepts of Historical Cost and the Matching Principle require allocating sunk costs (fixed overhead, such as plant and equipment depreciation) over future periods to products [Smith (1999)].
These sunk costs are unrecoverable and cannot be changed by day-to-day decisions [Smith (1999)]. Using such allocated costs to evaluate current performance or make decisions is illogical [Smith (1999)].
This explanation focuses on how financial accounting, unlike management accounting, is designed to look backward and why using its results for forward-looking decisions is illogical and often detrimental to a business.
The Backward-Looking Nature of Financial Records
Financial accounting, which results in public reports like the Balance Sheet and Income Statement, has a very specific purpose. It is focused on compliance, consistency, and recording what has already occurred.
Financial Accounting Records the Past
The fundamental nature of financial accounting is backward-looking:
Financial accounting is designed to record historical events and mathematically assign monetary value to past activity [Smith (1999), 433].
Financial reports are intended to present a record of the past in a consistent and fair format [Smith (1999), 433]. The theory behind financial accounting is valid for the purpose of reporting past activities [Smith (1999), 433]. It tracks the financial history of the firm by mathematically assigning monetary value to transactions that have already taken place [Smith (1999), 433, 434].
The purpose of financial accounting is to provide consistent and uniform information to stakeholders outside the company, such as stockholders, regulatory agencies, and lenders [Smith (1999), 435]. The rules governing financial accounting are known as Generally Accepted Accounting Principles (GAAP) [Smith (1999), 364, 425].
The Role of Historical Cost and Matching
Two core concepts ensure that financial reports accurately reflect past spending, but they are exactly what causes problems for managers trying to make current decisions:
The concepts of Historical Cost and the Matching Principle require allocating sunk costs (fixed overhead, such as plant and equipment depreciation) over future periods to products [Smith (1999), 435].
Historical Cost: When a company buys a major asset, such as a machine or a factory building, traditional accounting requires that the asset be recorded at its original purchase price (historical cost) [Smith (1999), 436].
Matching Principle: This principle requires that expenses (costs) be recognized in the same time period as the revenues they helped generate [Smith (1999), 435, 446].
To combine these concepts, accountants cannot expense the full cost of a machine the moment it is purchased. Instead, Historical Cost dictates the purchase price, and the Matching Principle requires that this cost be spread out over the estimated life of the asset and allocated to the products made during that time [Smith (1999), 435, 436]. This spreading of the asset’s cost over future periods is known as depreciation [Smith (1999), 436].
This process results in the complex allocation of fixed overhead (indirect costs) to every product [Smith (1999), 436]. These fixed overhead resources can include plant and equipment, maintenance staff, tooling, and plant supervision—all of which are long-term investments needed to support the business infrastructure [Smith (1999), 436].
The Illogical Use of Sunk Costs
The sophisticated systems created to accurately match historical costs to revenues for external reports produce figures that are inappropriate and often detrimental when used for internal operational decision-making.
Sunk Costs Cannot Be Changed
The costs being allocated in this process are, by definition, sunk costs:
These sunk costs are unrecoverable and cannot be changed by day-to-day decisions [Smith (1999), 436, 449].
A sunk cost is an expenditure already incurred that may not be escaped, even by going out of business [Johnson and Kaplan, 154, 192]. Once a company buys a machine or pays last month's rent, that money is gone, and no daily action can bring it back [Smith (1999), 449].
For example, the cost of last month’s computing hardware, software, and staff salaries are unrecoverable and cannot be changed by today's operational choices [Smith (1999), 449]. Whether a department uses internal computing services extensively or not at all, the historical investment has been made and is gone [Smith (1999), 449].
The Illogical Practice
This leads directly to the core problem:
Using such allocated costs to evaluate current performance or make decisions is illogical [Smith (1999), 449].
The cost accounting rules force companies to allocate these fixed (sunk) costs to every unit of product [Smith (1999), 436]. However, this total product cost figure, known as fully absorbed product cost, is not useful for a manager trying to determine the true cost of making a decision today [Smith (1999), 369].
Example: If a manager is trying to decide whether to accept a special order that will not affect plant capacity, the relevant cost is the incremental cash outflow (like the cost of raw materials and variable labor) [Smith (1999), 421].
Financial Accounting View: The full-absorption cost system includes a slice of fixed overhead (e.g., depreciation on the machine, or a portion of the factory rent) in the unit cost of that product. If the manager uses this number, the product looks too expensive, and they might reject the order [Smith (1999), 449].
Logical Managerial View: Since the fixed costs (the sunk costs) have already been paid and will not increase because of the order, those costs are irrelevant to the decision [Smith (1999), 449]. Rejecting the order based on the allocated fixed cost is illogical because the company foregoes the opportunity to earn money above the true variable cost [Smith (1999), 398, 449].
In summary, the sophisticated financial accounting system, by allocating unrecoverable sunk costs to products, fails to distinguish between costs that can be changed by a current decision (relevant costs) and those that cannot (irrelevant costs) [Johnson and Kaplan, 154, 192]. Using this backward-looking data for forward-looking decisions often results in poor decisions because it ignores the actual incremental costs and the constraints of the business [Smith (1999), 418, 433].
Analogy: The financial accounting system acts like a historian. It is excellent at recording every pound and penny ever spent on building a ship (historical cost) and tracking the spread of that cost over the ship's lifetime (matching principle). But when the captain needs to decide whether to take on an extra container of freight right now, the historian's report about the ship's original construction cost is useless. The captain only needs to know the cost of fuel and labor for the specific trip (relevant costs), not the sunk cost of the entire ship itself. Relying on the total historical cost to make the daily freight decision is illogical.
B. Why Cost Accounting Cannot Be Trusted for Decisions
Ignoring the Constraint (The Limiting Factor)
The conventional process of relevant costing or selecting relevant information is insufficient unless the relevant costs or data are considered in light of the constraining resource [Smith (1999); Noreen, Smith, and Mackey (1995)].
Failing to recognize the effect of the limited resource guarantees that companies will often choose alternatives that are less than optimal or even detrimental [Smith (1999)].
The Theory of Constraints' principle of profit maximization is centered on the contribution margin per unit of the scarce resource [Noreen, Smith, and Mackey (1995)]. Traditional cost systems completely ignore this [Smith (1999)].
The Critical Missing Piece: Why Good Decisions Depend on Scarcity
Management accounting exists to help managers make sound decisions by focusing on information that is relevant to the choice at hand, rather than just historical reports [Smith (1999), 298, 333]. "Relevant information" is defined as the predicted future costs and revenues that will actually differ among alternative courses of action [Smith (1999), 234, 300].
However, identifying these relevant costs is not enough if a company ignores the fundamental physical reality of its business operations—that every system has a limit.
1. The Insufficiency of Conventional Relevant Costing
The conventional process of relevant costing or selecting relevant information is insufficient unless the relevant costs or data are considered in light of the constraining resource [Smith (1999), 297; Noreen, Smith, and Mackey (1995), 297].
A "constraining resource" (or scarce resource) is anything that limits the entire system from achieving its objective, such as maximizing profit [Noreen, Smith, and Mackey (1995), 509; Smith (1999), 316]. If a company did not have a constraint, its output would theoretically be unlimited [Smith (1999), 316].
Management accounting textbooks routinely teach that when a limiting factor exists, the basic assumptions underlying costs and revenues of a potential action change [Smith (1999), 234, 300]. This is because the capacity of that specific resource determines the potential profit the entire company can generate [Smith (1999), 399].
For example, a traditional manager reviewing a product line might see that Product A has a high gross margin per unit sold, while Product B has a low margin. Based on traditional relevant costing (focusing on marginal revenue vs. marginal cost), the manager would likely push Product A [Smith (1999), 398]. However, this traditional view becomes insufficientif Product A uses three hours of time on the only specialized, expensive machine that runs the entire factory (the constraint), while Product B uses only ten minutes on that same machine. The constraint changes what information is truly relevant [Smith (1999), 297].
2. The Danger of Ignoring the Limited Resource
When companies fail to factor in this reality, the consequences can be severely negative:
Failing to recognize the effect of the limited resource guarantees that companies will often choose alternatives that are less than optimal or even detrimental [Smith (1999), 400].
When managers focus on minimizing local unit cost or maximizing gross margin per sales dollar—without regard for the constraint—they are inevitably driven toward decisions that harm the whole company [Smith (1999), 245, 303].
Example of Detrimental Choice: In one case, a company focused on manufacturing highly complex electronic panels (16- and 20-layer boards) because they believed these high-end, low-volume products had higher gross margins [Smith (1999), 402]. They based this decision on traditional full-absorption costing or even Activity-Based Costing (ABC)[Smith (1999), 401].
However, the inner-layer production machine was the plant's capacity constraint [Smith (1999), 402]. The complex 20-layer boards were found to consume substantially more time on this crucial constraint machine than simpler 4-layer boards [Smith (1999), 403]. Worse, the 20-layer boards had a high quality failure rate (over 40%) [Smith (1999), 402]. By prioritizing the seemingly high-margin complex boards, the company was intentionally starving the constraint machine of its valuable capacity, thereby limiting the total throughput (sales) and net profit the entire organization could achieve [Smith (1999), 402, 403]. Their actions were less than optimal and drove the company toward declining profit [Smith (1999), 402].
3. The Theory of Constraints: Focusing on Leverage
The Theory of Constraints (TOC) provides the framework for making optimal choices by ensuring that the constraint is always the center of strategic evaluation:
The Theory of Constraints' principle of profit maximization is centered on the contribution margin per unit of the scarce resource [Noreen, Smith, and Mackey (1995), 471; Smith (1999), 233, 239]. Traditional cost systems completely ignore this [Smith (1999), 354, 399].
TOC argues that managing a complex environment is simplified if resources are focused around a few leverage points—the constraints [Smith (1999), 317, 421]. The process of continuous improvement is centered on five steps, including identifying the constraint and deciding how to exploit it (maximize its use) [Smith (1999), 238, 388].
To exploit the constraint, managers must calculate the contribution margin per unit of the scarce resource[Noreen, Smith, and Mackey (1995), 471]. This is the financial criterion for maximizing profits [Noreen, Smith, and Mackey (1995), 297].
How the Calculation Works:
Contribution Margin (Throughput): This is calculated as sales revenue minus truly variable costs (costs that change in a direct one-for-one relationship with volume, such as raw materials and sales commissions, but usually not direct labor, which is often a fixed cost) [Smith (1999), 239, 395].
Scarce Resource Unit: This is the time it takes the product to consume the capacity of the bottleneck resource (e.g., machine minutes, hours of specialized labor) [Noreen, Smith, and Mackey (1995), 472].
Prioritization: Products are prioritized based on maximizing the dollar contribution yielded for every minute the constraint is used [Noreen, Smith, and Mackey (1995), 471, 472].
Example Revisited: When the electronic panel company calculated its product costs using the TOC formula, the results were dramatic [Smith (1999), 402]:
Product Layer Count
Contribution per Constraint Minute (Yielded)
4-Layer Panels
$77.93
20-Layer Panels
$3.73
The data clearly showed that the complex, 20-layer boards were the least valuable use of the constrained machine’s time, yielding only $3.73 per minute, compared to the simpler 4-layer boards, which yielded $77.93 per minute [Smith (1999), 403].
By relying on traditional cost systems—which allocate large amounts of fixed overhead and ignore the constraint—the company was making strategic decisions that were nearly opposite to what would maximize profit [Smith (1999), 252, 398]. The TOC approach provides the necessary framework to combine relevant cost information with operational reality to ensure optimal strategic choices [Smith (1999), 315].
Focus on Local Optimization
Cost accounting, by its nature, allows (and sometimes forces) a manager to think locally—a work center or a product in isolation—instead of globally [Noreen, Smith, and Mackey (1995)].
People will take actions to maximize their measures, and if their measure is focused on local optimization (like machine utilization or labor efficiency), it creates a conflict with the global goal of maximizing throughput [Smith (1999)].
Why Cost Accounting Encourages Local Thinking
The core problem with conventional management accounting systems today lies in their design. Historically, these systems evolved to meet the needs of financial reporting and often contain procedures that measure parts of the business in isolation, rather than focusing on the performance of the entire company.
Cost accounting, by its nature, allows (and sometimes forces) a manager to think locally—a work center or a product in isolation—instead of globally.
In short, "thinking cost allows... a manager to think locally—a work center in isolation, a product in isolation," while "thinking throughput forces a manager to think globally".
The Historical Root of Local Focus
The practice of thinking locally is deeply rooted in how cost systems developed. Historically, management accounting originated in the 19th century to help managers control the conversion of raw materials within their organizations. The initial focus was on measuring and monitoring the efficiency of internal processes—how well labor and materials were converted—not on measuring the overall "profit" of the entire enterprise.
For example, early American textile mills designed their systems to promote efficiency in converting raw materials into a single final product, concentrating on measures like conversion cost per pound. Engineers in metal-working firms also developed systems, such as Frederick W. Taylor's "scientific management," which focused narrowly on finding the "one best way" to use labor and material resources in specific tasks. This analytic approach focused on the "infinitely small" details of specific departments or processes.
Although some people argued for a "synthetic" approach to ensure that efficient parts added up to a profitable whole, the traditional emphasis remained on the efficiency of local processes.
The Modern Mechanism of Local Focus
Today, the local focus is reinforced by accounting rules that prioritize compliance with external financial reporting (absorption costing). When managers rely on the integrated, single cost system that was designed primarily to value inventory for outside auditors, this system typically allocates large fixed overhead costs based on single, local activities, such as direct labor hours.
This practice encourages managers to focus their cost-reduction efforts solely on the activities used as the allocation base, typically direct labor, even though it may be a very small fraction of total costs. This focus on reducing direct labor and increasing machine utilization is a classic example of the pervasive, ingrained "local cost world mentality".
Because of this system design, companies end up with reports that detail the "fully absorbed product cost" or labor efficiency for a single process, making it challenging to understand the financial consequences for the entire company.
The Conflict Between Local Efficiency and Global Throughput
The fundamental problem arises because the incentives built into these locally focused cost systems often directly contradict the objective of maximizing company profit, which is a global goal.
People will take actions to maximize their measures, and if their measure is focused on local optimization (like machine utilization or labor efficiency), it creates a conflict with the global goal of maximizing throughput.
The ultimate goal of a company, maximizing profit, is tied to three factors: increasing throughput (sales), decreasing operating expenses, and decreasing investment (inventory). Throughput is generally defined as sales revenue minus truly variable costs (like raw materials).
When a manager strives for local optimization—making their specific department look as efficient and low-cost as possible—they often take actions that are "detrimental" to the global goal of maximizing throughput for the whole system.
Example 1: Efficiency vs. On-Time Delivery
The conflict is immediate and chronic for managers on the shop floor. They are trapped between two opposing requirements:
To Ship On Time (Global Goal): The entire operation must run at the pace of the system's slowest resource (the constraint). This requires pacing raw material release and often means that non-constrained resources must slow down or stand idle, resulting in low labor efficiencies.
To Minimize Unit Cost (Local Goal): All production resources must operate at maximum efficiencey to produce the least-cost unit.
A manager is forced to compromise between these two criteria. If they maximize output at non-constrained resources, they simply create excess work-in-process (WIP) inventory, increasing cycle time, tying up cash flow, and delaying the product—actions that directly decrease the system's global throughput. The conventional system thus creates incentives for the exact opposite of the behavior required to maximize global profit.
Example 2: The Loom Factory and False Efficiency
This conflict is demonstrated by companies making suboptimal decisions to chase an efficiency measure:
A textile manufacturer had high-investment looms that were the operational constraint. An executive mandated cutting direct labor at the looms to increase "units per man-hour," a local efficiency measure.
The Flawed Result: Although the local measure of labor efficiency did increase, the total output (throughput) of the looms dropped, on-time delivery fell, and overtime increased in every other department.
The Illogical Decision: Because the vice president chose to maximize the output per unit of labor hour of the constraint, they were essentially trying to minimize their labor investment instead of maximizing the return on the plant's investment in the highly expensive looms. Maximizing least cost per unit of output is not the same as maximizing throughput dollars. The result was two months of dramatically deteriorating net profit because the decision focused on a local measure (labor input) rather than the global objective (maximizing the output of the constraint).
In summary, cost accounting systems focus managers on local measures that are often irrelevant for strategic decisions. Because these local measures, like efficiency or unit cost, reward actions that conflict with global flow, they result in departments being pitted against each other and a chronic state of conflict that ultimately reduces company profit and return on investment.
Measurement Distortion
The accounting figures generated are a mixture of current and historical costs, adjusted by estimates of asset useful life, making them unreliable for evaluating short-run performance [Smith (1999)].
Reported product costs are often given five or six significant digits, but the arbitrary allocation of overhead makes it likely that the first digit is wrong [Johnson and Kaplan].
Why Accounting Figures Are Unreliable for Short-Run Decisions
Management accounting is supposed to provide managers with information to help them fulfil organizational objectives [Smith (1999), 349]. However, the cost information generated by current systems is largely driven by financial accounting requirements, which are designed to record history rather than guide future action [Johnson and Kaplan, 7].
A Mixture of Past and Estimated Costs
The sources explain that the complexity and unreliability of current accounting figures stem from combining verifiable historical data with future guesses:
The accounting figures generated are a mixture of current and historical costs, adjusted by estimates of asset useful life, making them unreliable for evaluating short-run performance [Smith (1999), 401].
To produce financial reports, traditional cost accounting systems must record historical events and assign a monetary value to past activity [Smith (1999), 418]. This means when a company purchases a long-lived asset, such as a machine, its cost is recorded at the original price (historical cost) [Smith (1999), 421].
To satisfy the Matching Principle (which matches costs to the revenues they help generate), accountants must spread that historical cost over the asset’s assumed working life, a process called depreciation [Smith (1999), 421].
Example of the Mixture: A factory buys a machine for £100,000 and estimates it will last 10 years. Each year, £10,000 of the original, historical cost is recognized as an expense (depreciation). This depreciation is then classified as fixed overhead (sunk cost) and allocated to the products made that year [Smith (1999), 421].
The resulting product cost figure is therefore a mixture:
Current Costs: Things like the raw materials purchased this month.
Historical Costs Adjusted by Estimates: A piece of the machine's price from 10 years ago, allocated based on an estimate of how long the machine will be useful [Smith (1999), 401].
Why This Mix is Unreliable
This reliance on historical costs and estimates makes the product cost figure unreliable for evaluating short-run performance [Smith (1999), 401]. Managers need to know the true, incremental cost of an action to make a short-run decision (like accepting a special, urgent order) [Smith (1999), 333]. The allocated depreciation costs are sunk costs—money already spent that cannot be changed by day-to-day decisions—and are therefore irrelevant to current choices [Smith (1999), 422].
Using a figure contaminated by these irrelevant, historical allocations can lead to dysfunctional management decisions[Smith (1999), 401]. For instance, a manager might refuse a profitable short-run order because the reported unit cost (which includes the allocated sunk cost) makes the product appear too expensive [Smith (1999), 449].
The Illusion of Precision and Arbitrary Allocation
The second critical issue is the false sense of accuracy given by these flawed numbers and the methods used to calculate them.
Reported product costs are often given five or six significant digits, but the arbitrary allocation of overhead makes it likely that the first digit is wrong [Johnson and Kaplan, 228, 240].
This means that accountants, using powerful computers, may report a product cost as, for example, £5.18134 (six significant digits) [Johnson and Kaplan, 227]. This precision suggests great reliability and truth, but the source materials argue that this perceived precision is misleading because the underlying calculations are fundamentally arbitrary [Johnson and Kaplan, 228, 240].
The Arbitrary Allocation of Overhead
The reason the precision is flawed lies in how indirect costs, or overhead, are assigned to products [Johnson and Kaplan, 228]. Overhead costs are now a much higher fraction of total costs than they were in the past [Johnson and Kaplan, 267].
In traditional systems, overhead is allocated (spread out) to products using simplistic and arbitrary measures, most commonly based on the amount of direct labor the product requires [Johnson and Kaplan, 8, 224, 227].
Example of Arbitrary Allocation and Distortion: In a typical cost system, overhead might represent 60% of the total cost attributed to a product, while direct labor represents only 11% [Johnson and Kaplan, 226, 227]. The entire 60% of overhead is allocated based on that small, 11% slice of direct labor [Johnson and Kaplan, 227].
The sources define this as arbitrary because it uses an allocation base (direct labor) that may have no actual relationship to the activity causing the overhead cost (e.g., machine maintenance, engineering support, or number of setups) [Johnson and Kaplan, 8, 233, 234].
Because the largest cost component (overhead) is distributed using the smallest and most convenient measure (direct labor), the resulting product cost calculation is systematically biased and distorted [Johnson and Kaplan, 8].
The First Digit is Wrong
The consequence of this arbitrary allocation is that the final cost number is highly unreliable:
Given the arbitrary allocation of overhead, it is unlikely that the first digit of the five [significant digits] is correct! [Johnson and Kaplan, 228].
If the true cost of making Product A is £8.00, but the accounting system reports £5.18134, the first digit is wrong. The precision (the figures after the decimal point) offers an illusion of accuracy that is entirely unsupported by the flawed allocation methodology [Johnson and Kaplan, 240]. This distorted, precise-looking number becomes the only available data on "product costs," creating a danger of misguided decisions on pricing, product sourcing, and product mix [Johnson and Kaplan, 8, 127].
In essence, the accounting system prioritizes producing a figure that satisfies external auditors by following formal rules—regardless of whether that figure accurately reflects the operational reality of the business [Johnson and Kaplan, 230].
Analogy: The current accounting system is like a student who is asked the time and replies, "I estimate the time is 3:42:15 PM, based on the shadow of the old sundial my grandfather made fifty years ago." The student has given you six significant digits of precision, mixing historical information with estimates, but because modern clocks are available and the sundial is poorly calibrated (arbitrary allocation), the first digit of the answer is probably wrong, making the entire number useless for scheduling your day.