By Anand Toprani*
(FPRI) — Few people would dispute that the United States is now embroiled in a “great power” competition, but this unanimity does not extend to the size of the defense budget. Proponents for greater spending argue that current expenditures are still inadequate for fighting a major and regional war while simultaneously supporting modernization of U.S. forces. This is remarkable considering that the current budget, including Overseas Contingency Operations funding, is as large in real terms as during World War II, despite today’s Joint Force being only one-sixth the size. Advocates for greater spending also retort that the defense share of national income has declined since the Korean War.
Supporters of increased budgets assume that each additional increment in defense spending will yield an equivalent increase in capability. Such a belief is no more tenable today than it was in the 1970s when reformers first sounded the alarm that lavish defense spending was, in fact, impairing U.S. capabilities. Their descendants have proposed to deliver superior defense at lower cost by appropriating a concept from public finance: austerity.
Austerity is the idea that cutting government spending encourages economic growth for two reasons. The first is the “Treasury View”: when governments spend beyond their revenues, they “crowd out” private investment by offering a higher rate of interest. The second is the assumption that when governments increase deficits, individuals and firms will restrict their investment or consumption because they expect higher taxes.
What does any of this have to do with defense spending? At first glance, the connection to austerity appears tenuous. Contemporary critics of high budgets draw largely from two intellectual traditions. The first derives from the studies of the “military-industrial complex” that emerged in the 1960s. These warned of the militarization of U.S. society, distorting effects on the economy, and immunity to oversight. The second comes from the works of dissident military officers and civilian analysts in the 1970s, notably the “Fighter Mafia.” They began by re-designing the F-15 and pushing development of the F-16 and A-10, then expanded their efforts to revamp the Pentagon’s procurement system and the services’ warfighting doctrines before running out of steam in the 1980s.
Contemporary jeremiads against the military-industrial complex focus on two aspects. The first is the Pentagon’s byzantine bureaucracy, which has come under renewed scrutiny for gargantuan accounting lapses. The second is the U.S. defense industry. Following the “Last Supper” of 1993, the defense industry underwent a remarkable consolidation with the number of major firms declining by 90% within five years. This consolidation counteracted much of the power that the Pentagon should have over contractors as a monopsonist (single buyer). In fact, the contractors act as “hidden monopolists” by buying up small manufacturers of components and overcharging the government.
The absence of competition, combined with the certainty of rich contracts, has had a corrosive effect on the defense industry that has spread across U.S. private industry. People hoped that the F-35 would transform procurement by adopting best-practices from the private sectors. Whereas its designers 20 years ago estimated each plane would cost $20 million (nominal), the current price tag for an F-35A is $90 million or $34,000 per flying hour compared to $22,000 for the F-16. The man hours required to build each F-35A (over 40,000) are anywhere from double to triple those of comparable planes during World War II.
No one denies that the United States has witnessed a remarkable bloat in the cost of designing and producing weapons. Former Lockheed Chairman and CEO Norman Augustine once estimated that the cost of new military equipment increased four-fold every eleven years. In the private sector, improvements to new technologies drive costs down, but the opposite occurs in defense because “new technology opens vast new capability vistas which are then crammed in to each new generation of a product.” This phenomenon led to Augustine’s quip that by “the year 2054, the entire defense budget will purchase just one tactical aircraft.”
He was not far off the mark. Since the 1950s, defense programs have an average cost overrun of 46%. In a perverse form of Moore’s Law, the cost of today’s weapons is many multiples that of previous generations and long overtook the growth in the defense budget. The skyrocketing cost and complexity of each new generation of weapons systems have yielded a toxic brew of lower production and reliability, and are the most significant reason why readiness has declined steadily since the 1960s. Consider the Navy—its post-9/11 budget increase was the largest since the Korean War, but today’s ship numbers are half that of the end of the Cold War.
One might counter that there is nothing special about the United States—weapons’ costs have exceeded the growth in national income since World War II in many developed countries. That said, the effects are more pernicious in the United States due to the belief that the U.S. military must have a qualitative advantage to compensate for any quantitative inferiority.
To counter this trend, defense “austerians” have put forward four arguments drawn implicitly from the case for austerity in public finance. The first is a variation of the “crowding out” effect: cutting defense will enable the United States to redirect its resources to other instruments of national power and abandon a costly, interventionist foreign policy. Even if the defense burden on the U.S. economy continues to shrink, there is still the problem of “opportunity costs”—which alternatives are the United States forsaking to continue purchasing gold-plated, unreliable aircraft? For roughly two-thirds the lifetime program cost of the F-35, the United States could, theoretically, provide clean drinking water for every human being, which would have incalculable humanitarian and national security benefits.
Second, Pentagon budgeteers argue that “tightening up a bit on the money” is the most-effective way to stem the bloat in administrative overhead and program costs. The Pentagon bureaucracy, insiders conclude, has “built-in incentives” against controlling costs, since doing so might reduce future funding levels. Obviously, any cuts should be targeted rather than across-the-board—it would make little sense to trim funding to the Office of the Inspector General or Cost Assessment and Program Evaluation since any cut there would be dwarfed by the lost savings those agencies identify.
Third, austerity will discourage production of new weapons systems relying on unproven technologies. This attitude has had profoundly negative consequences for U.S. military readiness, and recent legislation has failed to halt the production of weapons using unproven technologies. Cost overruns forced the Navy to abandon production of the Littoral Combat Ship after 35 ships ($30 billion) and the Zumwalt-class destroyer after three ($23.5 billion). The Air Force cancelled the F-22 after a production run of 187 and a cost of $350 million per plane. Most recently, teething problems have indefinitely delayed deployment of the U.S.S. Gerald Ford (which cost almost three times more than the Nimitz-class carriers and four times more than the America-class amphibious-assault ships ), and have exploded the program cost of the F-35 by 90%, with only 11% of deployed fighters fully mission capable last year.
Defense austerians also claim that less complex systems will be easier to produce in bulk and use in combat. Fielding larger quantities of lower tech weapons will not only reduce wear and tear in peacetime, but doing so is vital in wartime against a peer opponent. As the new “workshop of the world,” China has outpaced the United States in the production of weapons, launching 32 warships in 2016-2017 compared to the United States’ 13. While some might argue that there is no reason to build a force large enough to wage a conventional conflict against a nuclear-armed China, it is equally plausible that a larger force would serve as an effective deterrent to war or keep any conflict limited.
Finally, austerians expect that less money will force the services to cooperate, embrace cost-effective technologies, and adopt strategies to concentrate resources where they are most needed. It was a surfeit of resources, after all, that compelled the United States to adopt the recommendation of Admiral Harold Stark’s “Plan Dog” memorandum of November 1940 to concentrate forces in Europe rather than disperse them there and in the Pacific. Post-war budget cuts also encouraged U.S. leaders to embrace “creative” means of achieving national aims such as the Marshall Plan.
The aims of austerity are laudable, but we should question its efficacy. One way to illuminate its potential shortcomings for defense is to examine what happened the first time the U.S. military had to sustain a global force posture in the midst of significant budgets cuts in the late 1940s. The great challenge confronting defense planners after World War II was to reconcile the United States’ global commitments with the demands for retrenchment and demobilization. Defense expenditures dropped 90% in the three years following the war’s end. Unification of the War and Navy departments was supposed to help cushion the impact by eliminating duplication of effort. It had the opposite effect by encouraging inter-service squabbling. It also alienated civilian and military leaders, serving as the kindling for the “Revolt of the Admirals,” when Navy officers fabricated evidence of corruption to discredit the Air Force, engineer the resignation of the Secretary of Defense, and thwart President Harry Truman’s policy of cost cutting.
Events such as the revolt are possible outcomes of any policy of austerity that tramples on service prerogatives. Three decades ago, one defense analyst concluded that the military services were the most-powerful elements of the national security apparatus, and little has changed since. Not only do the services enjoy significant autonomy over their internal affairs, but they can also refight battles they have lost with the Pentagon or White House on Capitol Hill. If cuts must be made, then the services will insist on their right to make them. If past experience is any guide, besides preserving their relative budget shares, the services will save big-ticket items that align with their institutional preferences while neglecting less sexy capabilities. Consider the fate of mine warfare within the Navy, which has never been properly resourced, and for which the United States is almost totally dependent on its allies. The services also tend to ignore options that have no constituency within any service or might upset intra-service arrangements. Again, the Navy resisted the first submarine-launched ballistic missile, Polaris, because it would have redirected disproportionate funds to the submarine community.
There is no correlation between declining budgets and inter-service rivalry except when the burdens of such cuts fall disproportionately on one. And yet, the latter is almost inevitable if civilian leaders decide that they must choose between competing programs. This is what happened after World War II when U.S. political leaders and even some military officers believed that the most cost-effective way to support American security was by fielding a large force of heavy bombers. These planes would either deter Soviet aggression or enable the United States to respond to a Soviet invasion of Europe effectively until the country could mobilize. The only way to secure enough money for the Air Force’s bombers was by restricting the size of the Navy’s carrier aviation force, which infuriated the Navy, compromising its independence by depriving it of the right to use whatever weapons it deemed necessary to fulfill its missions.
Perhaps, the strongest argument against austerity based on the experience of the 1940s is that it may force the United States to bet on a strategy that may be inappropriate to unexpected contingencies. This is a dangerous proposition for two reasons. First, it is impossible to predict the sort of challenges a country will actually face. After 1945, the U.S. national security establishment was fixated on waging an unlimited war against the Soviet Union in Europe, but it got a limited war in Korea, instead. Second, the alternatives that rival services may offer to cost-conscious civilian leaders might have selfish motivations. Many Air Force officers had misgivings about strategic bombing, but swallowed their doubts because they reckoned that strategic bombing’s political utility outweighed its military shortcomings.
Cutting the budget and hoping for the best is not an effective way to generate superior outcomes. Under these circumstances, John F. Kennedy’s guidance to Robert McNamara in 1961 to develop a defense program on the basis of efficiency rather than arbitrary budget ceilings is still relevant. McNamara’s team found that letting the services figure out how to spend was a recipe for an unbalanced force that preserved each service’s future budget allocations. Instead, we should retain McNamara’s system of “active management” by having civilian and military experts work together to determine the most-efficient means of accomplishing the nation’s political aims.
We might also reconsider how to spend defense dollars in ways that maximize flexibility. The United States has, arguably, a choice between developing one of two kinds of forces. It can resource a “mobile strike force” to handle brushfires or limited conventional conflicts—in other words, a force similar to the one that we already have, but frontloaded toward maximum effort when the shooting starts. The other option would be to create a “cadre” force that can serve as the nucleus for larger military capable of waging protracted warfare. Such a force would differ from today’s military by having a more modest forward posture and operations tempo. It would require different sorts of investments, such as a greater emphasis on solving the labor shortfall of skilled technicians and operators. One way to accomplish this might be to develop a large number of prototypes as well as the capability to scale up output. This was how the United States maintained its skill in airplane design and construction during the 1920s despite deep cuts in the defense budget following World War I.
James Forrestal long ago recognized that no one could predict the “the form and character of any war of the future.” But once political support for growing defense budgets evaporates—which seems likely in the wake of the recent pandemic—there will be no escape from making hard choices about the allocation of our scarce material, human, and financial resources.
This article is excerpted from a longer one that was favorably peer-reviewed by Orbis: A Journal of World Affairs, but not published due to a backlog at the journal. The views that the author has expressed here are his own and not necessarily those of the U.S. Government or the Foreign Policy Research Institute. He does, however, wish to acknowledge the insights of Mark Blyth, whose work on austerity and political economy inspired this piece.
*About the author: Anand Toprani is an Associate Professor of Strategy at the U.S. Naval War College, Term Member of the Council on Foreign Relations, and author of Oil and Great Powers: Britain and Germany, 1914-1945, which received the 2020 Richard W. Leopold Prize from the Organization of American Historians.
Source: This article was published by FPRI