Ghost Fleet: A Review

screen-shot-2015-06-11-at-10-39-31-am

I was a bit late in getting to it, but I was pleasantly surprised by P.W. Singer and August Cole’s Ghost Fleet. It took a bit of effort to get into it, but the temporal leap the novel takes into years after a second Pearl Harbor attack allows for some very interesting worldbuilding. The United States has been taken down a peg and enjoys little to none of its previous dominance. What does the post-hegemonic era look like for America? How, in the fabled era of “degraded ISR,” can American armed forces operate and conduct operations? While we’re living through that transition now, Singer and Cole explore what that future might actually resemble.

Riddled throughout with trenchant criticisms of the current political-military-industrial complex (such as a “Big Two” defense contractors, numerous references to the failings of the F-35, and the Air Force’s institutional resistance to unmanned air-to-air platforms), the vision fleshed out in Ghost Fleet is not a flattering one to our current state of affairs. At times the references are a bit on the nose, but the degree of underlying wit makes up for it.

If nothing else, the opening sequence helps explain even to the layman the importance of sensor platforms and space-based assets, the US military’s dependence on them, and their exquisite vulnerability. Finite quantities of ship-launched missiles and other material become apparent in a way that can be challenging to discern in real-life operations. Our reliance on Chinese-produced microchips and other advanced technology becomes a easily-exploitable Achilles’ Heel, in a manner all too reminiscent of the Battlestar Galactica pilot miniseries.

A new techno-thriller is, of course, cause for comparison to Tom Clancy, and where this far outshines him is in its willingness to critique technology and current trends in military procurement rather than lauding it unreservedly, while crafting somewhat multi-dimensional characters (some of whom are even not white!). And as I’ve written before, even if wrong in the details, fiction like this helps broaden the aperture a bit and convey the potentialities of future conflict. If not China, then Russia; if not the F-35, then perhaps the long-range strike bomber: things will go wrong, technologies will fail, and the United States may well be caught unawares. Hopefully, with novels such as Ghost Fleet illustrating the cost of unpreparedness, it will be possible to forestall the future it envisions.

The “Utility” of Low-Yield Nuclear Weapons

The B61 family of modifications

The B61 family of modifications

An article in the New York Times made the rounds last week, asserting that the new modification (“mod”) of the B61 nuclear gravity bomb was of a lower yield than its predecessors, and arguing that lower-yield, precision weapons are destabilizing to nuclear strategy and that their relatively limited destructive capabilities in turn render them more likely to be used than the multi-megaton, Cold War-era city-busters. It was also proclaimed that this was the death knell for the Obama Administration’s disarmament and arms control efforts, and represented a “new arms race” between nuclear powers.

The argument is well-intentioned, but misguided.

The article quotes the usual suspects – James Cartwright, Andrew Weber, William Perry – and offers some assertions that are patently false on their face.

David Sanger and William Broad, the authors of this piece, focus solely on US weapons development and policy in a bubble, ignoring the larger context of the world in which nuclear weapons exist. As they and their interview subjects characterize it, the United States is the one upsetting the nuclear balance:

Already there are hints of a new arms race. Russia called the B61 tests “irresponsible” and “openly provocative.” China is said to be especially worried about plans for a nuclear-tipped cruise missile. And North Korea last week defended its pursuit of a hydrogen bomb by describing the “ever-growing nuclear threat” from the United States.

This, of course, ignores the fact that Russia has violated the Intermediate Nuclear Forces Treaty, China has refused to enter into any arms control arrangements and is busy expanding its own arsenal (including the production of new nuclear warheads and delivery vehicles; the former something that the United States still will not do), and North Korea has rejected carrot and stick approaches alike dating back several decades. If the Presidential Nuclear Initiatives in the aftermath of the Cold War – or the past 30 years of sanctions – were insufficient to dissuade Pyongyang from nuclear proliferation, it’s hard to envision what would. Continue reading

Intelligence Collection and Systems Thinking

 

Methods of intelligence collection

Methods of intelligence collection

Not performing enough human intelligence collection is a standard refrain these days. As the saying goes, “we’ve traded spies for satellites.” A golden age of honeypots and tradecraft and dead drops had been left behind at the dawn of the digital age. This is, purportedly, in keeping with the military establishment’s general overreliance on technology, stretching back to Rumsfeldian “transformation,” the ill-fated “revolution in military affairs” (RMA), and earlier. Conventional wisdom has it that this shift in emphasis was proven correct in the 1991 Gulf War, but it could also be argued that this was the war that the US military—especially the “armor guys”—had been itching to fight since the partition of Germany. Rather than the harbinger of a new era, the Gulf War was instead the last gasp of the Cold War.

But what does this have to do with human intelligence?

Contrary to the emphasis placed on the “spy games” aspect of Cold War diplomacy, intrigue, and espionage, the period between 1936 and 1989 saw a vast increase in technical methods of intelligence and relative devaluing of  human collection (analysis, as always, has remained a predominantly human province). Some of these technical methods and their operators became lore unto themselves—Francis Gary Powers in his U-2 (imagery intelligence, or IMINT) and the codebreakers at Bletchley Park (signals intelligence, or SIGINT) come to mind—but most operated in a behind-the-scenes way. And they certainly continue to do to this day, recent disclosures notwithstanding.

The intelligence community has additionally seen a change in the way it structures its collection and analysis missions. During much of the Cold War, capabilities were duplicated throughout different agencies. Thus, in addition to the Defense Mapping Agency that preceded the National Imagery and Mapping Agency and today’s National Geospatial-Intelligence Agency (NGA), the Central Intelligence Agency (CIA) had its own IMINT people in the form of the National Photographic Interpretation Center, the National Reconnaissance Office did its own thing with satellites, and so forth. While all of these organizations persist in one form or another, their functions have been streamlined, such that we have most IMINT running through NGA, much of the SIGINT community operating at the National Security Agency (NSA), et cetera. Gaps do exists, as do split missions, and the joint responsibility of the Defense Intelligence Agency (DIA) and CIA for HUMINT is one such example. But in general, we now have standardized methods and practices of intelligence gathering, processing, exploitation, collection, and analysis. The very concept of “all-source intelligence” during the Cold War would have been unthinkable—and still seems a novelty to many analysts in the intelligence community—because it would have meant someone was driving in your lane, and that would be unacceptable. Fortunately, this is no longer the case.

Continue reading