What is “Systems Analysis”?

“Systems analysis,” as a concept, can be difficult to define and pin down. For much of my life, I assumed it was some sort of generic back-office IT function (see, for instance, the hundreds of man-on-the-street “American Voices” interviews in The Onion, which describe respondents in equal measure as ‘unemployed’ or ‘systems analyst’). But given the complexities of almost, well, everything in the modern era, an understanding of the logical underpinnings of systems analysis is critical.

Essentially, single variables cannot be considered in isolation. A new weapons platform or technological development or re-basing movement must be thought of in the context of existing technology, logistics capacity, weather, enemy reaction, enabling capabilities, fixed facilities, power projection, and so on, down an otherwise infinite fractal list of factors.

https://upload.wikimedia.org/wikipedia/en/thumb/e/e4/An_image_of_Strategist_Bernard_Brodie.jpg/220px-An_image_of_Strategist_Bernard_Brodie.jpg

Dr. Bernard Brodie, RAND Corporation (Wikimedia)

But all this is a long-winded introduction to Bernard Brodie’s hypothetical systems analysis example in Strategy in the Missile Age is one of the best,  most succinct ways of describing just how complex this interplay is. Brodie, of course, had a front-row seat to this effort, as the RAND Corporation was the earliest home to a methodological approach to the field. Beginning on page 381 in the 1967 edition:

Let us consider, for example, the problem of choosing between two kinds of strategic bombers. Each represents in its design an advanced “state of the art,” but each also represents a different concept. In one, which we shall call Bomber A, the designers have sought to maximize range. They have therefore settled for a subsonic top speed in a plane of fairly large size. The designers of Bomber B, on the contrary, have been more impressed with the need for a high dash speed during that part of the sortie which involves penetration of enemy territory, and have built a smaller, shorter-ranged plane capable of a Mach 2 dash for a portion of its flight. Let us assume also that the price of the smaller plane is about two-thirds that of the larger.

Perhaps we can take both types into our inventory, but even then we should have to compare them to determine which we should get in the larger numbers. Let us then pick a certain number of specific targets in enemy territory, perhaps three hundred, and specify the destruction of these targets as the job to be accomplished. Since we know that both types can accomplish this job with complete success if properly supported and handled, our question then becomes: which type can do it for the least money?

We do not ask at this stage which type can do it more reliably, because within limits we can buy reliability with dollars, usually by providing extra units. Some performance characteristics, to be sure, will not permit themselves to be thus translated into dollars-for example, one type of plane can arrive over target somewhat sooner than the other type, and it is not easy to price the value of this advantage but we shall postpone consideration of that and similar factors until later.

Let us assume that Bomber A has a cruising range of 6,000 miles, while Bomber B is capable of only 4,000 miles. This means that Bomber A has to be refueled only on its post-strike return journey, while Bomber B probably has to be refueled once in each direction. This at once tells us something about the number of “compatible” tankers that one has to buy for each type (“compatible” referring to the performance characteristics which enable it to operate smoothly with a particular type of bomber). Up to this point Bomber B has appeared the cheaper plane, at least in terms of initial purchase price, but its greater requirement in tankers actually makes it the more expensive having regard for the whole system. In comparing dollar costs, however, it is pointless to compare merely procurement prices for the two kinds of planes; one has to compare the complete systems, that is to say, the weapons, the vehicles, and the basing, protection, maintenance, and operating costs, and one must consider these costs for each system over a suitably long period of peacetime maintenance, say five years. These considerations involve us also in questions of manpower. We are in fact pricing, over some duration of time, the whole military structure required for each type of bomber.

https://upload.wikimedia.org/wikipedia/commons/7/70/B36-b-52-b-58-carswell.jpg

A B-36 Peacemaker, B-52 Stratofortress, and B-58 Hustler from Carswell AFB, TX en route to the former’s retirement in 1958. The B-52 would long outlive the more advanced Hustler. (Wikimedia)

Now we have the problem of comparing through a process of “operations analysis,” how the two types fare in combat, especially the survival expectancy of each type of plane during penetration. In other words, we have to find out how much the greater speed (and perhaps higher altitude) of Bomber B is worth as protection. If the enemy depends mostly on interceptors, the bomber’s high speed and altitude may help a great deal; if he is depending mostly on guided missiles, they may help relatively little. Thus a great deal depends on how much we know about his present and projected defenses, including the performance characteristics of his major weapons.

If our Bomber A is relying mostly on a low altitude approach to target, which its longer range may just make possible (we are probably thinking in terms of special high efficiency fuels for wartime sorties), it may actually have a better survival expectation than its faster competitor. Also, we know that penetration capability is enhanced by increasing the numbers of bombers penetrating (again, a matter of money) or by sending decoys in lieu of extra bombers to help confuse the enemy’s radar and saturate his defenses. Perhaps we find that the faster plane would outrun the decoys, which again might tend to give it a lower penetration score than one would otherwise expect. But decoys are expensive too, in acquisition costs, basing, and maintenance, and involve additional operating problems. The faster plane may be less accurate in its bombing than the other, which again would involve a requirement for more aircraft and thus more money.

We have given just barely enough to indicate the nature of a typical though relatively simple problem in what has come to be known as “systems analysis.” The central idea is that no weapon can be considered independently of the other weapons and commodities that are used with it, that all endure through some period of time and require men to service them and to be trained in their use, that all these items involve costs, and that therefore relative costs of different systems, as considered against some common standard of function, are basic to the problem of choice between systems. Systems analysis, which brings what is modern to present-day strategic analysis, is mostly a post-World War II development.

The challenges herein are immense, which in part explains the explosion not only of defense research and development but also of the defense bureaucracy as a whole. It’s a sprawling, tangled mass that can in many ways only be understood in relation to itself. But systems analysis is at least an attempt to build that into other assumptions and considerations.

Using this technique is not only a way to compare technologies with like missions; it’s an excellent tool for use in wargame design. This too is in fact an iterative process, as the insights from a wargame itself might reveal further interrelationships, which might then be used to craft a more complex operating environment (or refine the mechanics used to select force lists), and so on ad infinitum.

Practicality aside, Brodie’s writing serves as an excellent primer to what systems analysis entails, and more broadly, to the change in strategic thought and analysis since the end of World War II.

Ghost Fleet: A Review

screen-shot-2015-06-11-at-10-39-31-am

I was a bit late in getting to it, but I was pleasantly surprised by P.W. Singer and August Cole’s Ghost Fleet. It took a bit of effort to get into it, but the temporal leap the novel takes into years after a second Pearl Harbor attack allows for some very interesting worldbuilding. The United States has been taken down a peg and enjoys little to none of its previous dominance. What does the post-hegemonic era look like for America? How, in the fabled era of “degraded ISR,” can American armed forces operate and conduct operations? While we’re living through that transition now, Singer and Cole explore what that future might actually resemble.

Riddled throughout with trenchant criticisms of the current political-military-industrial complex (such as a “Big Two” defense contractors, numerous references to the failings of the F-35, and the Air Force’s institutional resistance to unmanned air-to-air platforms), the vision fleshed out in Ghost Fleet is not a flattering one to our current state of affairs. At times the references are a bit on the nose, but the degree of underlying wit makes up for it.

If nothing else, the opening sequence helps explain even to the layman the importance of sensor platforms and space-based assets, the US military’s dependence on them, and their exquisite vulnerability. Finite quantities of ship-launched missiles and other material become apparent in a way that can be challenging to discern in real-life operations. Our reliance on Chinese-produced microchips and other advanced technology becomes a easily-exploitable Achilles’ Heel, in a manner all too reminiscent of the Battlestar Galactica pilot miniseries.

A new techno-thriller is, of course, cause for comparison to Tom Clancy, and where this far outshines him is in its willingness to critique technology and current trends in military procurement rather than lauding it unreservedly, while crafting somewhat multi-dimensional characters (some of whom are even not white!). And as I’ve written before, even if wrong in the details, fiction like this helps broaden the aperture a bit and convey the potentialities of future conflict. If not China, then Russia; if not the F-35, then perhaps the long-range strike bomber: things will go wrong, technologies will fail, and the United States may well be caught unawares. Hopefully, with novels such as Ghost Fleet illustrating the cost of unpreparedness, it will be possible to forestall the future it envisions.

The “Utility” of Low-Yield Nuclear Weapons

The B61 family of modifications

The B61 family of modifications

An article in the New York Times made the rounds last week, asserting that the new modification (“mod”) of the B61 nuclear gravity bomb was of a lower yield than its predecessors, and arguing that lower-yield, precision weapons are destabilizing to nuclear strategy and that their relatively limited destructive capabilities in turn render them more likely to be used than the multi-megaton, Cold War-era city-busters. It was also proclaimed that this was the death knell for the Obama Administration’s disarmament and arms control efforts, and represented a “new arms race” between nuclear powers.

The argument is well-intentioned, but misguided.

The article quotes the usual suspects – James Cartwright, Andrew Weber, William Perry – and offers some assertions that are patently false on their face.

David Sanger and William Broad, the authors of this piece, focus solely on US weapons development and policy in a bubble, ignoring the larger context of the world in which nuclear weapons exist. As they and their interview subjects characterize it, the United States is the one upsetting the nuclear balance:

Already there are hints of a new arms race. Russia called the B61 tests “irresponsible” and “openly provocative.” China is said to be especially worried about plans for a nuclear-tipped cruise missile. And North Korea last week defended its pursuit of a hydrogen bomb by describing the “ever-growing nuclear threat” from the United States.

This, of course, ignores the fact that Russia has violated the Intermediate Nuclear Forces Treaty, China has refused to enter into any arms control arrangements and is busy expanding its own arsenal (including the production of new nuclear warheads and delivery vehicles; the former something that the United States still will not do), and North Korea has rejected carrot and stick approaches alike dating back several decades. If the Presidential Nuclear Initiatives in the aftermath of the Cold War – or the past 30 years of sanctions – were insufficient to dissuade Pyongyang from nuclear proliferation, it’s hard to envision what would. Continue reading

Diamond in the Techno-Thriller: The Value of Speculative Fiction

A few years ago, some sort of switch got flipped in my brain and all of a sudden I became far more capable of and willing to plow through half a dozen novels in a single stretch than to finish a single non-fiction book. Recently, equilibrium has been at least somewhat restored, but I continue to find myself immersed in fiction in a way that I rarely was before.

Some recent reading has included a pair of Larry Bond novels from the late 1980s and early 1990s, Vortex and Cauldron. Larry Bond is most famously, of course, the man who helped Tom Clancy game out many of his books’ wartime scenarios (and Bond co-wrote Red Storm Rising with Clancy). I hadn’t known Bond as an author in his own right, but recently read those two works of his in succession.

What’s wonderful about books like these is generally not their literary qualities, but nor is it even the conduct or proposed sequence of events in particular conflicts. Can fiction, in fact, predict the future of warfare? Perhaps, but more interestingly, such books serve as a time capsule of the era in which they were written. Much of the “valued added” from this is detailed (at times overly so) descriptions and explanations of the weaponry, arms systems, and military organization of the era. But furthermore, while not predictive in any meaningful way, these novels can help widen the Overton Window of the imagination, to at least consider a divergent future drastically different from our own.

With books set in the future, but now a dated future, it’s almost like reading alternate history. As of this writing, I’m reading The Third World War: August 1985, which is an account of World War III written in the past tense as a historical survey from the point of view of two years later (e.g., 1987). Of course, the book was actually published in 1979, along with a followup, The Third World War: The Untold Story, which was published two years later and dives deeper into specifics of nuclear postures, the war in the North Atlantic and the North Sea, Solidarity’s effect in Poland, and other issues. It is a look at a world that never was, but seemed frightfully close to being so. And from that perspective, it’s a chilling look at the prospective war facing the world of the past.

Obviously, these never came to pass, but when one considers what might have been, that can seem a blessing. Continue reading

Nike’s Revenge: The Return of Urban Missile Defense

News last week that the US is contemplating area cruise missile defense – around US cities – against Russian missiles, no less – was enough to fill me with a certain sort of  glee.

Given the problems we’ve had with defending against ballistic missiles and their far more predictable (and detectable) trajectories, the technological barriers to implementing an effective urban cruise missile defense are likely to be high. (On the other hand, if the Russian ICBM threat is as overhyped as State is playing it, then we’ve got some breathing room to develop such a system.) Of course, this wouldn’t be our first iteration of urban aerial defense.

Nike Hercules missiles on alert, 1970s.

The Nike program from the early Cold War was the US attempt to thwart Soviet nuclear attacks by shooting down bombers (at first, with Nike Ajax), and eventually expanding to target ballistic missiles (Nike Hercules/Zeus). Surface-to-air missiles versus aircraft, essentially. By the 1970s, the threat of a massive, overwhelming ICBM salvo led some missiles had begun to be armed with nuclear warheads, most notably those of the separate SENTINEL/SAFEGUARD program with its 5 megaton W71. But the last of the Nike sites was decommissioned in 1974, and Safeguard only lasted a few months before being shut down in 1975. An excellent overview of all this has been published, of course, as an Osprey book.

Raytheon/Kongsberg HAWK-AMRAAM mobile launcher.

The proposed system is a little different – new active electronically-scanned area (AESA) radars would enable F-16s to shoot down the cruise missiles, rather than relying on a ground-based interceptor. The fighters would be networked with some sort of barrage balloon-type airships carrying sensors, as well as radars and sensors at sea (too bad Washington must bid adieu to USS Barry – would that it might gain a second life from this effort). However, Raytheon is considering land-based versions of both the SM-6 and the AMRAAM, which would require some degree of construction for basing.

What’s most interesting to me is where exactly such systems might be deployed. In addition to the major SAC bases, various Nike anti-air batteries defended major industrial and population centers.

Nike sites in the continental United States

Reading a list of the 40″defense areas” from the 1950s is like a snapshot of US heavy industry at its peak: Hartford, Bridgeport, Chicago-Gary, Detroit, Kansas City, St. Louis, Niagara Falls-Buffalo, Cincinnati-Dayton, Providence, Pittsburgh, and Milwaukee (among others) were all deserving of their own ring of air defenses. But what would the map look like now?

It’s both a rhetorical and a practical question. The importance of place has changed, and in some ways the country as a whole is more sprawling than it was during the Cold War. More places to defend but also more targets for an enemy. As the JCS Vice Chairman Sandy Winnefeld is quoted, “we probably couldn’t protect the entire place from cruise missile attack unless we want to break the bank. But there are important areas in this country we need to make sure are defended from that kind of attack.” One can already imagine the hearings and horsetrading that would accompany any discussion of which cities are worthy of protection – picture the East Coast interceptor site debate multiplied a hundredfold.

Hopefully SM or AAMRAAM is a part of it, but if the new system of sensors and radars is indeed confined to mobile platforms – fighters, ships, balloons – we’ll be deprived of Nike’s wonderful legacy of ruins. Only one Nike site, that of Fort Barry in San Francisco’s Golden Gate Recreation Area, has been preserved and is open to the public. Others make for a nice, if haunting walk. Others have found new purpose – it was only in recent years that I learned Drumlin Farm, a wildlife sanctuary run by the Massachusetts Audubon Society and frequent field trip destination as a kid, was once home to  the control site of Launcher B-73 and its Ajax and later Hercules missiles.

Radar dome of Nike site Mike near Eielson AFB, AK.

Without the permanent physical infrastructure of a Nike-type program, our cruise missile defense initiative will be sorely lacking any vestiges for future generations to explore. Although how ironic is it that one of the better reasons to support a missile defense initiative is in anticipation of its eventual decommissioning? At any rate, it’s likely that any modern system would leave a smaller footprint than Nike, with presumably fewer control sites having command of multiple, if not all launch sites in a given area.

The land-based component might never be constructed. But at least with balloons in the skies overhead, we all might wake up one morning and wonder if we’ve been transported to some kind of alternate reality.

“Manhatan” (as it appeared on Fringe).

Pilots Are Special and Should Be Treated That Way

In the Secretary Gates v. General Moseley spat, the general’s comments on a place for manned and unmanned platforms jumped out (emphasis mine):

Moseley said there is no intentional bias against unmanned aircraft in the Air Force. There is a place for both manned and unmanned, he said. “Secretary Wynne got tired of hearing me say this when we were beaten up about not going all unmanned.” The reality is that there are few instances when the use of unmanned aviation is imperative. “One is when you believe the threat is so terrible that you’ll lose the human,” he said. “I believe the Air Force has never found that threat. We will penetrate any threat. We haven’t found a place we won’t go. So I don’t buy that one.

The other is when human pilots are the limiting factor to the persistence of the machine. “I got that one,” said Moseley. “You leave the plane out there for 30 hours on a reconnaissance mission. That’s a valid one.”

Isn’t that backwards? In this era of declining budgets, a need for persistence (in general), and an almost total lack of contested airspace, isn’t the onus on the Air Force to prove a need for the use of human pilots? Why are we still considering manned aircraft the default, and only operating them remotely in particular circumstances?

I guess it seems as if there’s no overwhelming rationale for maintaining pilot primacy with the vast majority of missions. And I’m the first to admit I would try and have it both ways: if the airspace is too dangerous, then of course we’d want to minimize pilot risk and maximize unmanned platforms. If  airspace is uncontested, then it would seem that the skill and reaction time of a human pilot is unwarranted.

Obviously, manned flight isn’t going anywhere, but to declare the Air Force as essentially a force of human-piloted aircraft, except for a few instances, seems to be ignoring the larger trends and gains to be had from unmanned aviation.

(Original link h/t Rich Ganske).

Flat Tops and Short Decks

article-2385430-1B2B7202000005DC-913_634x388[1]

Izumo’s commissioning on August 6, 2013

Japan unveiled its biggest warship since World War II on Tuesday, a $1.2 billion helicopter carrier aimed at defending territorial claims.

The move drew criticism from regional rival China, which accused its neighbor of “constant” military expansion.

The ceremony to showcase the 248-meter (810-feet) vessel came as Shinzo Abe’s conservative government, which took office last December, considers ditching the nation’s pacifist constitution and beefing up the military.

Japan plans to use the helicopter carrier, named Izumo and expected to go into service in 2015, to defend territorial claims following maritime skirmishes with China, which has demonstrated its own military ambitions in recent years.

This is via Nick Prime, who points out the theoretical possibility of fielding the F-35 on these. (Here’s another story).

Which, honestly, is what I thought was basically the only thing keeping the B variant alive. It’s not just the USMC that needs a VTOL-capable aircraft (or in the case of the JSF, “aircraft”), but a lot of our allies and partners in the region who have been investing in flattops like these (see: HMAS Canberra, ROKS Dokdo, etc.) with the possibility of flying such planes off of them. Even the Europeans are getting in on it – the French have a pretty good platform in the Mistral class, hence the brouhaha over the Russian acquisition of four of them.

And in that case, there had better be an aircraft that can use the short-decks. I mean, helicopters are great and all, but if we’re going to at least play bluewater navy and accept that power projection via the aircraft carrier is still a) relevant, and b) desirable, then doing it on the cheap is probably the best compromise. At this point you might as well assume that you’re going to lose them, so why not go for the more basic version? The Marines probably aren’t thrilled about a CAS aircraft that’s only 80% better than its predecessor (though certainly more than 80% more costly), but the key is that it’s not just for them. Our friends are getting in the game, and that’s not a bad thing.

Anyways, it is nice to see the JMSDF get a new flagship (that’s how they determine them, right? The biggest?). And one whose name has an interesting history, too.

 

UPDATE: Kyle Mizokami, as usual, has written excellent words on the Izumo. Short version: Japan’s going to have to go big or go home.

The Means of Consumption

PC sales are down. Way, way down.

What’s to blame? Zero Hedge says that in addition to lackluster sales and poor reception for Windows *, we are, after all, still in a pretty severely depressed economy and that there’s just no end-user demand for new OSes or new computers in general. None of which is wrong. Windows 8, in particular, severly hamstrings Windows as an operating system, forcing it to suffer from the same limitations as a phone (which is just silly, especially when Windows 7 was a solid OS).

But the comments point out that we’ve really reached a point in modern computing power where most people just don’t need it. The rise of mobile and tablet devices has only compounded that. If the average person uses a machine just to tweet or surf the internet or check email or even just watch a movie, what’s the point of having several cubic powers worth of CPUs and RAM capacity greater than that of hard drives less than a decade ago? The smaller devices speak to that and obviate a need for real “computing” devices.

But two comments in particular caught my eye. The first:

[M]ost people don’t do physics simulations, train neural nets, backtest stock trading strategies and so on.

In tight times – why upgrade something that’s already better than most need?  Even I still use some  2 core relative clunkers (that were the hottest thing going when bought).  Because they do their job and are dead-reliable.

And the second:

[E]very manuf [sic] caught the disease it seems.  They don’t give a shit about their installed base, only new sales, and are just slavishly following the migration of most people to crap mobiles – crap if you need any real computing power and flexibility and multi-tasking.

I recently got a Nexus 10 – it’s cute, sometimes handy and so on.  But solve any real problem on it?  You must be joking, it’s just not there.  It’s great for consuming content, sucks for creating anything real – it’s a toy that probably does match the mainstream mentality – the “average guy” who half of people are even dumber than.  That ain’t me.  I’m a maker…I need real tools.

This is just the digital embodiment of a long-time trend. We don’t shape our environments how we used to – we don’t create; we only consume. We refine what exists without thinking bigger. And the sad part about something like the news about PC sales, which could conceivably serve as a wakeup call, is that it won’t matter. If there is a lesson to be learned, it’s that Windows 7 was fine and why should we bother iterating new versions. But the lesson is that there is at least some segment of humanity that’s trying to create and only needs the proper tools to do it. Possessing the means of consumption allows one only to consume (the Apple model); if we can repopularize “dual-use” technologies that don’t restrict content distribution but also enable its creation, well, now we might see innovation for all the right reasons.

Third Time’s a Charm, and By “a Charm” I Mean Exactly the Same

DPRK test, actual scale (Not actually) [image: Petey Santeeny]

Following the other night’s North Korean nuclear test, there was definitely enough anxiety to keep observers and analysts up for hours. But there are a couple factors at play allowing me to sleep pretty soundly. Hopefully they’ll help you do the same!

The first is the relatively small yield – yes, it’s larger than the first two tests, but that really doesn’t mean anything. A 10 kiloton (or 6-7 kt or 15 kt) nuclear weapon is nothing to sneer at, but as the world saw with Hiroshima and Nagasaki, a weapon of that kind isn’t much more effective than conventional explosives. The firebombings of Tokyo did more damage and took more lives than either nuclear blast in World War II.

They’ve also talked about switching their nuclear fuel from plutonium to highly-enriched uranium, which is weird and kind of a step back. The United States used to use HEU but once we perfected plutonium processing techniques we stuck with that. It’s a much more effective fuel for a multi-stage thermonuclear explosion, and it’s a little weird for anyone to change from plutonium. If true, it could indicate a processing and/or supply issue, but that would be a good sign; it would means that they’re having trouble sourcing fissile material. So they may not even have the raw materials necessary to build many bombs.

The other part is a little up in the air and I’ve heard competing claims, but nothing I’ve read so far confirms (despite Pyongyang’s claims) that North Korea has successfully miniaturized a nuclear weapon – which would be a prerequisite for mounting it onto an ICBM. It’s one of the most difficult steps in the technological scale of nuclear science and requires increasing reaction efficiency. The small gain in yield this test provided makes me think that they definitely haven’t reached that step yet. I’m also not positive on the physics – and it might just be a coincidental concurrence rather than cause – but I believe the only miniaturized, i.e., ICBM warheads in existence are thermonuclear, and a failure to demonstrate that technology definitely means something.

So, in short, I’m not worried yet. They can’t build very many bombs; the bombs they can build aren’t especially powerful; they have no missile with the range to reach the United States and even if they did they haven’t miniaturized a warhead sufficiently to mount on it; and their only means of delivering one of the few extant bombs is by bomber, which exist in low numbers and also don’t have the range to hit the US, much less reach here undetected. So we’re all safe over here for the foreseeable future.

I don’t know that this really changes anything strategically even in the region. We’ve known, the South Koreans have known, and the Japanese have known; it’s common knowledge that North Korea has some nuclear weapons. And that hasn’t led to regional proliferation or a move to oust the Kim regime or anything like that. I don’t see “just another test” making a dramatic difference on that front. Dr. Farley probably says it best: “Last night, North Korea expended a significant fraction of its fissile material to achieve nearly nothing, beyond possibly the irritation of Beijing and the strengthening of right-wingers in Japan and the United States.”

Yeah, great job there, Pyongyang.

The Challenge

Originally meant for a Facebook post but it soon spiraled out of control. The subject is a piece by Jason Pontin in the Massachusetts Institute of Technology Review: “Why We Can’t Solve Big Problems.”

We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard; because that goal will serve to organize and measure the best of our energies and skills”

Since Apollo 17‘s flight in 1972, no humans have been back to the moon, or gone anywhere beyond low Earth orbit. No one has traveled faster than the crew of Apollo 10. (Since the last flight of the supersonic Concorde in 2003, civilian travel has become slower.) Blithe optimism about technology’s powers has evaporated, too, as big problems that people had imagined technology would solve, such as hunger, poverty, malaria, climate change, cancer, and the diseases of old age, have come to seem intractably hard.

Not to say that the article is entirely pessimistic for the future. In a lot of cases it’s not so much a question of know-how as it is mere willpower.

I’ve written about this before (the common thread through all writing on this seems to be the Concorde.  Humans could once buy a ticket to travel faster than the speed of sound. Those days now lie behind us).

And we’re running out of steam, too. Consider the troubled F-35 acquisition program (I hate holding up acquisitions as an example of anything, but…here I am). It’s not even as advanced as the F-22. Yet we still don’t have a combat-ready B variant (the Marine Corps has stood up an all F-35B squadron consisting of exactly three aircraft). And of course, our most advanced aircraft, the F-22 and B-2, were meant to be procured in far greater numbers but went into the “death spiral” of rising cost and declining orders.

This is not a problem unique to “legacy” industries. Even the hyped new media and tech sectors are seeing their own trivialization. As a Businessweek article pointed out, “The best minds of my generation are thinking about how to make people click ads.” As Jeff Hammerbacher says, this does indeed suck.

I don’t know what the solution is, but this is hardly a matter of perception. There’s an explanation as to why we no longer live in an age of optimism with the stars as the limit and a sense of awe and wonder at what tomorrow might bring. We’re stuck in a quagmire with little consequential technological progress, no political progress at all, and a generational rift that could just as easily be a referendum on moving into the 21st century. Other than Los Angeles, who’s building an urban heavy rail line? Who’s developing a faster way to travel? A better way to compute? A food replicator? A way to make money while also enhancing the common good?

The closest we’re getting right now is 3-D printing, and I have very high hopes for the field. Should it really reach its true potential, global supply chains will be completely disrupted (and for the better). But it’ll have to go beyond mere plastics. And other than that, what’s on the horizon? What about today, other than the tiny details, has changed in the last 30 years? What in that time has changed for the better?

I recently read Charles Stross’s Halting State, which deserves a more comprehensive treatment at some point, but which also has the following passage:

“Imagine you were a time-traveller from the 1980s, say 1984, and you stepped out of your TARDIS right here, outside, uh, West Port Books.” (Which tells you where you are.) “Looking around, what would you see that tells you you’re not in Thatcherland anymore?”

“You’re playing a game, right?”

“If you want it to be a game, it’s a game.” Actually it’s not a game, it’s a stratagem, but let’s hope she doesn’t spot it.

“Okay.” She points at the office building opposite. “But that…okay, the lights are modern, and there are the flat screens inside the window. Does that help?”

“A little.” Traffic lights change: Cars drive past. “Look at the cars. They’re a little bit different, more melted-looking, and some of them don’t have drivers. But most of the buildings—they’re the same as they’ve ever been. The people, they’re the same. Okay, so fashions change a little. But how’d you tell you weren’t in 1988? As opposed to ’98? Or ’08? Or today?”

“I don’t—” She blinks rapidly, then something clicks: “The mobile phones! Everyone’s got them, and they’re a lot smaller, right?”

“I picked 1984 for a reason. They didn’t have mobies then—they were just coming in. No Internet, except a few university research departments. No cable TV, no laptops, no websites, no games—”

“Didn’t they have Space Invaders?”

You feel like kicking yourself. “I guess. But apart from that…everything out here on the street looks the same, near enough, but it doesn’t work the same.”

Humanity possesses boundless reserves of optimism just waiting for the right conditions to be unleashed. But I fear we’re a long way away from that. We currently live in an age of in-between, a mere interlude of history, with our small times and small men and small problems. What’s next?