The idea of a technological “Singularity,” that is, a point where humankind essentially merges with technological systems to create a super-human intelligence, has become a hot topic these days. While the general concept has been around for the better part of the last century, these days it is most often associated with inventor and transhumanist advocate Ray Kurzweil, whose book The Singularity is Near (pictured above… it’s the one on the left, by the way) represents the most comprehensive work on the subject.
However, looking a bit further back, there were others who undertook the analysis of such possibilities, and even brought the idea of technological singularity close to the realm of ufology through guilt-by-association. Before going any further, can you guess who the culprit may be? I’ll give you a hint: his last name starts with a “V” (and it’s not Vernor Vinge)…
Indeed, as the title above already suggests, it was none other than Jacques Vallee, ufologist and computer scientist, who expounded on ideas quite relevant to technological Singularity in a scientific journal as early as 1975. An essay, co authored with Professor Francois Meyer in the year mentioned, appeared in the journal Technological Forecasting and Social Change under the title “The Dynamics of Long-Term Growth” (and note that, among Vallee’s credentials, no mention of his UFO research is made). The ideas expressed were very similar concepts to notions of Singularity expressed by Kurzweil and others today, though the term “Singularity” was never used specifically.
This article not only described concepts similar to what Kurzweil refers to as the “knee of the curve” (regarding faster-than-exponential growth of technology in relation to human evolution), but also a projected year of 2026 for what the authors called a “singular point.” Kurzweil predicts this point will be reached in the year 2029… still in the same ballpark, at least.
Regarding the omitted use of the term Singularity for the sorts of concepts being described, I initially figured the term simply did not exist at the time Meyer and Vallee authored the essay in question. Indeed, for them it was essentially a new idea at the time, based on then cutting-edge computer science and models Vallee referenced from other peer reviewed scientists of the day. However, looking back a bit further, one will find that the term had in fact been used as far back as 1958, courtesy of Stanislaw Ulam, who according to Wikipedia wrote the following in a letter to John Von Neumann:
“One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”
English mathematician I.J. Good touched on the similar concept of “Intelligence Explosion” again in 1965, and then science fiction writer Vernor Vinge (also a mathematician) went on to write about what he called “a kind of singularity” in an editorial for Omni Magazine in 1983, and thus arguably popularized the subject in the modern sense. It’s interesting, however, that neither Wikipedia, nor Ray Kurzweil so far as I know, mention Vallee and Meyer’s contribution to this subject, which at least seems to have pre-dated Vinge’s ideas by nearly a decade.
Unlike the more optimistic attitudes toward Singularity expressed by many in the transhumanist camp today, things get a bit dark toward the end of Vallee and Meyer’s treatise. In the “discussion” section toward the end, the authors note that, “the forecast of infinite growth in a finite time interval is absurd. All we can expect of these developments is that some damping effect will take place very soon. The only question is whether this will be accomplished through ‘soft regulation’ or catastrophe.” Wondering aloud here, what could “soft regulation” be compared to here? Orwellian micro-management of a populace? Geo-economic collapse? (I say this partially in jest, though I’d like to know what the authors really meant by this statement). Catastrophe, on the other hand, might possibly represent the harmful after-effects of an EMP weapon, or even coronal mass ejection by the Sun, which would cause similar effects globally (both are subjects I’ve addressed in the past here at TGR).
Vallee and Meyer’s final statement is perhaps the most cryptic: “It is clear that the rate of growth must eventually decrease. A discussion of the mechanism through which this decrease will take place is beyond the scope of the present study.” Vernor Vinge was less nebulous when he wrote for Omni back in January 1983 that, “To write a (science fiction) story set more than a century hence, one needs a nuclear war in between … so that the world remains intelligible.”
I’d be interested in hearing Vallee’s ideas on what this “mechanism” for potential decrease might be nowadays, given the present geo-economic forecast, as well as NASA’s concern with harmful potential of solar irradiation that focuses on the year 2013… or as Vinge alluded, the ever-present concerns surrounding nuclear proliferation. Could it be that the sorts of technological Singularity proposed by Kurzweil, Vallee, Vinge, and so many others may never be realized, due to impending conflicts that could await us in the future?
This article originally appeared at gralienreport.com on October 25, 2011, titled “Singular Semantics: Vallee and the Origins of Technological Singularity.”by