Science Progress | Where science, technology, and progressive policy meet

Could the Organic Singularity Occur Prior to Kurzweil’s Technological Singularity?

Many of us have heard of the “Technological Singularity”[1] predicted by Ray Kurzweil—the hypothetical future emergence of greater-than-human-intelligence based on computer systems. Kurzweil looked at the famous “Moore’s Law,”[2] which predicts the ever-increasing power of computers based on ever-increasing chip density, and concluded such power inevitably would exceed that of the human brain. Kurzweil, a pioneer in several areas of technology, relied on a straightforward projection of improvements in computing and software technology and didn’t envision any radical transformation in computing techniques themselves. While others have argued that Kurzweil understated the technical challenges—especially regarding the software needed to replicate human cognition—even today we see systems and networks of systems that far exceed the communications powers of any single human being. Google is one example, not to mention the Internet itself. Although these systems don’t fulfill Kurzweil’s prediction, they certainly do demonstrate that many human intellectual abilities can be exceeded through technical means.

Thus, for many of us the emergence of greater-than-human-intelligence computers is a foregone conclusion, notwithstanding the fact that at this point we don’t understand what intelligence really entails, how the brain actually performs many functions, or if a software analogue of neural processing is possible. Our expectation is that human creativity is wonderfully adept and, in due course, will develop the knowledge necessary to create such machines. Already, we have witnessed the IBM supercomputer “Watson” offer a masterful performance on Jeopardy, where it demonstrated an ability that exceeds the famous “Turing Test.”[3] (Alan Turing, a noted mathematical theorist, contended that an “intelligent” computer only needed to provide human-mimicking responses that didn’t actually have to be correct.) Thus, we are further reassured that Kurzweil’s Technological Singularity is inevitable.

The conclusion that a greater-than-human silicon-based intelligence will occur, however, does not preclude other developments that may occur before that. In the past we’ve seen favored technologies outpaced by newer technologies. A good example is what happened with the canals of the 1800s that were replaced by rapidly expanding rail lines.[4] The same kind of result may occur with the development of greater-than-human intelligence. We may face a biological creation based on organic structures that is more intelligent than a human being before a silicon-based version arrives.

Moore’s Law is under scrutiny today, as it appears increasingly uncertain whether the pace of technological advances in digital computing will continue. However many analysts suggest we are experiencing a similar power growth law regarding our abilities to manipulate biological mechanisms such as DNA.[5] Nearly everyone was surprised by the speed with which human DNA was sequenced, and many are similarly surprised by how quickly recombinant DNA varieties of agricultural products have reached the supermarket aisles. Similar advances are occurring throughout the biological fields, from our increasing understanding of the behavior of DNA to our rapid advances in understanding how the brain functions (although we still don’t understand how such “elementary” processes as memory actually occur).[6] Nonetheless, there’s good reason to expect continuing, if not accelerating, improvements in our biological sciences.

With that in mind, it’s long overdue for us to consider how quickly we might achieve an “Organic Singularity,” a synthetic biological structure or creature that is more intelligent than a human being. Having participated in the computing and biological fields for more than 35 years, I conclude it will occur, and, perhaps more importantly, that it will occur before a Technological Singularity occurs.

I’ve looked at the most important developments in biology, and I’m impressed, as no doubt are most of you, at how quickly we’ve been able to address the extraordinary complexities of organic life. If we begin at the time of Charles Darwin, it looks like a power law is at work with no end in sight. Moore had a simple metric—transistor density—but the biological field appears to have no single measure that stands out. We might look at the number of scientific publications, human longevity, patents issued, and so on, but none is as satisfyingly simple as transistor density. Combining these measures, however, gives us a powerful sense of increasing ability.

Additionally, we can take an inductive approach to this question. We continue to discover situations where organic structures are extraordinarily complex, but their developmental mechanisms are relatively simple. For example, we see how many behavioral aspects of personality are influenced by a single chemical such as adrenalin.[7] Similarly, we have discovered “growth factors” that influence a wide range of capacities in the body.[8] Neurological researchers are discovering mechanisms that control the size and function of various components of the brain, including its architectural structure and the arrangement of different types of neurological cells.

As with the discovery and commercialization of recombinant human growth hormone,[9] it’s not unreasonable to assume we could gain control over various neurological growth control mechanisms in the near term. Moreover, these growth controls could be augmented with mechanical controls. We might couple the use of a neurological growth factor with removal of a section of the cranium to permit an expansion of the cortex. Even though vast gaps might remain in our understanding of how the cortex actually encodes information and processes stimuli, the innate plasticity of neural functions could enable an enlarged cortex to function in a coherent manner.[10] The self-organizing nature of neural networks is a crucial element that will enable an Organic Singularity to occur prior to a Technological Singularity.[11]

Of course, the ethical implications are staggering, not to mention the scientific, commercial, or political implications. Even if we assume that any experiments that increase neurological capabilities are performed on animal rather than human subjects, we’d still be on a slippery slope. How do we contend with a German Shepherd dog whose intelligence has been enhanced by 25 percent? If we start with dolphin tissue, how much more intelligent would such a creature have to be before it is accorded respect as a sentient being? Doesn’t the dividing line between human beings and other creatures become irremediably blurred? Furthermore, such changes rapidly could be exceeded by even more heroic efforts. As we proceeded from the single cylinder engine to two-, four-, eight-, and 12-cylinder engines and from single-core CPUs to multicore models, what would stop us from extending a bilateral brain into a quadrilateral brain or creating arrays of organic processors linked by a network of spinal cord connectors?

Extensive academic research along these lines currently is underway, and no doubt more is going on in private commercial labs. The temptation to engage in such research is compelling; the threat that another nation or commercial interest would beat us in an “intelligence race” may be considered as dangerous as the threat of the nuclear arms race. Kurzweil recognized that once a computer was developed that was smarter than a human being, it could be employed to create ever-smarter machines. The same result can occur with organic computers, which may then improve upon themselves at an even faster rate.

We need to address this challenge immediately. I’m not in favor of blindly restraining all such research—the results for improvements in the human condition could be profound. Such research, however, needs to be performed in the most open and ethical manner we can arrange. We also have to expect that the results of such research may spread more easily than the knowledge of how to construct a nuclear weapon since the technical means needed to employ organic techniques are far cheaper than what’s needed to develop a nuclear weapon. We must be sure that the ramifications of an Organic Singularity are understood before it occurs, and there is a beneficial way it can be introduced to society if we make that choice.

John Chelen is a lawyer with extensive technology, commercial, and government experience. He currently is involved in a transportation-related startup, as well as environmental and health data mining.


[1] Ray Kurzweil, The Singularity Is Near: When Humans Transcend Biology (New York: Penguin Books, 2006).

[2] Gordon E. Moore, “Cramming more components onto integrated circuits,” Electronics 38 (8) (1965).

[3] A.M. Turing, “Computing Machinery and Intelligence,” Mind 59 (236) (1950): 433–460.

[4] Noble E. Whiteford, History of the Barge Canal of New York State (Albany: J. B. Lyon Company, 1922).

[5] Elaine R. Mardis, “A decade’s perspective on DNA sequencing technology,” Nature 470 (2011): 198–203.

[6] Thomas Insel, “How Does Memory Work? The Plot Thickens,” NIMH Director’s blog, February 10, 2011, available at

[7] Stanley Schachter and Jerome Singer, “Cognitive, Social, and Physiological Determinants of Emotional State,” Psychological Review 69 (5) (1962): 379–399.

[8] Thilini Fernandoa and others, “C. elegans ADAMTS ADT-2 regulates body size by modulating TGFβ signaling and cuticle collagen

organization,” Developmental Biology 352 (1) (2011) : 92–103. And: E. Vögelin and others, “Effects of local continuous release of brain derived neurotrophic factor (BDNF) on peripheral nerve regeneration in a rat model,” Experimental Neurology 99 (2) (2006): 348–353.

[9] John Wass and Raghava Reddy, “Growth hormone and memory,” Journal of Endocrinology 207 (2010): 125–126.

[10] J.M. Fuster, Cortex and Mind: Unifying Cognition (New York: Oxford University Press, 2003).

[11] A. Levina J.M. Herrmann, and T. Geisel, “Dynamical synapses causing self-organized criticality in neural networks,” Nature Physics 3 (2007): 857–860.

Comments on this article

By clicking and submitting a comment I acknowledge the Science Progress Privacy Policy and agree to the Science Progress Terms of Use. I understand that my comments are also being governed by Facebook's Terms of Use and Privacy Policy.