Blog


Blog Index

An Introduction and an Objection to the Singularity

Posted on 12th Aug 2015

My junior year of high school, I faced a daunting task: The Junior Theme. Like a teenager version of a college thesis, the Junior Theme was a scary 8-12 page paper that each English class would spend the whole year gathering research for and writing. If I remember correctly, it was worth a whopping 20% of our final grade. While each student frantically chose a topic from a list of suggestions to write about (poverty, politics, the economy, historical figures, etc.), I opted to create my own category in something I was more attracted to - computers. So I went online and looked for the biggest (the more to write about) and coolest book about computers that I could find. What I discovered would influence me years down the road.

The Singularity is Near, by Ray Kurzweil, is a compelling (and very large, bullseye) argument that focuses on the history and future of computer science and how the products from research in this field are changing, and will continue to change, our lives in unimaginable ways. Kurzweil is often referred to as the father of artificial intelligence for his work in the field and for his over-30-year-long track record of accurate technological predictions. He invented in the 1980's the technology that is now being used by such artificial intelligence programs as Siri and Google Now. The core thesis of Kurzweil's book is this: by the year 2045, computers will be so advanced that biological intelligence (humans) and artificial intelligence (robots) will be indistinguishable. This is a pretty huge claim. Think about it - 30 years from this moment in time, Kurzweil argues, with heaps of extraordinarily strong evidence, that you will not be able to tell a computer program from a human over text chat, over the phone, and probably in person. This is not just a prediction that computers will pass the Turing test (which Kurzweil predicts will happen in 2029); this is a prediction that we will have the capability to completely simulate a human brain on a computer in real time in just 30 years. The implications of this event are literally life-changing. From being able to upload and store your personality online to having an implant in your brain that puts the whole wealth of the Internet's knowledge at your fingertips in a second, the possibilities insinuated by the Singularity woils make anyone scratch their head in disbelief. But it shouldn't be so hard to believe - right now in your pocket probably, your cell phone holds more computing power than the Apollo 11 spaceship by many orders of magnitude and for 1000's of times cheaper. Quick reminder that Apollo 11 flew to the moon less than 50 years ago.

Now this post wasn't meant to be an argument for the Singularity, although I do agree with many of the arguments that Kurzweil has made. I'd like to read the rest of the research up to you if you're interested enough to learn more; tons of information can be found by reading the Wikipedia page for the book or by reading the book itself (also see www.KurzweilAI.net). Instead, I want to discuss an interesting case made against the Singularity. I stumbled upon an interesting article on Hacker News recently that makes a good argument for why we may all just be laughing at Kurzweil in 2045. 

Maciej Cegłowski, a web designer from San Fransisco, uses the aviation industry as a metaphor for why the Singularity simply won't happen. Specifically, he says that aviation scientists in the 1960's and 1970's were so captivated by the exponential nature of aviation technology advancements in the previous 70 years that they very optimistically began taking reservations for airplane rides to the moon. As you probably (hopefully) know, those reservations were never used. But how could we go from developing a completely new technology in 1903 that visionaries had been trying to tame for hundreds of years, to the first commercial flight in 1914, to developing jet propulsion and interplanetary travel just a few decades later, to 40 years of stagnation? Aviation technology really hasn't changed all that much in the last 40 years, argues Cegłowski (be sure to see his article for some great illustrations). Some may argue that of course we've made advancements in the last 40 years, but these advancements really have not nearly been as fundamentally ground-shattering as all the progress that occured in the first half of aviation history. How does this make sense?

According the article, the technology just got good enough. Once we had the Boeing 747 and jet propulsion technology, we hit the point of diminishing returns. Basically, the cost of developing the next best thing wasn't worth the benefits that the next best thing offered over the current technology. Cegłowski Says that because people are moving from desktops to laptops to cell phones to... smart watches? that there is no desire and thus no push to achieve a microchips composed of 3D transistors with ultra-low power requirements and super high clock speeds that are as big as just a few carbon atoms: "What people want from computers now is better displays, better battery life and above all, a better Internet connection." 

Cegłowski goes on to offer several other examples as evidence to explain how our computer science revolution is reaching a similar point of diminishing returns as the aviation industries. The difference between Cegłowski and Kurzweil essentially boils down to this: the web designer sees technological advancements as a product of consumer desire whereas the "father of artificial intelligence" sees the technological revolution as a self-fulfilling prophetic beast, a passive phenomenon that doesn't require any interference, a literal machine. My own personal opinion is much more on the side of Kurzweil. The information technology sector and the aviation sector are inherently different industries; we already have computer programs and robots that are self-improving and it is simply not true that computers will ever be good enough. Recent computing capabilities have ushered in the age of Big Data and advances in physics, medicine, neuroscience, and many other fields are requiring more advanced computer hardware and software for simulations and data analysis that will revoluntionize each of these industries within the decade. For example, IBM just acquired a healthcare analysis company in a billion dollar deal to help its artificial intelligence system Watson (the very same that beat the best Jeopardy player in the world) analyze medical images for diagnosis. Advances in the information technology industry show absolutely no signs of slowing down. With over 30 years of accurate technology predictions and loads of historical data that shows a smoothly exponential growth in computer capability over the last 120 years (smooth despite 2 world wars, a major depresion, and many different technology paradigms), Kurzweil has much more ammo to back up his argument. That being said, Cegłowski's position has its merits.

 

 "Never trust anything that can think for itself if you can't see where it keeps its brain."

- Arthur Weasley