The Universe - Solved!

Which is our future? Which is our present?

Jim Elvidge holds a Master's Degree in Electrical Engineering from Cornell University. He has applied his training in the high-tech world as a leader in technology and enterprise management, including many years in executive roles for various companies and entrepreneurial ventures. He also holds 4 patents in digital signal processing and has written articles for publications as diverse as Monitoring Times and the IEEE Transactions on Geoscience and Remote Sensing. Beyond the high-tech realm, however, Elvidge has years of experience as a musician, writer, and truth seeker. He merged his technology skills with his love of music, developed one of the first PC-based digital music samplers, and co-founded RadioAMP, the first private-label online streaming-radio company. For many years, Elvidge has kept pace with the latest research, theories, and discoveries in the varied fields of subatomic physics, cosmology, artificial intelligence, nanotechnology, and the paranormal. This unique knowledge base has provided the foundation for his first full-length book, "The Universe-Solved!"

Website: http://www.theuniversesolved.com/


There is a school of thought which has gained significant credibility and momentum, largely as the result of Ray Kurzweil’s best-selling book “The Singularity is Near.” It is based on the belief that we will achieve a technological singularity in the coming decades (Kurzweil puts the date at 2045), a sort of AI-run-amok scenario. Closely related is the concept of Transhumanism, which predicts that due to the same trends and forces, we will merge with artificial intelligence (AI) and evolve into a new species of cybernetic being that will essentially be immortal. This is a logical theory of the future that is based on Moore’s Law, nanotechnology, and trends in AI. This paper will show that there is actually evidence that the accelerating trend may be slowing down, due to reaching the limitation of human’s ability to keep up with the pace of technology. Instead, this is actually strong evidence for the theory of Programmed Reality.

Programmed Reality is a different, yet somewhat overlapping theory, that takes Moore’s Law to an alternative conclusion. That is, between advances in virtual reality simulations, gaming, and Brain Computer Interfacing (BCI), we will eventually be able to create reality simulations that are of such high quality from a “sensory experience” standpoint, that they will be indistinguishable from “real” reality. I, Kurzweil, Nick Bostrom, and others put this event at about 20 years from now, or in the 2025-2030 range. Bostrom, Director of the Future of Humanity Institute at Oxford University, makes the logical argument in his “Are You Living in a Computer Simulation?” paper, that we are most likely living in a simulation now. My book “The Universe – Solved!”, explores significant additional classes of evidence that lend support to the idea that we may be living in a programmed reality, if not via a “Matrix”-like simulation, then by a program that drives the physical world. Either way, if true, it revolutionizes our concept of reality.

Rather than being a theory of the future, Programmed Reality is more a theory of everything; the past, the present, and the future. And the Singularity would therefore occur within the construct of our Programmed Reality. This has some interesting implications for the likelihood of the Singularity actually happening; in fact, it also provides a test case for the validity of the whole theory. This paper explores the differences between the two views of the future and presents evidence that already exists that not only do we live in a programmed reality, but if so, the Singularity will never occur.


Evidence for the Programmed Reality

It is worthwhile to briefly review the evidence that we are living in a programmed reality. There is much more detail in the book, but to summarize:

D Branes

1. OUR DISCRETE WORLD – It takes an infinite amount of resources to create a continuous reality, but a finite amount to create a quantized reality. The very nature of the computational mechanisms of a computer are essentially the same as Quantum Mechanics – a sequence of states, with nothing existing or happening between the states. The resolution of any program is analogous to the spatial resolution of our reality, just at a different level. In fact, if you carry Moore’s Law forward (which has been consistent over the past 40 years), computers will reach the Planck resolution in 2192. Not too far off. However, you don’t need to model reality all the way to that level for the model to be indistinguishable from our reality. Let’s say you want to examine the guts of a tree. You cut it open, scrape off a few cells and put them under a microscope, maybe an electron microscope.

To simulate this computationally, one doesn’t have to model every single tree down to the Planck level. Only the observed tree needs to be modeled, and then only the cells selected, and then only down to a resolution that matches the observational limitations of our measurement devices. The program can do that dynamically. And all quantum effects can be programmatically modeled without building a reality model to the Planck level. So, given Moore’s law and the limitations of “observational reality”, we should be able to create Virtual Realities that are indistinguishable from our current reality within 20 years or so. The very fact that our reality is quantized may be considered strong evidence that reality is programmed.

2. THE SIMULATION TIMELINE – Various modern philosophers and scientists have posited that we are likely to be living a simulation. This is because it is highly probable that we will be able to create ancestor simulations within a few years, when we achieve a posthuman stage. Due either to the number of simulations that will be run, or to the proximity that we are to that stage, it is actually more probable that we are in one than the case where we haven’t yet reached that stage. Again, there is no way to tell that we aren’t in a programmed reality.

deep space

3. THE FINE-TUNED UNIVERSE – The universe is unbelievably finely tuned for the physical existence of matter, let alone life. For example, universal constants cancel out all of the vacuum energy in the universe to an amazing accuracy of one part in 10 to the 115th power. Also, a deviation in the expansion rate of the early universe of 1 part in a billion in either direction would have caused the universe to immediately collapse, or fly apart so fast that stars could never have formed. And there are many more such examples. The only explanation that mainstream science can come up with is that either an uncountably huge number of universes are spawned every second (the Everett interpretation of Quantum Mechanics) or an uncountably huge number of universes are already out there, most of which are entirely benign and unable to support life or even the formation of matter. Via the magic sleight of hand of the anthropic principle, we happen to be in the only perfect one. It certainly seems that Occam’s Razor heavily favors the simulation theory here.

4. THOSE PESKY ANOMALIES – The huge set of well-studied anomalies facing us in fields as varied as metaphysics, physics, philosophy, geology, anthropology, and psychology can all be explained only by the programmed reality model. The mathematics of coincidence, the perceived acceleration of society, OOPart, the truth about the paranormal, quantum entanglement, the existence of oil – they all fit neatly into this hypothesis. No other theory can make that claim.


Putting all of this together, we can actually make some predictions about our future. Specifically, and for the purpose of the argument of this paper, Programmed Reality predicts the following:

Any impending civilization-ending trend will always reverse.

Here’s why: If we are indeed in a programmed reality, we can say one of two things about the course of the program. Either:

1. We will end this reality at some point and begin another one.

or…


2. We will continue to “play the game.”

In the first case, I would argue that the civilization-ending trend merely gets erased, which is an extreme example of reversal. In the second case, there is an incentive for the “programmers” to maintain a construct that allows us to continue to play. Such a construct is inconsistent with the idea of a civilization-ending event.

Another prediction is:

The Singularity will not occur.

The Singularity, while not civilization-ending, is certainly a type of event that will destroy life, reality, and civilization, as we know it. Therefore, by the argument above, it will not happen. If it does, then I would have to concede that Programmed Reality is not a likely description of our reality.


Evidence that the exponential trends are flattening…

There is actually evidence that the trends that predict the Singularity are slowing down.  The Appendix describes in detail how some of the predictions of accelerating evolutionary events seem to be flattening out.  In fact, it seems that the next paradigm-shifting event may occur about 17 years after the last (WWW), give or take.  But the most recent one occurred about 13 years after the previous one (PC).  So, by that rationale, the pace of exponential technological evolution is slowing down.  Computer scientist and Virtual Reality pioneer Jaron Lanier thinks he knows why.  In his paper One-Half of a Manifesto (Wired, 2000), he argues that our inability to develop advances in software will, at least for now, prevent the Singularity from happening according to the Moore’s Law pace.  One great quote from his demi-manifesto: “Just as some newborn race of superintelligent robots are about to consume all humanity, our dear old species will likely be saved by a Windows crash. The poor robots will linger pathetically, begging us to reboot them, even though they’ll know it would do no good.”[1] 

I have been in the software industry for over 25 years and I must admit, I am also disheartened by the slow pace of software advances.  It seems to be that it takes almost as long to open a Word document, boot up, or render a 3D object on today’s blazingly fast PCs as it did 20 years ago on a machine running at less than 1% of today’s clock rate.  Kurzweil claims that we have simply forgotten: “Jaron has forgotten just how unresponsive, unwieldy, and limited they were.”

So, I wondered, who is right?  Are there objective tests out there?  I found an interesting article in PC World that compared the boot-up time from a 1981 PC to that of a 2001 PC. [2]  Interestingly, the 2001 was over 3 times slower (51 seconds for boot up) than its 20-year predecessor (16 seconds).  My 2007 Thinkpad – over 50 seconds.  Yes, I know that Vista is much more sophisticated than MS-DOS and therefore consumes much more disk and memory and takes that much more time to load.  But really, are those 3D spinning doodads really helping me work better?

Then I found a benchmark comparison on the performance on 6 different Word versions over the years.[3]  Summing 5 typical operations, the fastest version was Word 95 at 3 seconds.  Word 2007 clocked in at 12 seconds (in this test, they all ran on the same machine).

In summary, software has become bloated.  Developers don’t think about performance as much as they used to because memory and CPU speed is cheap.  Instead, the trend in software development is layers of abstraction and frameworks on top of frameworks.  Developers have become increasingly specialized (“I don’t do “Tiles”, I only do “Struts”) and very few get the big picture.

What does this have to do with the Singularity?  Simply this – With some notable exceptions, software development has not even come close to following Moore’s Law in terms of performance or reliability.  Yet, the Singularity predictions depend on it.


Infomania and the Flattening

Erosion of work

But the most significant reason that the exponential pace of technological development is slowing down is that our poor brains are simply not able to keep up. Wikipedia defines Infomania as “the debilitating state of information overload, caused by the combination of a backlog of information to process (usually in email), and continuous interruptions from technologies like phones, instant messaging, and email.” As one who works in the high tech industry, I am well aware of the changes that technology and globalization have imparted on the pace of the typical business life. Not so long ago, people actually took lunch breaks, spent uninterrupted evenings and weekends with their families, paid attention in meetings, and focused on a few large scale tasks on any given day.

Today, however, with the Blackberry on your belt, the cell phone in your pocket, the wireless laptop under your arm, you are imminently reachable 24x7x365.  People email in their cars, instant message from their laptops during meetings, conduct phone conferences during lunch while they work on a couple tasks in between sandwich bites, and in general process many more interrupts than they used to.  This behavior is not only condoned, it is expected in today’s world.

A study done by Intel in 1999 showed that the average worker in the high tech industry spends 2.5 hours per day processing email, much of it unnecessary.[4]  By 2006, another study by the same company showed that the number had risen to 4 hours per day.  In addition, the time that it takes to recover from the email interrupts amounts to another 50%.  For a 9 hour day, that leaves about 2.9 hours of non-email-related work time, not even considering other forms of unnecessary interrupts.  It is no wonder that people feel more stressed and pressured at work than they did a short time ago.  If this is an exponential trend and we project forward another 7 years, we find that email processing alone would take over the average work day, leaving negative time for real work.  So either, the person would have to put in the extra 3.7 hours to make up for the lost productivity (further deteriorating the work/life balance) or something else has to give.

The answer, of course, is that we can’t sustain this exponential pace.  Even if software somehow comes to the rescue with productivity-improving techniques, we have to reverse the trend just to stay productive.  This is symbolic of the overall acceleration effect of modern life.  I believe that it will reach a plateau out of necessity, thereby staving off the Singularity. 

This is perhaps by design.  If our minds and emotional psyches were designed to be able to support an exponentially growing pace of life, our realities would change drastically.  In the 1960s, people thought that we were living in a world that was growing exponentially more dangerous due to the specter of nuclear war.  In fact, we even set the doomsday clock to 2 seconds before midnight.  However, somehow that seemingly inevitable trend was reversed.  Then in the 1970s, the danger was the population explosion.  It seemed that we were on a path of exponentially growing population that threatened to overrun the planet and bring human life to extinction.  Paul Ehrlick’s best-seller, “The Population Bomb”, for example predicted an inevitable mass famine of hundreds of millions of people the 1970s and 1980s.  Not to minimize the current impact of our growing population, this never occurred and the exponential trend has certainly flattened somewhat in comparison to the projections of the time.

It seems that our world, our events, our technology, our trends, have a sort of thermostatic effect.  When they begin to get out of control, something pulls them back to normalcy.  I submit that this is the work of the Programmed Reality, which, being significantly advanced from our best concepts of control systems, has a flattening effect built into all trends.  For this reason, it will have the same effect on the Singularity.  It will simply not occur.  And the reason may just be as simple as the fact that we, the players on the stage of the Reality, are not well suited to maintain that pace of growth.


APPENDIX

Figure 1 is a recreated chart from the data presented by Ray Kurzweil in his book The Singularity is Near.[5] It demonstrates the exponentially accelerating page of change in human evolution. By plotting on log-log paper, exponential trends appear as a straight line. The trend shown in this particular chart is the time between successive significant evolutionary events, both biological and technological. Ostensibly, each successive event is equivalently significant in terms of evolution or technical advancement compared to the previous one. Therefore, it shows how evolutionary events and technology are accelerating and will reach a point of singularity. In an article in Washington Monthly, the same year that Kurzweil’s book was published, Steve Benen notes that if you extrapolate the graph for a few more orders of magnitude, it “indicates that major, paradigm-busting inventions should be spaced about a week apart these days.”[6]

Singularity Chart
Figure 1

I’ve shown that segment of the graph in grey. Kurzweil has correctly pointed out that you can’t project a log-log graph into the future, but nevertheless, this graph seems to be implying that we are on a trend to a Singularity in the current year. If it were not the case, the trend would be diverging to the right.

Another way of looking at it is to redraw the graph with the x-axis being “Time before 2045” instead of “Time Before Present.” Figure 2 shows such a graph using the same events. If the Singularity were really to happen at 2045 and the events are indeed chosen correctly, they should fall on the straight line. However, they do not.

Singularity Chart
Figure 2

As can be seen, the paradigm-shifting events are diverging toward the current day. What’s going on? One possibility is that the events chosen are wrong. Perhaps, the technological advance from the computer to the PC, for example, is not as significant as the evolution between Homo Erectus and Home Sapiens. Certainly, the choice of significant events is somewhat arbitrary. Unfortunately, we are looking at the problem through the lens of present day biases.

Even so, it does seem that the PC and the World Wide Web have been the two most significant technological paradigm shifts in recent years. The time between the Computer and the PC was 38 years. The time between the PC and WWW was 13 years. If progress was truly exponential, the next major invention should have occurred in 2001. What was it? It seems to me that we might be looking at a couple possible significant events in our near future:

  1. A computer passes the Turing Test (true AI)
  2. Artificial Life is created
  3. Brain-Computer Interfaces

In the recent Loebner Prize competition at the University of Reading, one computer system came within 5% of fooling enough judges into thinking it was human during a natural language conversation.[7] Given that, one might suspect that we will get AI within a few years.

A year ago, Craig Venter announced that he was on the brink of creating artificial life.[8]

And, although Brain-Computer Interfaces are a technology at its infancy, it certainly has begun, with 60-pixel bionic eyes a reality, and recent successful experiments in determining sensory stimuli merely by analyzing brain waves.
So, it seems that the next paradigm-shifting event may occur about 17 years after the last (WWW), give or take. But the most recent one occurred about 13 years after the previous one (PC). So, by that rationale, the pace of exponential technological evolution is slowing down.


References:

  1. Lanier, Jaron “One-Half of a Manifesto,” Wired, December 2000. http://www.wired.com/wired/archive/8.12/lanier.html?pg=1
  1. http://pcworld.about.com/magazine/1908p133id52503.htm
  1. http://www.oooninja.com/2008/07/benchmarking-microsoft-word-95-2007.html
  1. Nathan Zeldes, David Sward, and Sigal Louchheim,
    “Infomania: Why we can’t afford to ignore it any longer,” First
    Monday, http://www.firstmonday.org/issues/issue12_8/zeldes/
  1. Kurzweil, Ray, “The Singularity is Near,” Viking Penguin, 2005.
  1. Benen, Steve, “The Singularity,” Washington Monthly 21 September 2005, http://www.washingtonmonthly.com/archives/individual/2005_09/007172.php.
  1. Williams, Ian, “Artificial intelligence gets a step closer,” vrunet.com, 13 October 2008, http://www.vnunet.com/vnunet/news/2228123/ai-gets-step-closer
  1. Pilkington, Ed, “I am creating artificial life, declares US gene pioneer,” The Guardian, 6 October, 2007., http://www.guardian.co.uk/science/2007/oct/06/genetics.climatechange

The Universe - Solved!

Jim Elvidge holds a Master's Degree in Electrical Engineering from Cornell University. He has applied his training in the high-tech world as a leader in technology and enterprise management, including many years in executive roles for various companies and entrepreneurial ventures. He also holds 4 patents in digital signal processing and has written articles for publications as diverse as Monitoring Times and the IEEE Transactions on Geoscience and Remote Sensing. Beyond the high-tech realm, however, Elvidge has years of experience as a musician, writer, and truth seeker. He merged his technology skills with his love of music, developed one of the first PC-based digital music samplers, and co-founded RadioAMP, the first private-label online streaming-radio company. For many years, Elvidge has kept pace with the latest research, theories, and discoveries in the varied fields of subatomic physics, cosmology, artificial intelligence, nanotechnology, and the paranormal. This unique knowledge base has provided the foundation for his first full-length book, "The Universe-Solved!"

Website: http://www.theuniversesolved.com/