RIP Kardashev Civilization Scale

In 1964, Soviet astronomer Nikolai Kardashev proposed a model for categorizing technological civilizations.  He identified 4 levels or “Types”, simplified as follows:

Type 0 – Civilization that has not yet learned to utilize the full set of resources available to them on their home planet (e.g. oceans, tidal forces, geothermal forces, solar energy impinging upon the planet, etc.)

Type 1 – Civilization that fully harnesses, controls, and utilizes the resources of their planet.

Type 2 – Civilization that fully harnesses, controls, and utilizes the resources of their star system.

Type 3 – Civilization that fully harnesses, controls, and utilizes the resources of their galaxy.

halosphere500

As with philosophical thought, literature, art, music, and other concepts and artifacts generated by humanity, technological and scientific pursuits reflect the culture of the time.  In 1964, we were on the brink of nuclear war.  The space race was in full swing and the TV show “Star Trek” was triggering the imagination of laymen and scientists alike.  We thought in terms of conquering people and ideas, and in terms of controlling resources.  What countries are in the Soviet bloc?  What countries are under US influence?  Who has access to most of the oil?  Who has the most gold, the most uranium?

The idea of dominating the world was evident in our news and our entertainment.  Games like Risk and Monopoly were unapologetically imperialistic.  Every Bond plot was about world domination.

Today, many of us find these ideas offensive.  To start with, imperialism is an outdated concept founded on the assumption of superiority of some cultures over others.  The idea of harnessing all planetary resources is an extension of imperialistic mentality, one that adds all other life forms to the entities that we need to dominate.  Controlling planetary resources for the sake of humanity is tantamount to stealing those same resources from other species that may need them.  Further, our attempt to control resources and technology can lead to some catastrophic outcomes.  Nuclear Armageddon, grey goo, overpopulation, global warming, planetary pollution, and (human-caused) mass extinctions are all examples of potentially disastrous consequences of attempts to dominate nature or technology without fully understanding what we are doing.

I argue in “Alien Hunters Still Thinking Inside The Box (or Dyson Sphere)” that attempting to fully harness all of the energy from the sun is increasingly unnecessary and unlikely to our evolution as a species.  Necessary energy consumption per capita is flattening for developing cultures and declining for mature ones.  Technological advances allow us to get much more useful output from our devices as time goes forward.  And humanity is beginning to de-emphasize raw size and power as a desirable attribute (for example, see right sizing economic initiatives) and instead focus on the value of consciousness.

So, certainly the hallmarks of advanced civilizations are not going to be anachronistic metrics of how much energy they can harness.  So what metrics might be useful?

How about:  Have they gotten off their planet?  Have they gotten out of their solar system?  Have they gotten out of their galaxy?

Somehow, I feel that even this is misleading.  Entanglement shows that everything is interconnected.  The observer effect demonstrates that consciousness transcends matter.  So perhaps the truly advanced civilizations have learned that they do not need to physically travel, but rather mentally travel.

How about: How little of an impact footprint do they leave on their planet?

The assumption here is that advanced civilizations follow a curve like the one below, whereby early in their journey they have a tendency to want to consume resources, but eventually evolve to have less and less of a need to consume or use energy.

wigner

How about: What percentage of their effort is expended upon advancing the individual versus the society, the planetary system, or the galactic system?

or…

How about: Who cares?  Why do we need to assign a level to a civilization anyway?  Is there some value to having a master list of evolutionary stage of advanced life forms?  So that we know who to keep an eye on?  That sounds very imperialistic to me.

Of course, I am as guilty of musing about the idea of measuring the level of evolution of a species through a 2013 cultural lens as Kardashev was doing so through a 1964 cultural lens.  But still, it is 50 years hence and time to either revise or retire an old idea.

Signs of Real Humanity Evolution

We humans spend a lot of time talking about and worrying about and doing stuff that doesn’t really matter in the grand scheme of things.  And I’m not just talking about the obvious things, like Real Housewives of New Jersey.  I mean stuff that the media and schools tell us is important.  Including stuff that I used to think was important – like politics, GMOs, how “big chemical” is poisoning us, the revolving door between government, finance and business, widespread corruption, and how the financial elite continues to fleece us peons.  I still think these things are important, because they impact those whom we care about.

But, from a broader, larger, and historical perspective, which party wins the election, what the stock market does, which countries rise or fall, or who wins the war, are, to quote Dr. Evil, “quite inconsequential.”  It’s like rearranging deck chairs on the Titanic.  Everyone dies and new humans take their place.  Every civilization declines and new ones rise.

Where is the evolution of humanity?

Evolution Devolution

It is a mistake to think it is found in science or technology.  As much as I love gadgets and the bleeding edge of high tech, products do not equate to the evolution of a species.  Cloning, nuclear power, nanotech, and 3D printing are not signs of human evolution.  They are merely examples of our ability to control matter.

Nor are medical advances that extend our life expectancy.  Does it really make sense that living longer evolves our species in some way?  On the contrary, it only causes more problems.  Humans now have to compete with an ever-increasing number of humans for limited resources.  Instead of dying of quickly of natural causes, we live past our natural life expectancy and instead die slowly and miserably, often without dignity.  And the increasing human population rapidly takes away and destroys habitats for countless species of other conscious life forms, as well as using them for cruel experimental medical research, which continues the cycle and only serves to make big pharmaceutical companies even bigger.  It is all based on the mistaken assumption that we live in a cold materialistic objective reality.  Hardly evolved thinking.

All is not lost however.  Despite the war profiteers and religious nuts that contribute to the devolution of humanity, we slowly progress in the right direction.  From the Magna Carta in 1215 to the United States Bill of Rights in 1789 to broader recognition of gender and racial equality in the 20th Century, to the fact that homicide rates have dropped by a factor of 30 in Europe over the past 500 years, it can be said that this trend represents positive evolution.

Another good sign is the general trend away from religious dogma and toward spiritual growth.  The percentage of Americans who don’t identify with a particular religious preference has grown steadily from 2% in 1950 to 16% in 2010.  At the same time, the desire for “spiritual growth” has increased from 58 percent in 1994 to 82% five years later, according to a USA Today/Gallup poll.  Why is this a sign of evolution?  Because religious dogma teaches you that you are right and they are wrong, while unaffiliated spiritual discovery almost always results in the recognition that love is what really matters.

And finally, my favorite sign of human evolution is exemplified by what India did a few months ago in declaring dolphins to be “non-human persons” with similar rights including the right not to be held captive.  Three other countries have similar laws and more are sure to follow.  I believe that this, along with a significant trend away from cruel animal practices (think free range chickens, more vegetarians, and the growth of no-kill shelters) is a sure sign that more and more humans are recognizing that they aren’t the only ones with rights on this planet.  Truly evolved thinking.

So maybe there’s hope for us yet.

Alien Hunters Still Thinking Inside The Box (or Dyson Sphere)

As those who are familiar with my writing already know, I have long thought that the SETI program was highly illogical, for a number of reason, some of which are outlined here and here.

To summarize, it is the height of anthropomorphic and unimaginative thinking to assume that ET will evolve just like we did and develop radio technology at all.  Even if they did, and followed a technology evolution similar to our own, the era of high-powered radio broadcasts should be insignificant in relation to the duration of their evolutionary history.  In our own case even, that era is almost over, as we are moving to highly networked and low-powered data communication (e.g. Wi-Fi), which is barely detectable a few blocks away, let alone light years.  And even if we happened to overlap a 100-year radio broadcast era of a civilization in our galactic neighborhood, they would still never hear us, and vice versa, because the signal level required to reliably communicate around the world becomes lost in the noise of the cosmic microwave background radiation before it even leaves the solar system.

So, no, SETI is not the way to uncover extraterrestrial intelligences.

Dyson Sphere

Some astronomers are getting a bit more creative and are beginning to explore some different ways of detecting ET.  One such technique hinges on the concept of a Dyson Sphere.  Physicist Freeman Dyson postulated the idea in 1960, theorizing that advanced civilizations will continuously increase their demand for energy, to the point where they need to capture all of the energy of the star that they orbit.  A possible mechanism for doing so could be a network of satellites surrounding the solar system and collecting all of the energy of the star.  Theoretically, a signature of a distant Dyson Sphere would be a region of space emitting no visible light but generating high levels of infrared radiation as waste.  Some astronomers have mapped the sky over the years, searching for such signatures, but to no avail.

Today, a team at Penn State is resuming the search via data from infrared observatories WISE and Spitzer.  Another group from Princeton has also joined in the search, but are using a different technique by searching for dimming patterns in the data.

I applaud these scientists who are expanding the experimental boundaries a bit.  But I doubt that Dyson Spheres are the answer.  There are at least two flaws with this idea.

First, the assumption that we will continuously need more energy is false.  Part of the reason for this is the fact that once a nation has achieved a particular level of industrialization and technology, there is little to drive further demand.  The figure below, taken from The Atlantic article “A Short History of 200 Years of Global Energy Use” demonstrates this clearly.

per-capita-energy-consumption300

In addition, technological advances make it cheaper to obtain the same general benefit over time.  For example, in terms of computing, performing capacity per watt has increased by a factor of over one trillion in the past 50 years.  Dyson was unaware of this trend because Moore’s Law hadn’t been postulated until 1965.  Even in the highly corrupt oil industry, with their collusion, lobbying, and artificial scarcity, performance per gallon of gas has steadily increased over the years.

The second flaw with the Dyson Sphere argument is the more interesting one – the assumptions around how humans will evolve.  I am sure that in the booming 1960s, it seemed logical that we would be driven by the need to consume more and more, controlling more and more powerful tools as time went on.  But, all evidence actually points to the contrary.

We are in the beginning stages of a new facet of evolution as a species.  Not a physical one, but a consciousness-oriented one.  Quantum Mechanics has shown us that objective reality doesn’t exist.  Scientists are so frightened by the implications of this that they are for the most part in complete denial.  But the construct of reality is looking more and more like it is simply data.  And the evidence is overwhelming that consciousness is controlling the body and not emerging from it.  As individuals are beginning to understand this, they are beginning to recognize that they are not trapped by their bodies, nor this apparent physical reality.

Think about this from the perspective of the evolution of humanity.  If this trend continues, why will we even need the body?

Robert Monroe experienced a potential future (1000 years hence), which may be very much in line with the mega-trends that I have been discussing on theuniversesolved.com: “No sound, it was NVC [non-vocal communication]! We made it! Humans did it! We made the quantum jump from monkey chatter and all it implied.” (“Far Journeys“)

earthWe may continue to use the (virtual) physical reality as a “learning lab”, but since we won’t really need it, neither will we need the full energy of the virtual star.  And we can let virtual earth get back to the beautiful virtual place it once was.

THIS is why astronomers are not finding any sign of intelligent life in outer space, no matter what tools they use.  A sufficiently advanced civilization does not communicate using monkey chatter, nor any technological carrier like radio waves.

They use consciousness.

So will we, some day.

You Are Not Your Body

The debate rages on, but those of us who have done the research know which side is true.

We are NOT our bodies.

I am posting this as a reference to all of the excellent scientific research that has been done around this topic so that I can easily refer to it during future blog posts.  For example…

Gary Schwartz, HaConsciousness185rvard-educated professor of psychology, medicine, neurology, psychiatry, and surgery at the University of Arizona, has done extensive research in peer-reviewed journals and several books, such as “The Afterlife Experiments: Breakthrough Scientific Evidence of Life After Death”, where he states: “consciousness exists independently of the brain. It does not depend upon the brain for its survival. Mind is first, the brain is second. The brain is not the creator of mind; it is a powerful tool of the mind. The brain is an antenna/receiver for the mind, like a sophisticated television or cell phone.”

– Here are 290 Scientific papers on NDEs, such as: K. Ring and M. Lawrence, Further evidence for veridical perception during near-death experiences, Journal of Near-Death Studies, 11 (1993), pp. 223-229, which provide evidence and support for the theory that consciousness is separate from the brain.

– Research by the University of Virginia School of Medicine Division of Perceptual Studies includes a compilation of 12 books on reincarnation, 39 articles and research papers, 3 books on NDEs, and 71 articles and research papers, all supporting the evidence that we are not our bodies.

– “Irreducible Mind: Toward a Psychology for the 21st Century” was written by six interdisciplinary scientists who present years of evidence and thought that lead to the conclusion that “the mind as an entity independent of the brain or body.”

– Cardiologist Pim van Lommel’s presents 20 years of research and supporting scientific data on Near Death Experiences in his book “Consciousness Beyond Life: The Science of the Near-Death Experience.” “Ultimately, we cannot avoid the conclusion that endless consciousness has always been and always will be, independent of the body.”

– Harvard-educated neurosurgeon Eben Alexander explains in this article about his new book that “the brain itself doesn’t produce consciousness.” “it is, instead, a kind of reducing valve or filter, shifting the larger, nonphysical consciousness that we possess in the nonphysical worlds down into a more limited capacity for the duration of our mortal lives.”

– Kenneth Ring is a Professor Emeritus of Psychology at the University of Connecticut.  In his new book, “Mindsight: Near-Death and Out-of-Body Experiences in the Blind”, he documents 31 cases of blind people who had OBEs and NDEs who not only gained “knowledge of facts they could only have learned through a faculty like vision”, but there were also relevant eyewitnesses who corroborate their testimonies.

There is much more – this barely scratches the surface.  Don’t take my word for it, do your own research.  If you maintain an open mind, you will find that there is a boatload of supporting evidence for a separate brain and consciousness.  And pretty much no evidence to the contrary.

And yet, the idea is heretical in scientific circles.  Because it is not understood, it is scary to the closed-minded.

The Power of Intuition in the Age of Uncertainty

Have you ever considered why it is that you decide some of the things that you do?

Like how to divide your time across the multiple projects that you have at work, when to discipline your kids, what to do on vacation, who to marry, what college to attend, which car to buy?

The ridiculously slow way to figure these things out is to do an exhaustive analysis on all of the options, potential outcomes and probabilities.  This can be extremely difficult when the parameters of the analysis are constantly changing, as is often the case.  Such analysis is making use of your conscious mind.

The other option is to use your subconscious mind and make a quick intuitive decision.

We who have been educated in the West, and especially those of us who received our training in engineering or the sciences, are conditioned to believe that “analysis” represents rigorous logical scientific thinking and “intuition” represents new age claptrap or occasional maternal wisdom.  Analysis good, intuition silly.

This view is quite inaccurate.

According to Gary Klein, ex-Marine, psychologist, and author of the book “The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work,” 90% of the critical decisions that we make are made by intuition in any case.  Intuition can actually be a far more accurate and certainly faster way to make an important decision.  Here’s why…

Consider the mind to be composed of two parts – conscious and subconscious.  Admittedly, this division may be somewhat arbitrary, but it is also realistic.

The conscious mind is that part of the mind that deals with your current awareness (sensations, perceptions, memories, feelings, fantasies, etc.)  Research shows that the information processing rate of the conscious mind is actually very low.  Tor Nørretranders, author of “The User Illusion”, estimates the rate at only 16 bits per second.  Dr. Timothy Wilson from the University of Virginia estimates the conscious mind’s processing capacity to be little higher at 40 bits per second.  In terms of the number of items that can be retained at one time by the conscious mind, estimates vary from 4 – 7, with the lower number being reported in a 2008 study by the National Academy of Sciences.

Contrast that with the subconscious mind, which is responsible for all sorts of things: autonomous functions, subliminal perceptions (all of that data streaming in to your five sensory interfaces that you barely notice), implicit thought, implicit learning, automatic skills, association, implicit memory, and automatic processing.  Much of this can be combined into what we consider “intuition.”  Estimates for the information processing capacity and storage capacity of the subconscious mind vary widely, but they are all orders of magnitude larger than their conscious counterparts.  Dr. Bruce Lipton, in “The Biology of Belief,” notes that the processing rate is at least 20 Mbits/sec and maybe as high as 400 Gbits/sec.  Estimates for storage capacity is as high as 2.5 petabytes, or 2,500,000,000,000,000.

Isn’t it interesting that the rigorous analysis that we are so proud of is effectively done on a processing system that is excruciatingly slow and has little memory capacity?

Whereas, intuition is effectively done on a processing system that is blazingly fast and contains an unimaginable amount of data. (Note: as an aside, I might mention that there is actually significant evidence that the subconscious mind connects with powerful data and processing elements outside of the brain, which only serves to underscore the message of this post)

Kind of gives you a little more respect for intuition, doesn’t it?

In fact, that’s what intuition is – the same analysis that you might consider doing consciously, but doing it instead with access to far more data, such as your entire wealth of experience, and the entire set of knowledge to which you have ever been exposed.

Sounds great, right?  It might be a skill that could be very useful to hone, if possible.

But the importance of intuition only grows exponentially as time goes on.  Here’s why…

Eddie Obeng is the Professor at the School of Entrepreneurship and Innovation, HenleyBusinessSchool, in the UK.  He gave a TED talk which nicely captured the essence of our times, in terms of information overload.  The following chart from that talk demonstrates what we all know and feel is happening to us:

Image

The horizontal axis is time, with “now” being all the way to the right.  The vertical axis depicts information rate.

The green curve represents the rate at which we humans can absorb information, aka “learn.”  It doesn’t change much over time, because our biology stays pretty much the same.

The red curve represents the rate at which information is coming at us.

Clearly, there was a time in the past, where we had the luxury of being able to take the necessary time to absorb all of the information necessary to understand the task, or project at hand.  If you are over 40, you probably remember working in such an environment.  At some point, however, the incoming data rate exceeded our capacity to absorb it.  TV news with two or three rolling tickers, tabloids, zillions of web sites to scan, Facebook posts, tweets, texts, blogs, social networks, information repositories, big data, etc.  For some of us, it happened a while ago, for others; more recently.  I’m sure there are still some folks who live  simpler lives on farms in rural areas that haven’t passed the threshold yet.  But they aren’t reading this blog.  As for the rest of us…

It is easy to see that as time goes on, the ratio of unprocessed incoming information to human learning capacity grows exponentially.  What this means is that there is increasingly more uncertainty in our world, because we just don’t have the ability to absorb the information needed to be “certain”, like we used to.  Some call it “The Age of Uncertainty.”  Some refer to the need to be “comfortable with ambiguity.”

This is a true paradigm shift.  A “megatrend.”   It demands entirely new ways of doing business, of structuring companies, of planning, of living.  In my “day job”, I help companies come to terms with these changes by implementing agile and lean processes, structures, and frameworks in order for them to be more adaptable to the constantly changing environment.  But this affects all of us, not just companies.  How do we cope?

One part to the answer is to embrace intuition.  We don’t have time to use the limited conscious mind apparatus to do rigorous analysis to solve our problems anymore.  As time goes on, that method becomes less and less effective.  But perhaps we can make better use of that powerful subconscious mind apparatus by paying more attention to our intuition.  It seems to be what some of our most successful scientists, entrepreneurs, and financial wizards are doing:

George Soros said: “My [trading] decisions are really made using a combination of theory and instinct. If you like, you may call it intuition.”

Albert Einstein said: “The intellect has little to do on the road to discovery. There comes a leap in consciousness, call it intuition or what you will, and the solution comes to you, and you don’t know how or why.”  He also said: “The only real valuable thing is intuition.”

Steve Jobs said: “Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition.”

So how do the rest of us start paying more attention to our intuition?  Here are some ideas:

  • Have positive intent and an open mind
  • Go with first thing that comes to mind
  • Notice impressions, connections, coincidences (a journal or buddy may help)
  • Put yourself in situations where you gain more experience about the desired subject(s)
  • 2-column exercises
  • Meditate / develop point-focus
  • Visualize success
  • Follow your path

I am doing much of this and finding it very valuable.

Complexity from Simplicity – More Support for a Digital Reality

Simple rules can generate complex patterns or behavior.

For example, consider the following simple rules that, when programmed into a computer, can result in beautiful complex patterns akin to a flock of birds:

1. Steer to avoid crowding local flockmates (separation)
2. Steer towards the average heading of local flockmates (alignment)
3. Steer to move toward the average position (center of mass) of local flockmates (cohesion)

The pseudocode here demonstrates the simplicity of the algorithm.  The following YouTube video is a demonstration of “Boids”, a flocking behavior simulator developed by Craig Reynolds:

Or consider fractals.  The popular Mandelbrot set can be generated with some simple rules, as demonstrated here in 13 lines of pseudocode, resulting in beautiful pictures like this:

https://i1.wp.com/upload.wikimedia.org/wikipedia/commons/thumb/a/a4/Mandel_zoom_11_satellite_double_spiral.jpg/800px-Mandel_zoom_11_satellite_double_spiral.jpg

Fractals can be used to generate artificial terrain for video games and computer art, such as this 3D mountain terrain generated by the software Terragen:

Terragen-generated mountain terrain

Conways Game of Life uses the idea of cellular automata to generate little 2D pixelated creatures that move, spawn, die, and generally exhibit crude lifelike behavior with 2 simple rules:

1. An alive cell with less than 2 or more than 4 neighbors dies.
2. A dead cell with 3 neighbors turns alive.

Depending on the starting conditions, there may be any number of recognizable resulting simulated organisms; some simple, such as gliders, pulsars, blinkers, glider guns, wickstretchers, and some complex such as puffer trains, rakes, space ship guns, cordon ships, and even objects that appear to travel faster than the maximum propagation speed of the game should allow:

Cellular automata can be extended to 3D space.  The following video demonstrates a 3D “Amoeba” that looks eerily like a real blob of living protoplasm:

What is the point of all this?

Just that you can apply some of these ideas to the question of whether or not reality is continuous or digital (and thus based on bits and rules).  And end up with an interested result.

Consider a hierarchy of complexity levels…

Imagine that each layer is 10 times “zoomed out” from the layer below.  If the root simplicity is at the bottom layer, one might ask how many layers up you have to go before the patterns appear to be natural, as opposed to artificial? [Note: As an aside, we are confusing ideas like natural and artificial.  Is there really a difference?]

The following image is an artificial computer-generated fractal image created by Softology’s “Visions of Chaos” software from a base set of simple rules, yet zoomed out from it’s base level by, perhaps, six orders of magnitude:

softology-hybrid-mandelbulb

In contrast, the following image is an electron microscope-generate image of a real HPV virus:

b-cell-buds-virus_c2005AECO

So, clearly, at six orders of magnitude out from a fundamental rule set, we start to lose the ability to discern “natural” from “artificial.”  Eight orders of magnitude should be sufficient to make natural indistinguishable from artificial.

And yet, our everyday sensory experience is about 36 orders of magnitude above the quantum level.

The deepest level that our instruments can currently image is about 7 levels (10,000,000x magnification) below reality.  This means that if our reality is based on bits and simple rules like those described above, those rules may be operating 15 or more levels below everyday reality.  Given that the quantum level is 36 levels down, we have at least 21 orders of magnitude to play with.  In fact, it may very well be possible that the true granularity of reality is below the quantum level.

In any case, it should be clear to see that we are not even closed to being equipped to visually discern the difference between living in a continuous world or a digital one consisting of bits and rules.

My Body, the Avatar

Have you ever wondered how much information the human brain can store?  A little analysis reveals some interesting data points…

The human brain contains an estimated 100 trillion synapses.  There doesn’t appear to be a finer level of structure to the neural cells, so this represents the maximum number of memory elements that a brain can hold.  Assume for a moment that each synapse can hold a single bit; then the brain’s capacity would be 100 trillion bits, or about 12.5 terabytes. There may be some argument that there is actually a distribution of brain function, or redundancy of data storage, which would reduce the memory capacity of the brain.  On the other hand, one might argue that synapses may not be binary and hence could hold somewhat more information.  So it seems that 12.5 TB is a fairly good and conservative estimate.

It has also been estimated (see “On the Information Processing Capabilities of the Brain: Shifting the Paradigm” by Simon Berkovich) that, in a human lifetime, the brain processes 3 million times that much data.  This all makes sense if we assume that most (99.99997%) of our memory data is discarded over time, due to lack of need.

But then, how would we explain the exceptional capabilities of autistic savants, or people with hyperthymesia, or eidetic memory (total recall).  It would have to be such that the memories that these individuals retrieve can not all be stored in the brain at the same time.  In other words, memories, or the record of our experiences, are not solely stored in the brain.  Some may be, such as those most recently used, or frequently needed.

Those who are trained in Computer Science will recognize the similarities between these characteristics and the idea of a cache memory, a high speed storage device that stores the most recently used, or frequently needed, data for quick access.

As cardiologist and science researcher Pim van Lommel said, “the computer does not produce the Internet any more than the brain produces consciousness.”

Why is this so hard to believe?

After all, there is no real proof that all memories are stored in the brain.  There is only research that shows that some memories are stored in the brain and can be triggered by electrically stimulating certain portions of the cerebral cortex.  By the argument above, I would say that experimental evidence and logic is on the side of non-local memory storage.

In a similar manner, while there is zero evidence that consciousness is an artifact of brain function, Dr. van Lommel has shown that there is extremely strong evidence that consciousness is not a result of brain activity.  It is enabled by the brain, but not seated there.

These two arguments – the non-local seat of consciousness and the non-local seat of memories are congruent and therefore all the more compelling for the case that our bodies are simply avatars.