Flexi Matter

Earlier this year, a team of scientists at the Max Planck Institute of Quantum Optics, led by Randolf Pohl, made a highly accurate calculation of the diameter of a proton and, at .841 fm, it turned out to be 4% less than previously determined (.877 fm).  Trouble is, the previous measurements were also highly accurate.  The significant difference between the two types of measurement was the choice of interaction particle: in the traditional case, electrons, and in Pohl’s case, muons.

Figures have been checked and rechecked and both types of measurements are solid.  All sorts of crazy explanations have been offered up for the discrepancy, but one thing seems certain: we they don’t really understand matter.

Ancient Greeks thought that atoms were indivisible (hence, the name), at least until Rutherford showed otherwise in the early 1900s.  Ancient 20th-century scientists thought that protons were indivisible, at least until Gell-Mann showed otherwise in the 1960s.

So why would it be such a surprise that the diameter of a proton varies with the type of lepton cloud that surrounds and passes through it?  Maybe the proton is flexible, like a sponge, and a muon, at 200 times the weight of an electron, exerts a much higher contractive force on it – gravity, strong nuclear, Jedi, or what have you.  Just make the measurements and modify your theory, guys.  You’ll be .000001% closer to the truth, enough to warrant an even bigger publicly funded particle accelerator.

If particle sizes and masses aren’t invariant, who is to say that they don’t change over time.  Cosmologist Christof Wetterich of the University of Heidelberg thinks this might be possible.  In fact, says Wetterich, if particles are slowly increasing in size, the universe may not be expanding after all.  His recent paper suggests that spectral red shift, Hubble’s famous discovery at Mount Wilson, that led the most widely accepted theory of the universe – the big bang, may actually be due to changing particle sizes over time.  So far, no one has been able to shoot a hole in his theory.

Oops.  “Remember what we said about the big bang being a FACT?  Never mind.”

Flexi-particles.  Now there is both evidence and major philosophical repercussions.

And still, The Universe – Solved! predicts there is no stuff.

The ultimate in flexibility is pure data.

data200

The Ultimate Destiny of the Nature of Matter is Something Very Familiar

Extrapolation is a technique for projecting a trend into the future.  It has been used liberally by economists, futurists, and other assorted big thinkers for many years, to project population growth, food supply, market trends, singularities, technology directions, skirt lengths, and other important trends.  It goes something like this:

If a city’s population has been growing linearly by 10% per year for many years, one can safely predict that it will be around 10% higher next year, 21% higher in two years, and so on.  Or, if chip density has been increasing by a factor of 2 every two years (as it has for the past 40), one can predict that it will be 8 times greater than today in three years (Moore’s Law).  Ray Kurzweil and other Singularity fans extrapolate technology trends to conclude that our world as we know it will come to an end in 2045 in the form of a technological singularity.  Of course there are always unknown and unexpected events that can cause these predictions to be too low or too high, but given the information that is known today, it is still a useful technique.

To my knowledge, extrapolation has not really been applied to the problem that I am about to present, but I see no reason why it couldn’t give an interesting projection…

…for the nature of matter.

In ancient Greece, Democritus put forth the idea that solid objects were comprised of atoms of that element or material, either jammed tightly together, as in the case of a solid object, or separated by a void (space).  These atoms were thought to be little indivisible billiard-ball-like objects made of some sort of “stuff.”  Thinking this through a bit, it was apparent that if atoms were thought to be spherical and they were crammed together in an optimal fashion, then matter was essentially 74% of the space that it takes up, the rest being air, or empty space.  So, for example, a solid bar of gold was really only 74% gold “stuff,” at most.

That view of matter was resurrected by John Dalton in the early 1800s and revised once J. J. Thomson discovered electrons.  At that point, atoms were thought to look like plum pudding, with electrons embedded in the proton pudding.  Still, the density of “stuff” didn’t change, at least until the early 1900s when Ernest Rutherford determined that atoms were actually composed of a tiny dense nucleus and a shell of electrons.  Further measurements revealed that these subatomic particles (protons, electrons, and later, neutrons) were actually very tiny compared to the overall atom and, in fact, most of the atom was empty space.  That model, coupled with a realization that atoms in a solid actually had to have some distance between them, completely changed our view on how dense matter was.  It turned out that in our gold bar only 1 part in 10E15 was “stuff.”

That was, until the mid-60’s, when quark theory was proposed, which said that protons and neutrons were actually comprised of three quarks each.  As the theory (aka QCD) is now fairly accepted and some measurement estimates have been made of quark sizes, one can calculate that since quarks are between a thousand and a million times smaller than the subatomic particles that they make up, matter is now 10E9 to 10E18 times more tenuous than previously thought.  Hence our gold bar is now only about 1 part in 10E30 (give or take a few orders of magnitude) “stuff” and the rest in empty space.  By way of comparison, about 1.3E32 grains of sand would fit inside the earth.  So matter is roughly as dense with “stuff” as one grain of sand is to our entire planet.

So now we have three data points to start our extrapolation.  Since the percentage of “stuff” that matter is made of is shrinking exponentially over time, we can’t plot our trend in normal scales, but need to use log-log scales.

And now, of course, we have string theory, which says that all subatomic particles are really just bits of string vibrating at specific frequencies, each string possibly having a width of the Planck length.  If so, that would make subatomic particles all but 1E-38 empty space, leaving our gold bar with just 1 part in 1E52 of “stuff”.

Gets kind of ridiculous doesn’t it?  Doesn’t anyone see where this is headed?

In fact, if particles are comprised of strings, why do we even need the idea of “stuff?”  Isn’t it enough to define the different types of matter by the single number – the frequency at which the string vibrates?

What is matter anyway?  It is a number assigned to a type of object that has to do with how that object behaves in a gravitational field.  In other words, it is just a rule.

We don’t really experience matter.  What we experience is electromagnetic radiation influenced by some object that we call matter (visual).  And the effect of the electromagnetic force rule due to the repulsion of charges between the electron shells of the atoms in our fingers and the electron shells of the atoms in the object (tactile).

In other words, rules.

In any case, if you extrapolate our scientific progress, it is easy to see that the ratio of “stuff” to “space” is trending toward zero.  Which means what?

That matter is most likely just data.  And the forces that cause us to experience matter the way we do are just rules about how data interacts with itself.

Data and Rules – that’s all there is.

Oh yeah, and Consciousness.

goldbar185

Things We Can’t Feel – The Mystery Deepens

In my last blog “Things We Can’t See”, we explored the many different ways that our eyes, brains, and/or technology can fool us into seeing something that isn’t there or not seeing something that is.

So apparently, our sense of sight is not necessarily the most reliable sense in terms of identifying what is and isn’t in our objective reality.  We would probably suspect that our sense of touch is fairly foolproof; that is, if an object is “there”, we can “feel” it, right?

Not so fast.

First of all, we have a lot of the same problems with the brain as we did with the sense of sight.  The brain processes all of that sensory data from our nerve endings.  How do we know what the brain really does with that information?  Research shows that sometimes your brain can think that you are touching something that you aren’t or vice versa.  People who have lost limbs still have sensations in their missing extremities.  Hypnosis has been shown to have a significant effect in terms of pain control, which seems to indicate the mind’s capacity to override one’s tactile senses.  And virtual reality experiments have demonstrated the ability for the mind to be fooled into feeling something that isn’t there.

In addition, technology can be made to create havoc with our sense of touch, although the most dramatic of such effects are dozens of years into the future.  Let me explain…

Computer Scientist J. Storrs Hall developed the concept of a “Utility Fog.”  Imagine a “nanoscopic” object called a Foglet, which is an intelligent nanobot, capable of communicating with its peers and having arms that can hook together to form larger structures.  Trillions of these Foglets could conceivably fill a room and not be at all noticeable as long as they were in “invisible mode.”  In fact, not only might they be programmed to appear transparent to the sight, but they may be imperceptible to the touch.  This is not hard to imagine, if you allow that they could have sensors that detect your presence.  For example, if you punch your fist into a swarm of nanobots programmed to be imperceptible, they would sense your motion and move aside as you swung your fist through the air.  But at any point, they could conspire to form a structure – an impenetrable wall, for example.  And then your fist would be well aware of their existence.  In this way, technology may be able to have a dramatic effect on our complete ability to determine what is really “there.”

nanobot

But even now, long before nanobot swarms are possible, the mystery really begins, as we have to dive deeply into what is meant by “feeling” something.

Feeling is the result of a part of our body coming in contact with another object.  That contact is “felt” by the interaction between the molecules of the body and the molecules of the object.

Even solid objects are mostly empty space.  If subatomic particles, such as neutrons, are made of solid mass, like little billiard balls, then 99.999999999999% of normal matter would still be empty space.  That is, of course, unless those particles themselves are not really solid matter, in which case, even more of space is truly empty, more about which in a bit.

So why don’t solid objects like your fist slide right through other solid objects like bricks?  Because of the repulsive effect that the electromagnetic force from the electrons in the fist apply against the electromagnetic force from the electrons in the brick.

But what about that neutron?  What is it made of?  Is it solid?  Is it made of the same stuff as all other subatomic particles?

The leading theories of matter do not favor the idea that subatomic particles are like little billiard balls of differing masses.  For example, string theorists speculate that all particles are made of the same stuff; namely, vibrating bits of string.  Except that they each vibrate at different frequencies.  Problem is, string theory is purely theoretical and really falls more in the mathematical domain than the scientific domain, inasmuch as there is no supporting evidence for the theory.  If it does turn out to be true, even the neutron is mostly empty space because the string is supposedly one-dimensional, with a theoretical cross section of a Planck length.

Here’s where it gets really interesting…

Neutrinos are an extremely common yet extremely elusive particle of matter.  About 100 trillion neutrinos generated in the sun pass through our bodies every second.  Yet they barely interact at all with ordinary matter.  Neutrino capture experiments consist of configurations such as a huge underground tank containing 100,000 gallons of tetrachloroethylene buried nearly a mile below the surface of the earth.  100 billion neutrinos strike every square centimeter of the tank per second.  Yet, any particular molecule of tetrachloroethylene is likely to interact with a neutrino only once every 10E36 seconds (which is 10 billion billion times the age of the universe).

The argument usually given for the neutrino’s elusiveness is that they are massless (and therefore not easily captured by a nucleus) and charge-less (and therefore not subject to the electromagnetic force).  Then again, photons are massless and charge-less and are easily captured, to which anyone who has spent too much time in the sun can attest.  So there has to be some other reason that we can’t detect neutrinos.  Unfortunately, given the current understanding of particle physics, no good answer is forthcoming.

And then there is dark matter.  This concept is the current favorite explanation for some anomalies around orbital speeds of galaxies.  Gravity can’t explain the anomalies, so dark matter is inferred.  If it really exists, it represents about 83% of the mass in the universe, but doesn’t interact again with any of the known forces with the exception of gravity.  This means that dark matter is all around us; we just can’t see it or feel it.

So it seems that modern physics allows for all sorts of types of matter that we can’t see or feel.  When you get down to it, the reason for this is that we don’t understand what matter is at all.  According to the standard model of physics, particles should have no mass, unless there is a special quantum field that pervades the universe and gives rise to mass upon interacting with those particles.  Unfortunately, for that to have any credibility, the signature particle, the Higgs boson, would have to exist.  Thus far, it seems to be eluding even the most powerful of particle colliders.  One alternative theory of matter has it being an emergent property of particle fluctuations in the quantum vacuum.

For a variety of reasons, some of which are outlined in “The Universe – Solved!” and many others which have come to light since I wrote that book, I suspect that ultimately matter is simply a property of an entity that is described purely by data and a set of rules, driven by a complex computational mechanism.  Our attempt to discover the nature of matter is synonymous with our attempt to discover those rules and associated fundamental constants (data).

In terms of other things that we can’t perceive, new age enthusiasts might call out ghosts, spirits, auras, and all sorts of other mysterious invisible and tenuous entities.

starwarsghosts

Given that we know that things exist that we can’t perceive, one has to wonder if it might be possible for macroscopic objects, or even macroscopic entities that are driven by similar energies as humans, to be made from stuff that we can only tenuously detect, not unlike neutrinos or dark matter.  Scientists speculate about multiple dimensions and parallel universes via Hilbert Space and other such constructs.  If such things exist (and wouldn’t it be hypocritical of anyone to speculate or work out the math for such things if it weren’t possible for them to exist?), the rules that govern our interaction with them, across the dimensions, are clearly not at all understood.  That doesn’t mean that they aren’t possible.

In fact, the scientific world is filled with trends leading toward the implication of an information-based reality.

In which almost anything is possible.