Flexi Matter

Earlier this year, a team of scientists at the Max Planck Institute of Quantum Optics, led by Randolf Pohl, made a highly accurate calculation of the diameter of a proton and, at .841 fm, it turned out to be 4% less than previously determined (.877 fm).  Trouble is, the previous measurements were also highly accurate.  The significant difference between the two types of measurement was the choice of interaction particle: in the traditional case, electrons, and in Pohl’s case, muons.

Figures have been checked and rechecked and both types of measurements are solid.  All sorts of crazy explanations have been offered up for the discrepancy, but one thing seems certain: we they don’t really understand matter.

Ancient Greeks thought that atoms were indivisible (hence, the name), at least until Rutherford showed otherwise in the early 1900s.  Ancient 20th-century scientists thought that protons were indivisible, at least until Gell-Mann showed otherwise in the 1960s.

So why would it be such a surprise that the diameter of a proton varies with the type of lepton cloud that surrounds and passes through it?  Maybe the proton is flexible, like a sponge, and a muon, at 200 times the weight of an electron, exerts a much higher contractive force on it – gravity, strong nuclear, Jedi, or what have you.  Just make the measurements and modify your theory, guys.  You’ll be .000001% closer to the truth, enough to warrant an even bigger publicly funded particle accelerator.

If particle sizes and masses aren’t invariant, who is to say that they don’t change over time.  Cosmologist Christof Wetterich of the University of Heidelberg thinks this might be possible.  In fact, says Wetterich, if particles are slowly increasing in size, the universe may not be expanding after all.  His recent paper suggests that spectral red shift, Hubble’s famous discovery at Mount Wilson, that led the most widely accepted theory of the universe – the big bang, may actually be due to changing particle sizes over time.  So far, no one has been able to shoot a hole in his theory.

Oops.  “Remember what we said about the big bang being a FACT?  Never mind.”

Flexi-particles.  Now there is both evidence and major philosophical repercussions.

And still, The Universe – Solved! predicts there is no stuff.

The ultimate in flexibility is pure data.

data200

Grand Unified Humanity Theory

OK, maybe this post is going to be a little silly – apologies in advance.  I’m in that kind of mood.

Physicists recently created a fascinating concoction – a Bose-Einstein condensate (BEC) that was stable at a temperature 50% higher than critical.  Check out this phys.org article with the deets.  In this bizarre state of matter, all particles act in unison, entangled, as if they were collectively a single particle.  Back in Einstein’s day, BECs were envisioned to be composed of bosons.  Later, theory predicted and experiments demonstrated fermions, and ultimately, atoms.

bose185A comparison is made to an analogous process of getting highly purified water to exist at temperatures above boiling point.  It seems that phase transitions of various types can be pushed beyond their normal critical point if the underlying material is “special” in some way – pure, balanced, coherent.

Superfluids.  Laser light.

It reminds me of the continuous advances in achieving superlative or “perfect” conditions, like superconductivity (zero resistance) at temperatures closer and closer to room.  I then think of a characteristic that new agers ascribe to physical matter – “vibrational levels.”

Always connecting dots, sometimes finding connections that shouldn’t exist.

Given the trend of raising purity, alignment, and coherence in conditions closer and closer to “normal” transitions and scales, might we someday see entangled complex molecules, like proteins?  BECs of DNA strands?

Why stop there?  Could I eventually be my own BEC?  A completely coherent vibrationally-aligned entity?  Cool.  I’ll bet I would be transparent and could walk through doors.

And what if science could figure out how to create a BEC out of all living things?  Nirvana.  Reconnecting with the cosmic consciousness.

Grand Unified Humanity Theory.

Einstein Would Have Loved Programmed Reality

Aren’t we all Albert Einstein fans, in one way or another?  If it isn’t because of his 20th Century revolution in physics (relativity), or his Nobel Prize that led to that other 20th Century revolution (quantum mechanics), or his endless Twainsian witticisms, it’s his underachiever-turned-genius story, or maybe even that crazy head of hair.  For me, it’s his regular-guy sense of humor:

“The hardest thing in the world to understand is the income tax.”

and…

“Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. THAT’S relativity.”

Albert Einstein on a bicycle in Niels Bohr's garden

But, the more I read about Albert and learn about his views on the nature of reality, the more affinity I have with his way of thinking.  He died in 1955, hardly deep enough into the digital age to have had a chance to consider the implications of computing, AI, consciousness, and virtual reality.  Were he alive today, I suspect that he would be a fan of digital physics, digital philosophy, simulism, programmed reality – whatever you want to call it.  Consider these quotes and see if you agree:

“Reality is merely an illusion, albeit a very persistent one.”

“I wished to show that space-time isn’t necessarily something to which one can ascribe a separate existence, independently of the actual objects of physical reality. Physical objects are not in space, but these object are spatially extended. In this way the concept of ’empty space’ loses its meaning.”

As far as the laws of mathematics refer to reality, they are uncertain; and as far as they are certain, they do not refer to reality.”

“A human being is part of a whole, called by us the ‘Universe’ —a part limited in time and space. He experiences himself, his thoughts, and feelings, as something separated from the rest—a kind of optical delusion of his consciousness. This delusion is a kind of prison for us, restricting us to our personal desires and to affection for a few persons nearest us. Our task must be to free ourselves from this prison by widening our circles of compassion to embrace all living creatures and the whole of nature in its beauty.”

“Space does not have an independent existence.”

“Hence it is clear that the space of physics is not, in the last analysis, anything given in nature or independent of human thought.  It is a function of our conceptual scheme [mind].”

 “Every one who is seriously involved in the pursuit of science becomes convinced that a spirit is manifest in the laws of the Universe-a spirit vastly superior to that of man, and one in the face of which we with our modest powers must feel humble.”

I can only imagine the insights that Albert would have had into the mysteries of the universe, had he lived well into the computer age.  It would have given him an entirely different perspective on that conundrum that puzzled him throughout his later life – the relationship of consciousness to reality.  And he might have even tossed out the Unified Field Theory that he was forever chasing and settled in on something that looked a little more digital.

 

Bizarro Physics

All sorts of oddities emerge from equations that we have developed to describe reality.  What is surprising is that rather than being simply mathematical artifacts, they actually show up in our physical world.

Perhaps the first such bizarro (see DC Comics) entity was antimatter; matter with an opposite charge and spin.  A mathematical solution to Paul Dirac’s relativistic version of Schrödinger’s equation (it makes my head hurt just looking at it), antimatter was discovered 4 years after Dirac predicted it.

One of last year’s surprises was the negative frequencies that are solutions to Maxwell’s equations and have been shown to reveal themselves in components of light.

And, earlier this month, German physicists announced the ability to create a temperature below absolute zero.

So when we were told in physics class to throw out those “negative” solutions to equations because they were in the imaginary domain, and therefore had no basis in reality…uh, not so fast.

What I find interesting about these discoveries is the implications for the bigger picture.  If our reality were what most of us think it is – 3 dimensions of space, with matter and energy following the rules set forth by the “real” solutions to the equations of physics – one might say that reality trumps the math; that solutions to equations only make sense in the context of describing reality.

However, it appears to be the other way around – math trumps reality.  Solutions to equations previously thought to be in the “imaginary domain” are now being shown to manifest in our reality.

This is one more category of evidence that underlying our apparent reality are data and rules.  The data and rules don’t manifest from the reality; they create the reality.

Bizarro185 antimatter185

The Digital Reality Bandwagon

I tend to think that reality is just data.  That the fundamental building blocks of matter and space will ultimately be shown to be bits, nothing more.  Those who have read my book, follow this blog, or my Twitter feed, realize that this has been a cornerstone of my writing since 2006.

Not that I was the first to think of any of this.  Near as I can tell, Phillip K. Dick may deserve that credit, having said “We are living in a computer programmed reality” in 1977, although I am sure that someone can find some Shakespearean reference to digital physics (“O proud software, that simulates in wanton swirl”).

Still, a mere six years ago, it was a lonely space to be in.  The few digital reality luminaries at that time included:

But since then…

– MIT Engineering Professor Seth Lloyd published “Programming the Universe” in 2006, asserting that the universe is a massive quantum computer running a cosmic program.

– Nuclear physicist Thomas Campbell published his excellent unifying theory “My Big TOE” in 2007.

– Brian Whitworth, PhD. authored a paper containing evidence that our reality is programmed: “The emergence of the physical world from information processing”, Quantum Biosystems 2010, 2 (1) 221-249  http://arxiv.org/abs/0801.0337

– University of Maryland physicist, Jim Gates, discovered error-correction codes in the laws of physics. See “Symbols of Power”, Physics World, Vol. 23, No 6, June 2010.

– Fermilab astrophysicist, Craig Hogan, speculated that space is quantized.  This was based on results from GEO600 measurements in 2010.  See: http://www.wired.com/wiredscience/2010/10/holometer-universe-resolution/.  A holometer experiment is being constructed to test: http://holometer.fnal.gov/

– Rich Terrile, director of the Center for Evolutionary Computation and Automated Design at NASA’s Jet Propulsion Laboratory, hypothesized that we are living in a simulated reality. http://www.vice.com/read/whoa-dude-are-we-inside-a-computer-right-now-0000329-v19n9

– Physicists Leonard Susskind ad Gerard t’Hooft, developed the holographic black hole physics theory (our universe is digitally encoded on the surface of a black hole).

Even mainstream media outlets are dipping a toe into the water to see what kinds of reactions they get, such as this recent article in New Scientist Magazine: http://www.newscientist.com/article/mg21528840.800-reality-is-everything-made-of-numbers.html

So, today, I feel like I am in really great company and it is fun to watch all of the futurists, philosophers, and scientists jump on the new digital reality bandwagon.  The plus side will include the infusion of new ideas and the resulting synthesis of theory, as well as pushing the boundaries of experimental validation.  The down side will be all of the so-called experts jockeying for position.  In any case, it promises to be a wild ride, one that should last the twenty or so years it will take to create the first full-immersion reality simulation.  Can’t wait.

The Ultimate Destiny of the Nature of Matter is Something Very Familiar

Extrapolation is a technique for projecting a trend into the future.  It has been used liberally by economists, futurists, and other assorted big thinkers for many years, to project population growth, food supply, market trends, singularities, technology directions, skirt lengths, and other important trends.  It goes something like this:

If a city’s population has been growing linearly by 10% per year for many years, one can safely predict that it will be around 10% higher next year, 21% higher in two years, and so on.  Or, if chip density has been increasing by a factor of 2 every two years (as it has for the past 40), one can predict that it will be 8 times greater than today in three years (Moore’s Law).  Ray Kurzweil and other Singularity fans extrapolate technology trends to conclude that our world as we know it will come to an end in 2045 in the form of a technological singularity.  Of course there are always unknown and unexpected events that can cause these predictions to be too low or too high, but given the information that is known today, it is still a useful technique.

To my knowledge, extrapolation has not really been applied to the problem that I am about to present, but I see no reason why it couldn’t give an interesting projection…

…for the nature of matter.

In ancient Greece, Democritus put forth the idea that solid objects were comprised of atoms of that element or material, either jammed tightly together, as in the case of a solid object, or separated by a void (space).  These atoms were thought to be little indivisible billiard-ball-like objects made of some sort of “stuff.”  Thinking this through a bit, it was apparent that if atoms were thought to be spherical and they were crammed together in an optimal fashion, then matter was essentially 74% of the space that it takes up, the rest being air, or empty space.  So, for example, a solid bar of gold was really only 74% gold “stuff,” at most.

That view of matter was resurrected by John Dalton in the early 1800s and revised once J. J. Thomson discovered electrons.  At that point, atoms were thought to look like plum pudding, with electrons embedded in the proton pudding.  Still, the density of “stuff” didn’t change, at least until the early 1900s when Ernest Rutherford determined that atoms were actually composed of a tiny dense nucleus and a shell of electrons.  Further measurements revealed that these subatomic particles (protons, electrons, and later, neutrons) were actually very tiny compared to the overall atom and, in fact, most of the atom was empty space.  That model, coupled with a realization that atoms in a solid actually had to have some distance between them, completely changed our view on how dense matter was.  It turned out that in our gold bar only 1 part in 10E15 was “stuff.”

That was, until the mid-60’s, when quark theory was proposed, which said that protons and neutrons were actually comprised of three quarks each.  As the theory (aka QCD) is now fairly accepted and some measurement estimates have been made of quark sizes, one can calculate that since quarks are between a thousand and a million times smaller than the subatomic particles that they make up, matter is now 10E9 to 10E18 times more tenuous than previously thought.  Hence our gold bar is now only about 1 part in 10E30 (give or take a few orders of magnitude) “stuff” and the rest in empty space.  By way of comparison, about 1.3E32 grains of sand would fit inside the earth.  So matter is roughly as dense with “stuff” as one grain of sand is to our entire planet.

So now we have three data points to start our extrapolation.  Since the percentage of “stuff” that matter is made of is shrinking exponentially over time, we can’t plot our trend in normal scales, but need to use log-log scales.

And now, of course, we have string theory, which says that all subatomic particles are really just bits of string vibrating at specific frequencies, each string possibly having a width of the Planck length.  If so, that would make subatomic particles all but 1E-38 empty space, leaving our gold bar with just 1 part in 1E52 of “stuff”.

Gets kind of ridiculous doesn’t it?  Doesn’t anyone see where this is headed?

In fact, if particles are comprised of strings, why do we even need the idea of “stuff?”  Isn’t it enough to define the different types of matter by the single number – the frequency at which the string vibrates?

What is matter anyway?  It is a number assigned to a type of object that has to do with how that object behaves in a gravitational field.  In other words, it is just a rule.

We don’t really experience matter.  What we experience is electromagnetic radiation influenced by some object that we call matter (visual).  And the effect of the electromagnetic force rule due to the repulsion of charges between the electron shells of the atoms in our fingers and the electron shells of the atoms in the object (tactile).

In other words, rules.

In any case, if you extrapolate our scientific progress, it is easy to see that the ratio of “stuff” to “space” is trending toward zero.  Which means what?

That matter is most likely just data.  And the forces that cause us to experience matter the way we do are just rules about how data interacts with itself.

Data and Rules – that’s all there is.

Oh yeah, and Consciousness.

goldbar185

FTL Neutrinos are not Dead Yet!

So, today superluminal neutrinos are out.  Another experiment called ICARUS, from the same laboratory whence the OPERA results came, recently announced their findings that neutrinos do not travel faster than light.

It is a little surprising how eager scientists were to get that experimental anomaly behind them.  Almost as if the whole idea so threatened the foundation of their world that  they couldn’t wait to jump on the anti-FTL-neutrino bandwagon.  For a complete non-sequitor, I am reminded of the haste with which Oswald was fingered as JFK’s assassin.  No trial needed.  Let’s just get this behind us.

A blog on the Discover Magazine site referred to this CERN announcement as “the nail in the coffin” of superluminal neutrinos.  Nature magazine reported that Adam Falkowski, a physicist from the University of Paris-South said “The OPERA case is now conclusively closed”

Really?

Since when are two conflicting results an indication that one of them is conclusive?  It seems to me that until the reason for OPERA’s superluminal results is determined, the case is still open.

In software engineering, there is such a thing as a non-reproduceable defect.  A record of the defect is opened and if the defect is not reproduceable, it just sits there.  Over time, if the defect is no longer observed, it becomes less and less relevant and the priority of the defect decreases.  Eventually, one assumes that it was due to “user error” or something, and it loses status as a bona fide defect.

The same should hold for anomalous FTL events.  If they are reproduceable, we have new physics.  If not, it is still an anomaly to be investigated and root cause analyzed.

In fact, interestingly enough, the arxiv article shows that the average neutrino speed in the NEW experiment is still .3 ns faster than light speed would predict and more neutrinos were reported faster than the speed of light than slower.  Admittedly, this is well within the experimental error bar, but it does seem to indicate that neutrinos travel at c, the speed of light, which means that they should not have any mass.  Yet other experiments indicate that they do indeed have mass.

And then there was the result of the MINOS experiment in 2007 which also indicated faster than light neutrinos, although not at as statistically significant of a level as with OPERA.

So, we are still left with many neutrino anomalies:

– Two experiments that indicate faster than light speeds.
– Conflicting experiments regarding the possibility of neutrino mass.
– Mysterious transformations of one type of neutrino to another mid-flight.
– And the very nature of their very tenuous interaction with “normal matter,” not unlike dark matter.

Theories abound regarding the possibilities of neutrinos or dark matter existing in, or traveling through, higher dimensions.

How can anyone be so confident that there is a nail in the coffin of any scientific anomaly?

bringoutyerdead

Pathological Skepticism

“All great truths began as blasphemies” – George Bernard Shaw

  • In the 1800’s, the scientific community viewed reports of rocks falling from the sky as “pseudoscience” and those who reported them as “crackpots,” only because it didn’t fit in with the prevailing view of the universe. Today, of course, we recognize that these rocks could be meteorites and such reports are now properly investigated.
  • In 1827, Georg Ohm’s initial publication of what became “Ohm’s Law” met with ridicule, dismissal, and was called “a web of naked fantasies.” The German Minister of Education proclaimed that “a professor who preached such heresies was unworthy to teach science.” 20 yrs passed before scientists began to recognize its importance.
  • Louis Pasteur’s theory of germs was called “ridiculous fiction” by Pierre Pachet, Professor of Physiology at Toulouse in1872.
  • Spanish researcher Marcelino de Sautuola discovered cave art in Altamira cave (northern Spain), which he recognized as stone age and published a paper about it in 1880.  His integrity was violently attacked by the archaeological community, and he died disillusioned and broken.  Yet he was vindicated 10 years after death.
  • Lord Haldane, the Minister of War in Britain, said that “the aeroplane will never fly” in 1907.  Ironically, this was four years after the Wright Brothers made their first successful flight at Kitty Hawk, North Carolina.  After Kitty Hawk, the Wrights flew in open fields next to a busy rail line in Dayton OH for almost an entire year. US authorities refused to come to the demos, while Scientific American published stories about “The Lying Brothers.”
  • In 1964, physicist George Zweig proposed the existence of quarks.  As a result of this theory, he was rejected for position at major university and considered a “charlatan.”  Today, of course, it is an accepted part of standard nuclear model.

Note that these aren’t just passive disagreements.  The skeptics use active and angry language, with words like “charlatan,” “ridiculous,” lying,” “crackpot,” and “pseudoscience.”

This is partly due to a natural psychological effect, known as “fear of the unknown” or “fear of change.”  Psychologists who have studied human behavior have more academic sounding names for it, such as the “Mere Exposure Effect”, “Familiarity Principle”, or Neophobia (something that might have served Agent Smith well).  Ultimately, this may be an artifact of evolution.  Hunter-gatherers did not pass on their genes if they had a habit of eating weird berries, venturing too close to the saber-toothed cats, or other unconventional activities.  But we are no longer hunter-gatherers.  For the most part, we shouldn’t fear the unknown.  We should feel empowered to challenge assumptions.  The scientific method can weed out any undesirable ideas naturally.

But, have you also noticed how the agitation ratchets up the more you enter the realm of the “expert?”

“The expert knows more and more about less and less until he knows everything about nothing.” – Mahatma Gandhi

This is because the expert may have a lot to lose if they stray too far from the status quo.  Their research funding, tenure, jobs, reputations are all at stake.  This is unfortunate, because it feeds this unhealthy behavior.

So I thought I would do my part to remind experts and non-experts alike that breakthroughs only occur when we challenge conventional thinking, and we shouldn’t be afraid of them.

The world is full of scared “experts”, but nobody will ever hear of them.  But they will hear about the brave ones, who didn’t fear to challenge the status quo.  People like Copernicus, Einstein, Georg Ohm, Steve Jobs, and Elon Musk.

And it isn’t like we are so enlightened today that such pathological skepticism no longer occurs.

Remember Stanley Pons and Martin Fleischmann?  Respected electrochemists, ridiculed out of their jobs and their country by skeptics.  Even “experts” violently contradicted each other:

  • “It’s pathological science,” said physicist Douglas Morrison, formerly of CERN. “The results are impossible.”
  • “There’s very strong evidence that low-energy nuclear reactions do occur” said George Miley (who received Edward Teller medal for research in hot fusion.). “Numerous experiments have shown definitive results – as do my own.”

Some long-held assumptions are being overturned as we speak.  Like LENR (Low Energy Nuclear Reactions; the new, less provocative name for cold fusion.

And maybe the speed of light as an ultimate speed limit.

These are exciting times for science and technology.  Let’s stay open minded enough to keep them moving.

Yesterday’s Sci-Fi is Tomorrow’s Technology

It is the end of 2011 and it has been an exciting year for science and technology.  Announcements about artificial life, earthlike worlds, faster-than-light particles, clones, teleportation, memory implants, and tractor beams have captured our imagination.  Most of these things would have been unthinkable just 30 years ago.

So, what better way to close out the year than to take stock of yesterday’s science fiction in light of today’s reality and tomorrow’s technology.  Here is my take:

yesterdaysscifi

Time to Revise Relativity?: Part 2

In “Time to Revise Relativity: Part 1”, I explored the idea that Faster than Light Travel (FTL) might be permitted by Special Relativity without necessitating the violation of causality, a concept not held by most mainstream physicists.

The reason this idea is not well supported has to do with the fact that Einstein’s postulate that light travels the same speed in all reference frames gave rise to all sorts of conclusions about reality, such as the idea that it is all described by a space-time that has fundamental limits to its structure.  The Lorentz factor is a consequence of this view of reality, and so it’s use is limited to subluminal effects and is undefined in terms of its use in calculating relativistic distortions past c.

Lorentz Equation

So then, what exactly is the roadblock to exceeding the speed of light?

Yes, there may be a natural speed limit to the transmission of known forces in a vacuum, such as the electromagnetic force.  And there may certainly be a natural limit to the speed of an object at which we can make observations utilizing known forces.  But, could there be unknown forces that are not governed by the laws of Relativity?

The current model of physics, called the Standard Model, incorporates the idea that all known forces are carried by corresponding particles, which travel at the speed of light if massless (like photons and gluons) or less than the speed of light if they have mass (like gauge bosons), all consistent with, or derived from the assumptions of relativity.  Problem is, there is all sorts of “unfinished business” and inconsistencies with the Standard Model.  Gravitons have yet to be discovered, Higgs bosons don’t seem to exist, gravity and quantum mechanics are incompatible, and many things just don’t have a place in the Standard Model, such as neutrino oscillations, dark energy, and dark matter.  Some scientists even speculate that dark matter is due to a flaw in the theory of gravity.  So, given the incompleteness of that model, how can anyone say for certain that all forces have been discovered and that Einstein’s postulates are sacrosanct?

Given that barely 100 years ago we didn’t know any of this stuff, imagine what changes to our understanding of reality might happen in the next 100 years.  Such as these Wikipedia entries from the year 2200…

–       The ultimate constituent of matter is nothing more than data

–       A subset of particles and corresponding forces that are limited in speed to c represent what used to be considered the core of the so-called Standard Model and are consistent with Einstein’s view of space-time, the motion of which is well described by the Special Theory of Relativity.

–       Since then, we have realized that Einsteinian space-time is an approximation to the truer reality that encompasses FTL particles and forces, including neutrinos and the force of entanglement.  The beginning of this shift in thinking occurred due to the first superluminal neutrinos found at CERN in 2011.

So, with that in mind, let’s really explore a little about the possibilities of actually cracking that apparent speed limit…

For purposes of our thought experiments, let’s define S as the “stationary” reference frame in which we are making measurements and R as the reference frame of the object undergoing relativistic motion with respect to S.  If a mass m is traveling at c with respect to S, then measuring that mass in S (via whatever methods could be employed to measure it; energy, momentum, etc.) will give an infinite result.  However, in R, the mass doesn’t change.

What if m went faster than c, such as might be possible with a sci-fi concept like a “tachyonic afterburner”?  What would an observer at S see?

Going by our relativistic equations, m now becomes imaginary when measured from S because the argument in the square root of the mass correction factor is now negative.  But what if this asymptotic property really represents more of an event horizon than an impenetrable barrier?  A commonly used model for the event horizon is the point on a black hole at which gravity prevents light from escaping.  Anything falling past that point can no longer be observed from the outside.  Instead it would look as if that object froze on the horizon, because time stands still there.  Or so some cosmologists say.  This is an interesting model to apply to the idea of superluminality as mass m continues to accelerate past c.

From the standpoint of S, the apparent mass is now infinite, but that is ultimately based on the fact that we can’t perceive speeds past c.  Once something goes past c, one of two things might happen.  The object might disappear from view due to the fact that the light that it generated that would allow us to observe it can’t keep up with its speed.  Alternatively, invoking the postulate that light speed is the same in all reference frames, the object might behave like it does on the event horizon of the black hole – forever frozen, from the standpoint of S, with the properties that it had when it hit light speed.  From R, everything could be hunky dory.  Just cruising along at warp speed.  No need to say that it is impossible because mass can’t exceed infinity, because from S, the object froze at the event horizon.  Relativity made all of the correct predictions of properties, behavior, energy, and mass prior to light speed.  Yet, with this model, it doesn’t preclude superluminality.  It only precludes the ability to make measurements beyond the speed of light.

That is, of course, unless we can figure out how to make measurements utilizing a force or energy that travels at speeds greater than c.  If we could, those measurements would yield results with correction factors only at speeds relatively near THAT speed limit.

Let’s imagine an instantaneous communication method.  Could there be such a thing?

One possibility might be quantum entanglement.  John Wheeler’s Delayed Choice Quantum Eraser experiment seems to imply non-causality and the ability to erase the past.  Integral to this experiment is the concept of entanglement.  So perhaps it is not a stretch to imagine that entanglement might embody a communication method that creates some strange effects when integrated with observational effects based on traditional light and sight methods.

What would the existence of that method do to relativity?   Nothing, according to the thought experiments above.

There are, however, some relativistic effects that seem to stick, even after everything has returned to the original reference frame.  This would seem to violate the idea that the existence of an instantaneous communication method invalidates the need for relativistic correction factors applied to anything that doesn’t involve light and sight.

For example, there is the very real effect that clocks once moving at high speeds (reference frame R) exhibit a loss of time once they return to the reference frame S, fully explained by time dilation effects.  It would seem that, using this effect as a basis for a thought experiment like the twin paradox, there might be a problem with the event horizon idea.  For example, let us imagine Alice and Bob, both aged 20.  After Alice travels at speed c to a star 10 light years away and returns, her age should still be 20, while Bob is now 40.  If we were to allow superluminal travel, it would appear that Alice would have to get younger, or something.  But, recalling the twin paradox, it is all about the relative observations that were made by Bob in reference frame S, and Alice, in reference frame R, of each other.  Again, at superluminal speeds, Alice may appear to hit an event horizon according to Bob.  So, she will never reduce her original age.

But what about her?  From her perspective, her trip is instantaneous due to an infinite Lorentz contraction factor; hence she doesn’t age.  If she travels at 2c, her view of the universe might hit another event horizon, one that prevents her from experiencing any Lorentz contraction beyond c; hence, her trip will still appear instantaneous, no aging, no age reduction.

So why would an actual relativistic effect like reduced aging, occur in a universe where an infinite communication speed might be possible?  In other words, what would tie time to the speed of light instead of some other speed limit?

It may be simply because that’s the way it is.  It appears that relativistic equations may not necessarily impose a barrier to superluminal speeds, superluminal information transfer, nor even acceleration past the speed of light.  In fact, if we accept that relativity says nothing about what happens past the speed of light, we are free to suggest that the observable effects freeze at c. Perhaps traveling past c does nothing more than create unusual effects like disappearing objects or things freezing at event horizons until they slow back down to an “observable” speed.  We certainly don’t have enough evidence to investigate further.

But perhaps CERN has provided us with our first data point.

Time Warp