Thursday, January 27, 2005

The Harvard Tower Experiment Was Frauded

The Harvard Tower Experiment, done in the early 1960's on the Harvard University campus, was hauled as one of the most precise experiments confirming Einstein's General Relativity. See

But I want to point out this is but another typical example where scientists, for all kinds of told or untold reasons, manipulate their data to get the result they want. Progress of science relys on none-biases and objective observation of the nature. When the credibility of specific experiments are seriously questionable, the science community must set the record straight and disclose any potential fraud committed.

Before I continue, I must emphasis it has nothing to do with the correctness of GR, and I personally believe the GR is a correct theory. However it is wrong to manipulate data to yield desired result, when the precision of the experiment itself is questionable.

In the Harvard Tower experiment, the energy of a gamma photon, which is around 14.4KeV, is modified by a very small amount, 3.5x10^-11 eV, when it drops a height of 22.6 meter in the earth's gravity, according to the calculation of GR. The Harvard group claimed to have measured that 3.5x10^-11 eV displacement of energy level, and matched it to withing 1% of the predicted value. That would require the measurement of the 3.5x10^-11 eV displacement, to the precision of better than 1.4x10^-12 eV.

The question to asked is whether an experiment precision of 1.4x10-12 eV, or even 3.5x10^-11 eV, out of a photo energy of 14400 eV, is possible or not. My answer is it is impossible based on quantum mechanics, specifically, based on the uncertainty principle.

First, the source of gamma photo, Iron-57, has a natural line width of about 10^-8 eV, due to the short life time of decaying, and the uncertainty principle. See

That means you can not detect a photo energy change much smaller than the natural line width of 10^-8 eV. The claimed 1% precision (it was actualy 4%, based on the 5.1 versus 4.9 figure) would require a measurement precision of 1.4x10^-12 ev, which is 4 orders of magnitude narrower than the natural line width. Impossible to measure.

Second, it took a very short travel time for the gamma photon to travel the 22.6 meter distance. The time is 7.5x10-8 second. The photon existed for a very brief lifetime of just 7.5x10^-8 sec, from the time it was emitted to being absorbed. Based on uncertainly principle, this short lifetime brings about an uncertainly in the energy level of
0.5*hbar/t = 0.5*6.582x10^-16 eV*sec/(7.5x10^-8sec)
= 4.388x10^-9 eV

So the measurement precison of the energy level could never be better than 4.388x10^-9 eV. The GR effect is only 3.5x10^-11 eV, a quantity more than 100 times too small to be measured!!!

Third, even if such a miniscure amount is measured, any possible doppler shift due to relative movement of the source and detector will subject the data to question. To shift the energy level 3.5x10^-11 eV, out of a total of 14400 eV, by doppler effect, all it take is a relative speed of

V = C * 3.5x10^-11 eV / 14400 eV = 7.3 x10^-7 meter/second = 0.73 micro-meter/second

The Mossbauer Effect would only be sensitive enough to detect doppler shift of speeds down to milimeters per second. Speeds of 0.73 micro meters per second, which is roughly 2 to 3 milimeter per hour, is far below that measurable sensitivity of Mossbauer Effects.

Fourth, the equivalent doppler shift speed, 0.73 micrometer per second, or 2.6 milimeter per hour, is far smaller than any thermal expansion rate of the building, at a height of 20+ meters, during the sunshine and sunset of the day/night change.

There are solid and undeniable evidence that there is simply no way the researchers could have obtained the claimed the result at claimed precision. Any reasonable person would have to conclude that the data is probably doctored.