Pages

Showing posts with label scientific method. Show all posts
Showing posts with label scientific method. Show all posts

Saturday, 25 February 2012

The constant culpability of the crackpot

The tenet of media operations - in most places - is 'innocent until proven guilty'. The adversarial stance often assumed by television channels and newspaper reports reflect a tendency to view the news from a neutral perspective, to exonerate the common man as much as possible, and to hold ministers and other decision-makers as suspect till a particular threshold. Unfortunately for science, it's exactly the other way around: guilty until proven innocent.

[caption id="attachment_21676" align="aligncenter" width="220" caption="Roger Bacon ('Doctor Mirabilis') was a Franciscan friar (1214-1294) whose emphasis on the scientific method popularized it and made it the experimental method of choice for hundreds of generations of scientists, and continues to be to this day."][/caption]

This is, for many reasons, important. When the media get excited, they often raise the expectations from a particular news story: they ensure that the story is exciting, that it is sufficiently unprecedented so that it challenges existing notions enough to incite interest. And here's where 'guilty until proven innocent' gets to work its magic. Science stories that are capable of becoming exciting have by definition a low chance of being valid. If there was a high chance of the story being valid, it would border on being expected, predictable; covering such an event would be more a chronicling than a witnessing.

By holding a phenomenon guilty, however, nobody would get sucked up in what could be a wild goose chase.

There are a bunch of "discoveries" that, because of their unprecedentedness, purchased interest from the media and allowed the men and women behind the pseudo-discoveries to cash in. One of the greatest examples is the Fleischmann-Pons experiment in cold fusion. When Martin Fleischmann and Stanley Pons of the University of Utah announced that they may have discovered cold fusion on March 23, 1989, there was a flurry of excitement that rippled through military quarters, media offices and the scientific community. According to their paper, there were indications of micro-scale fusion reactions at temperatures of 50 degrees Celsius (as opposed to the millions of degrees they were expected to be at) when deuterium-water was electrolysed in the presence of palladium metal.

[caption id="attachment_21672" align="alignleft" width="199" caption="Martin Fleischmann"][/caption]

The timing couldn't have been better. The 1973 Oil Crisis was fresh in the minds of many people and, seemingly in response to that, cold fusion presented an alternate source of energy that was accessible and more efficient by orders of magnitude. Moreover, just three years earlier in 1986, high-temperature superconductivity had been discovered. When earlier certain materials were thought to be superconducting only at temperatures as low as 4-10 kelvin, a new breed of materials were shown conducting with zero resistance at temperatures as high as 90 and 120 kelvin. Its impact on the discovery of cold fusion was the exculpation of the crackpot: it told the world that having crackpot written all over an experiment didn't mean it couldn't be true.

Fleischmann and Pons were superstars. Soon, a $25-million federal grant was coming the way of U of Utah. However, disturbing developments were being reported from western Europe, other parts of the USA and Japan. The paper Fleishmann and Pons had published in the Journal of Electroanalytical Chemistry did not include the experimental protocol, but that didn't stop scientists worldwide from trying to replicate the research. And they had all been in vain: none of those experiments had detected fusion of any kind at any temperature. In fact, Nathan Lewis, a professor of chemistry at Caltech, set about trying to corroborate cold fusion systematically, trying out all kinds of variations on the process. None of them succeeded.

While the Utah chemists basked in the glory of their finding and the (carefully selected) findings of other groups in America, that also claimed to have observed cold fusion, the death knell was sounded so far afield as CERN: Douglas R. O. Morrison, a physicist, announced that all attempts in western Europe at replicating cold fusion had failed. In order to quell these doubts, Fleischmann and Pons published a "note" in the Journal of Electroanalytical Chemistry in April 1989 that showed a gamma-ray energy peak from their experiment.

Now, it was known at the time that gamma rays released by a fusion reaction could transfer only a certain amount of energy to the detector, which is reflected as a sharp cutoff in the peak after which the energy seems to plummet. The cutoff is called a Compton edge and wasn't observed in the chart in the Journal. When alerted to it, the chemists refused to take any blame, continuously asserting that their experiment was error-proof.

After this incident, interest in the matter rocketed - and negatively. The New York Times published a scathing article on April 30, 1989, titled The Utah Fusion Circus. Its last lines effectively drew the curtains on cold fusion and sent Fleischmann, Pons and the U of Utah back decades in terms of their credibility and accountability.
For Mr. Pons and Mr. Fleischmann, the best bet is to disappear into their laboratory and devise a clearly defined, well-understood experiment that others can reproduce. Until they have that, they have nothing. As for the University of Utah, it may now claim credit for the artificial-heart horror show and the cold-fusion circus, two milestones at least in the history of entertainment, if not of science.

What ultimately made the difference was replicability: when scientists found that the experiment couldn't be replicated, the game was lost. Fleischmann and Pons were held guilty to the end. Their exculpation was all the time hinged on their ability to assert that the experiment would occur anywhere irrespective of geographic location as long as the experimental setup was the same. Because of the nature of their experiment, there was a sizeable measure of urgency to establish priority: as far as anything immensely profitable is concerned, primacy makes all the difference. Woe betide if anyone else claim it first - all would be lost! And in that moment, facts are not checked as rigorously as they should be.

More recently, the same went for the misbehaving neutrinos. When other experiments around the world tried to replicate the phenomenon, nothing happened. In fact, a sister-experiment of OPERA (which first announced the anomaly) called ICARUS drew the curtains more than halfway down within three weeks of the exciting announcement.

[caption id="attachment_21674" align="aligncenter" width="500" caption="The ICARUS experiment at Gran Sasso National Laboratory, Italy"][/caption]

Replication is many things. At the least, it is consensus. At the most, it is a form of characterization that removes a phenomenon out of its germane environment and blesses it with universality, as a result making it explicable within a logical framework and scientific. If the Higgs boson were to be discovered at the Large Hadron Collider in 2012, that will be the first step in a series of experiments on which hundreds of scientists and thousands of engineers will be involved for tens of months. They will check, and then they will check again. If someone somewhere finds that such a boson couldn't be spotted, he won't be snubbed but encouraged to speak up: the closer to the truth you get, the more careful you need to be.

Unfortunately for quacks everywhere, everything about science is the truth.

--

In a curious turn of events, the National Institute of Advanced Studies, India, recommended the Indian government to resuscitate research in cold fusion in 2008. Projects were commenced at IIT Madras, the Indira Gandhi Centre for Atomic Research, and BARC. However, because of persistent scepticism among physicists at chemists, research was halted as of 2011. Find the ToI article here.

Monday, 26 September 2011

My kingdom for a neutrino

In 1982, when the construction for CERN’s Large Hadron Collider (LHC) experiment was given the go-ahead, physics entered a very exciting period. It promised them the answers to their biggest questions and, in the event that that didn’t happen, it promised them ample evidence to come to a conclusion of their own.

Two decades later, with the device in full operation, results are emerging, some more improbable than the rest. The project was put in place to attempt to create the conditions of the Big Bang so physicists could detect the Higgs boson. However, nobody anticipated such a thing as evidence of super-luminary travel by neutrinos.

The existence of neutrinos was first proposed by Austrian physicist Wolfgang Pauli in 1930 to account for the excess mass and energy left behind after a neutron disintegrated into a proton and an electron. The first direct observation was to be made only in the 1970s, more than 40 years later. The neutrino is one of the many indivisible particles of this Universe, and is of neutral charge and very little mass. In fact, amongst all the particles that have any mass, a neutrino is the lightest. This means that according to Albert Einstein’s theory of relativity, the mass of the particle will hit infinity only when it travels at speeds terribly close to that of light. And by terribly close, I’m talking 99.9999% close.

Such neutrinos were generated by the LHC over the course of some of its experiments and sent to the Gran Sasso National Laboratory in Italy for study. Located 730 km south of the LHC and almost a kilometre under Mt. Gran Sasso, the laboratory receives the particles in a massive tank of ultra-pure water.

Once a neutrino comes in contact with a proton of a water molecule, they react to form a neutron and a positron. The positron collides with an electron, its anti-particle, to annihilate each other, releasing two gamma rays. The neutron is captured by another nucleus to release a third gamma ray. Therefore, the signature of a neutrino capture is the release of three gamma rays.

[caption id="attachment_20371" align="aligncenter" width="439" caption="The Super-KamiokaNDE experiment in Japan contains a tank of 50,000 litres of water, fit with an array of tens of thousands of photo-multiplier tubes (as above) to detect the release of energy in case of a neutrino capture. The cylindrical container has other systems in place to detect the position of a capture, too."][/caption]

At the laboratory, as scientists waited for the neutrinos to arrive and set off the reactions, they were hardly prepared when the release of the gamma rays was detected precisely 60 nanoseconds before it was due. While such a small difference might seem trivial, the implication is that the neutrinos arrived before any other kind of electromagnetic radiation did. Since electromagnetic radiations possess the fastest speed attainable in this Universe according to Einstein, the speeding neutrinos have possibly defied the greatest physicist of the last century.

Where does that leave the world of physics?

Centuries of hypothesizing and experimenting by scientists have ingrained the importance of reasoned scepticism in their minds. While the Gran Sasso National Laboratory has claimed that 16,000 such instances have been recorded and documented, they haven’t ruled out any errors. For now, physicists the world over await similar conclusions, and so confirmations, from the two other colliders capable of replicating such conditions.

One of them is the J-PARC in Japan. Located along the coast of the Tohoku prefecture, the device was damaged and unable to operate for the next 12 months, at least, by the earthquake in March 2011 in nearby Fukushima. The other collider is the Fermilab Tevatron, an atom-smasher of considerable reputation, in the USA. After close to three decades of operation, the facility is scheduled to shut down permanently on September 30, 2011.

Collaboration rather than competition seems to be the emerging mantra. The shadow of CERN is beginning to loom large on most particle physics labs, and they’re finding it difficult to compete with CERN and its flagship project. Further, in these days of ballooning fiscal deficits and bond rating downgrades in the US, funding is hard to come by for such "creamy" projects.

Unsurprisingly, physicists are prepared to wait. Why won’t they when the speed of light – a form of electromagnetic radiation – has been the definitive cornerstone of some of the most important foundations of our understanding of this Universe? By defying that limit, neutrinos have brought upon them the scrutiny of the entire scientific community.

For one, by being faster than light, neutrinos speeding toward Earth from distant stars get here before the image of the star does, making it possible for astrophysicists to peek farther back into history. Second, the particle that carries electromagnetic energy, the photon, was thought to be massless so it wouldn’t violate the theory of relativity. However, the neutrino has mass, and that means all of Einstein’s works will have to be disrobed and studied. The larger consequence of this is that almost all high-energy installations on this planet, ranging from nuclear power plants (NPPs) that power cities to radio-telescopes searching for extra-terrestrial life, become available for changes as much as at the design level.


(In an NPP, the flow of single-phase coolants in pressurized water reactors is assumed to flow not faster than the speed of light. Even if the fluid dynamics of single-phase coolants has already been modelled on luminary principles, how could there be any changes at the design level?

If the speed of the coolants can be increased even higher, then the critical discharge, i.e., the maximum flow rate permissible, will also go higher. This translates into enhanced cooling, and this obviously means that fuel rods can be made even thicker and more power could be generated.*)


Similarly, the way the world works also is not going to change since the neutrino has always been working the same way for billions of years irrespective of how we thought it worked. But what is going to change is the way we understand electromagnetic concepts. The Standard Model of particle physics, the Big Daddy of all the theories of physics, won’t have to be tweaked so much as twisted around to accommodate this yet-unsubstantiated phenomenon.

Physicists will wait, and while they wait, they’ll debate. They’ll conduct more experiments, record more data, hypothesize more, refute more, argue more, all the while hoping that the Tevatron will be fired up for one last battle, that the J-PARC will be resuscitated for a one-of-a-kind challenge. In my opinion, no overbearing modifications will have to be made to the pool of knowledge we possess regarding physics as such. Even if the neutrinos did travel at super-luminary speeds - which I highly doubt and attribute to minute measurement errors adding up to cause the spike - it won't be long before the concept of the phenomenon is quickly subsumed by the search for other even greater truths. Yes, our perspectives are going to change, but more than anything, the discovery's original role as a tool with which to discover more about nature will not.

Let's not get carried away, though. After all, disproving the greatest minds of physics in history requires a future of its own.

*On the other hand, for example, the speed at which the gravitational force acts on any body is limited to the speed of electromagnetic radiations, but that doesn't mean discovery of a higher speed in this Universe is going to change anything. It's only a role reversal at the most (although the massiveness of the neutrino is going to make a difference) because the practically achievable velocity is going to remain the same.

My kingdom for a neutrino

In 1982, when the construction for CERN’s Large Hadron Collider (LHC) experiment was given the go-ahead, physics entered a very exciting period. It promised them the answers to their biggest questions and, in the event that that didn’t happen, it promised them ample evidence to come to a conclusion of their own.

Two decades later, with the device in full operation, results are emerging, some more improbable than the rest. The project was put in place to attempt to create the conditions of the Big Bang so physicists could detect the Higgs boson. However, nobody anticipated such a thing as evidence of super-luminary travel by neutrinos.

The existence of neutrinos was first proposed by Austrian physicist Wolfgang Pauli in 1930 to account for the excess mass and energy left behind after a neutron disintegrated into a proton and an electron. The first direct observation was to be made only in the 1970s, more than 40 years later. The neutrino is one of the many indivisible particles of this Universe, and is of neutral charge and very little mass. In fact, amongst all the particles that have any mass, a neutrino is the lightest. This means that according to Albert Einstein’s theory of relativity, the mass of the particle will hit infinity only when it travels at speeds terribly close to that of light. And by terribly close, I’m talking 99.9999% close.

Such neutrinos were generated by the LHC over the course of some of its experiments and sent to the Gran Sasso National Laboratory in Italy for study. Located 730 km south of the LHC and almost a kilometre under Mt. Gran Sasso, the laboratory receives the particles in a massive tank of ultra-pure water.

Once a neutrino comes in contact with a proton of a water molecule, they react to form a neutron and a positron. The positron collides with an electron, its anti-particle, to annihilate each other, releasing two gamma rays. The neutron is captured by another nucleus to release a third gamma ray. Therefore, the signature of a neutrino capture is the release of three gamma rays.

[caption id="attachment_20371" align="aligncenter" width="439" caption="The Super-KamiokaNDE experiment in Japan contains a tank of 50,000 litres of water, fit with an array of tens of thousands of photo-multiplier tubes (as above) to detect the release of energy in case of a neutrino capture. The cylindrical container has other systems in place to detect the position of a capture, too."][/caption]

At the laboratory, as scientists waited for the neutrinos to arrive and set off the reactions, they were hardly prepared when the release of the gamma rays was detected precisely 60 nanoseconds before it was due. While such a small difference might seem trivial, the implication is that the neutrinos arrived before any other kind of electromagnetic radiation did. Since electromagnetic radiations possess the fastest speed attainable in this Universe according to Einstein, the speeding neutrinos have possibly defied the greatest physicist of the last century.

Where does that leave the world of physics?

Centuries of hypothesizing and experimenting by scientists have ingrained the importance of reasoned scepticism in their minds. While the Gran Sasso National Laboratory has claimed that 16,000 such instances have been recorded and documented, they haven’t ruled out any errors. For now, physicists the world over await similar conclusions, and so confirmations, from the two other colliders capable of replicating such conditions.

One of them is the J-PARC in Japan. Located along the coast of the Tohoku prefecture, the device was damaged and unable to operate for the next 12 months, at least, by the earthquake in March 2011 in nearby Fukushima. The other collider is the Fermilab Tevatron, an atom-smasher of considerable reputation, in the USA. After close to three decades of operation, the facility is scheduled to shut down permanently on September 30, 2011.

Collaboration rather than competition seems to be the emerging mantra. The shadow of CERN is beginning to loom large on most particle physics labs, and they’re finding it difficult to compete with CERN and its flagship project. Further, in these days of ballooning fiscal deficits and bond rating downgrades in the US, funding is hard to come by for such "creamy" projects.

Unsurprisingly, physicists are prepared to wait. Why won’t they when the speed of light – a form of electromagnetic radiation – has been the definitive cornerstone of some of the most important foundations of our understanding of this Universe? By defying that limit, neutrinos have brought upon them the scrutiny of the entire scientific community.

For one, by being faster than light, neutrinos speeding toward Earth from distant stars get here before the image of the star does, making it possible for astrophysicists to peek farther back into history. Second, the particle that carries electromagnetic energy, the photon, was thought to be massless so it wouldn’t violate the theory of relativity. However, the neutrino has mass, and that means all of Einstein’s works will have to be disrobed and studied. The larger consequence of this is that almost all high-energy installations on this planet, ranging from nuclear power plants (NPPs) that power cities to radio-telescopes searching for extra-terrestrial life, become available for changes as much as at the design level.


(In an NPP, the flow of single-phase coolants in pressurized water reactors is assumed to flow not faster than the speed of light. Even if the fluid dynamics of single-phase coolants has already been modelled on luminary principles, how could there be any changes at the design level?

If the speed of the coolants can be increased even higher, then the critical discharge, i.e., the maximum flow rate permissible, will also go higher. This translates into enhanced cooling, and this obviously means that fuel rods can be made even thicker and more power could be generated.*)


Similarly, the way the world works also is not going to change since the neutrino has always been working the same way for billions of years irrespective of how we thought it worked. But what is going to change is the way we understand electromagnetic concepts. The Standard Model of particle physics, the Big Daddy of all the theories of physics, won’t have to be tweaked so much as twisted around to accommodate this yet-unsubstantiated phenomenon.

Physicists will wait, and while they wait, they’ll debate. They’ll conduct more experiments, record more data, hypothesize more, refute more, argue more, all the while hoping that the Tevatron will be fired up for one last battle, that the J-PARC will be resuscitated for a one-of-a-kind challenge. In my opinion, no overbearing modifications will have to be made to the pool of knowledge we possess regarding physics as such. Even if the neutrinos did travel at super-luminary speeds - which I highly doubt and attribute to minute measurement errors adding up to cause the spike - it won't be long before the concept of the phenomenon is quickly subsumed by the search for other even greater truths. Yes, our perspectives are going to change, but more than anything, the discovery's original role as a tool with which to discover more about nature will not.

Let's not get carried away, though. After all, disproving the greatest minds of physics in history requires a future of its own.

*On the other hand, for example, the speed at which the gravitational force acts on any body is limited to the speed of electromagnetic radiations, but that doesn't mean discovery of a higher speed in this Universe is going to change anything. It's only a role reversal at the most (although the massiveness of the neutrino is going to make a difference) because the practically achievable velocity is going to remain the same.

Sunday, 22 May 2011

Philosophy and the scientific method

Philosophy's widely been called the classification of thoughts; although that seems like a simple definition, those requirements in place to ensure that it is also accessible to the principles of scientific enquiry make it a daunting and esoteric subject to pursue. While intriguing challenges central to the human nature are presented in no small numbers along the way, the latter quality has prevented it from embraced by the masses as another context within which to investigate the Universe.

At the core of the philosophical argument lies its ability to define principles—any principles—which are, in turn, what enable the classification of thoughts. Would that the Universe contained all of its information as a garble of colours and noise, the process of learning would've been substituted by the process of discovery entirely. However, the case has been demonstrably polar: there are patterns everywhere, patterns that show a remarkable similitude to each other in that they are all principled, in that they all display characteristics of endurance and not grant any incidents of epistemological construction the misfortune of stagnation as structures of the past.This is one of the foremost reasons that inquiry into these matters has proved crucial for social, economic and political progress irrespective of "germane concerns" such as ethnicity, culture or racial history.

In order to both discover and establish (or, recognise and understand) such a replicable ontology, an underlying experimental process is necessary that abides by the principles of scientific investigation and, essentially, empiricism so as to prevent the case of reductio ad absurdum as well as to be able to verify the credibility of any hypothesis without interfering with its functions.

Philosophy and the scientific method

Philosophy's widely been called the classification of thoughts; although that seems like a simple definition, those requirements in place to ensure that it is also accessible to the principles of scientific enquiry make it a daunting and esoteric subject to pursue. While intriguing challenges central to the human nature are presented in no small numbers along the way, the latter quality has prevented it from embraced by the masses as another context within which to investigate the Universe.

At the core of the philosophical argument lies its ability to define principles—any principles—which are, in turn, what enable the classification of thoughts. Would that the Universe contained all of its information as a garble of colours and noise, the process of learning would've been substituted by the process of discovery entirely. However, the case has been demonstrably polar: there are patterns everywhere, patterns that show a remarkable similitude to each other in that they are all principled, in that they all display characteristics of endurance and not grant any incidents of epistemological construction the misfortune of stagnation as structures of the past.This is one of the foremost reasons that inquiry into these matters has proved crucial for social, economic and political progress irrespective of "germane concerns" such as ethnicity, culture or racial history.

In order to both discover and establish (or, recognise and understand) such a replicable ontology, an underlying experimental process is necessary that abides by the principles of scientific investigation and, essentially, empiricism so as to prevent the case of reductio ad absurdum as well as to be able to verify the credibility of any hypothesis without interfering with its functions.

Tuesday, 18 January 2011

Why A Language Resembles Physics So Much

Here's why a language is like physics. It's common knowledge that both of them help us understand the world: physics is a study of the physical word, a methodological inspection of every phenomenon we encounter, every experience we are affected by; language is tool of conveyance, and the passenger borne is meaning, and so by speaking in a language, we are only trapping the meaning of out thoughts in words and setting them afloat in a sea of communication. However, the world of physics is split into two distinct factions: the wave theorists' and the particle theorists' factions. How does this bode for language?


[caption id="" align="alignright" width="300" caption="Particle detection in a cloud chamber"]CMS detector[/caption]


The particle theorists see the world as being composed of discrete packets of energy that are, all of them, bound inexcusably by the law of conservation of energy and all the laws of thermodynamics. The wave theorists see the world as being seated on manifestations of energy being propagated as a continuous wave; in other words, the discretion as particles is discredited even though the law of conservation of energy and the laws of thermodynamics still hold.

Amongst particle theorists, all the energy that is present in this universe is a constitution of a very large number of packets, each of which contains a definite amount of energy that is unchanging over time. If a space contains twice as much energy as another, it does not mean the packets are twice as voluminous, it only means there are twice as many such packets. When we look out into this universe, all that we understand or all that there is to be understood at all can be done so if only there is an "amount" of meaning attached to it. Our interaction with this meaning is possible only through a tool that allows us to exchange meaning in the process - a tool like a word. A word can be said to contain a discrete amount of meaning. Even though different people may see it to be different amounts, the innate value of that meaning does not change over time. The adjective "beautiful" ascribes different amounts of beauty according to different people, and to each person therein, the amount of beauty the word describes is the same. However, "beautiful" never does come to ascribe ugliness to an object - apart from signifying its absence.

Words are discrete, like particles of meaning being strung together to create a large volume of meaning called a sentence. Sentences are then strung together to create a larger concatenation of meaning: it could be multi-dimensional, too, because a paragraph might discuss the properties of different objects through different adjectives and, in the process, create an array of meaning, so to speak. Now, if we were to zoom out to view the bigger picture, what we see is a language: there are grammatical rules that are the thermodynamic tyrants of communication, and then there are various other principles and theories that lay down how meaning is generated as well as understood - the "laws of conservation of meaning".

Even though we have described words to be discrete capsules that contain a set amount of meaning, it doesn't mean that the language as such prevents us from ascribing some amount of meaning to the gaps between these particles. Between one word and another, there is a boundary that prevents the spillage of any semantic entity, a boundary that holds it within a space in the confines of which it exists and can be understood. At the same time, with the combination of words, we create a "wave" of meaning that is present everywhere - even unto where "beautiful" and "pulchritudinous" can't reach, thereunto does "of a winsome exquisiteness surpassing the glow of a young star".