Pages

Showing posts with label Albert Einstein. Show all posts
Showing posts with label Albert Einstein. Show all posts

Friday, 24 February 2012

You shall not pass!

With neutrinos back where they belong - behind the massless photons in vacuum - a lot of the excitement generated by the strange incidents reported last year has died down. After the OPERA experiment announcement on September 23 last year, many observatories, labs and accelerators set about trying to recreate the experiments as well as assisting the Italian Gran Sasso National Lab, the home of OPERA, check for errors with the detection system. Following at their heels were theoretical physicists and mathematicians with different hypotheses each aimed at refuting the results.

On October 18, 2011, another experiment at Gran Sasso, the ominously named ICARUS, published a preprint paper completely contradicting the OPERA results. The ICARUS physicists' conclusion was underpinned by a simple concept: whenever a particle moves, it loses some energy. The rate of energy lost is dependent in a fixed way on the speed at which the particle is moving, and when the particle is moving at a speed greater than that of light's in vacuum, its energy loss must be a specific fraction of its overall energy.

The CERN produced neutrinos at 28.2 GeV, and by the time they reached the OPERA and the ICARUS, they should have had an energy of 12.1 GeV. Unlike ICARUS, OPERA had used a clocking mechanism to determine that the neutrinos were moving faster than light. The ICARUS result, however, showed that there were no neutrinos that had 12.1 GeV of energy, nor that neutrinos possessed any energy in the neighbourhood of that value. Instead, the plot it obtained - of neutrino energy versus number of events - conformed perfectly to the hypothesis that the neutrinos were travelling at the speed of light, no more.

[caption id="attachment_21656" align="aligncenter" width="640" caption="The ICARUS energy-event plot"][/caption]

An important theoretical result inspired the experimental ICARUS result: a paper by Andrew Cohen and Sheldon Glashow contested that superluminal neutrinos could decay into sub-luminal speeds by losing energy in the form of fermions, usually electron-positron pairs. The rate of pair formation, they calculated, was proportional to the sixth power of the neutrinos' energy and the rate of energy decay, proportional to the fifth power. With this, they arrived at a value of around 12.5 GeV as the terminal energy of the neutrinos. OPERA, however, had measured something much higher than this, which meant the energy decay was slow, which meant that they couldn't have been travelling as fast as had been claimed.

At the same time, it wasn't as if the announcement was without supporters: a host of papers were published whose authors seemed determined to validate superluminal travel. Some interesting ones among them are available here, here and here.

Such experiments and solutions, those that sought to prove as well as those that sought to refute the OPERA announcement, are indicative of the spirit of science: that even when an anomalous or conclusively contradictory finding is made, the scientific community utilizes the impetus of the discovery to learn more, to create more knowledge. Even in the case of the the multi-billion dollar hunt for the Higgs boson, all set to be taken to another level in 2012, evidence of the particle's nonexistence will matter as much as its existence. And just was the case when news emerged that neutrinos were well-behaved, after all.

Tuesday, 7 February 2012

The Middle Earth of physics: Where Tolkien and the physicists meet to have tea

Studying physics is like reading The Lord of the Rings trilogy. At first, there is a general excitement about things to come, how the small events at the beginning are going to avalanche into something portentous. Then, there comes the middle section where things get slow and a tad boring, but it's still a section that you have to understand before you can get on to bigger things. And then, there's the finish: spectacular and very memorable.



Lord-of-the-rings30511


Things are the same with physics. First, there are the atoms, the hobbits of the physical realm. With them, the molecules, bonds and a wide variety of interactions between different particles. There is enough about their behaviour to stoke one's curiosity, to explore how they interact under different circumstances toward different results. Then, as the basic structure of all materials has been understood, we move on to the universe in which they exist and how they shaped it. That's where things get tricky and, quickly, mindboggling.


At one point, however, all the concepts that are trying to be understood suddenly coalesce into one big, beautiful picture of the universe. There are stars, novae, nebulae and black holes, and diamonds, rubies and emeralds, and light and its millions of colours. This is where the beauty of physics becomes really evident, summoning appreciation and awe at its poignancy. This is also where the audience's focus is while, all the time, the physicist labours in the middle section to understand more, to explore more.



Pillars-of-creation


- The Pillars of Creation (one of the most beautiful images from outer space)


A lot of what goes on in the beginning is taught at schools. The foundation is laid such that wheresoever the student's interest lies, he finds himself equipped enough to move ahead confidently in that direction. All of what happens in the middle is locked up in universities, research labs and journals. That is where the core of the scientific community resides, constantly hypothesizing, experimenting, reviewing and publishing. The contents of the spectacular finish is what is circulated in the media: in news reports, TV shows, etc., the stuff that we see even if we don't care to look in the right places.


A book as comprehensive as The Lord of the Rings in its delineation of fantastic plots and sub-plots, of valorous and scheming characters, and of strange places and their stranger legends is bound to become both heavily inspirational and literarily restricting. Since 1955, when the trilogy was first published, there have been hundreds of books that show some sign or the other of the author having borrowed from Tolkien's brainchild. At the same time, many of them experienced only fleeting success simply because they were measured against the scope of the big daddy.


Physics isn't different. Every time there is a revolution - which has been happening less frequently of late because (we think) we're in the vicinity of a Solution to Everything - there is reluctance, reaffirmation, and then reorienting, in that order, of the scientific community. More recent discoveries add more meaning not only to the present but also to the past. Similarly, more recent knowledge is even more significant because the past has aged. As we gradually zero in on something, the more difficult it becomes to think radically, to think way out of the box, because such suggestions are considered abnormal in comparison to something groundbreaking that came before.



Kuhn4


- Thomas Kuhn is known for his controversial 1964 book, The Structure of Scientific Revolutions, in which he characterized the now-staple concept of a paradigm as the entity that undergoes rigorous testing before the scientific community can induct a once-anomalous fact.


This phenomenon is something that ought not to be eradicated: it is necessary to weed out the unscalable and the superficial. It is persistence in such an environment that reaps the greatest rewards, even though the idea may sound oddly masochistic.


For example, in the case of Dan Shechtman, whose story was popularized after he won the Nobel Prize for chemistry in 2011: even though the abrasive interference of Linus Pauling was unfortunate, the atmosphere of doubt was heavy because the conviction of Shechtman's peers got in the way of his immediate success. However, at all other points of time, that conviction is necessary to sustain research.



Shechtman_windowportrait_slideshow

- Dan Shechtman


This doesn't mean all knowledge in physics follows from what came before it. After all, the only things fixed in nature are the laws of physics, and it is by closely observing them that we begin our first lessons in the subject.


For the next few decades after the 1950s, the spell of The Lord of the Rings over fantasy fiction couldn't easily be broken (check postscript), so pervasive was its influence. Only gradually did writers realize that fantasy fiction is simply what the world is not, and that thought resurrected a treasure-chest of ideas, giving us the pleasurable writing of Steven Erikson, Ursula Le Guin, Terry Goodkind, Stephen Donaldson, Robert Jordan and others.


Analogously, after Albert Einstein formulated his general theory of relativity (GR) and quantum mechanics (QM) was brought up by Schrodinger, Planck, Pauli, Maxwell and others, there was a fallout amongst physicists. The two monumental theories couldn't be reconciled, resulting in academic chaos. It was in such an atmosphere that two factions of radical thought emerged: loop quantum gravity and M-theory (a.k.a. string theory), and neither of them had attempted to work off what was set down in GR or QM. (In fact, through an attempt at reconciliation, these two theories have evolved to explain some of the most fundamental secrets of the universe).



Calabi-yau-alternate

- "A Calabi-Yau manifold is a special type of [smooth surface] that shows up in certain branches of mathematics such as algebraic geometry, as well as in theoretical physics. Particularly in superstring theory, the extra dimensions of spacetime are sometimes conjectured to take the form of a 6-dimensional Calabi-Yau manifold." - Wikipedia


Ultimately, the lessons with which we journey into the future of science (or is it already here?) are all encapsulated in the spirit of The Lord of Rings, at least in my opinion. Both the magnum opus and physics have been and are seminal in various ways. Even though the trials and tribulations of Middle Earth may not have been the cause of great relief and healing like physics has, the journey into their causes was a teaching experience nonetheless.


The similarities that I have made a note of are simply empirical and born out of my fondness for both entities, but they are also equally undeniable. For instance, while the division of physics into three "realms" may seem perfunctory to some, it is a good place to begin to understand why what Elsevier Publications is up to is horrible.


*


PS:



  1. "Do you remember [...] The Lord of the Rings? [...] Well, Io is Mordor [...] There's a passage about "rivers of molten rock that wound their way ... until they cooled and lay like dragon-shapes vomited from the tortured earth." That's a perfect description: how did Tolkien know, a quarter of a century before anyone saw a picture of Io? Talk about Nature imitating Art.", Arthur C. Clarke, 2010: Odyssey Two, Chapter 16 'Private Line'

  2. http://www.moongadget.com/origins/lotr.html

  3. Gary Gygax, creator of Dungeons and Dragons: "How did it influence the D&D game? Whoa, plenty, of course. Just about all the players were huge JRRT fans, and so they insisted that I put as much Tolkien-influence material into the game as possible. Anyone reading this that recalls the original D&D game will know that there were Balrogs, Ents, and Hobbits in it. Later those were removed, and new, non-JRRT things substituted–Balor demons, Treants, and Halflings. Indeed, who can doubt the excellence of Tolkien’s writing? So of course it had a strong impact on A/D&D games." Link: http://www.theonering.net/features/interviews/gary_gygax.html

Friday, 3 February 2012

Leonard Hofstadter vs. Leslie Winkle: What's at stake

As far as TV shows are concerned, I'm a big fan of The Big Bang Theory. I've always thought nerds were cool, but the show only popularized that opinion - at least in some circles that wouldn't have accepted, to use lingo from the show, the paradigm. My only problem with the show is the character I really like, Dr. Sheldon Cooper, is a string theorist and a character I find annoying, Leslie Winkle, is an LQG-adherent (Loop Quantum Gravity).

[youtube http://www.youtube.com/watch?v=d9YBgqxpaLk]

I haven't received any formal training in particle physics. I believe that understanding the natural philosophy of the universe can happen to perfection just by a lot of reasoned thought with a compatible dose of theories and a few formulae.

On that note: I'm an advocate of LQG, and I'm going to spend the rest of this post on detailing why that is and how it works. Or, if you like, I'm going to try and explain what the squiggles on Sheldon's and Leonard's boards mean and what the squiggles on Leslie's board could mean.

The story must begin the way all histories of physical theories do: with Isaac Newton. In describing the fundamental laws of classical mechanics, Newton needed a frame reference, a fixed entity with reference to which objects could accelerate. Without a frame of reference, we'd never be able to understand how an apple falls from a tree, for example. We know it did because we see it against an unmoving background. Thinking as he was in much more stripped-down terms, Newton conceived of a background space against which all things could be measured.
6a00d8341bf67c53ef0133f51209a0970b-800wi

- Sir Isaac Newton

In the 19th century, there was a breakthrough by Michael Faraday and James Clerk Maxwell. Working with electricity, Faraday imagined that two charges interacted via a mediating electric field. He established that the field lines started and ended with charges, and in a space where there were no charges, the field lines bent around and formed loops. These field lines were called Faraday lines. Maxwell then transcribed Faraday's hypothesis onto paper and birthed the Maxwellian equations, which described the behaviour of electromagnetic fields in much the same way.
Em

- The image shows a point source generating a field. Notice how the field lines are all parts of closed loops.

And then, there was Albert Einstein, whose principle contribution was the idea that gravity is not just a force but an inherent attribute of space-time itself. However, when attempting to work this into his general theory of relativity, he realized that gravitational force also had to be mediated by a field like electricity and magnetism were. This meant that a gravitational field had to exist with loops, lines and everything else.

Einstein also stumbled across one other astounding realization. All those years ago, when Newton had conceptualized a background space to serve as a frame of reference, he'd described acceleration in terms of a gravitational force - a discovery he's the most known for. Einstein found, however, that this Newtonian background space was nothing other than gravitational field itself. Because the field permeated through space-time and gave mass to objects that interacted with the field, it became a frame of reference.

These "stumblings" gave rise to two important conclusions.

  1. Loops didn't exist in a background space (as was previously thought) but on another layer of loops. And those loops? On yet another layer of loops. And those loops? You get the idea.

  2. Because there was a gravitational field that mediated gravity, low-frequency excitations of the field had to manifest as some particle. Just as the low-frequency manifestation of an electric field are electrons and of an electromagnetic field are photons, the excitations of a gravitational field are called gravitons. (Note: these have nothing to do with the Higgs boson)


This lack of a background space was disturbing to many physicists. At this point, some decided to ignore this outcome of general relativity and proceeded as if there was background space - a region that didn't have any underlying loops. In fact, these theorists went on to postulate that the gravitational field was the resultant when a background space and a quantum field acted together. These are today known as the string theorists.
String_theory

xkcd comic #171

The other faction decided to take into account this lack of background space and set about trying to formulate a background-independent theory of the nature of the universe. These are the LQG-advocates. According to them, the universe offers no background space naturally but they are created when the Faraday lines of the gravitational fields are quantum-excited.

Let me explain.

The gravitational field exists on another layer of loops (we don't know what they are). Because the point sources that created the gravitational field - Higgs bosons - quickly decayed long ago, the lines are large closed loops. The gravitational field permeates throughout the space-time continuum, and therefore there are infinitely many such loops.
Main-qimg-8e377655420c6b945cfa6cd0620ad7bc

- One of the decay signatures of a Higgs boson is a tau lepton-antilepton pair, shown above. Tau leptons are extremely hard to detect because they decay too quickly. However, other signatures include quark-antiquark pairs and electron/muon lepton-antilepton pairs, which are relative more long-lived.

When one of these loops is excited to some energy, physical space is created. It is important to understand that the loops are not in that space but that they are the space. Earlier, there was a conundrum: if there are two loops that are separated by a really small distance, then each loop will represent one degree of freedom for the universe, i.e. one avenue of change.

Since there are infinitely many such loops, the universe has to have infinite degrees of freedom. But that is not the case. That was dissolved after physicists realised that all the loops were a part of the same field, and no space could have separated the loops because the loops were space.
Lqg

- In LQG, the point where two loops intersect is called a node. The region that corresponds to a node is called a cell, the section of a loop joining two nodes is called a link, and the surface of a cell that a link passes through is called a... well, a surface.

So, that is the story of the universe - according to some. The string theorists believe that long, one-dimensional strings exist in a background space, and their vibrations manifest as particles that make up the universe.

I chose to be on the LQG side of things because I see no reason to disregard Einstein's conclusion that there is no background space. Also, string theory has turned up no testable hypotheses because it claims the strings exist at the Planck scale (one-hundred-billion-trillion-trillionth of a metre) whereas the mathematics of LQG has been able to explain the formation of black holes (I'll save that for another post, another day).
Tumblr_lfayf8cixw1qfjvexo1_400

- While string theory claims that the universe exists as strings at the Planck scale, LQG claims that at such scales, the granular cells that the space-time continuum is made of should show up. But such claims are yet untested.

In conclusion: when Leonard says he prefers his universe stringy, not loopy, all this is what he means. However, I can sympathise with Leslie when she leaves in a huff after Leonard wants to "let the kids decide" which hypothesis to choose.

Wednesday, 19 October 2011

Star of the Orient

India’s first particle physics observatory is to be constructed in the district of Theni in Tamil Nadu at an expense of Rs. 1,200 crore (USD 250 million). Called the India-based Neutrino Observatory (INO), the entire experiment will be situated 1.3 km under a hill to keep other radiations and cosmic rays from interfering with the study. This is because the neutrinos that the detector will be studying rarely interact with matter and pass through it at the rate of three or four interactions per nearly 85 trillion trillion trillion. The gouging of a tunnel 7m wide and 1.9km long for accessing the cavern that will house the systems was commenced on October 14, Friday, and is expected to take a year.

Twenty-seven days ago, a startling discovery set off tremors across the scientific community when the Gran Sasso National Laboratory inItalyreported that certain fundamental particles called neutrinos had been observed moving faster than light. The reason this observation caused such dissonance and a flurry of excitement is that, according to the physics megagiant Albert Einstein, the Universe would allow nothing to travel faster than light.

Then again, conclusive proof was not presented by the physicists at the lab—at least, not anything that was within the infamous six-sigma accuracy tolerance limit: 99.99999 per cent. It was little surprise, then, that within a week of the report, engineers were working in full-swing atJapan's Kamioka reactor, at theUSA's dreaded Fermilab, at the Sudbury Neutrino Observatory inCanada, to recreate the conditions at Gran Sasso. Far away, in India, a country that had until then been the principle centre for processing second-hand information, a 22-year old plan was finally being mobilized.

With just a 29-year old history, the energy frontier of physics research was supposed to last at least until 2018—the year of the Super Large Hadron Collider. With such unprecedented discoveries, however, a shift away from high-energy research and toward ultra-rare processes has become conspicuous. For the INO, the timing couldn’t have been better.

The decision to locate the observatory at Theni was finalized after evaluating the local topography, seismic stability, environmental disturbance, rock quality, availability of electricity and water, and rain patterns. In order to further minimize the impact of the project’s logistical and infrastructural operations, an extant but little-used road is being re-laid for the trucks and earthmovers to use, instead of having them move through five villages.

Funded by the government of India and the Tata Institute of Fundamental Research (TIFR), and coordinated by the Institute of Mathematical Sciences (IMS), the INO will host a supersensitive static detector called the Iron Calorimeter (ICAL), incorporating a magnet exactly four times as large as the one in use at the Large Hadron Collider. Such an effort will involve the INO-industry interface in a big way, drawing heavily on available industrial infrastructure, in issues related to mechanical structure, electronics and detector-related technology.

The detector will consist of a stack of alternating plates of iron and borosilicate glass, each totally numbering 30,000 and measuring 12m to a side. The glass plates, in turn, will consist of glass sheets with a noble gas sandwiched in between—an arrangement referred to collectively as a resistive plate chamber (RPC). When a neutrino interacts with iron, it will knock out an electron from its orbit around an atom and send it into the RPC. Once there, the electron will be picked up by positively charged electrodes sewn into the glass, translated into a signal, and sent to the data processors.

The source of the neutrinos will be the sun, supernovae, cosmic rays and other intergalactic phenomena, and the output will correspond to the particle’s mass, position of interaction, velocity, type, degree of oscillation and charge.

There are two reasons the INO stands out from its peers: the first is that the ICAL is going to be devoted to studying neutrinos and neutrinos only, and the second is that the ICAL will study them continuously without stopping (except for scheduled maintenance). Because of such principled and technical dedication, physicists expect the detector to shine light on some of the more elusive characteristics of neutrinos, such as flavour oscillations and neutrino-neutrino interactions.

These are boom times for Indian science. The national spending on science and technology has gone up in the last five years and is inching towards two per cent ofIndia's GDP. Hordes of new institutes are coming up in the nook and corner of the country—30 new central universities, 5 new Indian Institutes of Science Education and Research, 8 new Indian Institutes of Technology and 20 new Indian Institutes of Information Technology are in various stages of conception and completion.

However, simply increasing the number of institutes will not lead to better scientific prowess. The education system needs a complete rethink in order to attract more students to science and produce world class scientists (the last home-grown scientist to win a Nobel Prize was Sir C. V. Raman in 1930). In this direction, the INO is a giant leap forward because of its capacity to sustain research in subjects at the energy and cosmic frontiers, because of the special and exotic experimentation environments it will support, and because of the invaluable access it will provide to the Indian scientific community to cutting-edge information.

Star of the Orient

India’s first particle physics observatory is to be constructed in the district of Theni in Tamil Nadu at an expense of Rs. 1,200 crore (USD 250 million). Called the India-based Neutrino Observatory (INO), the entire experiment will be situated 1.3 km under a hill to keep other radiations and cosmic rays from interfering with the study. This is because the neutrinos that the detector will be studying rarely interact with matter and pass through it at the rate of three or four interactions per nearly 85 trillion trillion trillion. The gouging of a tunnel 7m wide and 1.9km long for accessing the cavern that will house the systems was commenced on October 14, Friday, and is expected to take a year.

Twenty-seven days ago, a startling discovery set off tremors across the scientific community when the Gran Sasso National Laboratory inItalyreported that certain fundamental particles called neutrinos had been observed moving faster than light. The reason this observation caused such dissonance and a flurry of excitement is that, according to the physics megagiant Albert Einstein, the Universe would allow nothing to travel faster than light.

Then again, conclusive proof was not presented by the physicists at the lab—at least, not anything that was within the infamous six-sigma accuracy tolerance limit: 99.99999 per cent. It was little surprise, then, that within a week of the report, engineers were working in full-swing atJapan's Kamioka reactor, at theUSA's dreaded Fermilab, at the Sudbury Neutrino Observatory inCanada, to recreate the conditions at Gran Sasso. Far away, in India, a country that had until then been the principle centre for processing second-hand information, a 22-year old plan was finally being mobilized.

With just a 29-year old history, the energy frontier of physics research was supposed to last at least until 2018—the year of the Super Large Hadron Collider. With such unprecedented discoveries, however, a shift away from high-energy research and toward ultra-rare processes has become conspicuous. For the INO, the timing couldn’t have been better.

The decision to locate the observatory at Theni was finalized after evaluating the local topography, seismic stability, environmental disturbance, rock quality, availability of electricity and water, and rain patterns. In order to further minimize the impact of the project’s logistical and infrastructural operations, an extant but little-used road is being re-laid for the trucks and earthmovers to use, instead of having them move through five villages.

Funded by the government of India and the Tata Institute of Fundamental Research (TIFR), and coordinated by the Institute of Mathematical Sciences (IMS), the INO will host a supersensitive static detector called the Iron Calorimeter (ICAL), incorporating a magnet exactly four times as large as the one in use at the Large Hadron Collider. Such an effort will involve the INO-industry interface in a big way, drawing heavily on available industrial infrastructure, in issues related to mechanical structure, electronics and detector-related technology.

The detector will consist of a stack of alternating plates of iron and borosilicate glass, each totally numbering 30,000 and measuring 12m to a side. The glass plates, in turn, will consist of glass sheets with a noble gas sandwiched in between—an arrangement referred to collectively as a resistive plate chamber (RPC). When a neutrino interacts with iron, it will knock out an electron from its orbit around an atom and send it into the RPC. Once there, the electron will be picked up by positively charged electrodes sewn into the glass, translated into a signal, and sent to the data processors.

The source of the neutrinos will be the sun, supernovae, cosmic rays and other intergalactic phenomena, and the output will correspond to the particle’s mass, position of interaction, velocity, type, degree of oscillation and charge.

There are two reasons the INO stands out from its peers: the first is that the ICAL is going to be devoted to studying neutrinos and neutrinos only, and the second is that the ICAL will study them continuously without stopping (except for scheduled maintenance). Because of such principled and technical dedication, physicists expect the detector to shine light on some of the more elusive characteristics of neutrinos, such as flavour oscillations and neutrino-neutrino interactions.

These are boom times for Indian science. The national spending on science and technology has gone up in the last five years and is inching towards two per cent ofIndia's GDP. Hordes of new institutes are coming up in the nook and corner of the country—30 new central universities, 5 new Indian Institutes of Science Education and Research, 8 new Indian Institutes of Technology and 20 new Indian Institutes of Information Technology are in various stages of conception and completion.

However, simply increasing the number of institutes will not lead to better scientific prowess. The education system needs a complete rethink in order to attract more students to science and produce world class scientists (the last home-grown scientist to win a Nobel Prize was Sir C. V. Raman in 1930). In this direction, the INO is a giant leap forward because of its capacity to sustain research in subjects at the energy and cosmic frontiers, because of the special and exotic experimentation environments it will support, and because of the invaluable access it will provide to the Indian scientific community to cutting-edge information.

Monday, 26 September 2011

My kingdom for a neutrino

In 1982, when the construction for CERN’s Large Hadron Collider (LHC) experiment was given the go-ahead, physics entered a very exciting period. It promised them the answers to their biggest questions and, in the event that that didn’t happen, it promised them ample evidence to come to a conclusion of their own.

Two decades later, with the device in full operation, results are emerging, some more improbable than the rest. The project was put in place to attempt to create the conditions of the Big Bang so physicists could detect the Higgs boson. However, nobody anticipated such a thing as evidence of super-luminary travel by neutrinos.

The existence of neutrinos was first proposed by Austrian physicist Wolfgang Pauli in 1930 to account for the excess mass and energy left behind after a neutron disintegrated into a proton and an electron. The first direct observation was to be made only in the 1970s, more than 40 years later. The neutrino is one of the many indivisible particles of this Universe, and is of neutral charge and very little mass. In fact, amongst all the particles that have any mass, a neutrino is the lightest. This means that according to Albert Einstein’s theory of relativity, the mass of the particle will hit infinity only when it travels at speeds terribly close to that of light. And by terribly close, I’m talking 99.9999% close.

Such neutrinos were generated by the LHC over the course of some of its experiments and sent to the Gran Sasso National Laboratory in Italy for study. Located 730 km south of the LHC and almost a kilometre under Mt. Gran Sasso, the laboratory receives the particles in a massive tank of ultra-pure water.

Once a neutrino comes in contact with a proton of a water molecule, they react to form a neutron and a positron. The positron collides with an electron, its anti-particle, to annihilate each other, releasing two gamma rays. The neutron is captured by another nucleus to release a third gamma ray. Therefore, the signature of a neutrino capture is the release of three gamma rays.

[caption id="attachment_20371" align="aligncenter" width="439" caption="The Super-KamiokaNDE experiment in Japan contains a tank of 50,000 litres of water, fit with an array of tens of thousands of photo-multiplier tubes (as above) to detect the release of energy in case of a neutrino capture. The cylindrical container has other systems in place to detect the position of a capture, too."][/caption]

At the laboratory, as scientists waited for the neutrinos to arrive and set off the reactions, they were hardly prepared when the release of the gamma rays was detected precisely 60 nanoseconds before it was due. While such a small difference might seem trivial, the implication is that the neutrinos arrived before any other kind of electromagnetic radiation did. Since electromagnetic radiations possess the fastest speed attainable in this Universe according to Einstein, the speeding neutrinos have possibly defied the greatest physicist of the last century.

Where does that leave the world of physics?

Centuries of hypothesizing and experimenting by scientists have ingrained the importance of reasoned scepticism in their minds. While the Gran Sasso National Laboratory has claimed that 16,000 such instances have been recorded and documented, they haven’t ruled out any errors. For now, physicists the world over await similar conclusions, and so confirmations, from the two other colliders capable of replicating such conditions.

One of them is the J-PARC in Japan. Located along the coast of the Tohoku prefecture, the device was damaged and unable to operate for the next 12 months, at least, by the earthquake in March 2011 in nearby Fukushima. The other collider is the Fermilab Tevatron, an atom-smasher of considerable reputation, in the USA. After close to three decades of operation, the facility is scheduled to shut down permanently on September 30, 2011.

Collaboration rather than competition seems to be the emerging mantra. The shadow of CERN is beginning to loom large on most particle physics labs, and they’re finding it difficult to compete with CERN and its flagship project. Further, in these days of ballooning fiscal deficits and bond rating downgrades in the US, funding is hard to come by for such "creamy" projects.

Unsurprisingly, physicists are prepared to wait. Why won’t they when the speed of light – a form of electromagnetic radiation – has been the definitive cornerstone of some of the most important foundations of our understanding of this Universe? By defying that limit, neutrinos have brought upon them the scrutiny of the entire scientific community.

For one, by being faster than light, neutrinos speeding toward Earth from distant stars get here before the image of the star does, making it possible for astrophysicists to peek farther back into history. Second, the particle that carries electromagnetic energy, the photon, was thought to be massless so it wouldn’t violate the theory of relativity. However, the neutrino has mass, and that means all of Einstein’s works will have to be disrobed and studied. The larger consequence of this is that almost all high-energy installations on this planet, ranging from nuclear power plants (NPPs) that power cities to radio-telescopes searching for extra-terrestrial life, become available for changes as much as at the design level.


(In an NPP, the flow of single-phase coolants in pressurized water reactors is assumed to flow not faster than the speed of light. Even if the fluid dynamics of single-phase coolants has already been modelled on luminary principles, how could there be any changes at the design level?

If the speed of the coolants can be increased even higher, then the critical discharge, i.e., the maximum flow rate permissible, will also go higher. This translates into enhanced cooling, and this obviously means that fuel rods can be made even thicker and more power could be generated.*)


Similarly, the way the world works also is not going to change since the neutrino has always been working the same way for billions of years irrespective of how we thought it worked. But what is going to change is the way we understand electromagnetic concepts. The Standard Model of particle physics, the Big Daddy of all the theories of physics, won’t have to be tweaked so much as twisted around to accommodate this yet-unsubstantiated phenomenon.

Physicists will wait, and while they wait, they’ll debate. They’ll conduct more experiments, record more data, hypothesize more, refute more, argue more, all the while hoping that the Tevatron will be fired up for one last battle, that the J-PARC will be resuscitated for a one-of-a-kind challenge. In my opinion, no overbearing modifications will have to be made to the pool of knowledge we possess regarding physics as such. Even if the neutrinos did travel at super-luminary speeds - which I highly doubt and attribute to minute measurement errors adding up to cause the spike - it won't be long before the concept of the phenomenon is quickly subsumed by the search for other even greater truths. Yes, our perspectives are going to change, but more than anything, the discovery's original role as a tool with which to discover more about nature will not.

Let's not get carried away, though. After all, disproving the greatest minds of physics in history requires a future of its own.

*On the other hand, for example, the speed at which the gravitational force acts on any body is limited to the speed of electromagnetic radiations, but that doesn't mean discovery of a higher speed in this Universe is going to change anything. It's only a role reversal at the most (although the massiveness of the neutrino is going to make a difference) because the practically achievable velocity is going to remain the same.

My kingdom for a neutrino

In 1982, when the construction for CERN’s Large Hadron Collider (LHC) experiment was given the go-ahead, physics entered a very exciting period. It promised them the answers to their biggest questions and, in the event that that didn’t happen, it promised them ample evidence to come to a conclusion of their own.

Two decades later, with the device in full operation, results are emerging, some more improbable than the rest. The project was put in place to attempt to create the conditions of the Big Bang so physicists could detect the Higgs boson. However, nobody anticipated such a thing as evidence of super-luminary travel by neutrinos.

The existence of neutrinos was first proposed by Austrian physicist Wolfgang Pauli in 1930 to account for the excess mass and energy left behind after a neutron disintegrated into a proton and an electron. The first direct observation was to be made only in the 1970s, more than 40 years later. The neutrino is one of the many indivisible particles of this Universe, and is of neutral charge and very little mass. In fact, amongst all the particles that have any mass, a neutrino is the lightest. This means that according to Albert Einstein’s theory of relativity, the mass of the particle will hit infinity only when it travels at speeds terribly close to that of light. And by terribly close, I’m talking 99.9999% close.

Such neutrinos were generated by the LHC over the course of some of its experiments and sent to the Gran Sasso National Laboratory in Italy for study. Located 730 km south of the LHC and almost a kilometre under Mt. Gran Sasso, the laboratory receives the particles in a massive tank of ultra-pure water.

Once a neutrino comes in contact with a proton of a water molecule, they react to form a neutron and a positron. The positron collides with an electron, its anti-particle, to annihilate each other, releasing two gamma rays. The neutron is captured by another nucleus to release a third gamma ray. Therefore, the signature of a neutrino capture is the release of three gamma rays.

[caption id="attachment_20371" align="aligncenter" width="439" caption="The Super-KamiokaNDE experiment in Japan contains a tank of 50,000 litres of water, fit with an array of tens of thousands of photo-multiplier tubes (as above) to detect the release of energy in case of a neutrino capture. The cylindrical container has other systems in place to detect the position of a capture, too."][/caption]

At the laboratory, as scientists waited for the neutrinos to arrive and set off the reactions, they were hardly prepared when the release of the gamma rays was detected precisely 60 nanoseconds before it was due. While such a small difference might seem trivial, the implication is that the neutrinos arrived before any other kind of electromagnetic radiation did. Since electromagnetic radiations possess the fastest speed attainable in this Universe according to Einstein, the speeding neutrinos have possibly defied the greatest physicist of the last century.

Where does that leave the world of physics?

Centuries of hypothesizing and experimenting by scientists have ingrained the importance of reasoned scepticism in their minds. While the Gran Sasso National Laboratory has claimed that 16,000 such instances have been recorded and documented, they haven’t ruled out any errors. For now, physicists the world over await similar conclusions, and so confirmations, from the two other colliders capable of replicating such conditions.

One of them is the J-PARC in Japan. Located along the coast of the Tohoku prefecture, the device was damaged and unable to operate for the next 12 months, at least, by the earthquake in March 2011 in nearby Fukushima. The other collider is the Fermilab Tevatron, an atom-smasher of considerable reputation, in the USA. After close to three decades of operation, the facility is scheduled to shut down permanently on September 30, 2011.

Collaboration rather than competition seems to be the emerging mantra. The shadow of CERN is beginning to loom large on most particle physics labs, and they’re finding it difficult to compete with CERN and its flagship project. Further, in these days of ballooning fiscal deficits and bond rating downgrades in the US, funding is hard to come by for such "creamy" projects.

Unsurprisingly, physicists are prepared to wait. Why won’t they when the speed of light – a form of electromagnetic radiation – has been the definitive cornerstone of some of the most important foundations of our understanding of this Universe? By defying that limit, neutrinos have brought upon them the scrutiny of the entire scientific community.

For one, by being faster than light, neutrinos speeding toward Earth from distant stars get here before the image of the star does, making it possible for astrophysicists to peek farther back into history. Second, the particle that carries electromagnetic energy, the photon, was thought to be massless so it wouldn’t violate the theory of relativity. However, the neutrino has mass, and that means all of Einstein’s works will have to be disrobed and studied. The larger consequence of this is that almost all high-energy installations on this planet, ranging from nuclear power plants (NPPs) that power cities to radio-telescopes searching for extra-terrestrial life, become available for changes as much as at the design level.


(In an NPP, the flow of single-phase coolants in pressurized water reactors is assumed to flow not faster than the speed of light. Even if the fluid dynamics of single-phase coolants has already been modelled on luminary principles, how could there be any changes at the design level?

If the speed of the coolants can be increased even higher, then the critical discharge, i.e., the maximum flow rate permissible, will also go higher. This translates into enhanced cooling, and this obviously means that fuel rods can be made even thicker and more power could be generated.*)


Similarly, the way the world works also is not going to change since the neutrino has always been working the same way for billions of years irrespective of how we thought it worked. But what is going to change is the way we understand electromagnetic concepts. The Standard Model of particle physics, the Big Daddy of all the theories of physics, won’t have to be tweaked so much as twisted around to accommodate this yet-unsubstantiated phenomenon.

Physicists will wait, and while they wait, they’ll debate. They’ll conduct more experiments, record more data, hypothesize more, refute more, argue more, all the while hoping that the Tevatron will be fired up for one last battle, that the J-PARC will be resuscitated for a one-of-a-kind challenge. In my opinion, no overbearing modifications will have to be made to the pool of knowledge we possess regarding physics as such. Even if the neutrinos did travel at super-luminary speeds - which I highly doubt and attribute to minute measurement errors adding up to cause the spike - it won't be long before the concept of the phenomenon is quickly subsumed by the search for other even greater truths. Yes, our perspectives are going to change, but more than anything, the discovery's original role as a tool with which to discover more about nature will not.

Let's not get carried away, though. After all, disproving the greatest minds of physics in history requires a future of its own.

*On the other hand, for example, the speed at which the gravitational force acts on any body is limited to the speed of electromagnetic radiations, but that doesn't mean discovery of a higher speed in this Universe is going to change anything. It's only a role reversal at the most (although the massiveness of the neutrino is going to make a difference) because the practically achievable velocity is going to remain the same.

Sunday, 28 August 2011

Black holes and information theory

When you see a star in the night sky and think to yourself of its beauty, you're doing the following things.

  1. Deciding to look up at the night sky

  2. Firing a sequence of signals through the motor neurons in the central nerve system

  3. Powering up muscles and lifting bones

  4. Positioning your eye to receive optimal amounts of light

  5. Employing a photo-chemical reaction at the back of the retina

  6. Sending electric signals to the brain

  7. Evaluating the beauty by accessing your memory

  8. Understanding the beauty and feeling it by letting it manifest as a suitable configuration of muscle positions


One way or another, we're using up energy to receive information, process it and convert it into another form which we can use. In fact, even the information we receive is a form of energy. When you log in to access your Facebook account, the letters and numbers on the screen are digitized data, and that means they're a series of data modification pulses shipped in through hundreds of optical cables, electronic circuitry and wireless data transmission systems to appear on your screen. Every physical manifestation of intention and will, in this world, is the conversion of energy from one form into another.

Now, the law of conservation of energy states that the total amount of energy in this Universe is fixed and cannot ever be changed. In that case, shouldn't the amount of information we receive and generate be fixed? At any point of time, would it be possible to generate more information than this Universe can contain? And since this Universe seems equipped only to contain so much information, is it fair to consider either that humankind's capabilities are limited or that humankind will never have to worry about that limit because they won't get there?

Second: because of the nature of the origin of this Universe, it is assumed to be a constantly expanding volume of space, and so, the amount of information that the Universe can contain also increases with it. What is the rate of increase, then? A simple answer to this question can be arrived at by considering two concepts: the Kepler problem in general relativity and black holes.

A black hole is a singularity. A singularity is a point in space where the quantities used to measure space-time change in a way that is independent of the coordinate system. Imagine just having neatly pressed your new sheets, and then finding somewhere a small crease that you aren’t able to iron out, as if it wasn’t there at all when you last looked there, but now won’t go away however hard you try. That’s your black hole.

[caption id="attachment_20246" align="aligncenter" width="545" caption="Depiction of a black hole"][/caption]

A black hole is a point in space, whereas the large black spheres we imagine them to be are true if only because that is the region around which the black hole exerts its influence. This “sphere of influence” doesn’t have a definite boundary. For the sake of convenience, which physicists also see the need for from time to time, there’s the event horizon: a hypothetical sphere whose surface marks the point of no-return.

Because of there massive densities, black holes exert a gravitational force that is just as strong as they are dense. Now, the Standard Model of particle physics dictates that photons, the packets of energy that carry electromagnetic radiation like light, have no mass. The most important consequence of masslessness is non-conformance to the forces of gravity, and that means light should be able to pass through black holes with no reflection, refraction or absorption. However, that’s not the case; in fact, black holes swallow light completely and burp a small amount of heat out. This happens because, instead of bending rays of light into themselves, black holes distort the shape of the space-time continuum itself.

Now, imagine a long stretch of flat land parallel to the surface of which a missile is fired. The missile is such that it guides itself to stay 1m above the ground at all points of time. If, suddenly, a gorge opens beneath the missile, it dips down and continues to follow the surface. Light behaves like the missile when the ground is the space-time continuum, and the only known phenomenon capable of distorting the continuum like that is a black hole – the only difference is that a black hole wraps the continuum around itself.

Now, this distortion lends itself to a very useful technique with which to measure the rate of expansion of the Universe, a technique called gravitational lensing. Consider the animated image below.



Beams of light coming from the galaxy in the background are bent around the black hole. As a result of this bending, two consequences arise:

  1. Increase in the illumination and the size of the image

  2. Apparent change in the position of the source of the image (irrelevant for this discussion)




During lensing, the distance traveled by a beam increases while the net displacement doesn't change, thereby keeping the colours of the image intact but changing its brightness and dimensions. Therefore, the black hole behaves like a convex mirror or, more commonly, a fish-eye lens. Now, the Kepler problem in general relativity gives rise to the formula:

θ = (4GM / rc2)

Here, θ is the angle through which the light beam is deviated by the bending object (as depicted in the latest image), G is the universal gravitational constant, M is the mass of the bending object, r is the distance between the beams and the bender (between "missile" and "ground"), and c is the speed of light (note how the forces of gravity are not instantaneous but also travel at the speed of electromagnetic radiations).

Think of the space-time continuum as an elastic fabric and the various phenomena and objects as special designs on its surface. When, at a point, the weave is wrapped around a spherical object, the surface area at that point goes from being flat to being rounded, giving rise to a bulge that enlarges the image. In physical terms, the angular size of the image is said to have been increased.

By calculating the value of θ, the distance between the galaxy and the black hole can be established. Over the course of, say, one year, the initial and final values of θ between two objects can be computed to give the distance by which they have been moved apart in that year. The perfect way to understand is the "raisin bread" model of the Universe. Consider a loaf of bread embedded with raisins. If the loaf is expanded, the number of raisins and the shape of the raisins do not change, but the distance between them changes proportionately.

With that, we are in a position to understand by how much the information-carrying capacity of the Universe changes as it itself becomes more voluminous. That leaves a third and last question to be answered, the question of information transfer.

Information transfer



In the process flowchart shown above, an idea is encapsulated by certain words that are then transmitted by an "information generation" controller (which is duty-bound to filter out noise and other such chaotic elements from the message). Next, the message is conveyed through a medium to an "information reception" controller (which is also duty-bound to filter out noise), and then the message is understood as an idea.

Now, while the message is being conveyed, certain errors and losses could creep in into the transmission process. For instance, because of the incapacity of the system itself, data loss and/or data corruption could occur. Further, if the right medium is not chosen for the most-efficient conveyance of the message, it might be understood differently, which is also a kind of data corruption. In order to avoid such possibilities, the message is sometimes amplified.

During amplification, two processes can be said to occur:

  1. The information being carried is modified: it is made stronger in order to survive the journey

  2. An accompanying signal is sent to the receiver to inform it about the changes that have been made


As such, the amplification and abridgment processes have to accompany the conveyance-medium-conveyance subsystem because they compensate for the shortcomings of the conveyances (just as if A lends money to B and B lends that amount to C, it is only B's responsibility to retrieve it from C and return it to A). That being said, let's move on to the idea of interstellar magnifiers.

Interstellar magnifiers

If human civilization were to spread out from Earth and distribute itself across other planets in other galaxies, communication between the various systems is bound to become a pain. In that case, suitably modified electromagnetic signals (such as light, RF waves, etc.) can be pulsed into the sky, aimed at objects with strong gravitational forces that would further amplify, or boost, them on their way. With the assistance of suitably positioned satellites, the broadcast information can then be "narrowcasted" down to the receiving stations on some planet.

[caption id="attachment_20245" align="aligncenter" width="350" caption="Cassini's gravity-assistance"][/caption]

A significant hindrance posed to this method of communication is a phenomenon called galactic extinction: when intercepted by a cloud of galactic dust, the waves are absorbed and scattered into space, the information becoming lost in the process. In order to minimize the amount scattered, polarized radiation may be considered.

The no-hair theorem

What happens when light, instead of bending around a black hole, enters into it? The answer to this question is a puzzler because of something called the no-hair theorem, which states that every black hole is characterized only by its mass, charge and angular momentum. That means that if my laptop flies into a black hole, no information about the laptop can be retrieved if the mass, charge and/or the angular momentum of the black hole is/are not changed! If you cannot open an invisible door floating around my room, and if I step inside the door someday, how will you find me?

If any mass enters a black hole and is swallowed, there should be a subsequent increase in the mass of the black hole. Astronomers observed that this didn't happen in the case of energy, which meant that just as the black hole consumed some energy, it must also have radiated some energy in order to maintain its overall state.

This radiation is called Hawking radiation (in honour of its discoverer), and later observations found that it lay in the thermal section of the electromagnetic spectrum, i.e. a black hole radiated heat more than anything else. And since the black hole radiated heat, it must slowly be losing energy and, at one point, must also completely evaporate without a trace left behind. Using equations available in the theory of relativity, it was found that smaller black holes evaporated faster than larger ones. In fact, a black hole with the mass of a car would evaporate in 10-88 seconds.

[caption id="attachment_20247" align="aligncenter" width="545" caption="Hawking radiation mechanism"][/caption]

After complete evaporation... what about my laptop? Ultimately, we have a paradox: if my laptop went into the black hole, which then burped out some heat in the form of Hawking radiation, then is the no-hair theorem violable? Because my laptop's mass has caused a change in the interior energy of the black hole, which shouldn't happen according to the theorem.

Is the information then lost forever? Not possible; if it is, then the law of conservation of energy stands violated. Does the information also evaporate during the evaporation of the black hole? Not possible; if it did, then Hawking radiation would become inexplicable. Does the information jump out during the last moments of evaporation? Not possible; if it did, then a smaller black hole must have held on to that information that didn't participate in the evaporation. Does the information slip into a baby universe that we can't interact with? If your imagination can understand the creation of such a universe, sure. Does the information get lost in the time dimension? Nah.

Where is the information, then?

Black holes and information theory

When you see a star in the night sky and think to yourself of its beauty, you're doing the following things.

  1. Deciding to look up at the night sky

  2. Firing a sequence of signals through the motor neurons in the central nerve system

  3. Powering up muscles and lifting bones

  4. Positioning your eye to receive optimal amounts of light

  5. Employing a photo-chemical reaction at the back of the retina

  6. Sending electric signals to the brain

  7. Evaluating the beauty by accessing your memory

  8. Understanding the beauty and feeling it by letting it manifest as a suitable configuration of muscle positions


One way or another, we're using up energy to receive information, process it and convert it into another form which we can use. In fact, even the information we receive is a form of energy. When you log in to access your Facebook account, the letters and numbers on the screen are digitized data, and that means they're a series of data modification pulses shipped in through hundreds of optical cables, electronic circuitry and wireless data transmission systems to appear on your screen. Every physical manifestation of intention and will, in this world, is the conversion of energy from one form into another.

Now, the law of conservation of energy states that the total amount of energy in this Universe is fixed and cannot ever be changed. In that case, shouldn't the amount of information we receive and generate be fixed? At any point of time, would it be possible to generate more information than this Universe can contain? And since this Universe seems equipped only to contain so much information, is it fair to consider either that humankind's capabilities are limited or that humankind will never have to worry about that limit because they won't get there?

Second: because of the nature of the origin of this Universe, it is assumed to be a constantly expanding volume of space, and so, the amount of information that the Universe can contain also increases with it. What is the rate of increase, then? A simple answer to this question can be arrived at by considering two concepts: the Kepler problem in general relativity and black holes.

A black hole is a singularity. A singularity is a point in space where the quantities used to measure space-time change in a way that is independent of the coordinate system. Imagine just having neatly pressed your new sheets, and then finding somewhere a small crease that you aren’t able to iron out, as if it wasn’t there at all when you last looked there, but now won’t go away however hard you try. That’s your black hole.

[caption id="attachment_20246" align="aligncenter" width="545" caption="Depiction of a black hole"][/caption]

A black hole is a point in space, whereas the large black spheres we imagine them to be are true if only because that is the region around which the black hole exerts its influence. This “sphere of influence” doesn’t have a definite boundary. For the sake of convenience, which physicists also see the need for from time to time, there’s the event horizon: a hypothetical sphere whose surface marks the point of no-return.

Because of there massive densities, black holes exert a gravitational force that is just as strong as they are dense. Now, the Standard Model of particle physics dictates that photons, the packets of energy that carry electromagnetic radiation like light, have no mass. The most important consequence of masslessness is non-conformance to the forces of gravity, and that means light should be able to pass through black holes with no reflection, refraction or absorption. However, that’s not the case; in fact, black holes swallow light completely and burp a small amount of heat out. This happens because, instead of bending rays of light into themselves, black holes distort the shape of the space-time continuum itself.

Now, imagine a long stretch of flat land parallel to the surface of which a missile is fired. The missile is such that it guides itself to stay 1m above the ground at all points of time. If, suddenly, a gorge opens beneath the missile, it dips down and continues to follow the surface. Light behaves like the missile when the ground is the space-time continuum, and the only known phenomenon capable of distorting the continuum like that is a black hole – the only difference is that a black hole wraps the continuum around itself.

Now, this distortion lends itself to a very useful technique with which to measure the rate of expansion of the Universe, a technique called gravitational lensing. Consider the animated image below.



Beams of light coming from the galaxy in the background are bent around the black hole. As a result of this bending, two consequences arise:

  1. Increase in the illumination and the size of the image

  2. Apparent change in the position of the source of the image (irrelevant for this discussion)




During lensing, the distance traveled by a beam increases while the net displacement doesn't change, thereby keeping the colours of the image intact but changing its brightness and dimensions. Therefore, the black hole behaves like a convex mirror or, more commonly, a fish-eye lens. Now, the Kepler problem in general relativity gives rise to the formula:

θ = (4GM / rc2)

Here, θ is the angle through which the light beam is deviated by the bending object (as depicted in the latest image), G is the universal gravitational constant, M is the mass of the bending object, r is the distance between the beams and the bender (between "missile" and "ground"), and c is the speed of light (note how the forces of gravity are not instantaneous but also travel at the speed of electromagnetic radiations).

Think of the space-time continuum as an elastic fabric and the various phenomena and objects as special designs on its surface. When, at a point, the weave is wrapped around a spherical object, the surface area at that point goes from being flat to being rounded, giving rise to a bulge that enlarges the image. In physical terms, the angular size of the image is said to have been increased.

By calculating the value of θ, the distance between the galaxy and the black hole can be established. Over the course of, say, one year, the initial and final values of θ between two objects can be computed to give the distance by which they have been moved apart in that year. The perfect way to understand is the "raisin bread" model of the Universe. Consider a loaf of bread embedded with raisins. If the loaf is expanded, the number of raisins and the shape of the raisins do not change, but the distance between them changes proportionately.

With that, we are in a position to understand by how much the information-carrying capacity of the Universe changes as it itself becomes more voluminous. That leaves a third and last question to be answered, the question of information transfer.

Information transfer



In the process flowchart shown above, an idea is encapsulated by certain words that are then transmitted by an "information generation" controller (which is duty-bound to filter out noise and other such chaotic elements from the message). Next, the message is conveyed through a medium to an "information reception" controller (which is also duty-bound to filter out noise), and then the message is understood as an idea.

Now, while the message is being conveyed, certain errors and losses could creep in into the transmission process. For instance, because of the incapacity of the system itself, data loss and/or data corruption could occur. Further, if the right medium is not chosen for the most-efficient conveyance of the message, it might be understood differently, which is also a kind of data corruption. In order to avoid such possibilities, the message is sometimes amplified.

During amplification, two processes can be said to occur:

  1. The information being carried is modified: it is made stronger in order to survive the journey

  2. An accompanying signal is sent to the receiver to inform it about the changes that have been made


As such, the amplification and abridgment processes have to accompany the conveyance-medium-conveyance subsystem because they compensate for the shortcomings of the conveyances (just as if A lends money to B and B lends that amount to C, it is only B's responsibility to retrieve it from C and return it to A). That being said, let's move on to the idea of interstellar magnifiers.

Interstellar magnifiers

If human civilization were to spread out from Earth and distribute itself across other planets in other galaxies, communication between the various systems is bound to become a pain. In that case, suitably modified electromagnetic signals (such as light, RF waves, etc.) can be pulsed into the sky, aimed at objects with strong gravitational forces that would further amplify, or boost, them on their way. With the assistance of suitably positioned satellites, the broadcast information can then be "narrowcasted" down to the receiving stations on some planet.

[caption id="attachment_20245" align="aligncenter" width="350" caption="Cassini's gravity-assistance"][/caption]

A significant hindrance posed to this method of communication is a phenomenon called galactic extinction: when intercepted by a cloud of galactic dust, the waves are absorbed and scattered into space, the information becoming lost in the process. In order to minimize the amount scattered, polarized radiation may be considered.

The no-hair theorem

What happens when light, instead of bending around a black hole, enters into it? The answer to this question is a puzzler because of something called the no-hair theorem, which states that every black hole is characterized only by its mass, charge and angular momentum. That means that if my laptop flies into a black hole, no information about the laptop can be retrieved if the mass, charge and/or the angular momentum of the black hole is/are not changed! If you cannot open an invisible door floating around my room, and if I step inside the door someday, how will you find me?

If any mass enters a black hole and is swallowed, there should be a subsequent increase in the mass of the black hole. Astronomers observed that this didn't happen in the case of energy, which meant that just as the black hole consumed some energy, it must also have radiated some energy in order to maintain its overall state.

This radiation is called Hawking radiation (in honour of its discoverer), and later observations found that it lay in the thermal section of the electromagnetic spectrum, i.e. a black hole radiated heat more than anything else. And since the black hole radiated heat, it must slowly be losing energy and, at one point, must also completely evaporate without a trace left behind. Using equations available in the theory of relativity, it was found that smaller black holes evaporated faster than larger ones. In fact, a black hole with the mass of a car would evaporate in 10-88 seconds.

[caption id="attachment_20247" align="aligncenter" width="545" caption="Hawking radiation mechanism"][/caption]

After complete evaporation... what about my laptop? Ultimately, we have a paradox: if my laptop went into the black hole, which then burped out some heat in the form of Hawking radiation, then is the no-hair theorem violable? Because my laptop's mass has caused a change in the interior energy of the black hole, which shouldn't happen according to the theorem.

Is the information then lost forever? Not possible; if it is, then the law of conservation of energy stands violated. Does the information also evaporate during the evaporation of the black hole? Not possible; if it did, then Hawking radiation would become inexplicable. Does the information jump out during the last moments of evaporation? Not possible; if it did, then a smaller black hole must have held on to that information that didn't participate in the evaporation. Does the information slip into a baby universe that we can't interact with? If your imagination can understand the creation of such a universe, sure. Does the information get lost in the time dimension? Nah.

Where is the information, then?

Tuesday, 31 May 2011

Philosophiae homis


The 'Book Summary' makes me wonder... why are the likes of Carl Sagan and Stephen Hawking and Richard Feynman so few and so far in between? The popularization of science may not seem like a necessary fixture to its acceptance in mainstream media, but over the years it has become increasingly necessary to clarify its role in the eyes of the common man—even roles such as those played in the maintenance of the Large Hadron Collider or of the International Space Station. That Einstein's papers on special and general relativity need a "redesigning" betokens a moment's reflection on the dependence of our day-to-day activities on scientific research and development, whether the increasing investment in experimental apparatuses sees justification just like military spending does, and if anyone ascribes the need for that investment to anything apart from science's utilitarian value.

Philosophiae homis


The 'Book Summary' makes me wonder... why are the likes of Carl Sagan and Stephen Hawking and Richard Feynman so few and so far in between? The popularization of science may not seem like a necessary fixture to its acceptance in mainstream media, but over the years it has become increasingly necessary to clarify its role in the eyes of the common man—even roles such as those played in the maintenance of the Large Hadron Collider or of the International Space Station. That Einstein's papers on special and general relativity need a "redesigning" betokens a moment's reflection on the dependence of our day-to-day activities on scientific research and development, whether the increasing investment in experimental apparatuses sees justification just like military spending does, and if anyone ascribes the need for that investment to anything apart from science's utilitarian value.