Pages

Wednesday, 2 July 2014

googleVis - lfprMotion
















Data: lfpr.csv • Chart ID: MotionChartID1dbc4ce94734googleVis-0.5.3



R version 3.1.0 (2014-04-10)
Google Terms of UseDocumentation and Data Policy


Monday, 23 December 2013

Census: Presentation

"Story of the States: Convergence or Divergence?"

Census Data Visualisation
by
Rukmini S and Vasudevan Mukunth (The Hindu)
December 23, 2013



Sunday, 10 February 2013

EUCLID/ESA: A cosmic vision looking into the darkness


I spoke to Dr. Giuseppe Racca and Dr. Rene Laureijs, both of the ESA, regarding the EUCLID mission, which will be the world’s first space-telescope launched to study dark energy and dark matter. For the ESA, EUCLID will be the centerpiece of their Cosmic Vision program (2015-2025). Dr. Racca is the mission’s project manager while Dr. Laureijs is a project scientist.
Could you explain, in simple terms, what the Lagrange point is, and how being able to study the universe from that vantage point could help the study? 
GR: Sun-Earth Lagrangian point 2 (SEL2) is a point in space about 1.5 million km from Earth in the direction opposite to the sun, co-rotating with the Earth around the Sun. It is a nice and calm point to make observations. It is not disturbed by the heat fluxes from the Earth but at the same time is not too far away to allow to send to Earth the large amount of data from the observation. The orbit around SEL2 that Euclid will employ is rather large and it is easy to reach (in terms of launcher capability) and not expensive to control (in terms of fuel required for the orbit corrections and maintenance manoeuvres).
Does Euclid in any way play into a broader program by ESA to delve into the Cosmic Frontier? Are there future upgrades/extensions planned? 
RL: Euclid is the second approved medium class mission of ESA’s Cosmic Vision programme. The first one is Solar Orbiter, which studies the Sun at short distance. The Cosmic Vision programme sets out a plan for Large, Medium and Small size missions in the decade 2015-2025. ESA’s missions Planck, which is presently in operation in L2, and Euclid will study the beginning, the evolution, and the predicted end of our Universe.
GR: A theme of this programme is: “How did the Universe originate and what is it made of?” Euclid is the first mission of this part of Cosmic Vision 2015-2025. There will be other missions, which have not been selected yet.
What’s NASA’s role in all of this? What are the different ways in which they will be participating in the Euclid mission? Is this a mission-specific commitment or, again, is it encompassed by a broader participation agreement?
GR: The NASA participation in the Euclid mission is very important but rather limited in extent. They will provide the Near-infrared detectors for one of the two Euclid instruments. In addition they will contribute to the scientific investigation with a team of about 40 US scientists. Financially speaking NASA contribution is limited to some 3-4% of the total Euclid mission cost.
RL: The Euclid Memorandum of Understanding between ESA and NASA is mission specific and does not involve a broader participation agreement. First of all, NASA will provide the detectors for the infrared instrument. Secondly, NASA will support 40 US scientists to participate in the scientific exploitation of the data. These US scientists will be part of the larger Euclid Consortium, which contains nearly 1000 mostly European scientists.
Do you have any goals in mind? Anything specific or exciting that you expect to find? Who gets the data?
GR: The goals of the Euclid mission are extremely exciting: in few words we want to investigate the nature and origin of the unseen Universe: the dark matter, five times more abundant than the ordinary matter made of atoms, and the dark energy, causing the accelerating expansion of the Universe. The “dark Universe” is reckoned today to amount at 95% of the total matter-energy density. Euclid will survey about 40% of the sky, looking back in cosmic time up to 10 billion years. A smaller part (1% of the sky) will look back to when the universe was only few million years old. This three dimensional survey will allow to map the extent and history of dark matter and dark energy. The results of the mission will allow to understand the nature of the dark matter and its position as part of an extension of the current standard model. Concerning the dark energy we will be able to distinguish between the so called “quintessence” or a modification necessary to current theories of gravity, including General Relativity.
RL: Euclid goals are to measure the accelerated expansion of the universe which tells us about Dark Energy, to determine the properties of gravity on cosmic scales, to learn about the properties of dark matter, and to refine the initial conditions leading to the Universe we see now. These goals have been chosen carefully, the instrumentation of Euclid is optimised to reach these goals as best as possible. The Euclid data opens the discovery space for many other areas in astronomy: Euclid will literally measure billions of stars and galaxies at visible and infrared wavelengths, with a very high image quality, comparable to that of Hubble Space Telescope. The most exiting prospect is the availability of these sharp images, which will certainly reveal new classes of objects with new science. The nominal mission will last for 6 years, but the first year of data will become already public 26 months after the start of the survey.
When will the EUCLID data be released?
GR: The Euclid data will be released to the public one year after their collection and will be made available to all researchers in the world.

Wednesday, 23 January 2013

A different kind of experiment at CERN

This article, as written by me, appeared in The Hindu on January 24, 2012.

--

At the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland, experiments are conducted by many scientists who don’t quite know what they will see, but know how to conduct the experiments that will yield answers to their questions. They accelerate beams of particles called protons to smash into each other, and study the fallout.

There are some other scientists at CERN who know approximately what they will see in experiments, but don’t know how to do the experiment itself. These scientists work with beams of antiparticles. According to the Standard Model, the dominant theoretical framework in particle physics, every particle has a corresponding particle with the same mass and opposite charge, called an anti-particle.

In fact, at the little-known AEgIS experiment, physicists will attempt to produce an entire beam composed of not just anti-particles but anti-atoms by mid-2014.

AEgIS is one of six antimatter experiments at CERN that create antiparticles and anti-atoms in the lab and then study their properties using special techniques. The hope, as Dr. Jeffrey Hangst, the spokesperson for the ALPHA experiment, stated in an email, is “to find out the truth: Do matter and antimatter obey the same laws of physics?”

Spectroscopic and gravitational techniques will be used to make these measurements. They will improve upon, “precision measurements of antiprotons and anti-electrons” that “have been carried out in the past without seeing any difference between the particles and their antiparticles at very high sensitivity,” as Dr. Michael Doser, AEgIS spokesperson, told this Correspondent via email.

The ALPHA and ATRAP experiments will achieve this by trapping anti-atoms and studying them, while the ASACUSA and AEgIS will form an atomic beam of anti-atoms. All of them, anyway, will continue testing and upgrading through 2013.

Working principle

Precisely, AEgIS will attempt to measure the interaction between gravity and antimatter by shooting an anti-hydrogen beam horizontally through a vacuum tube and then measuring how it much sags due to the gravitational pull of the Earth to a precision of 1 per cent.

The experiment is not so simple because preparing anti-hydrogen atoms is difficult. As Dr. Doser explained, “The experiments concentrate on anti-hydrogen because that should be the most sensitive system, as it is not much affected by magnetic or electric fields, contrary to charged anti-particles.”

First, antiprotons are derived from the Antiproton Decelerator (AD), a particle storage ring which “manufactures” the antiparticles at a low energy. At another location, a nanoporous plate is bombarded with anti-electrons, resulting in a highly unstable mixture of both electrons and anti-electrons called positronium (Ps).

The Ps is then excited to a specific energy state by exposure to a 205-nanometre laser and then an even higher energy state called a Rydberg level using a 1,670-nanometre laser. Last, the excited Ps traverses a special chamber called a recombination trap, when it mixes with antiprotons that are controlled by precisely tuned magnetic fields. With some probability, an antiproton will “trap” an anti-electron to form an anti-hydrogen atom.

Applications

Before a beam of such anti-hydrogen atoms is generated, however, there are problems to be solved. They involve large electric and magnetic fields to control the speed of and collimate the beams, respectively, and powerful cryogenic systems and ultra-cold vacuums. Thus, Dr. Doser and his colleagues will spend many months making careful changes to the apparatus to ensure these requirements work in tandem by 2014.

While antiparticles were first discovered in 1959, “until recently, it was impossible to measure anything about anti-hydrogen,” Dr. Hangst wrote. Thus, the ALPHA and AEgIS experiments at CERN provide a seminal setting for exploring the world of antimatter.

Anti-particles have been used effectively in many diagnostic devices such as PET scanners. Consequently, improvements in our understanding of them feed immediately into medicine. To name an application: Antiprotons hold out the potential of treating tumors more effectively.

In fact, the feasibility of this application is being investigated by the ACE experiment at CERN.

In the words of Dr. Doser: “Without the motivation of attempting this experiment, the experts in the corresponding fields would most likely never have collaborated and might well never have been pushed to solve the related interdisciplinary problems.”

Wednesday, 16 January 2013

Aaron Swartz is dead.

This article, as written by me and a friend, appeared in The Hindu on January 16, 2013.

--

In July, 2011, Aaron Swartz was indicted by the district of Massachusetts for allegedly stealing more than 4.8 million articles from the online academic literature repository JSTOR via the computer network at the Massachusetts Institute of Technology. He was charged with, among others, wire-fraud, computer-fraud, obtaining information from a protected computer, and criminal forfeiture.

After paying a $100,000-bond for release, he was expected to stand trial in early 2013 to face the charges and, if found guilty, a 35-year prison sentence and $1 million in fines. More than the likelihood of the sentence, however, what rankled him most was that he was labelled a “felon” by his government.

On January 11, Friday, Swartz’s fight, against information localisation as well as the label given to him, ended when he hung himself in his New York apartment. He was only 26. At the time of his death, JSTOR did not intend to press charges and had decided to release 4.5 million of its articles into the public domain. It seems as though this crime had no victims.

But, he was so much more than an alleged thief of intellectual property. His life was a perfect snapshot of the American Dream. But the nature of his demise shows that dreams are not always what they seem.

At the age of 14, Swartz became a co-author of the RSS (Rich Site Summary) 1.0 specification, now a widely used method for subscribing to web content. He went on to attend Stanford University, dropped out, founded a popular social news website and then sold it — leaving him a near millionaire a few days short of his 20th birthday.

A recurring theme in his life and work, however, were issues of internet freedom and public access to information, which led him to political activism. An activist organisation he founded campaigned heavily against the Stop Online Piracy Act (SOPA) bill, and eventually killed it. If passed, SOPA would have affected much of the world’s browsing.

At a time that is rife with talk of American decline, Swartz’s life reminds us that for now, the United States still remains the most innovative society on Earth, while his death tells us that it is also a place where envelope pushers discover, sometimes too late, that the line between what is acceptable and what is not is very thin.

The charges that he faced, in the last two years before his death, highlight the misunderstood nature of digital activism — an issue that has lessons for India. For instance, with Section 66A of the Indian IT Act in place, there is little chance of organising an online protest and blackout on par with the one that took place over the SOPA bill.

While civil disobedience and street protests usually carry light penalties, why should Swartz have faced long-term incarceration just because he used a computer instead? In an age of Twitter protests and online blackouts, his death sheds light on the disparities that digital activism is subjected to.

His act of trying to liberate millions of scholarly articles was undoubtedly political activism. But had he undertaken such an act in the physical world, he would have faced only light penalties for trespassing as part of a political protest. One could even argue that MIT encouraged such free exchange of information — it is no secret that its campus network has long been extraordinarily open with minimal security.

What then was the point of the public prosecutors highlighting his intent to profit from stolen property worth “millions of dollars” when Swartz’s only aim was to make them public as a statement on the problems facing the academic publishing industry? After all, any academic would tell you that there is no way to profit off a hoard of scientific literature unless you dammed the flow and then released it per payment.

In fact, JSTOR’s decision to not press charges against him came only after they had reclaimed their “stolen” articles — even though Laura Brown, the managing director of JSTOR, had announced in September 2011, that journal content from 500,000 articles would be released for free public viewing and download. In the meantime, Swartz was made to face 13 charges anyway.

Assuming the charges are reasonable at all, his demise will then mean that the gap between those who hold onto information and those who would use it is spanned only by what the government thinks is criminal. That the hammer fell so heavily on someone who tried to bridge this gap is tragic. Worse, long-drawn, expensive court cases are becoming roadblocks on the path towards change, especially when they involve prosecutors incapable of judging the difference between innovation and damage on the digital frontier. It doesn’t help that it also neatly avoids the aura of illegitimacy that imprisoning peaceful activists would have for any government.

Today, Aaron Swartz is dead. All that it took to push a brilliant mind over the edge was a case threatening to wipe out his fortunes and ruin the rest of his life. In the words of Lawrence Lessig, American academic activist, and his former mentor at the Harvard University Edmond J. Safra Centre for Ethics: “Somehow, we need to get beyond the ‘I’m right so I’m right to nuke you’ ethics of our time. That begins with one word: Shame.”