Let's break down the Large Hadron Collider (LHC) to see what parts it yields. The 27-km long tunnel is made of concrete and is 3.8 metres wide. Two beam pipes guide the particles around in opposite directions around the pipe at 0.999999991 times the speed of light (299,792,455 m/s), and when they've been sufficiently charged up, the particles can meet at one of four intersections between the pipes. To increase the chances of a head-on collision, 1,232 dipole magnets and 392 quadrupole magnets, arranged above and below the pipes, guide the particles around. The magnets weight a total of 12,500 tonnes, surpassed in size- only by the static ICAL detector's 50,000-ton magnet at India's INO.
[caption id="attachment_21265" align="aligncenter" width="448" caption="The Compact Muon Solenoid (CMS) detector in the LHC"]
These 1,624 magnets are electromagnetic, and as their magnetic properties are turned on and off a little more than 22,000 times per second, they heat up. They can't be allowed to do that, however, and to cool them down, 96 tonnes of liquid helium is used to maintain the chunks of niobium-titanium at 1.9 K. In all, the magnets store 10 giga-joules of energy while generating a magnetic field of up to 8.3 tesla (about 332,000 times stronger than the Earth's) as 323 trillion protons gear up at near-light speeds for what can only be called the bloodiest civil war.
This (cooling business) makes the LHC the largest cryogenic storage facility of all-time, a VLSI exemplar we can work with as we move into a future in which the storage and transportation of hydrogen for fuel cells is an increasingly difficult problem - and not just because we haven't given it enough thought yet. You see, hydrogen is extremely explosive in the presence of oxygen, and readily reacts to form steam and a large quantity of heat. Therefore, hydrogen has to be stored in leak-proof containers that are capable of withstanding big shocks. It also has to be stored in its liquid state because gaseous hydrogen has an extremely low density and a very low mass can occupy a large volume.
[caption id="attachment_21266" align="aligncenter" width="400" caption="The cross-section of a cryopump"]
At the LHC, the liquid helium is stored in copper-jacketed cryostats that maintain its temperature at 1.9 K, which is circulated by means of special pumps called cryopumps. These pumps are maintained at the very low temperatures they require to remain functional by the compressed helium itself. However, in case of hydrogen, a smaller cryocooler can be attached to the cryopump (alongside a sorption pump - but that's not important now) so it remains cold and damp, and it can be made readily available for use using a regenerative evaporation process.
Also, if you didn't know, the entire working principle of the LHC is encapsulated by cyclotrons that work with Magnetic Resonance Imaging (MRI), Positron Emission Tomography (PET) and Computerized Tomography (CT) scanners in medical diagnostics. Basically, these scanners detect the decay of radioactive isotope with a very short half-life (usually from 20 minutes to 110 minutes) within the human body through a series of beta decays and electron-positron annihilation events. They have a cyclotron in close proximity that generates these isotopes to be traced, and a cyclotron works on the same principle as the LHC: takes a charged particle, sets it on a curved path by exposing it to a magnetic field, continuously switches the direction of the magnetic field using two different sets of magnets that go on and off alternatively until the particle has sped up, and then releases it.
[caption id="attachment_21267" align="aligncenter" width="532" caption="A classical Lawrence cyclotron"]
Because the LHC's detectors have to detect high-energy collisions precisely, i.e. determine the particles' charge, mass and other quantum properties to within 99.99%, they have to have a high luminosity. Further, each detector is not just a place that receives signals and immediately interprets. Instead, the detector comprises of everything from the detection mechanism to the millions of read-out channels that transmit the data to supercomputing grids. This in-built capacity to work at various energies and with unpredictable scenarios gives the diagnostics and instrumentation industries a lot to work with when it comes to the fabrication of scanners.
As of now, the image reconstruction by the scanners is the most difficult task in the entire process. However, the technology has improved so much so that the different tracer isotopes and their biochemical reactions with tissues can be visualized even for previously sensitive sections of the body. In neuropsychiatry, for instance, a substance called a radioligand is used to label the dopamine, serotonin and opioid receptors in the brain. Then, using a suitable scanner like a PET, the levels and neurological pathways of these receptors are monitored over a period of time. Because the receptors' involvement in disorders like schizophrenia, Alzheimer's, substance abuse and mood disorders is significant, advancements in selecting the perfect radioligands, determining where what happens, and the ability to identify various kinds of reactions and reconstruct it in 3D can go a long way in finding suitable cures.
[caption id="attachment_21268" align="aligncenter" width="448" caption="In a PET scanner, an electron and an injected positron annihilate each other within the human body to produce two gamma rays. When these rays reach the scanner, they are recorded as a burst of light. The image of the brain from a PET scan, shown above, is reconstructed by orienting the scanner in various directions."]
The supercomputing grid mentioned earlier is an array of monstrously powerful computers assembled at the CERN capable of processing tens of gigabytes of data per second, churning out the results, ordering them, and then looking for patterns. At the same time, there is a Europe-wide project in place that allows volunteers to log in to the CERN server and permit the use of their computers' idle time for added computing power: LHC@home. Given that the CERN supercomputers are ahead in terms of computing complexity by orders of magnitude, I don't know how much computing power the initiative contributes that makes a difference. However, that such a concept exists is more important - especially considering we're in the era of cloud computing.
Another field of physics that requires such massive computing power is meteorology. The second-by-second reconstruction of weather patterns, cyclones, hurricanes, cloud movements, winds, rains, and ocean waves across a swath of oddly varying topographies with high unpredictability makes weather forecasting a superbitch. As of now, it is a task taken on only by national governments and prolific academic institutions. In this context, what if a computing grid existed that drew on volunteers' idle PC time to assist with the calculation and simulation of climactic patterns? This would reduce the load on existing computing infrastructure and release computation time for other calculations, making forecasting quicker. Or, interpreted another way, more timely.
Such tightly-coupled distributed programs, called clusters, work with advancements in running a synchronized algorithm in essentially asynchronized systems, and determination of event-sequences and logical cause-effects ("what happened when/what's next"). Clusters also have to address a host of other issues, and understanding how the CERN is dealing with them everyday will provide invaluable insight into setting up such systems in other industries, too. These issues include achievement of overall system reliability in the presence of some faulty processes, keeping the signal-to-noise (SNR) ratio down at all points by self-stabilization to reduce the amount of error, and the Two Generals' Problem.
[caption id="attachment_21269" align="aligncenter" width="438" caption="In the Two Generals' Problem, two generals await with their armies on either sides of a valley. The people they wait to conquer are in the valley, and the invasion will be successful only if both generals attack at the same time. Since sending a messenger through the valley could result in his being "flipped" or lost, a potentially infinite number of messengers will be required to confirm a time of action."]
These are only some of the innovations that the CERN has pioneered. In the 1980s, nobody could have foreseen the onset of such a spur of improvements because, then, the LHC was just a particle accelerator. As the years went by and the demands of the physics community grew, the countries came together to exercise their strengths and contribute to this project an idea, a component or some other service, anything that each country was at the forefront of. Moreover, the LHC also pushed regional and associated industries to toughen up, encourage research in niche areas, and think of better ways to make any idea reality quickly. And when the internet came up and opened the world up more than globalization had, one country's contributions proved to be another country's solutions, and the communications gap that had existed between them for all those years was broken by the experiment.
I'm sure there's a lot more to the collider, but the purpose of this endeavour was to illustrate that the LHC is not just a physics thing. Its contribution to mankind long surpassed the hunt for the Higgs boson: today, collider technology is everywhere from baby diapers and cereal boxes to X-ray spectroscopy, Bose-Einstein statistics and superconductors.