Friday, 21 December 2012
Is there only one road to revolution?
Some of this connects, some of it doesn't. Most of all, I have discovered a fear in me that keeps me from from disagreeing with people like Meena Kandasamy - great orators, no doubt, but what are they really capable of?
The piece speaks of revolution as being the sole goal of an Indian youth's life, that we must spend our lives stirring the muddied water, exposing the mud to light, and separating grime from guts and guts from guts from glory. This is where I disagree. Revolution is not my cause. I don't want to stir the muddied water. I concede that I am afraid that I will fail.
And at this point, Meena Kandasamy would have me believe, I should either crawl back into my liberty-encrusted shell or lay down my life. Why should I when I know I will succeed in keeping aspirations alive? Why should I when, given the freedom to aspire, I can teach others how to go about believing the same? Why should I when I can just pour in more and more clean water and render the mud a minority?
Why is this never an option? Have we reached a head, that it's either a corruption-free world or a bloodied one? India desperately needs a revolution, yes, but not one that welcomes a man liberated after pained struggles to a joyless world.
Monday, 14 May 2012
Anumana, Drstam and Samkhya

The first response to such a dilemma would be to segregate the incoming information based on some logic. For example, the logic could be medium, and audio and visual objects of evidence could be saved in a set distinct from another that contains no information that can be heard or seen. However, there are too many such "logics" up for grabs. Even if information is separated on the basis of how it is perceived, our faculties of perception come under the scanner. If a video has both great audio and stunning imagery, do I save the file in both categories or in just one category where I think it belongs more? If I save it in both categories, it's going to be redundant half the time. If I save it in just the one, I'm effectively passing judgment that I classified it now and that's how it will always stay classified. Once a thief, always a thief.
However, a classification based on the pathways of perception (Drstam) is indispensable, perhaps more so than a classification based on the pathways of cognition (Anumana) is. The difference between the two is that the former expresses the sentiment "how we see it" and the latter, the sentiment "what we see it for". As is obvious, the two are closely related. In fact, they are coincidental until some intrinsic aspect about them has been projected strongly.
[caption id="attachment_23102" align="aligncenter" width="312"]

For instance, if I interpret inexplicable noises in the dark to be evidence for the presence of ghosts, then what I am seeing - though improbable - coincides perfectly with what I'm seeing it for. Until I find the source of the noise, I will continue to believe in ghosts in that scenario. When I find the source, however, I will empirically conclude that what I saw didn't coincide with what I saw it for. The link between the two will have stood broken when the truth of the observation would have been discovered, thereby making it a projection of fact deemed strong because of its implications for my internal logic.
[caption id="attachment_23107" align="aligncenter" width="446"]

Therefore, there are two reasons to introduce a symmetry-breaker parameter:
- To differentiate between Anumana and Drstam when they seem to coincide
- To eliminate the "what we see it for" because determinism can only be incommensurate with an indeterminable scenario within which we are trying to maximise knowledge-gain
This symmetry-breaker is Samkhya. In Hindu philosophy, from which I derive (a part of) these logics, Samkhya declares God as being unprovable. In the information-segregation scenario, consequently, the absence of a controlling external entity enforces the presence of an internal logical consistency (brought about by the strong projection of some fact). Otherwise, the external observer won't be able to make any sense of the information. Because we have assumed that the influx of information is already disordered, the only logical conclusion is that Anumana - the "what we see it for" - and the capacity of logical inference it betokens are necessary.
This hints at an uncertainty principle much alike the one in particle physics:
- I can know what I want at all points of time but not have access to all there is to be known, or
- I can access a pool of infinite knowledge but not be certain about what I want to know.

Tuesday, 17 April 2012
The architecture of access

The underpinning motivation of Open Access (OA) is to make knowledge freely available to everyone, especially to those who work with limited access to libraries and other resources. Specifically, OA is said to have been implemented when peer-reviewed journal articles are available for free to anyone who wishes to use their intellectual property as long as credit is given where it is due. The protocol enhances research by bringing reports and articles to those who might not have been able to access it without OA’s mandates in place. It promotes citations and encourages more new work to be built upon the old ones, speeding up innovation and development.
Evidently, those wishing to implement OA must be research-minded more than commercially inclined. There are, of course, merits and demerits that come with its implementation. The most lucrative merit is that OA journals reduce publication delays by focusing on making submissions more accessible. The most formidable deterrents are the monetary concerns: OA journals shift the burden of payment from readers to authors.
Instead of being paid for accepting articles, OA journals earn by charging authors, their employers and/or the research funders for getting their papers peer-reviewed. This is a charge that the research community must bear in order to assure users that what they will be reading is good-quality information: the message is “if you want your work to be developed quickly, pay up.”
However, such charges are easily surpassed by the increased probability of a work being cited if it is brought under the OA umbra—an observation supported by various studies. The latest, conducted in 2010, further noted that articles liberated by OA benefited not from authors self-selecting which articles to make freely accessible but by users selecting which articles to cite. This is a reflection of the fact that the more an article is accessible to being cited, the more it will be cited. (Interestingly, the study also found that citations followed the pareto principle: the top 20 per cent of all articles, graded by quality, received 80 per cent of all the citations.)

The other way is to take the “green” road and self-archive it: authors themselves choose an appropriate archiving resource, such as CiteSeer (by the University of Pennsylvania) for computer and information scientists and arXiv (by Cornell University) for physicists, and upload their articles for viewing. Since these authors also publish to journals for accruing credibility, it is advisable for them to look up the journal’s self-archiving policies before going down the “green road”.
Once the knowledge lying locked within journals has been unlocked by OA, readers can avail a bevy of concepts and numbers at their fingertips. Since journals play host to all kinds of studies—not just scientific—OA’s potential is best understood in terms of the macroeconomic development of the community it targets. It is not just about a bunch of scientists sitting around a laptop and siphoning off citations for their next big paper. Instead, it is about nations identifying the strengths and opportunities for further developments and recognizing potential partners worldwide who will shape funding, advocacy and support for regional initiatives. It is about promoting art and cultural programmes and encouraging structural changes in socio-economic policy to introduce greater freedom—of expression as well as thought.

In India, however, the absence of a national mandate on OA publishing has resulted in Indian works becoming less accessible. Consequently, because of reduced paper visibility and a poor subscriber base, Indian scientists prefer to publish their works in foreign journals in order to increase academic impact. In fact, there are only 822 OA-archived papers in and from India (Full list here). Needless to say, this is a vicious cycle in the academic community that must be weeded out, and the only way to do so would be to compel publishers to make OA-versions of papers available, even if after a short delay.
The implementation of OA does have its fair share of critics, however, whose remarks range from questioning the need for a free-access mandate to arguing against paying taxes to get papers published instead of to buy access to them. The latter could be a valid argument in countries that publicly fund research—implying that federal grants are rendered pointless by researchers having to shell out money anyway—if one were to exclude the issue of access: currently, publicly funded research is indeed accessible, but only in national libraries and (offline) institutional repositories. The principle of OA is to make this access more widespread, but the argument does highlight the need for cheaper processing fees to deter marginalized research groups from entering mainstream discussion fora.
Saturday, 7 April 2012
Degeneracy of the fantastic
Is the future here, then? The future is never here by definition, so that's ruled out. What, then, is the future? In the 1960s-1980s, a critical mass of fiction writers sat down in the Golden Age of science fiction and thought up a world in which the problems of their time didn't exist, a world in which the capacity for human goodness stood exacerbated by the development of technology immensely advanced but not yet radical. Where are such writers now? Going around the web looking for answers to these questions, it looks as if futurism has evolved into a less materialistic and more scientific school of thought.

Today, even as the price of being unique has gone up, so also has the ease with which uniqueness is to be found - original thinking has become really hard with the development of influences (which has always been happening) and their pervasiveness (which is a product of the revolution). There has been more and more to know about, to learn, and that learning eroded at the base of what was left to speculate about. This doesn't mean the solution is to limit learning: learning takes priority over writers inventing the future any day. This only means something unfortunate has happened that has quelled a once widely popular interest.

Tuesday, 27 March 2012
The cause-effect paradigm
From time to time, students and teachers alike need to be reminded that each topic in a subject is weak by itself, and only with the assistance of other topics is anything achieved. Instead of going from specifics to the larger picture, why not come from the larger picture to the specifics? After all, and this is just an (convenient) example, mathematics is a powerful but singular set of tools used to solve problems in the real world: every problem is application driven, including in string theory and loop quantum gravity, where, without the verification of their hypotheses by experiments, each remains just a strongly-defended opinion.
[caption id="" align="aligncenter" width="510" caption="The tools of multilateral thinking can be used within classrooms as well to improve efficiency and productivity."]
I must concede that some problems are better solved using some tools than others, but keeping in mind why the problem is being solved like that is important. Even if calculus provides a circuitous route to a solution, what's wrong with its being adopted by the calculus-lovers to get there? When they get there, the relationship between the problem and the solution becomes clearer: there is a better cause-effect relationship established than when a student struggles through vectors and is exhausted by the end, reluctant to take it up again.
As far as laying the groundwork is concerned, teaching students everything is the way to go: at some point later, then, they will be better equipped to make a choice - between what they think they ought to stick with and what they think they can afford to avoid. However, in this order of things, the problems solved using tool-set A and tool-set B, even if in different terms, could be the same, or related in some way so that even what seems difficult could be better understood in terms of what seems easy.
These are only musings concerned with the different ways through which students can convert information into knowledge. The point is: as long as we're here to solve problems, let's have fun doing it.
Saturday, 18 February 2012
The learner as far-seer
The reason I remember the experience is that, more often than not, one doesn't know when the learning phase of life ends - many, like me, don't even know what comes after it if it ends. However, earlier today, when I was reading a journal article on using laser-induced plasma and the technology's application in particle accelerators, I surprised myself by understanding the entire thing without stopping even once; I could get what the authors were saying even when they were speaking only via formulae.
It was strangely dejecting because one of the most likeable things about particle physics in my opinion is its tendency to throw up previously unknown information just when we least expect it. In fact, even the one thing we thought we knew about this universe - the highest speed possible - was defied last year by some 15,000 neutrinos. And in such a scenario, when the picture suddenly becomes clear, when I can see the jigsaw puzzle board and the different empty shapes here and there waiting to be filled, it's as if I'm ready to start answering the bigger questions and leave the smaller ones behind.
[caption id="attachment_21624" align="aligncenter" width="334" caption="Clinton Davisson (left) and Lester Germer conducted an experiment since named after them - as the Davisson-Germer experiment - in 1927. Six years earlier, Einstein had won the Nobel Prize in physics for his discovery that particles were discrete encapsulations of energy called quanta. In 1927, French physicist Louis de Broglie presented his thesis that all particles have a wave-like characteristic. In the Davisson-Germer experiment, the two Americans stumbled across an electron diffraction pattern where they were expecting an electron diffuse-reflection pattern while studying the surface of nickel. This proved de Broglie's informed conjecture true."]
I feel like the depressed man whose psychiatrist suggests he witness a performance by a clown in town. The depressed man then admits he is that clown.
At this point, I see two outcomes. The first one hints that I only want to keep learning and I'm not as interested in "deploying" that knowledge usefully. That is only partly true because, hey, I don't have a particle collider in my backyard that introduces new particles into my life so I can piece the universal puzzle together better. The second outcome suggests that my knowing a lot of things - science-wise and not - is, for the most part, a product of this fear of what-will-or-won't-come-next.
The first outcome doesn't bother me much because I've discovered I like teaching. Even though I may not be using my knowledge of IC engines to fix vehicles on desolate highways, I try and ensure as many people as possible understand how such engines work and do what they want with that knowledge. The same applies for particle physics. However, in this case, the dimension of teaching acquires more weight because its capacity to be misunderstood is great: it's a developing field whose foundations are currently under fire, whose experiments are so complex that multiple governments are helping fund it, whose conclusions are so counter-intuitive that the layman and the physicist are today many perspectives apart.
The second outcome is something I learnt while writing this post. The impetus that holds my two-decade-long learning spree up is nothing but fear, a fear of the unknown. Not learning something on a given day, with me, seems like forgoing a chance to imagine something we might not physically live. It's like the Copenhagen interpretation of quantum mechanics - a.k.a. Schrodinger's cat: for as long as I don't open the box, the cat is both dead and alive, the experience is both there and not there.
Saying "I learn not because I want to" is too bland: I learn because I want to look into the darkest corners of the universe and not see something that I can't understand or gauge in some way.
The popularly perceived notion of beauty comes with inexplicability: the capacity of an entity to defy definition and/or predictability, to defy structure and exhibit a will of its own in form and function. However, the silent reminder we are everyday given that, no matter how far out into the universe we venture or how deep we probe into the atom, the laws of physics are the same is the soul of beauty. And the inexplicability I seek to defy by learning is simply understanding how the same thing that gave us the dung beetle also gave us the Carina Nebula, that the same thing that gave us the Monarch butterfly also have us black holes.
[caption id="attachment_21625" align="aligncenter" width="335" caption="This image of the Carina Nebula is composed of multiple shots taken from the Atacama Desert in South America."]
I think that's a fear I've enjoyed and enjoy having.
Tuesday, 7 February 2012
The Middle Earth of physics: Where Tolkien and the physicists meet to have tea
Studying physics is like reading The Lord of the Rings trilogy. At first, there is a general excitement about things to come, how the small events at the beginning are going to avalanche into something portentous. Then, there comes the middle section where things get slow and a tad boring, but it's still a section that you have to understand before you can get on to bigger things. And then, there's the finish: spectacular and very memorable.
Things are the same with physics. First, there are the atoms, the hobbits of the physical realm. With them, the molecules, bonds and a wide variety of interactions between different particles. There is enough about their behaviour to stoke one's curiosity, to explore how they interact under different circumstances toward different results. Then, as the basic structure of all materials has been understood, we move on to the universe in which they exist and how they shaped it. That's where things get tricky and, quickly, mindboggling.
At one point, however, all the concepts that are trying to be understood suddenly coalesce into one big, beautiful picture of the universe. There are stars, novae, nebulae and black holes, and diamonds, rubies and emeralds, and light and its millions of colours. This is where the beauty of physics becomes really evident, summoning appreciation and awe at its poignancy. This is also where the audience's focus is while, all the time, the physicist labours in the middle section to understand more, to explore more.
- The Pillars of Creation (one of the most beautiful images from outer space)
A lot of what goes on in the beginning is taught at schools. The foundation is laid such that wheresoever the student's interest lies, he finds himself equipped enough to move ahead confidently in that direction. All of what happens in the middle is locked up in universities, research labs and journals. That is where the core of the scientific community resides, constantly hypothesizing, experimenting, reviewing and publishing. The contents of the spectacular finish is what is circulated in the media: in news reports, TV shows, etc., the stuff that we see even if we don't care to look in the right places.
A book as comprehensive as The Lord of the Rings in its delineation of fantastic plots and sub-plots, of valorous and scheming characters, and of strange places and their stranger legends is bound to become both heavily inspirational and literarily restricting. Since 1955, when the trilogy was first published, there have been hundreds of books that show some sign or the other of the author having borrowed from Tolkien's brainchild. At the same time, many of them experienced only fleeting success simply because they were measured against the scope of the big daddy.
Physics isn't different. Every time there is a revolution - which has been happening less frequently of late because (we think) we're in the vicinity of a Solution to Everything - there is reluctance, reaffirmation, and then reorienting, in that order, of the scientific community. More recent discoveries add more meaning not only to the present but also to the past. Similarly, more recent knowledge is even more significant because the past has aged. As we gradually zero in on something, the more difficult it becomes to think radically, to think way out of the box, because such suggestions are considered abnormal in comparison to something groundbreaking that came before.
- Thomas Kuhn is known for his controversial 1964 book, The Structure of Scientific Revolutions, in which he characterized the now-staple concept of a paradigm as the entity that undergoes rigorous testing before the scientific community can induct a once-anomalous fact.
This phenomenon is something that ought not to be eradicated: it is necessary to weed out the unscalable and the superficial. It is persistence in such an environment that reaps the greatest rewards, even though the idea may sound oddly masochistic.
For example, in the case of Dan Shechtman, whose story was popularized after he won the Nobel Prize for chemistry in 2011: even though the abrasive interference of Linus Pauling was unfortunate, the atmosphere of doubt was heavy because the conviction of Shechtman's peers got in the way of his immediate success. However, at all other points of time, that conviction is necessary to sustain research.
- Dan Shechtman
This doesn't mean all knowledge in physics follows from what came before it. After all, the only things fixed in nature are the laws of physics, and it is by closely observing them that we begin our first lessons in the subject.
For the next few decades after the 1950s, the spell of The Lord of the Rings over fantasy fiction couldn't easily be broken (check postscript), so pervasive was its influence. Only gradually did writers realize that fantasy fiction is simply what the world is not, and that thought resurrected a treasure-chest of ideas, giving us the pleasurable writing of Steven Erikson, Ursula Le Guin, Terry Goodkind, Stephen Donaldson, Robert Jordan and others.
Analogously, after Albert Einstein formulated his general theory of relativity (GR) and quantum mechanics (QM) was brought up by Schrodinger, Planck, Pauli, Maxwell and others, there was a fallout amongst physicists. The two monumental theories couldn't be reconciled, resulting in academic chaos. It was in such an atmosphere that two factions of radical thought emerged: loop quantum gravity and M-theory (a.k.a. string theory), and neither of them had attempted to work off what was set down in GR or QM. (In fact, through an attempt at reconciliation, these two theories have evolved to explain some of the most fundamental secrets of the universe).
- "A Calabi-Yau manifold is a special type of [smooth surface] that shows up in certain branches of mathematics such as algebraic geometry, as well as in theoretical physics. Particularly in superstring theory, the extra dimensions of spacetime are sometimes conjectured to take the form of a 6-dimensional Calabi-Yau manifold." - Wikipedia
Ultimately, the lessons with which we journey into the future of science (or is it already here?) are all encapsulated in the spirit of The Lord of Rings, at least in my opinion. Both the magnum opus and physics have been and are seminal in various ways. Even though the trials and tribulations of Middle Earth may not have been the cause of great relief and healing like physics has, the journey into their causes was a teaching experience nonetheless.
The similarities that I have made a note of are simply empirical and born out of my fondness for both entities, but they are also equally undeniable. For instance, while the division of physics into three "realms" may seem perfunctory to some, it is a good place to begin to understand why what Elsevier Publications is up to is horrible.
*
PS:
- "Do you remember [...] The Lord of the Rings? [...] Well, Io is Mordor [...] There's a passage about "rivers of molten rock that wound their way ... until they cooled and lay like dragon-shapes vomited from the tortured earth." That's a perfect description: how did Tolkien know, a quarter of a century before anyone saw a picture of Io? Talk about Nature imitating Art.", Arthur C. Clarke, 2010: Odyssey Two, Chapter 16 'Private Line'
- http://www.moongadget.com/origins/lotr.html
- Gary Gygax, creator of Dungeons and Dragons: "How did it influence the D&D game? Whoa, plenty, of course. Just about all the players were huge JRRT fans, and so they insisted that I put as much Tolkien-influence material into the game as possible. Anyone reading this that recalls the original D&D game will know that there were Balrogs, Ents, and Hobbits in it. Later those were removed, and new, non-JRRT things substituted–Balor demons, Treants, and Halflings. Indeed, who can doubt the excellence of Tolkien’s writing? So of course it had a strong impact on A/D&D games." Link: http://www.theonering.net/features/interviews/gary_gygax.html
Tuesday, 31 May 2011
Philosophiae homis
The 'Book Summary' makes me wonder... why are the likes of Carl Sagan and Stephen Hawking and Richard Feynman so few and so far in between? The popularization of science may not seem like a necessary fixture to its acceptance in mainstream media, but over the years it has become increasingly necessary to clarify its role in the eyes of the common man—even roles such as those played in the maintenance of the Large Hadron Collider or of the International Space Station. That Einstein's papers on special and general relativity need a "redesigning" betokens a moment's reflection on the dependence of our day-to-day activities on scientific research and development, whether the increasing investment in experimental apparatuses sees justification just like military spending does, and if anyone ascribes the need for that investment to anything apart from science's utilitarian value.
Philosophiae homis
The 'Book Summary' makes me wonder... why are the likes of Carl Sagan and Stephen Hawking and Richard Feynman so few and so far in between? The popularization of science may not seem like a necessary fixture to its acceptance in mainstream media, but over the years it has become increasingly necessary to clarify its role in the eyes of the common man—even roles such as those played in the maintenance of the Large Hadron Collider or of the International Space Station. That Einstein's papers on special and general relativity need a "redesigning" betokens a moment's reflection on the dependence of our day-to-day activities on scientific research and development, whether the increasing investment in experimental apparatuses sees justification just like military spending does, and if anyone ascribes the need for that investment to anything apart from science's utilitarian value.
Thursday, 19 May 2011
Language, truth and knowledge
It would prove futile to address every incidence of curiosity by seeking out the requisite "knowledge" that constitutes the "knowable" volume of the subject through an isolationist perspective; it is also obviously futile to address the content in its entirety lest the curiosity—essentially the context within which any epistemological exegesis becomes meaningful—stands overwhelmed. If I were to associate any semantic weight with the idea of justice, I would ask: where does the knowledge, "the truth", of law arise from, what is the need that, in the eyes of those who partake of its provisions, it assesses, and what is the modality within which it finds realization? Could there exist an epistemological variable the evaluation of which represents a (quantitative or qualitative) difference between the cognitive value of a statement of truth and that of a statement of law, thereby, say, establishing the origin of the truth of law as being independent of the same social urges that are the domain (of applicability) of the sanctions it backs?
Language, truth and knowledge
It would prove futile to address every incidence of curiosity by seeking out the requisite "knowledge" that constitutes the "knowable" volume of the subject through an isolationist perspective; it is also obviously futile to address the content in its entirety lest the curiosity—essentially the context within which any epistemological exegesis becomes meaningful—stands overwhelmed. If I were to associate any semantic weight with the idea of justice, I would ask: where does the knowledge, "the truth", of law arise from, what is the need that, in the eyes of those who partake of its provisions, it assesses, and what is the modality within which it finds realization? Could there exist an epistemological variable the evaluation of which represents a (quantitative or qualitative) difference between the cognitive value of a statement of truth and that of a statement of law, thereby, say, establishing the origin of the truth of law as being independent of the same social urges that are the domain (of applicability) of the sanctions it backs?
Tuesday, 10 May 2011
Das Opfer des schwarzen Blutes
Denken Sie nicht über die Wahrheit, mein Kind,
Die Dunkelheit ist notwendig für unsere Augen!
Wo sind wir zu gehen, wenn wir nicht wissen, Krieg?
Wo können wir gehen, wenn es keine Straßen?
Lass dich nicht von den Geräuschen, Kind, Angst
Halten Sie alle Ihre Ängste auslaufen
In den Frauen, die Träume haben für Kinder,
In die Herzen der Unwissenden und Ohren.
Hurt sich selbst und lassen den Blutfluss, Kind,
Weil sie werden für immer die dummen Affen!
Die Wissenschaften sind Sie wissen Geheimnisse
Welche von ihren kleinen Herzen verschwinden!
Leg deinen Kopf auf meinen Schoß und weinen, Kind,
Damit die Menschen können niemals lernen,
Von Ihrem Opfer, das notwendig ist für die Zukunft,
Die Lehren, die Sie für heute abend sterben müssen!
*
Translation
The sacrifice of black blood
Do not think about the truth, my child,
The darkness is necessary for our eyes!
Where do we go when we know not war?
Where can we go if there are no roads?
Do not fear the noise, child,
Leave behind all your fears
For women who have dreams for children,
To spill into the hearts of the ignorant and ears.
Hurt yourself and let the blood flow, child,
Because they will forever be the silly monkeys!
The science you now know are the secrets
Which must disappear from their little hearts!
Lay your head on my lap and cry, child,
So that people may never learn
From your sacrifice, that is necessary for the future,
The lessons that you have to die for tonight!
Das Opfer des schwarzen Blutes
Denken Sie nicht über die Wahrheit, mein Kind,
Die Dunkelheit ist notwendig für unsere Augen!
Wo sind wir zu gehen, wenn wir nicht wissen, Krieg?
Wo können wir gehen, wenn es keine Straßen?
Lass dich nicht von den Geräuschen, Kind, Angst
Halten Sie alle Ihre Ängste auslaufen
In den Frauen, die Träume haben für Kinder,
In die Herzen der Unwissenden und Ohren.
Hurt sich selbst und lassen den Blutfluss, Kind,
Weil sie werden für immer die dummen Affen!
Die Wissenschaften sind Sie wissen Geheimnisse
Welche von ihren kleinen Herzen verschwinden!
Leg deinen Kopf auf meinen Schoß und weinen, Kind,
Damit die Menschen können niemals lernen,
Von Ihrem Opfer, das notwendig ist für die Zukunft,
Die Lehren, die Sie für heute abend sterben müssen!
*
Translation
The sacrifice of black blood
Do not think about the truth, my child,
The darkness is necessary for our eyes!
Where do we go when we know not war?
Where can we go if there are no roads?
Do not fear the noise, child,
Leave behind all your fears
For women who have dreams for children,
To spill into the hearts of the ignorant and ears.
Hurt yourself and let the blood flow, child,
Because they will forever be the silly monkeys!
The science you now know are the secrets
Which must disappear from their little hearts!
Lay your head on my lap and cry, child,
So that people may never learn
From your sacrifice, that is necessary for the future,
The lessons that you have to die for tonight!
Tuesday, 3 May 2011
Orison of the knowing
The world is the land beneath the road that takes us
On the journey we so often seek to undertake
Unto cities of gold where lie the fortunes we must make!
The world is the name of the cross that aches us;
The world is the home of the foe that stakes us;
Here we with memories bury the hatchet of hate
When our golden dreams fall to the ground insatiate!
The world is the hue of the blindness that breaks us;
The world is the shade of turmoil that shakes us;
Only here rests the unseen good of an unseen god
But punishment–oh yes–must be with a firing squad!
Such is the reward of this world that baits us
Guarding our treasure with trying fate it awaits us!
It is no longer the world that once remained our own;
'Tis the sole truth to the ignorant we must make known!
Tuesday, 19 April 2011
On Ambition
Ambition reserves unto itself a certain qualifying dignity; had it not, then there would be no ambitious fools.
Saturday, 26 March 2011
A Shade Of Solecism
I learnt of the world outside my window by writing. When I write things and hit "Save", an exuberance sweeps over me that signifies that something has been said and set in stone, that something cannot be changed and for every moment that comes after it, it is embellished deeper and deeper in the murk of history. For that reason, I can't let anything be wrong. I want my footprints on history's pages to be picture-perfect. It's not something I'm pretending to be - it's only something I know I can be and am trying my best to be so. In order to make correctness a habit, I read, I discover, I interpret. Reading and discovering can happen over and over again, without interpretation they will remain useless as time passes. Our mark does not lie in understanding that darkness is darkness and the light is the light; it lies in being able to light a candle without regard to whatever winds may be blowing then.
However, as the writer writes more and more, there is more and more about the world that is new, that is there in the now but wasn't in the then. If this moment has been prepared for, then disillusionment can be spared in favour of understanding, as has been noted that to attempt to learn is futile if understanding is absent. The prevalence of a loss of context forces a delineation on the matter of "understanding": to say that one understands is to not have integrated the ability to recognize, disintegrate and recreate, but to have only remembered the meaning encapsulated therein.
As much as contributions are expedited, so much is the world changed, and the world of the minute before understands its retirement just so. I, who have learnt much in this process of writing and self-discovery, am now a different man than of the minute before and have cast over my understanding of the world then a shade of solecism. The greatest lesson, therefore, does not concern the contents of our learning but the methodology itself: not what we learn, but how we learn. By integrating the idea that the spinning top spins so because tops spin so, we do not graduate from being fools. We must learn why it spins so. A top spinning the moment past will grind to a clumsy halt, but in setting another in motion is our learning vindicated.
Tuesday, 1 February 2011
Sometimes, To Be Ethical Is To Be A Fool.
A very good example to illustrate this dilemma would be the products of Google, more specifically the services offered under their Labs feature. The first step in increasing worldwide public access to information was the Google Books project. The unevenness of opportunities presented by lack of space or time was nullified by the access to a range of books written in many languages and of many genres (especially of the classics corpora). On that primary level, any competence that involved information as a principal player saw the latter’s transformation into a tradable commodity. It began to be subjected to the same abuse that money faced: hoarding, thriftiness and deficiency. Therefore, he who hoarded, he who was thrifty and he who caused deficiency was inculpated for unethical practices.
Now, with too much information swimming around the cybersphere, data visualization has been resurrected with greater responsibility and, as a matter of an axiom, greater power. In between the two eras, that of data acquisition and perception, there was a period dominated quietly by a backstage hero called data mining: with more information on more things coming out every second, the proverbial gap between the winner and the loser began to narrow down because the two factions were only separated by the knowledge of what information was worthy and what was not. However, when we bit off more than we could chew, it was soon not a matter of what but of how. When we began to find out more than we ought to have known about the past, the future becomes less of a certainty and more of a possibility.
In line with that thought, Google brought in its Ngram Viewer (NV). A simple extension of the Google Books venture, NV brought together simple data mining, graphical data visualization and hundreds of thousands of books written in the last 200 years in 7 languages to leave the user with a new kind of data, ripe for interpretation. Visit the viewer here and see for yourself how the usage of the words “gay” and “homosexual” has varied in frequency over the years, and how it can be understood to show our perception of the words themselves: the more often they were used, the more they featured in discussion, the more they impacted us.
In this secondary level of information distribution – with the world as such tending to greater access limited by vaguer boundaries – could there be such a thing as information hoarding? Definitely. Compare this scenario you’re in to a ladder: you’re on the bottom rung, raw data is on the top-most rung. Before the raw data can reach you, the number of other filters it goes through on the way is increasing. Even though the greater challenge has been to engender new perspectives, there is also the challenge of leaving some information to be interpreted. On the primary level, the access to the information is increased. On the secondary level, it is classified more logically. On the third level, when it reaches you, you retain a responsibility still to decide:
- How you use it
- Why you use it, and
- Whom do you use it with
Therefore, the ethics of this day and age have not been blurred by the repeated refinements but have only been rendered into a finer and finer line, bent this way and that by corporate greed, capitalist agendas and an overriding anarchism performed as an act of rebellion in most cases. The withholding of information does not spell misdemeanour but, more often than not, caution. This is the very nature of capitalism: to address greed by fostering the need to compete in its players. To be completely ethical in such a day and age is to be a fool.
Related Articles
- On The Reawakening Of Dreams (enderanimate.wordpress.com)