I have never been able to fathom poetry. Not because it's unensnarable—which it annoyingly is—but because it never seems to touch upon that all-encompassing nerve of human endeavour supposedly running through our blood, transcending cultures and time and space. Is there a common trouble that we all share? Is there a common tragedy that is not death that we all quietly await that so many claim is described by poetry?
I, for one, think that that thread of shared memory is lost, forever leaving the feeble grasp of our comprehension. In fact, I believe that there is more to be shared, more to be found that will speak to the mind's innermost voices, in a lonely moment of self-doubting. Away from a larger freedom, a "shared freedom", we now reside in a larger prison, an invisible cell that assumes various shapes and sizes.
Sometimes, it’s in your throat, blocking your words from surfacing. Sometimes, it has your skull in a death-grip, suffocating all thoughts. Sometimes, it holds your feet to the ground and keeps you from flying, or sticks your fingers in your ears and never lets you hear what you might want to hear. Sometimes, it’s a cock in a cunt, a blade against your nerves, a catch on your side, a tapeworm in your intestines, or that cold sensation that kills wet dreams.
Today, now, this moment, the smallest of freedoms, the freedoms that belong to us alone, are what everyone shares, what everyone experiences. It's simply an individuation of an idea, rather a belief, and the truth of that admission—peppered as it is with much doubt—makes us hold on more tightly to it. And as much as we partake of that individuation, like little gluons that emit gluons, we inspire more to pop into existence.
Within the confines of each small freedom, we live in worlds of our own fashioning. Poetry is, to me, the voice of those worlds. It is the resultant voice, counter-resolved into one expression of will and intention and sensation, that cannot, in turn, be broken down into one man or one woman, but only into whole histories that have bred them. Poetry is, to me, no longer a contiguous spectrum of pandered hormones or a conflict-indulged struggle, but an admission of self-doubt.
Showing posts with label history. Show all posts
Showing posts with label history. Show all posts
Saturday, 8 September 2012
Friday, 27 July 2012
A clock without a craftsman
Curiosity can be devastating on the pocket. Curiosity without complete awareness has the likelihood of turning fatal.
At first, for example, there was nothing. Then, there was a book called The Feynman Lectures on Physics (Vol. 3) (Rs. 214) in class XII. Then, there was great interest centered on the man named Richard Feynman, and so, another book followed: Surely You're Joking, Mr. Feynman! (Rs. 346) By the time I'd finished reading it, I was introduced to that argumentative British coot named David Hume, whose Selected Essays (Rs. 425) sparked my initial wonderment on logical positivism as well as torpor-inducing verbosity (in these terms, his only peer is Thomas Pynchon (Against the Day, Rs. 800), and I often wonder why many call for his nomination for a Nobel Prize in literature. The Prize is awarded to good writers, right? Sure, he writes grandiose stuff and explores sensations and times abstract to everyone else with heart-warming clarity, but by god do you have to have a big attention span to digest it! In contrast: Vargas Llosa!).
I realized that if I had to follow what Hume had to say, and then Rawls, and then Sen (The Idea of Justice, Rs. 374) and Kuhn (The Structure of Scientific Revolutions, Rs. 169 - the subject of my PG-diploma's thesis) and Kant, and then Schopenhauer, Berkeley and Wittgenstein, I'd either have to study philosophy after school and spend the rest of my days in penurate thought or I'd have to become rich and spend the rest of my days buying books while not focusing on work.
But how does one choose the title of that school of thought that one finds agreeable without perusing the doctrines of all the schools on offer? I was back to square one. Then, someone suggested reading The Story of Philosophy (Rs. 230) by Will Durant. When I picked up a copy at a roadside bookstore, I suspected its innards had been pirated, too: the book would have been more suited in the hands of one in need of a quick-reference tool; the book didn't think; the book wasn't the interlocutor I was hoping it would be.
I wanted dialogue, I wanted dialectic in the context of Heinrich Moritz Chalybäus' thesis (Systems of Speculative Ethics as translated by Alfred Edersheim, 1854 - corresponding to System of Speculative Philosophy by G.W.F. Hegel). I wanted the evolution of Plato (The Republic, Rs. 200), Aristotle (Poetics, Rs. 200), Marcus Aurelius (Meditations, Rs. 200). That was when I chanced upon George Berkeley's Principles of Human Knowledge (Rs.225) and Three Dialogues Between Hylas and Philonous (Rs. 709). Epistemology began then to take shape; until that moment, however, it was difficult to understand the inherently understood element as anything but active-thought. It's ontology started to become clear - and not like it did in the context of The Architecture of Language by A. Noam Chomsky (Rs. 175), which, to me, still was the crowning glory of naturalist thought.
Upon the consumption of each book, a pattern became evident: all philosophers, and their every hypothesis, converged on some closely interrelated quantum mechanical concepts.
Around the same time, I came to the conclusion that advanced physics held the answers to most ontological questions - as I have come to understand it must. Somewhere-somewhen in the continuum, the observable and the unobservable have to converge, coalesce into a single proto-form, their constituents fuse in the environment afforded them to yield their proto-reactants. Otherwise, the first law of thermodynamics would stand violated!
However, keeping up with quantum mechanics would be difficult for one very obvious reason: I was a rookie, and it was a contemporary area of intense research. To solve for this, I started with studying the subject's most pragmatic parts: Introduction to Quantum Mechanics by Powell & Crasemann (Rs. 220), Solid State Physics by Ashcroft & Mermin (Rs. 420), Quantum Electrodynamics by Richard Feynman (Rs. 266), and Electromagnetic Systems and Radiating Waves by Jordan & Balmain (Rs. 207) were handy viaducts. Not like there weren't any terrors in between, such as Lecture Notes on Elementary Topology and Geometry by Singer & Thorpe.
At the same time, exotic discoveries were being made: at particle colliders, optical research facilities, within deep space by ground-based interstellar probes, within the minds of souls more curious than mine. Good for me, the literature corresponding to all these discoveries was to be found in one place: the arXiv pre-print servers (the access to which costs all of nothing). These discoveries included quantum teleportation, room-temperature superconductivity, supercomputers, metamaterials, and advancements in ferromagnetic storage systems.
(I also was responsible for discovering some phenomena exotic purely to me in this period: cellular automata and computation theory - which I experimented with using Golly and Mirek's Cellebration, and fuzzy logic systems and their application in robotics - experimented with using the Microsoft Robotics Developer Studio.)
What did these discoveries have to do with Hume's positivism? That I could stuff 1 gigabyte's worth of data within an inch-long row of particles championed empiricism, I suppose, but beyond that, the concepts' marriage seemed to demand the inception of a swath of interdisciplinary thought. I could not go back, however, so I ploughed on.
[caption id="attachment_23735" align="aligncenter" width="600"]
I was trapped in the spaces between books, between different moments in history, in time, a totalistic cellular automaton whose different avatars were simply different degrees of doubt.[/caption]
[caption id="attachment_23736" align="aligncenter" width="600"]
Because of reality's denial of accommodation to manifestations of tautologies and contradictions, so was I trapped within the shortcomings of all men and women.[/caption]
A Brief History of Time (Rs. 245) did not help - Hawking succeeded splendidly in leaving me with more questions than answers - (Gravitation and Cosmology: Principles and Applications of the General Theory of Relativity by Steven Weinberg (Rs. 525) answered some of them), The Language Instinct by Harvard-boy Steven Pinker (Rs. 450) charted the better courses of rationality into sociology and anthropology (whereas my intuition that Arundhati Roy would reward governance with a similar fashion of rational unknotting was proved expensively very right: Algebra of Infinite Justice, at Rs. 302, lays bare all the paradoxes that make India India).
For literature, of course, there were Orhan Pamuk and Umberto Eco, Lord Tennyson and Sylvia Plath, de Beauvoir, le Guin and Abbott (My Name is Red (Rs. ... Whatever, it doesn't matter!), The Name of the Rose, and The Mysterious Flame of Queen Loana are to be cherished, especially the last for its non-linear narration and the strange parallels waiting to be drawn with hermeneutics, such as one delineated on by E.H. Carr in his What Is History?) to fall in love with (Plath's works, of course, were an excursion into the unexplored... in a manner of speaking, just as le Guin's imagination and Abbott's commentary are labours unto the familiar).
Ultimately, that was all that I learnt. Quite romantic though that being an autodidact may sound, the assumption of its mantle involves the Herculean task of braiding all that one learns into a single spine of knowledge. The more you learn, the farther you are from where you started, the even more you have learnt, the more ambitious you get... I cannot foresee an end.
Currently, I am reading One Day in the Life of Ivan Denisovich by Soviet-era exile Alexander Solzhenitsyn (war-time dystopian fiction became a favourite along the way after reading a history of firearms in Russia, a history of science and technology in Islam, How Things Work gifted to me by my father when I was 11, and Science and Civilisation in China by Needham & Gwei-Djen (Rs. 6,374 - OK, now it matters)) and Current Trends in Science: Platinum Jubilee Edition - Indian Academy of Sciences, lent to me by Dr. G. Baskaran. At each stage, a lesson to be learnt about the universe is learnt, a minuscule piece told in the guise of one author's experiences and deductions to fit into a supermassive framework of information that has to be used by another's intelligence. A daunting task.
No wonder it doesn't come cheap.
At first, for example, there was nothing. Then, there was a book called The Feynman Lectures on Physics (Vol. 3) (Rs. 214) in class XII. Then, there was great interest centered on the man named Richard Feynman, and so, another book followed: Surely You're Joking, Mr. Feynman! (Rs. 346) By the time I'd finished reading it, I was introduced to that argumentative British coot named David Hume, whose Selected Essays (Rs. 425) sparked my initial wonderment on logical positivism as well as torpor-inducing verbosity (in these terms, his only peer is Thomas Pynchon (Against the Day, Rs. 800), and I often wonder why many call for his nomination for a Nobel Prize in literature. The Prize is awarded to good writers, right? Sure, he writes grandiose stuff and explores sensations and times abstract to everyone else with heart-warming clarity, but by god do you have to have a big attention span to digest it! In contrast: Vargas Llosa!).
I realized that if I had to follow what Hume had to say, and then Rawls, and then Sen (The Idea of Justice, Rs. 374) and Kuhn (The Structure of Scientific Revolutions, Rs. 169 - the subject of my PG-diploma's thesis) and Kant, and then Schopenhauer, Berkeley and Wittgenstein, I'd either have to study philosophy after school and spend the rest of my days in penurate thought or I'd have to become rich and spend the rest of my days buying books while not focusing on work.
An optimum course of action presented itself. I had to specialize.
But how does one choose the title of that school of thought that one finds agreeable without perusing the doctrines of all the schools on offer? I was back to square one. Then, someone suggested reading The Story of Philosophy (Rs. 230) by Will Durant. When I picked up a copy at a roadside bookstore, I suspected its innards had been pirated, too: the book would have been more suited in the hands of one in need of a quick-reference tool; the book didn't think; the book wasn't the interlocutor I was hoping it would be.
I wanted dialogue, I wanted dialectic in the context of Heinrich Moritz Chalybäus' thesis (Systems of Speculative Ethics as translated by Alfred Edersheim, 1854 - corresponding to System of Speculative Philosophy by G.W.F. Hegel). I wanted the evolution of Plato (The Republic, Rs. 200), Aristotle (Poetics, Rs. 200), Marcus Aurelius (Meditations, Rs. 200). That was when I chanced upon George Berkeley's Principles of Human Knowledge (Rs.225) and Three Dialogues Between Hylas and Philonous (Rs. 709). Epistemology began then to take shape; until that moment, however, it was difficult to understand the inherently understood element as anything but active-thought. It's ontology started to become clear - and not like it did in the context of The Architecture of Language by A. Noam Chomsky (Rs. 175), which, to me, still was the crowning glory of naturalist thought.
Where does the knowledge, "the truth", of law arise from? What is the modality within which it finds realization? Could there exist an epistemological variable (empirically speaking) the evaluation of which represents a difference between the cognitive value of a statement of truth and that of a statement of law? Are truths simply objective reasons whose truth-value may or may not be verifiable?
Upon the consumption of each book, a pattern became evident: all philosophers, and their every hypothesis, converged on some closely interrelated quantum mechanical concepts.
Are the mind and body one? Does there exist an absolute frame of reference? Is there a unified theory at all?
Around the same time, I came to the conclusion that advanced physics held the answers to most ontological questions - as I have come to understand it must. Somewhere-somewhen in the continuum, the observable and the unobservable have to converge, coalesce into a single proto-form, their constituents fuse in the environment afforded them to yield their proto-reactants. Otherwise, the first law of thermodynamics would stand violated!
However, keeping up with quantum mechanics would be difficult for one very obvious reason: I was a rookie, and it was a contemporary area of intense research. To solve for this, I started with studying the subject's most pragmatic parts: Introduction to Quantum Mechanics by Powell & Crasemann (Rs. 220), Solid State Physics by Ashcroft & Mermin (Rs. 420), Quantum Electrodynamics by Richard Feynman (Rs. 266), and Electromagnetic Systems and Radiating Waves by Jordan & Balmain (Rs. 207) were handy viaducts. Not like there weren't any terrors in between, such as Lecture Notes on Elementary Topology and Geometry by Singer & Thorpe.
At the same time, exotic discoveries were being made: at particle colliders, optical research facilities, within deep space by ground-based interstellar probes, within the minds of souls more curious than mine. Good for me, the literature corresponding to all these discoveries was to be found in one place: the arXiv pre-print servers (the access to which costs all of nothing). These discoveries included quantum teleportation, room-temperature superconductivity, supercomputers, metamaterials, and advancements in ferromagnetic storage systems.
(I also was responsible for discovering some phenomena exotic purely to me in this period: cellular automata and computation theory - which I experimented with using Golly and Mirek's Cellebration, and fuzzy logic systems and their application in robotics - experimented with using the Microsoft Robotics Developer Studio.)
What did these discoveries have to do with Hume's positivism? That I could stuff 1 gigabyte's worth of data within an inch-long row of particles championed empiricism, I suppose, but beyond that, the concepts' marriage seemed to demand the inception of a swath of interdisciplinary thought. I could not go back, however, so I ploughed on.
[caption id="attachment_23735" align="aligncenter" width="600"]
[caption id="attachment_23736" align="aligncenter" width="600"]
A Brief History of Time (Rs. 245) did not help - Hawking succeeded splendidly in leaving me with more questions than answers - (Gravitation and Cosmology: Principles and Applications of the General Theory of Relativity by Steven Weinberg (Rs. 525) answered some of them), The Language Instinct by Harvard-boy Steven Pinker (Rs. 450) charted the better courses of rationality into sociology and anthropology (whereas my intuition that Arundhati Roy would reward governance with a similar fashion of rational unknotting was proved expensively very right: Algebra of Infinite Justice, at Rs. 302, lays bare all the paradoxes that make India India).
For literature, of course, there were Orhan Pamuk and Umberto Eco, Lord Tennyson and Sylvia Plath, de Beauvoir, le Guin and Abbott (My Name is Red (Rs. ... Whatever, it doesn't matter!), The Name of the Rose, and The Mysterious Flame of Queen Loana are to be cherished, especially the last for its non-linear narration and the strange parallels waiting to be drawn with hermeneutics, such as one delineated on by E.H. Carr in his What Is History?) to fall in love with (Plath's works, of course, were an excursion into the unexplored... in a manner of speaking, just as le Guin's imagination and Abbott's commentary are labours unto the familiar).
Learn to like ebooks. Or turn poor.
Ultimately, that was all that I learnt. Quite romantic though that being an autodidact may sound, the assumption of its mantle involves the Herculean task of braiding all that one learns into a single spine of knowledge. The more you learn, the farther you are from where you started, the even more you have learnt, the more ambitious you get... I cannot foresee an end.
Currently, I am reading One Day in the Life of Ivan Denisovich by Soviet-era exile Alexander Solzhenitsyn (war-time dystopian fiction became a favourite along the way after reading a history of firearms in Russia, a history of science and technology in Islam, How Things Work gifted to me by my father when I was 11, and Science and Civilisation in China by Needham & Gwei-Djen (Rs. 6,374 - OK, now it matters)) and Current Trends in Science: Platinum Jubilee Edition - Indian Academy of Sciences, lent to me by Dr. G. Baskaran. At each stage, a lesson to be learnt about the universe is learnt, a minuscule piece told in the guise of one author's experiences and deductions to fit into a supermassive framework of information that has to be used by another's intelligence. A daunting task.
No wonder it doesn't come cheap.
Or does it?
Friday, 18 May 2012
The spoken word
What is the purpose of political correctness? Is it to hone language down to its permissible essentials and wean out potentially harmful phrases? Or is it to go beyond that and reinforce its "goodness" and necessity?
When being consciously politically correct, I'm saying one thing but meaning quite something else. The point is if I've already thought something, why does not saying it purported to make a difference? Even if the concern is that the statement is then set in stone and becomes a part of history, wanting to erase it and instead provide an alternative that sheds all insinuations and prejudices is more despicable.
The politically correct statement or phrase seems to have been constructed only to avoid uncomfortable silences in the present. In the long run, however, it masks very real tendencies in favour of a reality that perpetuates the existence of political correctness. It encourages self-censorship and restraint when expressing thoughts deemed unpalatable by others, even if only to the extent that those who speak their minds are driven underground and forced to engage in samizdat than in open discussion. Ultimately, if I'm politically correct today, I'll have to continue to be politically correct in the future.
Some would argue that the way to look at the need for political correctness is to look at the way the world would be without it. I see a world of difference. For instance, I could say something and mean it instead of having to disguise it in terms of politically correct phrases and imply that that's what I meant. I could call a black man "black" simply because I find it easy to identify him that way, not because I think calling him "black" is any sort of testament to his heritage.
By extension, it's necessary that we decouple politics and correctness not only in terms of language, but also such things like colours, flora, and fauna. Using the syntax but dissociating it from its semantics for the sake of avoiding political discomfort is nothing but the prostitution of language. It's the act of borrowing words to use them for purposes they were never intended to serve.
Being politically correct all the time is causing a harmful disconnect between the way we're thinking and what we're saying. To think that a) "This is what I mean" and then b) "This is what must be said to mean what I think it means" kills the nature of language to be native and organic. Under the influence of mechanical considerations and case-by-case replacements, the way we use language is transformed into something machine-like.
Are uncomfortable silences in the present so detestable as to risk the mechanization of expression?
When being consciously politically correct, I'm saying one thing but meaning quite something else. The point is if I've already thought something, why does not saying it purported to make a difference? Even if the concern is that the statement is then set in stone and becomes a part of history, wanting to erase it and instead provide an alternative that sheds all insinuations and prejudices is more despicable.

Some would argue that the way to look at the need for political correctness is to look at the way the world would be without it. I see a world of difference. For instance, I could say something and mean it instead of having to disguise it in terms of politically correct phrases and imply that that's what I meant. I could call a black man "black" simply because I find it easy to identify him that way, not because I think calling him "black" is any sort of testament to his heritage.
By extension, it's necessary that we decouple politics and correctness not only in terms of language, but also such things like colours, flora, and fauna. Using the syntax but dissociating it from its semantics for the sake of avoiding political discomfort is nothing but the prostitution of language. It's the act of borrowing words to use them for purposes they were never intended to serve.
Being politically correct all the time is causing a harmful disconnect between the way we're thinking and what we're saying. To think that a) "This is what I mean" and then b) "This is what must be said to mean what I think it means" kills the nature of language to be native and organic. Under the influence of mechanical considerations and case-by-case replacements, the way we use language is transformed into something machine-like.
Are uncomfortable silences in the present so detestable as to risk the mechanization of expression?
Friday, 30 December 2011
Prospect theory, locating the mind and biggest betrayals: Setting the mood for 2012
Three books are lined up for reading this January, books by Kahneman, Eagleman and Beevor.

Absolute v. relative happiness
'Thinking, Fast and Slow' by Nobel Economics laureate Daniel Kahneman I chose purely because of his work in prospect theory, rather his work in laying the basis of prospect theory. For too long, mathematical models have been constructed to determine the set of optimal choices that a human will make given a set of fixed assets and personal preferences. On the other hand, those made in real life are quite deviant from theory.
According to expected utility theory, one such 'ideal' model, choices are broken down as betting preferences made in the face of uncertain outcomes, or gambles. Next, each choice is depicted as a function of the payout, the probabilities of occurrence of each event, how important the event is to other people (it's "utility"), and the behavioural pattern of the chooser in the face of uncertainty (risk aversion).
According to Kahneman's theory, the process of prospecting is broken down into two stages. In the first editing stage, the outcomes of each experiment are ordered according to a set of guidelines set out speculatively. As part of the same process, the preference of each outcome is gauged according to a reference point and ordered accordingly. To this point, the theory is spot-on, and it is in the following evaluation phase that computation comes in and makes me uneasy.
[caption id="attachment_21151" align="aligncenter" width="259" caption="Daniel Bernoulli, one of the uber-smart Bernoullis, conceived the expected utility theory in 1738."]
[/caption]
I can recall no computations that I've made, consciously or no, while deciding which laptop to buy or what move to make in a game of chess apart from asking myself "what will hurt more?". Fortunately, prospect theory carefully asserts that losses hurt more than gains feel good, whereas in the expected utility theory, the reference point is scalar and doesn't provide a parameter to judge for how much losses hurt.
Kahneman's book, I feel, is a good place to begin when thinking about how a behavioural economist's conception of a problem affects the characterization of the problem and the subsequent solutions that present themselves. Another reward is the understanding of utility as a reference-based entity: think Sheldon Cooper's example in The Big Bang Theory.
Is freewill real?
Next in line is David Eagleman's 'Incognito: The Secret Lives of the Brain'. Eagleman is a neuroscientist with the Baylor College of Medicine, Texas, and is reputed in the fields of synesthesia, time perception and the emerging field of neurolaw. Neurolaw draws on cognitive psychology, criminology and philosophy to assess the interaction between jurisprudence and neuroscience.
[caption id="attachment_21152" align="aligncenter" width="529" caption="Synesthesia is the phenomenon whereby the stimulation of one cognitive pathway in the brain leads to the automatic stimulation of another pathway. Depicted above is an example of colour-graphemic synesthesia, where the visual input of letters and their shapes provokes the perception of an inherent colour as well."]
[/caption]
However, I picked his book for just three reasons: he's young, Ed Yong recommended his book, and his experimental methodology when attempting to understand time perception in high-adrenaline situations. They each translate to progressive-mindedness and an easily established non-adherence to old-school techniques and concepts, Ed Yong recommended his book, and a practical approach when it comes to satiating his curiosity.
In keeping with what I was looking for in Kahneman's book, I'm going to read the book keeping in mind my questions about freewill: whether we really have it or if it's just an illusion of choices created by a brain that already has some idea about the outcome (just the way it knew what the choices were going to be because it put them together based on years of experiences).
[caption id="attachment_21153" align="aligncenter" width="520" caption="The HAL 9000 computer, from '2001: A Space Odyssey', holds a special place in the minds of geeks because it was and is one of the most prominent AIs that, seeking to secure a greater interest, subverts mankind. Will a cylon seek to do the same?"]
[/caption]
The good thing about getting closer to the future and, better yet, the technological singularity is that we're more likely to work in the direction of man-machine unification - or at least I'd like to think so. To clarify, I'm a skeptic of such an idea, but I am hopeful; if anything, we'll always be 10 years away from realizing such a tech-intense marriage.
At the same time, such ambitions keep us working on understanding how we piece together different bits of information to generate an image of reality that we can meaningfully interact with. In fact, in the words of Eagleman himself, "[my long-range goals are] to understand how neural signals processed by different brain regions come together for a temporally unified picture of the world."
The battle that was the beginning of the end
The last book is the one I'm most eager to read: 'Stalingrad: The Fateful Siege 1942-1943' by Antony Beevor is a humanist take on the siege of Stalingrad during World War II that resulted in the loss of 1.9 million lives, making it one of the bloodiest conflicts in human history.
[caption id="attachment_21154" align="aligncenter" width="360" caption="A building in Stalingrad in ruins in the aftermath of the battle that liberated the city"]
[/caption]
Being a longstanding World War II enthusiast (and hoping it comes across as an academic pursuit more than anything else), Beevor's work is a gem because it brings together the strategies, regions, power shifts, resources, rewards and losses, and the people of Germany, eastern Europe, Finland, Russia, northern China and Japan into one vast but unrelenting study of conflict and fate.
I came across the book when I was sifting through Amazon's recommendations after I'd taken a look at Kennan's biography (written by National Humanities Medal recipient John Lewis Gaddis). There's one thing I always look for in a war chronicle: I want the historian to have presented every fact in the capacity of cause, effect or parallel, and then present his conclusions of their various relationships. I'd like to think the reader of a chronicle is a historian, too, and should be given the opportunity to interpret as he or she so chooses to.
And Beevor has done just that - that is, if the first chapter is anything to go by. The siege of Stalingrad is a decisive turning-point in World War II, and a good understanding of the battles and smaller conflicts it encompassed presents a nigh miniaturized of the entire War itself. The siege and comeback marked the rise of Soviet nationalism, Stalin's invasion of Europe, and Hitler's downfall as a result of his stupidest mistake: the transgression of the Molotov-Ribbentrop Pact, and the subsequent birth of a 'judeo-bolshevik conspiracy'.
[caption id="attachment_21155" align="aligncenter" width="370" caption="Molotov signs the Pact on August 23, 1939, in the presence of Joachim von Ribbentrop and Josef Stalin (standing behind him)."]
[/caption]
But all that's history, and it makes for good reading. For now, I'm going to let my enthusiasm for the subject and Beevor's industry guide me into the book so that I can gauge the other, in this case less-prioritized, aspects of book-reading on the way. Starting with a 4 on 5, let's see how much Beevor improves or reduces the impact of raw facts.
Absolute v. relative happiness
'Thinking, Fast and Slow' by Nobel Economics laureate Daniel Kahneman I chose purely because of his work in prospect theory, rather his work in laying the basis of prospect theory. For too long, mathematical models have been constructed to determine the set of optimal choices that a human will make given a set of fixed assets and personal preferences. On the other hand, those made in real life are quite deviant from theory.
According to expected utility theory, one such 'ideal' model, choices are broken down as betting preferences made in the face of uncertain outcomes, or gambles. Next, each choice is depicted as a function of the payout, the probabilities of occurrence of each event, how important the event is to other people (it's "utility"), and the behavioural pattern of the chooser in the face of uncertainty (risk aversion).
According to Kahneman's theory, the process of prospecting is broken down into two stages. In the first editing stage, the outcomes of each experiment are ordered according to a set of guidelines set out speculatively. As part of the same process, the preference of each outcome is gauged according to a reference point and ordered accordingly. To this point, the theory is spot-on, and it is in the following evaluation phase that computation comes in and makes me uneasy.
[caption id="attachment_21151" align="aligncenter" width="259" caption="Daniel Bernoulli, one of the uber-smart Bernoullis, conceived the expected utility theory in 1738."]
I can recall no computations that I've made, consciously or no, while deciding which laptop to buy or what move to make in a game of chess apart from asking myself "what will hurt more?". Fortunately, prospect theory carefully asserts that losses hurt more than gains feel good, whereas in the expected utility theory, the reference point is scalar and doesn't provide a parameter to judge for how much losses hurt.
Kahneman's book, I feel, is a good place to begin when thinking about how a behavioural economist's conception of a problem affects the characterization of the problem and the subsequent solutions that present themselves. Another reward is the understanding of utility as a reference-based entity: think Sheldon Cooper's example in The Big Bang Theory.
Is freewill real?
Next in line is David Eagleman's 'Incognito: The Secret Lives of the Brain'. Eagleman is a neuroscientist with the Baylor College of Medicine, Texas, and is reputed in the fields of synesthesia, time perception and the emerging field of neurolaw. Neurolaw draws on cognitive psychology, criminology and philosophy to assess the interaction between jurisprudence and neuroscience.
[caption id="attachment_21152" align="aligncenter" width="529" caption="Synesthesia is the phenomenon whereby the stimulation of one cognitive pathway in the brain leads to the automatic stimulation of another pathway. Depicted above is an example of colour-graphemic synesthesia, where the visual input of letters and their shapes provokes the perception of an inherent colour as well."]
However, I picked his book for just three reasons: he's young, Ed Yong recommended his book, and his experimental methodology when attempting to understand time perception in high-adrenaline situations. They each translate to progressive-mindedness and an easily established non-adherence to old-school techniques and concepts, Ed Yong recommended his book, and a practical approach when it comes to satiating his curiosity.
In keeping with what I was looking for in Kahneman's book, I'm going to read the book keeping in mind my questions about freewill: whether we really have it or if it's just an illusion of choices created by a brain that already has some idea about the outcome (just the way it knew what the choices were going to be because it put them together based on years of experiences).
[caption id="attachment_21153" align="aligncenter" width="520" caption="The HAL 9000 computer, from '2001: A Space Odyssey', holds a special place in the minds of geeks because it was and is one of the most prominent AIs that, seeking to secure a greater interest, subverts mankind. Will a cylon seek to do the same?"]
The good thing about getting closer to the future and, better yet, the technological singularity is that we're more likely to work in the direction of man-machine unification - or at least I'd like to think so. To clarify, I'm a skeptic of such an idea, but I am hopeful; if anything, we'll always be 10 years away from realizing such a tech-intense marriage.
At the same time, such ambitions keep us working on understanding how we piece together different bits of information to generate an image of reality that we can meaningfully interact with. In fact, in the words of Eagleman himself, "[my long-range goals are] to understand how neural signals processed by different brain regions come together for a temporally unified picture of the world."
The battle that was the beginning of the end
The last book is the one I'm most eager to read: 'Stalingrad: The Fateful Siege 1942-1943' by Antony Beevor is a humanist take on the siege of Stalingrad during World War II that resulted in the loss of 1.9 million lives, making it one of the bloodiest conflicts in human history.
[caption id="attachment_21154" align="aligncenter" width="360" caption="A building in Stalingrad in ruins in the aftermath of the battle that liberated the city"]
Being a longstanding World War II enthusiast (and hoping it comes across as an academic pursuit more than anything else), Beevor's work is a gem because it brings together the strategies, regions, power shifts, resources, rewards and losses, and the people of Germany, eastern Europe, Finland, Russia, northern China and Japan into one vast but unrelenting study of conflict and fate.
I came across the book when I was sifting through Amazon's recommendations after I'd taken a look at Kennan's biography (written by National Humanities Medal recipient John Lewis Gaddis). There's one thing I always look for in a war chronicle: I want the historian to have presented every fact in the capacity of cause, effect or parallel, and then present his conclusions of their various relationships. I'd like to think the reader of a chronicle is a historian, too, and should be given the opportunity to interpret as he or she so chooses to.
And Beevor has done just that - that is, if the first chapter is anything to go by. The siege of Stalingrad is a decisive turning-point in World War II, and a good understanding of the battles and smaller conflicts it encompassed presents a nigh miniaturized of the entire War itself. The siege and comeback marked the rise of Soviet nationalism, Stalin's invasion of Europe, and Hitler's downfall as a result of his stupidest mistake: the transgression of the Molotov-Ribbentrop Pact, and the subsequent birth of a 'judeo-bolshevik conspiracy'.
[caption id="attachment_21155" align="aligncenter" width="370" caption="Molotov signs the Pact on August 23, 1939, in the presence of Joachim von Ribbentrop and Josef Stalin (standing behind him)."]
But all that's history, and it makes for good reading. For now, I'm going to let my enthusiasm for the subject and Beevor's industry guide me into the book so that I can gauge the other, in this case less-prioritized, aspects of book-reading on the way. Starting with a 4 on 5, let's see how much Beevor improves or reduces the impact of raw facts.
Labels:
Antony Beevor,
books,
brain,
Daniel Kahneman,
David Eagleman,
decision making,
history,
identity,
Molotov-Ribbentrop Pact,
prospect theory,
Stalingrad,
synesthesia,
war,
World War 2,
writing
Thursday, 22 December 2011
The secularism of genocide
France has enacted a Bill that makes any denial of the Armenian genocide punishable by law, with deniers earning up to a year in prison and a EUR 45,000 ($58,600) fine. While most countries recognize the killing of 1.5 million Armenians in 1915 in Turkey to be a genocide, Turkey contests the numbers and, more importantly, that the characterization of the event (as a genocide or not) should be historians, not legislators. Here, I agree with Turkey.
Let's get the political angle out of the way: the conservative government of President Nicolas Sarkozy aims to garner support from the sizable local Armenian population in its run up to the elections next year. Beyond that, sealing the event by law to be of a particular nature heavily influences debate on the subject. I don't deny that the event happened, and going by the scale of things, I can't deny that it was genocidal, but both these facts have been established by reason. Reason is secular. Facts become facts only when established by way of logical reasoning and scientific evidence. When jurisprudential factitude is attached to it, the fact's secular character becomes sidelined. Now, argumentatively, I can't even debate the genocidal aspect of the Armenian genocide.
That well-informed French politicians have voted to pass the Bill unanimously means nothing: when there are political rewards or sanctions become involved, politicians will move guided only by them. On the other hand, that the Bill was drafted in the first place is a dubious act of gaining political leverage because it comes at the cost of the rejection of historical evidence. From this point on, the Armenian genocide in French debate will be a law-point, not a factual point.
What stops the French from labeling future events as genocides even though they may not have been so? It will be making that decision based only on what it chooses to know, not what it should know.
Let's get the political angle out of the way: the conservative government of President Nicolas Sarkozy aims to garner support from the sizable local Armenian population in its run up to the elections next year. Beyond that, sealing the event by law to be of a particular nature heavily influences debate on the subject. I don't deny that the event happened, and going by the scale of things, I can't deny that it was genocidal, but both these facts have been established by reason. Reason is secular. Facts become facts only when established by way of logical reasoning and scientific evidence. When jurisprudential factitude is attached to it, the fact's secular character becomes sidelined. Now, argumentatively, I can't even debate the genocidal aspect of the Armenian genocide.
That well-informed French politicians have voted to pass the Bill unanimously means nothing: when there are political rewards or sanctions become involved, politicians will move guided only by them. On the other hand, that the Bill was drafted in the first place is a dubious act of gaining political leverage because it comes at the cost of the rejection of historical evidence. From this point on, the Armenian genocide in French debate will be a law-point, not a factual point.
What stops the French from labeling future events as genocides even though they may not have been so? It will be making that decision based only on what it chooses to know, not what it should know.
Thursday, 8 December 2011
The non-option of going back in time
The recent, and ongoing, debates on the merits of nuclear power, and whether we need it at all of it's just a motivated laziness in looking for alternate sources of energy (ASEs), has prompted me to evaluate the necessity of technology in life as we know it. In fact, to take the discussion further, I want to know if life as we know it is just an epochal assessment or if life as we know it has become a template for future generations to base their functions on.
[caption id="" align="aligncenter" width="300" caption="The match-cut from '2001: A Space Odyssey' spanning four million years: we're somewhere in between and wondering."]
[caption id="" align="aligncenter" width="300" caption="The match-cut from '2001: A Space Odyssey' spanning four million years: we're somewhere in between and wondering."]
The penetration of technology is indubitable and has, in many ways, become irredeemable. While many suggest that owing to the amount of environmental degradation the time has come for us to slow down, reevaluate our needs and, if possible, turn back from here to a time when life seemed more sustainable, I believe that in the 20th century, we definitely hit a point of no return. There's no turning back from here. That means we no longer do technology any justice by referring to it as something that has penetrated into our lives, we do it no justice by referring to it as a tool. We ARE what technology is. Even if it wasn't nuclear power that merited this reflection, it would've been something else. Life as we know it, more than anything else, is not sustainable. It never was; it probably never will be.
Probably.
If we rewinded to a time before the internet, before the computer and the transistors, before engines and hydraulics and electromagnetism, before astronomy, geology, meteorology, exploration and trade, before literature, production and communication, we arrive at a point where there was nothing behind us, a point of "zero history". It is when we move back to this point that we truly see where some of us aspire to return to. There was nothing before us to aid us in our future quests except the human body, the then-indecipherable forces of nature, and the human mind: a vast reserve of questions, extremely limited resources, and a helluva lot of time.
[/caption]
When we started asking those questions, when we started to explore farther into the fog of war, we hit the future. Since then, we haven't turned back because there was nothing to go back to. Everything was an improvement, everything that contributed something to the human condition and alleviated the pains of not-knowing. Do you think it would be better for us if we moved toward a tribal way of life? No way. Sure, we could make love to the environment and not be afraid of nature turning against us in unimaginable ways, but at some point, our basic instincts will take over. They always have, and they always will, too.
The tribes that we observe living in forests and valleys today seem to present a solution to us because we've hit a wall with our energy resources and don't know where to go to from that point on. But without us, without the restrictions and the hindrances we pose to the sustenance of their livelihood, tribes would've become quite something else by now. Even they would've evolved technologically, and invented their own methods of acquiring more knowledge and using it to their benefit. They live in a controlled environment, fighting to remain what they've remained as for the past six millennia. If we all retrogressed to that stage, we'd have gone back a few thousand years back in time, but we'd begin again.
The reason we look to such "reduced" ways of life—reduced by the various techniques at our disposal today—is because we are panicking. We are finally realizing that we, as humans, are unsustainable, and we're ready to take desperate measures to assuage that thought. We are looking at what is not us and joining the dots to give absolute freedom and sustainability. However, in effect, we are bound to give ourselves only the curse of changelessness. All that we have done as humans, all that we have explored and reared and produced, will then lie as waste. Let me tell you, it is easy to join the dots, but that doesn't mean the image will then come to life. There is a lot that we're missing out, perhaps because we're taking them for granted.
Yes, our energy needs are growing. For six thousand years, we've been asking questions and answering them, and this is the point we've come to. I'm not advocating that in that pursuit, we lay to waste all that crosses our paths; no. I'm only saying the answer to our energy needs isn't regression, isn't the reevaluation of everything that came before us. If anything, the notion of future has made us understand that anything is possible, and if something doesn't seem to work out, then we haven't looked hard enough. Simply because our options are significantly unviable ASEs, nuclear energy, thermal power plants and regression doesn't mean we pick regression: we pick what will sustain us in the short-run so that it can power the ideas that will be necessary for the long-run. If we're working to prolong something that wasn't born with the universe, we will take a hit. Let's last it out, not back down. There's a difference.
Monday, 5 September 2011
One ear less
How long would humankind last if each human had only one ear, and if it did, how different would the world be?
- Lack of physical balance?
- Absence of symmetric infrastructure and technology?
- "Lopsided" conversations?
- Dissociation of beauty from symmetry?
- A much-less messy self-portrait by van Gogh?
The relevance of racial realism
There are three people in a room: A, B and C.
They each possess the following skills in varying levels of excellence:

ZS: A > B = C
FS: A > B > C
SS: A < B = C
If A, B and C belong to three different races, and if the observations made above applied to all individuals of the same race as A, B and C, respectively, then isn't it fair to identify the strength of each trait in a race with the race itself? Deliberately obfuscating such a persistent pattern in light of the social and cultural damage that racism has wreaked may be right when moderated by humanitarian ethics. Then, however, the scientifically provable biological realism of the association (between skill and race) will be lost out (I haven't mentioned any such traits, and the notions of the racially associated skills in the comparison above is only hypothetical).
A century of heady and steady technological progress has, more than anything else, taught us that statistical determinism still largely remains a son of chance. That means we must be as economical as we can with the data we have in order to utilize our human resources best. The real problem of racism lies in the threat posed by its misunderstanding and misuse. As a consequence of racially driven wars and sociopolitical movements, so much as recognizing a race has proved deplorable, condemning racial realism along with it. I'm not a racial realist. I only postulate that if the biological realism backing racism (as in the factitude of race and not the discrimination along the lines of race) stands proven, then we mustn't back away from that conclusion because the history of one scientific fact was blighted by the idiocy of humankind.
Moreover, continuing from the example, discrimination-by-race can occur only when a community forms that encourages, say, an eminence of the First Skill and so lets a community form solely on the basis of an inequality: A > B or B > C or A > C. In that case, a skill's association with the race is not to be blamed just as much as letting that association lend itself to the creation of a community is condemnable. In other words, the claim that "A can do this better than B can" can reek of racial realism as much as it wants to, but it can't acquire by itself a socially judgmental connotation if not for the society that tags it so.
And now, I'll start reading The Bell Curve (1994).
They each possess the following skills in varying levels of excellence:
- Zeroth Skill (ZS)
- First Skill (FS)
- Second Skill (SS)
ZS: A > B = C
FS: A > B > C
SS: A < B = C
If A, B and C belong to three different races, and if the observations made above applied to all individuals of the same race as A, B and C, respectively, then isn't it fair to identify the strength of each trait in a race with the race itself? Deliberately obfuscating such a persistent pattern in light of the social and cultural damage that racism has wreaked may be right when moderated by humanitarian ethics. Then, however, the scientifically provable biological realism of the association (between skill and race) will be lost out (I haven't mentioned any such traits, and the notions of the racially associated skills in the comparison above is only hypothetical).
"If anything, I’m a race realist, and that simply means to recognize and understand the fact that there are racial differences and that these differences have an impact on society, education, crime, and many other aspects of life. It is not being a racist. It’s simply being a realist and a person who’s in search of solutions, rather than simply allowing these problems to continue to escalate."
Carol M. Swain
A century of heady and steady technological progress has, more than anything else, taught us that statistical determinism still largely remains a son of chance. That means we must be as economical as we can with the data we have in order to utilize our human resources best. The real problem of racism lies in the threat posed by its misunderstanding and misuse. As a consequence of racially driven wars and sociopolitical movements, so much as recognizing a race has proved deplorable, condemning racial realism along with it. I'm not a racial realist. I only postulate that if the biological realism backing racism (as in the factitude of race and not the discrimination along the lines of race) stands proven, then we mustn't back away from that conclusion because the history of one scientific fact was blighted by the idiocy of humankind.
Moreover, continuing from the example, discrimination-by-race can occur only when a community forms that encourages, say, an eminence of the First Skill and so lets a community form solely on the basis of an inequality: A > B or B > C or A > C. In that case, a skill's association with the race is not to be blamed just as much as letting that association lend itself to the creation of a community is condemnable. In other words, the claim that "A can do this better than B can" can reek of racial realism as much as it wants to, but it can't acquire by itself a socially judgmental connotation if not for the society that tags it so.
And now, I'll start reading The Bell Curve (1994).
The relevance of racial realism
There are three people in a room: A, B and C.
They each possess the following skills in varying levels of excellence:

ZS: A > B = C
FS: A > B > C
SS: A < B = C
If A, B and C belong to three different races, and if the observations made above applied to all individuals of the same race as A, B and C, respectively, then isn't it fair to identify the strength of each trait in a race with the race itself? Deliberately obfuscating such a persistent pattern in light of the social and cultural damage that racism has wreaked may be right when moderated by humanitarian ethics. Then, however, the scientifically provable biological realism of the association (between skill and race) will be lost out (I haven't mentioned any such traits, and the notions of the racially associated skills in the comparison above is only hypothetical).
A century of heady and steady technological progress has, more than anything else, taught us that statistical determinism still largely remains a son of chance. That means we must be as economical as we can with the data we have in order to utilize our human resources best. The real problem of racism lies in the threat posed by its misunderstanding and misuse. As a consequence of racially driven wars and sociopolitical movements, so much as recognizing a race has proved deplorable, condemning racial realism along with it. I'm not a racial realist. I only postulate that if the biological realism backing racism (as in the factitude of race and not the discrimination along the lines of race) stands proven, then we mustn't back away from that conclusion because the history of one scientific fact was blighted by the idiocy of humankind.
Moreover, continuing from the example, discrimination-by-race can occur only when a community forms that encourages, say, an eminence of the First Skill and so lets a community form solely on the basis of an inequality: A > B or B > C or A > C. In that case, a skill's association with the race is not to be blamed just as much as letting that association lend itself to the creation of a community is condemnable. In other words, the claim that "A can do this better than B can" can reek of racial realism as much as it wants to, but it can't acquire by itself a socially judgmental connotation if not for the society that tags it so.
And now, I'll start reading The Bell Curve (1994).
They each possess the following skills in varying levels of excellence:
- Zeroth Skill (ZS)
- First Skill (FS)
- Second Skill (SS)
ZS: A > B = C
FS: A > B > C
SS: A < B = C
If A, B and C belong to three different races, and if the observations made above applied to all individuals of the same race as A, B and C, respectively, then isn't it fair to identify the strength of each trait in a race with the race itself? Deliberately obfuscating such a persistent pattern in light of the social and cultural damage that racism has wreaked may be right when moderated by humanitarian ethics. Then, however, the scientifically provable biological realism of the association (between skill and race) will be lost out (I haven't mentioned any such traits, and the notions of the racially associated skills in the comparison above is only hypothetical).
"If anything, I’m a race realist, and that simply means to recognize and understand the fact that there are racial differences and that these differences have an impact on society, education, crime, and many other aspects of life. It is not being a racist. It’s simply being a realist and a person who’s in search of solutions, rather than simply allowing these problems to continue to escalate."
Carol M. Swain
A century of heady and steady technological progress has, more than anything else, taught us that statistical determinism still largely remains a son of chance. That means we must be as economical as we can with the data we have in order to utilize our human resources best. The real problem of racism lies in the threat posed by its misunderstanding and misuse. As a consequence of racially driven wars and sociopolitical movements, so much as recognizing a race has proved deplorable, condemning racial realism along with it. I'm not a racial realist. I only postulate that if the biological realism backing racism (as in the factitude of race and not the discrimination along the lines of race) stands proven, then we mustn't back away from that conclusion because the history of one scientific fact was blighted by the idiocy of humankind.
Moreover, continuing from the example, discrimination-by-race can occur only when a community forms that encourages, say, an eminence of the First Skill and so lets a community form solely on the basis of an inequality: A > B or B > C or A > C. In that case, a skill's association with the race is not to be blamed just as much as letting that association lend itself to the creation of a community is condemnable. In other words, the claim that "A can do this better than B can" can reek of racial realism as much as it wants to, but it can't acquire by itself a socially judgmental connotation if not for the society that tags it so.
And now, I'll start reading The Bell Curve (1994).
Tuesday, 2 August 2011
The Spielberg-Kafka Impasse
The following are afterthoughts - as seems to have become the norm - concerning a good lecture by Prof. R. Radhakrishnan at the Asian College of Journalism on the 1st day of August, 2011.
--
Professionalism
I profess skill. Therefore, I join a profession. Do I therefore incur the responsibilities dictated by professionalism? Before we discuss the source of values, before we seek to include the mechanism of ethics in our discussion, it's important to address the basic conflict in the form of professionalism on the one hand and simply fulfilling responsibilities on the other.
Disregard the context for a minute: where do values come from? They are always self-imposed because they are the consequences of subjective evaluations of our reality by ourselves (they do not arise out of the context itself and, thus, the loss of context does not matter in understanding the nature of our values). When different people espouse different values, the institution no longer remains in a position to enjoin what those values are but still is able to hire or fire those it deems compatible with its goals.
Are values a priori? No. Are they necessary? They seem to be. Why? I've addressed this question earlier: the system of values that we deem necessary is a matter of personal choice; however, it is neither mandated nor forbidden. Are they the principle definitions of a general ethical code of conduct?
Possibly: the "goodness" quotient of the outcome of my actions is evaluated against the requirements of my profession together with certain humanistic unavoidables. In that light, my system of values - if any - is going to be influenced by the safeguarding of my interests and perhaps those of the organization, too. Values, I believe, are strictly a posteriori.
Freedom
Say what you will, freedom is a conversational piece. A flosculation. Perhaps its most palpable forms as such have all been macropolitical. In the micropolitical sense, however, it's a modality that gets diffused in various field logics, perhaps as a result of attempts by the freedom-seeker to contextualize it.
Reality itself has been undeniably victimized by such things as inflation and globalization: the "bigger picture" as I choose to see it does not step beyond the confines of my laptop. Consequently, my freedom is limited to the choices I will have a right to access and/or make, and so my freedom is to customize my Facebook profile, my freedom is my right to privacy on the web, and so forth.
There comes a difference when the macropolitical and the micropolitical engage, whereby a mitigating mediating force becomes apparent. When Gandhi asked those seeking to "do good" to consider what good they would do for the common man, did philanthropists and samaritans scurry to seek out the necessities of the common person? Or did they surmise the nature of the common man's micropolitical environment and scaled down the relevance of their ambitions?
In the name of what?
What am I speaking for? (Too many people go on at ACJ about how they've asked themselves this very question so many times - so what? I've asked myself the question many times, too, and I don't get the implied significance - are things all that ambivalent?).
Whether or not a collective is involved is irrelevant to me: as long as I am being representational, I will represent only that face of the collective that embodies all that is necessary for the representation to be accurate, i.e., like an individual who is the summa of all that the collective wishes communicated.
A minor reference to historicity becomes necessary (or, as Prof. Radhakrishnan chose to call it, temporality): to do something "in the name of an event that has become a part of history and acquired a political, social, cultural or economic flavour because of its eventual outcome."
(Say a man approaches a crossroads at which his friend awaits. The man says to his friend, "My cause is X." The friend replies, "I endorse your cause. Now, go forth." Presented with three options, the man picks the path straight ahead. He walks it, and its end he finds he has emerged a supporter of cause Y. Now, can the man's friend be said to endorse cause Y?)
What's your dharma?
Does idealism have its price in a world that constantly debates its pertinence? Is it fair to consistently toe the line as a matter of principle? Am I going to talk about just what shouldn't be talked about? It's the whole professionalism versus fundamentalism argument once more (I mean "fundamentalist" in its original sense).
Dharma is a perception of the self when between objective reality and subjective reality, and as such the former's existence is a matter of debate. However, irrespective of the conflict between a way of thinking and a way of practising, my dharma is a mechanism constituted by my experiences to model them (i.e., the ways).
However, there is some abrasion in the form of my individual autonomy. When extant in some reality, is it possible for me to not precipitate the antecedence of reality to my intervention? In other words, can I act without being acted upon, perhaps without reality having been presumptuous of my actions?
It wouldn't be right, I conclude, that the truth, per se, exists independent of my existence and so constitutes an independent reality with the employ of which I can reflect myself. Reality will always be antecedent of my intervention because I am involved in the constitution of that reality, and when I act, I can only do so in spaces that have room for the outcome/effect.
The truth is a negotiated simplification because I exist relative to a totality. (This reminds me of a post I wrote quite some time ago on the Whorf-Sapir hypothesis in linguistic theory.)
The simulacrum
When moving from being real to being intelligible, we move away from the objective existence of reality and toward the subjective counterpart (as if they're distinct!), and in the process attempt to include our understanding of reality. This "understanding" is encapsulated by the production of intelligibility (tied in with, but different from, the production of meaning).
So, what does it mean to have a point of view?
Just as in the previous statements, intelligibility also suffers from the marriage of existence and subjectivity: the question of a universally extant intelligibility is mired with the likelihood of the creation of new frames of knowledge in order to create such understanding. Just like the notion of freedom is extra-political, the moment we put something into words in order to understand it, we suffuse it with the persisting symbolism in language: a mediator rises like a snake on the bosom.
Ultimately, all of this condenses into the nature of the posthuman subject: just like Abhinavagupta's Shaivite position held that the individual consciousness is an individuation of the universal consciousness that is God, the posthuman is an individuation of the unified human entity. Being in possession of an emergent ontology, only the posthuman subject is capable of self-reflexivity, i.e., to avail the option of defying norms, etc., simply by availing the tools with which to study his reflection.
If you've read Edwin Abbott's Flatland (1884), the nature of self-reflexivity (as in social theories) can be explained by the inability of the two-dimensional objects to understand the real nature of the three-dimensional sphere. Going another way, it can also be analogized to the sphere's ability to view Flatland in its entirety while the lines and shapes can't.
And that brings us to...
The Spielberg-Kafka Impasse
Steven Spielberg must never adapt Franz Kafka's Metamorphosis for the silver-screen. Kafka's insectoid captured perhaps the uncapturable aspect of change and of displacement, and its now-Kafkaesque surrealism is befitting because it leaves ample space for interpretation.
If Spielberg made a movie out of it, the imagery would become set in stone, its changeable nature lost to the mass of readers who find solace in Kafka's consideration of such emotions. The posthuman would settle down back into the human entity, no longer capable of assuming different identities at will, the mediating ghosts would turn into phantoms, in their wake leaving a world incapable of change.
--
Professionalism
I profess skill. Therefore, I join a profession. Do I therefore incur the responsibilities dictated by professionalism? Before we discuss the source of values, before we seek to include the mechanism of ethics in our discussion, it's important to address the basic conflict in the form of professionalism on the one hand and simply fulfilling responsibilities on the other.
Disregard the context for a minute: where do values come from? They are always self-imposed because they are the consequences of subjective evaluations of our reality by ourselves (they do not arise out of the context itself and, thus, the loss of context does not matter in understanding the nature of our values). When different people espouse different values, the institution no longer remains in a position to enjoin what those values are but still is able to hire or fire those it deems compatible with its goals.
Are values a priori? No. Are they necessary? They seem to be. Why? I've addressed this question earlier: the system of values that we deem necessary is a matter of personal choice; however, it is neither mandated nor forbidden. Are they the principle definitions of a general ethical code of conduct?
Possibly: the "goodness" quotient of the outcome of my actions is evaluated against the requirements of my profession together with certain humanistic unavoidables. In that light, my system of values - if any - is going to be influenced by the safeguarding of my interests and perhaps those of the organization, too. Values, I believe, are strictly a posteriori.
Freedom
Say what you will, freedom is a conversational piece. A flosculation. Perhaps its most palpable forms as such have all been macropolitical. In the micropolitical sense, however, it's a modality that gets diffused in various field logics, perhaps as a result of attempts by the freedom-seeker to contextualize it.
Reality itself has been undeniably victimized by such things as inflation and globalization: the "bigger picture" as I choose to see it does not step beyond the confines of my laptop. Consequently, my freedom is limited to the choices I will have a right to access and/or make, and so my freedom is to customize my Facebook profile, my freedom is my right to privacy on the web, and so forth.
There comes a difference when the macropolitical and the micropolitical engage, whereby a mitigating mediating force becomes apparent. When Gandhi asked those seeking to "do good" to consider what good they would do for the common man, did philanthropists and samaritans scurry to seek out the necessities of the common person? Or did they surmise the nature of the common man's micropolitical environment and scaled down the relevance of their ambitions?
In the name of what?
What am I speaking for? (Too many people go on at ACJ about how they've asked themselves this very question so many times - so what? I've asked myself the question many times, too, and I don't get the implied significance - are things all that ambivalent?).
Whether or not a collective is involved is irrelevant to me: as long as I am being representational, I will represent only that face of the collective that embodies all that is necessary for the representation to be accurate, i.e., like an individual who is the summa of all that the collective wishes communicated.
A minor reference to historicity becomes necessary (or, as Prof. Radhakrishnan chose to call it, temporality): to do something "in the name of an event that has become a part of history and acquired a political, social, cultural or economic flavour because of its eventual outcome."
(Say a man approaches a crossroads at which his friend awaits. The man says to his friend, "My cause is X." The friend replies, "I endorse your cause. Now, go forth." Presented with three options, the man picks the path straight ahead. He walks it, and its end he finds he has emerged a supporter of cause Y. Now, can the man's friend be said to endorse cause Y?)
What's your dharma?
Does idealism have its price in a world that constantly debates its pertinence? Is it fair to consistently toe the line as a matter of principle? Am I going to talk about just what shouldn't be talked about? It's the whole professionalism versus fundamentalism argument once more (I mean "fundamentalist" in its original sense).
Dharma is a perception of the self when between objective reality and subjective reality, and as such the former's existence is a matter of debate. However, irrespective of the conflict between a way of thinking and a way of practising, my dharma is a mechanism constituted by my experiences to model them (i.e., the ways).
However, there is some abrasion in the form of my individual autonomy. When extant in some reality, is it possible for me to not precipitate the antecedence of reality to my intervention? In other words, can I act without being acted upon, perhaps without reality having been presumptuous of my actions?
It wouldn't be right, I conclude, that the truth, per se, exists independent of my existence and so constitutes an independent reality with the employ of which I can reflect myself. Reality will always be antecedent of my intervention because I am involved in the constitution of that reality, and when I act, I can only do so in spaces that have room for the outcome/effect.
The truth is a negotiated simplification because I exist relative to a totality. (This reminds me of a post I wrote quite some time ago on the Whorf-Sapir hypothesis in linguistic theory.)
The simulacrum
When moving from being real to being intelligible, we move away from the objective existence of reality and toward the subjective counterpart (as if they're distinct!), and in the process attempt to include our understanding of reality. This "understanding" is encapsulated by the production of intelligibility (tied in with, but different from, the production of meaning).
So, what does it mean to have a point of view?
Just as in the previous statements, intelligibility also suffers from the marriage of existence and subjectivity: the question of a universally extant intelligibility is mired with the likelihood of the creation of new frames of knowledge in order to create such understanding. Just like the notion of freedom is extra-political, the moment we put something into words in order to understand it, we suffuse it with the persisting symbolism in language: a mediator rises like a snake on the bosom.
Ultimately, all of this condenses into the nature of the posthuman subject: just like Abhinavagupta's Shaivite position held that the individual consciousness is an individuation of the universal consciousness that is God, the posthuman is an individuation of the unified human entity. Being in possession of an emergent ontology, only the posthuman subject is capable of self-reflexivity, i.e., to avail the option of defying norms, etc., simply by availing the tools with which to study his reflection.
If you've read Edwin Abbott's Flatland (1884), the nature of self-reflexivity (as in social theories) can be explained by the inability of the two-dimensional objects to understand the real nature of the three-dimensional sphere. Going another way, it can also be analogized to the sphere's ability to view Flatland in its entirety while the lines and shapes can't.
And that brings us to...
The Spielberg-Kafka Impasse
Steven Spielberg must never adapt Franz Kafka's Metamorphosis for the silver-screen. Kafka's insectoid captured perhaps the uncapturable aspect of change and of displacement, and its now-Kafkaesque surrealism is befitting because it leaves ample space for interpretation.
If Spielberg made a movie out of it, the imagery would become set in stone, its changeable nature lost to the mass of readers who find solace in Kafka's consideration of such emotions. The posthuman would settle down back into the human entity, no longer capable of assuming different identities at will, the mediating ghosts would turn into phantoms, in their wake leaving a world incapable of change.
The Spielberg-Kafka Impasse
The following are afterthoughts - as seems to have become the norm - concerning a good lecture by Prof. R. Radhakrishnan at the Asian College of Journalism on the 1st day of August, 2011.
--
Professionalism
I profess skill. Therefore, I join a profession. Do I therefore incur the responsibilities dictated by professionalism? Before we discuss the source of values, before we seek to include the mechanism of ethics in our discussion, it's important to address the basic conflict in the form of professionalism on the one hand and simply fulfilling responsibilities on the other.
Disregard the context for a minute: where do values come from? They are always self-imposed because they are the consequences of subjective evaluations of our reality by ourselves (they do not arise out of the context itself and, thus, the loss of context does not matter in understanding the nature of our values). When different people espouse different values, the institution no longer remains in a position to enjoin what those values are but still is able to hire or fire those it deems compatible with its goals.
Are values a priori? No. Are they necessary? They seem to be. Why? I've addressed this question earlier: the system of values that we deem necessary is a matter of personal choice; however, it is neither mandated nor forbidden. Are they the principle definitions of a general ethical code of conduct?
Possibly: the "goodness" quotient of the outcome of my actions is evaluated against the requirements of my profession together with certain humanistic unavoidables. In that light, my system of values - if any - is going to be influenced by the safeguarding of my interests and perhaps those of the organization, too. Values, I believe, are strictly a posteriori.
Freedom
Say what you will, freedom is a conversational piece. A flosculation. Perhaps its most palpable forms as such have all been macropolitical. In the micropolitical sense, however, it's a modality that gets diffused in various field logics, perhaps as a result of attempts by the freedom-seeker to contextualize it.
Reality itself has been undeniably victimized by such things as inflation and globalization: the "bigger picture" as I choose to see it does not step beyond the confines of my laptop. Consequently, my freedom is limited to the choices I will have a right to access and/or make, and so my freedom is to customize my Facebook profile, my freedom is my right to privacy on the web, and so forth.
There comes a difference when the macropolitical and the micropolitical engage, whereby a mitigating mediating force becomes apparent. When Gandhi asked those seeking to "do good" to consider what good they would do for the common man, did philanthropists and samaritans scurry to seek out the necessities of the common person? Or did they surmise the nature of the common man's micropolitical environment and scaled down the relevance of their ambitions?
In the name of what?
What am I speaking for? (Too many people go on at ACJ about how they've asked themselves this very question so many times - so what? I've asked myself the question many times, too, and I don't get the implied significance - are things all that ambivalent?).
Whether or not a collective is involved is irrelevant to me: as long as I am being representational, I will represent only that face of the collective that embodies all that is necessary for the representation to be accurate, i.e., like an individual who is the summa of all that the collective wishes communicated.
A minor reference to historicity becomes necessary (or, as Prof. Radhakrishnan chose to call it, temporality): to do something "in the name of an event that has become a part of history and acquired a political, social, cultural or economic flavour because of its eventual outcome."
(Say a man approaches a crossroads at which his friend awaits. The man says to his friend, "My cause is X." The friend replies, "I endorse your cause. Now, go forth." Presented with three options, the man picks the path straight ahead. He walks it, and its end he finds he has emerged a supporter of cause Y. Now, can the man's friend be said to endorse cause Y?)
What's your dharma?
Does idealism have its price in a world that constantly debates its pertinence? Is it fair to consistently toe the line as a matter of principle? Am I going to talk about just what shouldn't be talked about? It's the whole professionalism versus fundamentalism argument once more (I mean "fundamentalist" in its original sense).
Dharma is a perception of the self when between objective reality and subjective reality, and as such the former's existence is a matter of debate. However, irrespective of the conflict between a way of thinking and a way of practising, my dharma is a mechanism constituted by my experiences to model them (i.e., the ways).
However, there is some abrasion in the form of my individual autonomy. When extant in some reality, is it possible for me to not precipitate the antecedence of reality to my intervention? In other words, can I act without being acted upon, perhaps without reality having been presumptuous of my actions?
It wouldn't be right, I conclude, that the truth, per se, exists independent of my existence and so constitutes an independent reality with the employ of which I can reflect myself. Reality will always be antecedent of my intervention because I am involved in the constitution of that reality, and when I act, I can only do so in spaces that have room for the outcome/effect.
The truth is a negotiated simplification because I exist relative to a totality. (This reminds me of a post I wrote quite some time ago on the Whorf-Sapir hypothesis in linguistic theory.)
The simulacrum
When moving from being real to being intelligible, we move away from the objective existence of reality and toward the subjective counterpart (as if they're distinct!), and in the process attempt to include our understanding of reality. This "understanding" is encapsulated by the production of intelligibility (tied in with, but different from, the production of meaning).
So, what does it mean to have a point of view?
Just as in the previous statements, intelligibility also suffers from the marriage of existence and subjectivity: the question of a universally extant intelligibility is mired with the likelihood of the creation of new frames of knowledge in order to create such understanding. Just like the notion of freedom is extra-political, the moment we put something into words in order to understand it, we suffuse it with the persisting symbolism in language: a mediator rises like a snake on the bosom.
Ultimately, all of this condenses into the nature of the posthuman subject: just like Abhinavagupta's Shaivite position held that the individual consciousness is an individuation of the universal consciousness that is God, the posthuman is an individuation of the unified human entity. Being in possession of an emergent ontology, only the posthuman subject is capable of self-reflexivity, i.e., to avail the option of defying norms, etc., simply by availing the tools with which to study his reflection.
If you've read Edwin Abbott's Flatland (1884), the nature of self-reflexivity (as in social theories) can be explained by the inability of the two-dimensional objects to understand the real nature of the three-dimensional sphere. Going another way, it can also be analogized to the sphere's ability to view Flatland in its entirety while the lines and shapes can't.
And that brings us to...
The Spielberg-Kafka Impasse
Steven Spielberg must never adapt Franz Kafka's Metamorphosis for the silver-screen. Kafka's insectoid captured perhaps the uncapturable aspect of change and of displacement, and its now-Kafkaesque surrealism is befitting because it leaves ample space for interpretation.
If Spielberg made a movie out of it, the imagery would become set in stone, its changeable nature lost to the mass of readers who find solace in Kafka's consideration of such emotions. The posthuman would settle down back into the human entity, no longer capable of assuming different identities at will, the mediating ghosts would turn into phantoms, in their wake leaving a world incapable of change.
--
Professionalism
I profess skill. Therefore, I join a profession. Do I therefore incur the responsibilities dictated by professionalism? Before we discuss the source of values, before we seek to include the mechanism of ethics in our discussion, it's important to address the basic conflict in the form of professionalism on the one hand and simply fulfilling responsibilities on the other.
Disregard the context for a minute: where do values come from? They are always self-imposed because they are the consequences of subjective evaluations of our reality by ourselves (they do not arise out of the context itself and, thus, the loss of context does not matter in understanding the nature of our values). When different people espouse different values, the institution no longer remains in a position to enjoin what those values are but still is able to hire or fire those it deems compatible with its goals.
Are values a priori? No. Are they necessary? They seem to be. Why? I've addressed this question earlier: the system of values that we deem necessary is a matter of personal choice; however, it is neither mandated nor forbidden. Are they the principle definitions of a general ethical code of conduct?
Possibly: the "goodness" quotient of the outcome of my actions is evaluated against the requirements of my profession together with certain humanistic unavoidables. In that light, my system of values - if any - is going to be influenced by the safeguarding of my interests and perhaps those of the organization, too. Values, I believe, are strictly a posteriori.
Freedom
Say what you will, freedom is a conversational piece. A flosculation. Perhaps its most palpable forms as such have all been macropolitical. In the micropolitical sense, however, it's a modality that gets diffused in various field logics, perhaps as a result of attempts by the freedom-seeker to contextualize it.
Reality itself has been undeniably victimized by such things as inflation and globalization: the "bigger picture" as I choose to see it does not step beyond the confines of my laptop. Consequently, my freedom is limited to the choices I will have a right to access and/or make, and so my freedom is to customize my Facebook profile, my freedom is my right to privacy on the web, and so forth.
There comes a difference when the macropolitical and the micropolitical engage, whereby a mitigating mediating force becomes apparent. When Gandhi asked those seeking to "do good" to consider what good they would do for the common man, did philanthropists and samaritans scurry to seek out the necessities of the common person? Or did they surmise the nature of the common man's micropolitical environment and scaled down the relevance of their ambitions?
In the name of what?
What am I speaking for? (Too many people go on at ACJ about how they've asked themselves this very question so many times - so what? I've asked myself the question many times, too, and I don't get the implied significance - are things all that ambivalent?).
Whether or not a collective is involved is irrelevant to me: as long as I am being representational, I will represent only that face of the collective that embodies all that is necessary for the representation to be accurate, i.e., like an individual who is the summa of all that the collective wishes communicated.
A minor reference to historicity becomes necessary (or, as Prof. Radhakrishnan chose to call it, temporality): to do something "in the name of an event that has become a part of history and acquired a political, social, cultural or economic flavour because of its eventual outcome."
(Say a man approaches a crossroads at which his friend awaits. The man says to his friend, "My cause is X." The friend replies, "I endorse your cause. Now, go forth." Presented with three options, the man picks the path straight ahead. He walks it, and its end he finds he has emerged a supporter of cause Y. Now, can the man's friend be said to endorse cause Y?)
What's your dharma?
Does idealism have its price in a world that constantly debates its pertinence? Is it fair to consistently toe the line as a matter of principle? Am I going to talk about just what shouldn't be talked about? It's the whole professionalism versus fundamentalism argument once more (I mean "fundamentalist" in its original sense).
Dharma is a perception of the self when between objective reality and subjective reality, and as such the former's existence is a matter of debate. However, irrespective of the conflict between a way of thinking and a way of practising, my dharma is a mechanism constituted by my experiences to model them (i.e., the ways).
However, there is some abrasion in the form of my individual autonomy. When extant in some reality, is it possible for me to not precipitate the antecedence of reality to my intervention? In other words, can I act without being acted upon, perhaps without reality having been presumptuous of my actions?
It wouldn't be right, I conclude, that the truth, per se, exists independent of my existence and so constitutes an independent reality with the employ of which I can reflect myself. Reality will always be antecedent of my intervention because I am involved in the constitution of that reality, and when I act, I can only do so in spaces that have room for the outcome/effect.
The truth is a negotiated simplification because I exist relative to a totality. (This reminds me of a post I wrote quite some time ago on the Whorf-Sapir hypothesis in linguistic theory.)
The simulacrum
When moving from being real to being intelligible, we move away from the objective existence of reality and toward the subjective counterpart (as if they're distinct!), and in the process attempt to include our understanding of reality. This "understanding" is encapsulated by the production of intelligibility (tied in with, but different from, the production of meaning).
So, what does it mean to have a point of view?
Just as in the previous statements, intelligibility also suffers from the marriage of existence and subjectivity: the question of a universally extant intelligibility is mired with the likelihood of the creation of new frames of knowledge in order to create such understanding. Just like the notion of freedom is extra-political, the moment we put something into words in order to understand it, we suffuse it with the persisting symbolism in language: a mediator rises like a snake on the bosom.
Ultimately, all of this condenses into the nature of the posthuman subject: just like Abhinavagupta's Shaivite position held that the individual consciousness is an individuation of the universal consciousness that is God, the posthuman is an individuation of the unified human entity. Being in possession of an emergent ontology, only the posthuman subject is capable of self-reflexivity, i.e., to avail the option of defying norms, etc., simply by availing the tools with which to study his reflection.
If you've read Edwin Abbott's Flatland (1884), the nature of self-reflexivity (as in social theories) can be explained by the inability of the two-dimensional objects to understand the real nature of the three-dimensional sphere. Going another way, it can also be analogized to the sphere's ability to view Flatland in its entirety while the lines and shapes can't.
And that brings us to...
The Spielberg-Kafka Impasse
Steven Spielberg must never adapt Franz Kafka's Metamorphosis for the silver-screen. Kafka's insectoid captured perhaps the uncapturable aspect of change and of displacement, and its now-Kafkaesque surrealism is befitting because it leaves ample space for interpretation.
If Spielberg made a movie out of it, the imagery would become set in stone, its changeable nature lost to the mass of readers who find solace in Kafka's consideration of such emotions. The posthuman would settle down back into the human entity, no longer capable of assuming different identities at will, the mediating ghosts would turn into phantoms, in their wake leaving a world incapable of change.
Friday, 20 May 2011
The metaphysics of cricket
Watching cricket is such joy. It's a strange sort of team-play that the game necessitates, first in pairs by batsmen who score the runs and then as a unit of 11 men who attempt to defend their score by reinforcing the assaults of a series of bowlers in the form of a fielding unit. Unlike a game of football—whose example I invoke simply because it is the world's most watched sport—a game of cricket presents a theoretical number of infinite opportunities for an underdog to turn a losing game into a thumping victory. The gambit of procedures and regulations that sustain the adequation of each of the contending teams is necessitated by such numbers of chances, which are in turn actuated by how the game is, across any format, multi-faceted.
There have been many accusations directed toward the governing council (ICC) for letting what was once called a "gentleman's game" evolve to include sledging and 'not walking' as only issues of ambiguous morality and for not enforcing sterner measures against them. However, I believe that any civility that the game was envisioned to hold was intended only to address the social conventional requisites of the people who played the game centuries ago and that the inclusion of any moral dimensions into a system whose purposes are physical development and entertainment is, on the face of it, meaningless.
The world's most-watched sport lasts for roughly 90 minutes each time it is played between any two teams, has 11 players per team, and is very simple to understand given how it is nowhere close to being as macroscopically multidimensional as cricket is, although an emphasis on individual skills and talent have often made it an entertaining experience. The constant engagement of a whole team with the other team in its entirety, together with simplistic framework within which the game in its modern form functions, presents fewer opportunities for the cost of a mistake to be redeemed quickly or, for that matter, frequently. An important corollary of this argument is that, during a game of cricket, victory or defeat can be pinned on one man or a particular phase of the game, whereas in football, the same is not true: by way of providing for constant (or, at least, almost constant) engagement, the actions of each player depend on the actions of a few players at all points of time (except, of course, during a penalty shoot-out).
What do cricket and heavy metal music have in common? They are each a modality of group activity, one physical and one aesthetic, whose quality of performance has improved greatly since industrialization, and is even still dependent on industrial standards and how frequently they are not met. While the same can be said of football, it must be noted that, in the case of cricket, the improvement has been drastic and has also allowed cricketers to focus on the game instead of concerning themselves with issues of safety—concerns that have since been addressed against a threat of sanctions by said standards.
Did the Englishman really think he had infused civility into a sport simply by reducing physical contact with other players, requiring the wearing of full-sleeved clothing, and having stationary umpires arbiter disputes? If so, he will surely regret that he provided no other occupation for the mouth.
*
There have been many accusations directed toward the governing council (ICC) for letting what was once called a "gentleman's game" evolve to include sledging and 'not walking' as only issues of ambiguous morality and for not enforcing sterner measures against them. However, I believe that any civility that the game was envisioned to hold was intended only to address the social conventional requisites of the people who played the game centuries ago and that the inclusion of any moral dimensions into a system whose purposes are physical development and entertainment is, on the face of it, meaningless.
*
The world's most-watched sport lasts for roughly 90 minutes each time it is played between any two teams, has 11 players per team, and is very simple to understand given how it is nowhere close to being as macroscopically multidimensional as cricket is, although an emphasis on individual skills and talent have often made it an entertaining experience. The constant engagement of a whole team with the other team in its entirety, together with simplistic framework within which the game in its modern form functions, presents fewer opportunities for the cost of a mistake to be redeemed quickly or, for that matter, frequently. An important corollary of this argument is that, during a game of cricket, victory or defeat can be pinned on one man or a particular phase of the game, whereas in football, the same is not true: by way of providing for constant (or, at least, almost constant) engagement, the actions of each player depend on the actions of a few players at all points of time (except, of course, during a penalty shoot-out).
*
What do cricket and heavy metal music have in common? They are each a modality of group activity, one physical and one aesthetic, whose quality of performance has improved greatly since industrialization, and is even still dependent on industrial standards and how frequently they are not met. While the same can be said of football, it must be noted that, in the case of cricket, the improvement has been drastic and has also allowed cricketers to focus on the game instead of concerning themselves with issues of safety—concerns that have since been addressed against a threat of sanctions by said standards.
*
Did the Englishman really think he had infused civility into a sport simply by reducing physical contact with other players, requiring the wearing of full-sleeved clothing, and having stationary umpires arbiter disputes? If so, he will surely regret that he provided no other occupation for the mouth.
The metaphysics of cricket
Watching cricket is such joy. It's a strange sort of team-play that the game necessitates, first in pairs by batsmen who score the runs and then as a unit of 11 men who attempt to defend their score by reinforcing the assaults of a series of bowlers in the form of a fielding unit. Unlike a game of football—whose example I invoke simply because it is the world's most watched sport—a game of cricket presents a theoretical number of infinite opportunities for an underdog to turn a losing game into a thumping victory. The gambit of procedures and regulations that sustain the adequation of each of the contending teams is necessitated by such numbers of chances, which are in turn actuated by how the game is, across any format, multi-faceted.
There have been many accusations directed toward the governing council (ICC) for letting what was once called a "gentleman's game" evolve to include sledging and 'not walking' as only issues of ambiguous morality and for not enforcing sterner measures against them. However, I believe that any civility that the game was envisioned to hold was intended only to address the social conventional requisites of the people who played the game centuries ago and that the inclusion of any moral dimensions into a system whose purposes are physical development and entertainment is, on the face of it, meaningless.
The world's most-watched sport lasts for roughly 90 minutes each time it is played between any two teams, has 11 players per team, and is very simple to understand given how it is nowhere close to being as macroscopically multidimensional as cricket is, although an emphasis on individual skills and talent have often made it an entertaining experience. The constant engagement of a whole team with the other team in its entirety, together with simplistic framework within which the game in its modern form functions, presents fewer opportunities for the cost of a mistake to be redeemed quickly or, for that matter, frequently. An important corollary of this argument is that, during a game of cricket, victory or defeat can be pinned on one man or a particular phase of the game, whereas in football, the same is not true: by way of providing for constant (or, at least, almost constant) engagement, the actions of each player depend on the actions of a few players at all points of time (except, of course, during a penalty shoot-out).
What do cricket and heavy metal music have in common? They are each a modality of group activity, one physical and one aesthetic, whose quality of performance has improved greatly since industrialization, and is even still dependent on industrial standards and how frequently they are not met. While the same can be said of football, it must be noted that, in the case of cricket, the improvement has been drastic and has also allowed cricketers to focus on the game instead of concerning themselves with issues of safety—concerns that have since been addressed against a threat of sanctions by said standards.
Did the Englishman really think he had infused civility into a sport simply by reducing physical contact with other players, requiring the wearing of full-sleeved clothing, and having stationary umpires arbiter disputes? If so, he will surely regret that he provided no other occupation for the mouth.
*
There have been many accusations directed toward the governing council (ICC) for letting what was once called a "gentleman's game" evolve to include sledging and 'not walking' as only issues of ambiguous morality and for not enforcing sterner measures against them. However, I believe that any civility that the game was envisioned to hold was intended only to address the social conventional requisites of the people who played the game centuries ago and that the inclusion of any moral dimensions into a system whose purposes are physical development and entertainment is, on the face of it, meaningless.
*
The world's most-watched sport lasts for roughly 90 minutes each time it is played between any two teams, has 11 players per team, and is very simple to understand given how it is nowhere close to being as macroscopically multidimensional as cricket is, although an emphasis on individual skills and talent have often made it an entertaining experience. The constant engagement of a whole team with the other team in its entirety, together with simplistic framework within which the game in its modern form functions, presents fewer opportunities for the cost of a mistake to be redeemed quickly or, for that matter, frequently. An important corollary of this argument is that, during a game of cricket, victory or defeat can be pinned on one man or a particular phase of the game, whereas in football, the same is not true: by way of providing for constant (or, at least, almost constant) engagement, the actions of each player depend on the actions of a few players at all points of time (except, of course, during a penalty shoot-out).
*
What do cricket and heavy metal music have in common? They are each a modality of group activity, one physical and one aesthetic, whose quality of performance has improved greatly since industrialization, and is even still dependent on industrial standards and how frequently they are not met. While the same can be said of football, it must be noted that, in the case of cricket, the improvement has been drastic and has also allowed cricketers to focus on the game instead of concerning themselves with issues of safety—concerns that have since been addressed against a threat of sanctions by said standards.
*
Did the Englishman really think he had infused civility into a sport simply by reducing physical contact with other players, requiring the wearing of full-sleeved clothing, and having stationary umpires arbiter disputes? If so, he will surely regret that he provided no other occupation for the mouth.
Tuesday, 3 May 2011
Orison of the knowing
Welcome home to this world, this world that makes us;
The world is the land beneath the road that takes us
On the journey we so often seek to undertake
Unto cities of gold where lie the fortunes we must make!
The world is the name of the cross that aches us;
The world is the home of the foe that stakes us;
Here we with memories bury the hatchet of hate
When our golden dreams fall to the ground insatiate!
The world is the hue of the blindness that breaks us;
The world is the shade of turmoil that shakes us;
Only here rests the unseen good of an unseen god
But punishment–oh yes–must be with a firing squad!
Such is the reward of this world that baits us
Guarding our treasure with trying fate it awaits us!
It is no longer the world that once remained our own;
'Tis the sole truth to the ignorant we must make known!
The world is the land beneath the road that takes us
On the journey we so often seek to undertake
Unto cities of gold where lie the fortunes we must make!
The world is the name of the cross that aches us;
The world is the home of the foe that stakes us;
Here we with memories bury the hatchet of hate
When our golden dreams fall to the ground insatiate!
The world is the hue of the blindness that breaks us;
The world is the shade of turmoil that shakes us;
Only here rests the unseen good of an unseen god
But punishment–oh yes–must be with a firing squad!
Such is the reward of this world that baits us
Guarding our treasure with trying fate it awaits us!
It is no longer the world that once remained our own;
'Tis the sole truth to the ignorant we must make known!
Thursday, 24 March 2011
Blackbird's Egg
Ephemeral and lasting these sons of constant attention remain, swimming seas of white and seeking like brave fools the short-lived happiness that words bring. A bloodied chest of rubies with a curse screaming above their head, and I am pushed away, slowly, steadily, and I deliberately forget to fight as noiseless wonders fracture to an unforgiving life. My hollowness has been stolen and in its place is a black bird.
[caption id="attachment_819" align="alignright" width="420" caption="Broken sky, wholesome rain"]
[/caption]
A dreaded wall climbs high and lifts magnanimously on its bank a small green frog. The calendar is moving away, tearing slowly across the lines, the numbers are released up and down both at once. Ripples settle down in silence and the moon comes to watch a storm gently falling asleep in the morning. Jan-jan-jan, one by one, push the sun out. Was-now flaps its wings in a blur but white lingers, a black sun rises in the north, and the morning blooms now-was.
Dissension and debate rage on the outside while a sharp illness pricks within. Give me your promise, broken at birth, and exploit my choices as a preference. Blood on the world's hands and scratches on the queen's back, the marauder runs into eternity behind the pillars of creation. Reason gives fast pursuit but the catch is never done. Why must it be when the end is the end is the end? Raindrops slither down the damp wood and our fires won't burn for any bribe. The crime is only slavery... not you, my darling.
I'm a radioactive toy filled with evaporating purposes. Keep my right to freedom and keep my right to the skies. Give me the freedom to give up when I longer can, give me the freedom to throw my arms up, give me the freedom to shed a tear. To cry shamelessly. Dark patches of dried blood flake away into the wind while the sun sets slowly beyond the mountain, and sunflowers meet the Earth whence they came. The leaf, is airborne, skyward, as a souvenir of the true day.
[caption id="attachment_819" align="alignright" width="420" caption="Broken sky, wholesome rain"]
A dreaded wall climbs high and lifts magnanimously on its bank a small green frog. The calendar is moving away, tearing slowly across the lines, the numbers are released up and down both at once. Ripples settle down in silence and the moon comes to watch a storm gently falling asleep in the morning. Jan-jan-jan, one by one, push the sun out. Was-now flaps its wings in a blur but white lingers, a black sun rises in the north, and the morning blooms now-was.
Dissension and debate rage on the outside while a sharp illness pricks within. Give me your promise, broken at birth, and exploit my choices as a preference. Blood on the world's hands and scratches on the queen's back, the marauder runs into eternity behind the pillars of creation. Reason gives fast pursuit but the catch is never done. Why must it be when the end is the end is the end? Raindrops slither down the damp wood and our fires won't burn for any bribe. The crime is only slavery... not you, my darling.
I'm a radioactive toy filled with evaporating purposes. Keep my right to freedom and keep my right to the skies. Give me the freedom to give up when I longer can, give me the freedom to throw my arms up, give me the freedom to shed a tear. To cry shamelessly. Dark patches of dried blood flake away into the wind while the sun sets slowly beyond the mountain, and sunflowers meet the Earth whence they came. The leaf, is airborne, skyward, as a souvenir of the true day.
Labels:
Abstract art,
Astronomy,
creative,
darkness,
Dissension,
dream,
Earth,
feelings,
freedom,
history,
hope,
inspiration,
literature,
loss,
random,
thoughts,
Wallace Stevens,
Writing
Wednesday, 16 March 2011
Water, Sacrosanct
Deep down in the understanding
of the instance of resistance
there is a sleeping fire not waiting
to be awakened but eager to consume
in the process marking a fine line
between the wise and the knowing
Cautious would be those waiting
to throw a stick into it
to empty an ampoule of ghee into it
for its tongues of heat are infinite and eternal
never having once known the fatigue of toil
or distance, and in that truth, it became a power
Of the labouring masses because of its strangeness
Between each of the self-indulgent embers
and the next is an acute space of demand
and vice that act together like willing prostitutes
but never compliant to achieve a common goal
individually, and through pores that open and close here
is an osmotic pump that mobilizes the arrogance
Of those doused in blood into a different hell
that is only silenced by humiliation
Their every breath rises and falls with some terrible purpose
that they blanket themselves with in order
to seek comfort because freedom is a strange thing to them
In fact, it is the eyelessness of their masters
It is the very thing they have chosen to destroy
For the sake of their children not because
it causes physical harm – even though it does
for in knowing that blood is thicker than water
they know what causes pride and what kills it
dissolves it into an ocean of wisdom that is never
never permitted to come together in a war for food
If time healed all, then revolutions would become moot
and the Fire could be ignored till the day it went out
with an ostentatious “pop” only to remind its wardens of
the opalescence clouding their judgment, only to remind
its keepers that the time has also come for the shells to crumble to dust
money cannot ever buy happiness nor can be it traded
For another life, but in the absence of marked and ratified paper
What buys bread and what buries the dead
what is the memory of effort and what was left unsaid
It's important to feel the pain brought on
by one’s wounds not because it's a mistake to learn from
but because it's a reminder of the lessons still remaining
to be taught only because there are mouths still waiting to be fed
Desires must be procured, wants must be attained
but the needs must always be earned, and that's where
we all begin before an inner corruption seeps through
the oil that feeds the Fire only to leave us lashing out
against the Universe of humanity that's agreed to be our refuge
History's taught us less than what it could've by not teaching us anything at all
of the instance of resistance
there is a sleeping fire not waiting
to be awakened but eager to consume
in the process marking a fine line
between the wise and the knowing
Cautious would be those waiting
to throw a stick into it
to empty an ampoule of ghee into it
for its tongues of heat are infinite and eternal
never having once known the fatigue of toil
or distance, and in that truth, it became a power
Of the labouring masses because of its strangeness
Between each of the self-indulgent embers
and the next is an acute space of demand
and vice that act together like willing prostitutes
but never compliant to achieve a common goal
individually, and through pores that open and close here
is an osmotic pump that mobilizes the arrogance
Of those doused in blood into a different hell
that is only silenced by humiliation
Their every breath rises and falls with some terrible purpose
that they blanket themselves with in order
to seek comfort because freedom is a strange thing to them
In fact, it is the eyelessness of their masters
It is the very thing they have chosen to destroy
For the sake of their children not because
it causes physical harm – even though it does
for in knowing that blood is thicker than water
they know what causes pride and what kills it
dissolves it into an ocean of wisdom that is never
never permitted to come together in a war for food
If time healed all, then revolutions would become moot
and the Fire could be ignored till the day it went out
with an ostentatious “pop” only to remind its wardens of
the opalescence clouding their judgment, only to remind
its keepers that the time has also come for the shells to crumble to dust
money cannot ever buy happiness nor can be it traded
For another life, but in the absence of marked and ratified paper
What buys bread and what buries the dead
what is the memory of effort and what was left unsaid
It's important to feel the pain brought on
by one’s wounds not because it's a mistake to learn from
but because it's a reminder of the lessons still remaining
to be taught only because there are mouths still waiting to be fed
Desires must be procured, wants must be attained
but the needs must always be earned, and that's where
we all begin before an inner corruption seeps through
the oil that feeds the Fire only to leave us lashing out
against the Universe of humanity that's agreed to be our refuge
History's taught us less than what it could've by not teaching us anything at all
Subscribe to:
Posts (Atom)