Category Archives: Topical

Includes posts on physics, philosophy, sciences, quantitative finance, economics, environment etc.

Why not Discard Special Relativity?

Nothing would satisfy my anarchical mind more than to see the Special Theory of Relativity (SR) come tumbling down. In fact, I believe that there are compelling reasons to consider SR inaccurate, if not actually wrong, although the physics community would have none of that. I will list my misgivings vis-a-vis SR and present my case against it as the last post in this series, but in this one, I would like to explore why it is so difficult to toss SR out the window.

The special theory of relativity is an extremely well-tested theory. Despite my personal reservations about it, the body of proof for the validity of SR is really enormous and the theory has stood the test of time — at least so far. But it is the integration of SR into the rest of modern physics that makes it all but impossible to write it off as a failed theory. In experimental high energy physics, for instance, we compute the rest mass of a particle as its identifying statistical signature. The way it works is this: in order to discover a heavy particle, you first detect its daughter particles (decay products, that is), measure their energies and momenta, add them up (as “4-vectors”), and compute the invariant mass of the system as the modulus of the aggregate energy-momentum vector. In accordance with SR, the invariant mass is the rest mass of the parent particle. You do this for many thousands of times and make a distribution (a “histogram”) and detect any statistically significant excess at any mass. Such an excess is the signature of the parent particle at that mass.

Almost every one of the particles in the particle data book that we know and love is detected using some variant of this method. So the whole Standard Model of particle physics is built on SR. In fact, almost all of modern physics (physics of the 20th century) is built on it. On the theory side, in the thirties, Dirac derived a framework to describe electrons. It combined SR and quantum mechanics in an elegant framework and predicted the existence of positrons, which bore out later on. Although considered incomplete because of its lack of sound physical backdrop, this “second quantization” and its subsequent experimental verification can be rightly seen as evidence for the rightness of SR.

Feynman took it further and completed the quantum electrodynamics (QED), which has been the most rigorously tested theory ever. To digress a bit, Feynman was once being shown around at CERN, and the guide (probably a prominent physicist himself) was explaining the experiments, their objectives etc. Then the guide suddenly remembered who he was talking to; after all, most of the CERN experiments were based on Feynman’s QED. Embarrassed, he said, “Of course, Dr. Feynman, you know all this. These are all to verify your predictions.” Feynman quipped, “Why, you don’t trust me?!” To get back to my point and reiterate it, the whole edifice of the standard model of particle physics is built on top of SR. Its success alone is enough to make it impossible for modern physics to discard SR.

So, if you take away SR, you don’t have the Standard Model and QED, and you don’t know how accelerator experiments and nuclear bombs work. The fact that they do is proof enough for the validity of SR, because the alternative (that we managed to build all these things without really knowing how they work) is just too weird. It’s not just the exotic (nuclear weaponry and CERN experiments), but the mundane that should convince us. Fluorescent lighting, laser pointers, LED, computers, mobile phones, GPS navigators, iPads — in short, all of modern technology is, in some way, a confirmation of SR.

So the OPERA result on observed superluminalily has to be wrong. But I would like it to be right. And I will explain why in my next post. Why everything we accept as a verification of SR could be a case of mass delusion — almost literally. Stay tuned!

Faster than Light

CERN has published news about some subatomic particles exceeding the speed of light, according to BBC and other sources. If confirmed true, this will remove the linchpin of modern physics — it is hard to overstate how revolutionary this discovery would be to our collective understanding of world we live in, from finest structure of matter to the time evolution of the cosmos. My own anarchical mind revels at the thought of all of modern physics getting rewritten, but I also have a much more personal stake in this story. I will get to it later in this series of posts. First, I want to describe the backdrop of thought that led to the notion that the speed of light could not be breached. The soundness of that scientific backdrop (if not the actual conclusion about the inviolability of light-speed) makes it very difficult to forgo the intellectual achievements of the past one hundred years in physics, which is what we will be doing once we confirm this result. In my second post, I will list what these intellectual achievements are, and how drastically their form will have to change. The scientists who discovered the speed violation, of course, understand this only too well, which is why they are practically begging the rest of the physics community to find a mistake in this discovery of theirs. As it often happens in physics, if you look for something hard enough, you are sure to find it — this is the experimental bias that all experimental physicists worth their salt are aware of and battle against. I hope a false negation doesn’t happen, for, as I will describe in my third post in this series, if confirmed, this speed violation is of tremendous personal importance to me.

The constancy (and the resultant inviolability) of the speed of light, of course, comes from Einstein’s Special Theory of Relativity, or SR. This theory is an extension of a simple idea. In fact, Einstein’s genius is in his ability to carry a simple idea to its logically inevitable, albeit counter-intuitive (to the point of being illogical!) conclusion. In the case of SR, he picks an idea so obvious — that the laws of physics should be independent of the state of motion. If you are in a train going at a constant speed, for instance, you can’t tell whether you are moving or not (if you close the windows, that is). The statement “You can’t tell” can be recast in physics as, “There is no experiment you can device to detect your state of motion.” This should be obvious, right? After all, if the laws kept changing every time you moved about, it is as good as having no laws at all.

Then came Maxwell. He wrote down the equations of electricity and magnetism, thereby elegantly unifying them. The equations state, using fancy vector notations, that a changing magnetic field will create an electric field, and a changing electric field will create a magnetic field, which is roughly how a car alternator and an electric motor work. These elegant equations have a wave solution.

The existence of a wave solution is no surprise, since a changing electric field generates a magnetic field, which in turn generates an electric field, which generates a magnetic filed and so on ad infinitum. What is surprising is the fact that the speed of propagation of this wave predicted by Maxwell’s equations is c, the speed of light. So it was natural to suppose that light was a form of electromagnetic radiation, which means that if you take a magnet and jiggle it fast enough, you will get light moving away from you at c – if we accept that light is indeed EM wave.

What is infinitely more fundamental is the question whether Maxwell’s equations are actually laws of physics. It is hard to argue that they aren’t. Then the follow-up question is whether these equations should obey the axiom that all laws of physics are supposed to obey — namely they should be independent of the state of motion. Again, hard to see why not. Then how do we modify Maxwell’s equations such that they are independent of motion? This is the project Einstein took on under the fancy name, “Covariant formulation of Maxwell’s equations,” and published the most famous physics article ever with an even fancier title, “On the Electrodynamics of Moving Bodies.” We now call it the Special Theory of Relativity, or SR.

To get a bit technical, Maxwell’s equations have the space derivatives of electric and magnetic fields relating to the time derivatives of charges and currents. In other words, space and time are related through the equations. And the wave solution to these equations with the propagation speed of c becomes a constraint on the properties of space and time. This is a simple philosophical look on SR, more than a physics analysis.

Einstein’s approach was to employ a series of thought experiments to establish that you needed a light signal to sync clocks and hypothesize that the speed of light had to be constant in all moving frames of reference. In other words, the speed of light is independent of the state of motion, as it has to be if Maxwell’s equations are to be laws of physics.

This aspect of the theory is supremely counter-intuitive, which is physics lingo to say something is hard to believe. In the case of the speed of light, you take a ray of light, run along with it at a high speed, and measure its speed, you still get c. Run against it and measure it — still c. To achieve this constancy, Einstein rewrote the equations of velocity addition and subtraction. On consequence of these rewritten equations is that nothing can go faster than light.

This is my long-winded description of the context in which the speed violation measured at OPERA has to be seen. If the violation is confirmed, we have a few unpleasant choices to pick from:

  1. Electrodynamics (Maxwell’s equations) is not invariant under motion.
  2. Light is not really electromagnetic in nature.
  3. SR is not the right covariant formulation of electrodynamics.

The first choice is patently unacceptable because it is tantamount to stating that electrodynamics is not physics. A moving motor (e.g., if you take your electric razor on a flight) would behave differently from a static one (you may not be able to shave). The second choice also is quite absurd. In addition to the numeric equality between the speed of the waves from Maxwell’s equations and the measured value of c, we do have other compelling reasons why we should believe that light is EM waves. Radio waves induce electric signals in an antenna, light knocks of electrons, microwaves can excite water molecules and cook food and so on.

The only real choice we are left with is the last one — which is to say SR is wrong. Why not discard SR? More reasons than a blog post can summarize, but I’ll try to summarize them any way in my next post.

How to Avoid Duplicate Imports in iPhoto

For the budding photographer in you, iPhoto is a godsend. It is the iLife photo organization program that comes pre-installed on your swanky new iMac or Mac Book Air. In fact, I would go as far as to say that iPhoto is one of the main reasons to switch to a Mac. I know, there are alternatives, but for seamless integration and smooth-as-silk workflow, iPhoto reigns supreme.

iPhotoTaggerBut (ah, there is always a “but”), the workflow in iPhoto can create a problem for some. It expects you to shoot pictures, connect your camera to your Mac, move the photos from the camera to the Mac, enhance/edit and share (Facebook, flickr) or print or make photo books. This flow (with some face recognition, red-eye removal, event/album creation etc.) works like a charm — if you are just starting out with your new digital camera. What if you already have 20,000 old photos and scans on your old computer (in “My Pictures”)?

This is the problem I was faced with when I started playing with iPhoto. I pride myself in anticipating such problems. So, I decided to import my old library very carefully. While importing “My Pictures” (which was fairly organized to begin with), I went through it folder by folder, dragging-and-dropping them on iPhoto and, at the same time, labeling them (and the photos therein) with what I thought were appropriate colors. (I used the “Get Info” function in Finder for color labels.) I thought I was being clever, but I ended up with a fine (but colorful) mess, with my folders and photos sporting random colors. It looked impossible to compare and figure out and where my 20,000 photos got imported to in iPhoto; so I decided to write my very first Mac App — iPhotoTagger. It took me about a week to write it, but it sorted out my photo worries. Now I want to sell it and make some money.

Here is what it does. It first goes through your iPhoto library and catalogs what you have there. It then scans the folder you specify and compares the photos in there with those in your library. If a photo is found exactly once, it will get a Green label, so that it stands out when you browse to it in your Finder (which is Mac-talk for Windows Explorer). Similarly, if the photo appears more than once in your iPhoto library, it will be tagged in Yellow. And, going the extra-mile, iPhotoTagger will color your folder Green if all the photos within have been imported into your iPhoto library. Those folders that have been partially imported will be tagged Yellow.

The photo comparison is done using Exif data, and is fairly accurate. Note that iPhotoTagger doesn’t modify anything within your iPhoto library. Doing so would be unwise. It merely reads the library to gather information.

This first version (V1.0) is released to test the waters, as it were, and is priced at $1.99. If there is enough interest, I will work on V2.0 with improved performance (using Perl and SQLite, if you must know). I will price it at $2.99. And, if the interest doesn’t wane, a V3.0 (for $3.99) will appear with a proper help file, performance pane, options to choose your own color scheme, SpotLight comments (and, if you must know, probably rewritten in Objective-C). Before you rush to send me money, please know that iPhotoTagger requires Snow Leopard and Lion (OS-X 10.6 and 10.7). If in doubt, you can download the lite version and play with it. It is fully functional, and will create lists of photos/folders to be tagged in Green and Yellow, but won’t actually tag them.

Risk – Wiley FinCAD Webinar

This post is an edited version of my responses in a Webinar panel-discussion organized by Wiley-Finance and FinCAD. The freely available Webcast is linked in the post, and contains responses from the other participants — Paul Wilmott and Espen Huag. An expanded version of this post may later appear as an article in the Wilmott Magazine.

What is Risk?

When we use the word Risk in normal conversation, it has a negative connotation — risk of getting hit by a car, for instance; but not the risk of winning a lottery. In finance, risk is both positive and negative. At times, you want the exposure to a certain kind of risk to counterbalance some other exposure; at times, you are looking for the returns associated with a certain risk. Risk, in this context, is almost identical to the mathematical concept of probability.

But even in finance, you have one kind of risk that is always negative — it is Operational Risk. My professional interest right now is in minimizing the operational risk associated with trading and computational platforms.

How do you measure Risk?

Measuring risk ultimately boils down to estimating the probability of a loss as a function of something — typically the intensity of the loss and time. So it’s like asking — What’s the probability of losing a million dollars or two million dollars tomorrow or the day after?

The question whether we can measure risk is another way of asking whether we can figure out this probability function. In certain cases, we believe we can — in Market Risk, for instance, we have very good models for this function. Credit Risk is different story — although we thought we could measure it, we learned the hard way that we probably could not.

The question how effective the measure is, is, in my view, like asking ourselves, “What do we do with a probability number?” If I do a fancy calculation and tell you that you have 27.3% probability of losing one million tomorrow, what do you do with that piece of information? Probability has a reasonable meaning only a statistical sense, in high-frequency events or large ensembles. Risk events, almost by definition, are low-frequency events and a probability number may have only limited practical use. But as a pricing tool, accurate probability is great, especially when you price instruments with deep market liquidity.

Innovation in Risk Management.

Innovation in Risk comes in two flavors — one is on the risk taking side, which is in pricing, warehousing risk and so on. On this front, we do it well, or at least we think we are doing it well, and innovation in pricing and modeling is active. The flip side of it is, of course, risk management. Here, I think innovation lags actually behind catastrophic events. Once we have a financial crisis, for instance, we do a post-mortem, figure out what went wrong and try to implement safety guards. But the next failure, of course, is going to come from some other, totally, unexpected angle.

What is the role of Risk Management in a bank?

Risk taking and risk management are two aspects of a bank’s day-to-day business. These two aspects seem in conflict with each other, but the conflict is no accident. It is through fine-tuning this conflict that a bank implements its risk appetite. It is like a dynamic equilibrium that can be tweaked as desired.

What is the role of vendors?

In my experience, vendors seem to influence the processes rather than the methodologies of risk management, and indeed of modeling. A vended system, however customizable it may be, comes with its own assumptions about the workflow, lifecycle management etc. The processes built around the system will have to adapt to these assumptions. This is not a bad thing. At the very least, popular vended systems serve to standardize risk management practices.

What is Unreal Blog?

Tell us a little about why you started your blog, and what keeps you motivated about it.

As my writings started appearing in different magazines and newspapers as regular columns, I wanted to collect them in one place — as an anthology of the internet kind, as it were. That’s how my blog was born. The motivation to continue blogging comes from the memory of how my first book, The Unreal Universe, took shape out of the random notes I started writing on scrap books. I believe the ideas that cross anybody’s mind often get forgotten and lost unless they are written down. A blog is a convenient platform to put them down. And, since the blog is rather public, you take some care and effort to express yourself well.

Do you have any plans for the blog in the future?

I will keep blogging, roughly at the rate of one post a week or so. I don’t have any big plans for the blog per se, but I do have some other Internet ideas that may spring from my blog.

Philosophy is usually seen as a very high concept, intellectual subject. Do you think that it can have a greater impact in the world at large?

This is a question that troubled me for a while. And I wrote a post on it, which may answer it to the best of my ability. To repeat myself a bit, philosophy is merely a description of whatever intellectual pursuits that we indulge in. It is just that we don’t often see it that way. For instance, if you are doing physics, you think that you are quite far removed from philosophy. The philosophical spins that you put on a theory in physics is mostly an afterthought, it is believed. But there are instances where you can actually apply philosophy to solve problems in physics, and come up with new theories. This indeed is the theme of my book, The Unreal Universe. It asks the question, if some object flew by faster than the speed of light, what would it look like? With the recent discovery that solid matter does travel faster than light, I feel vindicated and look forward to further developments in physics.

Do you think many college students are attracted to philosophy? What would make them choose to major in it?

In today’s world, I am afraid philosophy is supremely irrelevant. So it may be difficult to get our youngsters interested in philosophy. I feel that one can hope to improve its relevance by pointing out the interconnections between whatever it is that we do and the intellectual aspects behind it. Would that make them choose to major in it? In a world driven by excesses, it may not be enough. Then again, it is world where articulation is often mistaken for accomplishments. Perhaps philosophy can help you articulate better, sound really cool and impress that girl you have been after — to put it crudely.

More seriously, though, what I said about the irrelevance of philosophy can be said about, say, physics as well, despite the fact that it gives you computers and iPads. For instance, when Copernicus came up with the notion that the earth is revolving around the sun rather than the other way round, profound though this revelation was, in what way did it change our daily life? Do you really have to know this piece of information to live your life? This irrelevance of such profound facts and theories bothered scientists like Richard Feynman.

What kind of advice or recommendations would you give to someone who is interested in philosophy, and who would like to start learning more about it?

I started my path toward philosophy via physics. I think philosophy by itself is too detached from anything else that you cannot really start with it. You have to find your way toward it from whatever your work entails, and then expand from there. At least, that’s how I did it, and that way made it very real. When you ask yourself a question like what is space (so that you can understand what it means to say that space contracts, for instance), the answers you get are very relevant. They are not some philosophical gibberish. I think similar paths to relevance exist in all fields. See for example how Pirsig brought out the notion of quality in his work, not as an abstract definition, but as an all-consuming (and eventually dangerous) obsession.

In my view, philosophy is a wrapper around multiple silos of human endeavor. It helps you see the links among seemingly unrelated fields, such as cognitive neuroscience and special relativity. Of what practical use is this knowledge, I cannot tell you. Then again, of what practical use is life itself?

Luddite Thoughts

For all its pretentiousness, French cuisine is pretty amazing. Sure, I’m no degustation connoisseur, but the French really know how to eat well. It is little wonder that the finest restaurants in the world are mostly French. The most pivotal aspect of a French dish usually is its delicate sauce, along with choice cuts, and, of course, inspired presentation (AKA huge plates and minuscule servings). The chefs, those artists in their tall white hats, show off their talent primarily in the subtleties of the sauce, for which knowledgeable patrons happily hand over large sums of money in those establishments, half of which are called “Cafe de Paris” or have the word “petit” in their names.

Seriously, sauce is king (to use Bollywood lingo) in French cuisine, so I found it shocking when I saw this on BBC that more and more French chefs were resorting to factory-manufactured sauces. Even the slices of boiled eggs garnishing their overpriced salads come in a cylindrical form wrapped in plastic. How could this be? How could they use mass-produced garbage and pretend to be serving up the finest gastronomical experiences?

Sure, we can see corporate and personal greed driving the policies to cut corners and use the cheapest of ingredients. But there is a small technology success story here. A few years ago, I read in the newspaper that they found fake chicken eggs in some Chinese supermarkets. They were “fresh” eggs, with shells, yolks, whites and everything. You could even make omelets with them. Imagine that — a real chicken egg probably costs only a few cents to produce. But someone could set up a manufacturing process that could churn out fake eggs cheaper than that. You have to admire the ingenuity involved — unless, of course, you have to eat those eggs.

The trouble with our times is that this unpalatable ingenuity is all pervasive. It is the norm, not the exception. We see it in tainted paints on toys, harmful garbage processed into fast food (or even fine-dining, apparently), poison in baby food, imaginative fine-print on financial papers and “EULAs”, substandard components and shoddy workmanship in critical machinery — on every facet of our modern life. Given such a backdrop, how do we know that the “organic” produce, though we pay four times as much for it, is any different from the normal produce? To put it all down to the faceless corporate greed, as most of us tend to do, is a bit simplistic. Going one step further to see our own collective greed in the corporate behavior (as I proudly did a couple of times) is also perhaps trivial. What are corporates these days, if not collections of people like you and me?

There is something deeper and more troubling in all this. I have some disjointed thoughts, and will try to write it up in an ongoing series. I suspect these thoughts of mine are going to sound similar to the luddite ones un-popularized by the infamous Unabomber. His idea was that our normal animalistic instincts of the hunter-gatherer kind are being stifled by the modern societies we have developed into. And, in his view, this unwelcome transformation and the consequent tension and stress can be countered only by an anarchical destruction of the propagators of our so-called development — namely, universities and other technology generators. Hence the bombing of innocent professors and such.

Clearly, I don’t agree with this luddite ideology, for if I did, I would have to first bomb myself! I’m nursing a far less destructive line of thought. Our technological advances and their unintended backlashes, with ever-increasing frequency and amplitude, remind me of something that fascinated my geeky mind — the phase transition between structured (laminar) and chaotic (turbulent) states in physical systems (when flow rates cross a certain threshold, for instance). Are we approaching such a threshold of phase transition in our social systems and societal structures? In my moody luddite moments, I feel certain that we are.

Risk: Interpretation, Innovation and Implementation


A Wiley Global Finance roundtable with Paul Wilmott

Featuring Paul Wilmott, Espen Haug and Manoj Thulasidas

PLEASE JOIN US FOR THIS FREE WEBINAR PRESENTED BY FINCAD AND WILEY GLOBAL FINANCE

How do you identify, measure and model risk, and more importantly, what changes need to be implemented to improve the long-term profitability and sustainability of our financial institutions? Take a unique opportunity to join globally recognised and respected experts in the field, Paul Wilmott, Espen Haug and Manoj Thulasidas in a free, one hour online roundtable discussion to debate the key issues and to find answers to questions to improve financial risk modelling.

Join our experts as they address these fundamental financial risk questions:

  • What is risk?
  • How do we measure and quantify risk in quantitative finance? Is this effective?
  • Is it possible to model risk?
  • Define innovation in risk management. Where does it take place? Where should it take place?
  • How do new ideas see the light of day? How are they applied to the industry, and how should they be applied?
  • How is risk management implemented in modern investment banking? Is there a better way?

Our panel of internationally respected experts include Dr Paul Wilmott, founder of the prestigious Certificate in Quantitative Finance (CQF) and Wilmott.com, Editor-in-Chief of Wilmott Magazine, and author of highly acclaimed books including the best-selling Paul Wilmott On Quantitative Finance; Dr Espen Gaarder Haug who has more than 20 years of experience in Derivatives research and trading and is author of The Complete Guide of Option Pricing Formulas and Derivatives: Models on Models; and Dr Manoj Thulasidas, a physicist-turned-quant who works as a senior quantitative professional at Standard Chartered Bank in Singapore and is author of Principles of Quantitative Development.

This debate will be critical for all chief risk officers, credit and market risk managers, asset liability managers, financial engineers, front office traders, risk analysts, quants and academics.

A Parker Pen from Singapore

During the early part of the last century, there was significant migration of Chinese and Indians to Singapore. Most of the migrants of Indian origin were ethnic Tamils, which is why Tamil is an official language here. But some came from my Malayalam-speaking native land of Kerala. Among them was Natarajan who, fifty years later, would share with me his impressions of Netaji Subhash Chandra Bose and the Indian National Army of the forties. Natarajan would, by then, be called the Singapore Grandpa (Singapore Appuppa), and teach me yoga, explaining the mystical aspects of it a bit, saying things like, “A practitioner of yoga, even when he is in a crowd, is not quite a part of it.” I remembered this statement when a friend of mine at work commented that I walked untouched (kind of like Tim Robbins in the Shawshank Redemption) by the corporate hustle and bustle, which, of course, may have been a polite way of calling me lazy.

Anyway, the Singapore Grandpa (a cousin to my paternal grandfather) was quite fond of my father, who was among the first University graduates from that part of Kerala. He got him a Parker pen from Singapore as a graduation gift. Some fifteen years later, this pen would teach me a lesson that is still not fully learned four decades on.

My father was very proud of his pen, its quality and sturdiness, and was bragging to his friends once. “I wouldn’t be able to break it, even if I wanted to!” he said, without noticing his son (yours faithfully), all of four years then with only a limited understanding of hypothetical conditionals of this kind. Next evening, when he came back from work, I was waiting for him at the door, beaming with pride, holding his precious pen thoroughly crushed. “Dad, dad, I did it! I managed to break your pen for you!”

Heart-broken as my father must have been, he didn’t even raise his voice. He asked, “What did you do that for, son?” using the overly affectionate Malayalam word for “son”. I was only too eager to explain. “You said yesterday that you had been trying to break it, but couldn’t. I did it for you!” Rather short on language skills, I was already a bit too long on physics. I had placed the pen near the hinges of a door and used the lever action by closing it to accomplish my mission of crushing it. In fact, I remembered this incident when I was trying to explain to my wife (short on physics) why the door stopper placed close to the hinges was breaking the floor tiles rather than stopping the door.

My father tried to fix his Parker pen with scotch tape (which was called cellophane tape at that time) and rubber bands. Later, he managed to replace the body of the pen although he could never quite fix the leaking ink. I still have the pen, and this enduring lesson in infinite patience.

Two and half years ago, my father passed away. During the ensuing soul-searching, this close friend of mine asked me, “Well, now that you know what it takes, how well do you think you are doing?” I don’t think I am doing that well, for some lessons, even when fully learned, are just too hard to put in practice.

Photo by dailylifeofmojo cc

Dualism

After being called one of the top 50 philosophy bloggers, I feel almost obliged to write another post on philosophy. This might vex Jat who, while appreciating the post on my first car, was somewhat less than enthusiastic about my deeper thoughts. Also looking askance at my philosophical endeavors would be a badminton buddy of mine who complained that my posts on death scared the bejesus out of him. But, what can I say, I have been listening to a lot of philosophy. I listened to the lectures by Shelly Kagan on just that dreaded topic of death, and by John Searle (again) on the philosophy of mind.

Listening to these lectures filled me with another kind of dread. I realized once again how ignorant I am, and how much there is to know, think and figure out, and how little time is left to do all that. Perhaps this recognition of my ignorance is a sign of growing wisdom, if we can believe Socrates. At least I hope it is.

One thing I had some misconceptions about (or an incomplete understanding of) was this concept of dualism. Growing up in India, I heard a lot about our monistic philosophy called Advaita. The word means not-two, and I understood it as the rejection of the Brahman and Maya distinction. To illustrate it with an example, say you sense something — like you see these words in front of you on your computer screen. Are these words and the computer screen out there really? If I were to somehow generate the neuronal firing patterns that create this sensation in you, you would see these words even if they were not there. This is easy to understand; after all, this is the main thesis of the movie Matrix. So what you see is merely a construct in your brain; it is Maya or part of the Matrix. What is causing the sensory inputs is presumably Brahman. So, to me, Advaita meant trusting only the realness of Brahman while rejecting Maya. Now, after reading a bit more, I’m not sure that was an accurate description at all. Perhaps that is why Ranga criticized me long time ago.

In Western philosophy, there is a different and more obvious kind of dualism. It is the age-old mind-matter distinction. What is mind made of? Most of us think of mind (those who think of it, that is) as a computer program running on our brain. In other words, mind is software, brain is hardware. They are two different kinds of things. After all, we pay separately for hardware (Dell) and software (Microsoft). Since we think of them as two, ours is an inherently dualistic view. Before the time of computers, Descartes thought of this problem and said there was a mental substance and a physical substance. So this view is called Cartesian Dualism. (By the way, Cartesian coordinates in analytic geometry came from Descartes as well — a fact that might enhance our respect for him.) It is a view that has vast ramifications in all branches of philosophy, from metaphysics to theology. It leads to the concepts of spirit and souls, God, afterlife, reincarnation etc., with their inescapable implications on morality.

There are philosophers who reject this notion of Cartesian dualism. John Searle is one of them. They embrace a view that mind is an emergent property of the brain. An emergent property (more fancily called an epiphenomenon) is something that happens incidentally along with the main phenomenon, but is neither the cause nor the effect of it. An emergent property in physics that we are familiar with is temperature, which is a measure of the average velocity of a bunch of molecules. You cannot define temperature unless you have a statistically significant collection of molecules. Searle uses the wetness of water as his example to illustrate emergence of properties. You cannot have a wet water molecule or a dry one, but when you put a lot of water molecules together you get wetness. Similarly, mind emerges from the physical substance of the brain through physical processes. So all the properties that we ascribe to mind are to be explained away as physical interactions. There is only one kind of substance, which is physical. So this monistic philosophy is called physicalism. Physicalism is part of materialism (not to be confused with its current meaning — what we mean by a material girl, for instance).

You know, the trouble with philosophy is that there are so many isms that you lose track of what is going on in this wild jungle of jargonism. If I coined the word unrealism to go with my blog and promoted it as a branch of philosophy, or better yet, a Singaporean school of thought, I’m sure I can make it stick. Or perhaps it is already an accepted domain?

All kidding aside, the view that everything on the mental side of life, such as consciousness, thoughts, ideals etc., is a manifestation of physical interactions (I’m restating the definition of physicalism here, as you can see) enjoys certain currency among contemporary philosophers. Both Kagan and Searle readily accept this view, for example. But this view is in conflict with what the ancient Greek philosophers like Socrates, Plato and Aristotle thought. They all believed in some form of continued existence of a mental substance, be it the soul, spirit or whatever. All major religions have some variant of this dualism embedded in their beliefs. (I think Plato’s dualism is of a different kind — a real, imperfect world where we live on the one hand, and an ideal perfect world of forms on the other where the souls and Gods live. More on that later.) After all, God has to be made up of a spiritual “substance” other than a pure physical substance. Or how could he not be subject to the physical laws that we, mere mortals, can comprehend?

Nothing in philosophy is totally disconnected from one another. A fundamental stance such as dualism or monism that you take in dealing with the questions on consciousness, cognition and mind has ramifications in what kind of life you lead (Ethics), how you define reality (Metaphysics), and how you know these things (Epistemology). Through its influence on religions, it may even impact our political power struggles of our troubled times. If you think about it long enough, you can connect the dualist/monist distinction even to aesthetics. After all, Richard Pirsig did just that in his Zen and the Art of Motorcycle Maintenance.

As they say, if the only tool you have is a hammer, all problems begin to look like nails. My tool right now is philosophy, so I see little philosophical nails everywhere.

Physics vs. Finance

Despite the richness that mathematics imparts to life, it remains a hated and difficult subject to many. I feel that the difficulty stems from the early and often permanent disconnect between math and reality. It is hard to memorize that the reciprocals of bigger numbers are smaller, while it is fun to figure out that if you had more people sharing a pizza, you get a smaller slice. Figuring out is fun, memorizing — not so much. Mathematics, being a formal representation of the patterns in reality, doesn’t put too much emphasis on the figuring out part, and it is plain lost on many. To repeat that statement with mathematical precision — math is syntactically rich and rigorous, but semantically weak. Syntax can build on itself, and often shake off its semantic riders like an unruly horse. Worse, it can metamorphose into different semantic forms that look vastly different from one another. It takes a student a few years to notice that complex numbers, vector algebra, coordinate geometry, linear algebra and trigonometry are all essentially different syntactical descriptions of Euclidean geometry. Those who excel in mathematics are, I presume, the ones who have developed their own semantic perspectives to rein in the seemingly wild syntactical beast.

Physics also can provide beautiful semantic contexts to the empty formalisms of advanced mathematics. Look at Minkowski space and Riemannian geometry, for instance, and how Einstein turned them into descriptions of our perceived reality. In addition to providing semantics to mathematical formalism, science also promotes a worldview based on critical thinking and a ferociously scrupulous scientific integrity. It is an attitude of examining one’s conclusions, assumptions and hypotheses mercilessly to convince oneself that nothing has been overlooked. Nowhere is this nitpicking obsession more evident than in experimental physics. Physicists report their measurements with two sets of errors — a statistical error representing the fact that they have made only a finite number of observations, and a systematic error that is supposed to account for the inaccuracies in methodology, assumptions etc.

We may find it interesting to look at the counterpart of this scientific integrity in our neck of the woods — quantitative finance, which decorates the syntactical edifice of stochastic calculus with dollar-and-cents semantics, of a kind that ends up in annual reports and generates performance bonuses. One might even say that it has a profound impact on the global economy as a whole. Given this impact, how do we assign errors and confidence levels to our results? To illustrate it with an example, when a trading system reports the P/L of a trade as, say, seven million, is it $7,000,000 +/- $5,000,000 or is it $7,000, 000 +/- $5000? The latter, clearly, holds more value for the financial institution and should be rewarded more than the former. We are aware of it. We estimate the errors in terms of the volatility and sensitivities of the returns and apply P/L reserves. But how do we handle other systematic errors? How do we measure the impact of our assumptions on market liquidity, information symmetry etc., and assign dollar values to the resulting errors? If we had been scrupulous about error propagations of this, perhaps the financial crisis of 2008 would not have come about.

Although mathematicians are, in general, free of such critical self-doubts as physicists — precisely because of a total disconnect between their syntactical wizardry and its semantic contexts, in my opinion — there are some who take the validity of their assumptions almost too seriously. I remember this professor of mine who taught us mathematical induction. After proving some minor theorem using it on the blackboard (yes it was before the era of whiteboards), he asked us whether he had proved it. We said, sure, he had done it right front of us. He then said, “Ah, but you should ask yourselves if mathematical induction is right.” If I think of him as a great mathematician, it is perhaps only because of the common romantic fancy of ours that glorifies our past teachers. But I am fairly certain that the recognition of the possible fallacy in my glorification is a direct result of the seeds he planted with his statement.

My professor may have taken this self-doubt business too far; it is perhaps not healthy or practical to question the very backdrop of our rationality and logic. What is more important is to ensure the sanity of the results we arrive at, employing the formidable syntactical machinery at our disposal. The only way to maintain an attitude of healthy self-doubt and the consequent sanity checks is to jealously guard the connection between the patterns of reality and the formalisms in mathematics. And that, in my opinion, would be the right way to develop a love for math as well.