Tuesday, September 24, 2013

The precursor effect

My brother in law is very passionate about history. He sent me a text showing that Nicole Oresme discovered the law of uniformly varied motion, 2-3 hundred years before Galileo

Nicole Oresme (image from Wikipedia)

In fact, he discovered several other things before others did. This should not be such big news to historians of science, but it was to me. I checked his Wikipedia page, and found written that "Oresme manages to anticipate Galileo's discovery". So, I replied to my brother in law, who also seemed a bit disappointed that Oresme is presented only as a "precursor",
it seems that the words "precursor" and "anticipate" have different meanings than we thought. The Romanian prime minister is accused of plagiarism, because he copied almost his entire PhD thesis from another guy's book. Does this make the other guy a precursor of the prime-minister?
In science, it happens often that some effect or discovery is attributed to a more famous guy, although it is known that it was discovered by a less famous guy, sometimes long time before. This even has a name, the Matthew effect, coined by Robert Merton inspired by the Bible verse Matthew 25:29, King James Version,
For unto every one that hath shall be given, and he shall have abundance: but from him that hath not shall be taken even that which he hath.
Even if this law apparently was given by God, it seems unfair to me. But why scientists abide by such an unfair law? 

My only explanation is that it is more practical. When speaking with someone about an effect, we call it with the name used by majority. We do so also when writing papers, so that interested people can find it in databases using the most common name. So, the reasons seem to be practical. But, even in this case, you can use the names of both persons, and this is a common practice too. I for example make an effort to write all the names in "Friedmann-Lemaître-Robertson-Walker singularity", and do this in chronological order, because it is more fair than just call it "Robertson-Walker".

I think it is very important to make sure, especially in a published work, that every time you mention the well-known scientists and inventors, to acknowledge also their "poorer relatives" , the "precursors" who merely "anticipated". Otherwise, after a time, they will be completely erased from history. When you will mention them, everybody will say "I haven't heard of him or her, and this name is not mentioned in any textbook or paper I've read". Check for example the history and talk around the Wikipedia page about the Bohr magneton. The value of this physical constant was first found by Ştefan Procopiu, this is a historical fact.
Ştefan Procopiu,  (image from Wikipedia)

Three years ago it was a "war", because somebody decided to "get rid of Procopiu" (his own words) from the Wikipedia page of the magneton. I will not reproduce the exchanged words, but I think that the main reason why he made the removal was that he never encountered Procopiu's name in relation to the magneton (in particular in Pais's biography of Bohr, which obviously was not the biography of Procopiu). There was a reference to Procopiu's paper, but he removed it too. I posted in the talk page, in addition to citations to two papers by Procopiu, a list of textbooks by experts, I explained that I have nothing against keeping the reference to Bohr, but why should we remove someone who really was the first who found it, and published this in two papers? Eventually, Ştefan Procopiu was accepted back in history, as a humble precursor who just happened to find the magneton first.

The precursor effect. This takes place when, if one wants to avoid to acknowledge that a person is the real author of a discovery or invention, one calls that person "precursor".
coined by me, or by some precursor of mine

Friday, September 13, 2013

Buckminster Fuller's romantic lie

Sean Carroll blogged recently, in Is Work Necessary?, about a quote attributed to Buckminster Fuller, which seems to be trendy (or, how is trendy to say, "it became a meme"). I reproduce the picture from Sean's lucid blog.

I very much agree with the part of the quote saying that the technological progress should allow us to work less. Indeed, since we could make a living before the invention of machines, and especially computers, it seems logical that now we can make a living by working, say, one day a month or so. Because it is indubitable that technological progress multiplied dozens of times our productivity. And in this rhythm, who knows, maybe in twenty years or so there will be robots doing 99.99% of our jobs.


So, why cannot we be unemployed in this society? If you don't pay, you can't get even a glass of water, or a place to sit. Not to mention the luxury of medical care. So, with all this progress, why do we still need jobs? Some of us need them just to live. Others, to live and, in addition, to be able to buy the most recent stuff, like the latest iPhone version, a new car, TV, computer, game console you will not get enough time to use, etc. Add to this that having a job is fun sometimes, even if only during the lunch breaks. At job, you make friends, some for a lifetime, others just for the duration of the job. But the sad truth is that many of us can't even conceive to be unemployed, simply because it will be boring. It takes imagination to work at your dream, instead of building your employer's dream.

Obviously, if we decide to consume less, we can work less and still make a living. We can choose downshifting (this is something I did). But, the paradox is, whenever you try to work a smaller amount of time, the employer tends to consider you lazy for this (even if you are more productive than some full-time employed colleagues). Your salary remains small forever, because you lack full-time experience. And you can't find another employer to hire you part-time with reasonable salary, because you raise suspicions for wanting more free time. In the meantime, the expenses keep increasing, so eventually you have to give up and become another brick in the wall. Of course, some of us can build successful business, which allow them to do nothing for the rest of their lives. But how many can do this? How many small businesses have fail, bankrupting entire families, for a single one to be successful?

So I think that the main idea from the quote, that most of us can do what we like instead of working, is a romantic lie.

But one can say, "did you test Buckminster Fuller's advice, to criticize it?". Well, this is precisely what I did for several years.

Here is his concluding remark

The true business of people should be to go back to school and think about whatever it was they were thinking about before somebody came along and told them they had to earn a living.
After living a Bohemian life as a student and as high school math teacher, I had to find a better paid job. So I built a successful career as a computer programmer, specialized in something that guarantees high salaries even in Romania (geometric algorithms, especially for cad/cam).

After several years, I decided to go back to school and do my Master and PhD, in something I like, geometry and mathematical physics. Soon I will defend my PhD (the thesis is done for almost a year). I like physics, I love to think at unsolved problems in foundational physics, and try to solve them. I do this for fun, without being paid (not that I don't want to be paid).

For my thesis, I researched the problem of singularities in General Relativity, but in the meantime I also activated in the foundations of Quantum Mechanics. Against all standard approaches to the problem, I wrote and published several papers in well rated peer reviewed physics journals (here is a continuously updating list of my papers). In the meantime, I have to earn a living for me and my four member family, to pay the mortgage and bills, and sometimes attend conferences. So, I have to work, as part-time as I can, as a computer programmer. There is the alternative that, after I finish my PhD, I can join a team as a postdoc, and be paid to do what I love. I like this idea, but will I find a position that guarantees me the freedom to research what I want? Or the only way is to help senior researchers make their dreams come true?

So, Mr. Fuller, at thirty years after your death, your beautiful idea is still a romantic lie. And if, in another thirty years, robots will be able to do 99.99% of our work, chances are that society will still find a way to keep us busy.

Wednesday, September 11, 2013

Global and local aspects of causality in quantum mechanics

It contains my talk to the conference "The Time Machine Factory, [speakable, unspeakable] on Time Travel in Turin", (Turin, Italy, October 14-19, 2012). The conference was very well organized, and the list of participants was really impressive. The proceedings were recently published online at EPJ Web of Conferences. Here is the link to my paper, and to the arXiv version. Here is a link to the slides.

Abstract
Quantum mechanics forces us to reconsider certain aspects of classical causality. The 'central mystery' of quantum mechanics manifests in different ways, depending on the interpretation. This mystery can be formulated as the possibility of selecting part of the initial conditions of the Universe 'retroactively'. This talk aims to show that there is a global, timeless, 'bird's view' of the spacetime, which makes this mystery more reasonable. We will review some well-known quantum effects from the perspective of global consistency.

This picture (which I made for the slides) represents the directions used in the proof to the Kochen–Specker theorem, simplified by A. Peres, and arranged by R. Penrose in a pattern inspired by M. C. Escher's Waterfall.

This paper develops some of the ideas I presented in my essay, "The Tao of It and Bit", which qualified for the finals of the FQXi essay contest  "It from Bit or Bit from It?", 2013.

Thursday, September 5, 2013

On the Weyl Curvature Hypothesis

Here are the 5 minutes slides made for my paper On the Weyl Curvature Hypothesis (Annals of Physics, Volume 338, November 2013, Pages 186–194, arxiv:1203.3382).


Abstract
The Weyl curvature hypothesis of Penrose attempts to explain the high homogeneity and isotropy, and the very low entropy of the early universe, by conjecturing the vanishing of the Weyl tensor at the Big-Bang singularity.

In previous papers it has been proposed an equivalent form of Einstein's equation, which extends it and remains valid at an important class of singularities (including in particular the Schwarzschild, FLRW, and isotropic singularities). Here it is shown that if the Big-Bang singularity is from this class, it also satisfies the Weyl curvature hypothesis.

As an application, we study a very general example of cosmological models, which generalizes the FLRW model by dropping the isotropy and homogeneity constraints. This model also generalizes isotropic singularities, and a class of singularities occurring in Bianchi cosmologies. We show that the Big-Bang singularity of this model is of the type under consideration, and satisfies therefore the Weyl curvature hypothesis.

Picasso's revenge

In a previous post, Picasso is so overrated!, I criticized Picasso's painting, Family of Saltimbanques, for containing several childish mistakes. Or at least I consider them mistakes, others may think that they were done at purpose, to send a message which only they can see.

The above mentioned painting was not the only one with mistakes. For instance, below is an annotated image of Boy with a Dog, painted in the same year. Again, we see a disregard to the proportions: this boy's hands are disproportionately long, being able to hang under his knees! The dog is fine.


Were these so-called mistakes really mistakes, or did they serve to a higher purpose, sending a message which could not be sent by conforming to the arid laws of proportion, perspective, anatomy? Last time I argued that they are mistakes, because they were made before Picasso's cubist period. If somebody can say that Picasso deliberately broke the rules, being one of the founders of cubism, well, he was not always a cubist. When being cubist, Picasso deliberately violated the rules, but before that, why would he try to be so conformist in all his paintings, only to break the rules occasionally? A possible explanation is that he was not mastering so well the techniques, he did not have so well the intuition of how objects are in space. Of course, to respect anatomy, he could use models, wooden manequins, and he could make first some sketches. Maybe he was lazy, or thought it is below him to do this, or that this would affect his inspiration. Perhaps he observed, or was told, that it's something wrong with the positions and proportions, that they don't fit well, but was to lazy to redo the entire painting, or thought that it represents so well what he meant, that he wouldn't change anything. 

Anyway, if he was making such childish mistakes, then he may have found in cubism his salvation. He found in cubism the freedom of expression, but not because the classical means were too limited. Rather, because he could not master them. So, it is not excluded that he thought he had something to say, but couldn't because he was "illiterate" in painting. Like an aspiring poet who doesn't know grammar and spelling, and decides to invent his own grammar and spelling. He couldn't play the game, so he changed the rules and invented his own game. It seems that, by this, he was able to find many others willing to play by his rules, and even to spend real fortunes on his works. If there is a public, then this is, after all, art.

I will close with a quote from Roger Waters (Curtis, James M. (1987). Rock Eras: Interpretations of Music and Society, 1954-1984. Popular Press. p. 283. ISBN 0879723696.)
Audiences at those vast concerts are there for an excitement which, I think, has to do with the love of success. When a band or a person becomes an idol, it can have to do with the success that that person manifests, not the quality of work he produces. You don't become a fanatic because somebody's work is good, you become a fanatic to be touched vicariously by their glamour and fame. Stars—film stars, rock 'n' roll stars—represent, in myth anyway, the life as we'd all like to live it. They seem at the very centre of life. And that's why audiences still spend large sums of money at concerts where they are a long, long way from the stage, where they are often very uncomfortable, and where the sound is often very bad.