Showing posts with label science. Show all posts
Showing posts with label science. Show all posts

Monday, September 30, 2013

Amazing science

There's a new atheist kid on the YouTube block: Jaclyn Glenn.


Here's her profile:
Jaclyn Glenn was born March 25, 1988, and lives in Florida, US. She is currently going to medical school and uploads regularly. It is believed that she was married in 2010, but her current relationship status is unknown. Her success on youtube is with the channel "JaclynGlenn", where she discusses topics such as religion, atheism, animal rights, politics, masturbation, and many other issues in a serious yet comical fashion. She has recently admitted to being an atheist and skeptic, but does not have an abrasive personality like many other atheist vloggers on the site.
In that final sentence, the term "vloggers" designates video bloggers: that's to say, individuals who submit regular blog posts in video form. Jaclyn Glenn's video creations can be found here. Countless Americans will be shocked by her following moving version of a sacred anthem:


Needless to say, Richard Dawkins was an instant fan of Jaclyn.

Tuesday, July 23, 2013

Awesome news from the Mother Country


When I first noticed a tweet suggesting that our dear former prime minister Julia Gillard was knitting a stuffed kangaroo for the future monarch, I said to myself: "What a charming little hoax story... at about the level of a high school magazine." At first sight, I would have never imagined that it was the bloody fair dinkum truth!

— photography Grant Matthews, styling by Judith Cook © The Australian Women's Weekly

Shit, we're a dumb country, with super-dumb leaders. Is this dreary situation likely to improve in the foreseeable future?

This morning, I was amused to realize, all of a sudden, that my attitudes towards royalty are quite similar to my attitudes towards religion. That's to say, they're excellent subjects for historical research and reflections (for example, my genealogical writings evoke constantly both royalty and religion), but these two themes are as out-of-place in the modern world as the old diseases that used to decimate babies and young kids, only so recently, royals and plebeians alike.



Thanks to science, the world has evolved remarkably since the time when I was a child, witnessing other kids condemned by such terrible afflictions. Today, it's high time that we were all immunized... against archaic religion and royalty.

POST SCRIPTUM: In looking back upon what I've written this morning, I'm amused to realize that I'm far less interested in the Windsor baby than in the three little marcassins that emerged from beneath my mailbox. A wise observer might conclude (whether or not it's meaningful, let alone true): An old man is awed by the babies that he deserves.

Thursday, September 22, 2011

Ask for evidence

The British charitable trust Sense about Science has just launched a promotional campaign on the theme of the constant necessity to ask for evidence, about all claims.

Normally, among rational human beings, this necessity should itself be evident. Sadly, though, we know it isn't. Mindless believers, Taliban-like fundamentalists, quacks and snake-oil salesmen abound. Their common characteristic, which they often share unwittingly, is a total disrespect for evidence.

Last night, in the USA, the state of Georgia executed Troy Davis.

There was no evidence proving that this man had ever committed the murder for which he was condemned.

Wednesday, September 14, 2011

Myths versus truth

In my recent blog post entitled Children's books [display], I indicated that Richard Dawkins has a book for children coming out soon, on the theme of evolution. Last night, he was interviewed on this subject by the BBC.

Tuesday, August 23, 2011

Wonders of the world

Humans have always liked to draw up lists of the most marvelous things in the world. In Hellenistic times, a famous list of this kind contained seven items, including public gardens (Babylon), a lighthouse (Rhodes) and a tomb (Halicarnassus).

From this assortment of world wonders, the only one that still exists is the pyramid of Giza.

Meanwhile, even this marvel has recently been depleted of much of its mystery thanks to the ingenious findings of an amateur archaeologist, the French architect Jean-Pierre Houdin, presented in my article entitled How did they do it? [display].

Periodically, people get excited about drawing up new lists of the world's wonders, often using Internet polls, but the outcome is generally of little interest, if not biased. For example, one such list, proposed by a Californian fellow named Matt Rosenberg, includes space exploration (which is a rather fuzzy wonder), the engineering phenomenon of telecom and the Internet (even fuzzier still), the tunnel under the sea between France and the UK, and the modern state of Israel (whose creation is said to be "nothing short of a miracle"). And I wouldn't be surprised if my Australian compatriots, asked to draw up a revised list of world wonders, were to include instinctively the Sydney Harbour Bridge and their opera house.

At the start of his 1997 masterpiece entitled The Fabric of Reality, which I presented briefly in a blog post of 2007 [display], the Oxfordian physicist David Deutsch included a dedication:
Dedicated to the memory of Karl Popper, Hugh Everett and Alan Turing, and to Richard Dawkins. This book takes their ideas seriously.
Richard Dawkins, alive and well, needs no introduction, at least not in my Antipodes blog, which celebrates constantly the insights and writings of this great Oxfordian intellectual. Concerning the other three, most people know that Karl Popper [1902-1994] was a Vienna-born philosopher, mentioned in my recent article entitled Voices from Vienna [display]. And Alan Turing [1912-1954] was, of course, the English mathematician who worked in the bunkers of Bletchley Park (Buckinghamshire) during the war years, designing primitive "computers" to crack Nazi codes.

But who's the third guy, Hugh Everett? Well, he died in 1982, at the age of 51. Today, his 48-year-old son Mark—singer, writer and performer with the band Eels—is no doubt better known than his father.

[Click the image for an article on Hugh Everett]

If Deutsch mentioned Everett Senior in his dedication, it's because this man introduced into science one of the weirdest ideas that a human brain has ever imagined, if not the weirdest idea: the existence of a multiverse. That's to say, the everyday universe to which we've grown accustomed could well be just one of very many coexisting universes.

Getting back to wonders of the world, I agree totally with David Deutsch that they number four, and that they can be represented respectively by the four individuals mentioned in his terse dedication. We are speaking here neither of natural marvels (such as the Great Barrier Reef) nor of spectacular worldly constructions (such as the Taj Mahal). Deutsch has indicated four stupendous intellectual creations, built by identifiable humans, which surpass infinitely the splendors of pyramids, palaces, temples, tombs, skyscrapers, etc. They are wonders of the world in the sense that (a) we might well wonder how humble human beings have acquired the wisdom to create such knowledge structures, and (b) the nature and consequences of these wonders leave us spellbound, as if we were gazing in awe upon the divine faces of angels. Except for Philistine observers who don't give a screw about anything, these four intellectual wonders of the world designated by Deutsch demand respect and admiration. To put it bluntly, we would seem (for the moment) to have no more profound sources of wonderment in the Cosmos.

And what in fact are they, these four Deutschian wonders of the world? Well, reduced to simple words, they don't necessarily sound all that marvelous and mind-boggling alongside the gardens of Babylon or even the dull foyer of the Sydney Opera House (which shocked me because of its charmless mediocrity, light years away from the splendor of the illustrious opera houses of Paris, Vienna and Venice, just to name a few, when my friend Ron Willard kindly invited me there in 2006).

• Let's start with the intellectual theme represented by Karl Popper. In a nutshell, this is the extraordinary observation that humble human scientists on our tiny planet Earth can in fact find explanations concerning the Cosmos. Before Popper, science was conceived as an affair of diligent workers in dull laboratories, analyzing the data revealed by Nature. Today, thanks to Popper, we realize that the great scientists have been starry-eyed creators, artists, poets, visionaries, quantum monks and madmen, who have nothing in common with docile laboratory employees. God was not a chartered accountant, a bank manager…

• Deutsch's second wonder of the world is the multiverse thing, which I've mentioned. Here, of course, from an understanding viewpoint, it's every man for himself. Personally, I had the good fortune of growing up in contact with 20th-century physics, so I've always had a vague idea of what was happening in domains labeled relativity, quantum mechanics, cosmology, etc. Frankly, I don't know to what extent the multiverse discourse might be fuzzily comprehensible, if at all, by a total novice in physics. In talking like that, I realize that I might be accused of intellectual elitism, but I can see no way of "sweetening" the hard facts of scientific knowledge.

• The third wonder of the world is closer to home: computing, symbolized by Alan Turing, the pioneer thinker on artificial intelligence. Now, if you happen to think that computing is basically a matter of shit stuff such as Microsoft Window and Facebook, then you're unlikely to understand immediately why the concept of digital computing (defined precisely through the metaphorical Turing Machine, described in my Machina Sapiens) might be imagined as a wonder of the world. Today, computer programming is synonymous with DNA coding. We now know that everything, including what we once thought of as our cherished "minds", is digital.

• Finally, the fourth and final wonder of the world is life, animal evolution, represented by dear old Charles Darwin and his living guardian angel Richard Dawkins. In a way, life is perhaps the easiest marvel to access, in the sense that few mysteries remain, apart from (a) how exactly it started, and (b) how it produced the strange epiphenomenon of consciousness… without which I wouldn't be here today, writing this blog post.

Things were hugely simpler back in the days when humanity could marvel at lighthouses, gardens, tombs, temples…

Sunday, January 30, 2011

Bird magic

The gist of an article in the latest issue of Wired Science is so utterly amazing that it's hard to imagine that the phenomenon in question could take place as described… in the everyday world of birds.

The starting point is a familiar question: How does a tiny bird such as the European robin find its way down to Africa?

The basic answer—as we've known for several decades—is that the bird is capable of detecting the direction of the Earth's magnetic field lines. More precisely, a bird's eye contains optical cells that react to the local magnetic field in such a way as to provide the tiny creature with a kind of black-and-white picture of the field lines, which it uses as a map or, rather, as a compass. OK, fair enough. In "explaining" things in that fashion, we're merely using ordinary words to describe our observations in a common-sense style. To put it even more succinctly, the robin can apparently "see" the imaginary lines that represent the geomagnetic field. But the big question that remains unanswered is: How can a bird's eye actually "see" a magnetic field line? Well, we humans can see the direction of light rays entering a room, say, through a partly-opened window. So, maybe birds detect the direction of geomagnetic "rays" in much the same way that we react to the presence of light. OK, but how do they actually do this?

It's a recently-proposed answer to that question that takes us into a magical domain of physics that was designated by Albert Einstein (who discovered this phenomenon) as "spooky action at a distance". Today, this mysterious phenomenon, which defies common sense, is known as quantum entanglement. It's such a weird affair that it can't really be apprehended directly… unless you happen to be a European robin bound for Africa. For humans, the only way of coming to grips with this concept is through advanced mathematics. But let me nevertheless propose a kind of fuzzy analogy of the situation. This analogy is in no way rigorous, nor even correct (in fact, it's totally wrong and absurd), but it has the merit of highlighting the weirdness of entanglement.

Suppose you own a pair of twin cats, which are hungry, as indicated by their constant meowing. So, they're waiting for you to feed them.

But they happen to be shut up in adjacent rooms of your house. Now, you're in one room, with one of the cats, but you know that the second cat is located in the adjacent room, because you can hear the meows of both animals. You give the first animal a bowl of cat food, which it gulps down immediately. Now, I should have pointed out that the two cats in my example are not only twins; they're also quantum entangled… whatever that might mean. So, when you open the door in order to step into the adjacent room in order to feed the second cat, you discover with amazement (unless, of course, you've become blasé about quantum phenomena) that the second cat is no longer meowing. What's more, you find that this second cat has apparently had its hunger satisfied by the food you just gave to the first cat! I warned you: quantum entanglement is crazy stuff… so there's no point in trying to "grasp" what might be happening.

Let's get back to the robin's eye, which contains a protein named cryptochrome. In a typical molecule of cryptochrome, pairs of electrons exist in a state of quantum entanglement. When a photon of light hits a pair of entangled electrons in a cryptochrome molecule, the photon's energy affects both particles simultaneously, but one of the entangled electrons gets knocked a tiny distance away from its initial position. In this new position of the second electron, the geomagnetic field line is oriented in a slightly different way to what it is in the case of the first electron. And the bird's eye uses this infinitesimal difference—along with data of the same kind from countless neighboring pairs of entangled electrons being hit similarly by photons—to build up its map of the Earth's magnetic field. Straightforward, no?

Now, if there's anything that's not quite clear in my explanations, please let me know, and I'll do my best to enlighten you. But try to make your questions as precise as possible. Use mathematics, if you like...

Hey. Whatever happened to that lovely little European robin that alighted here just a moment ago? Jeez, I fear it has got eaten (simultaneously) by my pair of entangled cats!

_________________

ADDENDUM
[of an intentionally lighthearted nature, unlike most of what I've just been saying]

QUESTION (to make sure you've been following me): What's the difference between a European robin?

You ask me: Between a European robin... and what? I'm sorry, there's no "and what" at the end of my question, which I'll repeat once again: What's the difference between a European robin?

ANSWER: There is, in fact, no difference whatsoever between a European robin. It has two legs, which are of exactly the same length. Particularly the left leg.

Thursday, January 20, 2011

Tools for better thinking

There's a fabulous website (for readers of my kind… whatever that fuzzy expression might mean) known as Edge, which was created by the celebrated literary agent John Brockman. It's truly a place where all the big minds hang out. This year's fundamental question for Edge participants (suggested apparently by Steven Pinker… which doesn't surprise me) is:

What scientific concept would improve
everybody's cognitive toolkit?

In other words, in the case of thinkers who don't seem to hit the nail exactly on the head: What are they missing in the way of paradigms that might enable them to "think different", or at least better?

I remember saying to myself, after my first reading of The Blind Watchmaker by Richard Dawkins: That fellow would write and explain things even more brilliantly if only he knew a bit about object-oriented computer programming! (I still have this impression.)

Today, I was amused and impressed by the answer of this same Dawkins to the 2011 Edge question. The professor suggests that people should master, as a prime necessity, the principles of the double-blind control experiment, as used by countless researchers in the domain of biology and, more particularly, pharmacology. Why not? Testing potential remedies in an objective scientific style prevents us (as Dawkins states) from being "seduced by homeopaths and other quacks and charlatans, who would consequently be put out of business". As I've always said, Dawkins is at his best when he's talking about down-to-earth scientific knowledge. He's the mythical science master whom all of us should have encountered when we were at school.

Another brilliant answer to the 2011 Edge question was supplied by Michael Schermer. He suggested that people should learn to think in a bottom-up rather than a top-down fashion. Now, that kind of advice pleases me immensely, because it uses the everyday talk of computer programmers from back in the last quarter of the 20th century. The only difference is that most of us were emerging, at that time, from an epoch of being fanatically top-down rather than bottom-up. We had been inculcated into thinking that the only way of solving problems is to start at the top and work your way down. In fact, as Michael Schermer points out, Nature (like everything in the Cosmos, so it would seem, ever since the Big Bang) has always started at the bottom and worked its way up…

Monday, November 15, 2010

Mind boggles

Curiously, it was only relatively late in life that I discovered the verb "boggle", generally applied to minds, and I imagined immediately that it was some kind of American neologism. Today, I learn that it is an old term, probably derived from "bogey" (scarecrow). So, the mind boggles when we encounter something that awes us greatly, to the point of causing us to lose our everyday grip on reality.

As a child, I knew the word "woggle" (ring used to fasten the scarf of a Wolf Cub). Later, in computing, I discovered the word "toggle" (two-state switch that goes from zero to one, or back, every time it is hit). But only later did I learn how to say that I was so overwhelmed that I didn't know what to say. Meanwhile, my mind had been constantly boggling, every now and again, for ages…

One should not, however, exaggerate. Simply being impressed is not sufficient for boggling to occur. When you come upon a trivial political statement, say, with which you disagree, you're not going to claim seriously that the opinion of that individual makes your mind boggle! It's like the adjective "awesome". When a lady blogger describes the insipid website of such-and-such a friendly hockey mom as "awesome", this is simply a nonsensical abuse of language. This happens all the time, of course. In French, a few decades ago, the powerful adjective "formidable" came to be used as a synonym for weak words such as "good", "pleasant", "nice" and "attractive". If we don't force ourselves systematically to tone down our use of superlative language, and refrain from using excessive terms to designate mediocre situations and happenings, then we're in danger of running out of appropriate words when they're really needed.

When I was a youth, the things that made my mind boggle most were the concepts of eternity and infinity. Trying to contemplate these disturbing notions made me nauseous, and I quickly had to force myself to think of something else, otherwise I felt that I might be physically sick, or go crazy. Curiously, I had two sure-fire techniques for getting back to normal, even when my mind started to boggle in the darkness and solitude of the night. More precisely, I had a pair of marvelous images stored away in a readily-accessible corner of mind, and I only had to fetch one or other of these two images, or both (in the same way that a computer programmer might link to an error-handling subroutine), in order to halt the boggling, and cause the nausea to disappear. You might say that those two images were unexpected. Even today, I don't know where they came from. I still don't understand why these images used to "work" (I gave up using them when I became a science student) in the sense of attenuating my anguish. In any case, the first image was that of a campfire with children.

Was this some kind of romantic allusion to the Wolf Cub paradigm, which had stirred my imagination for as long as I could remember?

The second image, totally unrelated to the first, was that of a giant ocean liner about to set sail for the other side of the globe. Incidentally, my choice of the image on the left is anachronistic, since the original Queen Mary (of which I had seen images when I was a child) was a far more modest vessel than her recently-built young sister (shown here). But the curves of a massive dark hull were part of my childhood vision. As things turned out, this image of a giant vessel presaged my real future. On the final day of the year in which I turned 21, I stepped aboard a Greek ocean liner with a French name, the Bretagne (built in 1951 at St-Nazaire with a sister ship, the Provence, for a Marseille-based shipping company), which took me to the Old World of my dreams.

A few months later, in May 1962, the Chandris shipping company decided to change the name of their liner to Brittany. Greek observers might claim retrospectively that this was an omen of bad luck. Be that as it may, the vessel was destroyed by fire in Piraeus less than a year later, in April 1963. As for me, after an exceptionally harsh winter holed up in London (and working for IBM, Wigmore Street), I arrived at the East London docks on 28 August 1963 to meet up with a Greek cargo vessel, the Persian Cyrus, on which I was to be employed as a deck boy. I've still got the UK immigration officer's document that authorized me to board the old tramp steamer.

For the first week or so, I greased steel cables, painted anything and everything that could be painted, and helped the cook to prepare and serve meals. After leaving Marseille, the first officer (learning that I had studied mathematics) invited me to take the helm. This was an utterly fabulous activity. Acting upon navigational orders expressed in Modern Greek, I edged the vessel manually through rough seas between Corsica and Sardinia, then around the west coast of Sicily and into the eastern waters of the Mediterranean. A day or so later, commanded by an Egyptian pilot, I steered the ship cautiously through the Suez Canal. Then we entered the Red Sea, with its hordes of dolphins and flying fish. By that time, the giant steel carcass had become my toy. Learning how to turn the wheel in order to change course by a precise number of degrees was quite an art. If you simply tried to aim the vessel in the desired direction, its huge momentum, combined with the effects of the swells and the wind, would cause you to overshoot the mark. Fortunately, I soon developed tricks that enabled me to perform this task optimally. Basically, the general idea was to aim the ship in such a way that it would rapidly overshoot the mark by a few degrees, while attempting to dip the bow into a big swell that would twist the ship abruptly back into the right direction. It's easier said than done… but it was an immense physical pleasure to master this technique. Finally, I left the Persian Cyrus in Kuwait, because it was bound for India, whereas I was keen to get back to France. Here's my pay slip (which you can click to enlarge a little):

I don't know much about the state of the Kuwaiti economy today, but you couldn't get far on ten quid back in those days. Fortunately, I was able to camp in the port zone of Mina-al-Ahmadi for three or four days before getting hired on a British Petroleum tanker, the British Glory, that enabled me to reach Rotterdam three weeks later.

By that time, I no longer needed to calm down my metaphysical anguishes by imagining stirring images of campfires and big ships, because I had discovered, in the interim, that scientific awareness was a far more efficient solution for boggled minds. The campfire is probably still burning, but I no longer need to sit down there. The big ships are still sailing, but I'm no longer obsessed with the idea of boarding them.

For years now, I've found myself face-to-face with visions of amazing entities such as quantum theory, modern cosmology and genetics. Certainly, there are many awesome phenomena that we cannot really comprehend in the same way that I mastered the task of pointing the big ship in the right direction. The truth of the matter is that our human brains, senses and muscles are fairly good for challenges such as getting a machine—such as a vessel or an automobile or a bicycle—to move from one place to another. But we're unfortunately not very good at all, in fact utterly lousy, at trying to get a gut feeling for stuff such as quantum events, the space/time scales of cosmology and the tricks played by DNA in the course of a few billion years. But we succeed in giving ourselves the impression that we understand such things by making an effort to assimilate their scientific explanations.

Little or nothing anguishes me any more, and yet everything amazes me, and commands my respect. I'm enraptured with Nature, which I imagine as an eternally youthful nymph, who seduces me constantly and endlessly. Now that my mind has ceased boggling, I need a new word to designate the rapture in my regard when I look upon the existence of life in the Cosmos.

Sunday, May 9, 2010

Morals

I know this is going to sound silly, but I'll say it all the same. Ever since my youth, I've been intrigued by the philosophy of morals. A child's first introduction to the notions of right and wrong is based largely upon punishment. It's wrong to poke your tongue out at an old man, even though he looks like a scarecrow. So, if the child does so, it's normal that he's likely to be spanked by his mother or father. It's also wrong to play with safety matches, but the punishment is of a different kind. Instead of a spanking, your fingers get burned. Although both actions—making fun of old folk, and playing with dangerous devices—are things that a child "should not do", the child soon starts to feel that there's a difference between these two categories of bad deeds. In the first case, the wrongness consists, as it were, of doing unto another something that you maybe wouldn't wish to be done unto yourself. In the second case, it's simply a matter of not accepting sound advice from experienced oldies who've already made those same mistakes and paid the price in pain.

Within the territory of right and wrong, good and bad, there are a striking number of loopholes, or rather patches of no-man's-land, particularly when other partners step into the picture: social customs, the influence of peer groups, the law of the land and, above all, religions. The territory is transformed into a vast muddy field, where youthful adventurers soon get bogged down… particularly when sex raises its naughty head. For example, young people generally feel that fornicating is good stuff, even when they haven't yet reached the so-called 'legal age" for acts that were referred to, in Australian law, by a delightfully exotic and erotic expression capable of giving a young man an erection: carnal knowledge. Screwing was not explicitly forbidden in the Ten Commandments (except in the form of adultery). Admittedly, if the partners in such a timid crime happened to forget about contraception (often because they didn't know what it was all about), then it could resemble the case of innocent children playing with safety matches.

For all these reasons (and many more), I signed up for a course in moral philosophy at the University of Sydney, in the context of my science studies. There, the naive 16-year-old country boy from Grafton was confronted immediately and inspired immensely by a wise old man from the past named Socrates.

After asserting that "the unexamined life is not worth living", he was put to death for allegedly corrupting the youth of Athens. Clearly, there were diabolical dimensions in the quest for the truth about morals… if such a truth existed. This became more and more obvious to me when I finally had a chance of looking at what had happened in Auschwitz… which had never been a noteworthy event, curiously, back in my hometown circles.

The classes of an obscure professor of moral philosophy named Alain Ker Stout [1900-1983] were an intellectual catastrophe, because he didn't have much to say, and his way of saying it was sadly comical. Funnily, though, I've retained, not only most of the little he told us, but also three of the books upon which his teachings were based.

They still carry my antiseptic ex libris, which looks as if it were written by a lad fresh out of Sunday school… which was in fact the case.

The philosophy of so-called utilitarianism is even dumber than the term used to designate it. Apparently, we should strive to maximize the greatest good for the greatest number of good people. (I'm simplifying.) What does that have to do with utility? Today, only somebody with a mind like that of George W Bush, say, would find this idea "philosophical". To be truthful, I don't know whether or not Bush ever studied John Stuart Mill [1806-1873].

G E Moore [1873-1958] was a brighter analyst… whom I respect for his associations with my two greatest philosophical heroes: Bertrand Russell [1872-1970] and Ludwig Wittgenstein [1889-1951]. But he, too, seems to end up telling us that he doesn't really have anything more profound than common sense to tell us about right and wrong, good and evil, and that stuff.

The well-written little book by Patrick Nowell-Smith [1914-2006] has been a primer for countless readers (Penguin sold more than 100,000 copies) who were intrigued by the idea of a logic-based approach to the philosophy of morals. (It would be more correct to speak of logical positivism rather than logic in a broad traditional sense.) But the interest of this book, today, is mainly historical.

So, what are we left with? Well, unfortunately, we're left with a widespread opinion that, somehow, you need to believe in religion before you even have the right to talk about morals. The antiquated enemies of secular thinking attempt to spread the notion that society would disintegrate into a vast anarchic cesspool of savage depravity if ever the little gods of Judaism, Christianity and Islam were to be removed from the current scene. It goes without saying that these rumormongers are stupid liars, who seek to bully innocent folk into accepting religion in order to save society from barbarian turmoil. But it's the evil liars who are the New Barbarians.

Basically, thinkers such as Richard Dawkins remind us constantly that human nature is what it is, for the better and for the worse, and that the alleged existence of a deity is a totally irrelevant speculation. It's not because the god Jupiter went out of fashion that assassination attempts upon mothers-in-law, say, suddenly spiked. On the contrary, people are starting to believe, these days, that if the gods were finally stacked away in wardrobes with all the other skeletons of human history, there would surely be a drop in crime statistics ranging from raped schoolkids up to kamikaze operations.

In this general context, the brilliant US atheist Sam Harris has succeeded in surprising many of his friends by suggesting that there might indeed be objective links between science and morals. In February 2010, he spoke on this subject at the prestigious conference known as TED [Technology, Entertainment, Design].



In the face of many reactions, Sam Harris has just clarified his thinking in an article entitled Toward a Science of Morality [display]. I like to think that Harris might be onto a good goal: the idea that, somewhere deep down inside our inherited structure of thought, there are inbuilt neuronal circuits (or something like that) that work nonstop at promoting the principle that Auschwitz and countless other barbarian acts were wrong, and that helping little old ladies to cross the busy street in bad weather is a morally good act, for which you deserve to win brownie points. [Remind me to tell you the joke about a pub artist who plays a miniature piano.] For the moment, I would conclude that Harris is not necessarily wrong, but that he nevertheless doesn't need to be right in order for societies to evolve morally in a "well-behaved" fashion.

Sunday, February 14, 2010

Miracles exist! God too... and He's Irish!

In 1775, the celebrated French mathematician Laplace persuaded the Academy of Sciences in Paris to waste no more time and efforts in the examination of projects for perpetual motion machines.

For the last six years, an astute Irishman, Sean McCarthy, has nevertheless succeeded in persuading financial investors that such a project is feasible. It would appear that his research company is about to save humanity from the old-fashioned threat of an energy crisis.



What does that prove? Well, we've always known that the Irish are the world's best talkers...

Sunday, October 4, 2009

Collection of papers on Ardi

Science is a journal published by the AAAS [American Association for the Advancement of Science], founded in 1848, which describes itself as "an international non-profit organization dedicated to advancing science around the world by serving as an educator, leader, spokesperson and professional association". The stated mission of the AAAS is to "advance science, engineering and innovation throughout the world for the benefit of all people".

This journal has assembled a splendid collection of papers concerning the creature Ardipithecus ramidus, and this collection can be downloaded in PDF form. All you need to do is to register (free) at their website, which can be accessed by clicking the above banners. I've downloaded all their 15 files, and I find them relatively easy to read and totally fascinating.

Saturday, December 6, 2008

Correlation between balls and brains

When I was a teenager in Australia, a good way of insulting a fellow was to call him a dickhead. I must admit, though, that I never really knew whether this was intended to mean that his head had the same shape as a penis, or an equivalent degree of intelligence, or a similar vocation in life, or some other more subtle resemblance.

Today, scientific research carried out in the UK has revealed that men of higher intelligence appear to have sperm of better quality. Results indicated that smart males who obtained higher notes in IQ tests tended to produce a greater quantity of sperm with greater mobility.

Now, if you're anything like me, I'll bet you were surprised to learn—in that last sentence—that mobility is an important factor in the clinical evaluation of sperm. We don't generally tend to imagine that these little critters need to travel to and from work every day, or that they like to go out driving in the countryside of a weekend. Well, the truth of the matter is that a lazy sperm who is not constantly up and about, in the style of an early bird catching worms, serves no useful purpose. The unique raison d'être of a self-respecting sperm is to track down an egg, crack it open and devour it in a single gulp, sunny side up. There's lots of tough competition from other sperms, who are totally lacking in brotherly love. In their search for an egg, they jostle and trample one another violently, like US shoppers stampeding into a Wal-Mart on sales day. Suffice it to say: May the best sperm win! We're talking of the most mobile young chap, in top physical form, with first-class sporting footware, at the wheel of the procreative equivalent of a red Ferrari. The brutal battle between competing sperms is a terribly vicious affair... like the Democratic primaries in the USA or the installation of a governing committee in the French Socialist party. Weak-hearted sperms, those that have let their regular gym work slip, those that drink, or those that have wasted their physical resources hanging around in bars with loose women, don't stand a chance. The quest for the egg, like the Graal, is even more terrifyingly Herculean than the Triwizard Tournament in Harry Potter and the Goblet of Fire.

The "dickhead" epithet might therefore be a disguised compliment, designating a superior male with balls in his brain (or maybe rather brains in his balls), whose gushing intellect and spurts of wisdom have the same volume and mobility as his sperm. In any case, this correlation between superior intelligence and award-winning sperm has an interesting corollary. Normally, according to Darwinian evolution, top-quality sperm should have a greater survival value, and it should be giving rise to more and more offspring with superior intelligence. In other words, our planet should be subjected to a relentless phenomenon of ever-increasing intelligence. Spiraling brilliance, wisdom, creativity... you name it. Frankly, I don't know. From my personal viewpoint, I'm convinced that, in our marvelous modern world, there are indeed more and more... dickheads.

Friday, October 31, 2008

Favorite magazine stoops to intelligent design

In my mailbox this morning, I received a nasty trick, stuck away in the cobwebs of the back pages of the latest issue of Scientific American.

I'm referring to an unexpected article about a Dominican priest who happens to deplore the conflict between Darwinism and Christian faith. It's not a habit of mine to behave like an offended reader and send letters to the press... except, maybe, in the case of a pretentious Fascist female journalist who works for The Australian, who regularly drives me up the wall. [The Aussie newspaper usually succeeds in "mislaying" my emails from France, so they don't get published.] But I was so shocked by the presence of religious rubbish in my favorite US science magazine that I immediately sent off a letter to the editors:

There is no place in your excellent time-honored magazine for an article such as "The Christian Man's Evolution" by Sally Lehrman. I would imagine that readers come to your magazine today for a broad and in-depth perspective of scientific achievements and goals, not for journalistic stuff about a fine fellow such as Francisco Ayala, whose religious beliefs cannot possibly concern us. I am afraid that the presence of this article is a promise of worse to come, next year, when the scientific world will be celebrating the 150th anniversary of the publication of Darwin's masterpiece. If Scientific American intends to give fair, if not equal, coverage to Darwinian evolution, creationism and so-called intelligent design, then I am dismayed to realize that I have subscribed to the wrong reading. Between science and all the rest, there is no such thing as fair coverage.

I see this article (which even evokes the beliefs of Sarah Palin) as a breach in the great traditions of Scientific American, and I shall no doubt refrain from renewing my subscription to the magazine.

Incidentally, I've thought it well to add a final seven-word explanation to my simplistic blog profile, which now reads as follows:

After working in various computing jobs, I retired to an old farm property on the edge of the French Alps, where I spend my time writing, playing with the Internet and wandering around on the mountain slopes with my dog Sophia, admiring the beauties of Creation... in the scientific sense of this concept.

It scares shit out of me, in a Halloween spirit, to imagine that any of my friends, acquaintances and anonymous readers of Antipodes might imagine for an instant that my vision of Creation (with a capital C) could be anything other than Darwinian, Dawkinsian, poetic and artistically fuzzy, but purely scientific...

Sunday, September 14, 2008

Vain actors: politicians, priests and parents

Politicians are surely the most arrogant actors of all, because they see themselves endowed with a mandate, and they take themselves very seriously. Many politicians think they have a vision of an ideal future society, and their attempts to realize that vision are imagined as a mission. They seem to forget, or deliberately ignore, that society has evolved through the efforts, not of politicians, but of engineers, scientists, industrialists, businessmen, economists, researchers, teachers, farmers, laborers, etc. Politicians are often powerful in the sense that they can do a lot of harm, such as telling lies that start a war. They can act as dull chiefs, like George W Bush or Dilbert's pointy-haired boss. But they rarely have the necessary competence and resources to actually create anything worthwhile... apart from public service jobs. To take an obvious example, modern society is totally dominated by the ubiquitous role of computing and the Internet. Were these phenomena the achievement of politicians? Of course not. It appears that the Republican candidate John McCain is still incapable of sending an email!

As far as priests are concerned, their role in the modern world has dwindled to almost zero. In the cathedral of Notre Dame in Paris, Benedict XVI urged French Catholics, pathetically, to stand up for their faith: "Don't be afraid! " But he's unlikely to bring about a surge in the recruitment of candidates for the priesthood. Meanwhile, the French president Nicolas Sarkozy has annoyed many citizens by promoting a fuzzy kind of religious observance that he likes to call "positive laicism", which would seem to consist of encouraging the development of religion in French society without ever admitting explicitly that you're doing so... like a mercantile pimp who tells the police that he's merely helping his girlfriends to make friends in the lonely city. Fortunately, since 1905, laicism has become such a profoundly ingrained concept in French attitudes that little Sarko is unlikely to make much headway with his archaic scheming.

Whereas I've realized for ages that politicians and priests do not play significant roles in the modern world, it was only recently that I learned that parents, too, have little or no influence on the lives that their children are likely to lead. In other words, the noble concepts of motherhood and fatherhood are probably myths. Children are socialized and educated, not by their parents, nor by their teachers, but by their peer groups.

Funnily enough, there is nevertheless one domain in which children can be totally and permanently brainwashed by their parents. It's not politics, or anything of a practical nature... but rather religion. A child survives in this treacherous world by learning rapidly, often through trial and error, that Mum and Dad are not talking bullshit when they warn that fire burns, that it's a good idea to look both ways before crossing a street, that eating green fruit can give you a belly ache, etc. In the same way, countless kids seem to decide that the only sure way of surviving in a "higher realm" is to accept the religious advice of Mum and Dad. That's why many baby Christians become adolescent Christians, Jewish babies evolve into adult Jews, and Muslim kids adopt Islam for the rest of their lives.

It's not intuitively obvious that the pursuits of politicians, priests and parents are vain. Personally, it took me over sixty years to reach this state of enlightenment. Up until then, I had always tended to give these worldly authorities the benefit of the doubt, by supposing that they were performing worthwhile deeds. Today, I realize retrospectively that their actions are essentially pointless, indeed empty. The forces that determine our destinies do not emanate from human actors such as politicians, priests or parents. These forces come directly from the Cosmos, and can only be perceived, if at all, through Science.

Wednesday, August 13, 2008

Exotic pilgrimage

In a book I've been reading over the last few days, I was delighted to come upon an outline of an activity that used to interest me greatly (and still does, as an aficionado): Macintosh programming.

The Mac has a toolbox of routines stored in ROM (Read Only Memory) or in System files permanently loaded at start-up time. There are thousands of these toolbox routines, each one doing a particular operation, which is likely to be needed, over and over again, in slightly different ways, in different programs. [...] If you look at the text of a Mac program, whoever wrote it, in whatever programming language and for whatever purpose, the main thing you'll notice is that it consists largely of invocations of familiar, built-in toolbox routines. The same repertoire of routines is available to all programmers. Different programs string calls of these routines together in different combinations and sequences.

Jargon such as that last sentence suggests that the writer is more than a mere user of computer products. Clearly, this didactic author is not in the same basic ballpark as the countless millions of lucky folk who perform their daily work with the help of a Macintosh. The writer would appear to have gone a big step further, and actually gotten his hands dirty in writing Mac software. In an earlier paragraph, he had explained in modest terms his relationship with this machine:

The computer I happen to be familiar with is the Macintosh, and it is some years since I did any programming so I am certainly out of date with the details.

Who is this former adept of Macintosh programming? And why is he is writing about his technical experience in this domain?

In The Ancestor's Tale, published in 2004, Richard Dawkins calls upon the paradigm of Mac software to demonstrate the functioning of a genome. More precisely, he's trying to explain why we should not be alarmed to learn that the human genome is no bigger than that of a mouse: some 30,000 genes. If you were to compare the architectural blueprints of an Olympic edifice at Beijing with a rough drawing I once made of the future shed at Gamone for my donkey Moshé, you would see immediately which of the two construction processes was designed by a planetary people capable of generating artificial fireworks, and which one was sketched by an Aussie hillbilly. In the same spirit, why shouldn't a human genome and a mouse genome, placed side by side, be vastly unalike?

The answer is simple. Genomes aren't blueprints; they're computer-like programs. Over the last day or so, front-page news stories have described Apple's ire at discovering that a proposed iPhone program is pure bullshit. Expensive to acquire, this iPhone application does nothing more than display ostentatiously the fact that the purchaser is apparently wealthy. [This kind of second-degree gag amuses me immensely.] Well, if you were to take out some kind of magic magnifying glass and examine this bullshit program, you would probably find that it "looks" more or less the same, in terms of digital volume, as any of the more brilliant iPhone applications. The difference is not in the vulgar quantity of bits, but in the way they are organized to form a complex computational entity capable of performing big things.

I could ramble on for ages about this brilliant book by Dawkins, but the best thing, dear reader, is that you should buy it and absorb it slowly and languidly, as if you were seated at a table of rare venison and unworldly wines, served by medieval Botticelli maidens against a sonorous background of Monteverdi... or something like that. The brilliant idea of Dawkins consists of leading us on an exotic backwards pilgrimage towards the dawn of creation, in which we meet up with all our genealogical cousins: chimpanzees, gorillas, etc... right back to the origins of life on the planet Earth. This magnum opus by Dawkins is yet another specimen of beautiful writing, fabulous literature and magnificent science. His literary style was inspired, of course, by Chaucer's Canterbury Tales.

Replacing vast phylogenetic trees of Earth's animals by my own humble genealogy, I think of my father. He went through life burdened by a pair of ridiculous Christian names: King Mepham. I explained the first element in last year's article entitled November 11 [display]. As for the second name, it all gets back to Kentish ancestors at a village named Meopham [website], associated with an ancestral Simon Mepham who was an early archbishop of Canterbury [1328-33].

In the cosmic Dawkins saga, the intrinsic "value" of a Mepham forefather on the ancient road back through Canterbury might be likened to that of our concestor [a Dawkinsean neologism for "common ancestor"] who witnessed the disappearance of the dinosaurs. None of these creatures [including probably the archbishop] was the kind of clear-cut individual you might have invited back home to meet up with Mother, let alone Father. They were tiny inconsequential but lovable minuses, like all of us. We can't even imagine what they might have looked like. But we know they existed. Meanwhile, I've spent hours trying to determine what an ancestor of me and my dear cousin Sophia—a descendant of wolves—might have looked like. I have my ideas...

Saturday, June 21, 2008

Summer solstice in the northern hemisphere


To see a world in a grain of sand
And a heaven in a wild flower
Hold infinity in the palm of your hand
And eternity in an hour.


William Blake, Auguries of Innocence

Thursday, January 17, 2008

Pope unwelcome in academia

Normally, today, Benedict XV should have gone along to the Sapienza University of Rome as a guest of the rector. But he decided, two days ago, to stay at home, since a group of 67 teachers and researchers of the physics department had made it known that the pope was a persona non grata in their ivory tower of science. Why didn't the academics wish to welcome the head of the Catholic church? A spokesman explained: "Ever since the condemnation of Galileo by the Inquisition in 1633, physicists are particularly touchy about the Catholic church meddling in scientific matters." For staff members of La Sapienza, Galileo's trial is looked upon as a relatively recent happening, since their prestigious university was founded (by a pope) in 1303.

Now, scientists have had ample opportunities to express their opposition to religion in general, and Christianity in particular. So, why today's sudden surge of aggressiveness, in the Italian capital, concerning the latest pope? It gets back to Galileo. Delving into the declarations of the theologian Joseph Ratzinger long before he became pope, the physicists of La Sapienza have unearthed an oration in which he attempts to justify the trial of Galileo by a fuzzy reference to some kind of "greater rationality" than that of science.

Let us hope that the decision of Benedict XV to refrain from visiting La Sapienza will set a precedent. Popes, cardinals and tutti quanti would do well, from now on, to remain on their time-honored terrain: that of the Church, with all its wishy-washy thinking and bloody history. Today, in the citadels of science, there is no longer any room for those who persist in believing in antiquated falsehoods and childish magic.