Whilst many people will tonight attempt to find their universe at the bottom of a glass, it appears that condensed matter scientists will be attempting to find it at the bottom of a test-tube.
Making a late entry into the 'most ludicrous claim made a scientist during the year' competition, is Richard Haley, who claims that, from an analogy between the theoretical representation of superfluid helium-3 and a certain way of theoretically representing space-time, by obtaining a superfluid state of helium-3 in a test-tube, "in effect, we have made a universe in a test tube."
Helium-3 in a superfluid state is a Bose-Einstein condensate. The significance of this is that the helium-3 nuclei are fermions, whereas Bose-Einstein condensates can only be formed by collections of bosons. To form a superfluid state, it is necessary, at very low temperatures, for pairs of helium-3 nuclei to become correlated, in a manner analogous to Cooper pairs of electrons in superconductivity. Each such pair of helium-3 nuclei form a boson, thereby enabling the formation of a Bose-Einstein condensate.
Now, the metric tensor used to represent space-time can, reputedly, be built from two copies of a fermionic coframe field. Hence, there is an analogy between this and a bosonic condensate built from Cooper pairs of helium-3 atoms.
"Consequently," claims Haley, "the superfluid can be used to simulate particle and cosmic phenomena; black holes, cosmic strings and the Big Bang for instance."
I don't know enough about condensed matter physics to assess this claim. However, I do know that aerodynamicists working with scale-models in wind-tunnels often find it extremely difficult to reliably infer conclusions that apply on full length scales. And in this case, one has more than merely a formal analogy, one is working with exactly the same medium, (namely airflow over an 3-dimensional object), about which one is attempting to infer conclusions.
Monday, December 31, 2007
Saturday, December 29, 2007
Opinions on Lewis Hamilton
Max Mosley: "There is always somebody new. If it wasn't him, it would be either [Nico] Rosberg or [Robert] Kubica or one of the other new stars, a [Sebastian] Vettel, would suddenly be the big one. So I think there is a tendency to exaggerate the importance of Lewis Hamilton."
Bernie Ecclestone: "He has been a real breath of fresh air and has resurrected F1. I have been in motor racing longer than I care to remember, but I have never seen anyone like him. He has been nothing short of a miracle worker. We lost a big hero in Michael Schumacher, but in Lewis we have another. But for him I'm not sure where the sport would be heading."
Stirling Moss: "He is a very impressive young man, the most impressive young driver I've seen in a long while...I have been connected with motor racing for 60 years now and he is certainly the best breath of fresh air we've had."
Mario Andretti: "He's a rare, rare talent. A rookie like this comes around once in a generation. You just try to step back and appreciate it."
Emerson Fittipaldi: "He's in a zone of calm and comfort like a veteran of ten years; his personality, the way he approaches the sport. I'm very impressed."
Frank Williams: "I thought after we got rid of Michael [Schumacher], 'Now we've got a chance again.' But then another superhuman turns up. Michael was many things, but he was also a very, very simple human. Hamilton is a different character I think, but purely in terms of calibre and quality of skill, what I'm seeing so early in this man's career is remarkable...I cry he's not in a Williams, but I rejoice for F1. I really do."
Jackie Stewart: "I think he is the brightest star to have entered F1 - ever!"
Bernie Ecclestone: "He has been a real breath of fresh air and has resurrected F1. I have been in motor racing longer than I care to remember, but I have never seen anyone like him. He has been nothing short of a miracle worker. We lost a big hero in Michael Schumacher, but in Lewis we have another. But for him I'm not sure where the sport would be heading."
Stirling Moss: "He is a very impressive young man, the most impressive young driver I've seen in a long while...I have been connected with motor racing for 60 years now and he is certainly the best breath of fresh air we've had."
Mario Andretti: "He's a rare, rare talent. A rookie like this comes around once in a generation. You just try to step back and appreciate it."
Emerson Fittipaldi: "He's in a zone of calm and comfort like a veteran of ten years; his personality, the way he approaches the sport. I'm very impressed."
Frank Williams: "I thought after we got rid of Michael [Schumacher], 'Now we've got a chance again.' But then another superhuman turns up. Michael was many things, but he was also a very, very simple human. Hamilton is a different character I think, but purely in terms of calibre and quality of skill, what I'm seeing so early in this man's career is remarkable...I cry he's not in a Williams, but I rejoice for F1. I really do."
Jackie Stewart: "I think he is the brightest star to have entered F1 - ever!"
Battle at Kruger
This, apparently, is the most popular Youtube video clip of the year, viewed on 21 million occasions. Which means, I guess, that it's rather redundant posting it here.
In psychological terms, I suspect that the popularity of the clip resides in the fact that we are vicariously enacting human scenarios through such confrontations in the wilderness. As you watch, it is difficult not to become emotionally involved. The course of events follows a type of Disney-esque scenario: little fella falls into peril, then falls into apparently deeper peril as another predator arrives on the scene; meanwhile, unknown to him, his parent/guardian is mounting a heroic rescue bid...
In psychological terms, I suspect that the popularity of the clip resides in the fact that we are vicariously enacting human scenarios through such confrontations in the wilderness. As you watch, it is difficult not to become emotionally involved. The course of events follows a type of Disney-esque scenario: little fella falls into peril, then falls into apparently deeper peril as another predator arrives on the scene; meanwhile, unknown to him, his parent/guardian is mounting a heroic rescue bid...
Friday, December 28, 2007
Gervais meets the Archbishop
Here's an interesting, but rather poorly-lit encounter between Ricky Gervais and the Archbishop of Canterbury, on Simon Mayo's Radio 5 programme. Mayo makes the perceptive observation that whilst Gervais detects an atheist undercurrent in The Simpsons, the Archbishop detects a spiritual undercurrent. The thing about The Simpsons is that it's such a large body of work, and one in which the pros and cons of so many different perspectives are presented, that people do tend to find what they're looking for within it. As a consequence, people with diametrically opposite beliefs are capable of finding verification for their own approach to life within The Simpsons.
The term 'spiritual' is very popular in religious circles, precisely because of its ambiguity. Many people, like Gervais, take 'spiritual issues' to mean moral or emotional issues, but religious people are fond of taking it to mean a diluted version of religious issues. Whilst many people in the secular West may be repelled by religion, they may be more susceptible to 'spiritual issues', which is presumably why so many religious programmes on television now purport to address spiritual issues, rather than religion itself. Hook them with the spiritual issues, and then suck them into religion, seems to be the strategy.
Note also that when the Archbishop tries to suggest that forgiveness makes you a Christian, Gervais is quick to correct him, pointing out that you need forgiveness to be a Christian, but forgiveness doesn't make you a Christian. It's the difference between a sufficient and a necessary condition, a distinction with which a philosophy graduate such as Gervais will be more than familiar.
Ricky Gervais Radio 5 Archbishop of Canterbury Christian
The term 'spiritual' is very popular in religious circles, precisely because of its ambiguity. Many people, like Gervais, take 'spiritual issues' to mean moral or emotional issues, but religious people are fond of taking it to mean a diluted version of religious issues. Whilst many people in the secular West may be repelled by religion, they may be more susceptible to 'spiritual issues', which is presumably why so many religious programmes on television now purport to address spiritual issues, rather than religion itself. Hook them with the spiritual issues, and then suck them into religion, seems to be the strategy.
Note also that when the Archbishop tries to suggest that forgiveness makes you a Christian, Gervais is quick to correct him, pointing out that you need forgiveness to be a Christian, but forgiveness doesn't make you a Christian. It's the difference between a sufficient and a necessary condition, a distinction with which a philosophy graduate such as Gervais will be more than familiar.
Ricky Gervais Radio 5 Archbishop of Canterbury Christian
Thursday, December 27, 2007
Sunday, December 23, 2007
Dawkins to seek martyrdom?
Richard Dawkins, it seems, has developed something of a death-wish. The great Dawk is currently constructing The Mayflower II, and, when sea-worthy next year, will embark on the perilous journey down the Thames to Reading, before taking a train to Heathrow and crossing the Atlantic on a 747. Once there, he will begin a lecture tour of the American Bible Belt and Midwest.
Despite Dawkins's forthright brand of atheism, American religious leaders have already welcomed Dawkins's plans. The Reverend David Cox, of the First Southern Methodist Church, Charleston, South Carolina said: "[Dawkins] is a tool of Satan, of the AntiChrist, it sounds to me. All God-fearing people will be opposed to an atheist touring."
Seriously, Richard: don't go, you'll be assassinated! Religious zealots in the American Bible Belt are as ignorant and intolerant as their compatriots in the Middle-East. Not unless you have some clever, Obi Wan Kenobi plan, to become more powerful in death than you could possibly be alive...
Despite Dawkins's forthright brand of atheism, American religious leaders have already welcomed Dawkins's plans. The Reverend David Cox, of the First Southern Methodist Church, Charleston, South Carolina said: "[Dawkins] is a tool of Satan, of the AntiChrist, it sounds to me. All God-fearing people will be opposed to an atheist touring."
Seriously, Richard: don't go, you'll be assassinated! Religious zealots in the American Bible Belt are as ignorant and intolerant as their compatriots in the Middle-East. Not unless you have some clever, Obi Wan Kenobi plan, to become more powerful in death than you could possibly be alive...
E.O.Wilson on religion
The persistently interesting Bryan Appleyard writes a nice article on biologist E.O.Wilson in today's Sunday Times. The article ostensibly concerns Wilson's intriguing group-selection theories. However, at the end of the article Wilson attempts to defend religion on the following basis:
"Humans have an innate tendency to form religious belief. It has a lot of beneficial influences. It helps people adjust to their mortality and it binds communities tightly together."
The first claim, that religion helps people to deal with mortality, requires considerable evidence to substantiate it. Many religious people seem, on the contrary, to spend their lives in a state of anxiety about their mortality, precisely because they are religious, and precisely because they fear that God will pass judgement on their lives, and potentially dispatch them to Hell, or abandon them in some sort of limbo. The religious concept of sin condemns countless millions to guilt-ridden lives, which hardly seems like a good way of enabling people to deal with their mortality.
Wilson's second claim, that religion binds communities together, is certainly correct, and well-substantiated. Unfortunately, communities tightly bound together also tend to regard outsiders and other communities as enemies, hence religion contributes to the amount of suffering in the world by exacerbating the violent and war-like capabilities of humanity.
"Humans have an innate tendency to form religious belief. It has a lot of beneficial influences. It helps people adjust to their mortality and it binds communities tightly together."
The first claim, that religion helps people to deal with mortality, requires considerable evidence to substantiate it. Many religious people seem, on the contrary, to spend their lives in a state of anxiety about their mortality, precisely because they are religious, and precisely because they fear that God will pass judgement on their lives, and potentially dispatch them to Hell, or abandon them in some sort of limbo. The religious concept of sin condemns countless millions to guilt-ridden lives, which hardly seems like a good way of enabling people to deal with their mortality.
Wilson's second claim, that religion binds communities together, is certainly correct, and well-substantiated. Unfortunately, communities tightly bound together also tend to regard outsiders and other communities as enemies, hence religion contributes to the amount of suffering in the world by exacerbating the violent and war-like capabilities of humanity.
Saturday, December 22, 2007
Is time slowing down?
Jose Senovilla suggests that the reason why the expansion of the universe appears to be accelerating is that time is slowing down, prior to a geometrical 'signature-change' in which the existing time dimension becomes another spatial dimension.
Senovilla's idea is expressed in terms of braneworld cosmology, but the basic idea can be explained more simply. Suppose that the geometry of a 4-dimensional universe is specified in terms of the following metric tensor field:
f(t)dt2 + g
where g is the Riemannian (spatial) geometry on a 3-dimensional manifold. Suppose that the coordinate t ranges from - ∞ to + ∞.
If f(t) is negative everywhere, say f(t) = -1, then t is a timelike coordinate everywhere, and the universe has one temporal dimension everywhere. However, suppose that f(t) is only negative for t < 0, and suppose that it approaches 0 at t = 0, and becomes positive for t > 0. In this case, t is a timelike coordinate for t < 0, but a spacelike coordinate for t > 0. In that region of the universe in which t < 0, there are 3 spatial dimensions, and 1 temporal dimension. In that region of the universe in which t > 0, there are 4 spatial dimensions, and no temporal dimension. The signature change hypersurface which divides the two regions, is the set of points for which t = 0.
Now, an observer is represented in general relativity by a timelike curve γ, and the 'proper time' which lapses for an observer is represented by the integral along the timelike curve, ∫t √(|< γ',γ' >|) dt. One can detect the approach of a signature-change hypersurface because f(t) becomes smaller and smaller, with the consequence that the proper time which lapses along timelike curves becomes smaller and smaller. This is most clearly seen in the case of those timelike curves in which the spatial coordinates are fixed, and the lapse of proper time is therefore ∫t √(|<∂t,∂t>|) dt = ∫t √(|f(t)|) dt.
Senovilla's idea is expressed in terms of braneworld cosmology, but the basic idea can be explained more simply. Suppose that the geometry of a 4-dimensional universe is specified in terms of the following metric tensor field:
f(t)dt2 + g
where g is the Riemannian (spatial) geometry on a 3-dimensional manifold. Suppose that the coordinate t ranges from - ∞ to + ∞.
If f(t) is negative everywhere, say f(t) = -1, then t is a timelike coordinate everywhere, and the universe has one temporal dimension everywhere. However, suppose that f(t) is only negative for t < 0, and suppose that it approaches 0 at t = 0, and becomes positive for t > 0. In this case, t is a timelike coordinate for t < 0, but a spacelike coordinate for t > 0. In that region of the universe in which t < 0, there are 3 spatial dimensions, and 1 temporal dimension. In that region of the universe in which t > 0, there are 4 spatial dimensions, and no temporal dimension. The signature change hypersurface which divides the two regions, is the set of points for which t = 0.
Now, an observer is represented in general relativity by a timelike curve γ, and the 'proper time' which lapses for an observer is represented by the integral along the timelike curve, ∫t √(|< γ',γ' >|) dt. One can detect the approach of a signature-change hypersurface because f(t) becomes smaller and smaller, with the consequence that the proper time which lapses along timelike curves becomes smaller and smaller. This is most clearly seen in the case of those timelike curves in which the spatial coordinates are fixed, and the lapse of proper time is therefore ∫t √(|<∂t,∂t>|) dt = ∫t √(|f(t)|) dt.
Friday, December 21, 2007
Who should direct The Hobbit?
Christmas without a Peter Jackson film has become something of a hollow experience. There was, therefore, some good news this week when it was announced that Jackson's company, Wingnut Films, will be producing a film version of The Hobbit, to be released in two parts in 2010 and 2011. However, somewhat inexplicably, it seems that the directorial/screenplay partnership of Jackson, his partner Fran Walsh, and Philippa Boyens, so successful with The Lord of the Rings films, will not be re-united for The Hobbit. Jackson claims that their current schedule would not enable them to meet a 2010/2011 release date, and he suggests that fans would not be prepared to wait any longer. The timing, however, perhaps has more to do with cashflow exigencies at MGM and New Line Cinema than a concern for the preferences of fans.
The current names being touted for the directorial role are Sam Raimi and Guillermo del Toro. Of these two, del Toro would be the better choice. Raimi is best-known for the Spiderman films, which are derivative, predictable, and lacking in technical ingenuity. Del Toro, however, is an imaginative director, capable of serious fantasy such as Pan's Labyrinth, or flamboyantly popular material such as Blade and Hellboy. Raimi would direct something resembling the films of the Harry Potter/Philip Pullman novels; del Toro would give The Hobbit the type of 'edge' which Jackson imparted to The Lord of the Rings.
The current names being touted for the directorial role are Sam Raimi and Guillermo del Toro. Of these two, del Toro would be the better choice. Raimi is best-known for the Spiderman films, which are derivative, predictable, and lacking in technical ingenuity. Del Toro, however, is an imaginative director, capable of serious fantasy such as Pan's Labyrinth, or flamboyantly popular material such as Blade and Hellboy. Raimi would direct something resembling the films of the Harry Potter/Philip Pullman novels; del Toro would give The Hobbit the type of 'edge' which Jackson imparted to The Lord of the Rings.
Wednesday, December 19, 2007
Toss
McCabism is proud to unveil Toss, the new fragrance for men and women.
You're the star of your own film, the centre of your own universe, the protagonist in your own novel. See yourself as you imagine yourself.
When passion turns to loathing, and relationship turns to property, there's only one perfume this Christmas, and that's Toss. Available at Harrods, Lidl, Aldi, and Londis.
You're the star of your own film, the centre of your own universe, the protagonist in your own novel. See yourself as you imagine yourself.
When passion turns to loathing, and relationship turns to property, there's only one perfume this Christmas, and that's Toss. Available at Harrods, Lidl, Aldi, and Londis.
Tuesday, December 18, 2007
The Star of Bethlehem
Frank Tipler argues that the Star of Bethlehem was, most probably, a supernova in the Andromeda galaxy.
Gazing skywards on a clear night, the stars appear to be speckled across the inner surface of an inverted bowl. This is one hemisphere of what astronomers call the celestial sphere. To understand some of the terminology in Tipler's article, note that declination and right ascension are the names for the equatorial system of coordinates upon the celestial sphere. In this system, the intersection of the plane of the Earth's equator with the celestial sphere determines a great circle on the celestial sphere called the celestial equator. Right ascension (R.A.) provides a coordinate upon the celestial equator, starting at the vernal equinox (see below) and running Eastward. Declination specifies the angular distance North or South of the celestial equator.
To define the vernal equinox, one first needs to introduce the ecliptic. The ecliptic is the great circle which the Sun traces upon the celestial sphere due to the Earth’s annual orbit around the Sun. It can also be thought of as the intersection of the Earth’s orbital plane with the celestial sphere. Because the Earth’s axis, and therefore its equator, are inclined at approximately 23 deg to the orbital plane, the celestial equator is inclined at the same angle to the ecliptic. Now, the ecliptic intersects the celestial equator at two points: the vernal equinox and the autumnal equinox. The vernal equinox is the point of intersection of the ecliptic and the celestial equator at which the Sun moves from the Southern celestial hemisphere into the Northern celestial hemisphere.
Tipler asserts that "The Star of Bethlehem is a star. It is not a planet, or a comet, or a conjunction between two or more planets, or an occultation of Jupiter by the Moon. I shall assume that the Star of Bethlehem was an actual point of light fixed on the celestial sphere. Second, I am going to assume that the Matthean expression 'stood over' means exactly that. The star went through the zenith at Bethlehem...the Star was there, in the sky, directly above the Magi, at the time of their visit to the baby Jesus...Since the latitude of Bethlehem is 31 deg 43' north, the declination of the Star in the first decade B.C. (the range of estimates of Jesus' birth year) must have been 31 deg 43' N.
"Setting Babylon as the zero of longitude and identifying it with the zero of R.A. would give the R.A. of the Star of Bethlehem as 23h 23m in 5 B.C...This position in the first decade B.C. is far away from the galactic plane (the likely location of a galactic nova/supernova), but it is very close to the Andromeda Galaxy, whose center in 5 B.C. was 30 deg 13' [declination], 23h 1m [right ascension]. The galactic halo of the Andromeda Galaxy would have definitely included the declination of the zenith of Bethlehem. The R.A. of the Andromeda Galaxy would correspond to a position in the Mediterranean Sea, but the nearest large city with the indicated declination/latitude is Jerusalem, the city to which the Magi first traveled. The nearest small city is Jaffa, the main port of Palestine, and in Greek mythology, the home city of Andromeda, princess of Jaffa. Any astronomer of the first decade B.C. would immediately associate an event in the constellation Andromeda with Palestine. Our system of constellations is essentially that of Ptolemy, which can be traced back at least to Eudoxus of Cnidus (c. 350 B.C.) (through the poet Aratus), before the Seleucid period of Greek rule over Babylon. Astronomical techniques at the time were sufficiently accurate to allow observers to determine that a star’s declination was at the zenith of a given location to within a minute of arc, or within a nautical mile, using a dioptra and plumb bob. A supernova in M31 could indeed have 'stood over' Bethlehem."
Star of Bethlehem Frank Tipler
Gazing skywards on a clear night, the stars appear to be speckled across the inner surface of an inverted bowl. This is one hemisphere of what astronomers call the celestial sphere. To understand some of the terminology in Tipler's article, note that declination and right ascension are the names for the equatorial system of coordinates upon the celestial sphere. In this system, the intersection of the plane of the Earth's equator with the celestial sphere determines a great circle on the celestial sphere called the celestial equator. Right ascension (R.A.) provides a coordinate upon the celestial equator, starting at the vernal equinox (see below) and running Eastward. Declination specifies the angular distance North or South of the celestial equator.
To define the vernal equinox, one first needs to introduce the ecliptic. The ecliptic is the great circle which the Sun traces upon the celestial sphere due to the Earth’s annual orbit around the Sun. It can also be thought of as the intersection of the Earth’s orbital plane with the celestial sphere. Because the Earth’s axis, and therefore its equator, are inclined at approximately 23 deg to the orbital plane, the celestial equator is inclined at the same angle to the ecliptic. Now, the ecliptic intersects the celestial equator at two points: the vernal equinox and the autumnal equinox. The vernal equinox is the point of intersection of the ecliptic and the celestial equator at which the Sun moves from the Southern celestial hemisphere into the Northern celestial hemisphere.
Tipler asserts that "The Star of Bethlehem is a star. It is not a planet, or a comet, or a conjunction between two or more planets, or an occultation of Jupiter by the Moon. I shall assume that the Star of Bethlehem was an actual point of light fixed on the celestial sphere. Second, I am going to assume that the Matthean expression 'stood over' means exactly that. The star went through the zenith at Bethlehem...the Star was there, in the sky, directly above the Magi, at the time of their visit to the baby Jesus...Since the latitude of Bethlehem is 31 deg 43' north, the declination of the Star in the first decade B.C. (the range of estimates of Jesus' birth year) must have been 31 deg 43' N.
"Setting Babylon as the zero of longitude and identifying it with the zero of R.A. would give the R.A. of the Star of Bethlehem as 23h 23m in 5 B.C...This position in the first decade B.C. is far away from the galactic plane (the likely location of a galactic nova/supernova), but it is very close to the Andromeda Galaxy, whose center in 5 B.C. was 30 deg 13' [declination], 23h 1m [right ascension]. The galactic halo of the Andromeda Galaxy would have definitely included the declination of the zenith of Bethlehem. The R.A. of the Andromeda Galaxy would correspond to a position in the Mediterranean Sea, but the nearest large city with the indicated declination/latitude is Jerusalem, the city to which the Magi first traveled. The nearest small city is Jaffa, the main port of Palestine, and in Greek mythology, the home city of Andromeda, princess of Jaffa. Any astronomer of the first decade B.C. would immediately associate an event in the constellation Andromeda with Palestine. Our system of constellations is essentially that of Ptolemy, which can be traced back at least to Eudoxus of Cnidus (c. 350 B.C.) (through the poet Aratus), before the Seleucid period of Greek rule over Babylon. Astronomical techniques at the time were sufficiently accurate to allow observers to determine that a star’s declination was at the zenith of a given location to within a minute of arc, or within a nautical mile, using a dioptra and plumb bob. A supernova in M31 could indeed have 'stood over' Bethlehem."
Star of Bethlehem Frank Tipler
Monday, December 17, 2007
Christmas again
Each year, around this time, I find myself urging forth the passage of time, so that the working year may end, and the Christmas festivities may ensue. Once the festivities have begun, I then find myself, each year, urging forth the passage of time, so that the festivities may end, and the normal routine of life resume.
It's all a long cry from the days of Space Lego on Christmas Day. Those were proper Christmases. The excitement of opening, building, and playing with Space Lego was simply sans pareil. Just magic. Then came ZX Spectrum computer games on Christmas Day. 'R: Tape Loading Error' is indelibly etched upon my brain-stem for perpetuity. (Whomever deemed that audio cassette players had the fidelity to act as data-loading devices, is, most probably, currently residing in a country with no UK extradition treaty). Eventually, however, when the games could be gently encouraged to load, monumental achievements such as Mike Singleton's Lords of Midnight defined the very texture of Christmas, and denied the use of the television to other stakeholders. And then came the book years: glossy Autocourse annuals, burnished with breathtaking photography, and burning with lambent text.
Do those Christmases past still exist? Is every detail of every event still sitting there, fixed in the crystalline lattice of the past, merely inaccessible to those who inhabit the present? There are basically three positions in the philosophy of time: the whole 4-dimensional space-time of the universe exists, the passage of time is merely a subjective experience, and the future exists just as surely as the past and present; all of the past and the present exist, but the future is only potential, and the passage of time selectively transforms future potentialities into actualities, hence the totality of existence is growing, like a bath filling with tapwater; only the present exists, and the past is merely a fading ember of our memories.
The transformation of our mundane urban surroundings into a sparkling, tinsellated, multi-coloured spree of illumination, still excites me, but the poignancy and regret grows also with each passing year.
It's all a long cry from the days of Space Lego on Christmas Day. Those were proper Christmases. The excitement of opening, building, and playing with Space Lego was simply sans pareil. Just magic. Then came ZX Spectrum computer games on Christmas Day. 'R: Tape Loading Error' is indelibly etched upon my brain-stem for perpetuity. (Whomever deemed that audio cassette players had the fidelity to act as data-loading devices, is, most probably, currently residing in a country with no UK extradition treaty). Eventually, however, when the games could be gently encouraged to load, monumental achievements such as Mike Singleton's Lords of Midnight defined the very texture of Christmas, and denied the use of the television to other stakeholders. And then came the book years: glossy Autocourse annuals, burnished with breathtaking photography, and burning with lambent text.
Do those Christmases past still exist? Is every detail of every event still sitting there, fixed in the crystalline lattice of the past, merely inaccessible to those who inhabit the present? There are basically three positions in the philosophy of time: the whole 4-dimensional space-time of the universe exists, the passage of time is merely a subjective experience, and the future exists just as surely as the past and present; all of the past and the present exist, but the future is only potential, and the passage of time selectively transforms future potentialities into actualities, hence the totality of existence is growing, like a bath filling with tapwater; only the present exists, and the past is merely a fading ember of our memories.
The transformation of our mundane urban surroundings into a sparkling, tinsellated, multi-coloured spree of illumination, still excites me, but the poignancy and regret grows also with each passing year.
Saturday, December 15, 2007
Nigel Roebuck
Nigel Roebuck, one of the greatest Grand Prix journalists of all time, retires from Autosport this week, after more than 30 years writing for the magazine. In recent years, Autosport started an internet feature called 'Ask Nigel', where Roebuck would answer readers' questions. Anyway, in late 2003, Juan Pablo Montoya's Williams was in serious danger of overhauling Michael Schumacher's Ferrari for the World Championship. After the Hungarian Grand Prix, Ferrari protested the Michelin tyres which Williams, amongst other teams, were running, and the FIA upheld their protest. This severely disrupted Williams and Michelin's preparation for the coming races, unsettled Montoya, and the championship slipped beyond their grasp. This prompted my question to Nigel Roebuck, and his sublime response.
Friday, December 14, 2007
Peace by Christmas?
It seems that a grovelling apology from McLaren, and a commitment not to develop certain systems, has finally ended Formula One's espionage saga. The McLaren statement reads:
"To avoid even the possibility of Ferrari information influencing our performance during 2008, McLaren has offered a set of detailed undertakings to the FIA which will impose a moratorium on development in relation to three separate systems.
"McLaren wish to make a public apology to the FIA, Ferrari, the Formula One community and to Formula One fans throughout the world and offer their assurance that changes are now being made which will ensure that nothing comparable to what has taken place will ever happen again. McLaren have also agreed to pay the costs incurred by the FIA for their investigation."
One trusts, then, that in the coming days, Renault will issue a similar apology to McLaren, the Formula One community, and Formula One fans throughout the world, and will undertake not to develop their shock absorbers, fuel system, mass damper and seamless shift transmission, given that such development work could clearly have been influenced by the confidential McLaren information which they were in unauthorised possession of.
The FIA have published a redacted version of their recent report on the development of McLaren's 2008 car. The report demonstrates, as McLaren concede, that the information supplied by Ferrari employee, Nigel Stepney, propagated more widely through McLaren's engineering staff than they previously acknowledged. However, the report fails to establish whether the developments which McLaren planned to introduce arose from public domain information, or from information which they would not have had without the assistance of Mr Stepney. McLaren have, for example, undertaken not to use CO2 gas in their tyres, despite the fact that every other team will, presumably, be availing itself of CO2 next season!
The FIA suggest, with cynical sanctimony, that McLaren engineers in possession of information which, let us remember, they did not seek, but were given, should have refrained from using that information, and should instead have informed Ferrari and the FIA of the presence of the 'mole' within the Ferrari organisation. The last time I checked, Formula One was a competitive, engineering-based sport, in which teams identify how other teams have obtained a performance advantage, and then copy them.
Anyway, let us hope that this proves to be the end of the matter. Let us also hope that the sport can be administered, at some time in the near future, in a consistent and impartial manner.
"To avoid even the possibility of Ferrari information influencing our performance during 2008, McLaren has offered a set of detailed undertakings to the FIA which will impose a moratorium on development in relation to three separate systems.
"McLaren wish to make a public apology to the FIA, Ferrari, the Formula One community and to Formula One fans throughout the world and offer their assurance that changes are now being made which will ensure that nothing comparable to what has taken place will ever happen again. McLaren have also agreed to pay the costs incurred by the FIA for their investigation."
One trusts, then, that in the coming days, Renault will issue a similar apology to McLaren, the Formula One community, and Formula One fans throughout the world, and will undertake not to develop their shock absorbers, fuel system, mass damper and seamless shift transmission, given that such development work could clearly have been influenced by the confidential McLaren information which they were in unauthorised possession of.
The FIA have published a redacted version of their recent report on the development of McLaren's 2008 car. The report demonstrates, as McLaren concede, that the information supplied by Ferrari employee, Nigel Stepney, propagated more widely through McLaren's engineering staff than they previously acknowledged. However, the report fails to establish whether the developments which McLaren planned to introduce arose from public domain information, or from information which they would not have had without the assistance of Mr Stepney. McLaren have, for example, undertaken not to use CO2 gas in their tyres, despite the fact that every other team will, presumably, be availing itself of CO2 next season!
The FIA suggest, with cynical sanctimony, that McLaren engineers in possession of information which, let us remember, they did not seek, but were given, should have refrained from using that information, and should instead have informed Ferrari and the FIA of the presence of the 'mole' within the Ferrari organisation. The last time I checked, Formula One was a competitive, engineering-based sport, in which teams identify how other teams have obtained a performance advantage, and then copy them.
Anyway, let us hope that this proves to be the end of the matter. Let us also hope that the sport can be administered, at some time in the near future, in a consistent and impartial manner.
Saturday, December 08, 2007
John Gray and Straw Dogs
John Gray's Straw Dogs is an attack upon the purported faith which supports modern secular liberal humanism. Gray's book is important, for it is the source for many of the anti-humanistic mantras uttered by contemporary religious apologists. Gray's primary target is the belief in the possibility of human progress, hence the attraction of his ideas to those who believe that human suffering is a penance we must serve to atone for 'original sin'.
Gray's definition of humanism can be found on page 4 of Straw Dogs:
"Humanism can mean many things, but for us it means belief in progress. To believe in progress is to believe that, by using the new powers given us by growing scientific knowledge, humans can free themselves from the limits that frame the lives of other animals."
Unfortunately, however, this is not the definition of humanism, but the definition of transhumanism. Humanism is the belief in the possibility of human progress, whilst transhumanism is the belief in the possibility that humans can transcend their animal nature. Gray's entire book, then, is founded upon a mis-understanding of what humanism is.
Moreover, humanism should not be conflated with utopianism. The unattainability of a human social and political utopia does not entail the impossibility of human progress. If progress is defined to be the reduction of human suffering, then progress is undeniably possible. Consider as a simple example the invention of anaesthetic. If one accepts that it is better for medical surgery to be performed with, rather than without anaesthetic, then one must accept that a society makes progress when it first conducts medical surgery under anaesthetic.
Gray's opinions on progress, however, are confused and contradictory. On p4 he states: "in the world shown to us by Darwin, there is nothing that can be called progress," whilst on p155 he states: "anaesthetic dentistry is an unmixed blessing. So are clean water and flush toilets. Progress is a fact. Even so, faith in progress is a superstition...Improvements in government and society are...real, but they are temporary. Not only can they be lost, they are sure to be."
Humanism, however, does not assert that progress is certain or irreversible. Humanism merely holds that: (i) human progress is possible; and (ii) human progress should be pursued. Most humanists are all too aware of the possibility of regress, and the difficulty of achieving progress. And, whilst there is no guarantee of progress, contra Gray there is also no guarantee of eventual regression.
The arguments expounded by Gray in Straw Dogs are washed, tumble-dried, and hung on the washing line again in Black Mass, which received the following withering review from the philosopher AC Grayling:
In order to establish that secular Whiggish Enlightenment-derived aspirations are the child of Christianity, Gray begins by calling any view or outlook a “religion”. Everything is a religion: Torquemada’s Catholicism, the pluralism and empiricism of 18th-century philosophers, liberalism, Stalinism. He speaks of “secular religion” and “political religion”. This empties the word “religion” of any meaning, making it a neutral portmanteau expression like “view” or “outlook”. He can therefore premise a gigantic fallacy of equivocation, and assimilate secular Enlightenment values to the Christian “narrative” of reformation aimed at bringing about a golden age.
For starters this misreads Christianity, for which truths are eternal and the narrative is a very short story indeed (obey, get to heaven; disobey, do not get to heaven); but more to the point, it utterly misreads the secular view. The secular view is a true narrative of incremental improvement in the human condition through education and political action. Gray thinks that such a view must of necessity be utopian, as if everyone simplistically thought that making things better (in dentistry, in the rule of law, in child health, in international mechanisms for reducing conflict, and so forth for many things) absolutely had to be aimed at realising an ideal golden age to have any meaning. But it does not: trying to make things better is not the same as believing that they can be made perfect. That is a point Gray completely fails to grasp, and it vitiates his case. Since that is so, the point bears repeating: meliorism is not perfectibilism.
But in making a nonsense of the word “religion” Gray blurs and blends just where important distinctions are required. A religion is a view which essentially premises commitment to belief in the existence of supernatural agencies in the universe, almost always conceived as having intentions and expectations regarding human beings. Such is the myth derived from humankind’s infancy, a myth that survives for both institutional and psychological reasons, largely to the detriment of human affairs. Most religions, especially if given the chance, share the totalitarian impulses of Stalinism and Nazism (think Torquemada and the Taliban) for a simple reason: all such are monolithic ideologies demanding subservience to a supposed ideal, on pain of punishment for non-conformity.
Now let us ask whether secular Enlightenment values of pluralism, democracy, the rule of independently and impartially administered law, freedom of thought, enquiry and expression, and liberty of the individual conform to the model of a monolithic ideology such as Catholicism, Islam or Stalinism. Let us further ask how Gray imagines that these values are direct inheritances from Christianity – the Christianity of the Inquisition, which burned to death any who sought to assert just such values. Indeed, the history of the modern European and Europe-derived world is precisely the history of liberation from the hegemony of Christianity. I shall be so bold as to refer the reader to the case for this claim in my forthcoming (Autumn 2007) full-length discussion of it, Towards the Light.
As to the weary old canard about the 20th-century totalitarianisms: it astonishes me how those who should know better can fail to see them as quintessentially counter-Enlightenment projects, and ones which the rest of the Enlightenment-derived world would not put up with and therefore defeated: Nazism in 17 years and Soviet communism in 70. They were counter-Enlightenment projects because they rejected the idea of pluralism and its concomitant liberties of thought and the person, and in the time-honoured unEnlightened way forcibly demanded submission to a monolithic ideal. They even used the forms and techniques of religion, from the notion of thought-crime to the embalming of saints in mausoleums (Lenin and Mao, like any number of saints and their relics, invite pilgrimage to their glass cases). Totalitarianism is not about progress but stasis; it is not about realising a golden age but coercively sustaining the myth of one. This indeed is the lineament of religion: it is the opposite of secular progressivism.
Most of what was achieved in the history of the West from the 16th century onwards – most notably science and the realisation of the values listed above – was wrested from the bitter reactionary grip of religion inch by painful and frequently bloody inch. How can Gray so far ignore this bald fact of history as to make the modern secular West the inheritor of the ideals and aspirations of what it fought so hard to free itself from (and is still bedevilled by)? His accordingly is a bizarre fantasy-version of history. In the face of the central heating that warms him, the modern dentistry that allows him to chew his peanuts, the computer he writes his strange books on and the aeroplanes he travels in, he asserts that “progress is a myth”. But perhaps he does not mean to call material progress a myth, but rather alleged progress in the political condition of a large portion of mankind. Does he thus mean that the movement from feudal baronies to universal suffrage and independent judiciaries is not progress? If it is not, what is it? Regress?
Gray's definition of humanism can be found on page 4 of Straw Dogs:
"Humanism can mean many things, but for us it means belief in progress. To believe in progress is to believe that, by using the new powers given us by growing scientific knowledge, humans can free themselves from the limits that frame the lives of other animals."
Unfortunately, however, this is not the definition of humanism, but the definition of transhumanism. Humanism is the belief in the possibility of human progress, whilst transhumanism is the belief in the possibility that humans can transcend their animal nature. Gray's entire book, then, is founded upon a mis-understanding of what humanism is.
Moreover, humanism should not be conflated with utopianism. The unattainability of a human social and political utopia does not entail the impossibility of human progress. If progress is defined to be the reduction of human suffering, then progress is undeniably possible. Consider as a simple example the invention of anaesthetic. If one accepts that it is better for medical surgery to be performed with, rather than without anaesthetic, then one must accept that a society makes progress when it first conducts medical surgery under anaesthetic.
Gray's opinions on progress, however, are confused and contradictory. On p4 he states: "in the world shown to us by Darwin, there is nothing that can be called progress," whilst on p155 he states: "anaesthetic dentistry is an unmixed blessing. So are clean water and flush toilets. Progress is a fact. Even so, faith in progress is a superstition...Improvements in government and society are...real, but they are temporary. Not only can they be lost, they are sure to be."
Humanism, however, does not assert that progress is certain or irreversible. Humanism merely holds that: (i) human progress is possible; and (ii) human progress should be pursued. Most humanists are all too aware of the possibility of regress, and the difficulty of achieving progress. And, whilst there is no guarantee of progress, contra Gray there is also no guarantee of eventual regression.
The arguments expounded by Gray in Straw Dogs are washed, tumble-dried, and hung on the washing line again in Black Mass, which received the following withering review from the philosopher AC Grayling:
In order to establish that secular Whiggish Enlightenment-derived aspirations are the child of Christianity, Gray begins by calling any view or outlook a “religion”. Everything is a religion: Torquemada’s Catholicism, the pluralism and empiricism of 18th-century philosophers, liberalism, Stalinism. He speaks of “secular religion” and “political religion”. This empties the word “religion” of any meaning, making it a neutral portmanteau expression like “view” or “outlook”. He can therefore premise a gigantic fallacy of equivocation, and assimilate secular Enlightenment values to the Christian “narrative” of reformation aimed at bringing about a golden age.
For starters this misreads Christianity, for which truths are eternal and the narrative is a very short story indeed (obey, get to heaven; disobey, do not get to heaven); but more to the point, it utterly misreads the secular view. The secular view is a true narrative of incremental improvement in the human condition through education and political action. Gray thinks that such a view must of necessity be utopian, as if everyone simplistically thought that making things better (in dentistry, in the rule of law, in child health, in international mechanisms for reducing conflict, and so forth for many things) absolutely had to be aimed at realising an ideal golden age to have any meaning. But it does not: trying to make things better is not the same as believing that they can be made perfect. That is a point Gray completely fails to grasp, and it vitiates his case. Since that is so, the point bears repeating: meliorism is not perfectibilism.
But in making a nonsense of the word “religion” Gray blurs and blends just where important distinctions are required. A religion is a view which essentially premises commitment to belief in the existence of supernatural agencies in the universe, almost always conceived as having intentions and expectations regarding human beings. Such is the myth derived from humankind’s infancy, a myth that survives for both institutional and psychological reasons, largely to the detriment of human affairs. Most religions, especially if given the chance, share the totalitarian impulses of Stalinism and Nazism (think Torquemada and the Taliban) for a simple reason: all such are monolithic ideologies demanding subservience to a supposed ideal, on pain of punishment for non-conformity.
Now let us ask whether secular Enlightenment values of pluralism, democracy, the rule of independently and impartially administered law, freedom of thought, enquiry and expression, and liberty of the individual conform to the model of a monolithic ideology such as Catholicism, Islam or Stalinism. Let us further ask how Gray imagines that these values are direct inheritances from Christianity – the Christianity of the Inquisition, which burned to death any who sought to assert just such values. Indeed, the history of the modern European and Europe-derived world is precisely the history of liberation from the hegemony of Christianity. I shall be so bold as to refer the reader to the case for this claim in my forthcoming (Autumn 2007) full-length discussion of it, Towards the Light.
As to the weary old canard about the 20th-century totalitarianisms: it astonishes me how those who should know better can fail to see them as quintessentially counter-Enlightenment projects, and ones which the rest of the Enlightenment-derived world would not put up with and therefore defeated: Nazism in 17 years and Soviet communism in 70. They were counter-Enlightenment projects because they rejected the idea of pluralism and its concomitant liberties of thought and the person, and in the time-honoured unEnlightened way forcibly demanded submission to a monolithic ideal. They even used the forms and techniques of religion, from the notion of thought-crime to the embalming of saints in mausoleums (Lenin and Mao, like any number of saints and their relics, invite pilgrimage to their glass cases). Totalitarianism is not about progress but stasis; it is not about realising a golden age but coercively sustaining the myth of one. This indeed is the lineament of religion: it is the opposite of secular progressivism.
Most of what was achieved in the history of the West from the 16th century onwards – most notably science and the realisation of the values listed above – was wrested from the bitter reactionary grip of religion inch by painful and frequently bloody inch. How can Gray so far ignore this bald fact of history as to make the modern secular West the inheritor of the ideals and aspirations of what it fought so hard to free itself from (and is still bedevilled by)? His accordingly is a bizarre fantasy-version of history. In the face of the central heating that warms him, the modern dentistry that allows him to chew his peanuts, the computer he writes his strange books on and the aeroplanes he travels in, he asserts that “progress is a myth”. But perhaps he does not mean to call material progress a myth, but rather alleged progress in the political condition of a large portion of mankind. Does he thus mean that the movement from feudal baronies to universal suffrage and independent judiciaries is not progress? If it is not, what is it? Regress?
Gordon Murray and hot baths
The coolest man in the human noosphere is surely Gordon Murray. Murray designed a series of beautiful and innovative Formula 1 cars in the 1970s and 1980s, and then proceeded to design both the McLaren F1 road car, and the Mercedes SLR. Interviewed in this month's Motorsport magazine, Murray reveals that "I'm unusual for an engineer in that I went to art school when I was 13. I still do a bit of drawing and painting, and I love styling. I couldn't bring myself to make an ugly car."
Murray (seen here on the right of Niki Lauda) is the most relaxed of individuals, and during his years in Formula 1 could typically be seen sporting a T-shirt, jeans and rock-star sunglasses. He also put together a rock band in the 1980s: "I play guitar and drums, both very badly...we'd have a jam session that would last all weekend. Leo Sayer sang with us for a bit, and George Harrison played with us one night. We had a lot of fun"
Murray introduced the idea of strategic pit-stops into modern Formula 1, and the genesis of this idea confirms my own long-held bathing beliefs:
"It started as a hot bath idea. I used to have a lot of good ideas after a hot bath. Apparently there's a physical reason for this, there's a channel in your spine that opens up in the heat and increases the blood supply to the brain. I knew how much the tyres used to go off. And I'd learned from running the cars light in qualifying that the weight of one litre of fuel cost around one hundredth of a second in lap times. So I lay in the bath doing the maths.
"The clever thing wasn't having the idea, it was developing all the stuff that went into it. That's the bit I love. Throw me a series of connected problems and I've got to find a way to make everything work together. In this case it was, how do you change the tyres quickly, how do you put the fuel in quickly, and how do you avoid losing pace going back out on cold tyres? We videoed the mechanics changing tyres, analysed it frame by frame, and I redesigned the hubs, bearing carriers, threads, nuts and wheel guns, with a device to retain the nuts. And I put titanium on-board air jacks on the car...Tyre warmers didn't exist then, so I made an oven, a big thing like the Tardis, with temperature probes and hot air circulating through four tyres. Then we did the fuel kit. Nowadays it all has to be done at atmospheric pressure, so it's pretty slow. But there were no rules about it then, so I designed a twin-barrel fuel system running at 4 bar. The damper barrel fed the fuel barrel and the fuel barrel fed the car, and we could push in 35 gallons in 3.5 seconds. Which is like an explosion, believe me.
"With the refuelling we had one guy on one side of the car opening the breather, one on the other putting in the fuel. Big heavy hoses over their shoulders. If the breather's not on when the fuel guy opens the pressure, the car disintegrates. Four bar inside a carbon and aluminium monocoque, you wouldn't find the pieces. So I designed all sorts of mechanical interlocks inside the tank and it all got really complicated, castellations and cams and Geneva mechanisms. And I looked at it and thought, on a racing car it's going to vibrate; one day it's going to fail to work and the car's going to explode. No more Brabham [the team for which Murray worked], no more pitlane, maybe no more F1. I studied the videos again, and I realised that during a stop the breather guy and the fuel guy would be facing each other each side of the roll hoop with their noses about four inches apart. So I scrapped the interlocks and I got the two of them together and I said to the breather guy, 'When you approach the car with your hose, you're looking down to where you've got to lock the hose on, so don't look up until it's on.' And I said to the fuel guy, 'Don't turn on the fuel until he looks up at you and you see the whites of his eyes four inches away.' So that's what they did, and it worked."
Murray (seen here on the right of Niki Lauda) is the most relaxed of individuals, and during his years in Formula 1 could typically be seen sporting a T-shirt, jeans and rock-star sunglasses. He also put together a rock band in the 1980s: "I play guitar and drums, both very badly...we'd have a jam session that would last all weekend. Leo Sayer sang with us for a bit, and George Harrison played with us one night. We had a lot of fun"
Murray introduced the idea of strategic pit-stops into modern Formula 1, and the genesis of this idea confirms my own long-held bathing beliefs:
"It started as a hot bath idea. I used to have a lot of good ideas after a hot bath. Apparently there's a physical reason for this, there's a channel in your spine that opens up in the heat and increases the blood supply to the brain. I knew how much the tyres used to go off. And I'd learned from running the cars light in qualifying that the weight of one litre of fuel cost around one hundredth of a second in lap times. So I lay in the bath doing the maths.
"The clever thing wasn't having the idea, it was developing all the stuff that went into it. That's the bit I love. Throw me a series of connected problems and I've got to find a way to make everything work together. In this case it was, how do you change the tyres quickly, how do you put the fuel in quickly, and how do you avoid losing pace going back out on cold tyres? We videoed the mechanics changing tyres, analysed it frame by frame, and I redesigned the hubs, bearing carriers, threads, nuts and wheel guns, with a device to retain the nuts. And I put titanium on-board air jacks on the car...Tyre warmers didn't exist then, so I made an oven, a big thing like the Tardis, with temperature probes and hot air circulating through four tyres. Then we did the fuel kit. Nowadays it all has to be done at atmospheric pressure, so it's pretty slow. But there were no rules about it then, so I designed a twin-barrel fuel system running at 4 bar. The damper barrel fed the fuel barrel and the fuel barrel fed the car, and we could push in 35 gallons in 3.5 seconds. Which is like an explosion, believe me.
"With the refuelling we had one guy on one side of the car opening the breather, one on the other putting in the fuel. Big heavy hoses over their shoulders. If the breather's not on when the fuel guy opens the pressure, the car disintegrates. Four bar inside a carbon and aluminium monocoque, you wouldn't find the pieces. So I designed all sorts of mechanical interlocks inside the tank and it all got really complicated, castellations and cams and Geneva mechanisms. And I looked at it and thought, on a racing car it's going to vibrate; one day it's going to fail to work and the car's going to explode. No more Brabham [the team for which Murray worked], no more pitlane, maybe no more F1. I studied the videos again, and I realised that during a stop the breather guy and the fuel guy would be facing each other each side of the roll hoop with their noses about four inches apart. So I scrapped the interlocks and I got the two of them together and I said to the breather guy, 'When you approach the car with your hose, you're looking down to where you've got to lock the hose on, so don't look up until it's on.' And I said to the fuel guy, 'Don't turn on the fuel until he looks up at you and you see the whites of his eyes four inches away.' So that's what they did, and it worked."
Thursday, December 06, 2007
Lawrence Krauss and the Casimir effect II
A couple of days ago, I wrote about Lawrence Krauss's article in Physics World. At that time, the article referred to
...the Casimir force that pushes apart metal plates brought very close together.
I pointed out in my post that the effects typically referred to as the Casimir effect, are examples of an attractive force between a pair of parallel plates. I now note with amusement that the text of the on-line version has been changed to read:
..the Casimir force that draws together metal plates brought very close to one another.
I'm impressed! However, the original version is preserved for posterity in the printed copy of Physics World.
...the Casimir force that pushes apart metal plates brought very close together.
I pointed out in my post that the effects typically referred to as the Casimir effect, are examples of an attractive force between a pair of parallel plates. I now note with amusement that the text of the on-line version has been changed to read:
..the Casimir force that draws together metal plates brought very close to one another.
I'm impressed! However, the original version is preserved for posterity in the printed copy of Physics World.
Wednesday, December 05, 2007
Alternative biochemical life
This month's Scientific American contains an article by Paul Davies on the possibility that life may have arisen on multiple independent occasions upon the surface of the Earth. A nice summary is provided of the biochemical differences which might exist between life forms of independent origin:
- "Large biological molecules possess a definite handedness: although the atoms in a molecule can be configured into two mirror-image orientations—left-handed or right-handed—molecules must possess compatible chirality to assemble into more complex structures. In known life-forms, the amino acids—the building blocks of proteins—are left-handed, whereas the sugars are right-handed and DNA is a right-handed double helix. The laws of chemistry, however, are blind to left and right, so if life started again from scratch, there would be a 50–50 chance that its building blocks would be molecules of the opposite handedness. Shadow life could in principle be biochemically almost identical to known life but made of mirror-image molecules. Such mirror life would not compete directly with known life, nor could the two forms swap genes, because the relevant molecules would not be interchangeable."
- "Another possibility is that shadow life might share the same general biochemistry with familiar life but employ a different suite of amino acids or nucleotides (the building blocks of DNA)...chemists can synthesize many other amino acids that are not present in known organisms...Some of these unfamiliar amino acids might make suitable building blocks for alternative forms of life."
- "Another popular conjecture concerns the basic chemical elements that make up the vital parts of known organisms: carbon, hydrogen, oxygen, nitrogen and phosphorus. Would life be possible if a different element were substituted for one of these five? Phosphorus is problematic for life in some ways. It is relatively rare and would not have existed in abundance in readily accessible, soluble form under the conditions that prevailed during the early history of Earth. Felisa Wolfe-Simon, formerly at Arizona State University and now at Harvard University, has hypothesized that arsenic can successfully fill the role of phosphorus for living organisms and would have offered distinct chemical advantages in ancient environments."
- "Some astrobiologists have speculated about the possibility of life arising from silicon compounds instead of carbon compounds." [Carbon, like silicon, possesses four valence electrons, enabling it to form rings and chains which form the backbone of biological molecules.]
Tuesday, December 04, 2007
Lawrence Krauss and the Casimir effect
The December issue of Physics World contains a number of features on dark energy, including this ludicrous piece by Lawrence Krauss. Consider the following statements made by Krauss:
Quantum mechanics, combined with relativity, implies that empty space is full of a wild brew of virtual particles that pop in and out of existence so quickly that we cannot directly detect them. Nevertheless, these particles leave a measurable imprint on everything from the spacing between atomic energy levels to the Casimir force that pushes apart metal plates brought very close together.
One might expect these virtual particles to contribute an energy to empty space, which would result in an identical term to Einstein's original cosmological constant that would lead to universal repulsion and hence an accelerating universe. This form of 'vacuum energy' is gravitationally repulsive because it possesses a negative pressure that is equal and opposite in magnitude to its energy density. In other words, the ratio of the pressure to the energy density — called the 'equation of state' parameter, w — has a value of –1.
This is factually incorrect in a very straightforward fashion: whilst there will be a repulsive force between a concentric pair of spheres, those effects which have typically been considered to be Casimir effects, are examples of an attractive force between a pair of parallel metal plates! The editor of Physics World might have wished to check previous articles on the Casimir effect within his own journal to verify this type of thing, but I guess he must have been otherwise engaged on this occasion.
Theory predicts that the vacuum expectation value for the energy density of the electromagnetic field, will not only be non-zero between the plates, but will be negative. Thus, it is believed by some that the Casimir effect demonstrates the physical existence of negative energy, at least on small scales. This is exactly why the Casimir effect was invoked as a mechanism for holding open wormholes in space, given that the latter require negative energy densities. The vacuum fluctuations between a pair of metal plates do indeed suggest an equation of state with a parameter of w =-1, but in the case of plates which are attracted together, this implies a positive pressure and a negative energy. This is the exact opposite of the conclusion which Krauss attempts to establish, that vacuum energy repels things due to the presence of negative pressure.
Krauss then proceeds to express some doubt that we can ever establish that dark energy is a cosmological constant. His argument is, once again, quite bizarre:
The only way we can determine from observations that dark energy is not a cosmological constant is to somehow measure its equation of state parameter, w, and find that it is not, or was not, equal to –1. If the measured value is indistinguishable from –1 within experimental uncertainties, then we have not learned anything at all because dark energy could either be a cosmological constant or something else less (or more) exotic that behaved very much like it.
If one applied this criterion generally in science, then one would have to conclude that the predicted value of a quantity could never be verified because, well, there are always some error bars associated with the measurement. On the contrary, if a theoretical hypothesis predicts a certain value for some quantity, and the value of that quantity is subsequently measured to agree with the predicted value to within, say, two multiples of the standard deviation due to measurement uncertainties, then one can say that: (i) the prediction has been verified, and (ii) the probability that the hypothesis is true has been considerably increased, in accordance with Bayes' theorem. Of course, if there is an alternative theory, which predicts a similar value for the quantity, then the measured value doesn't enable one to discriminate between the two theories. But, in the case of dark energy, there is, as yet, no theory of any 'exotic' something that behaves like the cosmological constant.
It was Krauss, of course, who suggested a couple of weeks ago that mankind may have reduced the life-expectancy of the universe...
Quantum mechanics, combined with relativity, implies that empty space is full of a wild brew of virtual particles that pop in and out of existence so quickly that we cannot directly detect them. Nevertheless, these particles leave a measurable imprint on everything from the spacing between atomic energy levels to the Casimir force that pushes apart metal plates brought very close together.
One might expect these virtual particles to contribute an energy to empty space, which would result in an identical term to Einstein's original cosmological constant that would lead to universal repulsion and hence an accelerating universe. This form of 'vacuum energy' is gravitationally repulsive because it possesses a negative pressure that is equal and opposite in magnitude to its energy density. In other words, the ratio of the pressure to the energy density — called the 'equation of state' parameter, w — has a value of –1.
This is factually incorrect in a very straightforward fashion: whilst there will be a repulsive force between a concentric pair of spheres, those effects which have typically been considered to be Casimir effects, are examples of an attractive force between a pair of parallel metal plates! The editor of Physics World might have wished to check previous articles on the Casimir effect within his own journal to verify this type of thing, but I guess he must have been otherwise engaged on this occasion.
Theory predicts that the vacuum expectation value for the energy density of the electromagnetic field, will not only be non-zero between the plates, but will be negative. Thus, it is believed by some that the Casimir effect demonstrates the physical existence of negative energy, at least on small scales. This is exactly why the Casimir effect was invoked as a mechanism for holding open wormholes in space, given that the latter require negative energy densities. The vacuum fluctuations between a pair of metal plates do indeed suggest an equation of state with a parameter of w =-1, but in the case of plates which are attracted together, this implies a positive pressure and a negative energy. This is the exact opposite of the conclusion which Krauss attempts to establish, that vacuum energy repels things due to the presence of negative pressure.
Krauss then proceeds to express some doubt that we can ever establish that dark energy is a cosmological constant. His argument is, once again, quite bizarre:
The only way we can determine from observations that dark energy is not a cosmological constant is to somehow measure its equation of state parameter, w, and find that it is not, or was not, equal to –1. If the measured value is indistinguishable from –1 within experimental uncertainties, then we have not learned anything at all because dark energy could either be a cosmological constant or something else less (or more) exotic that behaved very much like it.
If one applied this criterion generally in science, then one would have to conclude that the predicted value of a quantity could never be verified because, well, there are always some error bars associated with the measurement. On the contrary, if a theoretical hypothesis predicts a certain value for some quantity, and the value of that quantity is subsequently measured to agree with the predicted value to within, say, two multiples of the standard deviation due to measurement uncertainties, then one can say that: (i) the prediction has been verified, and (ii) the probability that the hypothesis is true has been considerably increased, in accordance with Bayes' theorem. Of course, if there is an alternative theory, which predicts a similar value for the quantity, then the measured value doesn't enable one to discriminate between the two theories. But, in the case of dark energy, there is, as yet, no theory of any 'exotic' something that behaves like the cosmological constant.
It was Krauss, of course, who suggested a couple of weeks ago that mankind may have reduced the life-expectancy of the universe...
Monday, December 03, 2007
Mis-engineering
Unbeknown to most, there is a secret cadre of benighted engineers who have, over the past decades, invented a new engineering discipline with no name. This new discipline is called Mis-engineering. The purpose of this highly distributed covert conspiracy, is the deliberate under-fulfillment of engineering potential, in order to satisfy moral, financial, or political objectives. The manifestations of this perverted underground include the following:
- The Amazon Search engine. The purpose of this is not to enable you to find the product you seek, but to generate a list of products which you must browse through before finding your intended quarry. En route, a significant proportion of customers will have their interest piqued by other items, which they will then choose to buy. It is the internet equivalent of forcing supermarket customers to walk past all the other food in a shop before they reach the bread and milk.
- Predictive text's Christian ignorance of slang taboo.
- Impaired road-junction visibility. People are not to be trusted in making judgements on the road, so hedges and barriers are deliberately engineered to prohibit road-users from spotting vehicles coming from the left or right approaching a junction. Vehicles must come to a stop, and the road-user must then, and only then, observe the traffic coming from either side.
- Famously, the interminable walk from an aircraft arrival gate to the luggage conveyor belt. The speed of delivery processes are not to be improved; rather, new methods for distracting people from the waiting are to be devised.
- Traffic lights which change according to fixed timing, and not the presence of cars. Traffic must be brought to a stop at traffic lights; a flowing stream of cars must be broken into bunches.
Thursday, November 29, 2007
Hugh Everett III and J.P.Morgan
The December issue of Scientific American contains a feature on the life of Hugh Everett III, the inventor of the many-worlds interpretation of quantum theory. Everett left academia in a state of some bitterness at the initial lack of enthusiasm for his proposal. After a period of time working for the Pentagon on the mathematics of nuclear warfare, he established Lambda, a private defence research company. Under contract to the Pentagon, Lambda used Bayesian methods to develop a system for tracking ballistic missiles. Most intriguingly, though, the article suggests that Everett may have subsequently hoodwinked American bank J.P.Morgan:
John Y. Barry, a former colleague of Everett's...questioned his ethics. In the mid-1970s Barry convinced his employers at J.P. Morgan to hire Everett to develop a Bayesian method of predicting movement in the stock market. By several accounts, Everett succeeded - and then refused to turn the product over to J.P.Morgan. "He used us," Barry recalls. "[He was] a brilliant, innovative, slippery, untrustworthy, probably alcoholic individual."
This information appears to have been taken from an on-line biography of Everett, which quotes Barry as follows:
"In the middle 1970s I was in the basic research group of J. P. Morgan and hired Lambda Corporation to develop...the Bayesian stock market timer. He refused to give us the computer code and insisted that Lambda be paid for market forecasts. Morgan could have sued Lambda for the code under the legal precedent of 'work for hire'. Rather than do so, we decided to have nothing more to do with Lambda because they, Hugh, were so unethical. We found that he later used the work developed with Morgan money as a basis for systems sold to the Federal Government. He used us...In brief a brilliant, innovative, slippery, untrustworthy, probably alcoholic, individual."
What I don't understand here, is how this recollection is supposed to substantiate the claim that Everett had dubious ethics. Within an economy, there are some companies which provide the fundamental level of production within the economy, companies which invent things, design things, develop things, discover things, and make things; and then there are parasitic companies, such as banks and firms of lawyers, which produce nothing, and merely drain their wealth from the activities of others. Taking money from a large American bank to subsidise a research project, and then preventing that bank from harvesting the fruits of that project, is the very paragon of ethical behaviour.
John Y. Barry, a former colleague of Everett's...questioned his ethics. In the mid-1970s Barry convinced his employers at J.P. Morgan to hire Everett to develop a Bayesian method of predicting movement in the stock market. By several accounts, Everett succeeded - and then refused to turn the product over to J.P.Morgan. "He used us," Barry recalls. "[He was] a brilliant, innovative, slippery, untrustworthy, probably alcoholic individual."
This information appears to have been taken from an on-line biography of Everett, which quotes Barry as follows:
"In the middle 1970s I was in the basic research group of J. P. Morgan and hired Lambda Corporation to develop...the Bayesian stock market timer. He refused to give us the computer code and insisted that Lambda be paid for market forecasts. Morgan could have sued Lambda for the code under the legal precedent of 'work for hire'. Rather than do so, we decided to have nothing more to do with Lambda because they, Hugh, were so unethical. We found that he later used the work developed with Morgan money as a basis for systems sold to the Federal Government. He used us...In brief a brilliant, innovative, slippery, untrustworthy, probably alcoholic, individual."
What I don't understand here, is how this recollection is supposed to substantiate the claim that Everett had dubious ethics. Within an economy, there are some companies which provide the fundamental level of production within the economy, companies which invent things, design things, develop things, discover things, and make things; and then there are parasitic companies, such as banks and firms of lawyers, which produce nothing, and merely drain their wealth from the activities of others. Taking money from a large American bank to subsidise a research project, and then preventing that bank from harvesting the fruits of that project, is the very paragon of ethical behaviour.
Wednesday, November 28, 2007
Bear Grylls and Coleridge
Last night on the Discovery Channel, there was a gripping two-part 'Born Survivor' special, as Bear Grylls demonstrated how to survive in the Saharan desert. At one stage, Grylls was presented with a dead camel from a local tribe. To provide a blanket against the cold of night, he stripped the skin from the animal, then he cut into its side, and scooped out some water-like liquid to re-hydrate himself. He then cut into what I think was the camel's stomach, the inside of which contained a mass of yellowish manure. Grylls scooped some out, held it above his head, and squeezed the yellow liquid from the manure into his mouth! "It's better than nothing," he claimed.
This was all very impressive, but I couldn't held remembering the recent revelations that Andy Serkis actually performs all the survival stunts on the show, and Grylls's face is simply CGI-ed on in post-production. And at one stage, when Grylls, shot from an aerial perspective, stood atop a high escarpment and wondered aloud "How do I get down from this," I did blurt out: "Use the helicopter!".
Seriously, though, I noticed that the programme was now making the presence of the camera crew explicit, Grylls talking to them at times, and taking a hand-held camera from them on one occasion. Along with the rider displayed at the beginning of the programme, which emphasises that some situations are set-up in advance for Grylls to demonstrate survival techniques, this seems to be a reaction to the accusations of fakery which were levelled at this programme, amongst several others. Similarly, I noticed on a recent 'Top Gear', that when the lads were driving across Botswana, the presence of the film crew and supporting mechanics was made quite overt.
There was a time when TV documentaries, like films, would seek to transport the mind of the viewer to another place, and, to sustain this illusion, no reference would be made to the presence of the camera, or to the production process as a whole. There was no attempt in this to deceive the viewer, rather the viewer and programme-maker were conspiring together in the willing suspension of disbelief (© Sam Coleridge), with the ultimate purpose of enhancing the viewing experience.
The modern trend to make explicit the presence of the cameraman and sound recordist, and to refer within a programme to the programme-making process, is seen as lending a type of authenticity to a programme. The upshot, however, is to dissolve the possible suspension of disbelief in the viewer, and ultimately, therefore, to reduce the potential pleasure which a viewer can gain from a TV programme. This is not a positive trend.
This was all very impressive, but I couldn't held remembering the recent revelations that Andy Serkis actually performs all the survival stunts on the show, and Grylls's face is simply CGI-ed on in post-production. And at one stage, when Grylls, shot from an aerial perspective, stood atop a high escarpment and wondered aloud "How do I get down from this," I did blurt out: "Use the helicopter!".
Seriously, though, I noticed that the programme was now making the presence of the camera crew explicit, Grylls talking to them at times, and taking a hand-held camera from them on one occasion. Along with the rider displayed at the beginning of the programme, which emphasises that some situations are set-up in advance for Grylls to demonstrate survival techniques, this seems to be a reaction to the accusations of fakery which were levelled at this programme, amongst several others. Similarly, I noticed on a recent 'Top Gear', that when the lads were driving across Botswana, the presence of the film crew and supporting mechanics was made quite overt.
There was a time when TV documentaries, like films, would seek to transport the mind of the viewer to another place, and, to sustain this illusion, no reference would be made to the presence of the camera, or to the production process as a whole. There was no attempt in this to deceive the viewer, rather the viewer and programme-maker were conspiring together in the willing suspension of disbelief (© Sam Coleridge), with the ultimate purpose of enhancing the viewing experience.
The modern trend to make explicit the presence of the cameraman and sound recordist, and to refer within a programme to the programme-making process, is seen as lending a type of authenticity to a programme. The upshot, however, is to dissolve the possible suspension of disbelief in the viewer, and ultimately, therefore, to reduce the potential pleasure which a viewer can gain from a TV programme. This is not a positive trend.
Monday, November 26, 2007
Is Benitez going to be sacked?
Astonishingly, it seems that Liverpool manager Rafael Benitez could be on the verge of leaving the club. Benitez reacted petulantly last week when the club's American co-owners, George Gillett and Tom Hicks, postponed any decisions about expenditure in the January transfer window. It appears that Gillett and Hicks's tolerance threshold for insubordination is rather low, for they are now seeking to rid themselves of Benitez by the end of the season. Either that, or they are seeking to control Benitez with dismissal threats, which doesn't sound like a rosy way to proceed either.
Now, I've never been an unconditional fan of Benitez. I think he's an excellent manager for the European game, but he's never really sussed out how to win the Premiership. Even in his first season with Liverpool, Benitez made a succession of slightly odd team selections and tactical decisions, which restricted the team to only 5th place at season's end.
Benitez's greatest triumph, of course, was taking Liverpool to victory in the Champions' League that same year. But even there, in the final, he made an astonishing mis-judgement in selecting Harry Kewell to start the game, and by leaving Dietmar Hamann out of the team until the second half, AC Milan rampaged to a three-goal lead. Liverpool's come-back was remarkable, but hugely fortunate, and was driven more by Gerrard, Carragher and Hamann, than by Benitez.
Benitez's erratic decision-making has continued into a fourth season with Liverpool, and a stunningly inexplicable 'rotation policy' has again restricted the team to its current 5th-place in the Premiership table.
But should Benitez go? On balance, I don't think so. Whilst he's an odd fellow, he's also clearly got something about him as a manager, and I don't know who could replace him at the moment and be more effective. Nevertheless, I've felt for some time that Liverpool won't win the Premiership under Benitez, and it seems that I will be proven correct even earlier than I imagined...
Now, I've never been an unconditional fan of Benitez. I think he's an excellent manager for the European game, but he's never really sussed out how to win the Premiership. Even in his first season with Liverpool, Benitez made a succession of slightly odd team selections and tactical decisions, which restricted the team to only 5th place at season's end.
Benitez's greatest triumph, of course, was taking Liverpool to victory in the Champions' League that same year. But even there, in the final, he made an astonishing mis-judgement in selecting Harry Kewell to start the game, and by leaving Dietmar Hamann out of the team until the second half, AC Milan rampaged to a three-goal lead. Liverpool's come-back was remarkable, but hugely fortunate, and was driven more by Gerrard, Carragher and Hamann, than by Benitez.
Benitez's erratic decision-making has continued into a fourth season with Liverpool, and a stunningly inexplicable 'rotation policy' has again restricted the team to its current 5th-place in the Premiership table.
But should Benitez go? On balance, I don't think so. Whilst he's an odd fellow, he's also clearly got something about him as a manager, and I don't know who could replace him at the moment and be more effective. Nevertheless, I've felt for some time that Liverpool won't win the Premiership under Benitez, and it seems that I will be proven correct even earlier than I imagined...
Saturday, November 24, 2007
Has mankind reduced the life-expectancy of the universe?
Hot on the heels of my post concerning the destruction of the universe, cosmologist Lawrence Krauss has suggested that mankind's detection of dark energy in 1998 may reduce the lifetime of our universe!
The idea, once again, is that the false vacuum energy of the scalar field responsible for inflation, (the hypothetical exponential expansion of the very early universe), may not have decayed to zero, and, since the time of inflation, may have been residing in another false vacuum state of much lower, but non-zero energy. It is suggested by Krauss and James Dent that the dark energy detected by cosmologists in the past decade, may simply be this residual false vacuum energy.
A false vacuum state is 'metastable' in the sense that it is a state of temporary stability, but one which is prone to decay, much like the nucleus of a radioactive atom. Krauss and Dent point out that in quantum mechanics, the survival probability of a metastable state will decrease exponentially until a critical cut-off time, at which point the survival probability will only decrease according to a power law. Given that a false vacuum expands exponentially, there will be a net increase in the volume of space in a false vacuum state after this cut-off time. Krauss argues that whilst the metastable residual false vacuum of our universe may have reached this critical cut-off, the act of observing the dark energy may have reset the decay according to the 'quantum zeno effect'. The consequence is that mankind's actions may have significantly increased the probability that the residual false vacuum will decay to the true vacuum, destroying all the structure in our universe.
With an irony that may have been lost on some readers, The Daily Telegraph's Roger Highfield refers to these as "damaging allegations."
There are also at least two reasons why the claims shouldn't be taken seriously. Firstly, as Max Tegmark points out in New Scientist, the quantum zeno effect does not require humans to make observations: "Galaxies have 'observed' the dark energy long before we evolved. When we humans in turn observe the light from these galaxies, it changes nothing except our own knowledge." Secondly, Krauss and Dent's assumption that dark energy can be equated with the energy density of a scalar field is inconsistent with the current observational evidence, which suggests that the dark energy is a cosmological constant. The energy density due to a cosmological constant is a property of space, not a property of any matter field in space. The energy density due to a cosmological constant is literally constant in space and time, unlike that attributable to the scalar fields postulated to explain dark energy.
The idea, once again, is that the false vacuum energy of the scalar field responsible for inflation, (the hypothetical exponential expansion of the very early universe), may not have decayed to zero, and, since the time of inflation, may have been residing in another false vacuum state of much lower, but non-zero energy. It is suggested by Krauss and James Dent that the dark energy detected by cosmologists in the past decade, may simply be this residual false vacuum energy.
A false vacuum state is 'metastable' in the sense that it is a state of temporary stability, but one which is prone to decay, much like the nucleus of a radioactive atom. Krauss and Dent point out that in quantum mechanics, the survival probability of a metastable state will decrease exponentially until a critical cut-off time, at which point the survival probability will only decrease according to a power law. Given that a false vacuum expands exponentially, there will be a net increase in the volume of space in a false vacuum state after this cut-off time. Krauss argues that whilst the metastable residual false vacuum of our universe may have reached this critical cut-off, the act of observing the dark energy may have reset the decay according to the 'quantum zeno effect'. The consequence is that mankind's actions may have significantly increased the probability that the residual false vacuum will decay to the true vacuum, destroying all the structure in our universe.
With an irony that may have been lost on some readers, The Daily Telegraph's Roger Highfield refers to these as "damaging allegations."
There are also at least two reasons why the claims shouldn't be taken seriously. Firstly, as Max Tegmark points out in New Scientist, the quantum zeno effect does not require humans to make observations: "Galaxies have 'observed' the dark energy long before we evolved. When we humans in turn observe the light from these galaxies, it changes nothing except our own knowledge." Secondly, Krauss and Dent's assumption that dark energy can be equated with the energy density of a scalar field is inconsistent with the current observational evidence, which suggests that the dark energy is a cosmological constant. The energy density due to a cosmological constant is a property of space, not a property of any matter field in space. The energy density due to a cosmological constant is literally constant in space and time, unlike that attributable to the scalar fields postulated to explain dark energy.
The economics of quality
It is a platitude of economics that competition drives the price of a product down towards the cost of producing it. However, it is also a truism that the cost of producing something is variable. Hence, a lower price or greater profit margin can be generated if production costs are reduced.
There are at least two ways in which costs can be reduced: (i) the efficiency of the production process can be increased; or (ii) the quality of the product can be reduced.
In television, it seems that competition has resulted in a reduction to the quality of the product. I would also argue that in the world of academic publishing, the physical quality of the books published in the past decade or so, do not match the quality of the books published in the 1970s and 1980s. I am not thinking here of the quality of the intellectual contents, but the quality of the book itself, as an extended physical object.
Back in the 1970s and 1980s, Academic Press published a series of monographs and textbooks entitled Pure and Applied Mathematics, edited by Samuel Eilenberg and Hyman Bass. These were beautiful books. The paper was of the highest quality; the choice of fonts and typeface was perfect, the text bejewelled with calligraphically sculpted fraktur and script characters; and the books were covered in a type of green leather hide, bearing their titles in gold leaf lettering. These books even smelt good when you opened them. There is nothing comparable in academic publishing today.
There are at least two ways in which costs can be reduced: (i) the efficiency of the production process can be increased; or (ii) the quality of the product can be reduced.
In television, it seems that competition has resulted in a reduction to the quality of the product. I would also argue that in the world of academic publishing, the physical quality of the books published in the past decade or so, do not match the quality of the books published in the 1970s and 1980s. I am not thinking here of the quality of the intellectual contents, but the quality of the book itself, as an extended physical object.
Back in the 1970s and 1980s, Academic Press published a series of monographs and textbooks entitled Pure and Applied Mathematics, edited by Samuel Eilenberg and Hyman Bass. These were beautiful books. The paper was of the highest quality; the choice of fonts and typeface was perfect, the text bejewelled with calligraphically sculpted fraktur and script characters; and the books were covered in a type of green leather hide, bearing their titles in gold leaf lettering. These books even smelt good when you opened them. There is nothing comparable in academic publishing today.
Saturday, November 17, 2007
Analytic Metaphysics
On Thursday this week I made the trip to Oxford to see James Ladyman deliver his talk, The Bankruptcy of Analytic Metaphysics. And most entertaining it was too.
The type of methodology which I take James to be attacking, is nicely defined in A Companion to Metaphysics, (Jaegwon Kim and Ernest Sosa (eds.), Blackwell, 1995). In Felicia Ackerman's entry on 'analysis', we are asked to "consider the following proposition.
(1) To be an instance of knowledge is to be an instance of justified true belief not essentially grounded in any falsehood.
(1) exemplifies a central sort of philosophical analysis. Analyses of this sort can be characterized as follows:
(a) The analysans and analysandum are necessarily coexistensive, i.e every instance of one is an instance of the other.
(b) The analysans and analysandum are knowable a priori to be coextensive.
(c) The analysandum is simpler than the analysans...
(d) The analysans does not have the analysandum as a constituent.
(e) A proposition that gives a correct analysis can be justified by the philosophical example-and-counter-example method. i.e. by generalizing from intuitions about the correct answers to questions about a varied and wide-ranging series of simple described hypothetical test cases, such as 'If such-and-such were the case, would you call this a case of knowledge?' Thus, such an analysis is a philosophical discovery, rather than something that must be obvious to ordinary users of the terms in question."
But what, exactly, is the criterion to be applied in these test cases? These are not empirical tests, where we can compare the predictions of theory with the results of measurement and observation. Neither are these tests comparable to the tests devised by mathematicians, to support or reject a mathematical hypothesis. In analytic metaphysics, an attempt is being made to define the meaning of one of the terms of discourse (in the example given here, the term is 'knowledge'); in the mathematical case, all the terms of discourse have been stipulatively defined at the outset.
As Ladyman has emphasized, the problem with this methodology, is that it ultimately appeals to intuition, which is not only culturally dependent, but varies from one individual to another within a culture.
Ackerman acknowledges that "It can...be objected that it is virtually impossible to produce an example of an analysis that is both philosophically interesting and generally accepted as true. But virtually all propositions philosophers put forth suffer from this problem...The hypothetical example-and counterexample method the sort of analysis (1) exemplifies is fundamental in philosophical enquiry, even if philosophers cannot reach agreement on analyses."
It seems to be acknowledged, then, that the results of relying upon intuition are inconsistent. If the results of a methodology are inconsistent, then, in most disciplines, that entails that the methodology is unreliable, which, in most cases, is a sufficient condition for the methodology to be rejected as a deficient methodology. Apparently, however, "all propositions philosophers put forth suffer from this problem," so the methodology continues to be employed in metaphysics, and, for that matter, in epistemology too. Remarkable.
The type of methodology which I take James to be attacking, is nicely defined in A Companion to Metaphysics, (Jaegwon Kim and Ernest Sosa (eds.), Blackwell, 1995). In Felicia Ackerman's entry on 'analysis', we are asked to "consider the following proposition.
(1) To be an instance of knowledge is to be an instance of justified true belief not essentially grounded in any falsehood.
(1) exemplifies a central sort of philosophical analysis. Analyses of this sort can be characterized as follows:
(a) The analysans and analysandum are necessarily coexistensive, i.e every instance of one is an instance of the other.
(b) The analysans and analysandum are knowable a priori to be coextensive.
(c) The analysandum is simpler than the analysans...
(d) The analysans does not have the analysandum as a constituent.
(e) A proposition that gives a correct analysis can be justified by the philosophical example-and-counter-example method. i.e. by generalizing from intuitions about the correct answers to questions about a varied and wide-ranging series of simple described hypothetical test cases, such as 'If such-and-such were the case, would you call this a case of knowledge?' Thus, such an analysis is a philosophical discovery, rather than something that must be obvious to ordinary users of the terms in question."
But what, exactly, is the criterion to be applied in these test cases? These are not empirical tests, where we can compare the predictions of theory with the results of measurement and observation. Neither are these tests comparable to the tests devised by mathematicians, to support or reject a mathematical hypothesis. In analytic metaphysics, an attempt is being made to define the meaning of one of the terms of discourse (in the example given here, the term is 'knowledge'); in the mathematical case, all the terms of discourse have been stipulatively defined at the outset.
As Ladyman has emphasized, the problem with this methodology, is that it ultimately appeals to intuition, which is not only culturally dependent, but varies from one individual to another within a culture.
Ackerman acknowledges that "It can...be objected that it is virtually impossible to produce an example of an analysis that is both philosophically interesting and generally accepted as true. But virtually all propositions philosophers put forth suffer from this problem...The hypothetical example-and counterexample method the sort of analysis (1) exemplifies is fundamental in philosophical enquiry, even if philosophers cannot reach agreement on analyses."
It seems to be acknowledged, then, that the results of relying upon intuition are inconsistent. If the results of a methodology are inconsistent, then, in most disciplines, that entails that the methodology is unreliable, which, in most cases, is a sufficient condition for the methodology to be rejected as a deficient methodology. Apparently, however, "all propositions philosophers put forth suffer from this problem," so the methodology continues to be employed in metaphysics, and, for that matter, in epistemology too. Remarkable.
Wednesday, November 14, 2007
An exceptionally simple theory of everything
Surfer dude Garrett Lisi has produced a fabulous theory of everything, which, at the classical level at least, unifies the structure of the standard model of particle physics with the structure of general relativity.
The basic idea is that the gauge group of the entire universe is E8, a group which is classified as an exceptional simple Lie group. The gauge field of the entire universe would be represented, at a classical level, by a superconnection upon the total space of an E8-principal fibre bundle over a 4-dimensional space-time. This gauge field subsumes not only the gravitational field, the electroweak field, and the strong field, but all the matter fields as well, including the quark and lepton fields, and the Higgs field.
The diagram here represents the roots of the Lie algebra of E8, each of which purportedly define a possible type of elementary particle. Every Lie algebra has a maximal commuting subalgebra, called the Cartan subalgebra. In each representation of a Lie algebra, the simultaneous eigenvectors of the elements from the Cartan subalgebra are called the weight vectors of the representation, and their simultaneous eigenvalues are called the weights of the representation. In the special case of the adjoint representation, (a representation of a Lie algebra upon itself), the weight vectors are called the root vectors, and the weights are called the roots.
In the case of E8 the Cartan subalgebra is 8-dimensional, hence E8 has 240 roots, and each of these roots is defined by 8 numbers, the eigenvalues of the 8 linearly-independent vectors which are chosen as a basis for the Cartan subalgebra. These 8 numbers are the 'quantum numbers' which define each type of elementary particle.
It's a remarkable paper, which I shall retire to consider at some length.
The basic idea is that the gauge group of the entire universe is E8, a group which is classified as an exceptional simple Lie group. The gauge field of the entire universe would be represented, at a classical level, by a superconnection upon the total space of an E8-principal fibre bundle over a 4-dimensional space-time. This gauge field subsumes not only the gravitational field, the electroweak field, and the strong field, but all the matter fields as well, including the quark and lepton fields, and the Higgs field.
The diagram here represents the roots of the Lie algebra of E8, each of which purportedly define a possible type of elementary particle. Every Lie algebra has a maximal commuting subalgebra, called the Cartan subalgebra. In each representation of a Lie algebra, the simultaneous eigenvectors of the elements from the Cartan subalgebra are called the weight vectors of the representation, and their simultaneous eigenvalues are called the weights of the representation. In the special case of the adjoint representation, (a representation of a Lie algebra upon itself), the weight vectors are called the root vectors, and the weights are called the roots.
In the case of E8 the Cartan subalgebra is 8-dimensional, hence E8 has 240 roots, and each of these roots is defined by 8 numbers, the eigenvalues of the 8 linearly-independent vectors which are chosen as a basis for the Cartan subalgebra. These 8 numbers are the 'quantum numbers' which define each type of elementary particle.
It's a remarkable paper, which I shall retire to consider at some length.
Philosophy 2.0
Philosophy 2.0 is, I propose, the new version of philosophy which will return the subject to its former prestigious role as the fundamental, over-arching, unifying, synthesising discpline, fully integrated with science.
In this vein, Dr James Ladyman, of the University of Bristol, will deliver a talk entitled The Bankruptcy of Analytic Metaphysics, in the Philosophy department at Oxford tomorrow (Thursday 15th November, 4.30, Lecture Room, 10 Merton St.) I reproduce the abstract here:
Analytic metaphysics is becoming increasingly dominant in contemporary philosophy but its status and influence is undeserved and pernicious. The methodology of analytic metaphysics with its reliance on intuition and explanation by posit has no epistemological justification and its results have little or no epistemic value. Unless checked it threatens to discredit philosophy among non-philosophers and waste the talents of a host a graduate students as well as exerting a pernicious influence on other areas of philosophy
I will argue the case for the above claims with reference to recent debates about composition, gunk versus atoms, mental causation and Humean supervenience. I will argue for a naturalized metaphysics that engages with science.
James has also just published a survey of structural realism, which can be found here.
In this vein, Dr James Ladyman, of the University of Bristol, will deliver a talk entitled The Bankruptcy of Analytic Metaphysics, in the Philosophy department at Oxford tomorrow (Thursday 15th November, 4.30, Lecture Room, 10 Merton St.) I reproduce the abstract here:
Analytic metaphysics is becoming increasingly dominant in contemporary philosophy but its status and influence is undeserved and pernicious. The methodology of analytic metaphysics with its reliance on intuition and explanation by posit has no epistemological justification and its results have little or no epistemic value. Unless checked it threatens to discredit philosophy among non-philosophers and waste the talents of a host a graduate students as well as exerting a pernicious influence on other areas of philosophy
I will argue the case for the above claims with reference to recent debates about composition, gunk versus atoms, mental causation and Humean supervenience. I will argue for a naturalized metaphysics that engages with science.
James has also just published a survey of structural realism, which can be found here.
Tuesday, November 13, 2007
All things fair and fowl
It seems that many of the birds in Norfolk and Suffolk have been forced to stay indoors for the next day or so. I imagine, therefore, that they will currently be sitting at home, flicking absent-mindedly through the channels on Freeview, or randomly surfing the net in the hope of finding something that plucks their interest.
In an attempt to satisfy my avian visitors, may I point them in the direction of this interesting research, which attempts to explain, by means of computer simulation, why flocks of birds fly in V-formations, or even W-formations. It seems that these formations offer the optimum combination of collective aerodynamic efficiency and visibility.
Our fine-feathered friends are, of course, notoriously fond of the odd worm-snack, and will therefore be most interested in the lastest proposal for wormholes in physics. This particular proposal appears to be an extension of the idea that invisibility cloaks can be designed using materials with a non-uniform refractive index. The tubular materials proposed here are, it seems, deemed wormholes on the basis that "light entering the tube at one end would emerge at the other with no visible tunnel in-between." I shall resist the judgement that such research is progressing on a wing and a prayer.
In an attempt to satisfy my avian visitors, may I point them in the direction of this interesting research, which attempts to explain, by means of computer simulation, why flocks of birds fly in V-formations, or even W-formations. It seems that these formations offer the optimum combination of collective aerodynamic efficiency and visibility.
Our fine-feathered friends are, of course, notoriously fond of the odd worm-snack, and will therefore be most interested in the lastest proposal for wormholes in physics. This particular proposal appears to be an extension of the idea that invisibility cloaks can be designed using materials with a non-uniform refractive index. The tubular materials proposed here are, it seems, deemed wormholes on the basis that "light entering the tube at one end would emerge at the other with no visible tunnel in-between." I shall resist the judgement that such research is progressing on a wing and a prayer.
Monday, November 12, 2007
How to destroy the universe
Schemes for the possible creation of a universe in a laboratory have received a decent amount of publicity in recent years. In comparison, laboratory-based schemes for the destruction of our universe have been sadly neglected. In an effort, then, to redress this inequality, let me explain how our universe may be destroyed.
The idea depends upon the concept of a false vacuum, introduced by inflationary cosmology. Inflation suggests that there is a scalar field, the 'inflaton', whose 'equation of state' is such that a positive energy density corresponds to a negative pressure. In general relativity, a matter field with negative pressure generates a repulsive gravitational effect. Inflationary cosmology suggests that at some time in the early universe, the energy density of the universe came to be dominated by the non-zero energy density of the inflaton field. A region of the universe in this so-called false vacuum state would undergo exponential expansion until the inflaton field dropped into a lower energy state. This lower energy state is conventionally considered to be the 'true vacuum' state, the lowest energy state of the inflaton field. However, inflation works just as effectively if the transition is from one positive energy state to another, lower, positive energy state. And it is this possibility which opens up the doomsday scenario.
It was originally proposed that the false vacuum state was a local minimum of the potential energy function for the inflaton field, and that inflation ended when the state of the inflaton quantum-mechanically tunnelled through the potential barrier from the initial minimum to another, lower, minimum of the potential energy function, possibly the global minimum (true vacuum) state (see diagram). It was suggested that inflation was ended locally by this quantum tunnelling, and a bubble of the lower-energy vacuum formed, surrounded by a region of the higher-energy vacuum. The walls of the bubble then expanded outwards at the speed of light, destroying all in their path. It was subsequently thought that such 'bubble nucleation' could not explain the observed homogeneity of our own universe, and the original inflationary proposal was duly superceded by the 'new' inflationary proposal, and the 'chaotic' inflationary proposal, which both suggested that inflation could occur without the need for the false vacuum to be a local minimum of the potential, and inflation could therefore end without quantum tunnelling and bubble nucleation.
This, however, does not mean that the bubble nucleation of a lower-energy vacuum is physically impossible, it merely entails that such a process was not involved in inflation. If the current state of the inflaton field is still not the lowest energy state of that field, and if the current state is a local minimum, it may be that the current state will be ended by quantum tunnelling and bubble nucleation. Moreover, it may be that particle accelerators or laser-fusion devices of the future, may generate sufficient energy density to perturb the current state of the inflaton field out of its local minimum, and over the potential barrier into a lower-energy state. A bubble of lower-energy vacuum could thereby form in the laboratory, and propagate outwards at the speed of light, destroying all in its path.
The idea depends upon the concept of a false vacuum, introduced by inflationary cosmology. Inflation suggests that there is a scalar field, the 'inflaton', whose 'equation of state' is such that a positive energy density corresponds to a negative pressure. In general relativity, a matter field with negative pressure generates a repulsive gravitational effect. Inflationary cosmology suggests that at some time in the early universe, the energy density of the universe came to be dominated by the non-zero energy density of the inflaton field. A region of the universe in this so-called false vacuum state would undergo exponential expansion until the inflaton field dropped into a lower energy state. This lower energy state is conventionally considered to be the 'true vacuum' state, the lowest energy state of the inflaton field. However, inflation works just as effectively if the transition is from one positive energy state to another, lower, positive energy state. And it is this possibility which opens up the doomsday scenario.
It was originally proposed that the false vacuum state was a local minimum of the potential energy function for the inflaton field, and that inflation ended when the state of the inflaton quantum-mechanically tunnelled through the potential barrier from the initial minimum to another, lower, minimum of the potential energy function, possibly the global minimum (true vacuum) state (see diagram). It was suggested that inflation was ended locally by this quantum tunnelling, and a bubble of the lower-energy vacuum formed, surrounded by a region of the higher-energy vacuum. The walls of the bubble then expanded outwards at the speed of light, destroying all in their path. It was subsequently thought that such 'bubble nucleation' could not explain the observed homogeneity of our own universe, and the original inflationary proposal was duly superceded by the 'new' inflationary proposal, and the 'chaotic' inflationary proposal, which both suggested that inflation could occur without the need for the false vacuum to be a local minimum of the potential, and inflation could therefore end without quantum tunnelling and bubble nucleation.
This, however, does not mean that the bubble nucleation of a lower-energy vacuum is physically impossible, it merely entails that such a process was not involved in inflation. If the current state of the inflaton field is still not the lowest energy state of that field, and if the current state is a local minimum, it may be that the current state will be ended by quantum tunnelling and bubble nucleation. Moreover, it may be that particle accelerators or laser-fusion devices of the future, may generate sufficient energy density to perturb the current state of the inflaton field out of its local minimum, and over the potential barrier into a lower-energy state. A bubble of lower-energy vacuum could thereby form in the laboratory, and propagate outwards at the speed of light, destroying all in its path.
Sunday, November 11, 2007
The philosophy of music
Modern scientists tend to possess an almost complete ignorance of philosophy, which renders many of their more general claims naive and parochial. Conversely, however, much of 20th-century philosophy was driven by a careerist desire to build an academic discipline which could stand independently of scientific discovery and understanding. Many 20th-century philosophers received an arts-based education just as narrow as that of modern scientists, and, lacking an understanding of science, strove to build an 'analytic philosophy' centred around the analysis of natural language. These philosophers sought to believe that their discipline was logically independent of science because their careers were dependent upon such a claim.
This is not what philosophy should be. Philosophy should be the great over-arching, unifying, synthesising discpline. In an age of deepening academic specialisation, philosophy should be reacting against this trend. Philosophers should be competent in the arts and the sciences. In particular, a lack of mathematical and scientific competency in a philosopher should be considered a crippling disability, comparable to a lack of logical competency.
A perfect example of the lack of ambition and imagination in modern philosophy is provided by Andrew Kania's survey of the philosophy of music. The key phrase in this account can be found in the preamble, where Kania states that the work considered is "in an analytic vein." This, amongst other things, is philosophic code for "there will be no discussion of scientific research here." The ensuing discussion therefore makes no reference either to neuroscience or biological evolution. Vital issues such as the purported universality of musical appreciation, or the emotions evoked by some music, can only be fully understood by integrating conceptual discussion with evidence from neuroscience and cognitive evolution theory.
Which parts of the brain are activated during the production and appreciation of music? What types of interaction occur between the cerebrum (the part of the brain responsible for conscious thought), the cerebellum (the part of the brain responsible for unconscious, 'second-nature' behaviour), and the amygdala (the 'levers' of the emotions). When music is emotionally ambiguous or neutral, but nevertheless evokes an aesthetic appreciation in the listener, which parts of the brain are then activated? How do such patterns of brain activity help us to understand music, if at all? All the scientific theory and evidence here needs to be incorporated into the philosophical discussion if the philosophy is to be of any real interest or relevance.
What role, if any, does music play from the perspective of biological evolution? What light, if any, can evolution throw upon the ontology of music, and the emotions sometimes expressed by music? Consider the following argument by John Barrow: "neither musical appreciation, nor any dexterous facility for musical performance, is shared by people so widely, or at a high level of competence, in the way that linguistic abilities are. In such circumstance it is hard to believe that musical abilities are genetically programmed into the brain in the way that linguistic abilities appear to be. The variations in our ability to produce and respond to music are far too great for musical ability to be an essential evolutionary adaptation. Such diversity is more likely to arise if musical appreciation is a by-product of mental abilities that were adaptively evolved primarily for other purposes. Unlike language, music is something that our ancestors could live without," (The Artful Universe, p196). Is this true? Kania's survey does not enable one to access such discussions, because such discussions are not within its remit.
There is a need for broadly-educated philosophers, with sufficient will and courage to look beyond their careerist aspirations, and to write genuinely unifying, all-embracing philosophy; work which cannot be published because it doesn't fall into the narrow domain of any particular journal; work which doesn't, therefore, enable those philsophers to gain promotion by increasing their citations ranking.
Am I asking too much?
This is not what philosophy should be. Philosophy should be the great over-arching, unifying, synthesising discpline. In an age of deepening academic specialisation, philosophy should be reacting against this trend. Philosophers should be competent in the arts and the sciences. In particular, a lack of mathematical and scientific competency in a philosopher should be considered a crippling disability, comparable to a lack of logical competency.
A perfect example of the lack of ambition and imagination in modern philosophy is provided by Andrew Kania's survey of the philosophy of music. The key phrase in this account can be found in the preamble, where Kania states that the work considered is "in an analytic vein." This, amongst other things, is philosophic code for "there will be no discussion of scientific research here." The ensuing discussion therefore makes no reference either to neuroscience or biological evolution. Vital issues such as the purported universality of musical appreciation, or the emotions evoked by some music, can only be fully understood by integrating conceptual discussion with evidence from neuroscience and cognitive evolution theory.
Which parts of the brain are activated during the production and appreciation of music? What types of interaction occur between the cerebrum (the part of the brain responsible for conscious thought), the cerebellum (the part of the brain responsible for unconscious, 'second-nature' behaviour), and the amygdala (the 'levers' of the emotions). When music is emotionally ambiguous or neutral, but nevertheless evokes an aesthetic appreciation in the listener, which parts of the brain are then activated? How do such patterns of brain activity help us to understand music, if at all? All the scientific theory and evidence here needs to be incorporated into the philosophical discussion if the philosophy is to be of any real interest or relevance.
What role, if any, does music play from the perspective of biological evolution? What light, if any, can evolution throw upon the ontology of music, and the emotions sometimes expressed by music? Consider the following argument by John Barrow: "neither musical appreciation, nor any dexterous facility for musical performance, is shared by people so widely, or at a high level of competence, in the way that linguistic abilities are. In such circumstance it is hard to believe that musical abilities are genetically programmed into the brain in the way that linguistic abilities appear to be. The variations in our ability to produce and respond to music are far too great for musical ability to be an essential evolutionary adaptation. Such diversity is more likely to arise if musical appreciation is a by-product of mental abilities that were adaptively evolved primarily for other purposes. Unlike language, music is something that our ancestors could live without," (The Artful Universe, p196). Is this true? Kania's survey does not enable one to access such discussions, because such discussions are not within its remit.
There is a need for broadly-educated philosophers, with sufficient will and courage to look beyond their careerist aspirations, and to write genuinely unifying, all-embracing philosophy; work which cannot be published because it doesn't fall into the narrow domain of any particular journal; work which doesn't, therefore, enable those philsophers to gain promotion by increasing their citations ranking.
Am I asking too much?
Thursday, November 08, 2007
Can lobsters feel pain?
Robert Elwood, of Queen's University, Belfast, has announced research which, he argues, demonstrates that prawns, and other crustaceans such as lobsters, can feel pain. Surprisingly, this research, to be published in Animal Behaviour, didn't involve a detailed analysis of the neurology of crustaceans, but, rather, involved daubing an irritant, acetic acid, on to one of the two antennae belonging to each in a collection of 144 prawns. Immediately, the creatures began grooming and rubbing the affected antenna for up to 5 minutes. Elwood argues that "the prolonged, specifically directed rubbing and grooming is consistent with an interpretation of pain experience."
Elwood, however, is conflating a controlled response to a potentially damaging stimulus, with the conscious experience or feeling of pain.
Elwood's use of the phrase "consistent with an interpretation of pain experience" is crucial here. This is a much weaker assertion than the claim that something has been observed which provides evidence in favour of a pain experience. A controlled response to a potentially damaging stimulus does not entail that pain is experienced. Even a single cell can respond to a potentially damaging stimulus, hence the observations made by Elwood and his colleagues are also consistent with the absence of experienced pain. If the observed behaviour is consistent with both the presence and absence of experienced pain, then it cannot constitute evidence to support the hypothesis that crustaceans are capable of experiencing pain.
Elwood, however, is conflating a controlled response to a potentially damaging stimulus, with the conscious experience or feeling of pain.
Elwood's use of the phrase "consistent with an interpretation of pain experience" is crucial here. This is a much weaker assertion than the claim that something has been observed which provides evidence in favour of a pain experience. A controlled response to a potentially damaging stimulus does not entail that pain is experienced. Even a single cell can respond to a potentially damaging stimulus, hence the observations made by Elwood and his colleagues are also consistent with the absence of experienced pain. If the observed behaviour is consistent with both the presence and absence of experienced pain, then it cannot constitute evidence to support the hypothesis that crustaceans are capable of experiencing pain.
Tuesday, November 06, 2007
What is a theory in physics?
In terms of mathematical logic, a theory is a set of sentences, in some language, which is 'closed' under logical entailment. In other words, any sentence which is entailed by a subset of sentences from the theory, is itself already an element of the theory. Now, physicists mean something slightly different from this when they refer to something as a theory; a theory in physics is more akin to a class of 'models'.
In this context, a model for a set of sentences is an 'interpretation' of the langauge in which those sentences are expressed, which renders each sentence as true. In this context, an intepretation of a language identifies the domain over which the variables in the language range; it identifies the elements in the domain which correspond to the constants in the language; it identifies which elements in the domain possess the predicates in the language, which n-tuples of elements are related by the n-ary relations in the language, and which elements in the domain result from performing n-ary operations upon n-tuples in the domain.
Each theory in mathematical physics has a class of models associated with it. As Philosopher of Physics John Earman puts it, "a practitioner of mathematical physics is concerned with a certain mathematical structure and an associated set M of models with this structure. The...laws L of physics pick out a distinguished sub-class of models,...the models satisfying the laws L (or in more colorful, if
misleading, language, the models that 'obey' the laws L)."
The laws which define a class of mathematical models, therefore define a theory as far as physicists are concerned. If one retains the same general class of mathematical structure, but one changes the laws imposed upon it, then one obtains a different theory. Thus, for example, whilst general relativity represents space-time as a 4-dimensional Lorentzian manifold, if one changes the laws imposed by general relativity upon a Lorentzian manifold, (the Einstein field equations), then one obtains a different theory.
Physicists find that, at a classical level, the equations of a theory can be economically specified by something called a Lagrangian, hence physicists tend to identify a theory with its Lagrangian. In superstring theory, there are five candidate theories precisely because there are five candidate Lagrangians. This point is particularly crucial because it also explains why physicists associate different theories with different 'vacua'.
The Lagrangians of particle physics typically contain scalar fields, such as the Higgs field postulated to exist by the unified electroweak theory. These scalar fields appear in certain terms of the Lagrangian. The scalar fields have certain values which constitute minima of their respective potential energy functions, and such minima are called vacuum states (or ground states). If one assumes that in the current universe such scalar fields reside in a vacuum state (as the consequence of a process called symmetry breaking), then the form of the Lagrangian changes to specify this special case. After symmetry breaking, the Lagrangian is not the Lagrangian of the fundamental theory, but an 'effective' Lagrangian. Hence, the selection of a vacuum state changes the form of the Lagrangian, and because a Lagrangian defines a theory, the selection of a vacuum state for a scalar field is seen to define the selection of a theory. Physicists therefore tend to talk, interchangeably, about the number of possible vacua, and the number of possible theories in string theory.
In this context, a model for a set of sentences is an 'interpretation' of the langauge in which those sentences are expressed, which renders each sentence as true. In this context, an intepretation of a language identifies the domain over which the variables in the language range; it identifies the elements in the domain which correspond to the constants in the language; it identifies which elements in the domain possess the predicates in the language, which n-tuples of elements are related by the n-ary relations in the language, and which elements in the domain result from performing n-ary operations upon n-tuples in the domain.
Each theory in mathematical physics has a class of models associated with it. As Philosopher of Physics John Earman puts it, "a practitioner of mathematical physics is concerned with a certain mathematical structure and an associated set M of models with this structure. The...laws L of physics pick out a distinguished sub-class of models,...the models satisfying the laws L (or in more colorful, if
misleading, language, the models that 'obey' the laws L)."
The laws which define a class of mathematical models, therefore define a theory as far as physicists are concerned. If one retains the same general class of mathematical structure, but one changes the laws imposed upon it, then one obtains a different theory. Thus, for example, whilst general relativity represents space-time as a 4-dimensional Lorentzian manifold, if one changes the laws imposed by general relativity upon a Lorentzian manifold, (the Einstein field equations), then one obtains a different theory.
Physicists find that, at a classical level, the equations of a theory can be economically specified by something called a Lagrangian, hence physicists tend to identify a theory with its Lagrangian. In superstring theory, there are five candidate theories precisely because there are five candidate Lagrangians. This point is particularly crucial because it also explains why physicists associate different theories with different 'vacua'.
The Lagrangians of particle physics typically contain scalar fields, such as the Higgs field postulated to exist by the unified electroweak theory. These scalar fields appear in certain terms of the Lagrangian. The scalar fields have certain values which constitute minima of their respective potential energy functions, and such minima are called vacuum states (or ground states). If one assumes that in the current universe such scalar fields reside in a vacuum state (as the consequence of a process called symmetry breaking), then the form of the Lagrangian changes to specify this special case. After symmetry breaking, the Lagrangian is not the Lagrangian of the fundamental theory, but an 'effective' Lagrangian. Hence, the selection of a vacuum state changes the form of the Lagrangian, and because a Lagrangian defines a theory, the selection of a vacuum state for a scalar field is seen to define the selection of a theory. Physicists therefore tend to talk, interchangeably, about the number of possible vacua, and the number of possible theories in string theory.
Monday, November 05, 2007
How to solve global warming
Anthropogenic global warming is caused by the emission of greenhouse gases such as carbon dioxide and methane. The Earth re-radiates energy from the Sun as infrared radiation, and greenhouse gases such as carbon dioxide and methane absorb infrared radiation, hence the temperature of the atmosphere will increase if the atmospheric concentration of carbon dioxide and methane increases.
Most proposed solutions to global warming suggest either a reduction in the anthropogenic emission of greenhouse gases, or various technological schemes for the removal of greenhouse gases from the atmosphere.
This, however, is not the correct way to approach the problem. A clue to the correct approach can be found by looking at another solution, which proposes increasing the reflectivity ('albedo') of the Earth's surface by making as much of it white as possible. This proposal works because incoming radiation at visible wavelengths is reflected back into space at the same visible wavelengths, thereby avoiding absorption by greenhouse gases.
I propose, then, that rather than looking at greenhouse gases such as carbon dioxide as the problem, it is the production of infrared radiation by the Earth which is the problem to be solved. If one could release a compound en masse, either into the atmosphere, or deposited upon the surface of the Earth, which absorbs infrared radiation and re-emits it at visible wavelengths, then the radiation emitted by the Earth will pass unhindered through the greenhouse gases into space.
This requires a so-called 'Anti-Stokes' material: "When a phosphor or other luminescent material emits light, in general, it emits light according to Stokes' Law, which provides that the wavelength of the fluorescent or emitted light is always greater than the wavelength of the exciting radiation...Anti-Stokes materials typically absorb infrared radiation in the range of about 700 to about 1300 nm, and emit in the visible spectrum." A variety of Anti-Stokes phosphors, based on yttrium, exist for the conversion of infrared radiation into visible radiation.
Intriguingly, lanthanum hexaboride is already being used on a trial basis in office windows to absorb all but 5% of the incident infrared radiation...
Most proposed solutions to global warming suggest either a reduction in the anthropogenic emission of greenhouse gases, or various technological schemes for the removal of greenhouse gases from the atmosphere.
This, however, is not the correct way to approach the problem. A clue to the correct approach can be found by looking at another solution, which proposes increasing the reflectivity ('albedo') of the Earth's surface by making as much of it white as possible. This proposal works because incoming radiation at visible wavelengths is reflected back into space at the same visible wavelengths, thereby avoiding absorption by greenhouse gases.
I propose, then, that rather than looking at greenhouse gases such as carbon dioxide as the problem, it is the production of infrared radiation by the Earth which is the problem to be solved. If one could release a compound en masse, either into the atmosphere, or deposited upon the surface of the Earth, which absorbs infrared radiation and re-emits it at visible wavelengths, then the radiation emitted by the Earth will pass unhindered through the greenhouse gases into space.
This requires a so-called 'Anti-Stokes' material: "When a phosphor or other luminescent material emits light, in general, it emits light according to Stokes' Law, which provides that the wavelength of the fluorescent or emitted light is always greater than the wavelength of the exciting radiation...Anti-Stokes materials typically absorb infrared radiation in the range of about 700 to about 1300 nm, and emit in the visible spectrum." A variety of Anti-Stokes phosphors, based on yttrium, exist for the conversion of infrared radiation into visible radiation.
Intriguingly, lanthanum hexaboride is already being used on a trial basis in office windows to absorb all but 5% of the incident infrared radiation...