Despite the proposal of several variable speed of light cosmologies in recent years, the very notion that the speed of light (in vacuum) is logically capable of variation is highly controversial within the physics community. In fact, the question can be generalised to include the purported variation of the three fundamental dimensional 'constants' of physics: the speed of light c, Planck's constant ℏ, and Newton's gravitational constant G.
c, ℏ, and G are dimensional constants in the sense that they possess physical dimensions, and their values must be expressed relative to a choice of physical units. Note in this context that there are three fundamental physical dimensions: length [L], time [T], and mass [M]. Each physical quantity is represented to have dimensions given by some combination of powers of these fundamental dimensions, and each value of a physical quantity is expressed as a multiple of some chosen unit of those dimensions. The speed of light has dimensions [L][T]-1, and in CGS (Centimetre-Gramme-Second) units has the value c ≈ 3 x 1010 cm/s; Planck's constant has the value ℏ ≈ 10-27g cm2s-1 in CGS units; and Newton's gravitational constant has the value G ≈ 6.67 x 10-8 cm3 g-1 s-2 in CGS units.
The laws of physics define the necessary relationships between dimensional quantities. The values of these quantities are variable even within a fixed system of units, hence the lawlike equations can be said to define the relationships between dimensional variables. Nevertheless, the laws of physics also contain dimensional constants. In particular, the fundamental equations of relativity and quantum theory, such as the Einstein field equation, the Maxwell equation, the Schrodinger equation, and the Dirac equation, contain the fundamental dimensional constants, c, ℏ, and G.
Ultimately, dimensional constants are necessary in equations which express the possible relationships between physical variables, because the dimensional constants change the units on one side of the equation into the units on the other side. As an example, consider the most famous case in physics, E=mc2. This equation can be seen as expressing a necessary relationship between the energy-values and mass-values of a system. In CGS units the energy is in ergs, where an erg is defined to equal one g cm2/s2, and the mass is in grammes. To convert the units of the quantity on the right-hand-side of the equation into the same units as the quantity on the left-hand-side, the mass is multiplied by the square of the speed of light in vacuum, which has units of cm2/s2. One might argue that the reason why the (square root of the) conversion factor should be ≈ 3 x 1010 in CGS units, rather than any other number, follows from the definition of the cm and the s. Like all dimensional quantities, the value of fundamental constants such as c changes under a change of physical units.
Intriguingly, the fundamental dimensional constants can also, heuristically at least, be used to express the limiting relationships between fundamental theories. Thus, classical physics is often said to be the limit of quantum physics in which Planck's constant ℏ → 0, and non-relativistic physics is often said to be the limit of relativistic physics in which the speed of light in vacuum c → ∞. The flip side of this coin is that ℏ is said to set the scale at which quantum effects become relevant, and c is said to set the speeds at which relativistic effects become relevant. ℏ sets the scale at which quantum effects become relevant, in the sense that a system with action A is a quantum system if the dimensionless ratio A/ℏ is small. If this ratio is large, then the system is classical. As ℏ → 0, A/ℏ becomes large even for very small systems, hence classical physics is said to be the limit of quantum physics in which ℏ → 0. Similarly, c sets the speeds at which relativistic effects become relevant in the sense that a system with speed ν is relativistic if the dimensionless ratio ν/c is close to 1. If the ratio is a small fraction, then the system is non-relativistic. As c → ∞, ν/c becomes a small fraction even for very fast systems, hence non-relativistic physics is said to be the limit of relativistic physics in which c → ∞. In a similar manner, G sets the scale of gravitational forces, and determines whether a system is gravitational or not.
Duff (2002), however, argues that no objective meaning can be attached to variation in the values of the dimensional constants. According to Duff, "the number and values of dimensional constants, such as ℏ, c, G, e, k etc, are quite arbitrary human conventions. Their job is merely to convert from one system of units to another...the statement that c = 3 x 108 m/s, has no more content than saying how we convert from one human construct (the meter) to another (the second)."
To understand Duff's point, consider 'geometrized' units, in which the speed of light is used to convert units of time into units of length. Thus, for example, c • s is a unit of length defined to equal the distance light travels in a second. If time is measured in units of length, then all velocities are converted from quantities with the dimensions [L][T]-1 to dimensionless quantities, and in particular, the speed of light acquires the dimensionless value c = 1. In geometrized units, anything which has a speed ν less than the speed of light has a speed in the range 0 ≤ νgeo < 1:
νgeo = νcgs / ccgs.
Thus, the speed of light can be used to convert velocities expressed in CGS and geometric units as follows:
νcgs = νgeo • ccgs.
Similarly, in geometrized units, the gravitational constant G converts units of mass to units of length. In fact, in geometrized units, all quantities have some power of length as their dimensions. In general, a quantity with dimensions LnTmMp in normal units acquires dimensions L{n+m+p} in geometrized units, after conversion via the factor cm(G/c2)p, (Wald, General Relativity, 1984, p470).
Whilst in geometrized units, c = G = 1, if one changes to so-called 'natural units' (such as Planck units), then ℏ = c = G = 1, and these constants disappear from the fundamental equations. Theories expressed in these natural units provide a non-dimensional formulation of the theory, and the dimensional variables becomes dimensionless variables in this formulation. Duff, for example, points out that "any theory may be cast into a form in which no dimensional quantities ever appear either in the equations themselves or in their solutions," (2002, p5). Whilst in geometrized units, all quantities have dimensions of some power of length [L]n, in Planck units all quantities are dimensionless, as a result of division by lPn, the n-th power of the Planck length lP = √(G ℏ/c3) ≈ 1.616 x 10-33cm. In particular, in natural units all lengths are dimensionless multiples of the Planck length.
The existence of theoretical formulations in which the dimensional constants disappear, is held to be one of the reasons why a postulated variation in the values of the dimensional constants cannot be well-defined. Different choices of units certainly result in different formulations of a theory, and the dimensional constants can indeed be eliminated by a judicious choice of units. Nevertheless, it should be noted that the most general formulation of a theory and its equations is the one which contains the symbols denoting the dimensional constants as well as the symbols denoting the dimensional variables.
Whilst the arguments recounted above are to the effect that variations in the fundamental dimensional constants cannot be well-defined, these arguments are also often conflated or conjoined with arguments that such changes are not operationally meaningful. Here, it is argued that a change in a dimensional constant cannot be measured because there is no way of discriminating it from a change in the units of which that constant is a multiple. For example, if the length of a physical bar, stored at a metrological standards institute, is used to define the unit of length, one might try to measure a change in the speed of light from a change in the time taken for light to travel such a length. In such a scenario, it could be argued that it is the length of the bar which has changed, not the speed of light.
Whilst it is indeed true that a change in the value of a dimensional variable could be explained by a change in one's standard units, this is a truth which applies to the measurement of dimensional variables, just as much as it applies to the measurement of dimensional constants. The logical conclusion of this line of argument is that only dimensionless ratios of dimensional quantities can be determined by measurements; individual lengths, times and masses cannot be determined, only ratios of lengths, ratios of times, and ratios of masses. Duff duly follows this line of reasoning to its logical conclusion, asserting that "experiments measure only dimensionless quantities," (2002, p5).
However, unless the dimensional quantity being measured is itself used to define the units in which the quantity is expressed, the question of whether one can discriminate a change in a dimensional quantity from a change in the units of which that quantity is a multiple, is an empirical-epistemological question rather than an ontological question. Whilst the value of a dimensional constant does indeed change under a change of units, so does the value of a dimensional variable, and there is no reason to infer from this that a dimensional variable is merely a human construct. For example, the rest-mass energy of a system changes under a change from MeV to keV, but this is no reason to conclude that rest-mass energy is a human construct. Hence, the question of operational meaning may be something of a red-herring.
It is, however, certainly true that the units of time and length can themselves be defined as functions of the fundamental dimensional constants. Thus, the standard unit of time is defined in terms of the frequency ν of hyperfine transitions between ground state energy levels of caesium-133 atoms:
ν = me2 c-2e8 / h5mN ≡ T-1,
where e is the charge of the electron, mN is the mass of the neutron, and me is the mass of the electron. The period of any cyclic phenomenon is the reciprocal of the frequency, 1/ν, and in 1967 the second was defined in the International System (SI) of units to consist of 9,192,631,770 such periods.
From 1960 until 1983, the SI metre was defined to be 1,650,763.73 wavelengths (in vacuum) of the orange-red emission line of krypton-86. This is determined by the Rydberg length R∞:
4 π R∞ = me e4 / c h3 ≡ L .
As Barrow and Tipler comment, "if we adopt L and T as our standards of length and time then they are defined as constant. We could not measure any change in fundamental constants which are functions of L and T," (The Anthropic Cosmological Principle, 1986, p242). Since 1983 the metre has been defined in terms of the unit of time, the second, so that a metre is defined to be the distance travelled by light, in a vacuum, during 1/299 792 458 of a second. Such considerations led cosmologist George Ellis to claim in 2003 that "it is...not possible for the speed of light to vary, because it is the very basis of measuring distance."
Magueijo and Moffat (2007) acknowledge that if the unit of length is defined in such a manner, then the constancy of the speed of light is indeed a tautology. However, they then provide the following riposte: "An historical analogy may be of use here. Consider the acceleration of gravity, little g. This was thought to be a constant in Galileo’s time. One can almost hear the Ellis of the day stating that g cannot vary, because 'it has units and can always be defined to be constant'. The analogy to the present day relativity postulate that c is an absolute constant is applicable, for the most common method for measuring time in use in those days did place the constancy of g on the same footing as c nowadays. If one insists on defining the unit of time from the tick of a given pendulum clock, then the acceleration of gravity is indeed a constant by definition. Just like the modern speed of light c. And yet the Newtonian picture is that the acceleration of gravity varies."
Whilst there is considerable disagreement that the values of fundamental dimensional constants have any theoretical significance, there is a consensus that each different value of a fundamental dimensionless constant, such as the fine structure constant α = e2/ℏ c, defines a different theory. The values of the dimensionless constants are, by definition, invariant under any change of units, they remain obstinately in the dimensionless formulation of a theory, and their values have to be set by observation and measurement. Dimensionless constants, however, are themselves merely functions f(c, ℏ, G) of dimensional constants, in which the dimensions of the units cancel. If the variation of dimensionless constants is meaningful, and if dimensionless constants are functions of the dimensional constants, then one might ask how variation in the former can be achieved without variation in the latter. As Magueijo (2003) comments: "If α is seen to vary one cannot say that all the dimensional parameters that make it up are constant. Something - e, ℏ, c, or a combination thereof - has to be varying. The choice amounts to fixing a system of units, but that choice has to be made...In the context of varying dimensionless constants, that choice translates into a statement on which dimensional constants are varying."
Sunday, August 23, 2009
Sunday, August 16, 2009
Frentzen, Schumacher, Villeneuve and probabilities
At the Formula 1 World Championship finale in 1997, the two drivers fighting for the championship, Jacques Villeneuve and Michael Schumacher, both set exactly the same qualifying time, 1m 21.072. Not only that, but Villeneuve's team-mate, Heinz-Harald Frentzen, also set a time of exactly 1m 21.072. How improbable was that?
Well, not as improbable as one might think. Ian Stewart, writing with Jack Cohen and Terry Pratchett in The Science of Discworld (1999), points out that these three drivers could have been expected to set a time within a tenth of each other. The timing resolution was to thousandths of a second, hence there were effectively a hundred different times available to each of these drivers. Thus, the probability of an individual driver setting a specific time, such as A = 1m 21.072, was:
P(A) = 1/100
The probability of a conjunction of independent events is obtained by multiplying the probabilities of the individual events, so, as a first stab, one might guess that the probability of all three drivers setting the same time is as follows:
P(A&B&C) = 1/100 * 1/100 * 1/100 = 1/1,000,000 (i.e., one in a million)
This, however, is the probability of all three drivers setting a specific time such as 1m 21.072. It is not the specific time which is relevant here, but the probability of all three drivers setting the same time. Once the first driver of the triumvirate had set a time of 1m 21.072, we want to know the conditional probability of the second and third drivers also setting that time. To obtain the conditional probability of A&B&C, given the occurrence of A, one simply divides the probability of A&B&C by the probability of A:
P(A&B&C)/P(A) = 1/10,000
So the probability of Frentzen, Schumacher and Villeneuve setting the same time to a thousandth of a second, was one-in-ten-thousand. Still a long shot, but hardly astronomical odds. There have been in the order of a thousand World Championship Grands Prix since the inauguration of the championship in 1950, and although timing to thousandths of a second is a more recent innovation, we can say that the probability of three drivers setting exactly the same time over a thousand Grands Prix is one-in-ten.
In a ensemble consisting of ten separate planets within a galaxy, each hosting a thousand Grands Prix, and timing to thousandths of a second, one would expect there to be one planet on which three drivers set exactly the same time in at least one Grands Prix.
To expect that it happens in qualifying for the World Championship shoot-out? Well, once the existence of Bernie Ecclestone is factored into the equation, the improbability becomes much smaller again...
Well, not as improbable as one might think. Ian Stewart, writing with Jack Cohen and Terry Pratchett in The Science of Discworld (1999), points out that these three drivers could have been expected to set a time within a tenth of each other. The timing resolution was to thousandths of a second, hence there were effectively a hundred different times available to each of these drivers. Thus, the probability of an individual driver setting a specific time, such as A = 1m 21.072, was:
P(A) = 1/100
The probability of a conjunction of independent events is obtained by multiplying the probabilities of the individual events, so, as a first stab, one might guess that the probability of all three drivers setting the same time is as follows:
P(A&B&C) = 1/100 * 1/100 * 1/100 = 1/1,000,000 (i.e., one in a million)
This, however, is the probability of all three drivers setting a specific time such as 1m 21.072. It is not the specific time which is relevant here, but the probability of all three drivers setting the same time. Once the first driver of the triumvirate had set a time of 1m 21.072, we want to know the conditional probability of the second and third drivers also setting that time. To obtain the conditional probability of A&B&C, given the occurrence of A, one simply divides the probability of A&B&C by the probability of A:
P(A&B&C)/P(A) = 1/10,000
So the probability of Frentzen, Schumacher and Villeneuve setting the same time to a thousandth of a second, was one-in-ten-thousand. Still a long shot, but hardly astronomical odds. There have been in the order of a thousand World Championship Grands Prix since the inauguration of the championship in 1950, and although timing to thousandths of a second is a more recent innovation, we can say that the probability of three drivers setting exactly the same time over a thousand Grands Prix is one-in-ten.
In a ensemble consisting of ten separate planets within a galaxy, each hosting a thousand Grands Prix, and timing to thousandths of a second, one would expect there to be one planet on which three drivers set exactly the same time in at least one Grands Prix.
To expect that it happens in qualifying for the World Championship shoot-out? Well, once the existence of Bernie Ecclestone is factored into the equation, the improbability becomes much smaller again...
Saturday, August 15, 2009
Thought for the day
Hello. Many of us this weekend will be celebrating the return of another season of Premiership football. The colourfully-dressed faithful will gather in their thousands to cheer the glorious bravery and skill of our finest sporting heroes, dancing and twirling with dexterous abandon on hallowed rectangles of greensward. Many, of course, will begin the season with hopes of winning silverware or gaining promotion, only to find those hopes cruelly dashed by injuries, poor defending, and a total lack of creativity in midfield.
Whilst the Chelseas and Man. Uniteds of this world can buy their way to success, it is the managers and players of the less successful clubs to whom our compassion and attention should truly be directed. Liverpool, for one, begin the season without talented Xabi Alonso in midfield, and their spiritual figurehead, Steven Gerrard, may find himself provoked and villified by opposing fans, despite being cleared of affray during the Summer months. And isn't this rather as Jesus must have felt, nailed to the Cross on Golgotha for committing no crime, as the crowds jeered him?
One of Gerrard's disciples, John Doran, certainly couldn't walk by on the other side of the road, as Manchester United fan Marcus McGee refused to change the music in that Southport nightclub last December. It is, of course, debatable whether hitting a Manchester United fan is a crime under English civil law, and as we enter this hopeful period, perhaps we should all try to show the same courage which Mr Doran demonstrated in that Lancashire nightclub, when he "could not resist following through with an elbow into [Mr McGee's] face". Admitting our errors, both to ourselves and others, is one of the most crucial challenges that we face in life, and if we could all aspire to Mr Gerrard's contrition and humility, then perhaps we can begin to find a little personal salvation as Gabby Logan and Richard Keys introduce those tough away fixtures and mouth-watering local derbies.
Whilst the Chelseas and Man. Uniteds of this world can buy their way to success, it is the managers and players of the less successful clubs to whom our compassion and attention should truly be directed. Liverpool, for one, begin the season without talented Xabi Alonso in midfield, and their spiritual figurehead, Steven Gerrard, may find himself provoked and villified by opposing fans, despite being cleared of affray during the Summer months. And isn't this rather as Jesus must have felt, nailed to the Cross on Golgotha for committing no crime, as the crowds jeered him?
One of Gerrard's disciples, John Doran, certainly couldn't walk by on the other side of the road, as Manchester United fan Marcus McGee refused to change the music in that Southport nightclub last December. It is, of course, debatable whether hitting a Manchester United fan is a crime under English civil law, and as we enter this hopeful period, perhaps we should all try to show the same courage which Mr Doran demonstrated in that Lancashire nightclub, when he "could not resist following through with an elbow into [Mr McGee's] face". Admitting our errors, both to ourselves and others, is one of the most crucial challenges that we face in life, and if we could all aspire to Mr Gerrard's contrition and humility, then perhaps we can begin to find a little personal salvation as Gabby Logan and Richard Keys introduce those tough away fixtures and mouth-watering local derbies.
Tuesday, August 11, 2009
Not Michael Schumacher
This is not Michael Schumacher, this is Keke Rosberg at Spa in 1983.
Keke was a proper racing driver. If Keke was faced with the unappealing prospect of having to race a car without any prior testing, (with possibly damaging consequences to his reputation), he wouldn't use a neck-injury he'd suffered 6 months previously as an excuse to withdraw.
But then Schumacher is known to be a particularly slow healer. After Michael suffered a leg fracture in July of 1999, Ferrari harboured hopes that his then-teammate, Eddie Irvine, could still win the World Championship for the Scuderia. Sadly, Michael's tardy recuperative powers meant that he couldn't return to support Irvine's championship bid until October of that year, when Luca di Montezemolo had to effectively order him back into the car.
Sunday, August 09, 2009
Golf bags and nuclear reactors
It is a fact rarely recognised, but indubitably true that golf bags share some remarkable design reciprocities with nuclear reactors.
Modern golf clubs tend to be carbon or boron-shafted, and these materials have exactly the neutron-absorbing capabilities which oblige the designers of nuclear reactors to chose them as the material for the reactor control rods. Just as a golfer raises or lowers a club into or out of the golf bag, so the control rods are raised or lowered to control the temperature of a nuclear reactor.
One suspects that nuclear physicists could learn a thing or two from the Callaway engineers...
Modern golf clubs tend to be carbon or boron-shafted, and these materials have exactly the neutron-absorbing capabilities which oblige the designers of nuclear reactors to chose them as the material for the reactor control rods. Just as a golfer raises or lowers a club into or out of the golf bag, so the control rods are raised or lowered to control the temperature of a nuclear reactor.
One suspects that nuclear physicists could learn a thing or two from the Callaway engineers...
Robert Wright and the Evolution of God
Bryan Appleyard writes an appreciative review of Robert Wright's new book, The Evolution of God, in The Sunday Times, claiming that it constitutes "a scientifically based corrective to the absurd rhetoric of militant atheism."
However, the logic of Appleyard's review does little to substantiate this claim. Bryan firstly argues that "Even after 9/11, [the atheists] can’t prove [that religion is a bad thing] because, especially in the 20th century, non-religious nastiness was infinitely worse than religious." So, the argument here is as follows: non-religious nastiness in the 20th century was worse than religious nastiness in preceding centuries, hence religion isn't bad. That seems a rather perverse form of logic. One might conclude, instead, that religiously-inspired nastiness is a bad thing, and nastiness inspired in the 20th century by Marxism and fascism was also a bad thing.
Appleyard claims that "the persistence of religion in all human societies strongly suggests that, even in the most basic Darwinian terms, it has been good for us as a species." This is another logical error: the persistence of a behavioural characteristic in a reproducing species does not entail that it is beneficial to survival in itself, for it may simply be a by-product of other traits which do, in combination, make a net contribution to survival.
When Appleyard then turns to the heart of Wright's argument, we find what appears to be an attempt to equate the concept of God with some form of cosmic evolution:
What is clear, for Wright, is that there is an organising principle in the world and that this principle may well be materialistically explicable but it is, nonetheless, moral and progressive...Dawkins said Darwin [showed] how design arose through purely material means — evolution through natural selection is the 'blind watchmaker'. Wright says this misses the point. The point is not how the watch was designed but the fact that it is designed. Some process has led to its existence and it is that process that matters because the mechanism and purpose of the watch clearly make it different in kind from, say, rocks. Equally, humans also require a different type of explanation from rocks. It may be natural selection or it may be some innate force in the universe. Either way, it is reasonable to associate this force with morality and God.
On the basis of this, Wright's argument is simply the latest in a long line which attempt to define a pantheistic concept of God. In this case, God is equated with the physical process of cosmic evolution. Such a pantheistic concept of God is straighforwardly inconsistent with the notion of a transcendent, supernatural and personal God held by theistic religions such as Islam, Christianity and Judaism. Moreover, one also wonders how Wright manages to derive morality from the existence of evolution without committing the naturalistic fallacy, thereby deriving an 'ought' from an 'is'.
Even leaving aside the inconsistency with theistic religion, there are serious problems with any pantheistic proposal to equate God with cosmic evolution. The primary problem is that evolution by natural selection cannot meet the demand for irrevocable progress which such a variety of pantheism places upon it. In particular, the notion that evolution necessarily leads to ever-greater complexity is a myth. As Michel Le Page points out:
"Evolution often takes away rather than adding. For instance, cave fish lose their eyes, while parasites like tapeworms lose their guts. Such simplification might be much more widespread than realised. Some apparently primitive creatures are turning out to be the descendants of more complex creatures rather than their ancestors. For instance, it appears the ancestor of brainless starfish and sea urchins had a brain."
But most seriously for Wright's argument, in cosmic terms the growth of entropy will dominate, and as the universe tends towards thermodynamic equilibrium, the energy flows which presently permit the evolution of complexity and life, will subside and ultimately cease. The evolution of complexity and life is therefore, cosmically speaking, something of a transient phenomenon, and if atheists such as Dawkins have forced religious apologists to the point where God has to be equated with an ephemeral physical process, then it seems that the atheists really have won the argument convincingly.
However, the logic of Appleyard's review does little to substantiate this claim. Bryan firstly argues that "Even after 9/11, [the atheists] can’t prove [that religion is a bad thing] because, especially in the 20th century, non-religious nastiness was infinitely worse than religious." So, the argument here is as follows: non-religious nastiness in the 20th century was worse than religious nastiness in preceding centuries, hence religion isn't bad. That seems a rather perverse form of logic. One might conclude, instead, that religiously-inspired nastiness is a bad thing, and nastiness inspired in the 20th century by Marxism and fascism was also a bad thing.
Appleyard claims that "the persistence of religion in all human societies strongly suggests that, even in the most basic Darwinian terms, it has been good for us as a species." This is another logical error: the persistence of a behavioural characteristic in a reproducing species does not entail that it is beneficial to survival in itself, for it may simply be a by-product of other traits which do, in combination, make a net contribution to survival.
When Appleyard then turns to the heart of Wright's argument, we find what appears to be an attempt to equate the concept of God with some form of cosmic evolution:
What is clear, for Wright, is that there is an organising principle in the world and that this principle may well be materialistically explicable but it is, nonetheless, moral and progressive...Dawkins said Darwin [showed] how design arose through purely material means — evolution through natural selection is the 'blind watchmaker'. Wright says this misses the point. The point is not how the watch was designed but the fact that it is designed. Some process has led to its existence and it is that process that matters because the mechanism and purpose of the watch clearly make it different in kind from, say, rocks. Equally, humans also require a different type of explanation from rocks. It may be natural selection or it may be some innate force in the universe. Either way, it is reasonable to associate this force with morality and God.
On the basis of this, Wright's argument is simply the latest in a long line which attempt to define a pantheistic concept of God. In this case, God is equated with the physical process of cosmic evolution. Such a pantheistic concept of God is straighforwardly inconsistent with the notion of a transcendent, supernatural and personal God held by theistic religions such as Islam, Christianity and Judaism. Moreover, one also wonders how Wright manages to derive morality from the existence of evolution without committing the naturalistic fallacy, thereby deriving an 'ought' from an 'is'.
Even leaving aside the inconsistency with theistic religion, there are serious problems with any pantheistic proposal to equate God with cosmic evolution. The primary problem is that evolution by natural selection cannot meet the demand for irrevocable progress which such a variety of pantheism places upon it. In particular, the notion that evolution necessarily leads to ever-greater complexity is a myth. As Michel Le Page points out:
"Evolution often takes away rather than adding. For instance, cave fish lose their eyes, while parasites like tapeworms lose their guts. Such simplification might be much more widespread than realised. Some apparently primitive creatures are turning out to be the descendants of more complex creatures rather than their ancestors. For instance, it appears the ancestor of brainless starfish and sea urchins had a brain."
But most seriously for Wright's argument, in cosmic terms the growth of entropy will dominate, and as the universe tends towards thermodynamic equilibrium, the energy flows which presently permit the evolution of complexity and life, will subside and ultimately cease. The evolution of complexity and life is therefore, cosmically speaking, something of a transient phenomenon, and if atheists such as Dawkins have forced religious apologists to the point where God has to be equated with an ephemeral physical process, then it seems that the atheists really have won the argument convincingly.
Saturday, August 08, 2009
Robert P Crease and Grids of Disputation
The debate over the relationship between science and religion appears to generate an abundance of 2-by-2 matrices.
Robert P Crease provides an interesting example in Physics World, where he classifies the different approaches to the debate according to whether science and religion are treated as methodologies ('processes') or fixed sets of beliefs.
Crease himself subscribes to the bottom-right quadrant, arguing that "Humans...inherit imperfect patterns of behaviour, and a religious life is the response to the feeling that we can 'live better' than we do." This betrays the familiar religious ploy of attempting to conflate religious beliefs with moral and ethical beliefs. If the sole basis for Crease's adherence to religion is a feeling that we can live better than we do, then what he actually subscribes to is not science and religion, but science and ethics.
The conflict between science and religion is primarily a conflict between science and theistic religion, where "theism is the belief that there is an all-powerful, all-knowing perfectly good immaterial person who has created the world, has created human beings 'in his own image,' and to whom we owe worship, obedience and allegiance." There is no sense in which theistic religions can be treated as just a way of living.
Sean Carroll meanwhile, defines a 'grid of disputation', and argues that atheistic scientific advocates should engage only with worthy opponents, rather than the crackpots (such as the creationist lobby in the US).
Carroll begs to differ, however, with Richard Dawkins, on the question of the best way to change people's minds. Dawkins argues: "I think we should probably abandon the irremediably religious precisely because that is what they are – irremediable. I am more interested in the fence-sitters who haven’t really considered the question very long or very carefully. And I think that they are likely to be swayed by a display of naked contempt."
As Carroll astutely points out, whether or not people choose to use ridicule or reasoned argument seems to depend, not upon any empirical evidence for which is the most effective technique, but upon "the mode to which they are personally temperamentally suited."
Robert P Crease provides an interesting example in Physics World, where he classifies the different approaches to the debate according to whether science and religion are treated as methodologies ('processes') or fixed sets of beliefs.
Crease himself subscribes to the bottom-right quadrant, arguing that "Humans...inherit imperfect patterns of behaviour, and a religious life is the response to the feeling that we can 'live better' than we do." This betrays the familiar religious ploy of attempting to conflate religious beliefs with moral and ethical beliefs. If the sole basis for Crease's adherence to religion is a feeling that we can live better than we do, then what he actually subscribes to is not science and religion, but science and ethics.
The conflict between science and religion is primarily a conflict between science and theistic religion, where "theism is the belief that there is an all-powerful, all-knowing perfectly good immaterial person who has created the world, has created human beings 'in his own image,' and to whom we owe worship, obedience and allegiance." There is no sense in which theistic religions can be treated as just a way of living.
Sean Carroll meanwhile, defines a 'grid of disputation', and argues that atheistic scientific advocates should engage only with worthy opponents, rather than the crackpots (such as the creationist lobby in the US).
Carroll begs to differ, however, with Richard Dawkins, on the question of the best way to change people's minds. Dawkins argues: "I think we should probably abandon the irremediably religious precisely because that is what they are – irremediable. I am more interested in the fence-sitters who haven’t really considered the question very long or very carefully. And I think that they are likely to be swayed by a display of naked contempt."
As Carroll astutely points out, whether or not people choose to use ridicule or reasoned argument seems to depend, not upon any empirical evidence for which is the most effective technique, but upon "the mode to which they are personally temperamentally suited."
Friday, August 07, 2009
Muse - Uprising
OMG! It's Uprising, the first track to be released from the new MUSE album, The Resistance.
Muse's main muse, Matt Bellamy describes it as "like a heavy-rock take on Goldfrapp"; in fact, it sounds not unadjacent to indie-funky-prog-rock-electronica to me.
The album also purportedly features something called slap bass, which is certainly something to look forward to, whatever it is.
BMW's Formula 1 failure
"The chassis on a BMW was a thing of beauty compared with other makes of the day...So we had the strategy of always emphasizing handling and driving pleasure. But it took us a few weeks to come up with a line, a slogan, to go with the idea."...The moment he wrote the famous line of ad copy doesn't stay with Puris. He just recalls working out a slew of fragmented words and phrases at his desk until one jumped off the page at him: 'The Ultimate Driving Machine'. (Martin Puris, speaking to David Kiley in Driven: Inside BMW, the Most Admired Car Company in the World).
More than just a slogan,...'The Ultimate Driving Machine' became a mission statement that informed everything the company did...Crucially,...it spoke directly to everything men love about driving: control, power, and the surging thrill they feel in the pit of their stomachs as they apply pressure to the accelerator. (Mark Turngate, Branded Male).
"Premium will increasingly be defined in terms of sustainability and environmental compatibility...In line with our Strategy Number ONE, we are continually reviewing all projects and initiatives to check them for future viability and sustainability. Our Formula 1 campaign is thus less a key promoter for us." (Dr. Norbert Reithofer, chairman of the BMW board, announcing BMW's withdrawal from Formula 1 after failing to win the World Championship in ten years of effort.)
More than just a slogan,...'The Ultimate Driving Machine' became a mission statement that informed everything the company did...Crucially,...it spoke directly to everything men love about driving: control, power, and the surging thrill they feel in the pit of their stomachs as they apply pressure to the accelerator. (Mark Turngate, Branded Male).
"Premium will increasingly be defined in terms of sustainability and environmental compatibility...In line with our Strategy Number ONE, we are continually reviewing all projects and initiatives to check them for future viability and sustainability. Our Formula 1 campaign is thus less a key promoter for us." (Dr. Norbert Reithofer, chairman of the BMW board, announcing BMW's withdrawal from Formula 1 after failing to win the World Championship in ten years of effort.)
Monday, August 03, 2009
Poisson processes and motorsport accidents
Mark Hughes argues in Autosport this week that there was no link between the F2 accident in which Henry Surtees was killed by a flying wheel, and the accident 6 days later in which F1 driver Felipe Massa suffered a fractured skull. By way of comparison, Hughes points out that there was barely any link between the five serious accidents which occurred over the Imola 1994 weekend, leaving two drivers, including Ayrton Senna, dead. "A full appreciation of randomness," writes Hughes, "means it would be bizarre if such clumps did not occasionally occur."
Mark is quite right about the Surtees and Massa accidents. In fact, this type of randomness can be mathematically characterised as a Poisson process. If a certain class of events occur randomly and independently of each other, at an average rate which is constant over time, then the number of such events which occur within a length of time will have a Poisson distribution, and the time intervals which elapse between successive events will have an exponential distribution.
In a Poisson process, the inter-event times have a tendency to cluster because short inter-event times have a relatively high probability under the exponential distribution. The clustering of accidents in motorsport is just one particular case of this statistical phenomenon.
There is, however, an extra ingredient at play in the case of motorsport: anxiety. The presence of anxiety creates a degree of clustering which exceeds that which would be explicable by Poisson statistics alone. For example, the series of fatal or near-fatal F1 accidents which occurred in early 1994, after an eight-year hiatus, is a peak too great to be attributable to randomness. Recall that, not only were there five serious accidents over the Imola weekend itself, but two weeks later Karl Wendlinger had a near-fatal accident at Monaco, and shortly thereafter, Pedro Lamy was fortunate to escape with broken legs after cartwheeling into the spectator enclosure during private testing at Silverstone.
It is well-known that certain types of anxiety inhibit effective decision-making, hence it can be hypothesized that after the first fatalities of 1994, some of the drivers, and even their teams, entered a state of anxiety, which in itself provoked further errors and serious accidents. The anxiety hypothesized here is not of the overt variety, but rather a suppressed and hidden type, which only manifests itself in split-second judgements. The presence of accident clusters in motorsport is therefore the consequence of both randomness and human psychology.
Mark is quite right about the Surtees and Massa accidents. In fact, this type of randomness can be mathematically characterised as a Poisson process. If a certain class of events occur randomly and independently of each other, at an average rate which is constant over time, then the number of such events which occur within a length of time will have a Poisson distribution, and the time intervals which elapse between successive events will have an exponential distribution.
In a Poisson process, the inter-event times have a tendency to cluster because short inter-event times have a relatively high probability under the exponential distribution. The clustering of accidents in motorsport is just one particular case of this statistical phenomenon.
There is, however, an extra ingredient at play in the case of motorsport: anxiety. The presence of anxiety creates a degree of clustering which exceeds that which would be explicable by Poisson statistics alone. For example, the series of fatal or near-fatal F1 accidents which occurred in early 1994, after an eight-year hiatus, is a peak too great to be attributable to randomness. Recall that, not only were there five serious accidents over the Imola weekend itself, but two weeks later Karl Wendlinger had a near-fatal accident at Monaco, and shortly thereafter, Pedro Lamy was fortunate to escape with broken legs after cartwheeling into the spectator enclosure during private testing at Silverstone.
It is well-known that certain types of anxiety inhibit effective decision-making, hence it can be hypothesized that after the first fatalities of 1994, some of the drivers, and even their teams, entered a state of anxiety, which in itself provoked further errors and serious accidents. The anxiety hypothesized here is not of the overt variety, but rather a suppressed and hidden type, which only manifests itself in split-second judgements. The presence of accident clusters in motorsport is therefore the consequence of both randomness and human psychology.