Why is there something rather than nothing? What was the origin of the universe? What is the future of the universe? Is the universe finite or infinite? Is our universe the only logically possible universe? Are there other universes? What is the relationship between life and the universe?
These fundamental metaphysical questions have always exerted a fascination for mankind, but over the past hundred years physical cosmology has begun to provide scientific answers to these questions, answers formulated using well-defined mathematical concepts, and verified by astronomical observation.
In more recent decades, however, a sub-industry has developed which produces cosmological theories and models lacking any empirical substantiation. The audience for such highly speculative cosmology consists not only of physicists and cosmologists, but the public-at-large. There is a shop-front for this brand of cosmology, composed of popular science programmes on TV, books such as Stephen Hawking's Brief History of Time, and articles in magazines such as Scientific American and New Scientist.
There is a problem with the manner in which this type of cosmology is presented to the public, and it conforms with Ben Goldacre's characterisation of the way in which all science is treated by the media:
On this template, science is portrayed as groundless, incomprehensible, didactic truth statements from scientists, who themselves are socially powerful, arbitrary, unelected authority figures. They are detached from reality; they do work that is either wacky or dangerous, but either way, everything in science is tenuous, contradictory, probably going to change soon and, most ridiculously, 'hard to understand'. (Bad Science, p225).
Cosmology is presented in the media as being beyond the understanding of the public, and beyond their ability to engage in critical appraisal. The concepts and reasoning are rarely properly explained, and almost never subjected to critical analysis. In the case of speculative cosmology, the ideas are also presented in a credulous and philosophically naive manner.
As an antidote to this comes Ultimate Explanations of the Universe, the latest book by Michael Heller, Polish physicist, philosopher, priest, and winner of the million-pound 2008 Templeton Prize. The text offers a critical guide to all the big philosophical issues in cosmology: eternal universes, cyclic universes, the heath death of the universe, quantum creation out of nothing, inflation, the anthropic principle, cosmological natural selection, theories of everything, and multiverses. The text is quite dense, but the book is still accessible to anyone with an interest in such ultimate questions, and will assist non-specialists seeking to assess the metaphysical credentials of modern cosmology.
Thursday, December 31, 2009
Thursday, December 24, 2009
Christmas with Thomas Covenant
Thomas Covenant the Unbeliever could feel the wild magic pulsing in his veins. He was the wielder of the white chocolate ring, and its argent fire had the power to destroy or redeem Christmas. This was the conundrum, the dilemma, the paradox.
The vitality of the pagan winter solstice, the seminal life-force and spirit of Christmas, had been corrupted, first by the bane of religious revelation, scripture and liturgy, then by the malefic puissance of commercialisation, and finally by the insidious constriction and life-draining encroachment of Health and Safety bureaucracy. The wild magic had the capability to destroy this suppurating putrefaction, but also to destroy the Arch of Time, releasing Lord Mandelson the Despiser from his chancrous demesne.
Covenant knew he had to wield the wild magic, but also knew that he couldn't control it. As rapacious commercial Ravers stalked the land, disseminating their incentives to gluttony and avarice, now came the final desecration: the conical, synthetic Poole-centre Christmas tree.
For a brief moment, Covenant thought he descried Lord Mandelson's carious yellow eyes from inside the tree. Instantly, his skin was enveloped in formication, and an eldritch nimbus formed around his body. As coruscations began to lance outwards, Covenant could feel the argent fire gathering itself, ready to blaze across the land in a conflagration of destruction. The Yuletide Log of Law had been broken, the Earthpower had been sundered from the people, and unless Thomas Covenant could find the point of balance, the Arch of Time itself was in peril.
The vitality of the pagan winter solstice, the seminal life-force and spirit of Christmas, had been corrupted, first by the bane of religious revelation, scripture and liturgy, then by the malefic puissance of commercialisation, and finally by the insidious constriction and life-draining encroachment of Health and Safety bureaucracy. The wild magic had the capability to destroy this suppurating putrefaction, but also to destroy the Arch of Time, releasing Lord Mandelson the Despiser from his chancrous demesne.
Covenant knew he had to wield the wild magic, but also knew that he couldn't control it. As rapacious commercial Ravers stalked the land, disseminating their incentives to gluttony and avarice, now came the final desecration: the conical, synthetic Poole-centre Christmas tree.
For a brief moment, Covenant thought he descried Lord Mandelson's carious yellow eyes from inside the tree. Instantly, his skin was enveloped in formication, and an eldritch nimbus formed around his body. As coruscations began to lance outwards, Covenant could feel the argent fire gathering itself, ready to blaze across the land in a conflagration of destruction. The Yuletide Log of Law had been broken, the Earthpower had been sundered from the people, and unless Thomas Covenant could find the point of balance, the Arch of Time itself was in peril.
Tuesday, December 22, 2009
The return of Dick Dastardly
So Michael Schumacher is coming out of retirement to race for Mercedes in 2010. After all the cheating and politics that's marred Formula One since Michael left the stage at the end of 2006, the return of such a pure and noble spirit will surely inject a breath of fresh air into the sport. And it's nice to see that Michael's neck has mended properly now that he'll be able to test the car he's going to race...
Much emphasis has been placed upon the fact that Michael will be re-united at Mercedes with Ross Brawn, his trusted friend and erstwhile Ferrari technical director. From a politically strategic perspective, however, Michael will also be not disadvantaged by the presence of Jean Todt as newly-installed President of the FIA. Todt and Schumacher also share a great deal of mutual respect, and given Todt's apparent grudge with Ferrari President Luca di Montezemolo, it doesn't take an excess of imagination to conceive that Schumacher and Mercedes may be given, let's say, the benefit of the doubt in any marginal technical or sporting decisions next year.
Still, with the prospect of Hamilton, Schumacher and Alonso going wheel-to-wheel in 2010, March 14th simply can't come around quickly enough.
Much emphasis has been placed upon the fact that Michael will be re-united at Mercedes with Ross Brawn, his trusted friend and erstwhile Ferrari technical director. From a politically strategic perspective, however, Michael will also be not disadvantaged by the presence of Jean Todt as newly-installed President of the FIA. Todt and Schumacher also share a great deal of mutual respect, and given Todt's apparent grudge with Ferrari President Luca di Montezemolo, it doesn't take an excess of imagination to conceive that Schumacher and Mercedes may be given, let's say, the benefit of the doubt in any marginal technical or sporting decisions next year.
Still, with the prospect of Hamilton, Schumacher and Alonso going wheel-to-wheel in 2010, March 14th simply can't come around quickly enough.
Snow and Bayesian probability
I was due to deliver a presentation today on the application of Bayesian probability to measurement science. Sadly, that won't now be possible.
Still, I'm sure Blogger won't mind me using their resources instead. The basic idea is that there's a distinction between true values x and measured values y.
You start off with a prior probability distribution over the true values. You then have a likelihood function, which gives you the probability
P(y|x) of measuring any value y given a hypothetical true value x.
When you perform an actual measurement, and obtain a particular measured value y, Bayes's theorem specifies a posterior distribution over the true values. This new distribution can then be set as the prior distribution for the next cycle, and so on. The Bayesian technique is simply a way of representing how knowledge changes and improves in the light of new evidence.
In the example represented graphically below, the prior distribution is the steep downward curve, a so-called inverse prior. This shows that prior to performing a measurement, the lowest values have the highest probability. (Technically, you need to truncate this distribution to ensure the total probability is equal to 1).
The graph represents the case where you perform an actual measurement, and find that the value is equal to 4 in whatever units are being used. The likelihood function P(y|x) in this example provides a normal distribution over the true values x for a fixed measured value y=4. Bayes's theorem then yields the posterior distribution, which in this case is also basically a normal distribution, but shifted slightly away from 4 towards 0, by the influence of the prior distribution.
Still, I'm sure Blogger won't mind me using their resources instead. The basic idea is that there's a distinction between true values x and measured values y.
You start off with a prior probability distribution over the true values. You then have a likelihood function, which gives you the probability
P(y|x) of measuring any value y given a hypothetical true value x.
When you perform an actual measurement, and obtain a particular measured value y, Bayes's theorem specifies a posterior distribution over the true values. This new distribution can then be set as the prior distribution for the next cycle, and so on. The Bayesian technique is simply a way of representing how knowledge changes and improves in the light of new evidence.
In the example represented graphically below, the prior distribution is the steep downward curve, a so-called inverse prior. This shows that prior to performing a measurement, the lowest values have the highest probability. (Technically, you need to truncate this distribution to ensure the total probability is equal to 1).
The graph represents the case where you perform an actual measurement, and find that the value is equal to 4 in whatever units are being used. The likelihood function P(y|x) in this example provides a normal distribution over the true values x for a fixed measured value y=4. Bayes's theorem then yields the posterior distribution, which in this case is also basically a normal distribution, but shifted slightly away from 4 towards 0, by the influence of the prior distribution.
Monday, December 21, 2009
Snow in North Hampshire and the Thames Valley
Moving a hundred yards in ninety minutes of gridlock, I abandon my car, and with a surge of adrenalin make for home on foot. A thick confetti of snow fills the air, and a fey light dwindles into a leaden darkness. Depleted of colour and reverberation, an eerie quiet smothers suburbia.
The chromatic uniformity is punctuated only by the twinkling hues of residential christmas illuminations. Striding past up-market properties where town blends into countryside, a couple of small replica reindeer stand in a front garden, their shapes traced by amber lights, exquisitely beautiful now in the deepening snowfield.
And then into the countryside, with no pavement, no light, and only the occasional car, rolling gingerly past in 2nd gear, to disturb the sonic claustrophobia. I am totally at the mercy of these drivers. Hemmed in between the snow-laden hedgerows, there is no escape-route; if one of these cars loses control, I will be smacked off my feet, my legs crushed, crumpled and shattered.
Finally, striding through drifts deeper than my shoes, I reach the farmhouse, and the light, warmth, sound and colour of home. All I need to do now, is do it all again tomorrow to retrieve my car...
The chromatic uniformity is punctuated only by the twinkling hues of residential christmas illuminations. Striding past up-market properties where town blends into countryside, a couple of small replica reindeer stand in a front garden, their shapes traced by amber lights, exquisitely beautiful now in the deepening snowfield.
And then into the countryside, with no pavement, no light, and only the occasional car, rolling gingerly past in 2nd gear, to disturb the sonic claustrophobia. I am totally at the mercy of these drivers. Hemmed in between the snow-laden hedgerows, there is no escape-route; if one of these cars loses control, I will be smacked off my feet, my legs crushed, crumpled and shattered.
Finally, striding through drifts deeper than my shoes, I reach the farmhouse, and the light, warmth, sound and colour of home. All I need to do now, is do it all again tomorrow to retrieve my car...
Saturday, December 19, 2009
Snow and entropy
Snowfall not only evokes the predictable platitudinous admonitions to "only travel if it's absolutely necessary", but from a more recondite perspective, also partially preserves the entropy of light.
Snow, of course, is white in colour, so it reflects all the colours of the spectrum. In contrast, the normal landscape, consisting of greens and browns and greys, absorbs all the other colours of the spectrum, and in each case reflects just one visible component of the sunlight impinging upon it (the absorbed energy is re-radiated in the infrared). Snow therefore preserves the spectral information in sunlight which is otherwise destroyed by the natural landscape concealed beneath.
Moreover, snow which has frozen overnight will sparkle in sunlight the next day, consisting as it does of numerous ice crystals. In effect, the surface of the snow consists of millions of tiny mirrors, each oriented at a different angle. Each tiny mirror reflects sunlight in a different direction, from a unique position, so the snow sparkles as the observer moves about and the line of sight intersects the reflected sunlight from differently positioned and oriented crystal facets.
A mirrored surface is distinguished from a white, non-mirrored surface simply by virtue of the smoothness of the former. A smooth surface permits the specular reflection of light, in which the directional information in the light is preserved, whilst an irregular surface results in diffuse reflection, which destroys the directional information. A mirrored surface therefore prevents the entropy of light from increasing by virtue of preserving both the spectral information and the directional information in the light. The blanket of sparkling white snow which covers a landscape, therefore operates as a mechanism to reduce the increasing entropy of light.
But do wrap up warm, because the wind will make it feel even colder.
Snow, of course, is white in colour, so it reflects all the colours of the spectrum. In contrast, the normal landscape, consisting of greens and browns and greys, absorbs all the other colours of the spectrum, and in each case reflects just one visible component of the sunlight impinging upon it (the absorbed energy is re-radiated in the infrared). Snow therefore preserves the spectral information in sunlight which is otherwise destroyed by the natural landscape concealed beneath.
Moreover, snow which has frozen overnight will sparkle in sunlight the next day, consisting as it does of numerous ice crystals. In effect, the surface of the snow consists of millions of tiny mirrors, each oriented at a different angle. Each tiny mirror reflects sunlight in a different direction, from a unique position, so the snow sparkles as the observer moves about and the line of sight intersects the reflected sunlight from differently positioned and oriented crystal facets.
A mirrored surface is distinguished from a white, non-mirrored surface simply by virtue of the smoothness of the former. A smooth surface permits the specular reflection of light, in which the directional information in the light is preserved, whilst an irregular surface results in diffuse reflection, which destroys the directional information. A mirrored surface therefore prevents the entropy of light from increasing by virtue of preserving both the spectral information and the directional information in the light. The blanket of sparkling white snow which covers a landscape, therefore operates as a mechanism to reduce the increasing entropy of light.
But do wrap up warm, because the wind will make it feel even colder.
Thursday, December 17, 2009
Dark matter detected?
The Cryogenic Dark Matter Search (CDMS) project is announcing on Friday 18th December, the possible detection of two dark matter particles. However, the chances of a false positive here are estimated to be 23%, which is rather high compared to the threshold of 5% typically employed in measurement science, and extremely high compared to the threshold of 0.1% stipulated by the CDMS team.
The CDMS detector resides in an abandoned iron mine in Minnesota, (to minimise the background neutron radiation from cosmic rays), and consists of germanium discs cryogenically cooled to an extremely low temperature. The discs are coated with phonon sensors, designed to detect the tiny vibrations in such a crystalline solid. When the crystals are cooled to a very low temperature, the vibrational background of the solid is virtually eliminated, and it is possible to detect the phonons created by incoming particles colliding with the atoms in the solid.
These phonons, however, can be created by background gamma radiation colliding with the electrons in the solid, or by hypothetical dark matter particles colliding with the nuclei of the atoms. The two different processes create different patterns of ionisation, hence the detection of dark matter particles requires the simultaneous detection of the phonons and the ionisation in the crystals. (See Dark Side of the Universe, Iain Nicolson, p84-85).
The CDMS detector resides in an abandoned iron mine in Minnesota, (to minimise the background neutron radiation from cosmic rays), and consists of germanium discs cryogenically cooled to an extremely low temperature. The discs are coated with phonon sensors, designed to detect the tiny vibrations in such a crystalline solid. When the crystals are cooled to a very low temperature, the vibrational background of the solid is virtually eliminated, and it is possible to detect the phonons created by incoming particles colliding with the atoms in the solid.
These phonons, however, can be created by background gamma radiation colliding with the electrons in the solid, or by hypothetical dark matter particles colliding with the nuclei of the atoms. The two different processes create different patterns of ionisation, hence the detection of dark matter particles requires the simultaneous detection of the phonons and the ionisation in the crystals. (See Dark Side of the Universe, Iain Nicolson, p84-85).
Lewis Hamilton and high-speed corners
McLaren managing director Jonathan Neale recently made the following comment about the prospect of having Lewis Hamilton and Jenson ("I do the triathlon, you know") Button together in the same team:
"In some ways, it multiplies our opportunities in a grand prix too: there could well be tracks where Jenson's skill-set is better-suited to the challenge, and equally, tracks where Lewis could excel. In the past, Jenson has demonstrated considerable talent at high-speed circuits, and we're looking forward to building that into our arsenal."
This isn't quite equivalent to suggesting that Lewis Hamilton has a relative weakness on high-speed circuits, but it isn't far away from it. But is it a notion which holds up to analysis?
Firstly, by high-speed circuits, let us restrict attention to circuits with a decent proportion of high-speed corners, rather than merely circuits, such as Monza, with a high average speed. As Autosport's Mark Hughes has explained at length over some years, Hamilton's natural driving technique utilises oversteer to achieve a rapid change of direction. This is perfect for slow to medium speed corners, but less appropriate for high-speed corners, so the question is whether Lewis has a relative weakness in high-speed corners.
But is there any evidence to support this supposition? Well, let's have a look at Lewis's performance at Silverstone and Spa, the two circuits with the highest proportion of high-speed corners, which have been on the calendar in each of Lewis's three seasons in Formula One. In 2007, Hamilton was off the pace of the other McLaren and Ferrari drivers at Silverstone. "In particular he was struggling through Copse corner, the near-flat-in-seventh first turn taken at over 185mph. Some drivers were claiming to be able to take it without a lift of the throttle. But Lewis never did." (Lewis Hamilton - The Full Story, p199). And although Hamilton was closer to McLaren team-mate Alonso at Spa, it was a similar story. In the case of Spa, however, there are mitigating circumstances, because Hamilton failed to drive in the pre-race test session, thereby losing valuable set-up time. The 2007 McLaren also had a natural tendency to over-heat its tyres in fast corners, and probably had lower peak downforce than the 2007 Ferrari on this type of track.
At Silverstone in 2008, Hamilton qualified eight-tenths off team-mate and pole-winner Kovalainen. Moreover, in his attempts to redress the difference, he over-drove and went bouncing off the circuit at Priory. The race itself, however, was Lewis's greatest day, his rendition of Senna at Estoril in 1985, operating on a different plane from the other drivers in torrentially wet conditions. Unfortunately, those same conditions make the race invalid as an analysis of Lewis's high-speed cornering ability.
At Spa 2008, however, Lewis qualified on pole, and would have won comfortably but for a silly spin in the early damp conditions. He then spent the remainder of the race remorselessly hunting down Raikkonen, eroding a tenth here and a tenth there, so that he was perfectly placed to nail the Ferrari when drizzle began falling again in the late stages of the race. It was Lewis's rendition of Hakkinen, hunting down Schumacher at Spa in 2000, after a silly early spin in damp conditions.
It's difficult to derive anything from Silverstone and Spa 2009, for the McLaren was lacking consistent downforce in these conditions, and wholly uncompetitive. Lewis, however, drove a very untidy race at Silverstone, allowing Alonso to drive him onto the grass at one stage, and spinning under braking at Vale on another occasion. At Spa he was shunted out on the first lap.
There's also mixed evidence from other tracks on the calendar. Lewis's driving style over-heated the outside front tyre in the fast, multiple-apex Turn 8 at Turkey in 2007 and 2008, but he looked very quick overall on both occasions, and, crucially, was faster than Alonso at Instanbul in 2007. He also looked fabulous at Suzuka in 2009, in a car which was still lacking peak downforce relative to its rivals.
In conclusion, there is some evidence that Lewis has a relative weakness on tracks with high-speed corners, but it seems likely that this is only relative to his extraordinary ability in slow and medium speed corners. One also assumes that Hamilton is the type of driver capable of eradicating any weakness, relative or not.
"In some ways, it multiplies our opportunities in a grand prix too: there could well be tracks where Jenson's skill-set is better-suited to the challenge, and equally, tracks where Lewis could excel. In the past, Jenson has demonstrated considerable talent at high-speed circuits, and we're looking forward to building that into our arsenal."
This isn't quite equivalent to suggesting that Lewis Hamilton has a relative weakness on high-speed circuits, but it isn't far away from it. But is it a notion which holds up to analysis?
Firstly, by high-speed circuits, let us restrict attention to circuits with a decent proportion of high-speed corners, rather than merely circuits, such as Monza, with a high average speed. As Autosport's Mark Hughes has explained at length over some years, Hamilton's natural driving technique utilises oversteer to achieve a rapid change of direction. This is perfect for slow to medium speed corners, but less appropriate for high-speed corners, so the question is whether Lewis has a relative weakness in high-speed corners.
But is there any evidence to support this supposition? Well, let's have a look at Lewis's performance at Silverstone and Spa, the two circuits with the highest proportion of high-speed corners, which have been on the calendar in each of Lewis's three seasons in Formula One. In 2007, Hamilton was off the pace of the other McLaren and Ferrari drivers at Silverstone. "In particular he was struggling through Copse corner, the near-flat-in-seventh first turn taken at over 185mph. Some drivers were claiming to be able to take it without a lift of the throttle. But Lewis never did." (Lewis Hamilton - The Full Story, p199). And although Hamilton was closer to McLaren team-mate Alonso at Spa, it was a similar story. In the case of Spa, however, there are mitigating circumstances, because Hamilton failed to drive in the pre-race test session, thereby losing valuable set-up time. The 2007 McLaren also had a natural tendency to over-heat its tyres in fast corners, and probably had lower peak downforce than the 2007 Ferrari on this type of track.
At Silverstone in 2008, Hamilton qualified eight-tenths off team-mate and pole-winner Kovalainen. Moreover, in his attempts to redress the difference, he over-drove and went bouncing off the circuit at Priory. The race itself, however, was Lewis's greatest day, his rendition of Senna at Estoril in 1985, operating on a different plane from the other drivers in torrentially wet conditions. Unfortunately, those same conditions make the race invalid as an analysis of Lewis's high-speed cornering ability.
At Spa 2008, however, Lewis qualified on pole, and would have won comfortably but for a silly spin in the early damp conditions. He then spent the remainder of the race remorselessly hunting down Raikkonen, eroding a tenth here and a tenth there, so that he was perfectly placed to nail the Ferrari when drizzle began falling again in the late stages of the race. It was Lewis's rendition of Hakkinen, hunting down Schumacher at Spa in 2000, after a silly early spin in damp conditions.
It's difficult to derive anything from Silverstone and Spa 2009, for the McLaren was lacking consistent downforce in these conditions, and wholly uncompetitive. Lewis, however, drove a very untidy race at Silverstone, allowing Alonso to drive him onto the grass at one stage, and spinning under braking at Vale on another occasion. At Spa he was shunted out on the first lap.
There's also mixed evidence from other tracks on the calendar. Lewis's driving style over-heated the outside front tyre in the fast, multiple-apex Turn 8 at Turkey in 2007 and 2008, but he looked very quick overall on both occasions, and, crucially, was faster than Alonso at Instanbul in 2007. He also looked fabulous at Suzuka in 2009, in a car which was still lacking peak downforce relative to its rivals.
In conclusion, there is some evidence that Lewis has a relative weakness on tracks with high-speed corners, but it seems likely that this is only relative to his extraordinary ability in slow and medium speed corners. One also assumes that Hamilton is the type of driver capable of eradicating any weakness, relative or not.
Saturday, December 12, 2009
Stars like grains of sand
If you live in suburbia, then within a few hundred yards of your own home, there will most probably be a towering display of mankind's ability to harness natural energy. On a dark and damp winter's day, the power lines suspended between the beneficent arms of electricity pylons, fizz and crackle in the cold rain. It is a sound derived from a cascade of energy flows which starts with nuclear fusion in the Sun, proceeds via photosynthesis to the creation of metabolic fuels in plant-life, is geologically transformed into hydrocarbon deposits, which are combusted to produce steam, thus impelling turbines to rotate, creating the electric power flows, of which a small fraction is dissipated as sound energy close to your home.
The photograph above is an image of the centre of our galaxy, obtained by the UK's new infrared telescope, Vista. The centre of the galaxy is obscured at visible wavelengths by interstellar dust, Iain Nicolson reporting that "along the line of sight to the galactic centre, of each 10 billion photons that should be arriving from the galactic nucleus, only about one actually does so," (Unfolding our Universe, p198). An infrared telescope, however, is able to peer through this dust, and it's well worth clicking to expand the image, for there are a million stars in this field of view.
"Whereas a sphere 5 light-years in diameter centred on the Sun would contain only four stars (the three components of the Alpha Centauri system and the Sun itself), a similar sized sphere close to the centre of the nuclear bulge [at the centre of our galaxy] would contain about ten million stars...To an observer on a planet located deep within the nuclear bulge, the sky would be aglow with millions of stars...several hundred of which would be brighter than the full Moon," (ibid., p203).
All the stars in Vista's image represent the starting points for cascading energy flows, some of which may be powering the evolution of alien biospheres and civilisations, and ultimately enabling alien observers to hear the fizz and crackle from alien power lines held aloft by beneficent alien pylons.
The photograph above is an image of the centre of our galaxy, obtained by the UK's new infrared telescope, Vista. The centre of the galaxy is obscured at visible wavelengths by interstellar dust, Iain Nicolson reporting that "along the line of sight to the galactic centre, of each 10 billion photons that should be arriving from the galactic nucleus, only about one actually does so," (Unfolding our Universe, p198). An infrared telescope, however, is able to peer through this dust, and it's well worth clicking to expand the image, for there are a million stars in this field of view.
"Whereas a sphere 5 light-years in diameter centred on the Sun would contain only four stars (the three components of the Alpha Centauri system and the Sun itself), a similar sized sphere close to the centre of the nuclear bulge [at the centre of our galaxy] would contain about ten million stars...To an observer on a planet located deep within the nuclear bulge, the sky would be aglow with millions of stars...several hundred of which would be brighter than the full Moon," (ibid., p203).
All the stars in Vista's image represent the starting points for cascading energy flows, some of which may be powering the evolution of alien biospheres and civilisations, and ultimately enabling alien observers to hear the fizz and crackle from alien power lines held aloft by beneficent alien pylons.
Friday, December 11, 2009
Formula 1 and Nash equilibrium
A selection of strategies by a group of agents is said to be in a Nash equilibrium if each agent's strategy is a best-response to the strategies chosen by the other players. By best-response, we mean that no individual can improve her payoff by switching strategies unless at least one other individual switches strategies as well. This need not mean that the payoffs to each individual are optimal in a Nash equilibrium: indeed, one of the disturbing facts of the prisoner's dilemma is that the only Nash equilibrium of the game--when both agents defect--is suboptimal. (Evolutionary Game theory, Stanford Encyclopedia of Philosophy).
The technological arms race in Formula 1 now appears to be dead. The technical regulations have been converging to a control formula for some years, with little opportunity for variation in engine or chassis design, and in the light of the limited financial resources now available to the surviving F1 teams, their optimal strategies include voluntary mutual agreements to limit expenditure on testing, research and development. The teams have even agreed to a common car launch in Valencia next year to save costs.
In the lexicon of game theory, the constraints upon the game players have changed, with the consequence that one state of Nash equilibrium (the arms race) has been supplanted by a new state of Nash equilibrium.
This new state of Nash equilibrium is highly preferable for those who own the commercial rights to Formula 1, (currently the private equity company CVC). Formula 1 has now essentially become a package of revenue streams: television broadcasting rights, circuit hosting rights, circuit advertising, and paddock hospitality. The arms race was bad for the commercial owners of F1 because it made for larger performance gaps between the cars, and therefore a less attractive product to sell to the television companies. It also meant that the manufacturers funding that arms race developed an interest in acquiring a larger stake of the commercial revenues generated by the sport, and a greater influence over its governance. With the shrivelling influence and spending capacity of the remaining manufacturer teams, that threat to the business plan of Formula 1 has been eliminated.
The technological arms race in Formula 1 now appears to be dead. The technical regulations have been converging to a control formula for some years, with little opportunity for variation in engine or chassis design, and in the light of the limited financial resources now available to the surviving F1 teams, their optimal strategies include voluntary mutual agreements to limit expenditure on testing, research and development. The teams have even agreed to a common car launch in Valencia next year to save costs.
In the lexicon of game theory, the constraints upon the game players have changed, with the consequence that one state of Nash equilibrium (the arms race) has been supplanted by a new state of Nash equilibrium.
This new state of Nash equilibrium is highly preferable for those who own the commercial rights to Formula 1, (currently the private equity company CVC). Formula 1 has now essentially become a package of revenue streams: television broadcasting rights, circuit hosting rights, circuit advertising, and paddock hospitality. The arms race was bad for the commercial owners of F1 because it made for larger performance gaps between the cars, and therefore a less attractive product to sell to the television companies. It also meant that the manufacturers funding that arms race developed an interest in acquiring a larger stake of the commercial revenues generated by the sport, and a greater influence over its governance. With the shrivelling influence and spending capacity of the remaining manufacturer teams, that threat to the business plan of Formula 1 has been eliminated.
Thursday, December 10, 2009
Tiger Woods and Lewis Hamilton
"My overriding feeling about Lewis is one of sadness for a young man who has, one way or another, let himself down very badly and done something which will never be erased from the record...The Tiger Woods parallel has always been a frame of reference with Lewis but for different reasons now. You see him everywhere in those big posters at the airport (with dotted white lines and percentage ratings of this or that) or on television. He was so close to imposing his will on all those boring golfers at the top of the leaderboard at the Masters this weekend. I can imagine Lewis watching him and feeling the same sadness that I do. Here is the guy Lewis wanted to be. A worldwide icon and superstar, leading by example and unsullied by ill-deeds, a sponsor's dream and someone who can make an impact - the right impact - almost anywhere he goes and whatever he does." (Edward Gorman, April 13th 2009).
Tuesday, December 08, 2009
PISN into the stellar wind
Whilst the lighter chemical elements found on the Earth, such as carbon and oxygen, are produced by nuclear fusion inside stars, the heavier elements from the far end of the periodic table are produced in supernovae, the explosive events by which high-mass stars end their lives.
The type of supernova described in most textbooks is a core-collapse supernova, in which the core of the star is transformed into iron by nuclear fusion, and fusion then ceases. The weight of the core is supported by a pressure gradient ultimately created by the energy released from fusion, hence when fusion ceases, the core collapses, triggering a supernova.
However, this type of supernova only occurs to stars in the range 10-140 solar masses. For stars more massive than this, their lives will actually end with a Pair Instability Supernova (PISN). Such an event occurs before the stage at which the fusion of oxygen would otherwise begin in the core. The temperatures in the core of such massive stars are so high at this stage, that the photons produced are effectively high-energy gamma rays. By virtue of their high-energy, these photons will tend to interact with the nuclei in the core by means of a mechanism called electron-positron pair production. The consequence of this is that the radiation pressure in the core plummets, leading to the collapse of the core, and a PISN supernova.
Nature for 3rd December reports that a supernova first detected in 2007, has now been verified as the first ever observational PISN. Such supernovae are difficult to observe because stars lose mass during their lifetime due to radiation-driven stellar winds, and the threshold mass for a PISN supernova increases for stars with a higher proportion of heavy elements. Each generation of stars forms from interstellar matter created by the supernovae of preceding generations, hence each successive generation of stars in a galaxy possesses ever higher proportions of heavy elements. PISN were therefore most frequent in the early universe, when stars had lower fractions of heavy elements, and when stellar winds were weaker. In fact, it is claimed that a single PISN can release more heavy elements than an entire generation of core-collapse supernovae. It is calculated, for example, that the 2007 supernova released 22 solar masses of silicon alone!
Hence, the heavy elements utilised by much of our technology on Earth, may owe their existence to the production of electron-positron pairs from high-energy gamma rays.
The type of supernova described in most textbooks is a core-collapse supernova, in which the core of the star is transformed into iron by nuclear fusion, and fusion then ceases. The weight of the core is supported by a pressure gradient ultimately created by the energy released from fusion, hence when fusion ceases, the core collapses, triggering a supernova.
However, this type of supernova only occurs to stars in the range 10-140 solar masses. For stars more massive than this, their lives will actually end with a Pair Instability Supernova (PISN). Such an event occurs before the stage at which the fusion of oxygen would otherwise begin in the core. The temperatures in the core of such massive stars are so high at this stage, that the photons produced are effectively high-energy gamma rays. By virtue of their high-energy, these photons will tend to interact with the nuclei in the core by means of a mechanism called electron-positron pair production. The consequence of this is that the radiation pressure in the core plummets, leading to the collapse of the core, and a PISN supernova.
Nature for 3rd December reports that a supernova first detected in 2007, has now been verified as the first ever observational PISN. Such supernovae are difficult to observe because stars lose mass during their lifetime due to radiation-driven stellar winds, and the threshold mass for a PISN supernova increases for stars with a higher proportion of heavy elements. Each generation of stars forms from interstellar matter created by the supernovae of preceding generations, hence each successive generation of stars in a galaxy possesses ever higher proportions of heavy elements. PISN were therefore most frequent in the early universe, when stars had lower fractions of heavy elements, and when stellar winds were weaker. In fact, it is claimed that a single PISN can release more heavy elements than an entire generation of core-collapse supernovae. It is calculated, for example, that the 2007 supernova released 22 solar masses of silicon alone!
Hence, the heavy elements utilised by much of our technology on Earth, may owe their existence to the production of electron-positron pairs from high-energy gamma rays.
How to survive Christmas
That grim spectacle of familial strife and forced collegiate jollity, Christmas is barely more than a couple of weeks away. So how to endure it for another year?
One possible recourse may be Michael Heller's new book, Ultimate Explanations of the Universe. Polish physicist, philosopher and priest, Heller offers a critical guide to all the big philosophical issues in cosmology: eternal universes, cyclic universes, the heath death of the universe, quantum creation out of nothing, inflation, the anthropic principle, cosmological natural selection, theories of everything, and multiverses. Whether Springer Verlag will get around to publishing this book before Christmas, however, is another matter.
Speaking of slipping publication dates naturally brings us to Autocourse 2009-2010, the glossy annual review of the Formula One year. This publication was at its peak in the 1970s and 1980s, when it combined lengthy and informative Grand Prix reports with superb photography. Sadly, in the years since it degenerated into a rather insipid publication, low on information, and padded with the most average of photography. Last year's annual, however, was a significant improvement, with excellent and detailed team-by-team assessments provided by Mark Hughes, and Simon Arron taking over responsibility for the race reports. If the word-counts haven't been hacked again, this should be well worth reading over the fetid period.
One possible recourse may be Michael Heller's new book, Ultimate Explanations of the Universe. Polish physicist, philosopher and priest, Heller offers a critical guide to all the big philosophical issues in cosmology: eternal universes, cyclic universes, the heath death of the universe, quantum creation out of nothing, inflation, the anthropic principle, cosmological natural selection, theories of everything, and multiverses. Whether Springer Verlag will get around to publishing this book before Christmas, however, is another matter.
Speaking of slipping publication dates naturally brings us to Autocourse 2009-2010, the glossy annual review of the Formula One year. This publication was at its peak in the 1970s and 1980s, when it combined lengthy and informative Grand Prix reports with superb photography. Sadly, in the years since it degenerated into a rather insipid publication, low on information, and padded with the most average of photography. Last year's annual, however, was a significant improvement, with excellent and detailed team-by-team assessments provided by Mark Hughes, and Simon Arron taking over responsibility for the race reports. If the word-counts haven't been hacked again, this should be well worth reading over the fetid period.
Monday, December 07, 2009
The Large Hadron Collider Pop-up book
Those worried about falling education standards, diminishing attention spans, and the debauched culture of modern liberal capitalist democracies, can rest easy in their beds tonight. Two physics books published this year eschew the modern habit of teaching science with the aid of visual gimmicks, and revert to old school methods, gripping the attention of the reader with the clarity of the prose and the innate fascination of the subject matter.
First up is Voyage to the Heart of Matter, which sounds like some dreadful confessional book by Joan Bakewell, but is otherwise known as the Large Hadron Collider Pop-up book.
Actually, this does look like a clever and original piece of work from 'paper engineer' Anton Radevsky. However, whether it is "Perfect for: teenagers who have studied the LHC at school or been lucky enough to visit CERN," is debatable.
Consider the following question: Why does the Large Hadron Collider accelerate two counter-rotating beams around a circle before colliding them into each other? Why not simply accelerate one beam, and smash it into a target which is at rest in the laboratory frame of reference? If the single beam is accelerated to twice the energy of each beam in the double-collider, then surely the energy of the collision will be the same?
If students of physics cannot answer this question, then they haven't grasped the most fundamental and important concept about particle accelerators. The answer to the question above depends upon the fact that the energy for creating new particles comes from the energy of the colliding particles in the centre of mass reference frame.
If one particle is at rest, and another particle is accelerated towards it, then the centre of mass of the joint system moves in the laboratory reference frame. (If the two particles have the same mass, the centre of mass will be the point equidistant between the moving particle and the target particle). Some of the energy of the accelerated particle goes into moving the centre of mass of the joint system, and it is only the remaining energy in the centre of mass reference frame which is available for creating reaction products. In this case, one needs to quadruple the energy of the accelerated particle to double the energy in the centre of mass reference frame.
However, in the case of two counter-rotating particles, the centre of mass remains fixed in the laboratory reference frame, so the laboratory reference frame coincides with the centre of mass reference frame, and the energy in the centre of mass reference frame is simply twice the energy of each particle.
A pop-up book clearly isn't going to communicate this type of crucial understanding, and that's obviously not the intention. It would be nice, however, if this type of book could ultimately inspire some students to develop a more sophisticated understanding in later years.
Next up is The Manga Guide to Physics, in which doe-eyed female teenagers learn about physics via the medium of tennis:
Megumi is an all-star athlete, but she's a failure when it comes to physics class. And she can't concentrate on her tennis matches when she's worried about the questions she missed on the big test! Luckily for her, she befriends Ryota, a patient physics geek who uses real-world examples to help her understand classical mechanics-and improve her tennis game in the process!
Perfect, then, for uncles as well as teenagers.
First up is Voyage to the Heart of Matter, which sounds like some dreadful confessional book by Joan Bakewell, but is otherwise known as the Large Hadron Collider Pop-up book.
Actually, this does look like a clever and original piece of work from 'paper engineer' Anton Radevsky. However, whether it is "Perfect for: teenagers who have studied the LHC at school or been lucky enough to visit CERN," is debatable.
Consider the following question: Why does the Large Hadron Collider accelerate two counter-rotating beams around a circle before colliding them into each other? Why not simply accelerate one beam, and smash it into a target which is at rest in the laboratory frame of reference? If the single beam is accelerated to twice the energy of each beam in the double-collider, then surely the energy of the collision will be the same?
If students of physics cannot answer this question, then they haven't grasped the most fundamental and important concept about particle accelerators. The answer to the question above depends upon the fact that the energy for creating new particles comes from the energy of the colliding particles in the centre of mass reference frame.
If one particle is at rest, and another particle is accelerated towards it, then the centre of mass of the joint system moves in the laboratory reference frame. (If the two particles have the same mass, the centre of mass will be the point equidistant between the moving particle and the target particle). Some of the energy of the accelerated particle goes into moving the centre of mass of the joint system, and it is only the remaining energy in the centre of mass reference frame which is available for creating reaction products. In this case, one needs to quadruple the energy of the accelerated particle to double the energy in the centre of mass reference frame.
However, in the case of two counter-rotating particles, the centre of mass remains fixed in the laboratory reference frame, so the laboratory reference frame coincides with the centre of mass reference frame, and the energy in the centre of mass reference frame is simply twice the energy of each particle.
A pop-up book clearly isn't going to communicate this type of crucial understanding, and that's obviously not the intention. It would be nice, however, if this type of book could ultimately inspire some students to develop a more sophisticated understanding in later years.
Next up is The Manga Guide to Physics, in which doe-eyed female teenagers learn about physics via the medium of tennis:
Megumi is an all-star athlete, but she's a failure when it comes to physics class. And she can't concentrate on her tennis matches when she's worried about the questions she missed on the big test! Luckily for her, she befriends Ryota, a patient physics geek who uses real-world examples to help her understand classical mechanics-and improve her tennis game in the process!
Perfect, then, for uncles as well as teenagers.
Saturday, December 05, 2009
The Systems Engineering delusion
"Division of labour or economic specialisation is the specialisation of cooperative labour in specific, circumscribed tasks and roles, intended to increase the productivity of labour. Historically the growth of a more and more complex division of labour is closely associated with the growth of total output and trade, the rise of capitalism, and of the complexity of industrialisation processes."(Wikipedia, Division of labour).
Systems Engineering is an attempt to uplift engineering into the information economy. It is predicated on the notion that because complex engineering projects are really 'systems of systems', it is necessary to create a separate discipline, with its own group of practitioners, which specialises in the abstract, top-down analysis of such systems of systems.
As John Morton, Chief Executive of the Engineering and Technology Board, asserted in 2007, "projects are increasingly complex and some climate-change solutions will cross the traditional branches of engineering. If you wanted to extract carbon dioxide from the atmosphere you’d need aerospace, IT, electronical, mechanical and civil engineers." The solution to this complexity, however, does not reside in the creation of "systems engineers who can coordinate big, complex projects like the building of Heathrow Terminal 5," as the subsequent debacle of Terminal 5 demonstrated.
The alternative ethos, the default attitude before the rise of Systems Engineering, and the philosophy which naturally follows from the principle of economic specialisation, is that the productivity and efficiency of an engineering project or company, is maximised by engineers specialising in concrete disciplines, and by those specialist engineers cooperating on complex projects. Such cooperation requires a combination of bottom-up and top-down methods, depending upon the circumstances. There is no need for Systems Engineers, just specialist mechanical engineers, aerodynamic engineers, electronic engineers etc, organised in hierarchies with heads of department, and coordinated by project managers and technical directors, who cooperate with business managers, accountants etc.
The need for complex interaction between different specialists is not a new economic phenomenon, and in engineering it does indeed require project managers and technical directors who can take a multi-disciplinary overview. This, however, is quite distinct from the Systems Engineering ethos, which attempts to legitimise the creation of a separate discipline which specialises in the top-down analysis of engineering projects in terms of requirements, capabilities and stakeholders, abstracted from concrete engineering issues.
Systems Engineering is an attempt to uplift engineering into the information economy. It is predicated on the notion that because complex engineering projects are really 'systems of systems', it is necessary to create a separate discipline, with its own group of practitioners, which specialises in the abstract, top-down analysis of such systems of systems.
As John Morton, Chief Executive of the Engineering and Technology Board, asserted in 2007, "projects are increasingly complex and some climate-change solutions will cross the traditional branches of engineering. If you wanted to extract carbon dioxide from the atmosphere you’d need aerospace, IT, electronical, mechanical and civil engineers." The solution to this complexity, however, does not reside in the creation of "systems engineers who can coordinate big, complex projects like the building of Heathrow Terminal 5," as the subsequent debacle of Terminal 5 demonstrated.
The alternative ethos, the default attitude before the rise of Systems Engineering, and the philosophy which naturally follows from the principle of economic specialisation, is that the productivity and efficiency of an engineering project or company, is maximised by engineers specialising in concrete disciplines, and by those specialist engineers cooperating on complex projects. Such cooperation requires a combination of bottom-up and top-down methods, depending upon the circumstances. There is no need for Systems Engineers, just specialist mechanical engineers, aerodynamic engineers, electronic engineers etc, organised in hierarchies with heads of department, and coordinated by project managers and technical directors, who cooperate with business managers, accountants etc.
The need for complex interaction between different specialists is not a new economic phenomenon, and in engineering it does indeed require project managers and technical directors who can take a multi-disciplinary overview. This, however, is quite distinct from the Systems Engineering ethos, which attempts to legitimise the creation of a separate discipline which specialises in the top-down analysis of engineering projects in terms of requirements, capabilities and stakeholders, abstracted from concrete engineering issues.
Subscribe to:
Posts (Atom)