Thursday, December 31, 2009

Ultimate Explanations of the Universe

Why is there something rather than nothing? What was the origin of the universe? What is the future of the universe? Is the universe finite or infinite? Is our universe the only logically possible universe? Are there other universes? What is the relationship between life and the universe?

These fundamental metaphysical questions have always exerted a fascination for mankind, but over the past hundred years physical cosmology has begun to provide scientific answers to these questions, answers formulated using well-defined mathematical concepts, and verified by astronomical observation.

In more recent decades, however, a sub-industry has developed which produces cosmological theories and models lacking any empirical substantiation. The audience for such highly speculative cosmology consists not only of physicists and cosmologists, but the public-at-large. There is a shop-front for this brand of cosmology, composed of popular science programmes on TV, books such as Stephen Hawking's Brief History of Time, and articles in magazines such as Scientific American and New Scientist.

There is a problem with the manner in which this type of cosmology is presented to the public, and it conforms with Ben Goldacre's characterisation of the way in which all science is treated by the media:

On this template, science is portrayed as groundless, incomprehensible, didactic truth statements from scientists, who themselves are socially powerful, arbitrary, unelected authority figures. They are detached from reality; they do work that is either wacky or dangerous, but either way, everything in science is tenuous, contradictory, probably going to change soon and, most ridiculously, 'hard to understand'. (Bad Science, p225).

Cosmology is presented in the media as being beyond the understanding of the public, and beyond their ability to engage in critical appraisal. The concepts and reasoning are rarely properly explained, and almost never subjected to critical analysis. In the case of speculative cosmology, the ideas are also presented in a credulous and philosophically naive manner.

As an antidote to this comes Ultimate Explanations of the Universe, the latest book by Michael Heller, Polish physicist, philosopher, priest, and winner of the million-pound 2008 Templeton Prize. The text offers a critical guide to all the big philosophical issues in cosmology: eternal universes, cyclic universes, the heath death of the universe, quantum creation out of nothing, inflation, the anthropic principle, cosmological natural selection, theories of everything, and multiverses. The text is quite dense, but the book is still accessible to anyone with an interest in such ultimate questions, and will assist non-specialists seeking to assess the metaphysical credentials of modern cosmology.

Thursday, December 24, 2009

Christmas with Thomas Covenant

Thomas Covenant the Unbeliever could feel the wild magic pulsing in his veins. He was the wielder of the white chocolate ring, and its argent fire had the power to destroy or redeem Christmas. This was the conundrum, the dilemma, the paradox.

The vitality of the pagan winter solstice, the seminal life-force and spirit of Christmas, had been corrupted, first by the bane of religious revelation, scripture and liturgy, then by the malefic puissance of commercialisation, and finally by the insidious constriction and life-draining encroachment of Health and Safety bureaucracy. The wild magic had the capability to destroy this suppurating putrefaction, but also to destroy the Arch of Time, releasing Lord Mandelson the Despiser from his chancrous demesne.

Covenant knew he had to wield the wild magic, but also knew that he couldn't control it. As rapacious commercial Ravers stalked the land, disseminating their incentives to gluttony and avarice, now came the final desecration: the conical, synthetic Poole-centre Christmas tree.

For a brief moment, Covenant thought he descried Lord Mandelson's carious yellow eyes from inside the tree. Instantly, his skin was enveloped in formication, and an eldritch nimbus formed around his body. As coruscations began to lance outwards, Covenant could feel the argent fire gathering itself, ready to blaze across the land in a conflagration of destruction. The Yuletide Log of Law had been broken, the Earthpower had been sundered from the people, and unless Thomas Covenant could find the point of balance, the Arch of Time itself was in peril.

Tuesday, December 22, 2009

The return of Dick Dastardly

So Michael Schumacher is coming out of retirement to race for Mercedes in 2010. After all the cheating and politics that's marred Formula One since Michael left the stage at the end of 2006, the return of such a pure and noble spirit will surely inject a breath of fresh air into the sport. And it's nice to see that Michael's neck has mended properly now that he'll be able to test the car he's going to race...

Much emphasis has been placed upon the fact that Michael will be re-united at Mercedes with Ross Brawn, his trusted friend and erstwhile Ferrari technical director. From a politically strategic perspective, however, Michael will also be not disadvantaged by the presence of Jean Todt as newly-installed President of the FIA. Todt and Schumacher also share a great deal of mutual respect, and given Todt's apparent grudge with Ferrari President Luca di Montezemolo, it doesn't take an excess of imagination to conceive that Schumacher and Mercedes may be given, let's say, the benefit of the doubt in any marginal technical or sporting decisions next year.

Still, with the prospect of Hamilton, Schumacher and Alonso going wheel-to-wheel in 2010, March 14th simply can't come around quickly enough.

Snow and Bayesian probability

I was due to deliver a presentation today on the application of Bayesian probability to measurement science. Sadly, that won't now be possible.

Still, I'm sure Blogger won't mind me using their resources instead. The basic idea is that there's a distinction between true values x and measured values y.

You start off with a prior probability distribution over the true values. You then have a likelihood function, which gives you the probability
P(y|x) of measuring any value y given a hypothetical true value x.

When you perform an actual measurement, and obtain a particular measured value y, Bayes's theorem specifies a posterior distribution over the true values. This new distribution can then be set as the prior distribution for the next cycle, and so on. The Bayesian technique is simply a way of representing how knowledge changes and improves in the light of new evidence.

In the example represented graphically below, the prior distribution is the steep downward curve, a so-called inverse prior. This shows that prior to performing a measurement, the lowest values have the highest probability. (Technically, you need to truncate this distribution to ensure the total probability is equal to 1).

The graph represents the case where you perform an actual measurement, and find that the value is equal to 4 in whatever units are being used. The likelihood function P(y|x) in this example provides a normal distribution over the true values x for a fixed measured value y=4. Bayes's theorem then yields the posterior distribution, which in this case is also basically a normal distribution, but shifted slightly away from 4 towards 0, by the influence of the prior distribution.

Monday, December 21, 2009

Snow in North Hampshire and the Thames Valley

Moving a hundred yards in ninety minutes of gridlock, I abandon my car, and with a surge of adrenalin make for home on foot. A thick confetti of snow fills the air, and a fey light dwindles into a leaden darkness. Depleted of colour and reverberation, an eerie quiet smothers suburbia.

The chromatic uniformity is punctuated only by the twinkling hues of residential christmas illuminations. Striding past up-market properties where town blends into countryside, a couple of small replica reindeer stand in a front garden, their shapes traced by amber lights, exquisitely beautiful now in the deepening snowfield.

And then into the countryside, with no pavement, no light, and only the occasional car, rolling gingerly past in 2nd gear, to disturb the sonic claustrophobia. I am totally at the mercy of these drivers. Hemmed in between the snow-laden hedgerows, there is no escape-route; if one of these cars loses control, I will be smacked off my feet, my legs crushed, crumpled and shattered.

Finally, striding through drifts deeper than my shoes, I reach the farmhouse, and the light, warmth, sound and colour of home. All I need to do now, is do it all again tomorrow to retrieve my car...

Saturday, December 19, 2009

Snow and entropy

Snowfall not only evokes the predictable platitudinous admonitions to "only travel if it's absolutely necessary", but from a more recondite perspective, also partially preserves the entropy of light.

Snow, of course, is white in colour, so it reflects all the colours of the spectrum. In contrast, the normal landscape, consisting of greens and browns and greys, absorbs all the other colours of the spectrum, and in each case reflects just one visible component of the sunlight impinging upon it (the absorbed energy is re-radiated in the infrared). Snow therefore preserves the spectral information in sunlight which is otherwise destroyed by the natural landscape concealed beneath.

Moreover, snow which has frozen overnight will sparkle in sunlight the next day, consisting as it does of numerous ice crystals. In effect, the surface of the snow consists of millions of tiny mirrors, each oriented at a different angle. Each tiny mirror reflects sunlight in a different direction, from a unique position, so the snow sparkles as the observer moves about and the line of sight intersects the reflected sunlight from differently positioned and oriented crystal facets.

A mirrored surface is distinguished from a white, non-mirrored surface simply by virtue of the smoothness of the former. A smooth surface permits the specular reflection of light, in which the directional information in the light is preserved, whilst an irregular surface results in diffuse reflection, which destroys the directional information. A mirrored surface therefore prevents the entropy of light from increasing by virtue of preserving both the spectral information and the directional information in the light. The blanket of sparkling white snow which covers a landscape, therefore operates as a mechanism to reduce the increasing entropy of light.

But do wrap up warm, because the wind will make it feel even colder.

Thursday, December 17, 2009

Dark matter detected?

The Cryogenic Dark Matter Search (CDMS) project is announcing on Friday 18th December, the possible detection of two dark matter particles. However, the chances of a false positive here are estimated to be 23%, which is rather high compared to the threshold of 5% typically employed in measurement science, and extremely high compared to the threshold of 0.1% stipulated by the CDMS team.

The CDMS detector resides in an abandoned iron mine in Minnesota, (to minimise the background neutron radiation from cosmic rays), and consists of germanium discs cryogenically cooled to an extremely low temperature. The discs are coated with phonon sensors, designed to detect the tiny vibrations in such a crystalline solid. When the crystals are cooled to a very low temperature, the vibrational background of the solid is virtually eliminated, and it is possible to detect the phonons created by incoming particles colliding with the atoms in the solid.

These phonons, however, can be created by background gamma radiation colliding with the electrons in the solid, or by hypothetical dark matter particles colliding with the nuclei of the atoms. The two different processes create different patterns of ionisation, hence the detection of dark matter particles requires the simultaneous detection of the phonons and the ionisation in the crystals. (See Dark Side of the Universe, Iain Nicolson, p84-85).

Lewis Hamilton and high-speed corners

McLaren managing director Jonathan Neale recently made the following comment about the prospect of having Lewis Hamilton and Jenson ("I do the triathlon, you know") Button together in the same team:

"In some ways, it multiplies our opportunities in a grand prix too: there could well be tracks where Jenson's skill-set is better-suited to the challenge, and equally, tracks where Lewis could excel. In the past, Jenson has demonstrated considerable talent at high-speed circuits, and we're looking forward to building that into our arsenal."

This isn't quite equivalent to suggesting that Lewis Hamilton has a relative weakness on high-speed circuits, but it isn't far away from it. But is it a notion which holds up to analysis?

Firstly, by high-speed circuits, let us restrict attention to circuits with a decent proportion of high-speed corners, rather than merely circuits, such as Monza, with a high average speed. As Autosport's Mark Hughes has explained at length over some years, Hamilton's natural driving technique utilises oversteer to achieve a rapid change of direction. This is perfect for slow to medium speed corners, but less appropriate for high-speed corners, so the question is whether Lewis has a relative weakness in high-speed corners.

But is there any evidence to support this supposition? Well, let's have a look at Lewis's performance at Silverstone and Spa, the two circuits with the highest proportion of high-speed corners, which have been on the calendar in each of Lewis's three seasons in Formula One. In 2007, Hamilton was off the pace of the other McLaren and Ferrari drivers at Silverstone. "In particular he was struggling through Copse corner, the near-flat-in-seventh first turn taken at over 185mph. Some drivers were claiming to be able to take it without a lift of the throttle. But Lewis never did." (Lewis Hamilton - The Full Story, p199). And although Hamilton was closer to McLaren team-mate Alonso at Spa, it was a similar story. In the case of Spa, however, there are mitigating circumstances, because Hamilton failed to drive in the pre-race test session, thereby losing valuable set-up time. The 2007 McLaren also had a natural tendency to over-heat its tyres in fast corners, and probably had lower peak downforce than the 2007 Ferrari on this type of track.

At Silverstone in 2008, Hamilton qualified eight-tenths off team-mate and pole-winner Kovalainen. Moreover, in his attempts to redress the difference, he over-drove and went bouncing off the circuit at Priory. The race itself, however, was Lewis's greatest day, his rendition of Senna at Estoril in 1985, operating on a different plane from the other drivers in torrentially wet conditions. Unfortunately, those same conditions make the race invalid as an analysis of Lewis's high-speed cornering ability.

At Spa 2008, however, Lewis qualified on pole, and would have won comfortably but for a silly spin in the early damp conditions. He then spent the remainder of the race remorselessly hunting down Raikkonen, eroding a tenth here and a tenth there, so that he was perfectly placed to nail the Ferrari when drizzle began falling again in the late stages of the race. It was Lewis's rendition of Hakkinen, hunting down Schumacher at Spa in 2000, after a silly early spin in damp conditions.

It's difficult to derive anything from Silverstone and Spa 2009, for the McLaren was lacking consistent downforce in these conditions, and wholly uncompetitive. Lewis, however, drove a very untidy race at Silverstone, allowing Alonso to drive him onto the grass at one stage, and spinning under braking at Vale on another occasion. At Spa he was shunted out on the first lap.

There's also mixed evidence from other tracks on the calendar. Lewis's driving style over-heated the outside front tyre in the fast, multiple-apex Turn 8 at Turkey in 2007 and 2008, but he looked very quick overall on both occasions, and, crucially, was faster than Alonso at Instanbul in 2007. He also looked fabulous at Suzuka in 2009, in a car which was still lacking peak downforce relative to its rivals.

In conclusion, there is some evidence that Lewis has a relative weakness on tracks with high-speed corners, but it seems likely that this is only relative to his extraordinary ability in slow and medium speed corners. One also assumes that Hamilton is the type of driver capable of eradicating any weakness, relative or not.

Saturday, December 12, 2009

Stars like grains of sand

If you live in suburbia, then within a few hundred yards of your own home, there will most probably be a towering display of mankind's ability to harness natural energy. On a dark and damp winter's day, the power lines suspended between the beneficent arms of electricity pylons, fizz and crackle in the cold rain. It is a sound derived from a cascade of energy flows which starts with nuclear fusion in the Sun, proceeds via photosynthesis to the creation of metabolic fuels in plant-life, is geologically transformed into hydrocarbon deposits, which are combusted to produce steam, thus impelling turbines to rotate, creating the electric power flows, of which a small fraction is dissipated as sound energy close to your home.

The photograph above is an image of the centre of our galaxy, obtained by the UK's new infrared telescope, Vista. The centre of the galaxy is obscured at visible wavelengths by interstellar dust, Iain Nicolson reporting that "along the line of sight to the galactic centre, of each 10 billion photons that should be arriving from the galactic nucleus, only about one actually does so," (Unfolding our Universe, p198). An infrared telescope, however, is able to peer through this dust, and it's well worth clicking to expand the image, for there are a million stars in this field of view.

"Whereas a sphere 5 light-years in diameter centred on the Sun would contain only four stars (the three components of the Alpha Centauri system and the Sun itself), a similar sized sphere close to the centre of the nuclear bulge [at the centre of our galaxy] would contain about ten million stars...To an observer on a planet located deep within the nuclear bulge, the sky would be aglow with millions of stars...several hundred of which would be brighter than the full Moon," (ibid., p203).

All the stars in Vista's image represent the starting points for cascading energy flows, some of which may be powering the evolution of alien biospheres and civilisations, and ultimately enabling alien observers to hear the fizz and crackle from alien power lines held aloft by beneficent alien pylons.

Friday, December 11, 2009

Formula 1 and Nash equilibrium

A selection of strategies by a group of agents is said to be in a Nash equilibrium if each agent's strategy is a best-response to the strategies chosen by the other players. By best-response, we mean that no individual can improve her payoff by switching strategies unless at least one other individual switches strategies as well. This need not mean that the payoffs to each individual are optimal in a Nash equilibrium: indeed, one of the disturbing facts of the prisoner's dilemma is that the only Nash equilibrium of the game--when both agents defect--is suboptimal. (Evolutionary Game theory, Stanford Encyclopedia of Philosophy).

The technological arms race in Formula 1 now appears to be dead. The technical regulations have been converging to a control formula for some years, with little opportunity for variation in engine or chassis design, and in the light of the limited financial resources now available to the surviving F1 teams, their optimal strategies include voluntary mutual agreements to limit expenditure on testing, research and development. The teams have even agreed to a common car launch in Valencia next year to save costs.

In the lexicon of game theory, the constraints upon the game players have changed, with the consequence that one state of Nash equilibrium (the arms race) has been supplanted by a new state of Nash equilibrium.

This new state of Nash equilibrium is highly preferable for those who own the commercial rights to Formula 1, (currently the private equity company CVC). Formula 1 has now essentially become a package of revenue streams: television broadcasting rights, circuit hosting rights, circuit advertising, and paddock hospitality. The arms race was bad for the commercial owners of F1 because it made for larger performance gaps between the cars, and therefore a less attractive product to sell to the television companies. It also meant that the manufacturers funding that arms race developed an interest in acquiring a larger stake of the commercial revenues generated by the sport, and a greater influence over its governance. With the shrivelling influence and spending capacity of the remaining manufacturer teams, that threat to the business plan of Formula 1 has been eliminated.

Thursday, December 10, 2009

Tiger Woods and Lewis Hamilton

"My overriding feeling about Lewis is one of sadness for a young man who has, one way or another, let himself down very badly and done something which will never be erased from the record...The Tiger Woods parallel has always been a frame of reference with Lewis but for different reasons now. You see him everywhere in those big posters at the airport (with dotted white lines and percentage ratings of this or that) or on television. He was so close to imposing his will on all those boring golfers at the top of the leaderboard at the Masters this weekend. I can imagine Lewis watching him and feeling the same sadness that I do. Here is the guy Lewis wanted to be. A worldwide icon and superstar, leading by example and unsullied by ill-deeds, a sponsor's dream and someone who can make an impact - the right impact - almost anywhere he goes and whatever he does." (Edward Gorman, April 13th 2009).

Tuesday, December 08, 2009

PISN into the stellar wind

Whilst the lighter chemical elements found on the Earth, such as carbon and oxygen, are produced by nuclear fusion inside stars, the heavier elements from the far end of the periodic table are produced in supernovae, the explosive events by which high-mass stars end their lives.

The type of supernova described in most textbooks is a core-collapse supernova, in which the core of the star is transformed into iron by nuclear fusion, and fusion then ceases. The weight of the core is supported by a pressure gradient ultimately created by the energy released from fusion, hence when fusion ceases, the core collapses, triggering a supernova.

However, this type of supernova only occurs to stars in the range 10-140 solar masses. For stars more massive than this, their lives will actually end with a Pair Instability Supernova (PISN). Such an event occurs before the stage at which the fusion of oxygen would otherwise begin in the core. The temperatures in the core of such massive stars are so high at this stage, that the photons produced are effectively high-energy gamma rays. By virtue of their high-energy, these photons will tend to interact with the nuclei in the core by means of a mechanism called electron-positron pair production. The consequence of this is that the radiation pressure in the core plummets, leading to the collapse of the core, and a PISN supernova.

Nature for 3rd December reports that a supernova first detected in 2007, has now been verified as the first ever observational PISN. Such supernovae are difficult to observe because stars lose mass during their lifetime due to radiation-driven stellar winds, and the threshold mass for a PISN supernova increases for stars with a higher proportion of heavy elements. Each generation of stars forms from interstellar matter created by the supernovae of preceding generations, hence each successive generation of stars in a galaxy possesses ever higher proportions of heavy elements. PISN were therefore most frequent in the early universe, when stars had lower fractions of heavy elements, and when stellar winds were weaker. In fact, it is claimed that a single PISN can release more heavy elements than an entire generation of core-collapse supernovae. It is calculated, for example, that the 2007 supernova released 22 solar masses of silicon alone!

Hence, the heavy elements utilised by much of our technology on Earth, may owe their existence to the production of electron-positron pairs from high-energy gamma rays.

How to survive Christmas

That grim spectacle of familial strife and forced collegiate jollity, Christmas is barely more than a couple of weeks away. So how to endure it for another year?

One possible recourse may be Michael Heller's new book, Ultimate Explanations of the Universe. Polish physicist, philosopher and priest, Heller offers a critical guide to all the big philosophical issues in cosmology: eternal universes, cyclic universes, the heath death of the universe, quantum creation out of nothing, inflation, the anthropic principle, cosmological natural selection, theories of everything, and multiverses. Whether Springer Verlag will get around to publishing this book before Christmas, however, is another matter.

Speaking of slipping publication dates naturally brings us to Autocourse 2009-2010, the glossy annual review of the Formula One year. This publication was at its peak in the 1970s and 1980s, when it combined lengthy and informative Grand Prix reports with superb photography. Sadly, in the years since it degenerated into a rather insipid publication, low on information, and padded with the most average of photography. Last year's annual, however, was a significant improvement, with excellent and detailed team-by-team assessments provided by Mark Hughes, and Simon Arron taking over responsibility for the race reports. If the word-counts haven't been hacked again, this should be well worth reading over the fetid period.

Monday, December 07, 2009

The Large Hadron Collider Pop-up book

Those worried about falling education standards, diminishing attention spans, and the debauched culture of modern liberal capitalist democracies, can rest easy in their beds tonight. Two physics books published this year eschew the modern habit of teaching science with the aid of visual gimmicks, and revert to old school methods, gripping the attention of the reader with the clarity of the prose and the innate fascination of the subject matter.

First up is Voyage to the Heart of Matter, which sounds like some dreadful confessional book by Joan Bakewell, but is otherwise known as the Large Hadron Collider Pop-up book.

Actually, this does look like a clever and original piece of work from 'paper engineer' Anton Radevsky. However, whether it is "Perfect for: teenagers who have studied the LHC at school or been lucky enough to visit CERN," is debatable.

Consider the following question: Why does the Large Hadron Collider accelerate two counter-rotating beams around a circle before colliding them into each other? Why not simply accelerate one beam, and smash it into a target which is at rest in the laboratory frame of reference? If the single beam is accelerated to twice the energy of each beam in the double-collider, then surely the energy of the collision will be the same?

If students of physics cannot answer this question, then they haven't grasped the most fundamental and important concept about particle accelerators. The answer to the question above depends upon the fact that the energy for creating new particles comes from the energy of the colliding particles in the centre of mass reference frame.

If one particle is at rest, and another particle is accelerated towards it, then the centre of mass of the joint system moves in the laboratory reference frame. (If the two particles have the same mass, the centre of mass will be the point equidistant between the moving particle and the target particle). Some of the energy of the accelerated particle goes into moving the centre of mass of the joint system, and it is only the remaining energy in the centre of mass reference frame which is available for creating reaction products. In this case, one needs to quadruple the energy of the accelerated particle to double the energy in the centre of mass reference frame.

However, in the case of two counter-rotating particles, the centre of mass remains fixed in the laboratory reference frame, so the laboratory reference frame coincides with the centre of mass reference frame, and the energy in the centre of mass reference frame is simply twice the energy of each particle.

A pop-up book clearly isn't going to communicate this type of crucial understanding, and that's obviously not the intention. It would be nice, however, if this type of book could ultimately inspire some students to develop a more sophisticated understanding in later years.

Next up is The Manga Guide to Physics, in which doe-eyed female teenagers learn about physics via the medium of tennis:

Megumi is an all-star athlete, but she's a failure when it comes to physics class. And she can't concentrate on her tennis matches when she's worried about the questions she missed on the big test! Luckily for her, she befriends Ryota, a patient physics geek who uses real-world examples to help her understand classical mechanics-and improve her tennis game in the process!

Perfect, then, for uncles as well as teenagers.

Saturday, December 05, 2009

The Systems Engineering delusion

"Division of labour or economic specialisation is the specialisation of cooperative labour in specific, circumscribed tasks and roles, intended to increase the productivity of labour. Historically the growth of a more and more complex division of labour is closely associated with the growth of total output and trade, the rise of capitalism, and of the complexity of industrialisation processes."(Wikipedia, Division of labour).

Systems Engineering is an attempt to uplift engineering into the information economy. It is predicated on the notion that because complex engineering projects are really 'systems of systems', it is necessary to create a separate discipline, with its own group of practitioners, which specialises in the abstract, top-down analysis of such systems of systems.

As John Morton, Chief Executive of the Engineering and Technology Board, asserted in 2007, "projects are increasingly complex and some climate-change solutions will cross the traditional branches of engineering. If you wanted to extract carbon dioxide from the atmosphere you’d need aerospace, IT, electronical, mechanical and civil engineers." The solution to this complexity, however, does not reside in the creation of "systems engineers who can coordinate big, complex projects like the building of Heathrow Terminal 5," as the subsequent debacle of Terminal 5 demonstrated.

The alternative ethos, the default attitude before the rise of Systems Engineering, and the philosophy which naturally follows from the principle of economic specialisation, is that the productivity and efficiency of an engineering project or company, is maximised by engineers specialising in concrete disciplines, and by those specialist engineers cooperating on complex projects. Such cooperation requires a combination of bottom-up and top-down methods, depending upon the circumstances. There is no need for Systems Engineers, just specialist mechanical engineers, aerodynamic engineers, electronic engineers etc, organised in hierarchies with heads of department, and coordinated by project managers and technical directors, who cooperate with business managers, accountants etc.

The need for complex interaction between different specialists is not a new economic phenomenon, and in engineering it does indeed require project managers and technical directors who can take a multi-disciplinary overview. This, however, is quite distinct from the Systems Engineering ethos, which attempts to legitimise the creation of a separate discipline which specialises in the top-down analysis of engineering projects in terms of requirements, capabilities and stakeholders, abstracted from concrete engineering issues.

Saturday, November 28, 2009

The future of the universe

All of the complexity and structure we see in the world around us will ultimately be degraded and eradicated.

Consider first the fate of gravitationally bound systems, such as galaxies and galaxy clusters. Let us first recall some concepts from Newtonian gravity. Because gravitation is an attractive force, the potential energy U of a system of gravitating objects is negative; the kinetic energy K of the system, the energy of motion of the objects, is positive; and the total energy is E = U + K. A gravitating system is said to be bound if the total energy E is negative. In other words, if the magnitude of the potential energy exceeds the magnitude of the kinetic energy, the system is bound. The more negative the total energy, the more tightly bound the system is.

Each gravitating system has associated with it an escape velocity, which is the speed a constituent object must attain if the distance between it and the other objects in the system is to become unbounded. A bound system is such that the average velocity of the objects in the system is less than the escape velocity. However, after the objects in the system have interacted for a period of time, there will be a distribution of velocities, and some will exceed the escape velocity. These objects will thence depart the system, never to return. This evaporation of objects from the system will remove positive kinetic energy from the system, hence the total energy of the system will become more negative, making the system more tightly bound. Nevertheless, the system will continue to evaporate.

Contemporary spiral galaxies are bright and vibrant star cities, evolving through multiple generations of star formation, and possessing an ecological structure in which material is cycled between the population of stars, and the gas and dust of the interstellar medium. Eventually, however, star formation will cease, all the stars in a galaxy will expend their nuclear fuel, and galaxies will be populated by black holes and the dead cinders of stars. These cold, dark galaxies will then evaporate, as Iain Nicolson explains:

"Although close encounters between stars are extremely rare, given sufficient time, many encounters between dead stars will take place. In each encounter, one star will gain energy and the other will lose energy. Even without any encounters of this kind, an orbiting star will gradually lose energy by radiating gravitational waves and so, very slowly, will migrate closer to the centre of its galaxy. Close encounters will accelerate this process. Over extremely long periods of time, most dead stars will evaporate from their host galaxies and the remainder will coalesce into gigantic 'galactic' black holes at their centres. A similar process is likely to happen to clusters and superclusters of galaxies, with dead galaxies merging at their centres to form 'supergalactic' black holes, and others being ejected into intercluster space." (The End of the Universe, 1998 Yearbook of Astronomy, pp220-232).

Smaller bound systems, such as molecules and atoms will also evaporate, but the reason for this is quite subtle. As John C.Baez explains, any system in thermal equilibrium will minimise its so-called free energy, the amount of energy which is available to perform work. The free energy can be defined to be E - TS, where E is the total (internal) energy, T is the temperature of the system, and S is the entropy of the system. The restriction to internal energy here simply means that one ignores the potential energy a system might possess in an external field, and one ignores any bulk energy of motion; internal energy includes the internal potential energy of the system, and its internal kinetic energy. The entropy S of a system can be seen in this context as the amount of unusable energy in the system, per unit of system temperature; hence, multiply the entropy by the temperature, and one obtains the total amount of unusable energy in the system. Subtract the amount of unusable energy from the total energy, and one obtains the free energy.

The free energy of a system E - TS can clearly be reduced either by reducing E, or by increasing TS. As Baez points out, an ionised gas (a so-called plasma) has more energy than a gas made from atoms or molecules of the same substance. When those atoms or molecules form, electromagnetic radiation is released, decreasing the total energy of the matter in the system. However, the atoms or molecules have less entropy than the ionised system. At high temperatures, the free energy is minimised by the high entropy plasma state. However, at lower temperatures, the free energy of a matter system can be minimised by reducing the total internal energy of the system. (Note, however, that although the atomic or molecular state is a lower entropy state for the matter, the total entropy still increases because of the entropy of the electromagnetic radiation released when the atomic or molecular state is formed).

This assumes, however, that the system occupies a fixed volume. If the volume available to the constituents of the system is constantly increasing, as is the case in an expanding universe, then the maximum available entropy S of the system will be constantly increasing, and eventually, even at very low temperatures, the free energy of a gas will be minimised in the ionised, plasma state, the state which maximizes the entropy of the system.

Black holes, of course, are also capable of evaporating into radiation, but only do so if their temperature is lower than that of their surroundings. Crucially, some theorists currently argue that the presence of the dark energy, responsible for the acceleratory expansion of the universe, equips the universe with a minimum temperature. The temperature of a black hole is inversely proportional to its size, hence if sufficiently large black holes form from the merger of smaller black holes (and they would have to be as large as the currently observable universe), then such black holes would never evaporate.

Thus, (neglecting some questions over the fate of protons) the future of the universe is a future in which all galaxies, stars, planets, complex molecules and atoms eventually evaporate, and all that remains will be gravitational radiation, electromagnetic radiation, black holes, and isolated elementary particles.

Wednesday, November 25, 2009

Lewis Hamilton and instability

The Eurofighter-Typhoon jet aircraft is so unstable, that it cannot be controlled by a pilot alone, and requires the intervention of electronic control systems to prevent it from stalling in flight. Hold that thought in your head as you read the description of Lewis Hamilton's raw driving ability, which McLaren director of engineering, Paddy Lowe, gave to Autosport's Mark Hughes in 2008:

"He's tremendously good at controlling a car in oversteer. We saw that from the first moment he got in our car. We saw the data, and on every entry we could see there was a massive correction on the steering, and our normal drivers would have been bitching like hell that the car was undriveable, yet he didn't even pass comment. So with a driver like that, you're better equipped to push the boundaries to new levels. Speaking generically of that characteristic, a lot of the performance limit of a car is set by stability; if you can't hang on to it, you will have to introduce understeer in that zone. But if you have a driver better able to deal with oversteer in those zones that induce it, then you will have a less-understeery car elsewhere and therefore more total grip over the lap. The great drivers over the years - Senna, Schumacher, Mansell - have all had that ability. Like for like compared to other drivers, they want more front end."

There are two particular concepts in Lowe's analysis which need to be distinguished: corner entry oversteer, and entry instability. To understand Hamilton's unique capabilities, we therefore need to briefly introduce some definitions from stability theory.

If a car (or aircraft) is initially in an equilibrium state, and there is a transitory control input (or external disturbance), a stable vehicle will return towards its initial equilibrium state of its own accord, whilst an unstable vehicle, in the absence of any further control inputs, will diverge even further from the initial state. To be precise, the first condition is sometimes called static stability, and the latter condition is called static instability. Whether a vehicle is stable or not can be speed dependent. For example, a bicycle is stable at higher speeds, but is unstable at low speed, requiring continuous corrective inputs from the rider to remain vertical.

In the case of an F1 car, an initial steering input induces an initial slip-angle in the front tyres, which induces an initial direction change (a rotation about the vertical axis, called a yaw motion). If an F1 car is statically stable, the car will then return towards a state of zero yaw. If an F1 car is statically unstable, an initial steering input would not just induce an initial slip-angle and change of direction, but an ever greater change of direction (in the absence of corrective action from the driver), giving the vehicle a tendency to spin on entry to every corner. In particular, if an initial steering input provokes the car into oversteer, then that oversteer will increase the initial direction-change. Hence, the driver must supply opposite-lock steering corrections to reduce the direction-change. Oversteer and instability are therefore related. To be precise, turn-in oversteer is a statically unstable handling characteristic, albeit one which Lewis Hamilton is clearly capable of dealing with.

There is a further nuance here, however, because even statically stable vehicles can be either dynamically stable or dynamically unstable. After an initial input, the attitude of a dynamically stable vehicle will oscillate with simple harmonic motion of decreasing amplitude about the initial attitude. In contrast, in the case of a dynamically unstable vehicle, whilst its attitude will at first return towards the initial state, it will then oscillate with increasing amplitude about that initial attitude, leading to a loss of control (in the absence of corrective inputs). These two behavioural characteristics are also sometimes dubbed positive stability, and relaxed stability, respectively. The Eurofighter Typhoon possesses dynamic instability (relaxed stability).

If an F1 car was statically stable, but dynamically stable on turn-in, an intial steering input would create an initial direction-change, and the direction-change would then oscillate with decreasing amplitude. If, however, an F1 car was dynamically unstable on entry to a corner, then the direction-change would oscillate with increasing amplitude, (in the absence of corrective action), giving the vehicle a tendency to spin.

Rear-end instability on corner entry is reportedly the handling characteristic which Jenson Button struggles most to deal with, but as his Brawn team-mate, Rubens Barrichello, demonstrated this year, it is a characteristic which different drivers can cope with to different degrees. Perhaps, then, the type of instability exhibited at times by the Brawn in the second half of the 2009 season, was merely the dynamic instability of a statically stable car.

Judging from Paddy Lowe's remarks, one can speculate that not only is Lewis Hamilton able to cope with such dynamic instability on corner entry, but to a degree unique amongst his peers, he is able to supply the corrective inputs necessary to prevent a statically unstable car from spinning on corner entry.

Monday, November 23, 2009

Why did Jenson Button leave Brawn?

On Friday, Eddie Jordan announced to the world that Michael Schumacher will be making a comeback with the Mercedes Formula 1 team. Generally speaking, Eddie Jordan is a reliable source of motorsport information in the same sense that Gillian McKeith is a qualified authority on diet and nutrition. In this case, however, Jordan's prediction makes a lot of sense, for Michael is clearly directionless without the opiate of Formula 1, and Mercedes have conspired to lose their World Champion driver, Jenson Button, to erstwhile partners McLaren.

And here's an interesting thing: Jordan claims that Mercedes's attempts to woo Michael "started with a meeting between Michael, Ross Brawn and Daimler chief executive officer Dieter Zetsche at the Abu Dhabi Grand Prix." This claim was corroborated on Sunday by Willi Weber, latterly Schumacher's manager, who said he was "sure that Schumacher had had talks with Dr Dieter Zetsche, head of Mercedes-Benz and Norbert Haug, who runs the company’s motor-sport division, at the Abu Dhabi Grand Prix."

At Abu Dhabi? Reports that Button could be lured to McLaren had surfaced in the week after he secured the World Championship in Brazil, but still it seemed that an agreement between Button and Brawn/Mercedes was a mere formality. As Ross Brawn commented at the time, "We are working with Jenson to find a balance between what we can afford and what he feels is fair for his status and what he can contribute in the future...You are never 100% but I would say 99% [certain it will happen]."

The possibility of Button switching to McLaren was interpreted as a mutually convenient negotiating ploy: it let Brawn/Mercedes know that Button had another option, and it let McLaren candidate Kimi Raikkonen know that McLaren too had other options. Mercedes motorsport boss Norbert Haug, for one, was rather dismissive of the possibility that Jenson Button and Lewis Hamilton could end up in the same team: "I do understand that people in England are dreaming of an English team with two world champions in the cockpits... However, dreams don't always come true."

By Abu Dhabi, however, Haug and Mercedes were apparently considering the loss of Button as a serious prospect, and began exploring the Schumacher option before Button put pen to paper with McLaren.

Both Mercedes and McLaren now claim that money was not an issue, that Mercedes offered Button the £8 million a year salary he was seeking, and that McLaren ultimately granted Jenson a deal worth less than Mercedes were offering. This, however, is not the point. Whilst Mercedes's final offer matched Button's salary requirements, the initial deal which Brawn/Mercedes offered to Jenson was only £4 million. This constituted little advance on the reduced salary which Jenson had voluntarily accepted to keep the team afloat when Honda pulled out at the end of 2008, and Jenson probably perceived this as something of a slight. By the time that Jenson was escorted on a tour of the Sir Norman Foster-designed McLaren Technology Centre (MTC) at Woking, the damage may already have been done.

To understand the effect this may have had on Jenson, it's worth recalling the words of Ron Dennis, speaking to Nigel Roebuck in late 2001 (Autosport, December 20-27, p23) before the opening of the MTC:

We were looking for perfection, so we didn't want [MTC] to look out over any buildings. When people are working there, all they'll see out of the windows is fields and trees...I believe that good technical resources attract the best people like a magnet...OK, they want money - and money becomes like a rate card...You measure yourself in financial terms - yes, it affects your lifestyle, but primarily it's a reflection of how good or bad you are. The best people get the most money - that should just be common sense. And once you've satisfied that desire, you've got to give them the best facilities.

So perhaps, then, the picture is as follows: Brawn/Mercedes made an offer which undervalued Button's services, at which point Jenson's management team made contact with McLaren to develop some negotiating leverage; in response, Brawn/Mercedes tried to cover the possibility of losing their new World Champion by developing an interest in Michael Schumacher; already offended by the comparatively low nature of the salary on offer, Button possibly became aware of Mercedes's apparent attempt to seduce Schumacher out of retirement, and decided to take the McLaren offer seriously; Jenson's eyes were then opened by the yin and the yang of the McLaren Technology Centre, and he reciprocated the interest of his new suitor, irrespective of salary.

Wednesday, November 18, 2009

Can Jenson beat Lewis?

It seems to be a week for unusual and inexplicable combinations. Erstwhile Toyota F1 driver Jarno Trulli revealed at the weekend that he was seriously considering an eventual move to NASCAR, the American stock-car racing series. On the face of it, this would be as appropriate as Brian Sewell playing centre-forward for Caledonian Thistle.

Jarno, of course, was so indignant at being hung out to dry by Adrian Sutil in the Brazilian Grand Prix, that he turned up at the next race in Abu Dhabi with a portfolio of photographs and video evidence to prosecute his case. Sadly, however, the driving tactics in NASCAR hardly constitute the Queensberry Rules either...

This is completely overshadowed, however, by Jenson Button's apparently counter-intuitive decision to join Lewis Hamilton at McLaren next year. Many pundits have advised Jenson against this because McLaren appears to be very much Lewis's team, and many have predicted that Lewis would blow Jenson away if the two were partnered in the same car.

There is, however, at least one factor in Jenson's favour. Next year's cars will have narrower front tyres, and the presence of larger fuel tanks will shift the centre of mass, and therefore the centre of aerodynamic pressure, further towards the rear of the car. This is potentially very much to the favour of Jenson, and to the detriment of Lewis. A centre of pressure further towards the rear potentially alleviates the possibility of rear instability under braking, a handling trait which Jenson struggles to deal with. Furthermore, Lewis notoriously favours a car with a strong front-end, and the narrowing of the front tyres and the shift in the centre of aerodynamic pressure will both contribute towards making next year's cars more liable to understeer. Perhaps Jenson, then, fancies his chances against Lewis...

The primary solecism of quantum theory

Last night, the BBC's popular science series, Horizon, once again featured the popular comedian Alan Davies, in a programme entitled How long is a piece of string? Like virtually every other popular science book or programme, it is also reiterated the following interpretation of quantum theory:

A particle can be in two different places, A & B, at the same time.

This is a claim repeated not just by many science journalists and popularisers, but also by many working physicists, yet it is completely wrong. Instead, the weirdness of quantum theory arises from the fact that either-or statements of the following form,

Either particle x is at position A or particle x is at position B,

can be true, even though quantum theory doesn't represent either of the constituent disjuncts to be true:

Particle x is at position A.

Particle x is at position B.

Thus, a disjunction can be true in quantum theory without either of the disjuncts being true. In technical terms, the logic of quantum theory is said to be non-distributive. This is the problem which any successful interpretation of quantum theory must deal with, and, in particular, this is why one of the possible approaches is the so-called hidden variables interpretation, which claims that quantum theory provides an incomplete specification of the actual state of a physical system.

For an excellent summary of the issues involved, one could do worse than this excellent review from Oxford philosopher of physics, David Wallace: The quantum measurement problem: state of play.

Paradoxes, however, are good press, and physicists are trying to sell their products to an increasingly ill-educated public, so don't expect the solecisms to abate.

Monday, November 16, 2009

Shell V-power and journalism

Being not disinterested in the world of motoring journalism, I was intrigued the other day to find what, at first sight, appeared to be a competition for aspiring journalists, jointly organised by Shell and Auto Express magazine.*

Sounds good, I thought.

But how, exactly, would the applicants demonstrate their journalistic flair and investigative powers to the competition judges? Well, here's how:

Please now share your thoughts in a written statement of 75 words or fewer in response to the question: "What do you think about Shell V-Power and the difference it makes to you and your car?"

What I like about this is the brazen assumption that there really is no difference at all between a journalist and a PR or press officer. Personally speaking, I was hoping to sell my soul after 5 years of fruitless struggle in journalism, not cave-in at the very outset.

Still, have to be in it to win it, I suppose, so here goes:

As the latest premium brand of fuel from the type of company which Robert Mugabe refuses to deal with on ethical grounds, Shell V-power is designed to fool affluent and the poorly-informed Western motorists, that a higher octane 'quality' will improve power, irrespective of an engine's compression ratio. Manufactured from the pituitary glands of Ugandan infants, and blended with the tears of Filipino child prostitutes, Shell V-power is now able to scrub CO2 from the atmosphere, and is increasingly used as a cure for syphilis and pancreatic cancer.

I look forward to my trip to Shell Global Solutions in Thornton, Cheshire.

*Many thanks to Patrick's Motorsports Ramblings for this link.

Saturday, November 14, 2009

Supersymmetry and the Large Hadron Collider

Like a parent cautiously returning to a firework which failed to launch at the first attempt, scientists at CERN are about to switch the Large Hadron Collider (LHC) on again.

Anil Ananthaswamy duly provides a decent summary in New Scientist of the prospects for the LHC finding evidence of supersymmetric particles as well as the Higgs boson.

The basic idea of supersymmetry is that the two types of elementary particles with which we are familiar, bosons and fermions, are actually just different states of single particle types. In this respect, it is postulated that each type of boson has a fermionic partner, and each type of fermion has a bosonic partner. Supersymmetry therefore predicts the existence of numerous particles which have not hitherto been detected. For example, the photon, (a boson) has a hypothetical supersymmetric partner called the photino (a fermion).

The particle ontology of supersymmetry is then twinned with a cosmological explanation of why bosons and fermions are observed as distinct particles in terrestrial laboratories. It is proposed that the symmetry between bosons and fermions was respected at the higher energy levels found in the early universe, but as the universe expanded and energy levels dropped, supersymmetric symmetry breaking took place, with the consequence that the bosons and fermions with which we are familiar interact very rarely with their supersymmetric partners. These weakly-interacting supersymmetric particles then provide a nice candidate to explain the existence of dark matter in astronomy and cosmology.

Mathematically, supersymmetry also entails an interesting modification to the definition of what an elementary particle is. The latter is intimately related to the local space-time symmetry group, the group of symmetries possessed by every small patch of space-time, irrespective of how those patches are sewn into a global space-time. Without supersymmetry, the local space-time symmetry group of our universe appears to be a subgroup of the Poincare group. Wigner established that each type of elementary particle corresponds to an irreducible unitary Hilbert space representation of this subgroup of the Poincare group (with a few technical qualifications concerning so-called covering groups).

However, if it transpires that our universe is a supersymmetric universe, then the definition of an elementary particle has a straightforward generalisation. The local space-time symmetry group becomes (a subgroup of) the super-Poincare group, and the set of possible supersymmetric elementary particles is then defined by the irreducible unitary Hilbert space representations of (a subgroup of) the super-Poincare group. Each such representation decomposes into a direct sum of unitary irreducible representations of the Poincare group, and the members of such a supersymmetric 'multiplet' are said to be super-partners of each other.

Mathematicians interested in symmetry, and philosophers interested in the mereological concept of elementarity, should therefore share a common interest in the data physicists are set to harvest from the LHC's detectors.

Monday, November 09, 2009

The latitude and longitude of F1

It has been observed on more than one occasion over the years that Bernie Ecclestone could save everyone in Formula One a lot of time and money by the simple expedient of holding each Grand Prix in the same location, and merely changing the scenery every couple of weeks.

There has, of course, also been a trend in recent years for the centre of gravity of the championship calendar to become increasing Oriental. Thus, if one were to hold each Grand Prix in the same location, where would be the most appropriate place in which to hold it? Probably not somewhere in Europe. To be scientific about this issue, then, let us propose instead that we take the average latitude and longitude of all the race tracks on the 2010 Formula One calendar.

Assuming that the British Grand Prix will be held at Silverstone, the 2010 calendar consists of the following geographical locations:

RaceLatitude Longitude
Bahrain 26.032550.510556
China 31.338889121.219722
Spain 41.572.261111
Turkey 40.95166729.405
Canada 45.505833-73.526667
Europe 39.458889-0.331667
Britain 52.071-1.016
Germany 49.3277788.565833
Hungary 47.57888919.248611
Belgium 50.4372225.971389
Japan 34.843056136.540556
S. Korea 34.733333126.416667
Abu Dhabi24.46722254.603056
Brazil -23.703611-46.699722

The average latitude and longitude of the 2010 Formula 1 calendar is therefore 28.95632537, 42.12886447. These coordinates transpire to be a desert region in Saudi Arabia called Al Haiyaniya, (pictured in the satellite image above).

The perfect location, then, for the entire championship.

Saturday, November 07, 2009

The brain of a racing driver

Is there a basic neuro-physiological difference between the brains of racing drivers and the brain of a normal person? Is a racing driver's preference for understeer or oversteer determined by neuro-physiological differences?

The techniques for answering both of these questions are already available, and have been applied by University College London (UCL) to study the structure and activity of the brains of taxi drivers.

An initial UCL study, published in 2000, used magnetic resonance imaging to discern that the size of the posterior hippocampus was enlarged in a sample of 16 London taxi drivers, compared to a control sample of 50 people. It also found a positive correlation between the length of time a taxi driver had spent in the job, and the volume of the right hippocampus. The hippocampus is associated with navigational abilities, (as well as the establishment of long-term memories), hence it was hypothesized that the posterior hippocampus actually grows in response to the navigational demands placed upon it.

A second study, publicised in 2008, examined the activity of a taxi driver's brain in real-time, using functional magnetic resonance imaging (fMRI), as the drivers navigated their way through the streets of a computer simulation:

The hippocampus was only active when the taxi drivers initially planned their route, or if they had to completely change their destination during the course of the journey.

The scientists saw activity in a different brain region when the drivers came across an unexpected situation - for example, a blocked-off junction.

Another part of the brain helped taxi drivers to track how close they were to the endpoint of their journey; like a metal detector, its activity increased when they were closer to their goal.

There is no reason why a similar pair of studies could not be conducted on racing drivers. Very sophisticated driving simulators now exist, including ones which can simulate the kinaesthetic sensations of driving a racing car. The latter point is particularly important, because a racing driver can sense the limit of car's adhesion, not merely by hand-eye coordination, but also by the sensations associated with changes of linear momentum and angular momentum. The first mechanism is a sensory feedback loop involving the visual cortex and motor cortex, whilst the latter is likely to be a feedback loop operating between the somatosensory cortex and the motor cortex.

Is any part of the brain in a racing driver enlarged compared to the background population? Do drivers who prefer oversteer have a larger somatosensory cortex than drivers who prefer understeer? Or are the differences purely psychological, rather than neuro-physiological?

I propose that we introduce UCL's fMRI to McLaren's driving simulator, and ask Mr Hamilton and Mr Raikkonen if they'd like to take part in a small medical study...

Tuesday, November 03, 2009


Driving a Jaquar is the closest one can get to driving a neoclassical temple on the open road.

It is rarely emphasised, however, that the iconic design of the Jaquar owes an awful lot to the styling of the bonnet. To be precise, it is the presence of the fluting in the bonnet, and the blending of that fluting into the curves of the protruding headlights, which give the Jaquar its Palladian aesthetic.
Such fluting is rarely seen on modern roadcars, and racing car designers are no longer noted for their stylistic flourishes, but, pleasingly, there is extensive use of it on the engine shroud of Adrian Newey's first Formula 1 car, the March 881.

Saturday, October 31, 2009

Yas and the plastic population

By day, the translucent raiment of Yas Hotel resembles gossamer film, draped over hedgerows on a dewy morning; by night, it mimics the violet bioluminescence of deep-sea cnidarians.

Snetterton, it definitely isn't. In fact, in terms of cultural distance, it might be impossible to devise a facility further from Snetterton than Abu Dhabi's Yas Marina facility.

The drivers, however, seem to like the circuit. Lewis Hamilton, in fact, gave it the thumbs-up on the basis that "it's really smooth, the kerbs are nice and in the right places." What more could one want of a racing circuit?

Meanwhile, Ron Dennis has made his first appearance at a Grand Prix since his enforced exile from Formula 1 in the wake of the lie-gate controversy. Coincidentally, this is also the first Grand Prix since the end of Max Mosley's tenure as FIA President. Coincidentally, McLaren Automotive, the road-car division of the McLaren Group, which Ron Dennis has been heading-up this year, have just launched the McLaren MP4-12C. Their new vehicle will be powered, not by a Mercedes engine, but by a McLaren engine. Coincidentally, it is believed that Mercedes will be leaving McLaren at the end of 2011, and buying a 75% shareholding in Team Brawn. Coincidentally, the initial investment in Team Brawn will be made by Aabar, an Abu Dhabi company affiliated to Daimler-Benz, thereby avoiding the exclusive ownership terms in the current Mercedes contract with McLaren.

Thus, whilst there is much talk of whether Jenson Button is really considering the possibility of leaving Team Brawn to join McLaren next year, Anthony Hamilton will surely have noted the direction in which the engine-supply wind is now blowing, and will perhaps be thinking of a move in the opposite direction, at the appropriate time. In this context, it is also interesting to note that Rubens Barrichello recently confirmed on Brazilian TV that his initial 2009 contract with Team Brawn only covered the first four races. One might recall that during the fallout from lie-gate, there was a suggestion that McLaren were in breach of contract to Lewis Hamilton. Barrichello's admission seems to substantiate the view that Anthony Hamilton really could have transferred Lewis into Team Brawn at the time, and perhaps helps to further explain why Ron Dennis was compelled to fall on his sword.

The McLaren MP4-12C, incidentally, has an interesting one-piece carbon 'monocell' construction (pictured on the left here). It's certainly not a monocoque, for there is essentially just half a shell here, so perhaps it would be most appropriate to refer to it as a plastron chassis, lacking as it does a carapace (the dorsal part of the shell in arthropods).

The other great political cause celebre entertaining minds at the Yas Marina circuit is the perpetual state of limbo into which the British Grand Prix seems to have fallen. Intriguingly, the Business Secretary, and Prince of Darkness himself, Lord Mandelson phoned Bernie Ecclestone this week to stress the importance of the British Grand Prix to the UK.

Lord Mandelson doesn't strike one as the sort of person to contact Bernie Ecclestone lightly. In strategic terms, the Business Secretary would either want to have no part whatsoever in the negotiations over the British Grand Prix, or if he did decide to get involved, he would surely do so only in the belief that Ecclestone could be encouraged to come to a settlement with Silverstone. If Lord Mandelson got involved without having any form of traction, he would leave himself vulnerable to appearing impotent.

Whilst Mandy encouraged Bernie to retain the British Grand Prix, Bernie reassured Mandy that he was doing everything possible. Only a shot across the bows at this stage, then, but a full confrontation between Mandelson and Ecclestone would truly be the political equivalent of Alien vs Predator...

Nigel Mansell, love, and pronouns

[After Williams signed Alain Prost for 1993, and declined to meet Nigel Mansell's contract demands, Mansell] raved to the tabloids about how unfair it all was, but received rather less sympathy from the English specialist press, whom he had thoroughly alienated in an interview with L'Equipe, in which he described us as 'corrupt' for our unwillingness to rank him with Senna or Prost. Given that Mansell's relationship with the subtleties of the English language was never an easy one, it may be that this was not the adjective he had intended to use; whatever, the damage was done. (Nigel Roebuck, Chasing the Title, p247).

The story of Nigel Mansell is essentially half of a love story. Fail to understand this, and it is impossible to understand Mansell and his Formula 1 career.

Nigel met Rosanne when he was 17 years old, stopping his Mini at the roadside to offer her a lift to Solihull Technical College. They married in 1974, and have been together ever since. Rosanne sold her own road-car to help Nigel buy his first Formula Ford racing car, and, as Simon Taylor recalls in the December issue of Motorsport Magazine, the transition to Formula 3 not only saw Nigel working night-shifts as an office cleaner, and Rosanne working overtime as a British Gas demonstrator, but ultimately required the young couple to sell their own home, raising a paltry £6,000 in the process.

Despite the lack of finance, and a couple of potentially serious accidents in the lower formulae, Mansell's self-belief and courage caught the eye of Lotus boss Colin Chapman, and he gained a precarious foothold in Formula 1. The significance of Rosanne's contribution to his career was aptly characterised by Paul Kimmage in early 2006: "She was there in 1980 when he made his F1 debut for Lotus. She was there in 1985 when he won his first grand prix for Williams at Brands Hatch. She was there in 1986 when he lost the world championship when a rear tyre exploded in Australia. She was there in 1989 when he won on his debut for Ferrari. She was there in 1992 when he was finally crowned champion for Williams. She was there in 1993 when he was IndyCar champion. She was there in 1995 when he drove his last race. For 35 years, she was there."

The story of Nigel and Rosanne Mansell is ultimately a story of sacrifice and devotion, and Mansell's career in Formula 1 is a gripping tale of battles fought against adversity, spectacular overtaking manoeuvres, and ultimately the wresting of the World Championship. Yet despite all this, Mansell's relationship with the specialist press had degenerated into mutual vitriol by the early 1990s.

Nigel's self-belief was matched only by his persecution complex, a strong cultural meme in the Birmingham and Black Country area. Mansell bristled with aggression and paranoia at any criticism he received in print, and his attitude towards the specialist press often led them to deliberately understate his achievements; something of a vicious circle.

To understand Mansell's behaviour, however, one needs to understand not only his egotism and persecution complex, but to comprehend that his antagonism towards the press was also a conjugal reflex response to the presence of an external threat. Nigel and Rosanne, almost as a holistic entity, had made enormous sacrifices to realise their dreams, and any criticism in the press was perceived by Mansell as an attempt to undermine this achievement.

Simon Taylor comments that "When referring to himself, [Mansell] mixes his pronouns, as he always did...: 'When I look back I say to myself, with what we were up against, we were so lucky and fortunate that we accomplished what we did'."

This, however, is to underestimate Mansell. There is no misunderstanding of pronouns in Mansell's words. There were occasions during his racing years when Mansell used 'we' to refer to himself and his team, but at all times it referred to Nigel and Rosanne. As a case in point, consider how Mansell recalls the announcement of his first retirement, at the British Grand Prix in 1990: "It was a genuine decision. Rosanne and I had talked it through before Silverstone, and we'd decided we were being manipulated."

We were being manipulated.

Nigel supported Rosanne as she fought cancer in 2004, and Paul Kimmage's interview concludes with the following Mills and Boonesque lines:

He glances across at Rosanne for confirmation. She shrugs and smiles.

"You know that you’re my hero, don’t you, for all that you’ve been through," he whispers tenderly.

Monday, October 26, 2009

Analysing Formula 1

It's well-known that the greatest number of Grand Prix winners in a single season is eleven, a number attained during the epic and tragic 1982 season. Did you know, however, that the second greatest number of winners in a single season is nine, from the 1975 season?

In fact, in the history of Formula 1, only six seasons have featured eight or more winners, and all but one of those occurred between 1975 and 1985, (the subsequent exception being 2003). On the basis of this fact alone, one might argue that the years between 1975 and 1985 define the most competitive era in the sport's history. It also begs all sorts of questions about the conditions which led to such a competitive environment, and why they have so rarely pertained since.

These facts are gleaned from Roger Smith's colourful 2008 tome on the statistics of Formula 1, which is now available in paperback. It's well worth a purchase, for this was clearly a labour of love for Smith. Of particular interest is the book's concluding chapter, where Smith expounds the results of a rating system which enables all the champion drivers to be ranked, irrespective of the eras in which they raced.

The basic performance indicator chosen by Smith is a driver's strike rate, the number of Grand Prix victories as a fraction of races contested. After ranking the drivers by strike rate, Smith then attempts to adjust the ranking to compensate for the superiority of the equipment at a driver's disposal, and the strength of the driving competition he faced. Sadly, Smith doesn't 'show his working' here, but he does explain that the superiority of a driver's equipment in any particular year can be estimated by factors such as: the absolute share of wins achieved by the driver's team; the number of 1-2s; the number of victories by the driver's team-mate; and the share of wins relative to the second most successful team that year. How Smith disentangles the strength of the driving competition from the strength of the equipment available to the competition is unclear, but the upshot is a rating system which places Fangio first, Clark second, and Schumacher third.

A first objection is that it's difficult to argue with Smith's reasoning without being able to see his detailed calculations. In addition, however, there is a serious omission which underlies Smith's ratings system, and it is an error which is committed by every published attempt to rank the all-time greats. It is the failure to adjust for the size of the competitive pool, and the fact that the pool has been steadily growing in size since the inception of the Formula 1 World Championship.

In the 1950s and 1960s, only a relatively small number of people were competing in single-seater motorsport, and there were only a small number of formulae. The number competing at all levels of the sport has increased massively from the 1970s and 1980s onwards. Developing hand-in-hand with this has been the proliferation of the different junior formulae, all arranged in a pyramidal structure, filtering out the best drivers at each stage (in theory!), and feeding them towards the world of Formula 1, located at the tip of the pyramid. At the base of the pyramid is the immensely competitive world of kart racing, into which thousands of children across the world every year, are now inducted at an early age, to begin learning the craft of racing driver.

As a general principle of performance statistics, all other things being equal, the best person from a large competitive pool is likely to be better than the best person from a small competitive pool. Aphoristically, it's easy to be a big fish in a small pond, and the best drivers of the 1950s and 1960s were essentially just that. The Fangios and the Clarks were the tips of very small pyramids, whilst the Sennas, Schumachers and Hamiltons are the tips of very large pyramids.

As a comparison, consider American single-seater racing. This is a much smaller competitive pool than the hierarchy of single-seater formulae in the rest of the world, which feeds into Grand Prix racing. Thus, it is easy for a driver such as Al Unser Jnr or Michael Andretti to look devastating in Indycar racing, but to fail badly when they attempt the transition to Grand Prix racing. The best Formula 1 drivers of the 1950s and 1960s are comparable to the best drivers in American single-seater racing. It's quite possible that the best driver ever could have raced in American single-seater racing, and it's still quite possible that Fangio or Clark was actually the best driver the world has ever seen, but on the basis of statistics alone, adjustment for the different sizes of the competitive pools mitigates against this conclusion.

The best drivers in the world effectively lie in the tail-end of the distribution of driver talent, and as a general statistical rule, unless you take very large sample sizes, you're unlikely to be sampling from the tail-end of a distribution. For example, if the frequency of great drivers (those in the tail-end of the talent distribution) is 1-in-100,000, then at a time when there are only, say, 100 drivers in the world, the chance that one of them will be a great driver will only be 1-in-1,000. Hence, it's highly unlikely (but not impossible), that the best driver the world has ever seen was part of the small sample of Grand Prix drivers found in the 1950s and 1960s.

All of which is maybe a way of saying that statistics alone cannot be used to support or refute the subjective appreciation of drivers, made by observers in the same era to which the drivers belong.