South African Jody Scheckter is perhaps best-known as the 1979 Formula 1 World Champion, and the winner of World Superstars in 1981. He is currently, however, the owner of Laverstoke Park in Hampshire, a flagship organic farm. "Modern farming is chasing profit, and the animals are bred to grow bigger faster," says Scheckter in this month's Motorsport magazine. "We're going in the opposite direction. We go for smaller and slower, because it tastes better. There are only 45 pure uncrossed Angus cattle left in the world, and we've got 13 of them here. There are probably 500 pure-bred Herefords left in the world, and we've got 80. We have 1000 head of buffalo, which is most of the UK population. Our sheep, our chickens, all are feeding totally naturally. I employ a full-time Doctor of Microbiology working in our own laboratory here...so we can get the soils and the grasses, herbs and clovers back to how they used to be. We hired the world's top animal psychologist to help us design our abattoir, so there is no stress on the animal when we kill it. We purify our own water - no chlorine, no fluoride - and within two years we want to be running all the farm machinery on non-fossil fuels, like rape seed oil. We want to be totally self-sustaining."
Imagine, the top animal psychologist in the world! Must be the only branch of psychology where 'talking about your problems' isn't considered to be a therapeutic solution. However, if this man can stop animals from fretting about the existence of the afterlife as they are led to the guillotine, then I too would like to hire him, mainly to help my colleagues get to work on a Monday without the normal levels of rancour and stress.
Anyway, I was most impressed with Mr Scheckter's organic farm, and when I read that Waitrose are now stocking his buffalo burgers, I decided, on this balmly Spring day, to take a trip to my local store.
On arriving at the store's entrance canal, I paused momentarily to pick up my cellophane-cradled copy of The Sunday Times, before forging inward to something called the Fresh Meat section. Here, I marvelled at the cuts of naked meat, stripped of their normal breadcrumb coating. Sadly, I could find no buffalo burgers. Instead, however, my eyes alighted upon a pack of 4 quarterpounder Aberdeen Angus (99%) burgers. These I cooked with the grill in my oven on returning home. I placed the burgers on a type of grid in a pan, which was inserted into a slot under the grill. At this stage, it all began to get quite complex. There was a half-grill option, a full-grill option, and a 'thermal grill' option where the grill alternates with the fan. There were different recommended temperatures and cooking times for each option, so I just had to make my best guesstimate, and see what happenend. After about 10 minutes, a type of greasy smoke began to issue forth from slots in the hob of the cooker, which I instinctively took to be a sign that I should turn the temperature down, and open the window. Having done so, the burgers sizzled happily to a cooked state. I was even able to clean the grid and pan afterwards without using any Acetonitrile.
And the burgers? Well, they tasted rather nice actually. But I do want those buffalo burgers...
Sunday, March 30, 2008
Friday, March 28, 2008
Terminal 5 and Systems Engineering
Those seeking to understand the ultimate root cause of the debacle at Heathrow's Terminal 5 this week, might perhaps wish to consult British Airways' 'Baggage Handling Project Management Manual'. The key sentence can be found on p5:
The [project] process has a similar structure to the systems engineering principles used throughout industry.
Systems engineering is a top-down approach to large, complex engineering projects, which expends great time and effort in codifying and formalising the blindingly obvious, usually in the form of UML (Unified Modelling Language) diagrams.
The fundamental fallacy of Systems Engineering is the tacit belief that complex engineering systems are best managed by defining, often in very abstract terms, top-level requirements, capabilities and stakeholders, and by then breaking those top-level entities down into sub-requirements, sub-capabilities and sub-stakeholders, all considered in abstraction from specific technologies. Systems Engineering often requires systems houses to expend large amounts of time and money before crucial procurement decisions are made, and therefore constitutes a form of institutionalised procrastination. And, crucially, the top-down dogma has a tendency to produce systems lacking in practical operational effectiveness. True, practical, rapid and effective engineering uses both top-down and bottom-up approaches, in combination, and without any of the wasteful and vacuous formalising used by systems engineers.
Intellectually, Systems Engineering has never progressed beyond the facile observation that complex engineering systems are really 'systems of systems'. On Friday's edition of Newsnight, Allen Fairbairn, Systems Engineering Manager for the Channel Tunnel Project, offered the useful, and entirely non-specific diagnosis that large engineering systems often fail at the point where the various subsystems interact. This is the self-sustaining aspect to Systems Engineering: when it is responsible for the failure of a project, it provides generalised diagnoses of why the project failed, and prescribes more Systems Engineering as the solution.
The [project] process has a similar structure to the systems engineering principles used throughout industry.
Systems engineering is a top-down approach to large, complex engineering projects, which expends great time and effort in codifying and formalising the blindingly obvious, usually in the form of UML (Unified Modelling Language) diagrams.
The fundamental fallacy of Systems Engineering is the tacit belief that complex engineering systems are best managed by defining, often in very abstract terms, top-level requirements, capabilities and stakeholders, and by then breaking those top-level entities down into sub-requirements, sub-capabilities and sub-stakeholders, all considered in abstraction from specific technologies. Systems Engineering often requires systems houses to expend large amounts of time and money before crucial procurement decisions are made, and therefore constitutes a form of institutionalised procrastination. And, crucially, the top-down dogma has a tendency to produce systems lacking in practical operational effectiveness. True, practical, rapid and effective engineering uses both top-down and bottom-up approaches, in combination, and without any of the wasteful and vacuous formalising used by systems engineers.
Intellectually, Systems Engineering has never progressed beyond the facile observation that complex engineering systems are really 'systems of systems'. On Friday's edition of Newsnight, Allen Fairbairn, Systems Engineering Manager for the Channel Tunnel Project, offered the useful, and entirely non-specific diagnosis that large engineering systems often fail at the point where the various subsystems interact. This is the self-sustaining aspect to Systems Engineering: when it is responsible for the failure of a project, it provides generalised diagnoses of why the project failed, and prescribes more Systems Engineering as the solution.
Wednesday, March 26, 2008
Digital radio
I'm fundamentally opposed to digital radio. Let me explain:
The idea of analogue radio is that the frequency and amplitude of sound waves at the source can be transmitted over great distances by mapping them to the frequency and amplitude of electromagnetic waves, and thence re-created as sound waves again at their destination. As such, analogue radio exploits isomorphisms between sound waves and radio waves. Digital radio, in contrast, turns continuous sound waves into binary digits, encodes those binary digits into electromagnetic waves, and then decodes and re-creates the sound waves at their destination. This improves the quality of the sound but destroys the feel of radio.
The essence of radio lies in the hiss. I want to listen to the radio late at night, with the lights off, my head on the pillow, and feel that there's something magic in the way my little radio catches the hidden, delicate, fleeting signal from the omnipresent background of electromagnetic noise. I want to feel the distance over which the signal has travelled, the variations in the volume and quality of the signal as the meteorological conditions fluctuate. I want to manually tune my radio, not have the radio tune itself. Digital radio, like radio over the internet, just feels like a voice in my own head. I want to sense how many different voices and channels there are out there, competing for space in the electromagnetic spectrum. And I want to hear the background hiss, the hiss that was present before humanity evolved, and the hiss that will prevail after humanity has passed away. I want to hear the voices climbing out of the stochastic abyss.
The idea of analogue radio is that the frequency and amplitude of sound waves at the source can be transmitted over great distances by mapping them to the frequency and amplitude of electromagnetic waves, and thence re-created as sound waves again at their destination. As such, analogue radio exploits isomorphisms between sound waves and radio waves. Digital radio, in contrast, turns continuous sound waves into binary digits, encodes those binary digits into electromagnetic waves, and then decodes and re-creates the sound waves at their destination. This improves the quality of the sound but destroys the feel of radio.
The essence of radio lies in the hiss. I want to listen to the radio late at night, with the lights off, my head on the pillow, and feel that there's something magic in the way my little radio catches the hidden, delicate, fleeting signal from the omnipresent background of electromagnetic noise. I want to feel the distance over which the signal has travelled, the variations in the volume and quality of the signal as the meteorological conditions fluctuate. I want to manually tune my radio, not have the radio tune itself. Digital radio, like radio over the internet, just feels like a voice in my own head. I want to sense how many different voices and channels there are out there, competing for space in the electromagnetic spectrum. And I want to hear the background hiss, the hiss that was present before humanity evolved, and the hiss that will prevail after humanity has passed away. I want to hear the voices climbing out of the stochastic abyss.
Tuesday, March 25, 2008
The laws of physics and the art of motorcycle maintenance
I recently came across the following piece of dialogue from Robert Pirsig's classic, 'Zen and the art of motorcycle maintenance':
After a while [John] says, "Do you believe in ghosts?"
"No," I say.
"Why not?"
"Because they are un-sci-en-ti-fic."
The way I say this makes John smile. "They contain no matter," I continue, "and have no energy and therefore, according to the laws of science, do not exist except in people's minds."
The whiskey, the fatigue and the wind in the trees start mixing in my mind. "Of course," I add, "the laws of science contain no matter and have no energy either and therefore do not exist except in people's minds. It's best to be completely scientific about the whole thing and refuse to believe in either ghosts or the laws of science. That way you're safe. That doesn't leave you very much to believe in, but that's scientific too.
"...Modern man has his ghosts and spirits too, you know."
"What?"
"Oh, the laws of physics and of logic -- the number system -- the principle of algebraic substitution. These are ghosts. We just believe in them so thoroughly they seem real.
"They seem real to me," John says.
"I don't get it," says Chris.
So I go on. "For example, it seems completely natural to presume that gravitation and the law of gravitation existed before Isaac Newton. It would sound nutty to think that until the seventeenth century there was no gravity."
"Of course."
"So when did this law start? Has it always existed?"
John is frowning, wondering what I am getting at.
"What I'm driving at," I say, "is the notion that before the beginning of the earth, before the sun and the stars were formed, before the primal generation of anything, the law of gravity existed."
"Sure."
"Sitting there, having no mass of its own, no energy of its own, not in anyone's mind because there wasn't anyone, not in space because there was no space either, not anywhere...this law of gravity still existed?"
Now John seems not so sure.
"If that law of gravity existed," I say, "I honestly don't know what a thing has to do to be nonexistent. It seems to me that law of gravity has passed every test of nonexistence there is. You cannot think of a single attribute of nonexistence that that law of gravity didn't have. Or a single scientific attribute of existence it did have. And yet it is still 'common sense' to believe that it existed."
John says, "I guess I'd have to think about it."
"Well, I predict that if you think about it long enough you will find yourself going round and round and round and round until you finally reach only one possible, rational, intelligent conclusion. The law of gravity and gravity itself did not exist before Isaac Newton. No other conclusion makes sense.
"And what that means," I say before he can interrupt, "and what that means is that that law of gravity exists nowhere except in people's heads! It's a ghost!"
The confusion that Pirsig trades upon here is that which exists between concrete objects and abstract objects. Ghosts are purported concrete objects, which fail to exist, whilst the laws of physics are abstract objects, generalisations from empirical data, which do exist. The laws of physics have no matter or energy themselves, and they do not occupy regions of space, but they exist by virtue of the behaviour exhibited by objects which do possess matter and energy, and which do occupy regions of space. Ghosts, in contrast, are not generalisations from empirical data, but purported individuals.
Whether the laws of physics exist independently of the physical world is another question altogether. There are at least three possible philosophical positions on the laws of nature. The laws can be treated as either:
1) Contingent relations between particulars.
2) Contingent relations between universals.
3) Necessary relations between universals.
Consider as an example the law F = ma. There are three properties here, the force experienced by an object F, the mass of an object m, and the acceleration of the object a.
Under the first view, called the 'regularity theory of laws', laws are just contingent relationships between individual things possessing properties.
Under the second view, laws are still just contingent relationships, but relationships between properties, treated as universals. (When philosophers refer to properties as universals, they mean that a property is a type of abstract object, which can be possessed by various objects at different times and places, but which cannot be identified with those instances.) Under this second view, the laws could still have been otherwise than they are. This is called the 'nomic necessity' theory of laws, the point being that nomic necessity is distinguished from metaphysical or conceptual necessity.
Under the third viewpoint, the laws of nature are necessary relations between properties, treated as universals. Those who support this view often argue that the laws cannot be otherwise than they are because they spring from the essential nature of the properties they relate. In other words, the essential nature of those properties includes their relationships with other properties, and these are the relationships encapsulated in the laws of physics. Under this viewpoint, the laws of nature are metaphysically necessary.
All three views accept that the laws of physics exist, but only under the third viewpoint is it possible to argue that the laws of physics could exist independently of the physical world. Even then, the contention that the laws of physics are necessary relations between properties does not entail that those relations exist independently of the physical world unless one postulates that those properties exist independently of the physical world.
After a while [John] says, "Do you believe in ghosts?"
"No," I say.
"Why not?"
"Because they are un-sci-en-ti-fic."
The way I say this makes John smile. "They contain no matter," I continue, "and have no energy and therefore, according to the laws of science, do not exist except in people's minds."
The whiskey, the fatigue and the wind in the trees start mixing in my mind. "Of course," I add, "the laws of science contain no matter and have no energy either and therefore do not exist except in people's minds. It's best to be completely scientific about the whole thing and refuse to believe in either ghosts or the laws of science. That way you're safe. That doesn't leave you very much to believe in, but that's scientific too.
"...Modern man has his ghosts and spirits too, you know."
"What?"
"Oh, the laws of physics and of logic -- the number system -- the principle of algebraic substitution. These are ghosts. We just believe in them so thoroughly they seem real.
"They seem real to me," John says.
"I don't get it," says Chris.
So I go on. "For example, it seems completely natural to presume that gravitation and the law of gravitation existed before Isaac Newton. It would sound nutty to think that until the seventeenth century there was no gravity."
"Of course."
"So when did this law start? Has it always existed?"
John is frowning, wondering what I am getting at.
"What I'm driving at," I say, "is the notion that before the beginning of the earth, before the sun and the stars were formed, before the primal generation of anything, the law of gravity existed."
"Sure."
"Sitting there, having no mass of its own, no energy of its own, not in anyone's mind because there wasn't anyone, not in space because there was no space either, not anywhere...this law of gravity still existed?"
Now John seems not so sure.
"If that law of gravity existed," I say, "I honestly don't know what a thing has to do to be nonexistent. It seems to me that law of gravity has passed every test of nonexistence there is. You cannot think of a single attribute of nonexistence that that law of gravity didn't have. Or a single scientific attribute of existence it did have. And yet it is still 'common sense' to believe that it existed."
John says, "I guess I'd have to think about it."
"Well, I predict that if you think about it long enough you will find yourself going round and round and round and round until you finally reach only one possible, rational, intelligent conclusion. The law of gravity and gravity itself did not exist before Isaac Newton. No other conclusion makes sense.
"And what that means," I say before he can interrupt, "and what that means is that that law of gravity exists nowhere except in people's heads! It's a ghost!"
The confusion that Pirsig trades upon here is that which exists between concrete objects and abstract objects. Ghosts are purported concrete objects, which fail to exist, whilst the laws of physics are abstract objects, generalisations from empirical data, which do exist. The laws of physics have no matter or energy themselves, and they do not occupy regions of space, but they exist by virtue of the behaviour exhibited by objects which do possess matter and energy, and which do occupy regions of space. Ghosts, in contrast, are not generalisations from empirical data, but purported individuals.
Whether the laws of physics exist independently of the physical world is another question altogether. There are at least three possible philosophical positions on the laws of nature. The laws can be treated as either:
1) Contingent relations between particulars.
2) Contingent relations between universals.
3) Necessary relations between universals.
Consider as an example the law F = ma. There are three properties here, the force experienced by an object F, the mass of an object m, and the acceleration of the object a.
Under the first view, called the 'regularity theory of laws', laws are just contingent relationships between individual things possessing properties.
Under the second view, laws are still just contingent relationships, but relationships between properties, treated as universals. (When philosophers refer to properties as universals, they mean that a property is a type of abstract object, which can be possessed by various objects at different times and places, but which cannot be identified with those instances.) Under this second view, the laws could still have been otherwise than they are. This is called the 'nomic necessity' theory of laws, the point being that nomic necessity is distinguished from metaphysical or conceptual necessity.
Under the third viewpoint, the laws of nature are necessary relations between properties, treated as universals. Those who support this view often argue that the laws cannot be otherwise than they are because they spring from the essential nature of the properties they relate. In other words, the essential nature of those properties includes their relationships with other properties, and these are the relationships encapsulated in the laws of physics. Under this viewpoint, the laws of nature are metaphysically necessary.
All three views accept that the laws of physics exist, but only under the third viewpoint is it possible to argue that the laws of physics could exist independently of the physical world. Even then, the contention that the laws of physics are necessary relations between properties does not entail that those relations exist independently of the physical world unless one postulates that those properties exist independently of the physical world.
Friday, March 21, 2008
The Centre of Narrative Gravity
This morning, a hypnopompic dream represented my youth to me as the receding tide on Ainsdale Beach: a flat expanse of puddle-strewn sand stretched all the way to the horizon, where a narrow lip of water threatened to disappear over the horizon and out of sight altogether.
I have, therefore, a number of complaints I'd like to lodge. Firstly, I slightly resent being taunted by my own subconscious; life is difficult enough without your own brain-stem being a source of misery. Secondly, the image of the receding tide and horizon are rather cliched, so I think my own subconscious can be accused of being rather trite here. Thirdly, the tide is a rather poor metaphor anyway, given that tides come in as well as out, whereas my own youth will never return.
It's a good job that my sense of a wholly unified, conscious, executive self is largely illusional, and, according to Daniel Dennett, merely corresponds to the centre of narrative gravity. A consoling thought, I think you'll agree.
I have, therefore, a number of complaints I'd like to lodge. Firstly, I slightly resent being taunted by my own subconscious; life is difficult enough without your own brain-stem being a source of misery. Secondly, the image of the receding tide and horizon are rather cliched, so I think my own subconscious can be accused of being rather trite here. Thirdly, the tide is a rather poor metaphor anyway, given that tides come in as well as out, whereas my own youth will never return.
It's a good job that my sense of a wholly unified, conscious, executive self is largely illusional, and, according to Daniel Dennett, merely corresponds to the centre of narrative gravity. A consoling thought, I think you'll agree.
Thursday, March 20, 2008
A few observations
I'd like to introduce a couple of new quantities into the science of the workplace. Firstly, I'd like to define a quantity I term the wanker density. This is the number of wankers per capita of the workforce, and should be measured in parts-per-hundred (pph). For example, one might say that a place-of-work with a wanker density of 50pph has quite a high wanker density. In this context, I define a wanker-at-work to be an uncooperative, obnoxious or devious individual, prone to serving his/her own self-interest at the expense of others. Different workplaces, of course, will have different wanker densities, and the level of frustration encountered in working for a particular company is a function of the wanker density.
The second quantity I'd like to define is what I dub the committed effective resentment equivalent. This is defined analogously to the committed effective dose equivalent, which measures the cumulative dose of radiation received over a lifetime if a radionuclide is taken into the body. The committed effective resentment measures the cumulative resentment acquired by employees who stay at the same company for much of their working lifetime, and who, over those decades, deposit geological layers of resentment towards their employers and work colleagues. I propose that the SI unit of committed effective resentment be termed the twat. Thus, work colleagues who are only slightly unpleasant, but who, nevertheless, spend much of their time bitching about the way they've been treated, could be said to have, say, 20 twats of committed effective resentment. Other colleagues, who might be described as rogue employees, being obnoxious, devious, disruptive, provocative, and even prone to acts of downright sabotage, might be said to have about 500 twats of committed effective resentment.
The second quantity I'd like to define is what I dub the committed effective resentment equivalent. This is defined analogously to the committed effective dose equivalent, which measures the cumulative dose of radiation received over a lifetime if a radionuclide is taken into the body. The committed effective resentment measures the cumulative resentment acquired by employees who stay at the same company for much of their working lifetime, and who, over those decades, deposit geological layers of resentment towards their employers and work colleagues. I propose that the SI unit of committed effective resentment be termed the twat. Thus, work colleagues who are only slightly unpleasant, but who, nevertheless, spend much of their time bitching about the way they've been treated, could be said to have, say, 20 twats of committed effective resentment. Other colleagues, who might be described as rogue employees, being obnoxious, devious, disruptive, provocative, and even prone to acts of downright sabotage, might be said to have about 500 twats of committed effective resentment.
Wednesday, March 19, 2008
Group B
Easter in the 1980s meant the Safari Rally and the Circuit of Ireland, epic 5-day adventures contested by voracious 4-wheel drive turbocharged monsters called Group B rallycars. Rallying has since been sanitised, the vehicles domesticated, and the routes emasculated, but in the 1980s rallying posed a remarkable test of endurance and speed. The Safari and Circuit of Ireland, in particular, provided twisting narratives of waxing and waning fortunes, spectacular driving, mechanical disaster, and fearsome accidents.
I'm currently reading John Davenport's retrospective account of this era, Group B: The Rise and Fall of Rallying's Wildest Cars, and it's a fine piece of nostalgia. Audi Sport Quattros, Lancia 037s, Peugeot 205 T16s, Ford RS200s, Metro 6R4s, Lancia Delta S4s, Opel Manta 400s, Ari Vatanen, Henri Toivonen, Markku Alen, Hannu Mikkola, Walter Rohrl, Stig Blomqvist, Tony Pond, Timo Salonen, Michele Mouton, Bjorn Waldegaard, and Juha Kankkunen. Brings a tingle to the spine, even at a distance of 20 years.
I'm currently reading John Davenport's retrospective account of this era, Group B: The Rise and Fall of Rallying's Wildest Cars, and it's a fine piece of nostalgia. Audi Sport Quattros, Lancia 037s, Peugeot 205 T16s, Ford RS200s, Metro 6R4s, Lancia Delta S4s, Opel Manta 400s, Ari Vatanen, Henri Toivonen, Markku Alen, Hannu Mikkola, Walter Rohrl, Stig Blomqvist, Tony Pond, Timo Salonen, Michele Mouton, Bjorn Waldegaard, and Juha Kankkunen. Brings a tingle to the spine, even at a distance of 20 years.
An invite
I received a letter today from the United States. Opening it, I was surprised and delighted to find that it was an invite from the Trustees of the John Templeton Foundation, inviting me to a Pall Mall reception in honour of Michael Heller, winner of the 2008 Templeton Prize!
Should be an interesting evening...
Should be an interesting evening...
Thursday, March 13, 2008
Michael Heller
Polish physicist, philosopher and Roman Catholic priest Michael Heller, has just been awarded the £820,000 Templeton Prize. Michael invited me to Krakow in 2006 and 2007 to deliver a couple of papers, and I honestly can't think of a more deserving recipient.
Michael will be investing all his Templeton prize-winning money in the Copernicus Centre in Krakow, an institute whose raison d'etre will be 'philosophy in science', rather than 'philosophy of science'. I have remarked before that most physicists seem to be happy to write about philosophical issues, but do so without having first familiarised themselves with the relevant philosophical literature. As a consequence, they commit the same egregious philosophical errors over and over again. It would be nice, then, to think that something can be done to ameliorate this phenomenon.
Congratulations Michael!
Michael will be investing all his Templeton prize-winning money in the Copernicus Centre in Krakow, an institute whose raison d'etre will be 'philosophy in science', rather than 'philosophy of science'. I have remarked before that most physicists seem to be happy to write about philosophical issues, but do so without having first familiarised themselves with the relevant philosophical literature. As a consequence, they commit the same egregious philosophical errors over and over again. It would be nice, then, to think that something can be done to ameliorate this phenomenon.
Congratulations Michael!
Monday, March 10, 2008
The Wrong Trousers
Me trousers split at work today.
One minute everything was fine, and I was chatting to a female colleague, and the next I looked down to see that a gaping hole had opened up from below the zip, all the way round to the saddle point of the trouser-space. Now, whilst this colleague's attractions are undeniable and multifarious, on this occasion the explanation didn't lie with an unanticipated tumescence event. Rather, the effort I had expended moments earlier in shifting a number of filing cabinets, had obviously torn the stitching asunder.
One's first thoughts in such a situation are obviously: 'Why can't they teach these third-world kids to use a sewing machine properly?' Subsequently, however, I realised that I was close to a total trouser-failure scenario, and remedial action was necessary. Sellotape placed across the inside of the rent fabric offered a temporary solution, but a safety pin supplied by another colleague, (who himself had suffered a total trouser-failure event some months previously, and been forced home by the trauma) was less effective, and ultimately discarded.
I duly reduced the length of my gait, and struggled to the end of the day as best I could.
One minute everything was fine, and I was chatting to a female colleague, and the next I looked down to see that a gaping hole had opened up from below the zip, all the way round to the saddle point of the trouser-space. Now, whilst this colleague's attractions are undeniable and multifarious, on this occasion the explanation didn't lie with an unanticipated tumescence event. Rather, the effort I had expended moments earlier in shifting a number of filing cabinets, had obviously torn the stitching asunder.
One's first thoughts in such a situation are obviously: 'Why can't they teach these third-world kids to use a sewing machine properly?' Subsequently, however, I realised that I was close to a total trouser-failure scenario, and remedial action was necessary. Sellotape placed across the inside of the rent fabric offered a temporary solution, but a safety pin supplied by another colleague, (who himself had suffered a total trouser-failure event some months previously, and been forced home by the trauma) was less effective, and ultimately discarded.
I duly reduced the length of my gait, and struggled to the end of the day as best I could.
Thursday, March 06, 2008
Evilution
Those wishing to gain an insight into the minds of those on the religious right-wing in North America, may be, in equal measures, enlightened, amused, and disturbed by this diagram, produced by the Pittsburgh Creation Society.
It seems that the theory of evolution by natural selection, sustained by nutritious sin, is not only responsible for all the ills of society, but is also responsible for inflation, moral education, women and children's liberation, 'dirty books', and even 'hard rock'.
Which reminds me of the time that Ned Flanders went to what he claimed was a Christian Rock concert. On returning from the 'Chris Rock' gig, Ned confessed to having enjoyed the event, but also expressed some surprise at the frequent use of the f-word.
It seems that the theory of evolution by natural selection, sustained by nutritious sin, is not only responsible for all the ills of society, but is also responsible for inflation, moral education, women and children's liberation, 'dirty books', and even 'hard rock'.
Which reminds me of the time that Ned Flanders went to what he claimed was a Christian Rock concert. On returning from the 'Chris Rock' gig, Ned confessed to having enjoyed the event, but also expressed some surprise at the frequent use of the f-word.
Tuesday, March 04, 2008
The Physics of NASCAR
With the exception of the characters in Frasier, America doesn't do sophistication. Nowhere is this more apparent than in the nature of its premier motorsport category, NASCAR, a form of stock-car racing. Simply put, the most popular form of motorsport in the USA is the type of thing which, in the 1970s, World of Sport and Dickie Davis would have spurned in favour of a demolition derby from Wimbledon.
Whilst Formula 1 cars are manufactured from carbon-fibre, NASCAR vehicles are made from sheet metal welded to steel tubing; the engine block has to be made of cast-iron; the valves in the engines must be operated by pushrods rather than overhead camshafts; and the air and fuel must be mixed by carburetors rather than fuel injection!
However, partly because of the ancient and alien nature of the technology employed in NASCAR, and also because its author, Diandra Leslie-Pelecky, is a brilliant scientific expositor, this is a fantastic book, and one which, I predict, will win awards within the 'popular science' genre.
Diandra covers unfashionable topics such as the physical chemistry of steel and paint and fuel, with elan and panache, and I found myself learning new things, like this, on virtually every page:
Water-based paint, like latex house paint, has a polymer binder and pigment suspended in water. You brush or roll the paint on the wall, and when the water evaporates, the polymers hold the pigment to the surface like bungee cords hold down a load in a pickup truck...Cars...use acrylic urethane paints...Unlike latex paint, acrylic urethane paint comes in two parts. The first part is an acrylic-based resin that contains the pigment and a polymer, and the second part is a hardener or catalyst. Neither part is paint by itself-they don't become paint until they are mixed and a chemical reaction occurs...Acrylic-urethane paint dries in two stages. First the [volatile organic compounds] in which the pigment and binder were dispersed start to evaporate. The second step is crosslinking, where the acrylic-urethane polymers form chemical bonds with each other. The resulting polymer network has a hard, glossy finish.
Whilst Formula 1 cars are manufactured from carbon-fibre, NASCAR vehicles are made from sheet metal welded to steel tubing; the engine block has to be made of cast-iron; the valves in the engines must be operated by pushrods rather than overhead camshafts; and the air and fuel must be mixed by carburetors rather than fuel injection!
However, partly because of the ancient and alien nature of the technology employed in NASCAR, and also because its author, Diandra Leslie-Pelecky, is a brilliant scientific expositor, this is a fantastic book, and one which, I predict, will win awards within the 'popular science' genre.
Diandra covers unfashionable topics such as the physical chemistry of steel and paint and fuel, with elan and panache, and I found myself learning new things, like this, on virtually every page:
Water-based paint, like latex house paint, has a polymer binder and pigment suspended in water. You brush or roll the paint on the wall, and when the water evaporates, the polymers hold the pigment to the surface like bungee cords hold down a load in a pickup truck...Cars...use acrylic urethane paints...Unlike latex paint, acrylic urethane paint comes in two parts. The first part is an acrylic-based resin that contains the pigment and a polymer, and the second part is a hardener or catalyst. Neither part is paint by itself-they don't become paint until they are mixed and a chemical reaction occurs...Acrylic-urethane paint dries in two stages. First the [volatile organic compounds] in which the pigment and binder were dispersed start to evaporate. The second step is crosslinking, where the acrylic-urethane polymers form chemical bonds with each other. The resulting polymer network has a hard, glossy finish.
Sunday, March 02, 2008
CP-violation
The redoubtable Bryan Appleyard writes a nice piece for the Sunday Times on supercomputer simulations of CP-violation in quantum chromodynamics.
In particle physics, a CP transformation is the combined operation which swaps positive charges and negative charges (Charge conjugation), and swaps right-handed particle states and left-handed particle states (Parity reversal). Given a left-handed, negatively charged particle, the corresponding antiparticle is a right-handed, positively charged particle. CP-violation occurs when particle processes are not invariant under a CP transformation. CP-violation is necessary to explain the asymmetry present in the early universe between the amount of matter and the amount of antimatter. However, whilst the standard model of particle physics predicts CP-violation in electroweak processes, and this CP-violation has been experimentally observed, the order of the violation is insufficient to account for the matter-antimatter asymmetry. The standard model is also capable of representing strong force CP-violation, but this has not been experimentally observed, and, once again, it transpires that this CP-violation, even if it does exist, cannot account for the exact magnitude of the matter-antimatter asymmetry in the early universe.
It seems then, that an explanation for the matter-antimatter asymmetry will reside in a Grand Unified Theory (GUT). Such a theory goes beyond the standard model by unifying the strong force with the electroweak force. (Note that a GUT does not incorporate gravity, and should not be confused with a Theory of Everything, which does just that.) CP-violation in a GUT will, it appears, be necessary to explain the matter-antimatter asymmetry. When energy densities in the early universe were sufficiently high, it is thought that the strong and electroweak forces were unified, and that particles mediating the unified force passed between quarks and leptons, permitting transmutations between these particles. If these reactions possessed CP-violation, then it could well explain the presence of a matter-antimatter asymmetry at the end of the GUT era.
Note, incidentally, that Appleyard's article equates the question 'Why was there more matter than antimatter in the early universe?' with the question 'Why is there something rather than nothing?' This is rather misleading. If there had been an exactly equal amount of matter and antimatter in the early universe, all the mass in the early universe would have been converted into a sea of photons (i.e., radiative energy). One would therefore have a space-time replete with energy, and although there would be no matter as such, this is hardly the same thing as nothing at all.
CP-Violation Standard Model Something rather than nothing Grand Unified Theories Bryan Appleyard
In particle physics, a CP transformation is the combined operation which swaps positive charges and negative charges (Charge conjugation), and swaps right-handed particle states and left-handed particle states (Parity reversal). Given a left-handed, negatively charged particle, the corresponding antiparticle is a right-handed, positively charged particle. CP-violation occurs when particle processes are not invariant under a CP transformation. CP-violation is necessary to explain the asymmetry present in the early universe between the amount of matter and the amount of antimatter. However, whilst the standard model of particle physics predicts CP-violation in electroweak processes, and this CP-violation has been experimentally observed, the order of the violation is insufficient to account for the matter-antimatter asymmetry. The standard model is also capable of representing strong force CP-violation, but this has not been experimentally observed, and, once again, it transpires that this CP-violation, even if it does exist, cannot account for the exact magnitude of the matter-antimatter asymmetry in the early universe.
It seems then, that an explanation for the matter-antimatter asymmetry will reside in a Grand Unified Theory (GUT). Such a theory goes beyond the standard model by unifying the strong force with the electroweak force. (Note that a GUT does not incorporate gravity, and should not be confused with a Theory of Everything, which does just that.) CP-violation in a GUT will, it appears, be necessary to explain the matter-antimatter asymmetry. When energy densities in the early universe were sufficiently high, it is thought that the strong and electroweak forces were unified, and that particles mediating the unified force passed between quarks and leptons, permitting transmutations between these particles. If these reactions possessed CP-violation, then it could well explain the presence of a matter-antimatter asymmetry at the end of the GUT era.
Note, incidentally, that Appleyard's article equates the question 'Why was there more matter than antimatter in the early universe?' with the question 'Why is there something rather than nothing?' This is rather misleading. If there had been an exactly equal amount of matter and antimatter in the early universe, all the mass in the early universe would have been converted into a sea of photons (i.e., radiative energy). One would therefore have a space-time replete with energy, and although there would be no matter as such, this is hardly the same thing as nothing at all.
CP-Violation Standard Model Something rather than nothing Grand Unified Theories Bryan Appleyard
Saturday, March 01, 2008
The Joint European Torus
Whilst the prospect of controllable fusion power remains out of reach, the technology employed to pursue this goal is quite fascinating.
Consider, for example, the Joint European Torus (JET), at UKAEA's Culham laboratory in Oxfordshire. This doughnut-shaped vessel, called a tokamak, is designed to confine a plasma of deuterium and tritium with the use of magnetic fields. The deuterium and tritium nuclei will fuse at the high temperatures created in a plasma, and release energy in two forms: neutrons and helium nuclei. The ultimate intention is that the energy carried by the neutrons can be used to heat the water in a jacket around the tokamak, and the steam energy from this can then be used to, say, power turbines that generate electricity. In principle, the energy carried by the helium nuclei is then sufficient to sustain the temperature of the plasma, and thereby make the fusion reaction self-sustaining once started.
What is particularly fascinating about JET is the way in which the magnetic fields are generated. To confine the plasma, a helical magnetic field is generated inside the tokamak. As a consequence, the charged deuterium and tritium nuclei in the plasma follow spiral trajectories inside the vessel. To generate this helical magnetic field, two distinct components are generated by distinct mechanisms:
Firstly, a 'toroidal' magnetic field is generated by superconducting magnets around the tokamak. This magnetic field is essentially generated in accordance with Ampere's law, in the same manner that a magnetic field is generated in the interior of a solenoid.
The toroidal field provides the longitudinal component to the resultant magnetic field. The total field is the sum of the toroidal field and a 'poloidal' field, which provides a latitudinal component to the resultant field. The poloidal field is generated by a changing current in a coil within the hole of the doughnut. To understand this requires some further background in electromagnetism:
An electric current creates a magnetic field, and in particular, creates something called magnetic flux, which is the integral of the magnetic field over a surface area. If a changing electric current is generated in a coil, then this generates a changing magnetic flux, and by magnetic induction, this will generate a current in another circuit. (This is the same principle by which a transformer operates).
In the case of a tokamak, the electrons in the plasma constitute the secondary circuit, and the changing current in the inner coil generates a toroidal electric field, which drives a longitudinal electron current in the plasma. Just like the current through a copper wire, this current generates a circular magnetic field in the perpendicular plane. This is the poloidal magnetic field, which generates the latitudinal component to the magnetic field controlling the trajectories of the deuterium and tritium nuclei.
Consider, for example, the Joint European Torus (JET), at UKAEA's Culham laboratory in Oxfordshire. This doughnut-shaped vessel, called a tokamak, is designed to confine a plasma of deuterium and tritium with the use of magnetic fields. The deuterium and tritium nuclei will fuse at the high temperatures created in a plasma, and release energy in two forms: neutrons and helium nuclei. The ultimate intention is that the energy carried by the neutrons can be used to heat the water in a jacket around the tokamak, and the steam energy from this can then be used to, say, power turbines that generate electricity. In principle, the energy carried by the helium nuclei is then sufficient to sustain the temperature of the plasma, and thereby make the fusion reaction self-sustaining once started.
What is particularly fascinating about JET is the way in which the magnetic fields are generated. To confine the plasma, a helical magnetic field is generated inside the tokamak. As a consequence, the charged deuterium and tritium nuclei in the plasma follow spiral trajectories inside the vessel. To generate this helical magnetic field, two distinct components are generated by distinct mechanisms:
Firstly, a 'toroidal' magnetic field is generated by superconducting magnets around the tokamak. This magnetic field is essentially generated in accordance with Ampere's law, in the same manner that a magnetic field is generated in the interior of a solenoid.
The toroidal field provides the longitudinal component to the resultant magnetic field. The total field is the sum of the toroidal field and a 'poloidal' field, which provides a latitudinal component to the resultant field. The poloidal field is generated by a changing current in a coil within the hole of the doughnut. To understand this requires some further background in electromagnetism:
An electric current creates a magnetic field, and in particular, creates something called magnetic flux, which is the integral of the magnetic field over a surface area. If a changing electric current is generated in a coil, then this generates a changing magnetic flux, and by magnetic induction, this will generate a current in another circuit. (This is the same principle by which a transformer operates).
In the case of a tokamak, the electrons in the plasma constitute the secondary circuit, and the changing current in the inner coil generates a toroidal electric field, which drives a longitudinal electron current in the plasma. Just like the current through a copper wire, this current generates a circular magnetic field in the perpendicular plane. This is the poloidal magnetic field, which generates the latitudinal component to the magnetic field controlling the trajectories of the deuterium and tritium nuclei.