Sunday, February 29, 2004

Julius, meet Gregory:

Here it is, that extra day we get every four years. The one where you wonder what it would be like to be born on February 29th ("Do you celebrate on the 28th or on March 1st during regular years?" ... "What happens when you have to fill out a form?" etc. etc. etc.)

Well, not quite every four years.

The rule is a little more complicated than that. Since 1582 we have been using the following:

According to the Gregorian calendar, which is the civil calendar in use today, years evenly divisible by 4 are leap years, with the exception of centurial years that are not evenly divisible by 400. Therefore, the years 1700, 1800, 1900 and 2100 are not leap years, but 1600, 2000, and 2400 are leap years.

Why not simply every four years?

Think of it this way: we can measure a year as 365.2422 days long (the number of days it takes Earth to complete one orbit around the Sun, which is what controls the seasons).

If we had no leap years, the calendar would slip behind the seasons by 0.2422 days each year. In an average lifetime, this error accumulates to about two weeks. It would take just under 750 years for the calendar to be out by six months from the seasons.

If we have a leap year every four years, we have the following:

(3*365+1*366) or 1,461 days/4 years = 365.25 days/year

Now we are closer to the correct value, but a little high. This calendar would gain on the seasons by 0.0078 days each year, accumulating to about half a day in a lifetime.

Since we are too high, we need to have fewer leap years. Since the above error is 0.0078 days/year, every hundred years this will be almost one day, 0.78 days/100 years. Let's try having one less leap year every hundred:

(76*365+24*366) or 36,524 days/100 years = 365.24 days/year

Now we are even closer to the correct value, but a still a little low. This calendar would lose on the seasons by 0.0022 days each year, accumulating to about three hours in a lifetime.

It looks like we need to have a couple more leap years than this to get to 365.2422. Since the above error is 0.0022 days/year, every four hundred years this will be almost one day, 0.88 days/400 years. Let's try having one more leap year every four hundred years:

(303*365+97*366) or 146,097 days/400 years = 365.2425 days/year

Now we are at the level of accuracy used today. Note that this is still a tiny bit high. The calendar we use today, the Gregorian calendar, gains on the seasons by 0.0003 days each year, accumulating to about one-half hour in a lifetime. At this rate, it will take 600,000 years to be out a half-year. This level of accuracy is good to slightly better than one part in one million.

Note that this approximation could go on - if we took it to the next level, we might add the following:

We need a tad fewer leap years to get even closer. Since the error is now 0.0003 days/year, every three thousand two hundred years this will be close to one day, 0.96 days/3200 years. Let's try having one less leap year every three thousand two hundred years:

(2425*365+775*366) or 1,168,775 days/3200 years = 365.2421875 days/year

This is an extremely accurate value. This calendar would lose on the seasons by 0.0000125 days each year, or about one second. This error would accumulate to about two minutes in a lifetime.

So why didn't Pope Gregory XIII recommend more approximations in 1582 when the current leap year system was adopted? It was clearly within their mathematical abilities (although their knowledge of the length of the year to this level might not have been). After I had figured the above rationale I also asked: "why don't we simply have one less leap year every 128 years, instead of every hundred? (getting rid of as much of the accumulated error at each step as possible)"

The answer was probably convenience. The above rule is easier to use. Once you start worrying about years divisible by 3200 (or 128), things get complicated. The rule as adopted is easy to remember. Especially in the several hundred years following 1582 when there were no logarithms, slide rules or much less calculators or computers.

And, no, of course that's not the end of the story - there are levels below this as well. I have written before about the infamous leap second, but it's also known that the Earth's day is slowly getting longer as the lunar and solar tides slow our rotation.

The rate is pretty small, losing a second every 36 million years. It is far below needing a correction to the above rule, but enough for today's timekeeping to worry about.

 Friday, February 27, 2004

Whoa Nelly!

Now some (like my wife) might find the humour a little bit off, but since my registration was about to expire, I tried the other day to get the following vanity licence plate:

I did hesitate, thinking that this kind of provocation would not be wise, and might lead to a strange episode of road rage directed against me by a vegetarian.

Trouble was, somebody out there has already claimed it. I can only hope they didn't use Virginia's Horse Lover plate:

...unless they were French, Italian, Swiss, Japanese or Belgian, of course.

The recent mad cow scare led to a small increase in the consumption of exotic meats in Europe and North America (alligator, hippo, horse, kangaroo, ostrich, etc.), but in fact most of these meats have their associated risks as well. A large risk is systemic, in that the inspection procedures for these meats are much less well-defined than those for the standard meats: beef, pork, mutton, fish, and fowl.

Given the extreme rarity of vCJD incidences (119 known cases worldwide as of January 2002) and the laxer inspections of these other meats (except horse and kangaroo, see below), it is very likely that consumers who chose exotics over beef due to the vCJD cases in fact increased their risk of contracting other food-borne diseases at the cost if an infinitesimal decrease in vCJD risk.

For alligator meat, the main risk besides improper handling is due to the mercury burden, especially in the Florida Everglades (ref.). However there is also some parasitic risk from trichinella, trematodes (flukes), ascarids (roundworms) and pentastomes (ref. [PDF]).

In the case of kangaroo, although many of the above parasites can be present in wild animals, a well-established inspection system has been in place since 1993 in Australia for meat raised or culled for use in human consumption. The only parasite of note is a nematode Pelicitus roemeri with a 1988 detected appearance rate of about 1.4% (ref.).

Since horsemeat is often consumed raw or extremely rare, there can be a (small) trichinosis risk. There was an outbreak in France in 1985, and more recently in 1998 from horses brought in from Eastern Europe (ref.).

I noticed that the U.S. had decided to stop all importation of French meat products last Tuesday, so I decided to look at the issue of French horsemeat consumption (although the flow of horsemeat is decidedly from the US to France, rather than the other way around). The data below is all from the most official source I could find, Office National Interprofessionel des Viandes, de l'Elevage et de l'Aviculture, OFIVAL, the French National Office of Meats, Breeding, and Poultry Farming (the Ministry of Agriculture didn't seem to have much). I have to say that the OFIVAL synthesis notes on horsemeat are already five years old, and getting a little ripe. There is also a monthly bulletin from which I pulled some 2002 data.

The most obvious fact is that the consumption of horsemeat declined in France during the 1988-1998 period, and according to later notes, continues to do so (imports dropped 10% from 2001 to 2002). Most meat is supplied from imports - in 2002, 30% came from other EC countries, 23% from Argentina, 21% from Canada, and 11% from the USA.

The French market is highly polarized. While 24% of the population consume horsemeat more than once a year, 43% of French refuse to consume it, and the remaining 33% either consume it once a year or less, have stopped because of unavailability, or have never tried it at all and are neutral. The industry itself is extremely sensitive to the public opinion, and has decided not to try and promote consumption using the media given the outcry that this might provoke.

This is a table from OFIVAL that compares the changes in consumption for various meats in France over the 1988-1998 period:

Consumption of meats in France 1988-1998





percent change 88-98

98 Consumption per capita

Average price (1998)

Large Bovines






60.0 F/kg







72.4 F/Kg







56.9 F/Kg







33.9 F/Kg







73.1 F/Kg







51.9 F/Kg







33.9 F/Kg

TOTAL (incl. Fowl)






47.4 F/Kg (est.)

Numbers are in the exquisite units: "equivalent carcass tonnes," and I have added the last row with its calculations.

What I thought was interesting here was the over 43% decrease in horsemeat consumption in the decade in question. Most of the difference has been made up by increases in poultry and pork. From the average price column, an abvious conclusion might be that this is price driven, since poultry and pork are the cheapest meats, and the two largest decreases, in horse and veal, are the most expensive meats.

Probably most fundamental is the actual level of consumption: only about 0.66% of all meat consumed in France is horsemeat.

I would love to find a similar table for US consumption to be able to look at the proportions of alligator, armadillo, elk, rabbit, moose, snake, and venison.

...and no, I have not tried horsemeat. I would not elect to, but I think I could. I have had plenty of venison, buffalo, rabbit, and wildfowl, and even had several alligator steaks. My lightning visit to Cairns never afforded me the chance at kangaroo. Oh yes - I have had ants.

And the licence plate? I ended up getting COI-O3 on a National Air and Space Museum plate. A hug and a kiss to whoever figures it out.

 Wednesday, February 18, 2004

Athens, Rome, Los Angeles:

No, not the Olympics, but the seat of cultures that have given us the mythology and deities we choose to immortalize in the skies. Athens gave us the Greeks, Rome gave us the Romans, and Los Angeles gave us the Tongva.

"Los Angeles? Tongva?"

Yes, it is so. Chad Trujillo and Mike Brown, of Caltech's Division of Geological and Planetary Sciences (GPS), have named their discovery, the largest "minor planet" beyond Pluto, Quaoar, after a god of creation in the legends of the Tongva, Native Americans indigenous to the Los Angeles Basin.

You might comment that this choice is somewhat parochial. After all, the Greeks and Romans had a very large impact on subsequent civilizations, many of them far removed in time and in geography. You might say it would be telling if we asked 100 randomly selected citizens (even from the L.A. basin) if they had heard of a) the Greeks, b) the Romans, and c) the Tongva.

Well, there is a website about c), of course. It turns out that the Tongva are an as-yet Federally unrecognized tribe of about 300 people.

I would suggest a few minutes perusal of the full list of Minor Planet Names. A few guffaws are guaranteed ("Arthurdent" "Tweedledee/Tweedledum" "Zappafrank"). "What about the deities of the Chibcha? I cry out... I want a planet called Tequendama!"

However, all this is old news, including the debate about the name. After all, the discovery dates from June of 2002, and the name 'Quaoar' has since been accepted by the International Astronomical Union. You can see information on the IAU naming proces and guidelines here and here (N.B.: despite many huckster's ongoing attempts to convince you and your wallet otherwise, naming a star for your sweetheart ain't official).

The name Quaoar does meet the IAU criteria (although I'm not sure about 'easily pronounceable'), and I bet having the largest body discovered since Pluto went a long way towards meeting the political goal of having the Tongva tribe Federally recognized. I can imagine that the Tongva elders' consideration of this idea was an extremely interesting discussion.

What got my attention was an article in the GPS Alumni Newsletter by Trujillo that talked about the ongoing search for other bodies like Quaoar.

By using the IRAM telescope in Spain and the Hubble, Trujillo and Brown have determined that Quaoar is 1,250 km in diameter, which makes it about as large as all the known asteroids put together. They are in the middle of a robotic search of the entire sky using the Oschin telescope on Palomar, which by the end of 2004 will indicate if there are any additional bodies like Quaoar.

Trujillo's surprising comment was the following:

"Since beginning this project, we have only had time to examine about 7 percent of the sky for the presence of very large bodies like Quaoar, so we think that there should be about ten more of similar size that are still undiscovered, a few of which may be even larger than Pluto." (Pluto is about 2,300 km in diameter)

(NASA and A. Feild/STSci)

If there are bodies discovered larger than Pluto, this will certainly add fuel to an ongoing debate about how many planets there are in the Solar System. Surprisingly, the debate is not whether Quaoar should be an additional planet, but about whether Pluto should be demoted, or thrown out. There has been sufficient debate and confusion to have the IAU reconfirm that Pluto is indeed a "planet."

It's not as simple as one would think. Simply saying that planets are round and orbit the Sun, would give us many more: Ceres, about 914 km in diameter; Pallas (522 km); and Vesta (~500km). In the outer solar system, among the Kuiper Belt, consider this crop:

(Gerhard Hahn/DLR, Astrovirtel, ESO, ESA, Institute for Astronomy)

When we add 2001 KX76 (also referred to as "Ixion"), Varuna, and all these other 1,000 km class objects to the list of 'planets,' and remember that they are being added to on a yearly basis, we can see that the list of planets will get out of hand quite quickly.

We've been there before, when Ceres' discovery in 1801 was quickly followed by the avalanche of Pallas, Juno, and Vesta, and the post-1845 deluge of Astraea, Hebe, Iris, Flora, Metis, and Hygeia.

(There are a lot of bodies out there, most of them much smaller, but in impressive numbers, as in this 100-year animation of the outer solar system made by the folks at Harvard's Minor Planet Center)

The historical list of eight plus Pluto was soon re-established, and it will likely live on for historical reasons rather than a truly consistent ontological naming system.

Despite Quaoar, Ixion, Varuna and their ilk.

 Wednesday, February 11, 2004

Polyblog II:

Hmm. I recently got a hit from Mexico, looking for "pleibol," that led me to rediscover the Google translator, so I decided to see what my page looked like in languages Google handles that I can actually understand.

Here's my ranking of the machine translated versions, from the best translation to the worst:

1. French;
2. Spanish;
3. Portuguese;
4. Italian; and
5. German.

This ranking has several problems. First, it's done by me, and therefore also reflects my decreasing understanding of these languages (except French and Spanish - I'm quite confident that Google is having more trouble translating my page into Spanish than French). Second, it's a translation of pieces written by me. They tend to be difficult: odd sentence structure, odd diction, etc. etc. I'm not an easy read, so I'm sure I'm an even harder translation.

Google is getting ready to provide translators into more languages - I tried changing the hl flag to sv for Swedish and ru for Russian, and although the page is not translated, there is an upper frame from the Google return that is in Swedish or Russian.

I also tried translating my page from these languages into English by simply switching the hl and sl flags. The results are odd, to say the least. I'm asking the machine to translate English into English, but to listen with a French, Spanish, Portuguese, Italian or German ear.

This has actually occurred to me once - I heard English as non-speakers must hear it. For several minutes, as an in-flight announcement was being made on an Avianca flight into Bogotá, I could not tell what language was being spoken. For some reason, I could not parse the sounds and place the spaces between words correctly, and I heard a stream of gibberish. Right at the end, something clicked, and suddenly I could understand that it was English. Try as I might, I could not hear gibberish again.

It's much like when one sees an interesting pattern, only to realize after a while that it is actually a highly stylized font. Once you can read the words, it is extremely difficult to recapture the pure pattern - the 'wordness' interferes too much.

 Tuesday, February 10, 2004


Something fun I found by backtracking visitors to my site...

My blog, in other words... (you might need a Japanese character set installed to load this on a Windows system... don't bother clicking on the link if you don't think you have one -- trust me, it's not worth the hassle).

Since I don't know any Japanese, I can't tell whether this even comes close to an accurate translation. Plus, my accent is probably atrocious anyway.

 Friday, February 06, 2004

Brunhes, Matuyama:

The Earth's magnetic field is dying. As it has before. And no, it is not a human-caused disaster. But it should have some interesting effects on humans.

The Earth's magnetic field goes through periods of instability every so often. Sometimes the field comes back reversed (so that magnetic North reappears where the magnetic South used to be, and North where South was), but more often than not, the field goes through a weak period and then strengthens back to the way it was before. We do not completely understand the process, but we are close. It's one of those non-linear magneto-hydro-dynamic problems even your best professors had nightmares about. Even using our most powerful supercomputers, we have problems modeling this process accurately.

We are pretty sure this occurs on Earth, because when lava cools, the magnetic minerals preserve the direction of the magnetic field at the time. We can see that the direction preserved in progressively older rocks changes in the way described above: sometimes the field reverses completely, and sometimes the signal dimishes only to reappear again in the same direction.

This "magnetic pole reversal" pattern can be mapped out by looking at successively older and older rocks from different locations to get an idea of how often this change occurs. Here is what it looks like for the last 5 million years, with black being 'normal,' and white 'reversed':

(the above is from Lisa Tauxe's notes for her courses at UCSD)

There is actually more detail here than shown: very short-lasting reversals are omitted, so what we are looking at above is the 'dominant' polarity over 10- to 20,000 year periods. These dominant polarity periods, then, last 250,000 years on average, with a fair amount of variation. But what actually happens during a magnetic field reversal?

It takes about one thousand years or so for the field to reverse. At first, the field weakens, as is happening presently. The next part is interesting: the field can pass through a multi-pole phase. In other words, there is a crazy period where there are many, many North magnetic poles, and many, many South magnetic poles. Out of this chaos, the field slowly organizes again, and can emerge either 'normal' or 'reversed.'

What I have wondered about is the effects that these collapses/multi-pole phases might have had in the past, and what they will have in the future. Consider the timescales involved: one thousand years is quite fast compared to evolutionary (speciation) time scales, and fast even compared to things like migration or species expansion into biomes. For species that rely on the magnetic field for navigation (bees, tuna, some turtles, some birds, perhaps some whales) this kind of thing could be catastrophic if the field is their only cue. Are magnetic reversals associated with any particular die-offs? I have never seen any attempts to answer this question, but given the timescale precision required to date die-offs against magnetic reversals, as well as proving that the species in question had a magnetoreceptor, I am not surprised.

The other end of the question is that since there is no longer a strong magnetic field, much more of the Earth's surface is exposed to hard radiation. Think aurora borealis/australis scattered all over the place. Would this affect the speciation rate from radiation induced mutations? Again, a question that would be exceedingly difficult to approach, given the quality of the geologic record.

What about the present? What would happen to us today if we lost the magnetic field? There would be a higher radiation risk generally - but even especially so for airplane flight and for manned space flight. We could expect many satellite outages, and the accompanying data/communications problems. On the ground, electric grid disturbances would be stronger, and might cause more the failures similar to those we saw last year. There might be ozone holes all over the place, since the high energy rays that break apart ozone can only get in around the magnetic poles. We would probably all be able to see the aurora - even in equatorial areas. We might even see an increase in public interest in science.

Learn more.

 Monday, February 02, 2004

#35 & #43:

Besides dooming the U.S. hatmaking industry with his bareheaded 'viggah,' John F Kennedy set NASA on its course to the Moon with his Message to Congress in May of 1961 and his more famous 1962 Moon speech at Rice University.

At NASA Headquarters, on January 14, President Bush made a speech about new directions for the U.S. space program, including a return to the Moon with the ultimate goal of putting a man on Mars. Perhaps significantly, this initiative was missing from his January 20th State of the Union Address.

First, it is interesting to compare these speeches. Now, I don't want to get into a Texas vs. Massachusetts debate, Yale vs. Harvard, or even 'jocks and geeks,' but my end conclusion was that unfortunately Dubya's speech writers have not been putting out their best.

The next few days after the Moon/Mars announcement we of course saw inevitable sniping about how could the nation afford this given the present budget climate, etc. etc. I have had to endure some pretty fierce ribbing from European counterparts at recent meetings about this issue.

My questions have been: What did the NASA budget look like in 1962? What has the NASA budget done since? What about science in general? What about other major pieces of the budget pie?

So, given my penchant for posting horribly long tables, here is the Federal budget history, from 1962 onwards.

Federal Finances, 1962-present

| Year | NASA Budget | Total Gov't Outlays | NASA % | National Defense % | General science % |

Some points to note about the table:
(*) 1. TQ stands for the 'transition quarter' of July through September of 1976, which bridged the change from July-June Federal fiscal years to October-September.
2. Figures are not in constant dollars (a MAJOR fault with the above table).
3. All data are from historical tables of the US Federal Budget

Some additional data that I stripped from the table to avoid clutter:
- the interest on the debt in 1962 took up 8.54% of the total outlays. By 2003, the interest took up 16.59% of the budget, down from a high of 22.22% in 1997 during the Clinton administration. Basically, we have twice the amount of proportional debt, in non-constant dollars. Note also that the Federal spending on science as a whole has declined in proportional terms.

Here is the diagram on the projected budget for the new NASA objectives presented by the President during his January 14 NASA HQ speech (click on it for a larger, readable, version).

Of note is the definite horizon/end of the Shuttle and ISS programs. This also means the end of non-"human spaceflight" experiments aboard the ISS. For years NASA has trumpeted space as a place for new manufacturing technology and as a source of new materials, and justifications for the Shuttle and ISS were written in those terms. Unfortunately, even routine scientific research is extremely difficult to do in a test vehicle, and that is what these platforms are - the risk of catastrophic failure on launch or re-entry was always very real.

The budget presented shows that there really is little new money to be devoted to this re-ignited mission of exploration. Of special note is that after 2009, there is no projected increase in the budget - it simply keeps pace with inflation. What is clear is that this mission will eat away at most every other aspect of NASA science - aeronautics, remote sensing, astronomy, etc. - very much as the ISS and Shuttle budgets did.

One remaining point. When the U.S. went from Mercury to Gemini to Apollo to Shuttle, we built each successive generation of booster/capsule systems from scratch. Based on previous experience, of course, but any engineer knows that systems as complex as these need extensive testing before using them for human flight. Why does the U.S. do this? Because we put the prime contracts out to bid. Sometimes Boeing would win, sometimes Lockheed, sometimes others. Each of these had submitted a proposal that had to be different enough to catch the selector's eye. The Russians, in contrast, have stuck with the basic RK-7 Soyuz rocket configuration for over 40 years. They have of course modified the boosters and capsules over the years, but the stability in the core program gave them one tremendous advantage: low cost. It is very probable that Russia could launch a manned mission to Mars for about one quarter of what it will cost the U.S. -- however, they currently lack the political will and financial power to do it. And that is, in the end analysis, what counts.

Having people in space is always much more expensive than simply launching metal. It's certainly much more exciting too, but we always have to be ready to face a catastrophe. And there will almost certainly be another disaster somewhere in this Moon/Mars series. I am not sure the U.S. has the political courage to face another space catastrophe so soon.

Labels: , , ,