Hi,
I often hear complaints or jabs about the USA not being on the metric system. Actually, we've been on the metric system since the late nineteenth century (1878). And we were the second country, after France, to use it officially long before we formally adopted it. We've been just a little slow to convert to it -- but all of our units are based on it.
Of course, the metric system itself sucks. It has one thing going for it, a unified set of prefixes for multiples of 1000 (with a few for 100 and 10). Every thing else about it (size of basic units, the fact that ten is only divisible by 2 and 5, the ridiculous size of derived units, etc.) is wrong.
The metric system was conceived by a bunch of ivory tower intellectuals with no practical experience. They worshiped simplicity and regularity, and they sacrificed utility for their dreams. Only a revolutionary French academician could think that the unit used to measure the length of a nail should be (effectively) the same unit used to measure the circumference of the Earth.
Indeed, metric units are arbitrary. Unlike the common units they replaced, they are not based on either accepted usage or convenience for the task at hand. By 1800, it took an amazing amount of geocentricism to base a measuring system on the dimension of the Earth. But that's just what they did. Starting with the meter.
One meter was supposed to be one ten millionth of the distance from the pole to the equator. Now this causes a problem. Traditionally, the earth is divided into degrees, minutes, and seconds. And from Babylonian times, there are 360 degrees in a circle, 60 minutes in a degree, and 60 seconds in a minute. By 1800, there was a large amount of material, from maps to star charts to sailing instructions that was based on the Babylonian system. In addition, there were large numbers of instruments using that system, from sextants to indicators on telescope mounts and more.
The nautical mile is the distance subtended on a great circle of the earth by a central angle of one minute. This makes navigation much easier, since the separation of two points in nautical miles is just their angular difference along a great circle in minutes. To make the meter (actually the kilometer) equally useful, the academicians proposed a new measure for angles, the grad. A complete circle is 400 grads. How that makes sense is not clear, since the circle is the natural basis for angular measurement and 100 grads to the circle would have been more in the spirit of metrication. However, they wanted a circular measure that agreed with their linear measure (which, when you think about it, shouldn’t the unit of length have such that the equatorial circumference was a power of ten?) Regardless, the problem is practical. Since the grad is not a widely accepted unit, finding maps with longitude and latitude in grads is a tad difficult. Also, if available at all, I suspect a sextant in grads will be hard to find and thus, pretty expensive. And I doubt that sight reduction tables in grads are any too common, either. If the grad is forced on the world, then yet one more useful unit would fall to the mindless metricians.
So, we have the meter. An arbitrary unit of no particular value. Had the academicians had the brains of a peacock instead of its pride, they would have used the nautical mile and developed the length unit around it.
Now, from this arbitrary unit of length, we get our unit of mass. The gram was originally defined as the mass of one cubic centimeter of water. Why? Why a cubic centimeter? That results in a pretty small unit of mass. Why water? It’s easier to purify mercury, and it would have given a slightly more useful mass to work with. Combined with the length of the meter, the smallness of the gram leads to an unworkable system. More on that later.
Then there was the whole thing with time. Neither the decimal calendar nor decimal time made the grade. Even the French academicians were smart enough to let them die. Had they combined their intelligence with common sense and recognized some of the usefulness of traditional practice, they might have actually come up with a better calendar, perhaps one of thirteen four week months with a non-weekday, non-day-of-the-month new year’s day and (as needed) leap year’s day.
Temperature measurement was only partially their fault. Around 1800, temperature was not well understood and the concept of absolute zero was still some years in the future. However, by the conference of 1875, things had changed quite a bit. The adoption of the centigrade scale at that time was a mistake that has had repercussions to this day. The modern scale is based on the degree centigrade but with its zero at absolute zero. Since two points are needed to establish a temperature scale, the second point was chosen as the triple point of water. This gives us the ugly additive factor of 273.15 to convert from Celsius to Kelvin. What happened to one unit fits all and everything by powers of ten? Indeed, why use a unit based on the freezing point and boiling point of water at all? The Fahrenheit values of zero and one hundred are more useful to the human scale. They are the temperatures (approximately) above and below which humans start to lose efficiency. The triple point of water, at least, makes sense in that it is a precisely defined condition.
Finally, there is the ampere. While not bad in itself, it leads to absurd derived units for induction and capacitance. The unit of capacitance? A Farad. But you typically see micro farads, pico farads, even pico pico farads. Inductance and the Henry is not as bad, but it by no means good.
The remaining two ‘fundamental’ units, the mole and the candela, are anything but. The mole is basically just a conversion factor from atomic mass units to grams. The candela is just the areal power density of a monochromatic light source.
How well were those base units chosen? Well, consider that there has never been a system of units based on them directly. The cgs systems (there were at least two) were based on the second and the gram, but used the centimeter for length because the results of using the meter were ludicrous. Still, it led to a unit of energy, the erg, that is so small that "One hundred million ergs!" (a line from a really bad sci-fi movie) is about the energy content of 1/4 of a PBJ sandwich.
The MKS system, the basis for the SI units, is hardly better. The unit of force, the Newton, is too small for most practical applications. Ask a metric user what he weights, and he'll give you his mass in kilograms instead. Metric torque wrenches are not labeled in N•m or N•cm (which are units of torque), probably because there isn't enough room on the barrel of the wrench to write all the digits. Instead, they're labeled in kg•m, kg•cm, or g•cm. I'm not too sure what a mass times a distance is, but I'm sure it's no torque by any definition I know.
As to the vaunted power of ten multiples, so what? First, if appropriate basis are used to measure things, then not many multiples are ever needed. Second, the unfortunate happenstance of our four fingers and a thumb gave us one of the worst numbering systems possible. Given our bilateral symmetry, it is pretty much inevitable that we would use an even base. Had we had four or six digits on each hand, we'd be a lot better off. Eight would be beautiful in that it would provide fast conversions between bases that are a power of 2. Twelve, though not as useful, at least has the advantage of having 2, 3, 4, and 6 as its factors.
How important is this? The next time you bake a pizza (or other pie) at home, try to cut it into ten pieces. Humans are very good at dividing things into halves, and halves of halves, and so on. They're almost as good at dividing things into thirds. But it's the rare untrained person that can get a division of an object by fifths right.
And, yeah, really nice system of prefixes. There are prefixes that are orders of magnitude apart, but their abbreviation differ only in being capital or lower case. Nice source of potential mistakes. Like peta and pico, zetta and zepto, yotta and yocto, and (before they changed it) deca and deci. Of course, they fixed that last pair by breaking their own rules about one letter abbreviations. And then there's the whole mega and milli and micro mess. Do you know, right off the top of your head, where µ is on you keyboard?
Even trained people screw up all the time. If I were really getting the 0.5 MG daily dose of tacrolimus that my prescription bottle claims, I'd be writing this on a Ouija board instead of a keyboard.
Ah, but conversion is so much easier say the metrification supporters. That’s their big bad bugaboo. Sure, to convert the width of the Atlantic from miles to nails is a lot harder than from kilometers to centimeters. But why would anyone want to? Or, except as a stupid exercise in primary school, how often does anyone actually do it? How often does someone need to convert barrels (any of them) to ounces (any of them)?
Consider the hand. You're out in the paddock, looking over some horses. One catches your eye. You look in its eyes, you check its ears, you examine its teeth for rings and wear, you then want an idea of how tall it is. Realizing you've left your meter stick at home, you jump into your car, go get the stick, and drive back to the paddock. By then, the horse is no longer qualified to run as a two year old. Me? I've got my measuring sticks with me at all times. I simply walk them up, hoof to withers, counting as I go and estimating the last to the nearest 1/2 hand. If I get 16 or so hands, I know I've got a reasonable size horse. No Clydesdale, but no pony, either. I compare hands to hands. I don't convert them to inches or fathoms or millimeters.
If a 16 hand horse runs a 7 furlong race, does anyone in their right mind convert that to 64 inches and 55,440 inches to find that the horse has run 866 times its height?
When you're driving do you convert your speedometer reading from km/hr to cm/s? If you're running the 100 meter sprint, do you convert that to 0.1 km or 10^5 mm? The reason the metric system needs easy conversions is the one size fits all mentality. If one is going to measure the width of a nucleus with the same ruler used for the diameter of the universe, he’s gonna need a lot of powers of ten. But in real applications, the metric system buys nothing. It is no easier nor harder to calculate gas usage by dividing some number of miles by some number of gallons than to divide some number of kilometers by some number of liters.
And yes, one unit per dimension is easy. On paper. But when you are talking about the mass of stars, is it reasonable to use the same unit that you use to measure the mass of the hydrogen atom? (Hint: the first is usually given in solar masses, the second in atomic mass units. And neither have an exact conversion factor to SI units).
There are a lot of conversion factors in the traditional units. My point is that all of those conversions don't matter. First, most people have no need of most of those units, they're specific to surveying, or navigation, or jewelry. Second, even for when they're used, conversion is seldom if ever necessary. There are few, if any, recipes that call for 16 ounces of flour and 2 ounces of gold making it necessary to convert from troy to avoirdupois.
So, yes, the metric system has a certain simplicity in definition. But one pays for that simplicity in many ways, including unnecessary conversions and the introduction of many zeros either before or after the decimal point. I'm not saying the metric system should not be used. I'm not even saying that it shouldn't be the basis for all measuring systems (as it is in the USA). I'm saying that demanding that everything be measured in SI units just for the sake of a trivial consistency is wrong. Discarding utility for elegance is a stupid, elitist attitude. Appropriate for academicians during the French revolution, possibly, but not for a modern society that has expanded its lengths and masses by orders of magnitude in both directions.
The imperial units are not arbitrary. Quite the opposite, they were developed as they were needed and they were convenient for the quantities they measured. So, terrestrial distances can be measured in miles in five digits or less. An arrow shot can be stepped off in yards, the width of a wall in feet. Originally, each unit stood alone, and no one would have thought of measuring a bolt of cloth with a surveyor's chain.
Consider how things are measured. For instance, atmospheric pressure. You take a glass tube with one end sealed and fill it with mercury and stand it up in a bowl of the same, open end down. Then you measure the height of the mercury in the tube relative to that in the bowl. That gives you a direct measurement of the pressure in some unit of length (usually mm of Hg or inches of Hg). Now, atmospheric pressure is usually just compared to atmospheric pressure, not to the pressure it takes to form diamonds nor that in intergalactic space. Just what is gained from converting from mm-Hg to Pa? It makes the useful comparisons no easier, and simply adds one unnecessary and potentially error causing step to the process.
Useful units arise as needed, and go when no longer needed. They don't get handed down from ivory towers nor are they mandated by politicians. Consider atomic cross sections in barns. Or look at the league. The origin was how far an average person could walk in an hour. Now that people seldom walk farther than from the couch to the refrigerator, it is of little use. But when shank's mare was most people's form of transportation, then it was a very useful unit. If someone told you it was 5 leagues to town, you knew right away how long it would take to get there. If they told you it was 24 km, you'd have to go through "Hm, I can walk about 5 km in an hour, so it'll take me about 5 hours." The league came about because it was useful. If it was ever formally defined in terms of other lengths, then that is a relatively modern (and useless) innovation. And it didn't need to be legislated away. Along with the knowledge of how to saddle a horse, it simply slipped into obscurity when it was no longer needed.
So, no. I'm not sold on the metric system. It is a poor system that, like the Jack of all trades, does many jobs, but none of them well. The one real advantage it has is that it is the closest thing the world has to a unified measuring system. But don't take that too far. When I was still working on cars, I discovered that German, French, and Italian metric nuts and bolts were not interchangeable. Oh, yeah, they all had the same nominal size heads and shafts, but they used different pitches and different thread shapes. My tools have been gathering dust since about '85, so that may have changed with the EU and all. But remember, standards are important. That's why every company, every country, every union has their own.
Now, if you really want a great system of units, consider a set of natural units.
--Pete
(09-10-2010, 12:54 PM)Taelas Wrote: Oh, and while you're at it, switch to metric and centigrade. Even more rational, logical ideas!
I often hear complaints or jabs about the USA not being on the metric system. Actually, we've been on the metric system since the late nineteenth century (1878). And we were the second country, after France, to use it officially long before we formally adopted it. We've been just a little slow to convert to it -- but all of our units are based on it.
Of course, the metric system itself sucks. It has one thing going for it, a unified set of prefixes for multiples of 1000 (with a few for 100 and 10). Every thing else about it (size of basic units, the fact that ten is only divisible by 2 and 5, the ridiculous size of derived units, etc.) is wrong.
The metric system was conceived by a bunch of ivory tower intellectuals with no practical experience. They worshiped simplicity and regularity, and they sacrificed utility for their dreams. Only a revolutionary French academician could think that the unit used to measure the length of a nail should be (effectively) the same unit used to measure the circumference of the Earth.
Indeed, metric units are arbitrary. Unlike the common units they replaced, they are not based on either accepted usage or convenience for the task at hand. By 1800, it took an amazing amount of geocentricism to base a measuring system on the dimension of the Earth. But that's just what they did. Starting with the meter.
One meter was supposed to be one ten millionth of the distance from the pole to the equator. Now this causes a problem. Traditionally, the earth is divided into degrees, minutes, and seconds. And from Babylonian times, there are 360 degrees in a circle, 60 minutes in a degree, and 60 seconds in a minute. By 1800, there was a large amount of material, from maps to star charts to sailing instructions that was based on the Babylonian system. In addition, there were large numbers of instruments using that system, from sextants to indicators on telescope mounts and more.
The nautical mile is the distance subtended on a great circle of the earth by a central angle of one minute. This makes navigation much easier, since the separation of two points in nautical miles is just their angular difference along a great circle in minutes. To make the meter (actually the kilometer) equally useful, the academicians proposed a new measure for angles, the grad. A complete circle is 400 grads. How that makes sense is not clear, since the circle is the natural basis for angular measurement and 100 grads to the circle would have been more in the spirit of metrication. However, they wanted a circular measure that agreed with their linear measure (which, when you think about it, shouldn’t the unit of length have such that the equatorial circumference was a power of ten?) Regardless, the problem is practical. Since the grad is not a widely accepted unit, finding maps with longitude and latitude in grads is a tad difficult. Also, if available at all, I suspect a sextant in grads will be hard to find and thus, pretty expensive. And I doubt that sight reduction tables in grads are any too common, either. If the grad is forced on the world, then yet one more useful unit would fall to the mindless metricians.
So, we have the meter. An arbitrary unit of no particular value. Had the academicians had the brains of a peacock instead of its pride, they would have used the nautical mile and developed the length unit around it.
Now, from this arbitrary unit of length, we get our unit of mass. The gram was originally defined as the mass of one cubic centimeter of water. Why? Why a cubic centimeter? That results in a pretty small unit of mass. Why water? It’s easier to purify mercury, and it would have given a slightly more useful mass to work with. Combined with the length of the meter, the smallness of the gram leads to an unworkable system. More on that later.
Then there was the whole thing with time. Neither the decimal calendar nor decimal time made the grade. Even the French academicians were smart enough to let them die. Had they combined their intelligence with common sense and recognized some of the usefulness of traditional practice, they might have actually come up with a better calendar, perhaps one of thirteen four week months with a non-weekday, non-day-of-the-month new year’s day and (as needed) leap year’s day.
Temperature measurement was only partially their fault. Around 1800, temperature was not well understood and the concept of absolute zero was still some years in the future. However, by the conference of 1875, things had changed quite a bit. The adoption of the centigrade scale at that time was a mistake that has had repercussions to this day. The modern scale is based on the degree centigrade but with its zero at absolute zero. Since two points are needed to establish a temperature scale, the second point was chosen as the triple point of water. This gives us the ugly additive factor of 273.15 to convert from Celsius to Kelvin. What happened to one unit fits all and everything by powers of ten? Indeed, why use a unit based on the freezing point and boiling point of water at all? The Fahrenheit values of zero and one hundred are more useful to the human scale. They are the temperatures (approximately) above and below which humans start to lose efficiency. The triple point of water, at least, makes sense in that it is a precisely defined condition.
Finally, there is the ampere. While not bad in itself, it leads to absurd derived units for induction and capacitance. The unit of capacitance? A Farad. But you typically see micro farads, pico farads, even pico pico farads. Inductance and the Henry is not as bad, but it by no means good.
The remaining two ‘fundamental’ units, the mole and the candela, are anything but. The mole is basically just a conversion factor from atomic mass units to grams. The candela is just the areal power density of a monochromatic light source.
How well were those base units chosen? Well, consider that there has never been a system of units based on them directly. The cgs systems (there were at least two) were based on the second and the gram, but used the centimeter for length because the results of using the meter were ludicrous. Still, it led to a unit of energy, the erg, that is so small that "One hundred million ergs!" (a line from a really bad sci-fi movie) is about the energy content of 1/4 of a PBJ sandwich.
The MKS system, the basis for the SI units, is hardly better. The unit of force, the Newton, is too small for most practical applications. Ask a metric user what he weights, and he'll give you his mass in kilograms instead. Metric torque wrenches are not labeled in N•m or N•cm (which are units of torque), probably because there isn't enough room on the barrel of the wrench to write all the digits. Instead, they're labeled in kg•m, kg•cm, or g•cm. I'm not too sure what a mass times a distance is, but I'm sure it's no torque by any definition I know.
As to the vaunted power of ten multiples, so what? First, if appropriate basis are used to measure things, then not many multiples are ever needed. Second, the unfortunate happenstance of our four fingers and a thumb gave us one of the worst numbering systems possible. Given our bilateral symmetry, it is pretty much inevitable that we would use an even base. Had we had four or six digits on each hand, we'd be a lot better off. Eight would be beautiful in that it would provide fast conversions between bases that are a power of 2. Twelve, though not as useful, at least has the advantage of having 2, 3, 4, and 6 as its factors.
How important is this? The next time you bake a pizza (or other pie) at home, try to cut it into ten pieces. Humans are very good at dividing things into halves, and halves of halves, and so on. They're almost as good at dividing things into thirds. But it's the rare untrained person that can get a division of an object by fifths right.
And, yeah, really nice system of prefixes. There are prefixes that are orders of magnitude apart, but their abbreviation differ only in being capital or lower case. Nice source of potential mistakes. Like peta and pico, zetta and zepto, yotta and yocto, and (before they changed it) deca and deci. Of course, they fixed that last pair by breaking their own rules about one letter abbreviations. And then there's the whole mega and milli and micro mess. Do you know, right off the top of your head, where µ is on you keyboard?
Even trained people screw up all the time. If I were really getting the 0.5 MG daily dose of tacrolimus that my prescription bottle claims, I'd be writing this on a Ouija board instead of a keyboard.
Ah, but conversion is so much easier say the metrification supporters. That’s their big bad bugaboo. Sure, to convert the width of the Atlantic from miles to nails is a lot harder than from kilometers to centimeters. But why would anyone want to? Or, except as a stupid exercise in primary school, how often does anyone actually do it? How often does someone need to convert barrels (any of them) to ounces (any of them)?
Consider the hand. You're out in the paddock, looking over some horses. One catches your eye. You look in its eyes, you check its ears, you examine its teeth for rings and wear, you then want an idea of how tall it is. Realizing you've left your meter stick at home, you jump into your car, go get the stick, and drive back to the paddock. By then, the horse is no longer qualified to run as a two year old. Me? I've got my measuring sticks with me at all times. I simply walk them up, hoof to withers, counting as I go and estimating the last to the nearest 1/2 hand. If I get 16 or so hands, I know I've got a reasonable size horse. No Clydesdale, but no pony, either. I compare hands to hands. I don't convert them to inches or fathoms or millimeters.
If a 16 hand horse runs a 7 furlong race, does anyone in their right mind convert that to 64 inches and 55,440 inches to find that the horse has run 866 times its height?
When you're driving do you convert your speedometer reading from km/hr to cm/s? If you're running the 100 meter sprint, do you convert that to 0.1 km or 10^5 mm? The reason the metric system needs easy conversions is the one size fits all mentality. If one is going to measure the width of a nucleus with the same ruler used for the diameter of the universe, he’s gonna need a lot of powers of ten. But in real applications, the metric system buys nothing. It is no easier nor harder to calculate gas usage by dividing some number of miles by some number of gallons than to divide some number of kilometers by some number of liters.
And yes, one unit per dimension is easy. On paper. But when you are talking about the mass of stars, is it reasonable to use the same unit that you use to measure the mass of the hydrogen atom? (Hint: the first is usually given in solar masses, the second in atomic mass units. And neither have an exact conversion factor to SI units).
There are a lot of conversion factors in the traditional units. My point is that all of those conversions don't matter. First, most people have no need of most of those units, they're specific to surveying, or navigation, or jewelry. Second, even for when they're used, conversion is seldom if ever necessary. There are few, if any, recipes that call for 16 ounces of flour and 2 ounces of gold making it necessary to convert from troy to avoirdupois.
So, yes, the metric system has a certain simplicity in definition. But one pays for that simplicity in many ways, including unnecessary conversions and the introduction of many zeros either before or after the decimal point. I'm not saying the metric system should not be used. I'm not even saying that it shouldn't be the basis for all measuring systems (as it is in the USA). I'm saying that demanding that everything be measured in SI units just for the sake of a trivial consistency is wrong. Discarding utility for elegance is a stupid, elitist attitude. Appropriate for academicians during the French revolution, possibly, but not for a modern society that has expanded its lengths and masses by orders of magnitude in both directions.
The imperial units are not arbitrary. Quite the opposite, they were developed as they were needed and they were convenient for the quantities they measured. So, terrestrial distances can be measured in miles in five digits or less. An arrow shot can be stepped off in yards, the width of a wall in feet. Originally, each unit stood alone, and no one would have thought of measuring a bolt of cloth with a surveyor's chain.
Consider how things are measured. For instance, atmospheric pressure. You take a glass tube with one end sealed and fill it with mercury and stand it up in a bowl of the same, open end down. Then you measure the height of the mercury in the tube relative to that in the bowl. That gives you a direct measurement of the pressure in some unit of length (usually mm of Hg or inches of Hg). Now, atmospheric pressure is usually just compared to atmospheric pressure, not to the pressure it takes to form diamonds nor that in intergalactic space. Just what is gained from converting from mm-Hg to Pa? It makes the useful comparisons no easier, and simply adds one unnecessary and potentially error causing step to the process.
Useful units arise as needed, and go when no longer needed. They don't get handed down from ivory towers nor are they mandated by politicians. Consider atomic cross sections in barns. Or look at the league. The origin was how far an average person could walk in an hour. Now that people seldom walk farther than from the couch to the refrigerator, it is of little use. But when shank's mare was most people's form of transportation, then it was a very useful unit. If someone told you it was 5 leagues to town, you knew right away how long it would take to get there. If they told you it was 24 km, you'd have to go through "Hm, I can walk about 5 km in an hour, so it'll take me about 5 hours." The league came about because it was useful. If it was ever formally defined in terms of other lengths, then that is a relatively modern (and useless) innovation. And it didn't need to be legislated away. Along with the knowledge of how to saddle a horse, it simply slipped into obscurity when it was no longer needed.
So, no. I'm not sold on the metric system. It is a poor system that, like the Jack of all trades, does many jobs, but none of them well. The one real advantage it has is that it is the closest thing the world has to a unified measuring system. But don't take that too far. When I was still working on cars, I discovered that German, French, and Italian metric nuts and bolts were not interchangeable. Oh, yeah, they all had the same nominal size heads and shafts, but they used different pitches and different thread shapes. My tools have been gathering dust since about '85, so that may have changed with the EU and all. But remember, standards are important. That's why every company, every country, every union has their own.
Now, if you really want a great system of units, consider a set of natural units.
--Pete
How big was the aquarium in Noah's ark?