(Last Mod: 20 August 2013 14:22:04 )
The NIST Reference on Constants, Units, and Uncertainty
Tracking and checking units is perhaps the single most valuable error detection tool available to an engineer. Unfortunately, many (if not most) engineers do not avail themselves of this tool because it was never expected of them as a part of their education. Sadly, this leads to a sloppy mindset in which answers that are wrong go undetected. While the consequence for students that are sloppy are missed points and lower grades, the consequences in the real world are, well, real. People can and have died and major projects have failed as a result of units mistakes that should never have been left uncaught.
Why is this allowed to continue? I believe there are two chief culprits.
First, engineers function in two worlds that are thoroughly intertwined -- namely their social world and their professional world. In their social world, they interact with other humans in the very informal manner to which humans have adapted. In most conversations, what a person means and the words they use to convey that meaning are two very different things -- the words convey the 'gist' of the meaning and the receiver uses the context of the conversation to fill in the gaps. The ability to extract meaning from what are actually, more often than not, very ambiguous statements is a very useful skill, it is a skill that human beings are innately very, very good at, and it is a skill that serves us very well day in and day out. So much so, that we seldom consider all of the cues that we are using to establish the context of our words, be it a hand gesture, a phrase from a classic movie, a cliche, or the sentences that came prior or will come after. If you have ever interacted extensively with someone from a different culture, particularly if you have different native languages, you frequently get glimpses into this process as you struggle to convey a "simple" idea and end up spending considerable time getting the point across because you lack a reliable set of common cues that you can leverage and you have to be much more complete, exact, and precise with the words you choose. But in our professional worlds, we have to convey complex ideas with a high degree of specificity and this is a learned trait. But, even here, we frequently communicate with other members of our field using shorthand jargon and catch phrases drawn from the common context of our field. The result is that we don't have clear cut boundaries between when we can use inference and context and when we must be explicit and precise; because of our innate ability to usually draw the right conclusions based on context in most of our social and professional interactions, we have a marked tendency to use that as our mode of operation unless we force ourselves to do otherwise.
Second, a large fraction of text books do not place an emphasis on the proper tracking of units and when they present worked examples the units are, at best, tacked onto the answer as an afterthought. It is hard to convince a student that the practice they see in most of their text books is not an acceptable practice for them to follow. But why do so many textbooks do this? The most likely explanation, albeit one that is almost certainly oversimplified, is that most textbook authors have three things working against them: (1) they themselves were never taught the proper use and tracking of units; (2) they are using previous textbooks as the basic template for their own; and (3) they come predominantly from the world of academia where the penalties for wrong answers are lower grades and not crashed airplanes or failed bridges.
The description of physical quantities consist of the product of two components: the numerical value and the units. It is not correct to just convey the numerical value or to work with the numerical value and ignore the units. For instance, the table I am sitting at does not have a width of 6, it has a width of 6 ft. My height is not 182, it is 182 cm. The same thing applies to quantities used in such areas as circuit analysis. A battery has a voltage of 12 V, not just 12. The current flowing in a resistor is 3.21 A, not 3.21. The units are an intrinsic part of the quantity and can't just be cast aside as being too inconvenient to write down and then magically tacked back on to the final answer. They must be tracked throughout the work.
The reasons for this might seem arbitrary or overly nitpicky, but the in fact they are immensely practical. As human beings we will always make mistakes -- it's one of the few things in life we can truly count on. We will make mistakes setting a problem up, we will make algebra mistakes when manipulating it, we will leave a quantity out from one step to the next by pure oversight. But by tracking our units rigorously and in a way that becomes second nature, we gain the ability to do a truly remarkable and powerful thing -- catch the overwhelming number of these mistakes almost immediately after they are made and quickly identify and correct them. The reason is simple: the overwhelming majority of mistakes you will make -- not all, but most -- will mess up the units. If you are tracking the units through your work, at some point (usually pretty quickly) the units will no longer work out. At any point that the units don't work out as they must, there is absolutely no point going on because you KNOW the answer you will get will be wrong. Stop, find, and fix the answer before proceeding; and the fact that you tracked your units will make this part almost trivial in most instances because all you have to do is walk through the work, either forward or backward, and find where the units went from being correct to being wrong and that is almost always where your error will lie.
The alternatives really aren't that attractive. You can blindly chug through literally pages of work spending hours to get an answer and, if you are lucky, realize that must be wrong at that point and then quickly realize that you need to start from scratch because it is virtually impossible to find the error in the mess of numbers in front of you. If you are unlucky, you don't catch the mistake and get little or no credit in exchange for your hard labor -- your time would have been better spent taking in a movie or getting some sleep. Even worse, in the real world, you cost your employer or your customer time and money and they fire you or, worst of all, your easily caught mistake goes uncaught and someone is seriously injured or killed as a result and you might find yourself being prosecuted for criminal negligence or sued for wrongful injury or death.
This last point really can't be overstressed. You are being trained and educated to take your place in society as an engineer and along with that goes a personal and professional responsibility far beyond what a doctor is saddled with, if for know other reason than that doctors are almost always limited to killing people one at a time while engineers do it in job lots. Part of that moral, and at times legal, responsibility is to adopt reasonable practices aimed at minimizing the number of errors that go uncaught. Given that the proper use of units is a completely free and highly effective practice capable of catching most errors, can it really be argued that an engineer does NOT have a moral and ethical obligation to use them properly at all times?