The popular meaning of decimate is "kill, destroy, or remove a large percentage or part of." The proper meaning of decimate is "to kill, destroy or remoe one tenth of".
We all cope with the fact that most people use "literally" to mean "figuratively", and if you're like me, when you hear someone say "literally", your bogometer kicks in and tells you which they actually mean, sometimes having to ask, if the possibility is remote but in fact possible that it is literally true.
My bogometer fails me when I hear the word "decimate", however. My first thought is that it's a 90% reduction, rather than a 10% reduction, and there's rarely a clue as to which is intended. The speaker is devastated at the size of the loss, but if one were to lose 10% of our blood or 10% of our fingers, we'd be sorely aggrieved.
i think the problem is that our language hasn't yet embraced decimal notation.
We've double, triple and quanruple to indicate multiplying by two, three or four, but it is SO easy to multiply by one order of magnitude, and yet we don't seem to have a word that means "to shift the decimal point one place". We have dectuple to indicate the result of such an operation, but only clumsy phrases to verb that action.
We also lack a word that means "a decimal digit" as bit means "binary digit", if you're up to word coinage this morning. The word bit proves quite useful in such terms as "bit-shifting", "bit-flipping" and other "bit-fiddling". For instance, it would be useful to have a term for decimal digit in explaining casting-out-nines, or in explaining subtraction by adding the 9s-coomplement of the subtrahend. Should a language that has the term "subtrahend" have a term that means "decimal digit".
And if we had a word that meant "to increase 10x in magnitude", people might start to understand the difference between "on the order of" and "on the close order of".
deaconB said: We also lack a word that means “a decimal digit” as bit means “binary digit” ...
I believe the word is simply "digit" whatever base you're counting in. At least that the way I've always used the word. The contraction of "binary digit" into "bit" was just that ... a logical contraction of something that was being spoken and written a lot after the invention of computing. Kinda like the more recent contraction of "quantum bit" into "qubit." But for "decimal digit" that hasn't happened (for whatever reason) so your only choice is simply the generic "digit". And that also works for hexadecimal numbers, even for the non-numeric parts. We could properly say that the fifth "digit" in #6617b5 (a hex color code) is "b".
So unless you want to coin a new term, I think we're stuck with the generic "digit" or the prefixed "decimal digit" if you want to be specific.
I am not sure that bit really has any strong association with digit, if this latter word is mainly associated with numbers. A digital photo is a collection of bits that you decode with the appropriate photo reader. But 'digital photo' refers to modern technique, not the concept of numbers. Once I used bits strings to represent 3-d solids for quick clearance detection. Rumors have it that the quantum bit is not even binary.
Here is what the OxED says about bit (in its fourth entry):
Etymology: Abbrev. of binary digit.
A unit of information derived from a choice between two equally probable alternatives or ‘events’; such a unit stored electronically in a computer.
1948 C. E. Shannon in Bell Syst. Techn. Jrnl. July 380 The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey.
Google's Ngrams find the above and several others whose dates are often the beginning date of the journal--not the date of the quote.
BTW, I think bit (although made up) works because it corresponds to an earlier definition (listed in the OxED's second entry):
4.a. A small portion or quantity, a little (of anything material or immaterial). Also applied to complete objects, viewed as portions or samples of a substance. to give any one a bit of one's mind: (colloq.) to express one's candid (and uncomplimentary) opinion of his conduct, etc.
I don't think any other abbreviation of "X digit", where X could be decimal, hexadecimal, octal, etc., can make a similar correspondence. That may be the reason why none of those abbreviations have been accepted into the lexicon.
RobertB said: Rumors have it that the quantum bit is not even binary.
Yeah, that's what the whole "qubit" thing is about. When you read the information, it's gonna be either a 0 or 1. Just like a standard "bit." So it's still really binary. But it has this spooky property of being able to exist in a superposition of 0 and 1 states, depending on quantum correlations. If they ever build a full quantum computer, and I've no doubt that will happen, probably before the end of the century, it would have the ability to solve currently intractable problems ... cryptography, exact facial recognition, climate modeling, protein folding, SETI signal filtering ... the list goes on.
I don't pretend to understand how the technology functions, but I get the implications if it does.