Recently we discussed some very old marksmanship in connection with Civil War sharpshooters (see this post and this one, and there’s more to come, thanks to expert Fred Ray). And a couple of commenters asked about the “string” measurement of marksmanship precision and accuracy that was used at the time, and well up into the 20th Century, before modern measurements of precision (group size) and accuracy (distance from point of aim and intended point of impact) were developed.
Fortunately, the late Steve Ricciardelli of Steve’s Pages incorporated an explanation in a list of measures of group size and central tendency:
This is an old method still used to determine a shooter’s skill at hitting a target. It assumes the point of aim is always the desired point of impact and is simply the sum of the distances from the point of aim to each bullet hole. Originally a string was used to gather the distances, hence the name. It is still a valid measure of total error relative to the aim point. String Measurements however cannot be used to analyze sight settings because it only measures the magnitude of error, not the direction of error. It is also not a useful measure of group size because a tight group located away from the Bullseye will produce a large String Measurement.
The string measurement is old, but it remains surprisingly useful on a real-world basis, to get a broad idea of the practical accuracy of a specific shooter and firearm combination, or to put shooters in a rough rank order (say, if grouping soldiers for marksmanship training). Now, a marksman seeking to maximize performance (think of, say, a benchrest shooter) would not want to use it, because it is important to him or her to separate the possible causes of misses; you do something different if your windage is off than you do if your group is too large.
Turns out a string is useful for something, even though ATF doesn’t say it’s a machine gun any more. (Yes, they once did. But that is another story!)