I began my career over thirty years ago as a COBOL programmer (these days I work mostly in C++), so perhaps I should answer the question.
The number of characters you can represent doubles each time you use an additional bit of storage.
1 bit ==> 2 different characters
2 bits ==> 4 different characters
3 bits ==> 8 different characters
4 bits ==> 16 different characters
5 bits ==> 32 different characters
6 bits ==> 64 different characters
7 bits ==> 128 different characters
8 bits ==> 256 different characters
The character set we typically use has something like 90 characters, so at least 7 bits are needed to represent it. Since computers are organized by bytes (8 bits), normally we use a whole byte to represent each character. Thus, if you are going to allow up to 6 characters per number, the character representation would require 6 bytes of storage.
On the other hand, there are only 10 digits, so you can represent a digit using just 4 bits; that is you can store 2 digits in each byte, so the digit (what is sometimes called "packed" or "BCD") representation of a 6 character number would require just 3 bytes of storage.
Saving 3 bytes doesn't seem like much now, but 40 years ago storage was much more precious than it is now, and we were trained (we're talking about the era when I was a college student) how to save every last byte. My wife finished her thesis (in Chemistry) in 1980 - 10 years after the time we are talking about, but even then the program she used was too big for the main academic computer at Indiana University (she was storing something like 23000 4-byte "double precision" numbers), so her advisor got permission to run her program overnight on the main administrative computer; it didn't have quite as much real memory, but it was equipped with "virtual memory" - that is, the ability to automatically shift contents between real memory and the disk so that the user thought much more real memory was available. And running her program did take that computer almost all night to finish.
At the time I commented "someday computing will reach the point where you could run that program on a computer in our bedroom". She tossed all her stuff as soon as she was done, so we never tested that idea, but a typical home computer of 2000 could have breezed through her program.
BTW - these storage issues are also what led to the Y2K issues, since we routinely saved a byte per date by recording just the last 2 digits of the year.