Back in the days when software was distributed on floppy disks (remember floppy disks?), the rule of thumb for Windows was one byte costs a dollar.
In other words, considering the cost of materials, the additional manufacturing time, the contribution to product weight, the cost of replacing materials that became defective after they left the factory (e.g., during shipping), after taking data compression into account, and so on, the incremental cost of adding another megabyte to the Windows product was around one million dollars, or about a dollar per byte.
This was a cute rule of thumb to have, because it let you put a (admittedly somewhat artificial) monetary value on code bloat. Was your feature even worth the disk space?
Of course, the advent of the CD as the primary distribution medium changed the mathematics, but there was still great concern over the size of the operating system. It is my understanding that the Windows Server 2003 CD was basically “full”. It may not look full to you, but remember that your CD is probably the 32-bit English version. Additional space needs to be reserved for translations into other languages, and don’t forget that the 64-bit edition of Windows is roughly twice as big as the 32-bit version, since it needs to contain two operating systems, the native 64-bit one and the emulated 32-bit one. (It’s not quite that bad, because some files can be shared, and many 32-bit components can be jettisoned.)
And then distribution media switched to DVDs, and now it’s entirely online. I wonder what the cost-per-byte is nowadays. The cost is now in bandwidth, but it still costs money.
The post One byte used to cost a dollar appeared first on The Old New Thing.