Okay, I went a bit maniac today asking Jon this one question and keep on asking him, “Why Jon, why?!” repeatedly that I think I must have annoyed him.

We were in Maths class today when near the end Mr Masukor talked about how numbers are represented in calculators and computers and how dumb they really are, because they cannot multiply or divide, only add and subtract. He told us how calculator slides the digit when it performs multiplication by 2 much in the same way we slide digit when we multiply by 10 in our decimal system. Then it came to me, without me being able to stop it, for whatever it’s worth this one question that got me to go bug Jon and Ju Anne and Riza and even Mr Saimun for the answer why:

Why does our computer system built on binary instead of decimal?

We know that down to the most basic component of any computer system, be it as simple as your watch or calculator to as complex as the most advanced computer, all the processing of information are conducted in a representation of the digit 0 and 1. We know that this is because down to the most basic component of the computer system is the transistor which can be turned on and off, so 1 represents its on state and 0 represents its off state, the ultimate language of computer. We know that whatever processing we want it to perform the instructions must be converted to this language of 0 and 1, so a simple task like viewing this page could mean millions of individual yes-no instructions that needs to be processed, all represented by strings of 0 and 1.

So the reason behind the usage of binary system is because the transistor can only be turned on and off. But then I think, why can’t there be the states on and off and 8 interstates for each transistor? Why can’t they be half-on or quarter-on or 1/10-on, for example? The mechanism for these interstates could be by varying the amount of current that pass through the transistor and by using resistor, each individual states can be detected for each transistor. If we have 8 of these interstates plus complete on and complete off, then we can have a decimal system instead of a binary system. Instead of having to code everything into 0 and 1 for processing, they can now be represented by 0, 1, 2, 3, 4, 5, 6, 7, 8, and 9. This would certainly decrease the number of combinations required to represent all the different bits of information. Instead of having to process millions of instructions in binary to view this page, the amount of instructions may as well be reduced to only hundreds of thousands. When less instructions is required for one task to be performed, then the processor can perform much more tasks in a given time. A 3.0GHz processor today might as well be equivalent to 15.0GHz if it’s operating in decimal system instead of binary.

I know I made everything sounded so simple. Of course I’m guessing it won’t be so easy to create the interstates by only applying varying amount of current and using resistor might also pose a problem. But hey, the possibility is there. They didn’t straight away created microprocessor, did they? If first started with huge mechanical machine to perform simple calculations, data were stored as punched holes in cards in those early days. There must be a way to produce those interstates, one way or another. So why didn’t we create our computer system based on decimal system instead of binary?

Hard Disk

Of all those who I have asked this question, no one gave me a satisfactory answer. The only explanation I can come up with right now is because when we first discovered the binary system and the wonders it can do, we became so excited that we began building on the technology straight away without ever considering the alternatives of a decimal system instead. And now that we have built practically everything on binary system, it just isn’t viable to go all the way back and start from scratch again by using decimal system. All the codes will have to be redefined, the processor architecture remapped, the graphic system revised, the list goes on. If we are going to do this now, it’s like going back to the days when computer is as big as a large room (we called them mainframe back then). Of course not many people, if any at all, would ever be willing to undertake that rollback, but if we do, it’ll be a revolution. Just think of how much faster our computer will be and how much more data can be stored, for example.

I know that my answer to this is barely acceptable, but that’s all that my obviously limited knowledge can churn out. If anyone has a better answer post it here. It’ll be interesting to know 🙂