Binary is not very user friendly as humans use a base 10 or decimal method for numbers. This is because they are analog devices and have ten fingers and ten toes. This should prevent them from counting higher than one hundred but somehow they have managed to overcome that obstacle. Maybe by gathering together in groups?
Anyhow, if a bank of relays were wired in such a manner that they would be able to add two numbers and produce the answer, a human would have to convert the numbers from base 10 to base 2 and wire the inputs to the relay banks accordingly. When the power was turned on there would be a lot of noise and the condition of the outputs would be the binary equivalent of the answer.
As a note of interest, this is how the expression "bug" got started. A moth got trapped between two relay contacts preventing them from closing and the results of the computation were incorrect.
The digital computer began to grow and evolve. From relays to vacuum tubes to transistors to integrated circuits then to the large scale integrated circuits that we use today. It has become much smaller, faster, and much, much more powerful but it still does the same basic functions of moving, comparing and manipulating binary data.
The "binary" information has evolved into three separate types. Originally it was "data". You put data in and you got data out. Then terminals were added so the humans would have an easier way of interfacing with the computer. This required both computer data and data that humans could understand. For example the binary value of 1234 is not the same as the numbers 1234 that the human sees on the terminal. Also the text that you see here means absolutely nothing to the computer.
There were many forms of binary codes devised to provide information that the humans could read and eventually one standard survived, it is called ASCII. The IBM ASCII set on your PC/AT is an extended set with additional graphic codes.
As the computer "processor" evolved a third type of data was needed to provide instructions to the processor. These codes tell the processor what step to perform next. Unfortunately, there is no standard here as each processor has different capabilities and what makes things worse is that a standard would violate the copyrights of the processor manufacturers.
So, what have the humans done to keep up with this awesome revolution? Well, they first tried to describe binary information as ones and zeros and the decimal equivalent. Then they tried Octal and Hex but despite everything this still meant nothing to anybody but the little guy with the thick glasses, long hair and a beard.
Step 1, Assembly Language. A method of using words and notations to represent what the processor should do. When assembled the information would be converted to the actual binary information understood by processor. This method is still used today as it provides the fastest and most direct method of controlling a computer.
Step 2, Fortran, or Formula Translator, evolved from assembly and contains powerful mathematical routines and the ability to translate mathematical equations written in text to select the functions needed to solve the equation.
Step 3, There were many offshoots but the most notable was Basic which greatly simplified developing computer programs by the humans with short hair. Early versions were limited in ability and very slow as they would interpret the human information during execution. Over the years Basic has evolved into a very powerful language and programs may be compiled to generate a fast running executable file.
Step 4, A few years back, Bell Labs decided to combine the ease of Basic with the power of Assembly and make it transportable to different processors. They called it "C". I don't know why it's called C but I have been told that versions A and B did not work out.
Well it lost the user friendly aspect for anyone that was not a C programmer. It provided a lot of the power of Assembly but lacked speed and compactness. It never was transportable even between compilers without a lot of work. But it did became very popular with the new generation of programmers that were not interested in learning more than one language.
Lets get back to this binary thing. All the information on your computer is stored and executed in binary. You may ask "If it's all binary then why are there thousands of programs and languages?". Very simple, all those thousands of programs and languages are attempts by the humans to develop tools and interfaces to make it easier for them to generate the binary information that the computer needs to perform the task.
The humans have evolved their computer interfaces to the point where they now have programs to write programs. What does this mean to you? Well, you can now run a user interface that requires 30mb of disk space and 16mb of ram and then run a development package that requires 60mb of disk space and develop an application that requires 600mb of ram to execute and get a "General protection error" that is a bug in another program.
A few years ago you could have done the same thing in about 10k of disk and ram space but now it's "user friendly" for the "computer challenged" that have no brain, one finger and a mouse.
The Bright Side is that now more people that know nothing about computers can generate applications that don't work and the unsuspecting neophytes will buy more hardware and think that the applications actually do work.
If you have endured our history lesson, then you may still be interested in knowing that we can develop applications in many languages and on many platforms including stand alone micro-controllers.
We understand the need for the cutsie interface but we also believe that speed, size, versatility and functionality are still very important.