Binary

might be worth pointing out that, while binary is not a programming language, it's not unheard of to program in binary.

it's unlikely that you ever will though.

generally if you look at programming it'll be from a point of view like this:
(when I say next step down I'm talking about getting closer to the silicon.)

Using a drag ans drop type program, (e.g Media builder) everything is basically done for you, it's like using actionscript, or using javascript, you put an element down, and you say, onclick, play this sound.
you don't need to worry about how files are actually loaded, or how networking works, or anything too in depth.

Down from this you'll get languages like visual basic, again very drag and drop, you don't need to learn how to draw a window on the screen, but you get a bit more power over what things can and can't do. you;re also a bit more involved with creating things, - so rather than having a file selection box as a pre-built thing ready to use, you put the code in to make that.

Down from that you start getting to languages like Java, you need to tell the program where everything goes, and there aren't many graphical (drag drop type) interfaces, you're writing in code, but, even though this sounds like really hard work, the computer and compiler are still doing a ton of work for you, you don't need to worry about a lot of memory management or anything like that. there are still plenty of "pre-built" things... the trouble is. (and this might just be a personal experience) but most things written in Java seem to be resource hungry beasts. I guess it's cheaper to buy more memory than to figure out why your notepad application consumes 50MB of RAM, etc. C++ gives you (the programmer) a bit more to worry about than Java, but by and large, lots of things are done for you.

Then you get things like C, where the onus is very much on you to allocate and de-allocate memory. you have to write much more code. but "generally speaking in my experience" the results are slicker. (e,g using less resources). for all intent and purpose, unless you're working on OS Kernels or embedded hardware, you may as well consider C a dead language, there aren't that many programs written using it any more.

All these languages require compilers to move from a human readable form to machine codes.

After this there is another type of programming languages.

Often when people talk about Assembly languages they write them using mnemonics:

mov A,B
(move contents of register A to register B)
(there is a post on this here: http://www.computerforums.org/forums/programming/computer-architecture-help-226255.html

when written like this there is still the need for a compiler (of sorts)

in this case though the compiler is a literal translation.
MOV = 0000 1010 (instruction 10)
A = 0000 0001 (register 1)
B = 0000 0010 (register 2)

so MOV A,B literally gets compiled to 0x0A, 0x01, 0x02

But (and here is the clever bit) when you read a programming manual for a chip those instructions (move) and register locations are given in hex.

so instead of writing, MOV A,B

you can refer to the manual and just directly write,
0x0A
0x01
0x02

or you could write that out in binary as

00001010
00000001
00000010

so the long and the short of it is:
Yes, you can program in binary, not only is that possible it used to be the ONLY way to program, -where programs were instructions were literally entered with a bank of 8 switches and a GO button. - you'd set the switches according to line1 of the program, then press GO. then you'd set the switches to line 2 and press GO, then set the switches to line 3 and press GO.

this is still possible with some very small chips, (though most chips load data via a serial bus so parallel programming is all but dead!)


Not quite binary, but programming a Z80 in machine code
0x0a, 0x01, 0x02 is (or at least was) on the A level (16 - 18 year old) learning syllabus for electronics (in the UK) I was given a question paper, and a programming manual, you had to hand write programs in hex under exam conditions.

Writing a program to create a signal generator in hex was a part of my 1st year electronics degree program...


Even though it's highly unlikely that you'll ever need to program directly in Hex or binary, (though people still do for small embedded controllers) that doesn't mean that it's not worth learning, its quite useful to be able to count in it (that and hex!) (so at least you understand where you need to use int, and small ints etc. many programs, (especially encryption based) will use binary operations to change data. (stuff like bit shifting and performing XOR operations etc.)


so, to answer your question:
Is binary a programming language, - no.
Do programmers use it - Yes.

Weirdly, even though binary is not a programming language, you can program in binary. (where program means to physically sit at a box of switches entering code to a device!) it's not the same as programming in java, or C etc.



My advice. if you are hazy on the concepts of numbering in computer systems, then try and get as much committed to memory as you can, (that way the grass roots type stuff you know, and can devote more time to understanding the more advanced stuff when required).
 
Yes Root, I did some programming of an HP 3000 mini computer in a college class back in 1976 or so using only the front panel switches. However, the front panel switches represented Octal, not Binary, so we had to convert the instructions into Octal in order to set the switches. Once the program was completely entered then you had to hit the "Run" button. The front panel lights would flicker for a few seconds and then stop, if the lights then showed a result of 0 (return code of zero) then the program ran successfully, otherwise a non-zero return code meant your program failed.

After that class, I couldn't understand why people were so awestruck by computers since I thought it was way too much work to set the switches for each instruction and then run the program only to attempt to get a return code of zero. It was another 8 years before I used a PC with a keyboard for input and a monitor for output where I could finally see the value of using a computer. In between, I took a programming class where we used punched cards for input, another method that I felt was way too complicated to be worthwhile.
 
That's not really too different from using hex as an intermediary step.

in hex numbers 0,1,2, 3, 4, 5, 6, 7, 8, 9,a, b, c, d, e, f represent numbers 0-15 (or 1 - 16 depending on where you start,) or a four bit binary number.
in octal, number symbols 0,1,2,3,4,5,6,7 represent numbers 0 - 7, (or three binary bits).

it's still sort of programming in binary, in so far as you can "see" electrical connections being made. - and yes, a hell of a journey to see a light blink! or to add two small numbers together that you could do in your head.

I guess that the answer is still the same, (you can do something, but ordinary people would choose not to!)...

HP3000 is a little before my time!!
 
One thing I haven't seen mentioned (which is actually where I've had to drop down to raw 1's and 0's most) is reverse engineering of comms protocols (particularly serial protocols.)

I've spent the best part of the last week in my day job reverse engineering an IR protocol used on laser tag guns... turned out to be a form of RS232 over IR, but with the start and stop bits backwards (don't get me started on the stupidity of that last bit!) That required an oscilloscope, logic analyser and many hours of staring at highs and lows on the scope (and translating them into 1's and 0's accordingly) before we figured out what was going on.

Now that's figured out we're down to analysing the protocol at a higher packet based level, and the same applies - you still have to realistically work with the data at either a binary or a hex level to work out what bits are changing to what based on different parameters.

Before that I was doing a similar task on a circuit board designed to drive ultrasonic rangefinders - same story.

Before that I was doing the same thing on an atomic clock receiver with a UART (drivers were windows 3.1 only, not very useful today but the receiver itself works great!) Again, same story with dropping back to raw binary to work out the protocol.
 
Back
Top Bottom