They only understand machine language, which most people associate with binary code. But it's more than just binary digits. A certain sequence of some of them equates to a specific instruction for the CPU to execute. You could see this in assembly language.
Computers use machine language, which are coded instructions in binary.
Most assemblers support binary, decimal, hexadecimal and octal notations.
56 in binary is 111000. Unlike the decimal number system where we use the digits.
Computers do not understand decimal notation. All information (both instructions and data) must be converted to a binary representation before the machine can understand it. We use the symbols 0 and 1 (binary notation) but the machine has a variety of physical representations it can use to encode binary data, including transistors, flux transitions, on/off switches and so on.
A: Humans use the decimal or the 10 numbers systems since i guess we have ten fingers. And we do the mathematics using the ten digit numbers. Computers use the binary system two digits "0" "1" and it recognizes as true or false that is machine language all the computations are based on these two binary numbers
The difference between high level languages and machine languages are as follows: 1)Machine language uses binary numbers/codes but high level languages(HLL) use key words similar to English and are easier to write. 2)Machine Language is a Low level language and is machine dependant while HLLs are not.
You simply use more binary digits.
Computers are not smart They only know 0 and 1 or binary states or true or false. this language is known as machine language
binary language
2 - 0 & 1
It doesn't. The only language the computer understands is its own native machine code; binary language. We use that binary language to program the computer such that it can translate the high-level human languages that we can understand into the low-level languages that it can understand, and vice versa.