Reputation:
I was looking through the following code from tutorial point:
section .text
global _start ;must be declared for using GCC
_start: ;tell linker entry point
sub ah, ah
mov al, '9'
sub al, '3'
aas
or al, 30h
mov [res], ax
mov edx,len ;message length
mov ecx,msg ;message to write
mov ebx,1 ;file descriptor (stdout)
mov eax,4 ;system call number (sys_write)
int 0x80 ;call kernel
mov edx,1 ;message length
mov ecx,res ;message to write
mov ebx,1 ;file descriptor (stdout)
mov eax,4 ;system call number (sys_write)
int 0x80 ;call kernel
mov eax,1 ;system call number (sys_exit)
int 0x80 ;call kernel
section .data
msg db 'The Result is:',0xa
len equ $ - msg
section .bss
res resb 1
I feel like I understand this code except for the line:
or al, 30h
I understand that the or is a bitwise or and 30h is 0011 0000 in binary. I don't understand why this is needed for the code to work though! Can someone explain this to me?
Upvotes: 1
Views: 8275
Reputation: 46960
This isn't a great example.
After the subtraction, al
contains 6.
The aas
instruction does nothing because the high nibble of al
is zero.
The ASCII code for 0 is 30h. "Or"ing this with the 6 produces 36h, which is ASCII "6". In general, this converts the binary value for a decimal digit to its ASCII code.
Sounds like you'll benefit from studying the difference between binary values and their ASCII representations. See for example an ASCII table.
But also note, this code is (AFAICS) erroneous because it stores a 16-bit word into a single reserved byte.
Upvotes: 1