what is the definition of macro?, Basic Computer Science

Definition of the macro
A macro is a collection of repetitive instructions in a program which are codified only one time and can be used as many times as required. The major difference between a macro and a procedure is that in the macro the passage of parameters is possible and in the process it is not, this is only applicable for the TASM - there are some other programming languages which do allow it. At the instant the macro is executed each parameter is alternated by the name or value individual at the time of the call.
We can say then that a procedure is an addition of a determined program, while the macro is a module with particular functions which can be used by different programs. One more difference between a macro and a procedure is the method of calling each one, to call a procedure the use of a directive is required, conversely the call of macros is done as if it were an assembler instruction.
Posted Date: 5/4/2012 8:13:21 AM | Location : United States







Related Discussions:- what is the definition of macro?, Assignment Help, Ask Question on what is the definition of macro?, Get Answer, Expert's Help, what is the definition of macro? Discussions

Write discussion on what is the definition of macro?
Your posts are moderated
Related Questions
Disk output : You will already have decided whether to use a hard disk or floppies for storing data. An important point of disk management is to ensure a secure method of kee

Necessary Nurses Records The software should generate all registers/reports in detail summary for various permutations and combinations of options. A powerful SQL (Structure

Compact Diskette (CD) CD-ROM CD-ROM stands for Compact Disk-Read Only Memory. This disk comes with data already stored in it. It can store around 640 MB of

programm for fibonacci series

parent and child process using c language


Compiler A compiler is a special program which translates programs written in high level language into machine language. The original program is called the source progra

Debugging Once a program has been written, it has to be tested first to ensure that it is free of errors. It is very normal that any program written will contain mistake

Digital Camera: A Digital camera is an electronic device which takes video or still photographs or both, digitally by recording images via an electronic image sensor. Digital

Example of flowcharting: Example Problem statement: To find out whether a given number is even or odd. Algorithm: Step 1 Start Step 2 INPUT the number n  Step 3