- How much of memory must be set aside for a variable.
- How data represented by string of 1s & 0s are to be interpreted.
int, char, float and double are four data typed defined by C language. Size of memory allocated aside by compile for these data types differ from platform to platform. Data type qualifiers short and long overrides the memory related specification made by the data type on a variable. E.g. data type qualifier short preceding int asks compiler to set aside two bytes of memory on X86/Linux platform rather than four bytes. Signed and unsigned qualifiers overrides the interpretation that is dictated by data types.
Another data type in C is enum. Enum allows a programmer to define a variable which will take restricted range of values. Enum constants are automatically assigned integer values, starting with 0 and each successive constant incremented by 1. However, the values can be overridden as per programmer's requirement. The enum variables are treated as any other integer variable.
Is enum really required? I can do the same thing with #define, the macro. Yes agreed! But here are few differences.
- Enum provides readability and maintainability to a program.
- If a variable takes its value from a finite and pragmatically manageable range, errors (assignment of inappropriate value) can be easily avoided.
- Addition and deletion of enumeration constants can be done without bothering about the numerical value (in contrast to that for macros).
- Enum is substituted at compile time and #defines are substituted at preprocessing time.
- Enum can have local to block scope as programmed but a #define has file-wide scope.
No comments:
Post a Comment