(Last Mod: 27 November 2010 21:38:40 )

ECE-1021 Home



The 45 operators in Standard C can be grouped a number of different ways, depending on the purpose behind creating the grouping. Here we choose to group them according to the function that is performed by the operation. In doing so, we discover that most of the operators fall into one of five well defined groups:

The remaining six operators will be collectively called simply the Miscellaneous Operators.

Aside from knowing what each operator that is used does, it is also important to know what order the operations are performed in. Just like the rules that we are already accustomed to for the everyday arithmetic operators, the operators in any programming language have a well defined precedence and associativity for each operator.

Operators by Functional Group

Function Operation Symbol(s)
Arithmetic Unary Plus +
Unary Minus -
Multiplication * *=
Division / /=
Modulus % %=
Addition + += ++
Subtraction - -= --
Relational LT <
GE >=
GT >
LE <=
EQ ==
NE !=
Logical NOT !
AND &&
OR ||
Bitwise NOT ~
AND & &=
OR | |=
XOR ^ ^=
L. Shift << <<=
R. Shift >> >>=
Indirection Address &
Dereference *
Array Index [ ]
Structure . ->
Miscellaneous Grouping ( )
Assignment =
Type Cast ( type )
Size in Bytes sizeof
Conditional ?  :
Comma ,

Order of Operations

Order of Operations refers to the order in which operators are to be executed within an expression. From a computer programming standpoint, it is critically important because when we write an expression in a computer program, the compiler is going to evaluate that expression according to its rules for the Order of Operations. We need to understand those rules because we must obviously we write our expression such that, when evaluated according to those rules, it produces the result we intended.

People tend to equate the phrase "Order of Operations" with the phrase "Precedence of Operators". As we shall see shortly, these are different things. An operator's precedence is only one factor in determining the order in which the operations in an expression are carried out. The other factor is the "Direction of Associativity". Most people are not comfortable with this last concept, even though they have been using it since grade school. So it is worth taking a moment to consider what each of these concepts mean.

Precedence of Operators

If we asked to recite the rules we were taught in grade school for performing arithmetic involving addition, subtraction, multiplication, and division we would probably regurgitate something like, "Multiplication and division come before addition and subtraction."

This works okay if we are given an expression like:

y = m * x + b

We would perform these operations as follows:

y = (m * x) + b

Similarly, if we had:

z = g / y - k * r

We would perform these as follows:

z = (g / y) - (k * r)

Before proceeding, let's make a note of something that seems to cause confusion later on - we can't always say that operator A has precedence over operator B. It is entirely possible that both operators have the same precedence. This is nothing new to us - multiplication and division have the same precedence as do addition and subtraction. In the Standard C Language, there are 45 operators but only twelve precedence levels. Seven of those levels only contain a single operator while one level has eleven operators and another has ten.

Direction of Associativity

In the above examples (because they were carefully chosen) knowing just the Precedence of Operators was sufficient to determine the complete Order of Operations for those examples. But what if that second example had been changed just slightly to:

z = g - y / k * r

Without violating the Precedence of Operators we could group this as either:

z = g - ( (y / k) * r )


z = g - ( y / ( k * r ) )

Clearly, we could obtain significantly different results. The problem is that the Precedence of Operators only tell us part of the story. If two operators have different levels of precedence, then we know which to perform before the other. But if two operators have the same level of precedence, we need further guidance on how to proceed. Notice that the fact that different operators can have the same precedence is not the cause of this problem - we would have the same issue in the above example if the multiplication operator were replaced by a second division operator.

If pushed for a more complete statement of the rules that we learned, we would probably say something like, "Multiplication and division operations are performed left to right and then all addition and subtraction operations are performed, also left to right." The more precise way of stating this is that both levels of precedence being discussed are "left associative". What does this phrase mean? Where did it come from?

Recall from grade school that some operators are commutative and some are associative.

For instance, addition is commutative meaning that:

x + y = y + x

Addition is also associative, meaning that:

(x + y) + z = x + (y + z)

Commutativity is not an issue here since it involves rearranging the order of the terms for a given operator and not the order in which multiple operators are executed. Although it is good for us to understand which operations are commutative and which are not form the standpoint of writing good, efficient code, it is not an issue in making sure that our expressions are evaluated properly by the compiler.

But associativity is an issue since it deals directly with which order operators are executed in. The compiler has no choice but to choose an order, so we have to know what that order is because, unlike addition and multiplication, not all operators are associative. The left side of the above equation shows an evaluation being performed according to "left associativity" while the right side is evaluated according to "right associativity". Left associativity merely means that the operators are executed left to right while right associativity means they are executed right to left. It really is that straight forward.

Also notice that saying that an operator is left-associative or right-associative is completely separate from saying whether or not the operator itself is associative. Left and right associativity are rules telling us what order to perform the operations in. Whether or not an operator is associative tells us whether or not the order matters since saying that an operator is associative is really saying that it obeys the "associative property" meaning that you will get the same result regardless of whether you use left-associativity or right-associativity when choosing the order in which to evaluate it.

What may not be obvious at this point is that the direction of associativity is really not a property of a given operator but, instead, of the precedence level that the operator belongs to. It would become hopelessly confusing if multiplication were left associative while division was right associative - all operators at the same level of precedence must have the same direction of associativity. In C++ speak, we would say that an operator "inherits" the direction of associativity of the precedence level it is a member of.

Abbreviated Assignment Operators

It turns out that in computer programming it is extremely common to read the present value stored in a memory location, modify that value by combining it with another value via some operator, and then writing the new value back into the same memory location that the original value was read from. This is so common that this type of procedure is widely referred to as "Read-Modify-Write" operations. If implemented cleanly and efficiently, these operations can significantly improve the performance of a piece of software.

Many of the operators in C allow us to us a shorthand form for read-modify-write operations which is very useful. These shorthand forms are known as abbreviated assignment operators and, in the early days of compilers, the use of these forms could lead to significantly faster code. The reason is that the compiler put in the effort to take advantage of the processor's built in capabilities to support these operations whenever it came across an abbreviated operator. As compilers have evolved - based in no small part on the modern machines they run on being fast enough to permit spending a lot of compute cycles to compiler a program - these performance differences have largely disappeared. While they still cannot compete at the top end with a highly skilled programmer working in Assembly Language - and probably never will - modern optimizing compilers have closed the gap considerably and frequently produce faster code than assembly level programmers of modest abilities generally achieve.

Still, the use of abbreviated operators is generally useful because they highlight the read-modify-write for our own benefit. Even if we don't think of them this way, we still tend to recognize the qualitative difference between updating the value of a variable versus just assigning it a brand new value. The use of abbreviated operators therefore provides us with visual, subconscious cues that, given a little bit of exposure and experience with them, allows us to comprehend both our own code and code written by others more quickly and deeply.

For those operators that have an abbreviated version, whenever we can write an expression in the form of:

(object) = (object) (operator) (expression);

We can write in the form:

(object) (operator)= expression;

These are equivalent forms.

Some examples to illustrate this are:

x += 3;      /* The same as x = x + 3;       */

y /= y + 3;  /* The same as y = y / (y + 3); */

Increment/Decrement Operators

There is a special case of read-modify-write operations, namely when we want to change the value stored in a memory location by a value of exactly one - either up or down - that is so common that many processors have special instructions to perform them extremely quickly.

If we add one we are said to be incrementing it while if we subtract one we are decrementing it. Like the abbreviated assignment operators, C provides a shorthand notation for incrementing and decrementing. Also like the abbreviated assignment operators, the performance benefits obtained with modern optimizing compilers is minimal to non-existent but, like the other operators, they still give the humans reading the code significant cues to grasp the logic behind the code more quickly.

There are four such operators:





The first two are the post-increment and post-decrement operators while the last two are the pre-increment and pre-decrement operators.

All four operators change the value stored in object by one (up or down as appropriate). But this is the side effect and not the value that the expression yields. The pre- operators yield the new value that will be stored in object while the post- operators yield the original value that was stored in object. When they are used in an isolated statement, which they frequently are, there is absolutely no difference between the pre- and post- versions because they have the same side effect - which is all we are interested in - and we discard the expression results.

An interesting side note is that, commensurate with the low sense of humor common to so many programmers, when a new language based heavily on the C language was developed it was quipped that this new language would taking C programming to the next level. It was therefore almost inevitable that the new language would be called C++.