Compiler Behavior When Returning New

#include <iostream>

using namespace std;

int main(void){
    int size = -2;
    int* p = new int[size];
    cout<<p<<endl;
    return 0;
}

      

The above code compiles without any problem on Visual Studio 2010.

But

#include <iostream>

using namespace std;

int main(void){
    const int size = -2;
    int* p = new int[size];
    cout<<p<<endl;
    return 0;
}

      

But this code (added const keyword) gives a compile error (array size cannot be negative).

Why are these different results?

+3


source to share


4 answers


By making this a constant expression, you have given the compiler the ability to diagnose the problem at compile time. Since you've done this const

, the compiler can easily figure out that the value is bound to be negative when you pass it to new

.



With a volatile expression, you have the same problem, but it takes more knowledge on the part of the compiler to detect it. In particular, the compiler must detect that the value does not change between the time it was initialized and the time you pass it to new

. If optimization is enabled, the likelihood that some compilers may / may still detect a problem may be detected because, for optimization purposes, they detect the data stream so that they "understand" that in this case the value remains constant, t is specified as const

.

+4


source


Because it const int size = -2;

can be substituted at compile time at compile time, whereas non-const cannot - the compiler can say which size

is negative and does not allow allocation.

There is no way for the compiler to determine if it is int* p = new int[size];

legal or not, if size

not const - size

can be modified in the program or by another thread. A const

cannot.



Either way, you will run into undefined behavior triggering the first sample, this is simply illogical.

+2


source


In the first case it size

is a variable whose value is -2, but the compiler does not track it as a variable (at least for diagnostic purposes, I'm sure the optimization phase can track It). Execution should be problematic (I don't know if this is an exception guarantee or just UB).

In the second, it size

is a constant, and therefore its value is known and checked at compile time.

+1


source


The former gives a compile-time error because the compiler can detect valid syntax that is broken at compile time.
The second gives a Undefined Behaivor [1] because the compiler cannot detect illegal syntax at compile time.

Rationale:
After creating the variable a, the const

compiler knows that the value of the variable size

should not change at any time during program execution. Therefore, the compiler will apply its optimizations and simply replace the integral type const

size

with its constant value -2

wherever it is used at compile time. However, he realizes that the legal syntax is not used [1] and complains about the error.

In the second case, without adding const

so that the compiler does not apply the optimizations mentioned above, because it cannot be sure that the variable size

will never change during program execution. So it cannot detect illegal syntax at all. However, you end up with Undefined behavior at runtime.

[1] Ref:
C ++ 03 5.3.4 New [expr.new]

noptr-new-declarator:
        [ expression ] attribute-specifier-seqopt
        noptr-new-declarator [ constant-expression ] attribute-specifier-seqopt

      

constant-expression

further explained in 5.4.3 / 6 and 5.4.3 / 7:

Each constant expression in noptr-new-declarator must be an integral constant expression (5.19) and evaluate to a strictly positive value. The expression in direct-new-declarator

must be of an integer or enumerated type (3.9.1) with a non-negative value. [Example: If n is an int, it is new float[n][5]

well formed (because it n

is a direct-new-declarator expression), but new float[5][n]

poorly formed (because n

it is not a constant expression). If n

negative, the effect new float[n][5]

is undefined.
]

+1


source







All Articles