c++ - On what architectures is calculating invalid pointers unsafe? -


int* = new int[5] - 1; 

this line invokes undefined behavior according c++ standard because invalid pointer , not one-past-the-end. @ same time 0 overhead way of making 1-based array (first element a[1]) need project of mine.

i'm wondering if need avoid or if c++ standard being conservative support bizarre architectures code never going run on anyway. question is, on architectures problem? of widespread?

edit: see line above indeed invoke undefined behavior, take @ this question.

edit: dennis zickefoose points out compilers allowed when undefined behavior invoked, both compiler , cpu have offer guarantees beyond c++ standard code work. i'm expanding question whether modern c++ compilers have issue.

the hardware doing checks present in x86 processors, not using @ moment in popular operating systems.

if use segmented memory architecture, did 16-bit systems, allocation not unlikely return address segment:0. in case cannot subtract address!

here starting point reading segmented memory , why loading invalid segment not possible:

http://en.wikipedia.org/wiki/segment_descriptor

you have decide if unlikely happen code, or if perhaps can define overloaded operator[] handles offset you.


Comments

Popular posts from this blog

c# - SharpSVN - How to get the previous revision? -

c++ - Is it possible to compile a VST on linux? -

url - Querystring manipulation of email Address in PHP -