György Kőszeg
György Kőszeg

Reputation: 18013

Should this unsafe code work also in .NET Core 3?

I'm refactoring my libraries to use Span<T> for avoiding heap allocations if possible but as I target also older frameworks I'm implementing some general fallback solutions as well. But now I found a weird issue and I'm not quite sure whether I found a bug in .NET Core 3 or am I doing something illegal.

The issue:

// This returns 1 as expected but cannot be used in older frameworks:
private static uint ReinterpretNew()
{
    Span<byte> bytes = stackalloc byte[4];
    bytes[0] = 1; // FillBytes(bytes);

    // returning bytes as uint:
    return Unsafe.As<byte, uint>(ref bytes.GetPinnableReference());
}

// This returns garbage in .NET Core 3.0 with release build:
private static unsafe uint ReinterpretOld()
{
    byte* bytes = stackalloc byte[4];
    bytes[0] = 1; // FillBytes(bytes);

    // returning bytes as uint:
    return *(uint*)bytes;
}

Interestingly enough, ReinterpretOld works well in .NET Framework and in .NET Core 2.0 (so I could be happy with it after all), still, it bothers me a bit.

Btw. ReinterpretOld can be fixed also in .NET Core 3.0 by a small modification:

//return *(uint*)bytes;
uint* asUint = (uint*)bytes;
return *asUint;

My Question:

Is this a bug or does ReinterpretOld work in older frameworks only by accident and should I apply the fix also for them?

Remarks:

Upvotes: 43

Views: 2413

Answers (1)

Marc Gravell
Marc Gravell

Reputation: 1062955

Ooh, this is a fun find; what is happening here is that your local is getting optimized away - there are no locals remaining, which means that there is no .locals init, which means that stackalloc behaves differently, and does not wipe the space;

private static unsafe uint Reinterpret1()
{
    byte* bytes = stackalloc byte[4];
    bytes[0] = 1;

    return *(uint*)bytes;
}

private static unsafe uint Reinterpret2()
{
    byte* bytes = stackalloc byte[4];
    bytes[0] = 1;

    uint* asUint = (uint*)bytes;
    return *asUint;
}

becomes:

.method private hidebysig static uint32 Reinterpret1() cil managed
{
    .maxstack 8
    L_0000: ldc.i4.4 
    L_0001: conv.u 
    L_0002: localloc 
    L_0004: dup 
    L_0005: ldc.i4.1 
    L_0006: stind.i1 
    L_0007: ldind.u4 
    L_0008: ret 
}

.method private hidebysig static uint32 Reinterpret2() cil managed
{
    .maxstack 3
    .locals init (
        [0] uint32* numPtr)
    L_0000: ldc.i4.4 
    L_0001: conv.u 
    L_0002: localloc 
    L_0004: dup 
    L_0005: ldc.i4.1 
    L_0006: stind.i1 
    L_0007: stloc.0 
    L_0008: ldloc.0 
    L_0009: ldind.u4 
    L_000a: ret 
}

I think I'd be happy to say that this is a compiler bug, or at least: an undesirable side-effect and behavior given that previous decisions have been put in place to say "emit the .locals init", specifically to try and keep stackalloc sane - but whether the compiler folks agree is up to them.

The workaround is: treat the stackalloc space as undefined (which, to be fair, is what you're meant to do); if you expect it to be zeros: manually zero it.

Upvotes: 36

Related Questions