Davio
Davio

Reputation: 4737

Verify Currently Running Executable

I'm looking for the right approach to verify a currently running executable from within that executable. I've already found a way to compute a (SHA256) hash for the file that is currently running.

The problem is: Where do I safely store this hash? If I store it in a config file, a malicious user can just calculate his own hash and replace it. If I store it in the executable itself, it can be overridden with a hex editor probably.

A suggestion I read was to do an asymmetrical en- (or was it de-) cryption, but how would I go about this?

A requirement is that the executable code hashes and en/decrypts exactly the same on different computers, otherwise I can't verify correctly. The computers will all be running the same OS which is Windows XP (Embedded).

I'm already signing all of my assemblies, but I need some added security to successfully pass our Security Target.

For those who know, it concerns FPT_TST.1.3: The TSF shall provide authorised users with the capability to verify the integrity of stored TSF executable code.

Upvotes: 4

Views: 2849

Answers (2)

CodeTherapist
CodeTherapist

Reputation: 2806

Since .NET 3.5 SP1, the runtime is not checking the strong name signature. When your assemblies are strong named, so I suggest to check the signature by code. Use the native mscoree.dll with p/Invoke.

private static class NativeMethods
{
      [DllImport("mscoree.dll")]
      public static extern bool StrongNameSignatureVerificationEx([MarshalAs(UnmanagedType.LPWStr)] string wszFilePath, byte dwInFlags, ref byte pdwOutFlags);
}

Than you can use the assemlby load event and check every assembly that is loaded into your (current) app domain:

AppDomain.CurrentDomain.AssemblyLoad += CurrentDomain_AssemblyLoad;

private static void CurrentDomain_AssemblyLoad(object sender, AssemblyLoadEventArgs args)
{
    Assembly loadedAssembly = args.LoadedAssembly;

    if (!VerifytrongNameSignature(loadedAssembly))
        // Do whatever you want when the signature is broken.
}

private static bool VerifytrongNameSignature(Assembly assembly)
{
     byte wasVerified = 0;

     return NativeMethods.StrongNameSignatureVerificationEx(assembly.Location, 1, ref wasVerified);
}

Of course, someone with enough experience can patch out the "check code" from you assemlby, or simply strip the strong name from your assembly..

Upvotes: 2

Andras Zoltan
Andras Zoltan

Reputation: 42363

All the comments, especially the one from Marc, are valid.

I think your best bet is to look at authenticode signatures - that's kind of what they're meant for. The point being that the exe or dll is signed with a certificate (stamping your organisation's information into it, much like an SSL request) and a modified version cannot (in theory plus with all the normal security caveats) be re-signed with the same certificate.

Depending upon the requirement (I say this because this 'security target' is a bit woolly - the ability to verify the integrity of the code can just as easily be a walkthrough on how to check a file in windows explorer), this is either enough in itself (Windows has built-in capability to display the publisher information from the certificate) or you can write a routine to verify the authenticode certificate.

See this SO Verify whether an executable is signed or not (signtool used to sign that exe), the top answer links to an (admittedly old) article about how to programmatically check the authenticode certificate.

Update

To follow on from what Marc suggested - even this won't be enough if a self-programmatic check is required. The executable can be modified, removing the check and then deployed without the certificate. Thus killing it.

To be honest - the host application/environment really should have it's own checks in place (for example, requiring a valid authenticode certificate) - if it's so important that code isn't modified then it should have its own steps for doing so. I think you might actually be on a wild goose chase.

Just put whatever check will take least amount of effort on your behalf without worrying too much about the actual security it apparently provides - because I think you're starting from an impossible point. If there is actually any genuine reason why someone would want to hack the code you've written, then it won't just be a schoolboy who tries to hack it. Therefore any solution available to you (those mentioned in comments etc) will be subverted easily.

Rent-a-quote final sentence explaining my 'wild goose chase' comment

Following the weakest link principle - the integrity of an executable file is only as valid as the security requirements of the host that runs that executable.

Thus, on a modern Windows machine that has UAC switched on and all security features switched on; it's quite difficult to install or run code that isn't signed, for example. The user must really want to run it. If you turn all that stuff down to zero, then it's relatively simple. On a rooted Android phone it's easy to run stuff that can kill your phone. There are many other examples of this.

So if the XP Embedded environment your code will be deployed into has no runtime security checks on what it actually runs in the first place (e.g. a policy requiring authenticode certs for all applications) then you're starting from a point where you've inherited a lower level of security than you actually supposed to be providing. No amount of security primitives and routines can restore that.

Upvotes: 4

Related Questions