Timwi
Timwi

Reputation: 66614

How to convert a virtual-key code to a character according to the current keyboard layout?

I have browsed several earlier questions about this, and the best answer I found so far is something like this:

(char) WinAPI.MapVirtualKey((uint) Keys.A, 2)

However, this doesn't work in two ways:

What is the correct solution for this?

Upvotes: 24

Views: 25297

Answers (4)

Muhirwa Thierry
Muhirwa Thierry

Reputation: 21

I also ran into this problem in WinUI 3. I tried solving it using the approach from Timwi's answer, but I needed the actual keyboard state information. This is because I wanted to use it with a control that accepts keyboard input. To achieve this, I added another function to get the keyboard state directly.

[DllImport("user32.dll", SetLastError = true)]
public static extern int ToUnicode(uint virtualKeyCode, uint scanCode,
    byte[] keyboardState,
    [Out, MarshalAs(UnmanagedType.LPWStr, SizeConst = 64)] StringBuilder receivingBuffer,
    int bufferSize, uint flags);

[DllImport("user32.dll", SetLastError = true)]
public static extern bool GetKeyboardState([Out]byte[] receivingBuffer);

public static string GetCharsFromKeys(VirtualKey key)
{
    StringBuilder buffer = new(256);
    byte[] keyboardState = new byte[256];
    KeyboardHelpers.GetKeyboardState(keyboardState);

    _ = KeyboardHelpers.ToUnicode((uint)key, 0, keyboardState, buffer, 256, 0);
    return buffer.ToString();
}

If you need more information about this function GetKeyBoardState(PBYTE lpKeyState); you can read this documentation: https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-getkeyboardstate

Upvotes: 1

Alireza Ghahremanian
Alireza Ghahremanian

Reputation: 37

    [DllImport("user32.dll")]
    static extern int MapVirtualKey(int uCode, uint uMapType);

    private void textBox1_OnKeyDown(object sender, KeyEventArgs e)
    {
        char c = (char)MapVirtualKey((int)e.KeyData, (uint)2);
        if (char.IsNumber(c)) DoSomething();
        else if (!char.IsNumber(c)) DoNothing();
    }

Upvotes: 0

Timwi
Timwi

Reputation: 66614

The correct solution is the ToUnicode WinAPI function:

[DllImport("user32.dll")]
public static extern int ToUnicode(uint virtualKeyCode, uint scanCode,
    byte[] keyboardState,
    [Out, MarshalAs(UnmanagedType.LPWStr, SizeConst = 64)]
    StringBuilder receivingBuffer,
    int bufferSize, uint flags);

One way to wrap this into a sensible, convenient method would be:

static string GetCharsFromKeys(Keys keys, bool shift, bool altGr)
{
    var buf = new StringBuilder(256);
    var keyboardState = new byte[256];
    if (shift)
        keyboardState[(int) Keys.ShiftKey] = 0xff;
    if (altGr)
    {
        keyboardState[(int) Keys.ControlKey] = 0xff;
        keyboardState[(int) Keys.Menu] = 0xff;
    }
    WinAPI.ToUnicode((uint) keys, 0, keyboardState, buf, 256, 0);
    return buf.ToString();
}

Now we can retrieve characters and actually get the expected results:

Console.WriteLine(GetCharsFromKeys(Keys.E, false, false));    // prints e
Console.WriteLine(GetCharsFromKeys(Keys.E, true, false));     // prints E

// Assuming British keyboard layout:
Console.WriteLine(GetCharsFromKeys(Keys.E, false, true));     // prints é
Console.WriteLine(GetCharsFromKeys(Keys.E, true, true));      // prints É

It is also possible to use ToUnicodeEx to retrieve the characters for a keyboard layout that is not the currently active one. The signature is the same except for one extra parameter, the input locale ID, which can be retrieved using the LoadKeyboardLayout function.

Upvotes: 34

Binus
Binus

Reputation: 1075

I think this can be achieved using this method:

[DllImportAttribute("User32.dll")]
public static extern int ToAscii(int uVirtKey, int uScanCode, byte[] lpbKeyState,
        byte[] lpChar, int uFlags);

The example usage can be found here: http://www.pcreview.co.uk/forums/toascii-function-t1706394.html

Upvotes: 1

Related Questions