Reputation: 3513
There seems to be a bug in the GetCharABCWidths() and GetGlyphOutline() functions, when the font size is changed from (-16 * i) / 72, to (-n * i) /72 , for any n below 16.
The character 'a' is still displayed correctly but the functions return FALSE and GDI_ERROR. Why ???
GetLastError() for GetCharABCWidths() returns 0 (ERROR_SUCCESS) and for GetGlyphOutline() GetLastError() returns 1003 (CANNOT COMPLETE FUNCTION).
MAT2 gmat = { {0, 1}, {0, 0}, {0, 0}, {0, 1} };
case WM_PAINT:
PAINTSTRUCT ps;
BeginPaint(hwnd, &ps);
int i;
i = GetDeviceCaps(ps.hdc, LOGPIXELSY);
LOGFONT lf;
memset(&lf, 0, sizeof(LOGFONT));
lf.lfHeight = (-14 * i) / 72;
lf.lfWeight = FW_NORMAL;
lf.lfItalic = 0;
lf.lfCharSet = SYMBOL_CHARSET;
wcscpy(lf.lfFaceName, L"Symbol");
HFONT hFont;
hFont = CreateFontIndirect(&lf);
hFont = (HFONT)SelectObject(ps.hdc, hFont);
BOOL bx;
ABC abc;
TCHAR tx;
DWORD dwx;
tx = 'a';
if( !GetCharABCWidths(ps.hdc, tx, tx, &abc) ) dwx = GetLastError();
GLYPHMETRICS gm;
if( GetGlyphOutline(ps.hdc, tx, GGO_METRICS, &gm, 0, NULL, &gmat) == GDI_ERROR )
dwx = GetLastError();
TextOut(ps.hdc, 10, 20, (LPTSTR)&tx, 1);
EndPaint(hwnd, &ps);
Upvotes: 1
Views: 2442
Reputation: 55402
On my system, Symbol exists both as a raster font (symbole.fon
) and an OpenType font (symbol.ttf
), so when GDI decides to use the raster version of the font you won't be able to obtain any TrueType metrics. To fix this, set the fdwOutputPrecision
member of your LOGFONT
to something suitable, such as OUT_TT_PRECIS
.
Upvotes: 3