Reputation: 271
I am having problems measure the height of a font which I have included with CSS using this code:
measureFontHeight3: function(font)
{
var left = 0;
var top = 0;
var height = 50;
var width = 50;
// Draw the text in the specified area
var canvas = ig.$new('canvas');
canvas.width = width;
canvas.height = height;
var ctx = canvas.getContext('2d');
ctx.font = font;
ctx.textBaseline = 'top';
ctx.fillText('gM', 0,0);
// Get the pixel data from the canvas
var data = ctx.getImageData(left, top, width, height).data,
first = false,
last = false,
r = height,
c = 0;
// Find the last line with a non-white pixel
while(!last && r)
{
r--;
for(c = 0; c < width; c++)
{
if(data[r * width * 4 + c * 4 + 3])
{
last = r;
break;
}
}
}
// Find the first line with a non-white pixel
while(r)
{
r--;
for(c = 0; c < width; c++)
{
if(data[r * width * 4 + c * 4 + 3]) {
first = r;
break;
}
}
// If we've got it then return the height
if(first != r)
{
var result = last - first;
console.log("3: " +result);
return result;
}
}
// We screwed something up... What do you expect from free code?
return 0;
},
When I measure a font which the system already has installed, the function is quite accurate, but when I try to measure a font which I have included in a CSS file, the measurement does not work, i.e. it measure wrongly.
Is it because of the new canvas not being able to "see" the new font or is something else wrong ?
Upvotes: 1
Views: 64
Reputation: 1261
Could it be because you want to measure the font before it's been fully loaded ?
In my example it seems to be working fine : Font example
Upvotes: 1