OnTheFly
OnTheFly

Reputation: 2101

Why might SetString intrinsic cause an "Incompatible types" error on PChar argument?

Please excuse the silly question, but I'm confused. Consider the following method (sorry for noisy comments, this is a real code under development):

function HLanguages.GetISO639LangName(Index: Integer): string;
const
  MaxIso639LangName = 9;  { see msdn.microsoft.com/en-us/library/windows/desktop/dd373848 }
var
  LCData: array[0..MaxIso639LangName-1] of Char;
  Length: Integer;
begin
  { TODO : GetLocaleStr sucks, write proper implementation }
  //Result := GetLocaleStr(LocaleID[Index], LOCALE_SISO639LANGNAME, '??');
  Length := GetLocaleInfo(LocaleID[Index], LOCALE_SISO639LANGNAME, @LCData, System.Length(LCData));
  Win32Check(Length <> 0);
  SetString(Result, @LCData, Length); // "E2008 Incompatible types" here, but why?
end;

If I remove the reference operator then implicit cast from $X+ comes to the rescue and method compiles. Why compiler refuses this code with reference operator is beyond my understanding.

This is Delphi XE2 and this behaviour might be specific to it.


And if I add a test-case dummy with equivalent prototype as intrinsic one within the scope of HLanguages.GetISO639LangName this error will magically go away:

procedure SetString(var s: string; buffer: PChar; len: Integer);
begin
  { test case dummy }
end;

Upvotes: 4

Views: 1471

Answers (4)

Arioch &#39;The
Arioch &#39;The

Reputation: 16045

Because LCData is pointer to the array, not to the Char. Sure, sometimes it happens that an array or a record or a class start with char-type variable, but consequences are not what statically-typed compiler should rely upon.

You have to take the pointer to a character in that array, not to the array itself.

SetString(Result, @LCData[Low(LCData)], Length); 

Upvotes: 0

David Dubois
David Dubois

Reputation: 3932

I know this doesn't answer the specific question regarding SetString, but I'd like to point out that you can do the same thing by simply writing

Result := LCData;

When assigning to a string, Delphi treats a static array of char with ZERO starting index, as a Null terminated string with maximum length. Consider the following:

var
  IndexOneArray  : array [ 1 .. 9 ] of char;
  IndexZeroArray : array [ 0 .. 8 ] of char;
  S : string;
  T : string;
begin
  IndexOneArray  := 'ABCD'#0'EFGH';
  IndexZeroArray := 'ABCD'#0'EFGH';
  S := IndexOneArray;
  T := IndexZeroArray;

  ShowMessage (      'S has ' + inttostr(length(S)) + ' chars. '
                + #13'T has ' + inttostr(length(T)) + ' chars. ' );
end;

This displays a message that S has 9 chars, while T has 4. It will also work when the zero-index array has 9 non-null characters. The result will be 9 characters regardless of what's in the following memory locations.

Upvotes: 3

Arnaud Bouchez
Arnaud Bouchez

Reputation: 43023

You have to explicitly convert it to PChar:

SetString(result,PChar(@LCData),Length); 

As you stated, SetString() is very demanding about the 2nd parameter type. It must be either a PChar either a PWideChar either a PAnsiChar, depending on the string type itself.

I suspect this is due to the fact that SetString() is defined as overloaded with either a string, a WideString, or an AnsiString as 1st parameter. So in order to validate the right signature, it needs to have exact match of all parameters types:

SetString(var s: string; buf: PChar; len: integer); overload;
SetString(var s: AnsiString; buf: PAnsiChar; len: integer); overload;
SetString(var s: WideString; buf: PWideChar; len: integer); overload;

Of course, all those are "intrinsics", so you won't find such definition in system.pas, but directly some procedure like _LStrFromPCharLen() _UStrFromPCharLen() _WStrFromPWCharLen() or such.

This behavior is the same since early versions of Delphi, and is not a regression in XE2.

Upvotes: 5

David Heffernan
David Heffernan

Reputation: 612794

I think there's a compiler bug in there because the behaviour with SetString differs from the behaviour with overloaded functions that you provide. What's more there's an interaction with the Typed @ operator compiler option. I don't know how you set that. I always enable it but I suspect I'm in the minority there.

So I cannot explain the odd behaviour, and answer the precise question you ask. I suspect the only way to answer it is to look at the internals of the compiler, and very few of us can do that.

Anyway, in case it helps, I think the cleanest way to pass the parameter is like so:

SetString(Result, LCData, Length); 

This compiles no matter what you set Typed @ operator to.

Upvotes: 4

Related Questions