hockeyman
hockeyman

Reputation: 1183

Dynamically detect required smart card protocol (T0 or T1) and change to it in C#

Situation:

I have a C# app which acts as a communicator with smart card for authentication purposes.

Basically, the smart card authentication happens on the server and server simply sends APDU packets to the application running on client computer. This application, in turn, transmits those APDU packets to the card reader through Winscard API (SCardTransmit), and returns the response to the server.

The server sends to the app raw bytes array (which is actually full APDU command under the hood), the app simply sends that bytes array to the SCardTransmit function.

Everything works great when both, card and server are working on a same protocol. However, if the card connected is on T1 protocol, while the particular server uses T0 - it breaks eventually during the authentication. It works though, if I manually change the protocol to the one server is using.

Now the issue here is that there are multiple servers that use this application, and each server can decide to use different protocol (T0 or T1).

Is there a way to dynamically detect required protocol purely from the contents of APDU packets? If so, is there also a way, using windows APIs, to dynamically change the protocol used by card on the fly?

P.S. To make development easier, as a wrapper around Winscard API, I use GemCard (https://github.com/orouit/SmartcardFramework/tree/master/Smartcard_API/GemCard).

What have been tried:

The authentication process actually breaks in this way:

After extensive googling I found out that some developers, when 6700 is received, extends the APDU buffer length by one (empty byte). That's what I've tried. Then the outcome became 6982 (Security status not satisfied).

Upvotes: 0

Views: 511

Answers (1)

guidot
guidot

Reputation: 5333

Something appears to be upside-down in the setup you describe.

Short re-statement of facts:

  • A card may support T=0, T=1 or both. Which protocols are supported is indicated by the ATR, which has to be evaluated by the host side, to handle the card correctly.
  • If both protocols are supported, T=0 is the default, switching to T=1 (as well as to higher baudrates) requires successful PPS (Protocol and Parameter Selection).
  • T=0 (character-based) and T=1 (block-based) are no easy replacements for each other (read: are not designed with clean abstraction in mind) since any command providing a response simply does so in T=1. In T=0 it returns 61xy as status and requires the host to send a get response with LE set to the same xy. Some software instance has to distinguish and resolve this, and it is not the responsibility of PCSC.

Upvotes: 0

Related Questions