WCF Client in .NET 4.5.1: How to enable TLS 1.2 when WebRequest is used?

Our .net WCF Client, the WebRequest call, compiled to a windows EXE, running on Win2012-R2, refuses to connect to a web server that surfaces ONLY TLS 1.2

We know that Win2012 and .NET 4.5x support TLS 1.2

We have no problems when the server surfaces TLS 1.0 and up. The problem is only seen when the server we connect to has DISABLED TLS 1.0, 1.1 and SSL2 and SSL3. The Server ONLY surfaces TLS 1.2. Chrome and firefox (on Win 7 and higher) connect fine to the server (no warnings or SSL issues of any kind).

The server certificate is %100 OK.

The problem is that WebRequest fails to connect in this situation.

What do we need to set in code so that our use of WebRequest will connect to systems that may run TLS 1.2, 1.1, 1.0, and/or SSL v3?

Upvotes: 40

Views: 75997

Answers (3)

J.Wincewicz
J.Wincewicz

Reputation: 962

What is important, you should start with .Net Framework v4.5 at least. Older versions do not support TSL 1.2. Later on, while authenticating to the server explicitly use this protocol:

    sslStream.AuthenticateAsClient(this._configuration.Host, null, SslProtocols.Tls12, true);

Upvotes: 0

Yujie
Yujie

Reputation: 452

You should work with .NET 4.5 or above version and add this line in your code:

System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;

Upvotes: 42

While not easy to figure out, the needed property is:

System.Net.ServicePointManager.SecurityProtocol

This can be used to disable and enable TLS levels in the WCF environment.

Further, you can see what WCF is currently set to using:

Console.WriteLine(System.Net.ServicePointManager.SecurityProtocol.ToString());

With thanks to: How do I disable SSL fallback and use only TLS for outbound connections in .NET? (Poodle mitigation)

Upvotes: 38

Related Questions