Reputation: 1161
We use swagger to test our rest APIs. I have a json object that comes back with the following value:
...
"MyValue" : 243400.000000
}
However when it is displayed thru swagger it shows as this:
...
"MyValue" : 243400
}
In my controller I put a break point on the return statement and I can verify that in dResult that "MyValue" is 243400.000000, but swagger display does not reflect this. The controller code is below:
...
var dresult = JSON.JsonConvert.DeserializeObject(result, new JSON.JsonSerializerSettings
{
FloatParseHandling = JSON.FloatParseHandling.Decimal
});
return Request.CreateResponse(HttpStatusCode.OK, dresult, JsonMediaTypeFormatter.DefaultMediaType);
Could this be a swagger configuration issue? I have not found anything yet to point to it. Any help would be appreciated.
Original swagger config:
using System.Web.Http;
using WebActivatorEx;
using MyService.WebApi;
using Swashbuckle.Application;
using Swashbuckle.Swagger;
[assembly: PreApplicationStartMethod(typeof(SwaggerConfig), "Register")]
namespace MyService.WebApi
{
public class SwaggerConfig
{
public static void Register()
{
var thisAssembly = typeof(SwaggerConfig).Assembly;
GlobalConfiguration.Configuration
.EnableSwagger(c =>
{
c.SingleApiVersion("v1", "MyService.WebApi");
c.IncludeXmlComments(string.Format(@"{0}\bin\MyService.WebApi.XML", System.AppDomain.CurrentDomain.BaseDirectory));
c.UseFullTypeNameInSchemaIds();
c.DescribeAllEnumsAsStrings();
})
.EnableSwaggerUi(c =>
{
c.DisableValidator();
});
}
}
}
Upvotes: 0
Views: 10463
Reputation: 3892
I think you may be thinking about this the wrong way!
I can see it would be frustrating, if you care about the exact decimal representation, to not be able to see the exact decimal representation in the tool which you're using to test your API.
But...
How are you going to eventually use the JSON which is sent? Presumably you're consuming it in a JavaScript application and you're going to deserialize the API response into JavaScript objects. But in that case, your value of "MyValue": 243400.000000
is going to deserialize to a JavaScript Number. Which is the root of the problem, because swagger-ui - which is what Swashbuckle.AspNetCore uses for its UI - only knows that it is a Number at this point: it is displaying what your API JSON deserialized to, and 243400.000000 and 243400 deserialize to the same value.
These are all symptoms of what I believe may be a more underlying problem. A JavaScript Number is not the best place to hold a value if you are concerned about absolutely accurately matching a certain decimal representation (as mentioned also in this answer to a related SO question). If you care about representing decimals absolutely correctly, you should probably be looking into a strict decimal supporting library for JavaScript, and be sending these values from your API as strings (which you then convert client-side into the decimal objects provided by the decimal library you use). (EDIT: Or perhaps send the values as is, but implement a custom JSON deserializer which converts them to true decimal objects?)
Here's a link to someone else with the same problem.
Upvotes: 2