Reputation: 4508
I have a data driven unit test that uses a CSV file for its data source. One of the columns in my file is meant to be treated as a string. It works fine until I add a line where the value for that column can be interpreted as a date. When I do this, the tests for the earlier lines start to fail. It appears that having a "date" in the column makes it treat all values in the columns as dates. Values that can't be parsed as a date are then given a DBNull value. Is there anyway to prevent this? Maybe by specifying what type each column in my data source should be treated as?
Upvotes: 2
Views: 1230
Reputation: 538
Based on the information you gave, I would suggest to try doubles quotes (") around the value's. Secondly, I always treat all fields as String and call an appropriate parse method in my code.
I do the following in the CSV file:
input,expected
"1600,1","1600,1"
"1600","1600"
A simple test method reads both of the value's. The input is parsed as a Double and the expected is treated as a String.
[DeploymentItem("UnitTest\\TestData.csv"),
DeploymentItem("TestData.csv"), TestMethod(),
DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\TestData.csv",
"TestData#csv",
DataAccessMethod.Sequential)
]
public void Test()
{
double input= System.Double.Parse(TestContext.DataRow["input"].ToString());
string expected = TestContext.DataRow["expected"].ToString();
Assert.AreEqual(input, expected);
}
I know this is a very basic example and maybe I'm doing things that are not advised. My experience with Unit Testing in VS2010 is limited, so feel free to suggest improvements to this answer.
This answer is based on a problem of me to read some decimal value's. This for testing a formatter I'm implementing for the moment.
Upvotes: 3