Reputation: 3424
I have an integration testing solution. I have my tests described in XML files. In order to capitalize on Visual Studio 2010 testing infrastructure, I have a C# class where every XML test file has an associated method that loads the XML file and executes its content. It looks like this:
[TestClass]
public class SampleTests
{
[TestMethod]
public void Test1()
{
XamlTestManager.ConductTest();
}
[TestMethod]
public void Test2()
{
XamlTestManager.ConductTest();
}
...
[TestMethod]
public void TestN()
{
XamlTestManager.ConductTest();
}
}
Each method name corresponds to an XML file name. Hence, I have to have the following files in my test directory:
XamlTestManager.ConductTest()
uses the StackTrace class to get the name of the calling method and this way it can find the correct XML test file to load.
I would like to get rid of the extra administration of adding/removing/renaming test methods whenever I change my tests, adding/removing/renaming an XML test file. How can I automagically generate this class or its methods during the compilation process based on the actual XML files in my test directory?
Option 1:
I have considered PostSharp, but it does not allow me to look up the XML files and generate methods on the fly (or was I superficial?).
Option 2:
The other idea was to build a Visual Studio custom tool that generates my code whenever it is executed. The downside here is the deployment. The custom tool needs to be registered to VS. I want a solution that can be committed into a repository, check it out to another computer and use it right away.
(I believe in simplicity. "Check out and run" just simplifies the life of new developers soooooo much, if they do not need to go through a list of thing to install before they can compile run the application.)
Do you have any recommendation, how to get rid of the unnecessary maintenance issue?
EDIT:
For the request of Justin, I add more details. We use Bizunit (fantastic!!!) as the basis of our framework with a truckload of custom made high level test steps. From these steps we can build our test like from lego blocks in a declarative manner. Our steps include like FileDrop, WebService invokation or even polling, firing up a full blown web server to simulate a partner web application, random data generator, data comparing steps etc. Here is an example test xml (in fact XAML):
<TestCase BizUnitVersion="4.0.154.0" Name="StackOverflowSample" xmlns="clr-namespace:BizUnit.Xaml;assembly=BizUnit" xmlns:nib="clr-namespace:MyCompany.IntegrationTest;assembly=BizUnit.MyCustomSteps" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">
<TestCase.SetupSteps>
<nib:ClearStep FailOnError="True" RunConcurrently="False" />
<nib:LaunchSimulatedApp AppKernelCacheKey="provider" FailOnError="True" FireWakeUpCall="False" PortNumber="4000" RepresentedSystem="MyProviderService" RunConcurrently="False" />
<nib:HttpGetStep FailOnError="True" RunConcurrently="False" Url="http://localhost:10000/Home/StartSvgPolling">
<nib:HttpGetStep.Parameters>
<x:String x:Key="PolledAddress">http://localhost:4000/SvgOutputPort.asmx</x:String>
<x:String x:Key="PollingInterval">10</x:String>
<x:String x:Key="FilterFile"></x:String>
</nib:HttpGetStep.Parameters>
</nib:HttpGetStep>
</TestCase.SetupSteps>
<TestCase.ExecutionSteps>
<nib:DocumentMergeStep FailOnError="True" OutputCacheKey="inputDocument" RunConcurrently="False">
<nib:DocumentMergeStep.InputDocuments>
<nib:RandomLoader BoundingBox="Europe" LinkbackUrlPattern="http://MyProviderService/id={0}" MaxAmount="10" MaxID="100" MinAmount="10" MinID="0" NamePattern="EuropeanObject_{0}" NativeFormat="Svg" RepeatableRandomness="False" UriPrefix="European" />
<nib:RandomLoader BoundingBox="PacificIslands" LinkbackUrlPattern="http://MyProviderService/id={0}" MaxAmount="10" MaxID="100" MinAmount="10" MinID="0" NamePattern="PacificObject_{0}" NativeFormat="Svg" RepeatableRandomness="False" UriPrefix="Pacific" />
</nib:DocumentMergeStep.InputDocuments>
</nib:DocumentMergeStep>
<nib:PushToSimulatedApp AppKernelCacheKey="provider" ContentFormat="Svg" FailOnError="True" RunConcurrently="False">
<nib:PushToSimulatedApp.InputDocument>
<nib:CacheLoader SourceCacheKey="inputDocument" />
</nib:PushToSimulatedApp.InputDocument>
</nib:PushToSimulatedApp>
<nib:GeoFilterStep FailOnError="True" OutputCacheKey="filteredDocument" RunConcurrently="False" SelectionBox="Europe">
<nib:GeoFilterStep.InputDocument>
<nib:CacheLoader SourceCacheKey="inputDocument" />
</nib:GeoFilterStep.InputDocument>
</nib:GeoFilterStep>
<nib:DeepCompareStep DepthOfComparision="ID, Geo_2MeterAccuracy, PropertyBag, LinkbackUrl" FailOnError="True" RunConcurrently="False" Timeout="30000" TolerateAdditionalItems="False">
<nib:DeepCompareStep.ReferenceSource>
<nib:CacheLoader SourceCacheKey="filteredDocument" />
</nib:DeepCompareStep.ReferenceSource>
<nib:DeepCompareStep.InvestigatedSource>
<nib:SvgWebServiceLoader GeoFilter="Europe" NvgServiceUrl="http://localhost:10000/SvgOutputPort.asmx"/>
</nib:DeepCompareStep.InvestigatedSource>
</nib:DeepCompareStep>
</TestCase.ExecutionSteps>
<TestCase.CleanupSteps>
<nib:HttpGetStep FailOnError="True" RunConcurrently="False" Url="http://localhost:10000/Home/StopSvgPolling">
<nib:HttpGetStep.Parameters>
<x:String x:Key="PolledAddress">http://localhost:4000/SvgOutputPort.asmx</x:String>
</nib:HttpGetStep.Parameters>
</nib:HttpGetStep>
<nib:KillSimulatedApp AppKernelCacheKey="provider" FailOnError="True" PortNumber="4000" RunConcurrently="False" />
</TestCase.CleanupSteps>
</TestCase>
This is what it does:
The power of Bizunit is that merges the ease of creating tests in C# with intellisense and ease of maintaining/duplicating it in XAML files. For a quick easy read on how it works: http://kevinsmi.wordpress.com/2011/03/22/bizunit-4-0-overview/
Upvotes: 3
Views: 1215
Reputation: 528
Just answered alike "code generation from XML with T4" question.
https://stackoverflow.com/a/8554949/753110
Your requirement matches exactly what we did initially (and what lead to discovery of the ADM described on that answer).
We are currently working on test-case based generation, where the test-cases are actually built by the testing-staff, yet the complete integrationtests through code are generated to support them.
Added custom XML based generation demo for that other example, if you want to see:
https://github.com/abstractiondev/DemoSOCase8552428ABS
Upvotes: 0
Reputation: 19426
As @GeorgeDuckett said, T4 templates are probably the way to go. In the application I am working on, we use them for a lot, including generating Repositories, Services, ViewModels, Enums and recently unit tests.
They are basically code generating scripts written in either VB or C#, looking at a directory for XML files would be no problem for these kinds of templates.
If you do choose to go the T4 route, the Tangible T4 Editor is definitely a must have, it is a free download.
Here is a quick example of a T4 script which should do or be pretty close to what you want:
<#@ template language="C#" debug="true" hostspecific="true"#>
<#@ output extension="g.cs"#>
[TestClass]
public class SampleTests
{
<#
string[] files = Directory.GetFiles(@"C:\TestFiles", "*.xml");
foreach(string filePath in files)
{
string fileName = Path.GetFileNameWithoutExtension(filePath);
#>
[TestMethod]
public void <#=fileName#>()
{
XamlTestManager.ConductTest();
}
<#
}
#>
}
Make sure this is placed in a file with the .tt extension, then on the property windows for this file, ensure the Build Action is None
, Custom Tool is TextTemplatingFileGenerator
.
Edit: Accessing output directory from T4 template
Add the following two lines to the top of your T4 template, under the <#@ template ... #> line:
<#@ assembly name="EnvDTE" #>
<#@ import namespace="EnvDTE" #>
Then inside your template, you can access and use the visual studio API like so:
IServiceProvider serviceProvider = this.Host as IServiceProvider;
DTE dte = serviceProvider.GetService(typeof(DTE)) as DTE;
object[] activeSolutionProjects = dte.ActiveSolutionProjects as object[];
if(activeSolutionProjects != null)
{
Project project = activeSolutionProjects[0] as Project;
if(project != null)
{
Properties projectProperties = project.Properties;
Properties configurationProperties = project.ConfigurationManager.ActiveConfiguration.Properties;
string projectDirectory = Path.GetDirectoryName(project.FullName);
string outputPath = configurationProperties.Item("OutputPath").Value.ToString();
string outputFile = projectProperties.Item("OutputFileName").Value.ToString();
string outDir = Path.Combine(projectDirectory, outputPath);
string targetPath = Path.Combine(outDir, outputFile);
}
}
outDir
and targetPath
contain the output directory and the full path to the output file.
Upvotes: 1
Reputation: 13665
I actually don't think this is a job for build-time code generation, I think you should use data attributes to drive the tests in this case.
If you used xunit you could do that like this:
public class SampleTests
{
[Theory]
[InlineData(1)]
[InlineData(2)]
[InlineData(...)]
[InlineData(N)]
public void Test(int x)
{
XamlTestManager.ConductTest(x);
}
}
And it will run the test once per InlineData attribute. Also I believe there is another attribute that you can pass a path to a file and it will populate your parameters with values from that file...
I think NUnit has a similar feature but XUnit is much better, I would recommend using XUnit instead.
Upvotes: 0
Reputation: 106836
Instead of creating a separate test for each set of test data you can create a single test that is repeatedly run for each set of test data:
[TestClass]
public class SampleTests
{
[TestMethod]
public void Test()
{
for (var i = 0; i < 10; ++i)
XamlTestManager.ConductTest(i);
}
}
You can also perform data-driven tests by using the DataSource attribute. This will perform your test for each row in your data set.
[TestClass]
public class SampleTests
{
public TestContext Context { get; set; }
[TestMethod]
[DataSource(...)]
public void Test()
{
var someData = Context.DataRow["SomeColumnName"].ToString();
...
}
}
Upvotes: 1