Reputation:
Currently we have set up our UI test automation in one environment(Dev). We are using C#, .Net, Visual Studio, specflow and MSTest
CONFIG
We read an app.config
file for environment specific variables. We use Azure Pipeline for CI and run these tests on a nightly build.
<configuration>
<appSettings>
<add key="DevApp" value="https://dev.websitename.com />
</appSettings>
<connectionStrings>
<add name="DevDatabase" connectionString="http://dev.url/" />
</connectionStrings>
</configuration>
Now we want to run these tests on our UAT environment as well. The setup will be the same, and we want to run the same tests with the same data. The only difference is that for UAT we will point to different URL's and a different database.
for example
Dev env = https://dev.websitename.com
UAT env = https://uat.websitename.com
server name="DevDatabase" connectionString="http://dev.url/"
server name="UATDatabase" connectionString="http://uat.url/"
PASSWORDS
In terms of password, out application is an internal application and we use windows auth. So in our dev and uat environment we have the same password set up for all users. So in Dev = devpassword and UAT = uatpassword
For both dev and test we are using the same users with the password being the only difference.When testing we launch the browser using impersonation and launch the browser as 'run as' for that user
var service = ChromeDriverService.CreateDefaultService(driverpath)
if user is not null then we do this
var pwd = new SecureString()
service.StartDomain = Configurationhelper.Domain
service.StartupUserName = username
service.StartupPassword= = pwd
service.StartupLoadUserProfile = true
we store domain and password and other environmental variables in a separate config file as constants.
**Main issue: ** This wont work now so I think it could be best to store passwords as secrets in AZURE pipeline variables? if so, how would i change this code? for example
The server team, db team and devops team have taken care of server,db setup and urls etc
So for me its just configuring the test automation repo with my changes to configuration
What could be an elegant approach to do this?
AZURE PIPELINE
How could we run tests for both these environments in parallel? by parallel i mean having both run on a nightly run. Our Azure pipeline has 2 separate clients UAT and DEV pointing to the same artifact. The tasks and Variable are the same for both environments but with different values obviously
Currently they both would run in isolation
Upvotes: 1
Views: 1979
Reputation: 21709
Solutions to this problem come down to how does the context (in your case the environment and all its associated connection strings and URLs) get to the tests where they will be consumed. In your question, you stated several orthogonal concerns:
Not mentioned is another concern
I'll explain one solution (a strategy really) that addresses these concerns, and why it appears to be a maintainable and extensible solution.
This can be very simple or very complex. The simple solution is to create a database of canonical and representative test data, and to then swap in that database to your environment. That can be done via your database's backup/restore or by creating the data programmatically. You would need to have a mechanism to restore or wipe the data whether the test(s) succeeds or fails.
Very rarely will using the environment's database "as is" lead to reliable tests; tests often modify state, and a database is the ultimate form of state; mutations of state will affect future tests.
It is with this last sentence that a full swap that occurs before/after each test is probably a) faster (occurring at a bulk/macro level with a quicker swap function), b) more maintainable (data can be reviewed/created ahead of time) and c) less brittle.
This, like the heart of your question discusses, is where you come down to whether to use multiple files or a single file. Using multiple files means you can take advantage of some of the built-in .NET configuration mechanisms that allow you to specify the environment. That means duplicating the files and changing the values to reflect the environment.
The other way, you mentioned, is storing all of this information into a single configuration file. If you do it this way, you need some sort of discriminator to disambiguate the entries, and your test needs to be able to pass in the environment name to some API to pull the value. I prefer this mechanism personally, because when you add a new feature/test you can add all the configuration in one place.
So the two mechanisms are roughly the same in terms of work, except the latter leads to a more compact editing session when adding new config, and from a readability/maintainability scenario, you have fewer places to look.
If you follow the single-source of configuration approach, you simply extend that to your secrets, but you select the appropriate secret store (e.g. a secrets file or Azure Key Vault, or some such... again with an environment-based discriminator). Here's an example:
{
"DEV.Some.Key" : "http://devhost/some/path",
"UAT.Some.Key" : "https://uathost/some/other/path"
...
}
Using a discriminator means far less changes to your DevOps pipeline, which is, from a developer/editing experience, most likely slower and more cumbersome than editing a file or key vault.
While you could rotate out the context and design your solution to run in parallel using the MSTest mechanisms, it would be more elegant to allocate this to your pipeline itself, and have enough resources to be able to run these pipelines in parallel by having enough build agents and so on.
It comes down to which parts of the solution should be addressed by which resources. The solution above allocates environmental selection and test execution into the pipeline itself. Granular values such as connection strings and secrets are allocated to a single source to reduce the friction that occurs when having to edit these values.
Following this strategy might better leverage your team's skills as well. Those with a DevOps mindset can most likely spin up new environments and parallelize more readily than a Developer mindset, who would be more aware of what data needs to be setup and how to craft the tests.
Upvotes: 1