Ray
Ray

Reputation: 46585

What changes require a dependent assembly to be redeployed?

At my workplace we deploy internal application by only replacing assemblies that have changed (not my idea).

We can tell which assemblies we need to deploy by looking at if the source files that are compiled into the assemblies have changed. Most of the time we don't need to redeploy assemblies that depend on assemblies that have changed. However we have found some cases where even though no source files in an assembly have changed, we need to redeploy it.

So far we know that any of these changes in an assembly, will require all dependent assemblies to need to be recompiled and deployed:

Are there any other cases that we're missing? I'm also open to arguments why this entire approach is flawed (although it's been used for years).

Edit To be clear, we're always recompiling, but only deploying assemblies where the source files in them have changed.

So anything that breaks compilation will be picked up (method name changes, etc.), since they require changes in the calling code.

Upvotes: 7

Views: 337

Answers (2)

ChrisLively
ChrisLively

Reputation: 88074

First off, we have sometimes deployed only a few assemblies in an application instead of the complete app. However, this is by no means the norm and has ONLY been done in our test environments when the developer had very recently (as in within the last few minutes) published the whole site and was just making a minor tweak. However, once the dev is satisfied they will go ahead and do a full recompile and republish.

The final push to testing is always based off a full recompile / deploy. The pushes to staging and ultimately production are based off of that full copy.

Besides repeatability, one reason is that you really can't be 100% positive that a human didn't miss something in the comparisons. Next, the amount of time to deploy 100 assemblies versus 5 is trivial and quite frankly not worth the amount of human time it takes to try and figure out what really changed.

Quite frankly, the list you have in combination with Oded's answer ought to be enough to convince others of the potential for failure. However, the very fact that you have already run into failures due to this lackadaisical approach should be enough of a warning flag to stop it from continuing.

At the end of the day, it really boils down to a question of professionalism. Standardization and repeatability of the process of moving code out of development, through the various hoops and ultimately into production are extremely important in creating robust mission critical applications. If your deployment process is frought with the potential for failure due to these types of risk inducing short cuts, it raises questions on the quality of the code being produced.

Upvotes: 2

Oded
Oded

Reputation: 499062

Here is another one:

Changes to optional parameter values.

The default values get directly compiled to the assembly using them (if not specified)

 public void MyOptMethod(int optInt = 5) {}

Any calling code such as this:

 theClass.MyOptMethod();

Will end up compiled to:

 theClass.MyOptMethod(5);

If you change the method to:

 public void MyOptMethod(int optInt = 10) {}

You will need to recompile all dependent assemblies if you want the new default to apply.


Additional changes that will require recompilation (thanks Polynomial):

  • Changes to generic type parameter constraints
  • Changes to method names (especially problematic when using reflection, as private methods may also be inspected)
  • Changes to exception handling (different exception type being thrown)
  • Changes to thread handling
  • Etc... etc... etc...

So - always recompile everything.

Upvotes: 6

Related Questions