Reputation: 326
I'm using powershell dot-sourcing to set variables in the current scope and have encountered an interesting feature.
It seems that the parameter of the function will also overwrite any local variable of the same name.
Is this expected?
Should I just use $global:MyVar instead to set variables in the local scope from other scripts?
# Given
function TestX([string]$X)
{
Write-Host "`$X = $X"
}
# And variable $X
# Note that the variable name is the same as the parameter name in 'TestX'
$X = "MyValue"
PS> TestX $X
MyValue
PS> $X; TestX "123456"
MyValue
123456
PS> $X; . TestX "123456"
MyValue
123456
PS> $X; . TestX "123456"
123456
123456
EDIT: To expand on what I'm trying to accomplish...
I have a set of scripts used for a build process. These scripts are used to target multiple environments. There are different configurations for each environment (DEV, TEST, QA, PROD) that apply different rules/settings/etc. These configurations are stored in directories. Among these settings are some powershell files that are used to set script-wide settings for that particular environment. For example, target server URL, target server UNC, etc..
Among the build process scripts there is a function Confirm-TargetEnvironmentVariables. As the name implies, it checks to see if the environment variables have been loaded and if not, loads them. This function is sprinkled throughout the various script files/functions to ensure that when a function uses one of these script-wide variables, it has been set.
It was this function that I used to call with dot-sourcing.
function Confirm-TargetEnvironmentVariables([string]$TargetEnvironment)
{
...
}
# Like this..
. Confirm-TargetEnvironmentVariables "PROD"
This all worked just fine. Until the I had a need to switch between loading variables from more than 1 environment (for refreshing TEST from PROD for example, I need variable info from both). And in fact this still works, except for the fact that in the script that was calling Confirm-TargetEnvironmentVariables I already had a variable called $TargetEnvironment. So I was trying to do this:
$SourceEnvironment = "PROD"
$TargetEnvironment = "TEST"
. Confirm-TargetEnvironmentVariables $SourceEnvironment
# Do stuff with loaded "PROD" variables...
. Confirm-TargetEnvironmentVariables $TargetEnvironment
# Do stuff with loaded "TEST" variables...
But what was happening was this:
$SourceEnvironment = "PROD"
$TargetEnvironment = "TEST"
. Confirm-TargetEnvironmentVariables $SourceEnvironment
# Do stuff with loaded "PROD" variables...
# The value of $TargetEnvironment has been set to "PROD" by dot-sourcing!!
. Confirm-TargetEnvironmentVariables $TargetEnvironment
# Do stuff with loaded... "PROD" variables!!!
So this should provide more context hopefully. But ultimately it still raises the question of why dot-sourcing includes parameter variables when bringing variables into the local scope. Is this by design? I can't think of a scenario where this would be desired behavior though.
Upvotes: 0
Views: 561
Reputation: 174690
Should just use $global:MyVar instead to set variables in the local scope from other scripts?
I'd recommend you avoid writing functions that either requires dot-sourcing to work correctly, or that write to global variables.
Instead, use Set-Variable
's -Scope
parameter to write to a variable in the calling scope:
function Test-SetVariable
{
param([string]$Name,$Value)
# '1' means "one level up", so it updates the variable in the caller's scope
Set-Variable -Name $Name -Value $Value -Scope 1
}
Upvotes: 1