KdgDev
KdgDev

Reputation: 14549

Load all functions into PowerShell from a certain directory

Suppose you're a system administrator who uses PowerShell to manage a lot of things on his/her system(s).

You've probably written a lot of functions which do things you regularly need to check. However, if you have to move around a lot, use different machines a lot and so on, you'd have to re-enter all your functions again and again to be able to use them. I even have to do it every time I exit and restart PowerShell for some reason, as it won't remember the functions...

I've written a function that does this for me. I'm posting it here because I want to be certain it's foolproof. The function itself is stored in allFunctions.ps1, which is why I have it excluded in the code.

The basic idea is that you have one folder in which you store all your ps1 files which each include a function. In PowerShell, you go to that directory and then you enter:

. .\allFunctions.ps1

The contents of that script is this:

[string]$items = Get-ChildItem -Path . -Exclude allFunctions.ps1
$itemlist = $items.split(" ")
foreach($item in $itemlist)
{
    . $item
}

This script will first collect every file in your directory, meaning all non-ps1 files you might have in there too. allFunctions.ps1 will be excluded.

Then I split the long string based on the space, which is the common separator here. And then I run through it with a Foreach-loop, each time initializing the function into PowerShell.

Suppose you have over 100 functions and you never know which ones you'll need and which you won't? Why not enter them all instead of nitpicking?

So I'm wondering, what can go wrong here? I want this to be really safe, since I'm probably going to be using it a lot.

Upvotes: 18

Views: 38152

Answers (6)

Dirk
Dirk

Reputation: 1324

I use PowerShell 4 and the following two lines in my Microsoft.PowerShell_profile.ps1:

$modules = $env:USERPROFILE + "\..\*.psm1"
Import-Module $modules

In addition: A tutorial to create advanced modules.

Upvotes: 2

EricG
EricG

Reputation: 1

Another solution is to combine functions into a module (psm1) file and import that with the Import-Module command. You can add the Import-Module command to your profile as described above but the syntax is much simpler.

A simple way to start is to create a folder called Modules in your WindowsPowerShell directory. In that folder create another folder with the name of your module. Your psm1 file is saved here. Then add Import-Module ModuleName.psm1 to your profile that is in the WindowsPowerShell directory. If you want to use a different profile or save your module in another location, you will need to manipulate the path accordingly.

Another tip is to use the Export-ModuleMember FunctionName for each function in your module file. This will obscure the supporting function for the end users you may distribute this to.

Upvotes: 0

user2016665
user2016665

Reputation: 1

Simple solution:

$Path - file location
$Name - exluded file

Get-ChildItem -Path $Path | Where {$_.psIsContainer -eq $false}|Where { $_.Name -like "*.ps1" }|Where { $_.Name -ne $Name }|foreach-object -process {.$_.FullName}|out-null

read all object|file only| powershell file only|exclude file $Name| run ...|out-null

Upvotes: 0

JasonMArcher
JasonMArcher

Reputation: 15011

You can do this in a simpler way. I have this in my profile:

##-------------------------------------------
## Load Script Libraries
##-------------------------------------------
Get-ChildItem ($lib_home + "*.ps1") | ForEach-Object {& (Join-Path $lib_home $_.Name)} | Out-Null

Where $lib_home is a folder that stores scripts that I want to auto include. In this case it executes them. So I have the scripts define global functions. You could also dot source them (replace "&" with ".").

Upvotes: 12

aphoria
aphoria

Reputation: 20199

Include them in your PowerShell profile so they will load automatically every time you start PS.

Look at Windows PowerShell Profiles for more info about where to find your profile script.

PS defaults your profile to your "My Documents" folder. Mine is on a network drive, so anywhere I login, PowerShell points to the same profile folder.

Upvotes: 17

JaredPar
JaredPar

Reputation: 755081

Functionality wise I think there are a couple of ways you could improve your script.

The first is that your script is dependent upon the name of the script not changing. While I don't think it's likely you'll change the name of this script, you never know what mood you'll be in a few years from now. Instead why not just calculate the name of the script dynamically.

$scriptName = split-path -leaf $MyInvocation.MyCommand.Definition

The next problem is that I believe you're split function will fail if you ever place the directory in a path which contains a space. It will cause a path like "c:\foo bar\baz.ps1" to appear as "c:\foo", "bar\baz.ps1". Much better to remove the split and just use the enumeration by the get-childitem command.

Also you are taking a dependency on the current path being the path containing the scripts. You should either make that an explicit parameter or use the path containing the allFunctions.ps1 file (i prefer the latter)

Here is the updated version I would use.

$scriptName = split-path -leaf $MyInvocation.MyCommand.Definition
$rootPath = split-path -parent $MyInvocation.MyCommand.Definition
$scripts = gci -re $rootPath -in *.ps1 | ?{ $_.Name -ne $scriptName }
foreach ( $item in $scripts ) {
  . $item.FullName
}

From a security standpoint you have to consider the possibility that a malicious user adds a bad script into the target directory. If they did so it would be executed with your allFunctions.ps1 file and could do damage to the computer. But at the point the malicious user has access to your file system, it's likely they could do the damage without the help of your script so it's probably a minor concern.

Upvotes: 6

Related Questions