Reputation: 3
Here is a script I used to check status of my URL whether it is working or not. I need to write a script in such a way that it accepts all the URLs present in a text file which is placed in local drive and displays their information.
[string] $url = 'http://mywebsite.net'
function CheckForStatus($url) {
try {
[net.httpWebRequest] $req = [net.webRequest]::create($url)
$req.Method = "HEAD"
[net.httpWebResponse] $res = $req.getResponse()
if ($res.StatusCode -eq "200") {
write-host "`nSite $url is up (Return code: $($res.StatusCode) -
$([int] $res.StatusCode))`n" -ForegroundColor green
}
else {
write-host "`nSite $url is not available (Return code:
$($res.StatusCode) - $([int] $res.StatusCode))`n" -ForegroundColor red
}
} catch {
write-host "`nSite $url is having some DNS issues`n" -ForegroundColor
red
}
}
CheckForStatus $url
Upvotes: 0
Views: 89
Reputation: 34592
Use the Get-Content
cmdlet and pipe it for processing each line.
Get-Content "some_file" | % { CheckForStatus($_) }
Upvotes: 3