get script to work on remote servers - powershell

I want the below code to work on multiple computers - any idea how to do this? I have the below but it fails as I do not currently call in the servers in question I think.
Thanks,
CODE:
Write-Host "Script to check Storage Write, Read and Delete Times"
Write-Host "`n"
$computer = Get-Content -path d:\temp\servers.txt
$path = "f:\temp\test.txt"
Foreach ($storage in $computer)
{
$date = Get-Date
Write-Host "Script being run on $date"
$write = Measure-Command { new-item -Path $path -ItemType File -Force } | select TotalMilliseconds
Write-Host "Writing file on $storage took $write"
$read = Measure-Command { Get-Content -Path $path } | select TotalMilliseconds
Write-Host "Reading file on $storage took $read"
$delete = Measure-Command {Remove-Item -Path $path -Force } | select TotalMilliseconds
Write-Host "Deleting file on $storage took $delete"
Write-Host "`n"
}

You need to step back for a second an rethink the approach. You are issuing the filesystem commands every time to f:\temp, which is on your local system.
There are two ways to make remote computers perform filesystem tasks. The easiest way is to use UNC paths. That is, \\server\share format. Assuming you have local admin access:
Foreach ($storage in $computer) {
$uncpath = $("\\{0}\f`$\temp\text.txt" -f $storage)
$write = Measure-Command { new-item -Path $uncpath -ItemType #...
# rest of code uses $uncpath for access
}
Mind you, using UNC path puts some stess on LAN, so this type of testing might or might not be accurate enough.
The second way would be using Powershell remoting to connect on remote systems and issuing the commands there. Take a look at New-PSSession, Enter-PSSession and Exit-PSSession cmdlets.

Related

Get location of specific SCCM device collection in Powershell

I am writing a script to export the names of all computer in a device collection to a txt file. My script works as expected but I would like to preserve the folder structure in the exported file structure. For this I need to get the location of the Device Collection.
My Question:
Is there a way to get the location of a SCCM Device Collection in PowerShell?
I've stumbled across a few posts like this and this that use WMI and WQL for this, but I wasn't able to get those working in my script and I would like to do everything in PowerShell whenever possible.
$collections = (Get-CMDeviceCollection | Select -ExpandProperty "Name")
$totalCollections = $collections.length
"Number of Collections: $totalCollections"
$i = 0
foreach($name in $collections){
ForEach-Object -Process {
$i++
"Writing File $i of $totalCollections"
$SanitizedName = $name -replace '/','(slash)' -replace '\\','(backslash)' -replace ':','(colon)' -replace '\*','(asterisk)' -replace '\?','(questionmark)' -replace '"','(quote)' -replace '<','(less)' -replace '>','(more)' -replace '\|','(pipe)'
$file = New-Item -Path "C:\Temp\exporte\$SanitizedName.txt"
Add-Content -Path $file.FullName -Value (Get-CMCollectionMember -CollectionName $name | Select -ExpandProperty "Name")
}
}
I would like to expand this code so that the txt files are placed in the corresponding subfolder analog to the SCCM file structure. E.g rootFolder/rooms/
I was using this module until now but wasn't able to find anything that gives me back the specific location of a collection.
Thanks in advance
I wasn't able to find a way to do this in plain PowerShell and the SCCM Module. In the end I did it like #FoxDeploy suggested. I made a SQL query select for each collection (performance isn't an issue in my case) on our SCCM database to get the folder path. I then used this to place the export file in the appropriate place.
This is my working example with some confidential lines removed
## Parameter ##
$exportLocation = [removed]
$sqlServer = [removed]
$db = [removed]
$query = "SELECT [ObjectPath] FROM [removed].[v_Collections] WHERE CollectionName ="
$SiteCode = [removed] # Site code
$ProviderMachineName = [removed] # SMS Provider machine name
# Customizations
$initParams = #{}
# Import the ConfigurationManager.psd1 module
if((Get-Module ConfigurationManager) -eq $null) {
Import-Module [removed]\..\ConfigurationManager.psd1" #initParams
}
# Connect to the site's drive if it is not already present
if((Get-PSDrive -Name $SiteCode -PSProvider CMSite -ErrorAction SilentlyContinue) -eq $null) {
New-PSDrive -Name $SiteCode -PSProvider CMSite -Root $ProviderMachineName #initParams
}
Set-Location "$($SiteCode):\" #initParams
# get all collections and save them to an array
$collections = (Get-CMDeviceCollection | Select -ExpandProperty "Name")
# total number of collections
$totalCollections = $collections.length
# output to console
"Number of Collections: $totalCollections"
# empty output directory
Set-Location [removed]
Remove-Item $exportLocation\* -Recurse -Force
Set-Location [removed]
# loop through all collections
$i = 0
foreach($name in $collections){
ForEach-Object -Process {
# print progress
$i++
"Writing File $i of $totalCollections"
# remove all characters, that aren't compatible with the windows file naming scheme (/\:*?"<>|)
$SanitizedName = $name -replace '/','(slash)' -replace '\\','(backslash)' -replace ':','(colon)' -replace '\*','(asterisk)' -replace '\?','(questionmark)' -replace '"','(quote)' -replace '<','(less)' -replace '>','(more)' -replace '\|','(pipe)'
# get members of collection
$collectionMembers = (Get-CMCollectionMember -CollectionName $name | Select -ExpandProperty "Name")
# write to file
Set-Location [removed]
$path = (Invoke-Sqlcmd -ServerInstance $sqlServer -Database $db -Query "$query '$collection'").Item("ObjectPath")
New-Item -ItemType Directory -Force -Path "$exportLocation$path"
$file = New-Item -Path "$exportLocation$path\$SanitizedName.txt"
Add-Content -Path $file.FullName -Value $collectionMembers
Set-Location [removed]
}
}
hope this helps someone. Thanks #FoxDeploy

PowerShell find files by extension on multiple servers and export

I wrote a script to find the particular file on windows servers based on Disks and exporting that list to .txt file for each server and drive.
I want to run the script for multiple servers and export all the list details in one csv or excel file with server name and the file location path as output.
#Clearing the Host file
Clear-Host
#Get all the list of Servers
$Machines = get-content "C:\Scripts\Servers.txt"
#Get all the list of Disks to search
$Disks = get-content "C:\Scripts\Disks.txt"
#Lopping through specified servers
foreach ($Machine in $Machines)
{
#Lopping through each Disks
foreach ($Disk in $Disks)
{
if (Test-Path \\$Machine\$Disk$)
{
Write-Host Checking $Machine Disk $Disk -BackgroundColor DarkRed
Get-ChildItem -Path \\$Machine\$Disk$\ -Filter log4j.jar -Recurse -Name -Force | Out-File "C:\Scripts\Output\$Machine $Disk.txt"
}
}
}
Thanks in Advance
Don't send output from the script until you have the information you need.
Using a custom object will allow you to store the values you want
#Clear Host
Clear-Host
#Get all the list of Servers
$Machines = get-content "C:\Scripts\Servers.txt"
#Get all the list of Disks to search
$Disks = #('D','G')
# Looping through specified servers
foreach ($Machine in $Machines) {
#Looping through each Disks
$FileResults = $Disks | ForEach-Object {
If (Test-Path \\$Machine\$_$) {
Write-Host Checking $Machine Disk $_ -BackgroundColor DarkRed
$DiskResults = Get-ChildItem -Path \\$Machine\$_$\ -Filter log4j.jar -Name -Recurse -Force
#
# Send the results of each disk to custom object, to be stored in $FileResults
#
[pscustomobject]#{Disk=$_;Files=$DiskResults}
}
}
If (-not $FileResults) {
$FileResults = 'None found'
}
#
# Send the results of each completed server
#
[pscustomobject]#{ComputerName=$Machine;Found=$FileResults}
#
}

Powershell Script to Move Files to new server by last access date

I'm new to Powershell. I have 80 servers that I need to connect to and run a Pshell script on remotely to find files recursively in one share by last access date and move them to another \server\share for archiving purposes. I also need the file creation, last accessed etc. timestamps to be preserved.
I would welcome any help please
thank you
You need to test this thoroughly before actually using it on all 80 servers!
What you could do if you want to use PowerShell on this is to use Invoke-Command on the servers adding admin credentials so the script can both access the files to move as well as the destination Archive folder.
I would suggest using ROBOCOPY to do the heavy lifting:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$DaysAgo = 130
# from a cmd box, type 'robocopy /?' to see all possible switches you might want to use
# /MINAGE:days specifies the LastWriteTime
# /MINLAD:days specifies the LastAccessDate
robocopy $SourcePath $TargetPath /MOVE /MINLAD:$DaysAgo /COPYALL /E /FP /NP /XJ /XA:H /R:5 /W:5 /LOG+:$logFile
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred
If you want to do all using just PowerShell, try something like this:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$refDate = (Get-Date).AddDays(-130).Date # the reference date set to midnight
# set the ErrorActionPreference to Stop, so exceptions are caught in the catch block
$OldErrorAction = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
# loop through the servers LOCAL path to find old files and move them to the remote archive
Get-ChildItem -Path $SourcePath -File -Recurse |
Where-Object { $_.LastAccessTime -le $refDate } |
ForEach-Object {
try {
$target = Join-Path -Path $TargetPath -ChildPath $_.DirectoryName.Substring($SourcePath.Length)
# create the folder in the archive if not already exists
$null = New-Item -Path $target -ItemType Directory -Force
$_ | Move-Item -Destination $target -Force
Add-Content -Path $LogFile -Value "File '$($_.FullName)' moved to '$target'"
}
catch {
Add-Content -Path $LogFile -Value $_.Exception.Message
}
}
$ErrorActionPreference = $OldErrorAction
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred

Powershell - Array Loop for service start and copy files

I need help on a specific issue with Powershell.
What I am trying to do is that starting multiple services one bye one and after successfully start process, I need to copy some files from one location to another. These files are created only after the service/app is up. I need to check it from a string in a text file (like "Service successfully started").
I tried to make a for-each loop but because of copying and text check locations are different, I couldn't manage to do it. And honestly, I don't have much information about nested loops. Maybe you can give me some ideas to make this work.
For examples, for 1 service;
Source folder file locations;
C:\sourcepath\location1\folder\abc.dat
C:\sourcepath\location1\folder\cde.dat
txt file which needs to be checked if there is a string line called "Service successfully started" (to understand the service-app successfully started)
C:\sourcepath\folder1\logs\logfile.txt
Destination folder file locations
D:\destinationpath\location1\ (abc.dat and cde.dat files should be in same folder)
--- The flow should be like that;
Start a service
Make sure it's up as checking the txt file string
After controls, make copying process from source folder to destination for the specified files (as creating destination folder based on source folder)
Stop the service
After checking it's status as stopped, again start another service and do the same processes until the last service but for different locations
For example, location1 should be location2 and then location3 but the file names are the same. Also destination folder should be created according to source folder.
Even any directions will be helpful.
Edit1:
So far, I could write code.
[array]$serviceNames = "lfsvc", "iphlpsvc"
[array]$app = "app1", "app2"
$sourceStart = "C:\Source\"
$destinationStart = "C:\Target\"
$logs = "\logs"
$sourceFull = $sourceStart+$app.Get(0)+"\data"
$destinationFull = $destinationStart+$app.Get(0)
ForEach ($serviceNames in $serviceNames)
{
Start-Service $serviceNames -ErrorAction SilentlyContinue;
$text = Select-String -Path $sourceStart+$app.Get(0)+$logs\log.txt -Pattern "Service successfully started"
if ($text -ne $null)
{
md $destination;
Copy-Item -Path $sourceFull\123.txt -Destination $destinationFull\123.txt
Copy-Item -Path $sourceFull\456.txt -Destination $destinationFull\456.txt
}
}
I need to point other $app values in a row as pointing other $serviceNames values accordingly.
I need to take control the if values if wait till it shows the service successfully started line
Thanks
Edit2:
If I want to write it in long way, that should be something like that. (Ofc, if I can check the string from a specified text file, it would be gr8)
I need to shorten the codes
[array]$serviceNames = "aService", "bService"
Start-Service $serviceNames[0] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 75;
md "C:\Dest\aService\fld";
Copy-Item -Path "C:\Source\aService\fld\123.txt" -Destination "C:\Dest\aService\fld\123.txt";
Copy-Item -Path "C:\Source\aService\fld\456.txt" -Destination "C:\Dest\aService\fld\456.txt";
Copy-Item -Path "C:\Source\aService\fld\789.txt" -Destination "C:\Dest\aService\fld\789.txt";
Stop-Service $serviceNames[0] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 15;
Start-Service $serviceNames[1] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 75;
md "C:\Dest\bService\fld";
Copy-Item -Path "C:\Source\bService\fld\123.txt" -Destination "C:\Dest\bService\fld\123.txt";
Copy-Item -Path "C:\Source\bService\fld\456.txt" -Destination "C:\Dest\bService\fld\456.txt";
Copy-Item -Path "C:\Source\bService\fld\789.txt" -Destination "C:\Dest\bService\fld\789.txt";
Stop-Service $serviceNames[1] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 15;
I think I have an idea of what you want.
[array]$serviceNames = "lfsvc", "iphlpsvc"
[array]$apps = "app1", "app2"
$sourceStart = "C:\Source\"
$destinationStart = "C:\Target\"
$logs = "\logs"
# main loop, which loops over the apps
foreach($app in $apps)
{
$sourceFull = $sourceStart + $app + "\data"
$destinationFull = $destinationStart + $app
# each app will iterate over all of the services
ForEach ($name in $serviceNames)
{
# uses -passthru to get the service object, and pulls its status from that. Will cause any errors to terminate the script
$status = (Start-Service $name -ErrorAction Stop -PassThru).Status
# this while loop will cause it to pause until the service is in "running" state
while($status -ne "Running") {Start-Sleep -Seconds 5; Get-Service $name}
$text = Select-String -Path $($sourceStart + $app + $logs + "\log.txt") -Pattern "Service successfully started"
# check to see if the $text variable is null or empty, if not, do the thing
if (![string]::IsNullOrEmpty($text))
{
if(!(Test-Path -Path $destinationFull)){New-Item -ItemType Directory -Path $destinationFull}
Get-Content -Path "$sourceFull\123.txt" | Add-Content -Path "$destinationFull\123.txt"
Get-Content -Path "$sourceFull\456.txt" | Add-Content -Path "$destinationFull\456.txt"
}
# stops the service
$status = Stop-Service $name -PassThru
# pauses until the service is stopped
while($status -ne "Stopped") {Start-Sleep -Seconds 5; Get-Service $name}
}
}
something like this?

PowerShell SQL Job Step Move-Item not working on 1 server

This identical code has been used in 3 servers, and only one of them does it silently fail to move the items (it still REMOVES them, but they do not appear in the share).
Azure-MapShare.ps1
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path "${DriveLetter}:"))
{
cmd.exe /c "net use ${DriveLetter}: ${StorageLocation} /u:${StorageUser} ""${StorageKey}"""
}
Get-Exclusion-Days.ps1
param (
[datetime]$startDate,
[int]$daysBack
)
$date = $startDate
$endDate = (Get-Date).AddDays(-$daysBack)
$allDays =
do {
"*"+$date.ToString("yyyyMMdd")+"*"
$date = $date.AddDays(-1)
} until ($date -lt $endDate)
return $allDays
Migrate-Files.ps1
param(
[string]$Source,
[string]$Filter,
[string]$Destination,
[switch]$Remove=$False
)
#Test if source path exist
if((Test-Path -Path $Source.trim()) -ne $True) {
throw 'Source did not exist'
}
#Test if destination path exist
if ((Test-Path -Path $Destination.trim()) -ne $True) {
throw 'Destination did not exist'
}
#Test if no files in source
if((Get-ChildItem -Path $Source).Length -eq 0) {
throw 'No files at source'
}
if($Remove)
{
#Move-Item removes the source files
Move-Item -Path $Source -Filter $Filter -Destination $Destination -Force
} else {
#Copy-Item keeps a local copy
Copy-Item -Path $Source -Filter $Filter -Destination $Destination -Force
}
return $True
The job step is type "PowerShell" on all 3 servers and contains this identical code:
#Create mapping if missing
D:\Scripts\Azure-MapShare.ps1 -DriveLetter 'M' -StorageKey "[AzureStorageKey]" -StorageLocation "[AzureStorageAccountLocation]\backup" -StorageUser "[AzureStorageUser]"
#Copy files to Archive
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "D:\Databases\BackupArchive"
#Get date range to exclude
$exclusion = D:\Scripts\Get-Exclusion-Days.ps1 -startDate Get-Date -DaysBack 7
#Remove items that are not included in exclusion range
Remove-Item -Path "D:\Databases\BackupArchive\*.bak" -exclude $exclusion
#Move files to storage account. They will be destroyed
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "M:\" -Remove
#Remove remote backups that are not from todays backup
Remove-Item -Path "M:\*.bak" -exclude $exclusion
If I run the job step using SQL then the files get removed but do not appear in the storage account. If I run this code block manually, they get moved.
When I start up PowerShell on the server, I get an error message: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed." However, this does not really impact the rest of the operations (copying the backup files to BackupArchive folder, for instance).
I should mention that copy-item also fails to copy across to the share, but succeeds in copying to the /BackupArchive folder
Note sure if this will help you but you could try to use the New-PSDrive cmdlet instead of net use to map your shares:
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path $DriveLetter))
{
$securedKey = $StorageKey | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object System.Management.Automation.PSCredential ($StorageUser, $securedKey)
New-PSDrive -Name $DriveLetter -PSProvider FileSystem -Root $StorageLocation -Credential $credentials -Persist
}
Apparently I tricked myself on this one. During testing I must have run the net use command in an elevated command prompt. This apparently hid the mapped drive from non-elevated OS features such as the Windows Explorer and attempts to view its existence via non-elevated command prompt sessions. I suppose it also was automatically reconnecting during reboots because that did not fix it.
The solution was as easy as running the net use m: /delete command from an elevated command prompt.