Confusion on Start-Job for File Share - powershell

So, this is absolutely whipping me. I have created a script that moves data based on a user's response to an number of questions from one file share to another. What I would like to do is have a background job running that provides a report of all the files being moved prior to the move taking place. As a result, I added this little bit of code that absolutely doesn't gather info from the source file share. It simply provides data from my particular machine. What am I doing wrong?
While ($sourcepath -eq $null) {
$sourcepath= read-host "Enter source file path"
}
Set-Location $sourcepath
Start-job -Scriptblock {Get-childitem -recurse |Out-File
c:\users\john.smith\desktop\shareonfile.txt}

Jobs run in a different process, with their own scope. The working directory won't be inherited. To demonstrate this:
Set-Location $sourcepath
Start-Job -ScriptBlock {
Get-Location
} | Wait-Job | Receive-Job
Get-Job | Remove-Job
You should avoid setting the location anyway, and just pass the path to Get-ChildItem. To do that in a job, define a parameter and pass its value like so:
Start-job -Scriptblock { param($thePath)
Get-childitem -Path $thePath -recurse |
Out-File c:\users\john.smith\desktop\shareonfile.txt
} -ArgumentList $sourcepath

Related

Remove alternative data stream using powershell

I'm trying to remove a bunch of OSX alternate data streams on an NTFS volume. However no matter what I try I cannot get Powershell to do it. Yes, I admit that my powershell is not great. Is anyone able to help?
Objective: Remove the ADS "AFP_AfpInfo" from any directory in the volume.
Current Code:
Get-ChildItem E:\ -Directory -Recurse | ForEach-Object {
$streams = Get-Content -Path $_ -Stream AFP_AfpInfo -ErrorAction SilentlyContinue
if ($streams) {
$streams | ForEach-Object {
try {
Remove-Item -Path "$($_.PSPath)" -Stream AFP_AfpInfo -Recurse -Force -ErrorAction Silentlycontinue
}
catch {
Write-Host "An error occurred: $($_.Exception.Message)"
}
}
}
}
Current error:
An error occurred: A parameter cannot be found that matches parameter name 'Stream'.
Note: Running Powershell 7.3
-Recurse and -Stream don't seem to go together even though in the documentation they appear in the same Parameter Sets. In this case -Recurse should be removed. GitHub Issue #9822 was submitted to add clarification to the Remove-Item doc.
Also, you're seeking for an exact stream, AFP_AfpInfo, so I don't see a need to enumerate $streams. Lastly, checking if a file or folder has an alternative stream should be done with Get-Item instead of Get-Content for efficiency.
As a final aside, the code must use the .Remove method from EngineIntrinsics to work, Remove-Item -Confirm:$false -Force will always ask for confirmation on folders, arguably a bug. Remove-Item should skip confirmation checks if -Stream is in use and -Confirm:$false -Force. GitHub issue #19154 was submitted to follow up on this.
$removeFunc = $ExecutionContext.InvokeProvider.Item.Remove
$targetStream = 'AFP_AfpInfo'
Get-ChildItem E:\ -Recurse -Directory | ForEach-Object {
if ($stream = $_ | Get-Item -Stream $targetStream -ErrorAction SilentlyContinue) {
try {
$removeFunc.Invoke($stream.PSPath, $false, $true, $true)
}
catch {
Write-Host "An error occurred: $($_.Exception.Message)"
}
}
}
Why are you not just using the Unblock-File cmdlet to remove ADS?
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/unblock-file?view=powershell-7.3
Description This cmdlet only works on the Windows and macOS platforms.
The Unblock-File cmdlet lets you open files that were downloaded from
the internet. It unblocks PowerShell script files that were downloaded
from the internet so you can run them, even when the PowerShell
execution policy is RemoteSigned. By default, these files are blocked
to protect the computer from untrusted files.
Before using the Unblock-File cmdlet, review the file and its source
and verify that it is safe to open.
Internally, the Unblock-File cmdlet removes the Zone.Identifier
alternate data stream, which has a value of 3 to indicate that it was
downloaded from the internet.
Get-Help -Name Unblock-FIle -Examples
NAME
Unblock-File
SYNOPSIS
Unblocks files that were downloaded from the internet.
------------------ Example 1: Unblock a file ------------------
PS C:\> Unblock-File -Path C:\Users\User01\Documents\Downloads\PowerShellTips.chm
-------------- Example 2: Unblock multiple files --------------
PS C:\> dir C:\Downloads\*PowerShell* | Unblock-File
------------- Example 3: Find and unblock scripts -------------
PS C:\> Get-Item * -Stream "Zone.Identifier" -ErrorAction SilentlyContinue
FileName: C:\ps-test\Start-ActivityTracker.ps1
See also Get-Item, Clear-Content and Remove-Item cmdlets use case:
Friday Fun with PowerShell and Alternate Data Streams
https://jdhitsolutions.com/blog/scripting/8888/friday-fun-with-powershell-and-alternate-data-streams
You could also just use the MSSysinternals tool to remove ADS as well in your PS code.
https://learn.microsoft.com/en-us/sysinternals/downloads/streams

Powershell script searching files on domain

Very new to powershell and AD, so apologies if this post has an obvious answer. I have done some research and I am still not finding the answers I am looking for. My script is below for reference.
I have created a simple powershell script that will run on an admin vm i have setup on my domain. I have a separate SQL vm running a backup script that consume a lot of storage over time. I am trying to run this very simple script. My question is, do I need to modify this script in order to store it on my admin vm but have it run on my sql vm? Or can i leave the path as is and just set up in AD task scheduler. I have tried targeting the FQDN and the IP, but it doesn't seem to be working either way.
$backups_file = 'E:\blahBlahBla\SQL\Backups' or
$backups_file = '<IP_ADDRESS>\E:\blahBlahBla\SQL\Backups' or
$backups_file = '<FQDN>E:\blahBlahBla\SQL\Backups'
$backup_file_exist = (Test-Path -Path $backups_file)
if ($backup_file_exist){
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $backups_file
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $backups_file -Recurse | Where-Object {($_.LastWriteTime -lt (Get-
Date).AddDays(-7))} | Remove-Item
}
else
{
Write-Output -InputObject "Unable to access this directory."
}
Thanks.
well all your $backups_file solutions seems wrong to me.
If you want excess a directory on a Remote system, it has to be at least a fileshare or a administrative share like \\computer\e$\folder\folder\
But why using file shares or something like that when you just simple can connect to a Powershell Session on the Remote Host? here is a example.:
$mySQLServer = "Server1.domain.name", "server2.domain.name"
$backupFolder = "E:\blahBlahBla\SQL\Backups"
foreach ($server in $mySQLServer)
{
$session = New-PSSession -ComputerName $server #maybe -cred if needed
Invoke-Command -Session $session -ArgumentList $backupFolder -ScriptBlock {
param(
$directoy
)
if ($backup_file_exist)
{
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $directoy
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $directoy -Recurse | Where-Object { ($_.LastWriteTime -lt (Get-Date).AddDays(-7))
} | Remove-Item
}
}
Remove-PSSession
}
Good Luck!

Run all .exe in a folder

I'm playing with malware in a VM and every script I try gets stuck. Basically I need to run every .exe in a folder. Tried batch files using start, powershell, etc. The issue happens when AV moves some file to quarentine, or some process keep running then the script doesn't jump to the next one.
CMD start works but shows popups when doesn't find some file, then you have to keep clicking to jump to the next file.
These works but get stuck after a while:
Get-ChildItem 'C:\Users\LAB\Desktop\test' | ForEach-Object {
>> & $_.FullName
>> }
Same here:
for %%v in ("C:\Users\LAB\Desktop\test\*.exe") do start "" "%%~v"
and here:
for %%i in (C:\Users\LAB\Desktop\test\*.exe) do %%i
You need to provide some form of code to allow us to help you troubleshoot it; this is not a request a script page.
Anyways, you would be looking at something like this:
#Assuming the .exe's are located in C Root.
Get-ChildItem -Path C:\ | Where-Object {$_.Extension -like ".exe"}| Foreach {Start-Process $_.FullName}
#In Ps, we like to filter as far left as possible for faster results.
Get-ChildItem -Path C:\ -File "*.exe" | Foreach {Start-Process $_.FullName}
#Running the commands as jobs so it doesnt wait on any to finish before running the next.
Start-Job { Get-ChildItem -Path C:\ -File "*.exe" | Foreach {Start-Process $_.FullName} }
Start-Sleep 2
Get-Job | Remove-Job
Please refer to the following link: How to ask a question

PowerShell: using Compress-Archive with Start-Job won't work

I'm trying to use PowerShell to compress a bunch of video files on my H:\ drive. However, running this serially would take a long time as the drive is quite large. Here is a short snippet of the code that I'm using. Some parts have been withheld.
$shows = Get-ChildItem H:\
foreach($show in $shows){
Start-Job -ArgumentList $show -ScriptBlock {
param($show)
$destPath = "$($show.DirectoryName)\$($show.BaseName).zip"
Compress-Archive -Path $show.FullName -DestinationPath $destPath
}
}
When I run a Get-Job, the job shows up as completed with no reason in the JobStateInfo, but no .zip was created. I've run some tests by replacing the Compress-Archive command with an Out-File of the $destPath variable using Start-Job as well.
Start-Job -ArgumentList $shows[0] -ScriptBlock {
param($show)
$show = [System.IO.FileInfo]$show
$destPath = "$($show.DirectoryName)\$($show.BaseName).zip"
$destPath | Out-File "$($show.DirectoryName)\test.txt"
}
A text file IS created and it shows the correct destination path. I've run PowerShell as an Administrator and tried again but that doesn't appear to work either. Not sure if it matters, but I'm running on Windows 10 (latest).
Any help would be appreciated. Thanks!
For some reason, inside the job, the serialized fileinfo object has no basename (it's a scriptproperty). If you have threadjobs, that works.
dir | start-job { $input } | receive-job -wait -auto |
select name,basename,fullname
Name basename FullName
---- -------- --------
file1.txt C:\Users\js\foo\file1.txt
file2.txt C:\Users\js\foo\file2.txt
file3.txt C:\Users\js\foo\file3.txt
Am not sure if that's what you want but I think you first need to create an archive then update that archive with shows so I created zip called archive and looped through adding files.
$shows = Get-ChildItem H:\
foreach($show in $shows){
Compress-Archive -Path $show.FullName -Update -DestinationPath "C:\archive"
}

PowerShell Invoke-Command severe performance issues

I'm having a heck of a time running a script on a remote system efficiently. When run locally the command takes 20 seconds. When run using Invoke-Command the command takes 10 or 15 minutes - even when the "remote" computer is my local machine.
Can someone explain to me the difference between these two commands and why Invoke-Command takes SO much longer?
Run locally on MACHINE:get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue
Run remotely on \MACHINE (behaves the same rather MACHINE is my local machine or a remote machine: invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue}
Note: The command returns 5 file objects
EDIT: I think part of the problem may be reparse points. When run locally get-childitem (and DIR /a-l) do not follow junction points. When run remotely they do, even if I use the -attributes !ReparsePoint switch)
EDIT2: Yet, if I run the command invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Attributes !ReparsePoint -Force -ErrorAction SilentlyContinue} I don't see the Junction Points (i.e. Documents and Settings). So, it is clear that both DIR /a-l and get-childitem -attributes !ReparsePoint do not prevent it from recursing in to a reparse point. Instead it appears to only filter the actual entry itself.
Thanks a bunch!
It appears the issue is the Reparse Points. For some reason, access is denied to reparse points (like Documents and Settings) when the command is run locally. Once the command is run remotely, both DIR and Get-ChildItem will recurse into reparse points.
Using the -Attributes !ReparsePoint for get-childitem and the /a-l switch for DIR does not prevent this. Instead, it appears those switches only prevent the reparse point from appearing in the file listing output, but it does not prevent the command from recursing into those folders.
Instead I had to write a recursive script and do the directory recursion myself. It's a little bit slower on my machine. Instead of around 20 seconds locally, it took about 1 minute. Remotely it took closer to 2 minutes.
Here is the code I used:
EDIT: With all the problems with PowerShell 2.0, PowerShell remoting, and memory usage of the original code, I had to update my code as shown below.
function RecurseFolder($path) {
$files=#()
$directory = #(get-childitem $path -Force -ErrorAction SilentlyContinue | Select FullName,Attributes | Where-Object {$_.Attributes -like "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
foreach ($folder in $directory) { $files+=#(RecurseFolder($folder.FullName)) }
$files+=#(get-childitem $path -Filter "*.pst" -Force -ErrorAction SilentlyContinue | Where-Object {$_.Attributes -notlike "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
$files
}
If it's a large directory structure and you just need the full path names of the files you should be able to speed that up considerably by using to the legacy dir command instead of Get-ChildItem:
invoke-command -ComputerName MACHINE -ScriptBlock {cmd /c dir c:\*.pst /s /b /a-d /a-l}
Try using a remote session:
$yourLoginName = 'loginname'
$server = 'tagetserver'
$t = New-PSSession $server -Authentication CredSSP -Credential (Get-Credential $yourLoginName)
cls
"$(get-date) - Start"
$r = Invoke-Command -Session $t -ScriptBlock{[System.IO.Directory]::EnumerateFiles('c:\','*.pst','AllDirectories')}
"$(get-date) - Finish"
I faced the same problem.
It is obvious that Powershell has problems with the transfer of arrays through PSRemoting.
It demonstrates this little experiment (UPDATED):
$session = New-PSSession localhost
$arrayLen = 1024000
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Preparing test array {0} elements length..." -f $using:arrayLen)
$global:result = [Byte[]]::new($using:arrayLen)
[System.Random]::new().NextBytes($result)
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer array ({0})" -f $using:arrayLen)
return $result
} | Out-Null
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer same array nested in a single object")
return #{array = $result}
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
And my output (time in ms):
Preparing test array 1024000 elements length...
Completed in 0.0211385 sec
Transfer array (1024000)
Completed in 48.0192142 sec
Transfer same array nested in a single object
Completed in 0.0990711 sec
As you can see, the array is tranfered more than a minute.
Single objects transfered in a few seconds despite their size