Script to copy files not ending - powershell

I have a script that will copy files from a set of folder(s) which match a specific string to a location. The script runs fine for the first server, but on the second it just never ends. I have a write-host step so I know when the first server is done. There are files written from the second server but it never gets to the second write-host command. Any thoughts? [for context the folders I want to search contain "cust+$AppType* and the files are $LogDate_$LogHour.log - these steps work as expected and logs are found and copied for CUST001 server]
Script is below:
$AppType = Read-Host "What App Server do you need calls from? (MCP,RM,SIP,URS,ORS,WFM,WEB,GIR) [can also use * for all servers]"
$LogDate = Read-Host "What date was the call? (yyyymmdd)"
$LogHour = Read-Host "What hour was the call? (hh) [can also use * for entire day]"
$UserName = Read-Host "What is your UserID? (files will be places on Host SAMPLE2\c:\users\*USERID*\Desktop)"
$SrchString = Read-Host "What string are you searching for? (can use connid or ANI for example)"
$LogName = "*"+$LogDate+"_"+$LogHour+"*.log"
$Folder = "*cust"+$AppType+"*"
Get-ChildItem \\customer001\e$\Logs\$Folder -Filter $LogName -Recurse | Select-String $SrchString | Copy-Item -Destination \\SAMPLE2\c$\users\$UserName\Desktop -Force
Write-Host "Finished copying 001 files...."
Get-ChildItem \\customer002\e$\Logs\$Folder -Filter $LogName -Recurse | Select-String $SrchString | Copy-Item -Destination \\SAMPLE2\c$\users\$UserName\Desktop -Force
Write-Host "Finished copying 002 files...."
Exit

Related

Powershell Get-ChildItem returns nothing for folder

A friend of mine asked me to write a short script for him. The script should check a specific folder, find all files and subfolders older than X days and remove them. Simple so far, I wrote the script, successfully tested it on my own system and sent it to him. Here's the thing - it doesn't work on his system. To be more specific, the Get-ChildItem cmdlet does not return anything for the provided path, but it gets weirder even, more on that later. I'm using the following code to first find the files and folders (and log them before deleting them later on):
$Folder = "D:\Data\Drive_B$\General\ExchangeFolder"
$CurrentDate = Get-Date
$TimeSpan = "-1"
$DatetoDelete = $CurrentDate.AddDays($TimeSpan)
$FilesInFolder = (Get-ChildItem -Path $Folder -Recurse -ErrorAction SilentlyContinue | Where-Object {$_.LastWriteTime -lt $DatetoDelete})
All variables are filled and we both know that the folder is filled to the brim with files and subfolders older than one day, which was our timespan we chose for the test. Now, the interesting part is that not only does Get-ChildItem not return anything - going to the folder itself and typing in "dir" does not return anything either. Never seen behaviour like this. I've checked everything I could think of - is it DFS?, typos, folder permissions, share permissions, hidden files, ExecutionPolicy. Everything is as it should be to allow this script to work properly as it did on my own system when initially testing it. The script does not return any errors whatsoever.
So for some reason, the content of the folder cannot be found by powershell. Does anyone know of a reason why this could be happening? I'm at a loss here :-/
Thanks for your time & help,
Fred
.AddDays() takes an double I would use that.
Filter then action
This code will work for you.
$folder = Read-Host -Prompt 'File path'
$datetodel = (Get-Date).AddDays(-1)
$results = try{ gci -path $folder -Recurse | select FullName, LastWriteTime | ?{ $_.LastWriteTime -lt $datetodel}}catch{ $Error[-1] }
$info = "{0} files older than: {1} deleting ...." -f $results.count, $datetodel
if($results | ogv -PassThru){
[System.Windows.Forms.MessageBox]::Show($info)
#Put you code here for the removal of the files
# $results | % del FullName -force
}else{
[System.Windows.Forms.MessageBox]::Show("Exiting")
}

Attempting to Filter out All .p12 AND .pfx files from a given Directory

My organization requires the filtering, and removal of all .PFX and .P12 files from our computers and servers. The script we are currently running every week does not go deep enough or far enough per higher guidance. What I'm trying to do is take my current working script, and filter for both file extensions. The person who wrote the script is all but gone from this world so I'm working on a script I didn't write but I'm trying to become familiar with.
I've already tried changing some of the variables inside of the Get-ChildItem cmdlt to apply the filtering there instead of a variable. This includes attempts like:
$Files = Get-ChildItem -Path \\$client\c$\Users -Filter -filter {(Description -eq "school") -or (Description -eq "college")} -Recurse -ErrorAction SilentlyContinue
Here is a portion of the Code, not the entire thing. There is logging and notes and other administrative tasks that are done other than this, I've only included the portion of the code that is creating errors.
$computers = Get-ADComputer -Filter * -SearchBase "AD OU PATH OMITTED"
$destination = "****\Software\aPatching_Tools\Log Files\Soft Cert\Workstations\AUG 19\WEEK 2"
$ext = "*.pfx"
foreach ($computer in $computers)
{
$client = $computer.name
if (Test-Connection -ComputerName $client -Count 1 -ErrorAction SilentlyContinue)
{
$outputdir = "$destination\$client"
$filerepo = "$outputdir\Files"
$files = Get-ChildItem -Path \\$client\c$\Users -Filter $ext -Recurse -ErrorAction SilentlyContinue
if (!$files)
{
Write-Host -ForegroundColor Green "There are no .pfx files on $client."
}
else
{
Write-Host -ForegroundColor Cyan "PFX files found on $client"
The expected and normal operation of the script is that it goes through each machine, tests it, moves on if it's offline, or if it's online, there is a 4-5 minute pause while it searches and moves on.
The errors I get when I make changes such as doing a $ext = "*.p12", ".pfx" is that -Filter does not support this operation. Or if I try the above mentioned change to the filtering, the script takes 1-2 seconds per machine, and with at times, 15-20 users in the C:\Users folder, it's nearly impossible to search that fast over the network.
Instead of passing your extensions as the -filter, pass them using -include - that is, $files = Get-ChildItem -Path \\$client\c$\Users\* -include $ext -Recurse -ErrorAction SilentlyContinue. You can then define $ext as an array of strings, e.g., $ext = "*.pfx","*.p12", and Get-ChildItem will return only those files with the indicated extensions.

Capture File Paths with Out-File in Powershell

I've created a script that I'm using to clean up some drives at my work. I've been asked to create a log and leave it in the source folder after I move files.
Currently, the process is slow because my script creates a text file of files that meet the parameters I input. Once I have verified it, or edited the text file, I allow the script to read the file and keep doing what it needs to do. I'm creating this original text file with Out-File. Obviously, the path of the files change because I'm moving them from one drive to another. I'd like to log their new path but can't seem to figure out how to do this.
The file the script creates and reads from looks like the following:
C:\This\Is\The\Source\Something.rpt
C:\This\Is\The\Source\Somethingelse.bak
C:\This\Is\The\Source\AnotherFile.jpg
I'm looking to create something that will reflect the new path once the files are moved. In the different ways I've tried I either end up with nothing or just the last file copied, which would tell me Out-File is not appending but overwriting each time it gets a new file path.
And the list will just go on. The following is bit from my script I'm having issue with:
$path = Read-Host "What path should I look at?"
$SourceFolder = $path
$files = Get-ChildItem $path -Recurse
| Where-Object {$_.lastwritetime.Date -eq $targetdate}
|Where-Object {$_.PSIsContainer -eq $false}
| ForEach-Object {$_.fullname}
| Out-File $OutFileCopy
$Target = Read-Host "What is the destination?"
Write-Host "Please view the text file created." -foregroundcolor "yellow" -backgroundcolor "red"
Invoke-Item $OutFileCopy
$CopyContinueAnswer = Read-Host "Would you like to continue? Y or N?"
If ($CopyContinueAnswer -eq 'Y') {
$Readfile = Get-Content $outfilecopy
$ReadFile | Where-Object {$_.PSIsContainer -eq $false}
foreach ($file in $ReadFile) {
$logfile = "$Sourcefolder\log.txt"
out-file $logfile
Write-Host "The old path is $File"
$TargetPath = $File.Replace($SourceFolder, $Target)
if (!(Test-Path $TargetPath)){
Write-Host "This is the new path $TargetPath" -foregroundcolor "Magenta"
Write-Host
Copy-Item -path $file -Destination $TargetPath
Write-output $TargetPath | out-file $logfile
}
Out-File by default will overwrite an existing file. If you do not want this to happen, use Out-File -append. I recommend looking at the Microsoft documentation for Out-File; you can find it by typing Get-Help Out-File at any PowerShell prompt, or clicking on the link.
The file that I want to record the new files paths is being created inside the foreach loop. Everytime it stepped to the next file it would recreate the log.txt file and erase what was there previously. Now that I've taken it out of the loop, I don't have the issue of nothing being recorded or only the last file that went through the loop being recorded.
I'll create a portion of the script that looks to see if the log.txt already exists before it tries to create one.

Powershell WinSCP won't move it

here is my script:
$Path = "G:\FTP\APX\APXDropoff\Pay"
$Archive = "G:\FTP\APX\APXDropoff\Pay\Archive"
#$BankOfTulsa = "H:\Documentation\Projects\PJ\BankOfTulsa"
#$compareDate = (Get-Date).AddDays(-1)
$LastFile = Get-ChildItem $Path -Recurse | Where{$_.Name -Match "^CPdb(\d{6})(\d{8}).txt"}; $LastFile
CP $LastFile $Archive
#Call WinSCP Navigate to Incoming\Temp folder for test.
# & 'C:\Program Files (x86)\WinSCP\WinSCP.com' /command "option batch abort" "option confirm off" "open sftp:BankOfTulsa/" "put $LastFile /incoming/arp/"
So here's my issue. I am using reg ex to find the file the CP moves it just fine but when I go to upload to winSCP it says the file doesn't exist.
And it calls it by name, so the variable is there...
Authenticating with pre-entered password.
Authenticated.
Starting the session...
Reading remote directory...
Session started.
Active session: [1] BankOfAmerica
File or folder 'CPdb08131408252014.TXT' does not exist.
System Error. Code: 2.
The system cannot find the file specified
(A)bort, (R)etry, (S)kip, Ski(p) all: Abort
Please help!!
I would think that your issue would lie in that $LastFile does not contain the full path of the file you are trying to upload. I would suggest you use the .FullName property of $LastFile since you have that from the Get-ChildItem cmdlet.
"put $($LastFile.FullName) /incoming/arp/"
Also please refrain from using aliases where you can as some people might not know that CP is an alias for Copy-Item
Afterthought
$lastFile has the potential to match more that one file. If that is the case it would make a mess of the rest of the script potentially.
From your comment you can do the following:
Get-ChildItem $Path -Recurse | Where{$_.Name -Match "^CPdb(\d{6})(\d{8}).txt"} |
Sort-Object LastWriteTime -Descending | Select-Object -First 1

Copying all .dll files from one virtual server to another with Powershell

I'm trying to copy all the .dll files from C:\windows in a virtual server to a new virtual server. I've managed to get all the .dll files, however I can't find a way to copy them to the new virtual server and was wondering if anyone might know how to do this with Powershell.
[void][System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic')
$server = [Microsoft.VisualBasic.Interaction]::InputBox("Enter the server name with files you want to copy", "Server")
$server2 = [Microsoft.VisualBasic.Interaction]::InputBox("Enter the server name you want files copied to", "Server")
$destinationName = ("\\" + $server2 + '\C$\windows')
$Files = Get-ChildItem -Path ("\\" + $server + '\C$\windows') -recurse | Where {$_.extension -eq ".dll"}
What would I have to do with my $Files variable to copy it to a new VM? I know of the copy-item cmdlet, but unaware of how to use it to move all of this to a new virtual server.
EDIT:
[void][System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic')
$server = [Microsoft.VisualBasic.Interaction]::InputBox("Enter the server name with files you want to copy", "Server")
$server2 = [Microsoft.VisualBasic.Interaction]::InputBox("Enter the server name you want files copied to", "Server")
$destinationName = ("\\" + $server2 + '\C$\windows')
$Files = Get-ChildItem -Path ("\\" + $server + '\C$\windows') -recurse | Where {$_.extension -eq ".dll"}
foreach($dll in $Files){
$destinationName +=
cp $dll.fullname $destinationName}
I want to get string of path to be "\$server2\C$\windows\ ..\ .. " for each specific file.
At the moment if the code runs it'll make every file/directory appear as "\$server2\C$\windows" and not get the full path.
You're really nearly there, actually.
$Files = Get-ChildItem... makes $Files become an array of items, and because Powershell was designed to work with objects, you just would use $Files as an argument to Copy-Item. The caveat with this is that, for whatever reason, Copy-Item doesn't use the full path property from the objects obtained with Get-ChildItem (instead it just gets the filenames, so you'd have to be in that directory for it to work), so the easiest way would be this:
foreach($dll in $Files){cp $dll.fullname $destinationName}
To copy while preserving directory structure, you want to take the starting full path and just modify it to reflect the new root directory/server. This can be done in one line similar to the above, but for clarity and readability, I'm expanding it into the following multi-line setup:
foreach($dll in $Files){
$target = $dll.fullname -replace "\\\\$server","\\$server2"
$targetDir = $($dll.directory.fullname -replace "\\\\$server","\\$server2")
if(!(Test-Path $targetDir){
mkdir $targetDir
}
cp $dll.fullname $target
}
To explain, the $target... line takes the full path of the current $dll, say \\SourceServer\C$\windows\some\rather\deep\file.dll, and regex replaces the part of the path that says \\SourceServer and replaces it with \\DestinationServer so that the rest of the path is left intact. That is, it will now be \\TargetServer\C$\windows\some\rather\deep\file.dll. This method eliminates the need for your $destinationName variable.
The Test-Path bit makes sure that the the parent folder of the file exists remotely before copying, otherwise it will fail out.