Powershell WinSCP won't move it - powershell

here is my script:
$Path = "G:\FTP\APX\APXDropoff\Pay"
$Archive = "G:\FTP\APX\APXDropoff\Pay\Archive"
#$BankOfTulsa = "H:\Documentation\Projects\PJ\BankOfTulsa"
#$compareDate = (Get-Date).AddDays(-1)
$LastFile = Get-ChildItem $Path -Recurse | Where{$_.Name -Match "^CPdb(\d{6})(\d{8}).txt"}; $LastFile
CP $LastFile $Archive
#Call WinSCP Navigate to Incoming\Temp folder for test.
# & 'C:\Program Files (x86)\WinSCP\WinSCP.com' /command "option batch abort" "option confirm off" "open sftp:BankOfTulsa/" "put $LastFile /incoming/arp/"
So here's my issue. I am using reg ex to find the file the CP moves it just fine but when I go to upload to winSCP it says the file doesn't exist.
And it calls it by name, so the variable is there...
Authenticating with pre-entered password.
Authenticated.
Starting the session...
Reading remote directory...
Session started.
Active session: [1] BankOfAmerica
File or folder 'CPdb08131408252014.TXT' does not exist.
System Error. Code: 2.
The system cannot find the file specified
(A)bort, (R)etry, (S)kip, Ski(p) all: Abort
Please help!!

I would think that your issue would lie in that $LastFile does not contain the full path of the file you are trying to upload. I would suggest you use the .FullName property of $LastFile since you have that from the Get-ChildItem cmdlet.
"put $($LastFile.FullName) /incoming/arp/"
Also please refrain from using aliases where you can as some people might not know that CP is an alias for Copy-Item
Afterthought
$lastFile has the potential to match more that one file. If that is the case it would make a mess of the rest of the script potentially.
From your comment you can do the following:
Get-ChildItem $Path -Recurse | Where{$_.Name -Match "^CPdb(\d{6})(\d{8}).txt"} |
Sort-Object LastWriteTime -Descending | Select-Object -First 1

Related

"PowerShell on Target Machines" task fails with an error in TFS 2017\Azure Dev Ops

I am trying to run a PowerShell script present on one of the azure server using the "PowerShell on Target Machines" Task in my TFS build definition, but the task fails with the below error.
System.Management.Automation.RuntimeException: The running command
stopped because the preference variable "ErrorActionPreference" or
common parameter is set to Stop: The specified path, file name, or
both are too long. The fully qualified file name must be less than 260
characters, and the directory name must be less than 248 characters.
--->
I have copied the script in F drive but it still gives path too long error, not able to find any solution for the same.
Does anyone know what would be the reason?
Added script code as well for reference,
GetLatestDebugOutput.ps1
$DebugBuildOutput = "F:\Drops\econNextGen\SecurityScan\19.0"
$Dest = "F:\Drops\econNextGen\SecurityScan\Debug Build Output"
Remove-Item "$Dest\*" -Recurse -Force
#Code to Copy Common-App Debug Build
$Dir= $DebugBuildOutput + "\econNextGen-Common-App-Debug\"
$Latest = Get-ChildItem -Path $Dir | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$FolderPath= $Dir +$Latest.Name
Copy-Item -Path $FolderPath $Dest –Recurse -force
#Code to Copy Main-App Debug Build
$Dir= $DebugBuildOutput + "\econNextGen-MAIN-APP-Debug\"
$Latest = Get-ChildItem -Path $Dir | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$FolderPath= $Dir +$Latest.Name
Copy-Item -Path $FolderPath $Dest –Recurse -force
First suggest you directly RDP to remote target machine and check if you are able to run the same script in it. This will narrow down if the issue related to your tfs build definition and environment.
For environment, make sure you have met all prerequisites of this PowerShell on Target Machines task. And you have qualified powershell version installed.
Actually the error message is pretty straight forward, and so is the key point you should pay attention to: make sure you're not using paths that are too long or using an invalidly path. If apply for all folder and files.
Besides, try starting the build with diagnostics\debug enabled with system.debug=true and see if you can get any meaningful output for future troubleshooting.

How to prevent PowerShell -Recurse from renaming first file twice?

When using powershell to rename files with their directory name and file name, my code works, except in the first file in a directory, it gives it two copies of the directory name. So the file book1.xlsx in folder folder1 should become folder1book1.xlsx but it becomes folder1folder1book1.xlsx. The remaining files in folder1 are correctly named folder1book2.xlsx, folder1book3.xlsx, etc.
I have a directory, with many sub-directories. In each sub-dir are files that need their sub-dir name added in.
I've been following this code. For me it looks like:
dir -Filter *.xlsx -Recurse | Rename-Item -NewName {$_.Directory.Name + "_" + $_.Name}
I've also tried
--setting the Recurse -Depth 1 so that it doesn't keep looking for folders in the sub-folders.
--using ForEach-Object {$_ | ... after the pipe, similar to this.
--running it in Visual Studio Code rather than directly in PowerShell, which turns it into:
Get-ChildItem "C:\my\dir\here" -Filter *.xls -Recurse | Rename-Item -NewName {$_.DirectoryName + '_' + $_.Name}
--putting an empty folder inside the sub-directory, setting -Depth 2 to see if that will "catch" the recurse loop
I would expect the files to be named folder1_book1.xlsx, folder1_book2.xlsx, folder1_book3.xlsx.
But all of the attempted changes above give the same result. The first file is named folder1_folder1_book1.xlsx [INCORRECT], folder1_book2.xlsx[CORRECT], folder1_book3.xlsx[CORRECT].
A workaround might be writing an if statement for "not files that contain the sub-directory name" as suggested here. But the link searches for a text string not an object (probably not the correct term) like #_.Directory.Name. This post shows how to concatenate objects but not something like #_.Directory.Name. Having to put in an if statement seems like an unnecessary step if -Recurse worked the way it should, so I'm not sure this workaround gets at the heart of the issue.
I'm running windows 10 with bootcamp on a 2018 iMac (I'm in Windows a lot because I use ArcMap). Powershell 5.1.17134.858. Visual Studio Code 1.38.0. This is a task I would like to learn how to use more in the future, so explanations will help. I'm new to using PowerShell. Thanks in advance!
This was a script I created for one of my customers that may help
<##################################################################################################################################
This script can be used to search through folders to rename files from their
original name to "filename_foldername.extension". To use this script
please configure the items listed below.
Items to Congfigure
-$Original
-$Source
-$Destination
-$Files
Also please change the Out-File date on line 29 to today's date ****Example: 2019-10-02****
We've also added a change log file that is named "FileChange.txt" and can be found in the location identified on line 30
>
$Original="C:\temp\test" #Location of ".cab" files copied
$Source="C:\temp\Test" #Location were ".cab" files are stored
$Destination="C:\temp\Test\2019-10-02" #Location were you want to copy ".cab" files after the file name change. Be sure to change the date to the date you run this script. The script creates a folder with todays date
$Files=#("*.cab") #Choose the file type you want to search for
$ErrorActionPreference = "SilentlyContinue" #Suppress Errors
Get-ChildItem $Original -Include "*.cab" -File -Recurse | Rename-Item -NewName {$_.BaseName+"_"+$_.Directory.Name +'.cab'}
New-Item -ItemType Directory -Path ".\$((Get-Date).ToString('yyyy-MM-dd'))"; Get-ChildItem -recurse ($Source) -include ($Files) | Copy-Item -Destination ($Destination) -EA SilentlyContinue
Get-ChildItem $Original | Where {$_.LastWriteTime -ge [datetime]::Now.AddMinutes(-10)} | Out-File C:\temp\test\2019-10-02\FileChange.txt

Create PS script to find files

I want to start by saying coding is a bit outside of my skill set but because a certain problem keeps appearing at work, I'm trying to automate a solution.
I use the below script to read an input file for a list of name, search the C:\ for those files, then write the path to an output file if any are found.
foreach($line in Get-Content C:\temp\InPutfile.txt) {
if($line -match $regex){
gci -Path "C:\" -recurse -Filter $line -ErrorAction SilentlyContinue |
Out-File -Append c:\temp\ResultsFindFile.txt
}
}
I would like to make two modifications to this. First, to search all drives connected to the computer not just C:\. Next, be able to delete any found files. I'm using the Remove-Item -confirm command but so far can't make it delete the file it just found.

The specified network name is no longer available in powershell

I have list of share path in a text file. I try to read the files and folders in each path and exporting to csv file using powershell script. I got some csv files with 0 KB.
so i try to test the existance of such network path using Test-Path command. few path shows it is exist but when itry to list out the directories of existing path using Dir \sharepath name i got error like "The specified network name is no longer available" Why??
Sharing code below..
foreach ($dir in (Get-Content $infile)) {
$outfilecsv='jerin-Download'+'.csv'
Get-ChildItem -Path $dir -Filter *.* -Recurse | Select-Object
Name,#{Name="Owner";Expression={(Get-ACL $_.fullname).Owner}},CreationTime,#{Name="FileModifiedDate";Expression={$_.LastWriteTime}},
#{Name="FileAccessedDate";Expression={$_.LastAccessTime}},#{Name="Attributes";Expression=
{$_.Attributes}},#{l='ParentPath';e={Split-Path $_.FullName}},
#{Name="DormantFor(days)";Expression={[int]((Get-Date)-$_.LastWriteTime).TotalDays}},
#{N="FileCategory";E={Get-FileSizeCategory($_)}},
#{Name="Size";Expression={if($_.PSIsContainer -eq $True){(New-Object -com
Scripting.FileSystemObject).GetFolder( $_.FullName).Size} else {$_.Length}}}|
Export-Csv -Path $outfilecsv -Encoding ascii -NoTypeInformation
}
Can anyone suggest
Thanks
Jerin

Powershell script to locate specific file/file name?

I wanted to write a small script that searched for an exact file name, not a string within a file name.
For instance if I search for 'hosts' using Explorer, I get multiple results by default. With the script I want ONLY the name I specify. I'm assuming that it's possible?
I had only really started the script and it's only for my personal use so it's not important, it's just bugging me. I have several drives so I started with 2 inputs, one to query drive letter and another to specify file name. I can search by extension, file size etc but can't seem to pin the search down to an exact name.
Any help would be appreciated!
EDIT : Thanks to all responses. Just to update. I added one of the answers to my tiny script and it works well. All three responses worked but I could only use one ultimately, no offence to the other two. Cheers. Just to clarify, 'npp' is an alias for Notepad++ to open the file once found.
$drv = read-host "Choose drive"
$fname = read-host "File name"
$req = dir -Path $drv -r | Where-Object { !$PsIsContainer -and [System.IO.Path]::GetFileNameWithoutExtension($_.Name) -eq $fname }
set-location $req.directory
npp $req
From a powershell prompt, use the gci cmdlet (alias for Get-ChildItem) and -filter option:
gci -recurse -filter "hosts"
This will return an exact match to filename "hosts".
SteveMustafa points out with current versions of powershell you can use the -File switch to give the following to recursively search for only files named "hosts" (and not directories or other miscellaneous file-system entities):
gci -recurse -filter "hosts" -File
The commands may print many red error messages like "Access to the path 'C:\Windows\Prefetch' is denied.".
If you want to avoid the error messages then set the -ErrorAction to be silent.
gci -recurse -filter "hosts" -File -ErrorAction SilentlyContinue
An additional helper is that you can set the root to search from using -Path.
The resulting command to search explicitly search from, for example, the root of the C drive would be
gci -Recurse -Filter "hosts" -File -ErrorAction SilentlyContinue -Path "C:\"
Assuming you have a Z: drive mapped:
Get-ChildItem -Path "Z:" -Recurse | Where-Object { !$PsIsContainer -and [System.IO.Path]::GetFileNameWithoutExtension($_.Name) -eq "hosts" }
I use this form for just this sort of thing:
gci . hosts -r | ? {!$_.PSIsContainer}
. maps to positional parameter Path and "hosts" maps to positional parameter Filter. I highly recommend using Filter over Include if the provider supports filtering (and the filesystem provider does). It is a good bit faster than Include.
I'm using this function based on #Murph answer.
It searches inside the current directory and lists the full path:
function findit
{
$filename = $args[0];
gci -recurse -filter "*${filename}*" -file -ErrorAction SilentlyContinue | foreach-object {
$place_path = $_.directory
echo "${place_path}\${_}"
}
}
Example usage: findit myfile
To search the whole computer:
gdr -PSProvider 'FileSystem' | %{ ls -r $_.root} 2>$null | where { $_.name -eq "httpd.exe" }
In findFileByFilename.ps1 I have:
# https://stackoverflow.com/questions/3428044/powershell-script-to-locate-specific-file-file-name
$filename = Read-Host 'What is the filename to find?'
gci . -recurse -filter $filename -file -ErrorAction SilentlyContinue
# tested works from pwd recursively.
This works great for me. I understand it.
I put it in a folder on my PATH.
I invoke it with:
> findFileByFilename.ps1
To search the whole computer:
gdr -PSProvider 'FileSystem' | %{ ls -r $.root} 2>$null | where {
$.name -eq "httpd.exe" }
I am pretty sure this is a much less efficient command, for MANY reasons, but the simplest is your piping everything to your where-object command, when you could still use -filter "httpd.exe" and save a ton of cycles.
Also, on a lot of computers the get-psdrive is gonna grab shared drives, and I am pretty sure you wanted that to get a complete search. Most shares can be IMMENSE with regard to the sheer number of files and folders, so at the very least I would sort my drives by size, and add a check after each search to exit the loop if we locate the file. That is if you are looking for a single instance, if not the only way to save yourself the IMMENSE time sink of searching a 10TB share or two, is to comment the command and highly suggest any user who were to need to use it should limit their search as much as they can. For instance our User Profile share is 10TB, at least the one I am on is, and I can limit my search to the directory $sharedrive\users\myname and search my 116GB directory rather than the 10TB one. Too many unknowns with shares for this type of script, which is already super inefficient with regard to resources and speed.
If I was seriously considering using something like this, I would add a call to a 3rd party package and leverage a DB.