With my script, I am attempting to scan a directory for a subdirectory that is automatically created each day that contains the date in the directory name. Once it finds yesterdays date (since I need to upload previous day), it looks for another subdirectory, then any files that contain "JONES". Once it finds those files, it does a foreach loop to upload them using winscp.com.
My issue is that I'm trying to use the .xml log created from winscp to send to a user to confirm uploads. The problem is that the .xml file contains only the last file uploaded.
Here's my script:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\Path\to\server" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#list every sql server trace file
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USERNAME:PASSWORD#ftps.hostname.com:21/dropoff/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
}
} Else {
#Something Else will go here
}
I feel that it's my $PutCommand line all being contained within the ForEach loop, and it just overwrites the xml file each time it connects/exits, but I haven't had any luck breaking that script up.
You are running WinSCP again and again for each file. Each run overwrites a log of the previous run.
Call WinSCP once instead only. It's even better as you avoid re-connecting for each file.
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") |
Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USERNAME:PASSWORD#ftps.hostname.com:21/dropoff/ -explicitssl" '
foreach($item in ($FilesToUpload))
{
$PutCommand += '"put """"' + $Item.FullName + '""""" '
}
$PutCommand += '"exit"'
Invoke-Expression $PutCommand
Though all you really need to do is checking WinSCP exit code. If it is 0, all went fine. No need to have the XML log as a proof.
And even better, use the WinSCP .NET assembly from PowerShell script, instead of driving WinSCP from command-line. It does all error checking for you (you get an exception if anything goes wrong). And you avoid all nasty stuff of command-line (like escaping special symbols in credentials and filenames).
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
FtpSecure = [WinSCP.FtpSecure]::Explicit
TlsHostCertificateFingerprint = "xx:xx:xx:xx:xx:xx..."
HostName = "ftps.hostname.com"
UserName = "username"
Password = "password"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files
foreach ($item in ($FilesToUpload))
{
$session.PutFiles($Item.FullName, "/dropoff/").Check()
Write-Host "Upload of $($Item.FullName) succeeded"
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
exit 1
}
Related
I have a script file (batch file) which generate three files in a specific folder. Then i have a ps1 file which copy / move the generated files to another server / folders. Separately, everything is working properly
I'd like if it's possible to merge this, and have a wait function between the two scripts. In fact launching the copy / move ps1 function, only when the three files was correctly generated.
The following assumes:
that the files are created and written in full in a single operation.
that it is the appearance of a *.zip file that signals that all files of interest have been created (though they may still in the process of being written to), as you've indicated in a later comment.
$inFolder = '.' # set to the folder of interest.
$outFolder = './out' # ditto
Write-Verbose -vb 'Waiting for a *.zip file to appear...'
while (-not (Test-Path "$inFolder/*.zip")) { Start-Sleep 1 }
# Get a list of all files.
$files = Get-ChildItem -File $inFolder
Write-Verbose -vb 'Waiting for all files to be written completely...'
$files | ForEach-Object {
do {
# Infer from the ability to obtain an exclusive lock that the file has
# has been written in its entirety.
try { [IO.File]::Open($_.FullName, 'Open', 'Read', 'None').Dispose(); return }
catch { Start-Sleep 1 }
} while ($true)
}
# Move the files elsewhere
Write-Verbose -vb 'Moving...'
$files | Move-Item -Destination $outFolder -WhatIf
Note: The -WhatIf common parameter in the last command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
param(
[String]$sourceDirectory = "c:\tmp\001",
[String]$destDirectory = "c:\tmp\001"
)
Get-ChildItem $sourceDirectory | ? {
#this step wait while locks free
[bool]$flag
while (!$flag) {
try {
$FileStream = [System.IO.File]::Open($_,'Open','Write')
$FileStream.Close()
$FileStream.Dispose()
$flag = $true
}
catch{
Start-ScheduledTask -s 1
$null
}
}
$true
} | Copy-Item -Destination $destDirectory
SCRIPT PURPOSE
The idea behind the script is to recursively extract the text from a large amount of documents and update a field in an Azure SQL database with the extracted text. Basically we are moving away from Windows Search of document contents to an SQL full text search to improve the speed.
ISSUE
When the script encounters an issue opening the file such as it being password protected, it fails for every single document that follows. Here is the section of the script that processes the files:
foreach ($list in (Get-ChildItem ( join-path $PSScriptRoot "\FileLists\*" ) -include *.txt )) {
## Word object
$word = New-Object -ComObject word.application
$word.Visible = $false
$saveFormat = [Enum]::Parse([Microsoft.Office.Interop.Word.WdSaveFormat], "wdFormatText")
$word.DisplayAlerts = 0
Write-Output ""
Write-Output "################# Parsing $list"
Write-Output ""
$query = "INSERT INTO tmp_CachedText (tCachedText, tOID)
VALUES "
foreach ($file in (Get-Content $list)) {
if ($file -like "*-*" -and $file -notlike "*~*") {
Write-Output "Processing: $($file)"
Try {
$doc = $word.Documents.OpenNoRepairDialog($file, $false, $false, $false, "ttt")
if ($doc) {
$fileName = [io.path]::GetFileNameWithoutExtension($file)
$fileName = $filename + ".txt"
$doc.SaveAs("$env:TEMP\$fileName", [ref]$saveFormat)
$doc.Close()
$4ID = $fileName.split('-')[-1].replace(' ', '').replace(".txt", "")
$text = Get-Content -raw "$env:TEMP\$fileName"
$text = $text.replace("'", "''")
$query += "
('$text', $4ID),"
Remove-Item -Force "$env:TEMP\$fileName"
<# Upload to azure #>
$query = $query.Substring(0,$query.Length-1)
$query += ";"
Invoke-Sqlcmd #params -Query $Query -ErrorAction "SilentlyContinue"
$query = "INSERT INTO tmp_CachedText (tCachedText, tOID)
VALUES "
}
}
Catch {
Write-Host "$($file) failed to process" -ForegroundColor RED;
continue
}
}
}
Remove-Item -Force $list.FullName
Write-Output ""
Write-Output "Uploading to azure"
Write-Output ""
<# Upload to azure #>
Invoke-Sqlcmd #params -Query $setQuery -ErrorAction "SilentlyContinue"
$word.Quit()
TASKKILL /f /PID WINWORD.EXE
}
Basically it parses through a folder of .txt files that contain x amount of document paths, creates a T-SQL update statement and runs against an Azure SQL database after each file is fully parsed. The files are generated with the following:
if (!($continue)) {
if ($pdf){
$files = (Get-ChildItem -force -recurse $documentFolder -include *.pdf).fullname
}
else {
$files = (Get-ChildItem -force -recurse $documentFolder -include *.doc, *.docx).fullname
}
$files | Out-File (Join-Path $PSScriptRoot "\documents.txt")
$i=0; Get-Content $documentFile -ReadCount $interval | %{$i++; $_ | Out-File (Join-Path $PSScriptRoot "\FileLists\documents_$i.txt")}
}
The $interval variable defines how many files are set to be extracted for each given upload to azure. Initially i had the word object being created outside the loop and never closed until the end. Unfortunately this doesn't seem to work as every time the script hits a file it cannot open, every file that follows will fail, until it reaches the end of the inner foreach loop foreach ($file in (Get-Content $list)) {.
This means that to get the expected outcome i have to run this with an interval of 1 which takes far too long.
This is a shot in the dark
But to me it sounds like the reason its failing is because the Word Com object is now prompting you for some action due since it cannot open the file so all following items in the loop also fail. This might explain why it works if you set the $Interval to 1 because when its 1 it is closing and opening the Com object every time and that takes forever (I did this with excel).
What you can do is in your catch statement, close and open a new Word Com object which should lets you continue on with the loop (but it will be a bit slower if it needs to open the Com object a lot).
If you want to debug the problem even more, set the Com object to be visible, and slowly loop through your program without interacting with Word. This will show you what is happening with Word and if there are any prompts that are causing the application to hang.
Of course, if you want to run it at full speed, you will need to detect which documents you can't open before hand or you could multithread it by opening several Word Com objects which will allow you to load several documents at a time.
As for...
ISSUE
When the script encounters an issue opening the file such as it being password protected, it fails for every single document that follows.
... then test for this as noted here...
How to check if a word file has a password?
$filename = "C:\path\to\your.doc"
$wd = New-Object -COM "Word.Application"
try {
$doc = $wd.Documents.Open($filename, $null, $null, $null, "")
} catch {
Write-Host "$filename is password-protected!"
}
... and skip the file to avoid the failure of the remaining files.
I am receiving the following error
Get-PnPFile : The WriteObject and WriteError methods cannot be called from outside the overrides of the BeginProcessing, ProcessRecord, and EndProcessing methods, and they can only be called from within the same thread. Validate that the cmdlet makes these calls correctly, or contact Microsoft Customer Support Services.
when running this PowerShell script:
$cred = Get-Credential;
$webUrl = "https://...sharepoint.com";
$listUrl = "..";
$destination = "C:\\Folder1"
Connect-PnPOnline -Url $webUrl -Credentials $cred
$web = Get-PnPWeb
$list = Get-PNPList -Identity $listUrl
function ProcessFolder($folderUrl, $destinationFolder) {
$folder = Get-PnPFolder -RelativeUrl $folderUrl
$tempfiles = Get-PnPProperty -ClientObject $folder -Property Files
if (!(Test-Path -Path $destinationfolder)) {
$dest = New-Item $destinationfolder -Type Directory
}
$total = $folder.Files.Count
for ($i = 0; $i -lt $total; $i++) {
$file = $folder.Files[$i]
Get-PnPFile -ServerRelativeUrl $file.ServerRelativeUrl -Path
$destinationfolder -FileName $file.Name -AsFile
}
}
function ProcessSubFolders($folders, $currentPath) {
foreach ($folder in $folders) {
$tempurls = Get-PnPProperty -ClientObject $folder -Property ServerRelativeUrl
# Avoid Forms folders
if ($folder.Name -ne "Forms") {
$targetFolder = $currentPath +"\"+ $folder.Name;
ProcessFolder
$folder.ServerRelativeUrl.Substring($web.ServerRelativeUrl.Length)
$targetFolder
$tempfolders = Get-PnPProperty -ClientObject $folder -Property Folders
ProcessSubFolders $tempfolders $targetFolder
}
}
}
# Download root files
ProcessFolder $listUrl $destination + "\"
# Download files in folders
$tempfolders = Get-PnPProperty -ClientObject $list.RootFolder -Property Folders
ProcessSubFolders $tempfolders $destination + "\"
This script works as expected on a Win10 PC but not on a Win Server. Can anybody tell me what the reason could be please?
After trying this again this morning, the above script works on both a Windows 10 PC and Windows Server,
Alexandra
This may help people looking for that Exact Exception, in some cases. The Exception does not seem to explain what was actually happening so here's some background for people to see if the situation is similar to mine:
It was Generated when I was using Powershell and Get-PNPFile to get files in bulk from a Sharepoint site. After some head scratching it seemed to be related to the existence of the file already being present on the local system from a previous download. Since the script was being used to keep a local copy of certain folders on the sharepoint site I was trying to overwrite the files locally if the server files were newer. (A bit like a crude rsync).
It appears Get-PnpFile does not clobber (silently overwrite) the file if it already exists, but generates this exception. A synopsis of the code that ran without error follows:
Try {
$local_path = ($local_dir + $local_file)
#test for the local file first and rename it
if( Test-Path ( $local_path ) -PathType Leaf ){
Rename-Item $local_path ($local_path + '.old' )
}
#get the file
Get-PnpFile -ServerRelativeUrl $file.ServerRelativeUrl -Path $local_dir -FileName $local_file -AsFile
#remove the old file
if( Test-Path ( $local_path ) -PathType Leaf ){
Remove-Item ($local_path + '.old' )
}
}
Catch {
#handle error and continue
}
Essentially test for the local file first and rename it, download the file, delete the renamed local file. I wrapped it in a Try catch block so that if the download failed for this file I could continue as part of a loop and later I could search for any filenames with a .old suffix and recover the most recent local files for the failures, making note of those.
When the script was re-run the script again which would not see the remote file locally (since the names wouldn't match and fetch a fresh copy but clear the .old ( from a previous run ) file away if it succeeded in getting the file.
I didn't see the exception again.
Hope that helps somebody.
P.S. (if any of your files end in .old Strange/Unknown things WILL happen with the simple method above ....you have been warned)
Update
I'll update here whenever I see a new cause of this Exception
Update
It appears "[" or "]" square brackets in the file name may also cause this. Perhaps in the URL needs urlencoding or escaping....
I'm using Powershell for the first time to download the previous day's files from a webpage for a client. The web page is from a data logger than is on a vendor skid. The data logger always saves the files in the format yyMMdd##.CSV, where ## is the sequential number file for that given day (starting at 00). When viewing the webpage I have only seen the maximum number of CSV files for a given day as 1 (so, 8/31/17's file would be 17083100.CSV). I have got the Powershell code written to give me yesterday's file assuming that 00 is the only file for that day, but I was hoping there was a way I could either use a wildcard or for loop to download any additional files that may exist for the previous day. See the code below for what I currently have:
$a = "http://10.109.120.101/logs/Log1/"
$b = (get-date).AddDays(-1).ToString("yyMMdd") + "00.CSV"
$c = "C:\"
$url = "$a$b"
$WebClient = New-Object net.webclient
$path = "$c$b"
$WebClient.DownloadFile($url, $path)
try Something like this:
$Date=(get-date).AddDays(-1).ToString("yyMMdd")
$URLFormat ='http://10.109.120.101/logs/Log1/{0}{1:D2}.CSV'
$WebClient = New-Object net.webclient
#build destination path
$PathDest="C:\Temp\$Date"
New-Item -Path $PathDest -ItemType Directory -ErrorAction SilentlyContinue
1..99 | %{
$Path="$PathDest\{0:D2}.CSV" -f $_
$URL=$URLFormat -f $Date, $_
try
{
Write-Host ("Try to download '{0}' file to '{1}'" -f $URL, $Path)
$WebClient.DownloadFile($Path, $URL)
}
catch
{
}
}
$WebClient.Dispose()
I am still very new and I have for example one script to backup some folders by zipping and copying them to a newly created folder.
Now I want to know if the zip and copy process was successful, by successful i mean if my computer zipped and copied it. I don't want to check the content, so I assume that my script took the right folders and zipped them.
Here is my script :
$backupversion = "1.65"
# declare variables for zip
$folder = "C:\com\services" , "C:\com\www"
$destPath = "C:\com\backup\$backupversion\"
# Create Folder for the zipped services
New-Item -ItemType directory -Path "$destPath"
#Define zip function
function create-7zip{
param([String] $folder,
[String] $destinationFilePath)
write-host $folder $destinationFilePath
[string]$pathToZipExe = "C:\Program Files (x86)\7-Zip\7zG.exe";
[Array]$arguments = "a", "-tzip", "$destinationFilePath", "$folder";
& $pathToZipExe $arguments;
}
Get-ChildItem $folder | ? { $_.PSIsContainer} | % {
write-host $_.BaseName $_.Name;
$dest= [System.String]::Concat($destPath,$_.Name,".zip");
(create-7zip $_.FullName $dest)
}
Now I can either check if in the parentfolder is a newly created folder by time.
Or i can check if there are zip folders in my subfolders I created.
What way would you suggest? I probably just know this ways, but there are a million way to do this.
Whats your idea? The only rule is , that powershell should be used.
thanks in advance
You could try using the Try and Catch method by wrapping the (create-7zip $_.FullName $dest) with a try and then catch any errors:
Try{ (create-7zip $_.FullName $dest) }
Catch{ Write-Host $error[0] }
This will Try the function create-7zip and write any the errors that many accrue to the shell.
One thing that can be tried is checking the $? variable for the status of the command.
$? stores the status of the last command run,
So for
create-7zip $_.FullName $dest
If you then echo out $? you will see either true or false.
Another option is the $error variable
You can also combine these in all sorts of ways (Or with the exception handling).
For example, run your command
foreach-object {
create-7zip $_.FullName $dest
if (!$?) {"$_.FullName $ErrorVariable" | out-file Errors.txt}
}
That script is more pseudocode for ideas than working code, but it should at least get you close to using it!