Referencing this posting, what might be the difference between running this on Windows 7 Professional and Server 2008 R2?
Server 2008 R2 IS running PS version 2.0 whereas my Windows 7 machine is running 1.0.
Would this be enough to cause the following issue (this works correctly until the final trip through the directory, which then throws the error shown below - over and over and over again):
Here is the PS script from that posting:
$dir_files = "C:\path\to\my\files"
$new = "newFile.xml"
$chkFile = "C:\path\to\my\files\newFile.xml"
$fileExists = Test-Path $chkFile
while($true) {
$xmlFiles = gci $dir_files -include pre_name_*.xml -recurse
if ($xmlFiles.Length -eq 0) {
Write-Host "No files in "$dir_Files". Shutting down."
break
}
foreach ($file in $xmlFiles) {
[string]$strfile = $file
if (Test-Path $chkFile) {
Write-Host $chkFile + " file exists! Sleeping 20 seconds ..."
Start-Sleep -s 20
} else {
if (Test-Path $strfile){
Write-Host $chkFile " doesn't exist, renaming next file in array..."
rename-item -path $file -newname ("$new")
Start-Sleep -s 10
} else{
Write-Host $file " does not exist. Last file processed. Exiting..."
break
}
}
}
}
The gist of this is to run through a list of files in a directory and, once a specific type is found, rename it. That renamed file gets processed and deleted and then the next one needs to be renamed until none are left. Then, it will quit/break.
The script works fine on the Windows 7 machine. Not so much on the Server 2008 R2 machine. Both are 64bit.
Per request, fleshing request out more fully. Taken from original posting:
The original intention was more fully fleshed out in the original posting. Sorry, I tried to keep the original request short but caused more confusion:
Task: Directory that will have various .csv files dumped into it. These could be like file1.csv, file2.csv, etc.
Problem: Iterate through the files and rename one to a standardized 'newfilename.csv'. This file will then be processed by an external program and get renamed and moved out of the directory.
This does not run 24/7. It will be a scheduled task and run until there are no more files in the directory. No sub directories.
UPDATE/FINAL CODE:
$dir_files = "C:\path\to\my\files"
$new = "newFile.xml"
$chkFile = "C:\path\to\my\files\newFile.xml"
#https://stackoverflow.com/questions/20922997/powershell-in-windows-7-versus-server-2008-differences
#changed Include to Filter to improve performance
$xmlFiles = gci $dir_files -include pre_name*.xml -recurse
if (#($xmlFiles).Count -eq 0) {
Write-Host "No files in '$dir_Files'. Shutting down."
break
}
foreach ($file in $xmlFiles) {
#wait until last file is removed (processed by external app)
while (Test-Path $chkFile){
Write-Host "$chkFile file exists! Sleeping 20 seconds ..."
Start-Sleep -s 20
}
#continue with the next file
[string]$strfile = $file
if (($strFile) -and (Test-Path $strFile)){
Write-Host "$chkFile doesn't exist, renaming next file in array, sleeping 10 seconds ..."
Rename-Item -Path $file -newname ("$new")
Start-Sleep -s 10
}
}
Is this what you were after? I've cleaned up/rewritten the code(untested), but as I don't completely understand what you're trying to do it may be wrong.
$dir_files = "C:\path\to\my\files"
$new = "newFile.xml"
$chkFile = "C:\path\to\my\files\newFile.xml"
$fileExists = Test-Path $chkFile
#Changed Include to Filter to improve performance.
$xmlFiles = Get-ChildItem -Path $dir_files -Filter pre_name_*.xml
if (#($xmlFiles).Count -eq 0) {
Write-Host "No files in '$dir_files'. Shutting down."
break
}
foreach ($file in $xmlFiles) {
#Wait until last file is removed(processed by external app)
while (Test-Path $chkFile) {
Write-Host "$chkFile file exists! Sleeping 20 seconds ..."
Start-Sleep -s 20
}
#Continue with next file
Rename-Item -Path $file -NewName $new
Start-Sleep -s 10
}
To find out what caused the error in your original script, you should run the script using PowerShell ISE and set a breakpoint at if (Test-Path $strfile){ so you can see the value of $strfile each time and detect what happends to it, because as the error says, $strfile suddenly becomes blank (Path property is empty).
Related
We need to move .csv files from folder where the are stored down to external server using powershell.
this is what i've tried so far but for some reason i only get message not copying and name of the files:
$DestinationFolder = "C:\d1\"
$SourceFolder = "C:\s1\"
If (-not (Test-Path $DestinationFolder) ) {
New-Item -ItemType Directory -Force -Path $DestinationFolder
}
$EarliestModifiedTime = (Get-Date).AddMinutes(200).Date # get's current time + adds 30 min
$LatestModifiedTime = (Get-Date).Date
echo($DestinationFolder); # will check time
Get-ChildItem "C:\s1\*.*" |
ForEach-Object {
if ( ($_.CreationTime -ge $EarliestModifiedTime) -and ($_.CreationTime -lt $LatestModifiedTime) ){ # i.e., all day yesterday
Copy-Item $_ -Destination $DestinationFolder -Force
Write-Host "Copied $_" }
else {
Write-Host "Not copying $_"
}
}
does it work if you simplify it and just try for a test (e.g. pick one file rather than trying to run for a delta of every 30 mins / last day)? Just thinking first you need to see if the problem is with accessing (or the formatting) of your source / destination directories or the delta logic itself. Perhaps some more error conditions would help....
I have a script file (batch file) which generate three files in a specific folder. Then i have a ps1 file which copy / move the generated files to another server / folders. Separately, everything is working properly
I'd like if it's possible to merge this, and have a wait function between the two scripts. In fact launching the copy / move ps1 function, only when the three files was correctly generated.
The following assumes:
that the files are created and written in full in a single operation.
that it is the appearance of a *.zip file that signals that all files of interest have been created (though they may still in the process of being written to), as you've indicated in a later comment.
$inFolder = '.' # set to the folder of interest.
$outFolder = './out' # ditto
Write-Verbose -vb 'Waiting for a *.zip file to appear...'
while (-not (Test-Path "$inFolder/*.zip")) { Start-Sleep 1 }
# Get a list of all files.
$files = Get-ChildItem -File $inFolder
Write-Verbose -vb 'Waiting for all files to be written completely...'
$files | ForEach-Object {
do {
# Infer from the ability to obtain an exclusive lock that the file has
# has been written in its entirety.
try { [IO.File]::Open($_.FullName, 'Open', 'Read', 'None').Dispose(); return }
catch { Start-Sleep 1 }
} while ($true)
}
# Move the files elsewhere
Write-Verbose -vb 'Moving...'
$files | Move-Item -Destination $outFolder -WhatIf
Note: The -WhatIf common parameter in the last command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
param(
[String]$sourceDirectory = "c:\tmp\001",
[String]$destDirectory = "c:\tmp\001"
)
Get-ChildItem $sourceDirectory | ? {
#this step wait while locks free
[bool]$flag
while (!$flag) {
try {
$FileStream = [System.IO.File]::Open($_,'Open','Write')
$FileStream.Close()
$FileStream.Dispose()
$flag = $true
}
catch{
Start-ScheduledTask -s 1
$null
}
}
$true
} | Copy-Item -Destination $destDirectory
I am needing to write a script to automate moving all of the IIS log files from a system that has been running for months. This system consists of many web servers.
I am successful at getting the log directory moved to the new location, and then my script will move the other log files from the old directory to the new location with the exception of the current file. It cannot move because there is a file in the destination with the same name. (current log file.) I want to rename it, but do not want to rename all files in the directory. I know I could use a wildcard, but would prefer to rename only that file.
What command can I use to find the name of the file, not the directory or path? I have pieced this together from other smaller requests I have found on here and on the web.
Import-Module WebAdministration
$LogPath = "e:\Logs\IISlogs"
foreach($WebSite in $(get-website))
{
$logFile="$($Website.logFile.directory)\w3svc$($website.id)".replace("%SystemDrive%",$env:SystemDrive)
}
New-Item $LogPath -ItemType Directory
Set-ItemProperty “IIS:\Sites\Default Web Site” -name logFile.directory -value $LogPath
$path = $logpath+"\W3SVC1"
$Timeout = 60
$timer = [Diagnostics.Stopwatch]::StartNew()
while (($timer.Elapsed.TotalSeconds -lt $Timeout) -and (-not (Test-Path -Path $path )))
{
Start-Sleep -Seconds 1
$tot = $timer.Elapsed.Seconds
Write-Output -Message ("Still waiting for action to complete after " + $tot + " seconds upto " + $Timeout)
}
$timer.Stop()
Move "$($logfile)\*.*" $path
All files have a time a date stamp. Filter the file name, sort by the time stamp descending, and select the first one.
For example:
$dir = "e:\Logs\IISlogs"
($latest = Get-ChildItem -Path $dir |
Sort-Object LastAccessTime -Descending |
Select-Object -First 1)
"Current log file details: $($latest.name)"
Get-Content $latest
If you are saying the logs can be of a different name, then you need to specify the wildcard match for that in the path specification.
I have a powershell script that's moving files from a source directory over to a target directory every 15 minutes. Files of around 1 meg are moving into the source directory by an SFTP server... so the files can be written at anytime by the SFTP clients.
The Move-Item command is moving files, however it seems that it's moving them without making sure the file isn't still being written (in-use?).
I need some help coming up with a way to write the files from the source to the target and make sure the entire file gets to the target. Anyone run across this issue before with Powershell?
I searched and was able to find a few functions that said they solved the problem but when I tried them out I wasn't seeing the same results.
Existing PowerShell script is below:
Move-Item "E:\SFTP_Server\UserFolder\*.*" "H:\TargetFolder\" -Verbose -Force *>&1 | Out-File -FilePath E:\Powershell_Scripts\LOGS\MoveFilesToTarget-$(get-date -f yyyy-MM-dd-HH-mm-ss).txt
I ended up cobbling together a few things and got this working as I wanted it. Basically I'm looping through the files and checking the length of the file once... then waiting a second and checking the length of the file again to see if it's changed. This seems to be working well. Here's a copy of the script incase it helps anyone in the future!
$logfile ="H:\WriteTest\LogFile_$(get-date -format `"yyyyMMdd_hhmmsstt`").txt"
function log($string, $color)
{
if ($Color -eq $null) {$color = "white"}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
$SourcePath = "E:\SFTP_Server\UserFolder\"
$TargetPath = "H:\TargetFolder\"
$Stuff = Get-ChildItem "$SourcePath\*.*" | select name, fullname
ForEach($I in $Stuff){
log "Starting to process $I.name" green
$newfile = $TargetPath + $I.name
$LastLength = 1
$NewLength = (Get-Item $I.fullname).length
while ($NewLength -ne $LastLength) {
$LastLength = $NewLength
Start-Sleep -Seconds 1
log "Waiting 1 Second" green
$NewLength = (Get-Item $I.fullname).length
log "Current File Length = $NewLength" green
}
log "File Not In Use - Ready To Move!" green
Move-Item $I.fullname $TargetPath
}
I am still very new and I have for example one script to backup some folders by zipping and copying them to a newly created folder.
Now I want to know if the zip and copy process was successful, by successful i mean if my computer zipped and copied it. I don't want to check the content, so I assume that my script took the right folders and zipped them.
Here is my script :
$backupversion = "1.65"
# declare variables for zip
$folder = "C:\com\services" , "C:\com\www"
$destPath = "C:\com\backup\$backupversion\"
# Create Folder for the zipped services
New-Item -ItemType directory -Path "$destPath"
#Define zip function
function create-7zip{
param([String] $folder,
[String] $destinationFilePath)
write-host $folder $destinationFilePath
[string]$pathToZipExe = "C:\Program Files (x86)\7-Zip\7zG.exe";
[Array]$arguments = "a", "-tzip", "$destinationFilePath", "$folder";
& $pathToZipExe $arguments;
}
Get-ChildItem $folder | ? { $_.PSIsContainer} | % {
write-host $_.BaseName $_.Name;
$dest= [System.String]::Concat($destPath,$_.Name,".zip");
(create-7zip $_.FullName $dest)
}
Now I can either check if in the parentfolder is a newly created folder by time.
Or i can check if there are zip folders in my subfolders I created.
What way would you suggest? I probably just know this ways, but there are a million way to do this.
Whats your idea? The only rule is , that powershell should be used.
thanks in advance
You could try using the Try and Catch method by wrapping the (create-7zip $_.FullName $dest) with a try and then catch any errors:
Try{ (create-7zip $_.FullName $dest) }
Catch{ Write-Host $error[0] }
This will Try the function create-7zip and write any the errors that many accrue to the shell.
One thing that can be tried is checking the $? variable for the status of the command.
$? stores the status of the last command run,
So for
create-7zip $_.FullName $dest
If you then echo out $? you will see either true or false.
Another option is the $error variable
You can also combine these in all sorts of ways (Or with the exception handling).
For example, run your command
foreach-object {
create-7zip $_.FullName $dest
if (!$?) {"$_.FullName $ErrorVariable" | out-file Errors.txt}
}
That script is more pseudocode for ideas than working code, but it should at least get you close to using it!