I'm trying to output to a file that is created on the fly, but I can't seem to get either to work. Here's that portion of my code-
New-Item -Path $LogPath -Name $InfoLog -Type File -Force
New-Item -Path $LogPath -Name $ErrorLog -Type File -Force
"Script started at: $DateStamp_${TimeStamp}" | $InfoLog
I've also tried just ">>" instead of the pipe. The script runs fine, it just doesn't pipe the output into the file. Instead it pipes it out to a a files called "0" in the directory the script ran from.
Three things.
First, New-Item outputs the item it creates. So unless you do something with the output objects, New-Item will output the new file items to the pipeline. So I think you might want to say:
New-Item -Path $LogPath -Name $InfoLog -Type File -Force | Out-Null
Second, since you're not specifying the path to the file you want to write to, PowerShell will assume the current location.
Finally, if you want to write output to a log file, you probably want to use Out-File. Perhaps something like this:
$infoLogPath = Join-Path $LogPath $InfoLog
"Script started at: $DateStamp_${TimeStamp}" | Out-File $infoLogPath
Join-Path combines the directory and filename into a fully-qualified path name.
Related
I encountered something weird I do not understand. My scenario:
I have in C:\Functions multiple .ps1 files. I would like to copy the content of the files to one file (AllFunctions.ps1). The file CopyFunctions2AllFunctions.ps1 is the file that execudes my commands.
$path="C:\Functions\*.ps1"
$destination="C:\Functions\AllFunctions.ps1"
Clear-Content -Path C:\Functions\AllFunctions.ps1
Get-Content -Path $path -Exclude "C:\Functions\CopyFunctions2AllFunctions.ps1" | Add-Content -Path $destination
The error message is in german, however, it says AllFunctions.ps1 cannot be accessed, because it is used in another process.
The code works if replace
$path="C:\Functions\*.ps1"
with a specific file name like
$path="C:\Functions\Read-Date.ps1"
-Force didnt help
Also, the code worked until Add-Content -Path $destination. When I executed Get-Content... the terminal didnt show me just what was inside the .ps1 files, but also the content of the terminal, with all the errors I encountered while trying...
Does someone have an idea?
There are 2 things to be fixed in this code, first the new code:
$path="C:\Functions"
$destination="C:\Functions\AllFunctions.ps1"
Clear-Content -Path C:\Functions\AllFunctions.ps1
$functions=Get-ChildItem -Path $path -Exclude CopyFunctions2Profile.ps1 | Get-Content
Add-Content -Path $destination -Value $functions
Issue #1
$path="C:\Functions\*.ps1" doesnt work, PS is also copying the content of the terminal, I dont know why...Therefore, we dont use wildcards in our $path.
Because of that we need to use Get-Childitem in the code as the following:
$functions=Get-ChildItem -Path $path -Exclude CopyFunctions2Profile.ps1 | Get-Content
Issue #2
Working with pipes, PS processes one item and sends it through the pipe, then the second and so on. Due to that, when item 2 is sent to Add-Content, "AllFunctions.ps1" is still being used for item 1.
Therefore, we need to save our Get-Content in a variable ($functions) and then use it in Add-Content.
I have a simple task of getting a file from one path and copying it to another using PowerShell script. Now, I understand that using Get-ChildItem, if-Path is not specified, it will get the current directory. I'm using this code, setting the directory the file is in, but it is still getting the current directory:
$file = Get-ChildItem -Path "\\w102xnk172\c$\inetpub\wwwroot\PC_REPORTS\exemplo\DCT\Files\STC" | Sort-Object LastWriteTime | Select-Object -Last 1
Copy-Item $file -Destination "\\Brhnx3kfs01.vcp.amer.dell.com\brh_shipping$\DEMAND_MONITOR\"
cmd /c pause | out-null
It returns this error when it pauses:
This is script follow the rules in the documentation of both Get-ChildItem and Copy-Item. Am I missing something? Why is it still getting the current directory? I tried mutiple syntaxes differences, getting the paths out of commas, not setting -Path or -Destination, setting the file directly in the Copy-Item without using the $file variable...
Copy-Item expects a [string] as the first positional argument - so it attempts to convert $file to a string, which results in the name of the file.
Either reference the FullName property value (the full path) of the file:
Copy-Item $file.FullName -Destination "\\Brhnx3kfs01.vcp.amer.dell.com\brh_shipping$\DEMAND_MONITOR\"
Or pipe the $file object to Copy-Item and let pipeline binding do its magic for you:
$file |Copy-Item -Destination "\\Brhnx3kfs01.vcp.amer.dell.com\brh_shipping$\DEMAND_MONITOR\"
If you want to see for yourself how this is processed internally by powershell for yourself, use Trace-Command:
Trace-Command -Name ParameterBinding -Expression {Copy-Item $file -Destination "\\Brhnx3kfs01.vcp.amer.dell.com\brh_shipping$\DEMAND_MONITOR\"} -PSHost
I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force
I want to move the file "file_to_move.txt"in each folder to their respective "done"-folder.
so the file_to_move.txt in C:\Temp\test\folder1 is moved to C:\Temp\test\folder1\done
and file_to_move.txt in C:\Temp\test\folder2 is moved to C:\Temp\test\folder2\done
...and so on, preferably with a %date%_%time% added to the file-name.
if a folder (like folder4 in the example below) does not have a file_to_move.txt, the script should just ignore it and move on.
folder structure example:
C:\Temp\test\DONE
C:\Temp\test\folder1
C:\Temp\test\folder1\done
C:\Temp\test\folder1\some_other_folder
C:\Temp\test\folder1\some_other_file.txt
C:\Temp\test\folder1\file_to_move.txt
C:\Temp\test\folder2
C:\Temp\test\folder2\done
C:\Temp\test\folder2\some_other_folder
C:\Temp\test\folder2\some_other_file.txt
C:\Temp\test\folder2\file_to_move.txt
C:\Temp\test\folder3
C:\Temp\test\folder3\done
C:\Temp\test\folder3\some_other_folder
C:\Temp\test\folder3\some_other_file.txt
C:\Temp\test\folder3\file_to_move.txt
C:\Temp\test\folder4
C:\Temp\test\folder4\done
C:\Temp\test\folder4\some_other_folder
C:\Temp\test\folder4\some_other_file.txt
I have experimented with a Powershell script even if I'm not very good at it and I dont know it can be done in a standard batch-script.
I have tried this so far:
In a batch-script:
SET ThisScriptsDirectory=%~dp0
SET PowerShellScriptPath=%ThisScriptsDirectory%bin\movescript.ps1
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& '%PowerShellScriptPath%'"
in the movescript.ps1:
Move-Item C:\Temp\test\*\file_to_move.txt C:\Temp\test\*\done\file_to_move_$(get-date -f yyyyMMdd_HHmmss).txt
But this is not working.
I guess it's not precise enough to work.
As a bonus, can the whole thing be done within the basic script or must we use the external .PS1-file?
You can use the Get-ChildItem cmdlet with a filter to retrieve all file_to_move.txt files recursively from a path. Use the Foreach-Object (alias foreach) to iterate over them and combine the new path using the Join-Path cmdlet. To Copy the Item, you can use the Copy-Item cmdlet:
$itemsToCopy = Get-ChildItem -Path c:\Temp\Test -Filter file_to_move.txt -Recurse
$itemsToCopy | foreach {
$newPath = Join-Path $_.DirectoryName 'done'
New-Item -Path $newPath -ItemType directory -Force | out-null
$_ | Copy-Item -Destination $newPath
}
If you want to add a Timestamp, you could use the Get-Date cmdlet and invoke the ToString method with your desired format on it, example:
(Get-Date).ToString("yyyy-dd-M_HH-mm-ss")
Output:
2016-05-4_15-06-02
You can now concat the filenames using a format string and the $_.Basename and $_.Extension property within your foreach loop. I will leave this as an exercise to you.
Below code does not work. But doing them seperately in two line works.
Move-Item file.txt \same-directory | Set-Content -Path \same-directoy -Value "New content"
The Move-Item cmdlet without switch -PassThru returns nothing, so in that case, nothing gets sent through the pipe to the Set-Content cmdlet.
If you really want to do this as one-liner, use
Move-Item -Path 'D:\file.txt' -Destination 'D:\some-directory' -PassThru | Set-Content -Value "New content"
Since now we're actually piping the moved object, the -Path parameter for the Set-Content needs to be omitted.
Of course, the destination folder D:\some-directory needs to exist