I have a script that I run at my work that uses get-childitem to get all the files of a certain type in a storage drive and sorts and moves them to an archive drive. I'd like to automate this process to run once everyday but I realized I would have a problem in doing so.
Occasionally, when this script is run a file or two will still be in the process of transferring over to our storage drive. If I let the script move this file while it is still being transferred from our customer, it gets corrupted and won't open later.
I know how to filter based on file type and date and other basic parameters, but I'm not entirely sure how I tell this script to exclude files that are currently growing in size.
Below is what I'm currently using to filter what I want to move:
$TargetType = "*.SomeFileType"
$TargetDrive = "\\Some\UNC\Path"
Get-ChildItem $targetdrive\$targettype | ForEach-Object {$_.fullname} | Sort-Object | out-file $outStorageMove
Also, at the moment I'm putting everything that get-childitem finds into a text file, that gets invoked later so that I can manually edit what I want it to move. I'd like to get rid of this step if at all possible.
So, move is essentially copy and delete.
So, like gvee state, Copy-Item is a better option, to get you past your stated concern, monitor for the copy to complete. My addition would be to delete once the copy is done and you have verified the copy.
Or use Bits as a job to do this.
Using Windows PowerShell to Create BITS Transfer Jobs
https://msdn.microsoft.com/en-us/library/windows/desktop/ee663885(v=vs.85).aspx
You can use PowerShell cmdlets to create synchronous and asynchronous Background Intelligent Transfer Service (BITS) transfer jobs.
All of the examples in this topic use the Start-BitsTransfer cmdlet. To use the cmdlet, be sure to import the module first. To install the module, run the following command: Import-Module BitsTransfer. For more information, type Get-Help Start-BitsTransfer at the PowerShell prompt.
Related
I am trying to execute an .exe executable file (let' say it is called myfile.exe) under the argument (argument.fst) . Both files have the same name for each execution, but are located in different subfolders in the same parent directory.
My objective is to create a for-loop, in which, I will pinpoint the paths to both files (14 groups in total, so 14 loops) and then Windows Powershell will execute those. My goal is to automate my simulations, ran by the .exe files+arguments, thus saving time.
Is my thought possible to be implemented on Windows Powershell?
Thank you very much,
Ioannis Voultsos.
If you want to automate the process, you may store your command,args in csv file (i.e. commands.csv):
command;arguments
myapp.exe;c:/
myapp.exe;h:/
then load it and execute using &:
$csv=(import-csv commands.csv -delimiter ';')
$csv|foreach{ &$_.command $.arguments }
Beware of executing commands from strings, coming from untrusted sources though.
Try out this sample code on the parent folder
Get-ChildItem | Where-Object {($_.psiscontainer)} | ForEach-Object { cd $_.FullName; & ".\SampleApp.exe args0 args1"; cd.. }
it will go into each directory and execute .exe in each folder with arguments.
Note: I'm using the built-in PowerShell ISE as my environment
I got a funny issue with dot slash on Powershell. All of my scripts run from a certain folder and there are subfolders that contain data that is needed for them to run.
For example, my scripts are saved at c:\users\chris\posh
Most of the time, I will call input and send output to subfolders like this...
c:\users\chris\posh\inputs
c:\users\chris\posh\output
Therefore I'll have scripts examples that look like this for inputs and outputs:
$hbslist = Get-Content .\inputs\HBS-IP.txt
write-output "$($lat),$($long)" | Out-File .\Outputs\"LatLong.csv" -Append
Lately, when I run the scripts, it cannot locate my files or exe's that I call on. That's because it's trying to look at P:/ instead of c:\users\chris\posh when using .\
Powershell also starts in my P:\ (mapped share drive) for some reason and I cannot figure out as to why my PC is running this way.
It might be a policy on your machine which changes your home directory. You can check the home directory with:
echo $env:HOME
This happens often on corporate machines. If you want to set it back for your powershell environment, you can set it in your profile.ps1.
This is typically stored at:
c:\Users\<Name>\Documents\WindowsPowershell\profile.ps1
I created a group policy in In Group Policy Management Editor, in the navigation pane, expand User Configuration, expand Policies, expand Windows Settings, and then click Scripts (Logon/Logoff). I made a logon script as a ps1 file:
copy-item "\\server1\Pictures\background.jpg" -Destination "C:\screensaver\" -Recurse
I added that ps1 file in the powershell scripts part of the group policy and set it to run powershell scripts first.
I didn't use any parameters which may be causing the issue?
I need each computer to have that c:\screensaver\background.jpg image when they login.
It's the only group policy applied to that OU, all the PCs are Windows 10, and the domain controllers are Windows 2012 r2.
In my opinion creating a (PowerShell-) logon-script for copying a file is not a great solution and out-of-date nowadays.
Make your life easier and use group-policy-preferences for this task. You don't have to create scripts for that.
Open the Group Policy Management Console, select your policy, open the "Preferences"-Node and select "Files". Create a new element and select the source- and the target (as shown below).
After that reboot the client and the file should get copied without coding.
Sounds like there's two parts to implementing your request. Before doing any of the following, make sure that you can log in as one of the users, and manually perform the steps you want the script to complete (to make sure any user restrictions aren't holding you up). So, make sure you can navigate to the remove image location \\server1\Pictures\background.jpg, and copy it to the local folder C:\screensaver.
Copying the file to the target machine. You provided the contents of your PS1 file as copy-item "\server1\Pictures\background.jpg" -Destination "C:\screensaver\" -Recurse. I believe you'll want to have two slashes "\\" at the beginning of your \server1 string, resulting in "\\server1\Pictures\background.jpg" (the two slashes make this a valid UNC path). Additionally, you included the -Recurse parameter. I don't understand the need for this parameter, based off of the documentation, accessible via the command Get-Help Copy-Item -Full.
-Recurse [<SwitchParameters>]
Indicates that this cmdlet performs a recursive copy.
I would suggest that you include the -Force parameter. If you ever update that image, the next time a user logs on, they'll receive the updated image. Without the -Force parameter, the command might not overwrite the existing image on disk. Also, you shouldn't need the trailing slash on the -Destination parameter. I would suggest the command resemble:
Copy-Item "\\server1\Pictures\background.jpg" -Destination "C:\screensaver\" -Force
Configuring the wallpaper via Group Policy The first link I found via Google Search set windows 10 wallpaper via group policy with anything that looked like useful steps was some grouppolicy.biz website. I haven't been able to test them, but the main point being that you'll need to make sure that you're actually telling the computer to use the wallpaper you've copied into place.
If you make sure that you've addressed the above items, then it should work for you. There may be some considerations for the first time a user logs in, if the image isn't copied over, then the wallpaper may not display until the second time they log in.
I am not a scripter at all. Someone else had created this script for me and it has previously worked. The only thing that has changed is the drive letter (which I did change in the script - it is currently drive E). But it is not working now. All it is supposed to do is pull back a list of files in a specified folder and save it as a text file in that directory; in this case, it's my karaoke song collection.
When I run the script now, I get:
Get-Process : A positional parameter cannot be found that accepts argument Get-ChildItem.
Here is the original script:
PS C:\Users\Tina> Get-ChildItem "F:\My Music\Karaoke\*.*" | Set-Content "F:\My Music\Karaoke\test.txt"
I'd like to make it so that it just pulls back all .mp3's, if that's possible, too. Thanks in advance for your help!
Since you appear to be copying and pasting this to the command line I will assume there was a typo that caused this issue. After a couple of quick tests to try and guess what the accident was I was unable to replicate exactly. Not being a scripter might make this harder but I recommend saving this code to a ps1 file so that you can just double click on it.
Get-ChildItem "F:\My Music\Karaoke\*.mp3" | Set-Content "F:\My Music\Karaoke\test.txt"
Warning
In order for the this file to work for you you have to allow PowerShell to execute it. If you run the shell as administrator once and run this code
Set-ExecutionPolicy remotesigned
It will allow your script to run. Keep in mind this is a site for scripters to get help. You should expect answers like this.
I am copying files from One Windows machine to another using Copy-Item in Powershell script.
But I want to wait till copy completes, Powershell Copy-Item is non-blocking call means, it just triggers copy and returns to script however I want to wait till copy completes.
Is there any way to do it ?
Maybe "copy-item [...] | out-null" will help to wait for completion.
"Out-null actually deletes output instead of routing it to the PowerShell console, so the script will sit and wait for the content to be deleted before moving on."
The explanation comes from ITProToday.com:
https://www.itprotoday.com/powershell/forcing-powershell-wait-process-complete
Copy-Item does block. I just copied a 4.2GB file from a share on our gigabit network to my local machine. My PowerShell prompt hung and didn't return for several minutes.
It seems like it's non-blocking here for very small files. I'm using:
Start-Sleep -s 3
To wait for the files to be copied. Not an ideal solution but it's what I got so far.