For some when I run this command in PowerShell, it succeeds up to a certain point:
Copy-Item S:\schools\$question C:\Users\$env:username\Desktop\schools\$question -Recurse -Verbose
Integrated with my current script, this line takes the files from S:\schools\$question and copies them to C:\Users\$env:username\Desktop\schools\$question. This works fine on some folders, and then just stops running on other ones. On the one's that it stops running on, PowerShell seems to be using 50% of my CPU resources, whitch should never happen as I'm just copying files.
Again, as always, any assistance is greatly appreciated.
Related
After searching for the question, I didn't see anything outside of running multiple scripts in one file.
I'm trying to figure out how to run a script where the script lives in one directory and the .exe file that the script is running lives in another one.
If you're curious, the ask is coming from a person that stores hundreds of scripts in a separate location from any executable files. I was asked specifically to help with the script itself on this particular project. I'm not well versed in script writing, so I thought I would reach out here and see.
Below is the current code in the script that runs perfectly. The other parts of the script are specific to the program being installed and aren't relevant to actually starting the install.
I'm using Powershell to run this.
Start-Process -FilePath "$currentpath\ClientSetup_111.exe" -ArgumentList $InstCmdLine -Wait -PassThru
Note: I'm using the built-in PowerShell ISE as my environment
I got a funny issue with dot slash on Powershell. All of my scripts run from a certain folder and there are subfolders that contain data that is needed for them to run.
For example, my scripts are saved at c:\users\chris\posh
Most of the time, I will call input and send output to subfolders like this...
c:\users\chris\posh\inputs
c:\users\chris\posh\output
Therefore I'll have scripts examples that look like this for inputs and outputs:
$hbslist = Get-Content .\inputs\HBS-IP.txt
write-output "$($lat),$($long)" | Out-File .\Outputs\"LatLong.csv" -Append
Lately, when I run the scripts, it cannot locate my files or exe's that I call on. That's because it's trying to look at P:/ instead of c:\users\chris\posh when using .\
Powershell also starts in my P:\ (mapped share drive) for some reason and I cannot figure out as to why my PC is running this way.
It might be a policy on your machine which changes your home directory. You can check the home directory with:
echo $env:HOME
This happens often on corporate machines. If you want to set it back for your powershell environment, you can set it in your profile.ps1.
This is typically stored at:
c:\Users\<Name>\Documents\WindowsPowershell\profile.ps1
I'm currently experiencing some problems with importing/editing existing registry keys into a printer driver location.
After some searching I found that Set-ItemProperty is the exact thing I need to rewrite subkey areas within the registry, however, I've had a lot of trouble running it.
When the code is put directly into the PowerShell console it works perfectly and updates the file in just the way I want it to.
When the code is open and ran in Powershell ISE it has to run several times consecutively (spammed around 3 times) for it works.
When the code is ran from its PS1 file from its file explorer location (C:\user\RW Sandbox\Documents\Printerscripts) then it doesn't work at all, regardless of whether or not I've spammed the run invocation.
This is the code I was using (although mine includes a lot more hex than this example):
Set-ItemProperty -Path "HKLM:\SOFTWARE\Xerox\PrinterDriver\V5.0\Xerox Global Print Driver PCL6\DeviceSettings" -Name "CachedXrxDeviceSettings" -Type "binary" -Value ([byte[]](0x10,0x00,0x00,0x00,0x01,0x00,0x07,0x20,0x2b,0x16,0x58,0x02,0x00,0x00,0x00,0x00,0x4d,0x53,0x43,0x46,0x00,0x00,0x00,0x00,0x70,0x02,0x00,0x00,0x00,0x00,0x00,0x00,0x2c,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x03))
Within the PowerShell, the execution policy is currently unrestricted as initially, I believed that this might have been a permission issue.
I have a script that I run at my work that uses get-childitem to get all the files of a certain type in a storage drive and sorts and moves them to an archive drive. I'd like to automate this process to run once everyday but I realized I would have a problem in doing so.
Occasionally, when this script is run a file or two will still be in the process of transferring over to our storage drive. If I let the script move this file while it is still being transferred from our customer, it gets corrupted and won't open later.
I know how to filter based on file type and date and other basic parameters, but I'm not entirely sure how I tell this script to exclude files that are currently growing in size.
Below is what I'm currently using to filter what I want to move:
$TargetType = "*.SomeFileType"
$TargetDrive = "\\Some\UNC\Path"
Get-ChildItem $targetdrive\$targettype | ForEach-Object {$_.fullname} | Sort-Object | out-file $outStorageMove
Also, at the moment I'm putting everything that get-childitem finds into a text file, that gets invoked later so that I can manually edit what I want it to move. I'd like to get rid of this step if at all possible.
So, move is essentially copy and delete.
So, like gvee state, Copy-Item is a better option, to get you past your stated concern, monitor for the copy to complete. My addition would be to delete once the copy is done and you have verified the copy.
Or use Bits as a job to do this.
Using Windows PowerShell to Create BITS Transfer Jobs
https://msdn.microsoft.com/en-us/library/windows/desktop/ee663885(v=vs.85).aspx
You can use PowerShell cmdlets to create synchronous and asynchronous Background Intelligent Transfer Service (BITS) transfer jobs.
All of the examples in this topic use the Start-BitsTransfer cmdlet. To use the cmdlet, be sure to import the module first. To install the module, run the following command: Import-Module BitsTransfer. For more information, type Get-Help Start-BitsTransfer at the PowerShell prompt.
I am copying files from One Windows machine to another using Copy-Item in Powershell script.
But I want to wait till copy completes, Powershell Copy-Item is non-blocking call means, it just triggers copy and returns to script however I want to wait till copy completes.
Is there any way to do it ?
Maybe "copy-item [...] | out-null" will help to wait for completion.
"Out-null actually deletes output instead of routing it to the PowerShell console, so the script will sit and wait for the content to be deleted before moving on."
The explanation comes from ITProToday.com:
https://www.itprotoday.com/powershell/forcing-powershell-wait-process-complete
Copy-Item does block. I just copied a 4.2GB file from a share on our gigabit network to my local machine. My PowerShell prompt hung and didn't return for several minutes.
It seems like it's non-blocking here for very small files. I'm using:
Start-Sleep -s 3
To wait for the files to be copied. Not an ideal solution but it's what I got so far.