I get the following error when trying to rename a file in Powershell on Windows Server 2022.
The action can't be completed because the folder or a file in it is open in another program
Is there any solution to be able to force the file to be renamed?
Related
I have a scheduled script that outputs bunch of HTML files with static names to a remote location. I noticed, that if I have one of those files selected in Windows Explorer so that its contents are shown in Preview Pane, then Powershell cannot overwrite that file and skips updating it.
This only happens if output files are in remote location. Works just fine if files are local.
How do I force PowerShell to overwrite remote files in this situation? Lots of users work with those reports and if one of them leaves Windows Explorer window with one of those files highlighted overnight when the script runs, the file is not going to be updated.
Move HTML files to webserver. You will solve your problem entirely. IIS Setup on windows server is Next, Next, Next. You can leave link to a new file location (https://....) in old place, so users can easily navigate to a new place. Possibly this link can be automated (not sure because of modern security standards)
Try [System.IO.File]::Delete($path) just before writing this file. This removes file entry from filesystem, but leaves file open for those who have it open for now. This makes your script to write to a new file with the same name. Old file exists without name (deleted) but leaves open until everyone close it. Check it actually deleted with resresh!
Try [System.IO.File]::Move($path, $someTrashFullName) just before writing this file. $someTrashFullName probably must be on same drive. Same as Delete, but renames file. Some self-updating software use this strategy. File is renamed, but it's still kept open under new name.
Try replace file with shortcut to some file. You can generate files with different names and change shortcut programmatically
HTML files that change location using js ? They read nearby JSON (generated by export script) and lookup there for a new filename. So user opens static unchanged A.html, JS inside lookups at A.json for new name and redirects user to A-2020-08-11.html. I'm not sure browsers allow reading JSON files from JS for files that opened from network drive.
Only way left is to stop network share or\and close open files server-side.
Maybe some fun with to disable preview in this folder \ completely?
Try with -Force. But to me, it seems to be more a permission issue.
Remove-Item -Path '\\server\share\file' -Force
I use ssis to run a powershell script to download a file that used to be csv but recently became large enough to be zipped. I updated the powershell script to look for a zip file and added a task to the package to unzip the file so it can be loaded into a sql database. Well, then it came through as a csv again. I need a solution to choose either the zip file or the csv file. Not sure if this should be a task in ssis or updated powershell.
I would go with a PS task to download the files (either zip/csv) then SSIS foreach container to iterate over the files you just downloaded. Doing this you will assign the individual file to a user variable. Inside your container if the file is zip, (use a variable set via an expression to determine if it is zip or not) run a task that will run PS to unzip and then a expression task to update the variable that holds the file path to be the newly unzipped csv path. Then run your data flow task to import the csv.
If the file is a csv to begin with, then just run the DFT.
Either way the data flow task is the same, take csv and load it. I have found I like to keep my PS in SSIS packages very purpose driven. I have a tendency to build my logic in PS because it is easier, but then my package becomes harder to debug because an issue in my PS script will fail the SSIS package and SSIS tells me nothing usefull about what in the script failed. (unless you are handling redirecting of stdout and stderr from your PS, or doing some other logging)
Best to keep the powershell as simple as needed for each task you need to do.
I have a script that I've created to prep our customer's servers for a software install. Part of this requires the script to be run as administrator, so just instructing people to click "Run With Powershell" doesn't get the job done. The script is in a folder with a number of .ini files that the script needs to copy to different server locations. If I just right-click the Powershell script and select "Run With Powershell," it is able to find the files and copy them without issue. Unfortunately, if I open the script in ISE, it opens with a default directory of C:\users\user, and I can't seem to copy those .ini files without first running a change directory command to get us to the folder that the script and the .ini files are in. But I'd like our installation techs to be able to run this without worrying about the exact location they initially drop these folders. I'd also like them to not have to worry about changing the directory manually in PowerShell. Some of our customers have multiple drives, and it might make sense to put this stuff on something other than the C drive, so it's hard to tell where this folder might end up. But I'm not sure of a command that will get me to the directory of the *.ps1 file, without knowing where that file is beforehand... Anyone have a suggestion?
You can use $PSScriptRoot that will have the location of the directory where the script is located.
This is referenced in the following post:
How can I get the file system location of a PowerShell script?
I have an issue using the copy task in https://www.visualstudio.com/docs/build/steps/utility/copy-files
My task failed with an error cp: copyFileSync: could not write to dest file (code=EBUSY): ...
Looking up, I found out that somehow the file couldn't be overridden. When I delete that file and queue the build again, it succeeded.
Is there any permanent solution for this issue? I don't want to use the option "Clean target folder" in the copy task because in that folder there will be additional files that are not copied by the build task.
According to the error message, the file is being used when the copy files task is running. You need to check which application is using it and make sure the application is closed when the task running. If the file is locked by any application, you may get "rm: could not remove the file (code EBUSY):..." error message even if you use the "Clean target folder" option.
You run a web server reset command as workaround to pass through this problem if your application is running on web server for the open files to be released.
I am having a lot of issues trying to automate downloading from an ftp site. I know the folder the file will be in, and I know that it will be a .zip file. However I do not know what the files will be named.
So I have code that works if I know the file name...for example:
$sourceuri = "ftp://myFtpSite/test/myZipFile.zip"
I would like to be able to use wildcards in this string so it will recongize any zip file. So I could write something like
$sourceuri = "ftp://myFtpSite/test/_.zip"
and it would download any zip file in that folder.
I know this question is ancient, but have you considered just using the console app ftp.exe? You can build a text file with commands (such as "mget *.zip" to retrieve all .zip files) and automate the process.
ftp -s:filename