File selected in WindowsExplorer with Preview Pane locks the file so powershell cannot output to that file - powershell

I have a scheduled script that outputs bunch of HTML files with static names to a remote location. I noticed, that if I have one of those files selected in Windows Explorer so that its contents are shown in Preview Pane, then Powershell cannot overwrite that file and skips updating it.
This only happens if output files are in remote location. Works just fine if files are local.
How do I force PowerShell to overwrite remote files in this situation? Lots of users work with those reports and if one of them leaves Windows Explorer window with one of those files highlighted overnight when the script runs, the file is not going to be updated.

Move HTML files to webserver. You will solve your problem entirely. IIS Setup on windows server is Next, Next, Next. You can leave link to a new file location (https://....) in old place, so users can easily navigate to a new place. Possibly this link can be automated (not sure because of modern security standards)
Try [System.IO.File]::Delete($path) just before writing this file. This removes file entry from filesystem, but leaves file open for those who have it open for now. This makes your script to write to a new file with the same name. Old file exists without name (deleted) but leaves open until everyone close it. Check it actually deleted with resresh!
Try [System.IO.File]::Move($path, $someTrashFullName) just before writing this file. $someTrashFullName probably must be on same drive. Same as Delete, but renames file. Some self-updating software use this strategy. File is renamed, but it's still kept open under new name.
Try replace file with shortcut to some file. You can generate files with different names and change shortcut programmatically
HTML files that change location using js ? They read nearby JSON (generated by export script) and lookup there for a new filename. So user opens static unchanged A.html, JS inside lookups at A.json for new name and redirects user to A-2020-08-11.html. I'm not sure browsers allow reading JSON files from JS for files that opened from network drive.
Only way left is to stop network share or\and close open files server-side.
Maybe some fun with to disable preview in this folder \ completely?

Try with -Force. But to me, it seems to be more a permission issue.
Remove-Item -Path '\\server\share\file' -Force

Related

ftp-kr - Cannot edit in read-only editor

I cannot figure out how to use the ftp-kr extension for VS Code properly. I have read and re-read the GitHub Wiki and the README documentation and cannot find any other help for my issue. I posted this question to the project Issues on GitHub but have yet to receive a response.
ftp-kr is just a simple extension for editing files via FTP that are located on a remote server. Many other users seem to be using the extension with no issue.
I installed the extension, and then I successfully edited the connection settings (in ftp-kr.json) and successfully made a connection to the remote server I am trying to edit files on.
I opened the FTP-KR: EXPLORER pane to look at all of the files on the server, but if I double-click any of the files to open them and then try to edit them, I am unable to type and just get a warning in VS Code that says "Cannot edit in read-only editor".
I have tried right-clicking on files and clicking the "Download This" button, thinking that maybe I need to download a local copy of the files to edit before uploading the changes. However, whenever I click the "Download This" option on any file in the ftp-kr Explorer, it just gives me an error message that says "[file_name] is not in remotePath".
I tried running the >ftp-kr: Download All command, but it just spits out a notice that says "Nothing to DO".
How do I edit files located on the remote server and save those changes to the server?
P.S. I have tried the solutions found in this question but unfortunately none of them seem to work. Particularly, "code-runner: Run in terminal" is not in my settings and "Edit in Local" is not a context menu item that appears in my editor.
After a number of months, I finally have a solution to this thanks to the developer eventually responding on GitHub. There are a few things going on here.
Any file that a user wishes to modify must be downloaded as a local copy on the user's machine first and then that copy can be uploaded to the webserver via FTP.
ftp-kr cannot auto-download individual files. (Either when they are double-clicked on or through any other method.) It can only download entire directories, and those directories can be changed by using the localPath and ignore options in the configuration file, then stopping and restarting the ftp connection.
The "Download This" context menu option that appears is a piece of non-implemented code. It will not do anything.
>ftp-kr: Download All is the preferred way (by the developer's intent) to download the remote files onto the user's computer. The fact that it was returning an error before was a bug which has now been fixed.
Confusingly, the user can view the filenames and context of every file on the remote file system using a convenient tree view, you just cannot simply download any of those files individually.
In all, this plugin does not provide the functionality that I hoped it would have. (Namely, being able to easily download, modify, and upload individual files.) So I know how to properly use it, I will just be switching to a different plugin for my purposes.

Change Directory to Folder Containing PowerShell Script - Regardless of Where That Folder Is Located

I have a script that I've created to prep our customer's servers for a software install. Part of this requires the script to be run as administrator, so just instructing people to click "Run With Powershell" doesn't get the job done. The script is in a folder with a number of .ini files that the script needs to copy to different server locations. If I just right-click the Powershell script and select "Run With Powershell," it is able to find the files and copy them without issue. Unfortunately, if I open the script in ISE, it opens with a default directory of C:\users\user, and I can't seem to copy those .ini files without first running a change directory command to get us to the folder that the script and the .ini files are in. But I'd like our installation techs to be able to run this without worrying about the exact location they initially drop these folders. I'd also like them to not have to worry about changing the directory manually in PowerShell. Some of our customers have multiple drives, and it might make sense to put this stuff on something other than the C drive, so it's hard to tell where this folder might end up. But I'm not sure of a command that will get me to the directory of the *.ps1 file, without knowing where that file is beforehand... Anyone have a suggestion?
You can use $PSScriptRoot that will have the location of the directory where the script is located.
This is referenced in the following post:
How can I get the file system location of a PowerShell script?

Group policy to make Chrome open files automatically after download

I am in the process of updating our currently existing Chrome distribution so that when a user downloads a .pdf file, the file is opened automatically.
As a first step, I was able to prevent Chrome from opening the pdf in its internal pdf viewer and could do a registry entry by using AlwaysOpenPdfExternally. This way it was downloading the file immediately and not opening it in the Chrome viewer.
I'd like to use Adobe to view the file automatically after download.
Unfortunately I was not able to find a registry entry. Nor was I able to find a GPO entry to be changed for Chrome.
I had found the following PowerShell script to be started upon logon of the user. However, testing it did not give me the result I was looking for.
The given entry extensions_to_open was not existing even after starting the script.
The current version of Chrome used is 66.255.
I would be happy for any help.
Edit: I am able to get the file to be opened automatically by inserting the given line here:
Is there any way to programatically force "Always open files of this type" for a specific file type in Chrome?
however, I am unsure of how to distribute the line of code into every PC in my network.
Edit: Chrome offers Group Policy Templates which should be importet into the active directory of your Windows Server. Once imported into your GPO, Chrome offers the setting: "Always open Pdf files externally". Once you check this feature, file should not automatically open in the internal PDF viewer.
See second edit: import the latest group policy templates into the GPO (will be stored on the active directory). After that, go into the GPO policies for "Google Chrome" and check the feature "Always open PDF files externally".

Files on my WebDAV mapped drive output rendered files in IDEs instead of actual content

On my mac I mounted a shared drive using WebDAV by going to "Finder > Go > Connect to server".
Now, when I try to view the files using TextWranger or TextEdit I can see the PHP code that I want to edit.
However, if I try to use an IDE like NetBeans/Eclipse/TextMate and create a new project with my shared drive as the "Existing sources" folder I cannot see the PHP code.
Instead I see the HTML output of the files as if I were seeing them through a web browser. Also, if I try to view a file that isn't normally accessibility (a command line script) I see the output as if it were called from the command line.
But a weird thing is if I use TextMate to edit a single file from the shared drive I can see the php code I am trying to edit. It just doesn't work as a project.
Any suggestions or solutions on how I can use an IDE to edit files over WebDAV? And why do my IDEs display the content rendered, instead of the actual file on the file system.
I'm not a specialist at all but I seem to remember that WebDAV clients do send GET requests.
If I'm correct your server may not be able to discriminate between HTTP GET and WebDAV GET thus rendering your .php files. Why this would work that way when working with a project and another way while working with individual files is not clear, though.
Do you get rendered files when you add files to your project manually as well?

Why CGI.pm upload old revision of a file on successful new file upload?

I am using CGI.pm version 3.10 for file upload using Perl. I have a Perl script which uploads the file and one of my application keeps track of different revisions of the uploaded document with check-in check-out facility.
Re-creational steps:
I have done a checkout(download a file) using my application (which is web based uses apache).
Logout from current user session.
Login again with same credentials and then check-in (upload) a new file.
Output:
Upload successful
Perl upload script shows the correct uploaded data
New revision of the file created
Output is correct and expected except the one case which is the issue
Issue:
The content of the newly uploaded file are same as the content of the last uploaded revision in DB.
I am using a temp folder for copying the new content and if I print the new content in upload script then it comes correct. I have no limit on CGI upload size. It seems somewhere in CGI environment it fails might be the version i am using. I am not using taint mode.
Can anybody helps me to understand what might be the possible reason?
Sounds like you're getting the old file name stuck in the file upload field. Not sure if that can happen for filefield but this is a feature for other field types.
Try adding the -nosticky pragma, eg, use CGI qw(-nosticky :all);. Another pragma to try is -private_tempfiles, which should prevent the user from "eavesdropping" even on their own uploads.
Of course, it could be that you need to localize (my) some variable or add -force to the filefield.
I found the issue. The reason was destination path of the copied file was not correct, this was because my application one of event maps the path of copied file to different directory and this path is storing in user session. This happens only when I run the event just before staring upload script. This was the reason that it was hard to catch. As upload script is designed to pick the new copied file from same path so it always end up uploading the same file in DB with another revision. The new copied file lying in new path.
Solved by mapping correct path before upload.
Thanks