If files exist in directory,move it powershell - powershell

I am a PowerShell beginner
May I have a question,
Is there any PS script that can always detect files in a directory in the background,
If files exist in the directory, move it to another location.
I tried using .PS1 (Move-item cmdlet) on taskscheduler, but it can't be work
Can Powershell do that?

Google for FileSystemWatcher and PowerShell, FileSystemWatcher allows you to specify a directory to watch for changes and then act on those changes. Like if a new file is detected, you can move it.
It seems there are several examples of scripts doing this out there.

Related

File selected in WindowsExplorer with Preview Pane locks the file so powershell cannot output to that file

I have a scheduled script that outputs bunch of HTML files with static names to a remote location. I noticed, that if I have one of those files selected in Windows Explorer so that its contents are shown in Preview Pane, then Powershell cannot overwrite that file and skips updating it.
This only happens if output files are in remote location. Works just fine if files are local.
How do I force PowerShell to overwrite remote files in this situation? Lots of users work with those reports and if one of them leaves Windows Explorer window with one of those files highlighted overnight when the script runs, the file is not going to be updated.
Move HTML files to webserver. You will solve your problem entirely. IIS Setup on windows server is Next, Next, Next. You can leave link to a new file location (https://....) in old place, so users can easily navigate to a new place. Possibly this link can be automated (not sure because of modern security standards)
Try [System.IO.File]::Delete($path) just before writing this file. This removes file entry from filesystem, but leaves file open for those who have it open for now. This makes your script to write to a new file with the same name. Old file exists without name (deleted) but leaves open until everyone close it. Check it actually deleted with resresh!
Try [System.IO.File]::Move($path, $someTrashFullName) just before writing this file. $someTrashFullName probably must be on same drive. Same as Delete, but renames file. Some self-updating software use this strategy. File is renamed, but it's still kept open under new name.
Try replace file with shortcut to some file. You can generate files with different names and change shortcut programmatically
HTML files that change location using js ? They read nearby JSON (generated by export script) and lookup there for a new filename. So user opens static unchanged A.html, JS inside lookups at A.json for new name and redirects user to A-2020-08-11.html. I'm not sure browsers allow reading JSON files from JS for files that opened from network drive.
Only way left is to stop network share or\and close open files server-side.
Maybe some fun with to disable preview in this folder \ completely?
Try with -Force. But to me, it seems to be more a permission issue.
Remove-Item -Path '\\server\share\file' -Force

Remove active folder with powershell script

So I have to create an installation script for the company I work for. And to bypass some of its policies I need to first run a script that copy's all the files that are needed locally. But those files need to be deleted again.
The remove-item does not work because the script that is running is also in this folder. Placing this script in a different location is no option because of the policies.
I have searched 2 days for a solution and have today and tomorrow left to finish this project.
So please this is the only thing I have get done.
edit:
I found another way around the policies I think. Now i should only delete the powershell script itself is this possible or not?

Change Directory to Folder Containing PowerShell Script - Regardless of Where That Folder Is Located

I have a script that I've created to prep our customer's servers for a software install. Part of this requires the script to be run as administrator, so just instructing people to click "Run With Powershell" doesn't get the job done. The script is in a folder with a number of .ini files that the script needs to copy to different server locations. If I just right-click the Powershell script and select "Run With Powershell," it is able to find the files and copy them without issue. Unfortunately, if I open the script in ISE, it opens with a default directory of C:\users\user, and I can't seem to copy those .ini files without first running a change directory command to get us to the folder that the script and the .ini files are in. But I'd like our installation techs to be able to run this without worrying about the exact location they initially drop these folders. I'd also like them to not have to worry about changing the directory manually in PowerShell. Some of our customers have multiple drives, and it might make sense to put this stuff on something other than the C drive, so it's hard to tell where this folder might end up. But I'm not sure of a command that will get me to the directory of the *.ps1 file, without knowing where that file is beforehand... Anyone have a suggestion?
You can use $PSScriptRoot that will have the location of the directory where the script is located.
This is referenced in the following post:
How can I get the file system location of a PowerShell script?

Powershell dot sourcing opens up file in notepad

Everytime i dot source a file in PowerShell it opens a copy of the file in notepad.
Exe:
.\MyScript.ps1
The script runs fine - its just really annoying having these pop up all the time. Is there a way to suppress this?
I'm on windows 7 x64 and using the latest version of PowerShell.
Ex2: This is still launching notepad.
cls
Set-Location "\\PSCWEBP00129\uploadedFiles\psDashboard\"
. .\assets\DCMPull\Powershell\SqlServerTransfer.psm1
. .\assets\DCMPull\Powershell\RunLogging.psm1
You cannot dot source PowerShell files with the .psm1 file extension. One option is to rename them to .ps1.
Alternatively (and, in my opinion the better approach), you can load the PowerShell modules using Import-Module <module.psm1>. Just note that the behavior of Import-Module is different from dot sourcing it. Dot sourcing runs the script in the current scope and also persists all variables, functions, etc.in the current scope. Import-Module does not do that.
Although not very common, you can also export variables from modules with Export-ModuleMember.
Adding to Raziel's answer, there's a lot of thought that went into only being able to dot source files with .ps1 extension, and otherwise why it tries to run it as a system executable. Here's a snippet from PeterWhittaker on GitHub:
. ./afile would only execute something if there's either an
extension-less but executable aFile in the current dir, or a
(not-required-to-be-executable) afile.ps1 file, with the former taking
precedence if both are present; if the file exists, but is neither
executable nor has extension .ps1, it is opened as if it were a
document.
. <filename> with <filename> being a mere name (no path component) by
(security-minded) design only ever looks for a file of that name in
the directories listed in $env:PATH (see below), not in the current
directory.
I encountered exactly the same situation : If the point source imports the .psm1 file, the file will be opened directly instead of importing the code in the file.
Because the function of point source import is only valid in the file with suffix of.ps1, if the suffix does not meet the requirements, it will not be regarded as path, but as a code , so it is like running the corresponding string directly, and the effect is naturally to open the file.
So,this phenomenon is not aimed at .PSM1,if you change the extension to TXT, it will have the same effect. It will have the same effect for any file whose suffix is not .PS1.
You can bypass this problem by creating symbolic links or hard links!
In PowerShell 7, it's easy to create links using New-Item.

Is it possible to run a batch file from package manager console?

I'm using code first migrations with my context class in a class library (ie not the startup project) and I want to make batch files for the common operations to save having to pass in the parameters each time I want to add-migration and update-database. I ran the "dir" command in the console and it appears to be in the solution root folder so I have tried creating a .bat,.cmd or .ps1 file in the Solution Items folder but the package manager powershell doesn't seem to be able to find it?
At this very moment I am happening to read this from Bruce Payette's "Powershell in Action" (Wonderful book) so share something with you, lucky guy:
"In this example (Poster: an example in the book), even though hello.ps1 is in the current directory, you had to put ./ in front of it to run it. This is because Powershell doesn't execute commands out of the current directory by default. This prevents accidental execution of the wrong command."
Looks like I needed to just put a ".\" on the beginning of the batch file name - not sure if Powershell requires this to execute?