I am trying to create a new folder on google drive and append the date and hour to the end of it to create a rolling 1 hour backup with old versions still accessible. I am not knowledgeable enough about PowerShell in order to automatically delete folders over 7 days old so I have to do it manually. The problem seems to be when PowerShell creates the new folders and copies the files, I no longer have access to delete them. I have checked permissions and added Everyone, Administrators, and the current user (the only user and admin) and it still will not allow me to delete the folders that PowerShell creates. I can delete files I manually put into Google drive. Here is my code:
$yest = (Get-Date).AddDays(-1).ToString(‘hhMMddyy’)
mkdir "C:\Users\admin\Google Drive\SpaceEngineersDedicated.$yest" -Force
Copy-Item C:\Users\admin\AppData\Roaming\SpaceEngineersDedicated -Destination "C:\Users\admin\Google Drive\SpaceEngineersDedicated.$yest" -Recurse -Force
If someone could help me with code that will just delete folders over 7 days, I would appreciate it but worst case scenario I'm fine with occasionally cleaning them up manually.
Related
I'm writing a PowerShell script that will, if possible, delete an occasional extraneous directory if a subtask (over which I have no control, apart from invoking it) fails to properly delete it.
The catch is that the directory (which has a lot of files and recursive hierarchy) may or may not be in use by another (large, 3rd party) program. Normally, I'm doing this in Windows Explorer: I hit the "delete" key, I get a "in use" dialog, shrug and move on, no harm done. (It's a big process and it usually cleans up after itself.)
What I don't want to do is call Remove-Item -Recursive and have it purge half a directory and then discover that some random file is in use. Nor am I even sure it's a file-in-use issue; maybe it's because the directory (or a subdirectory) is some process's current directory or, for some reason, the directory (or subdirectory) itself is open (I'm just making stuff up at this point), or some entirely different and mysterious cause.
I'm hoping to duplicate the current manual process.
Can anyone point me in the right direction? Basicaly, I'm looking for Remove-Directory-If-You-Can-But-Don't-Do-Anything-At-All-If-You-Can't.
(Edit)
Note that I don't want to remove anything in the directory if the directory is still "in use", in some sense. That will corrupt the large process using the directory. If the large process has finished and moved on, I can delete the directory.
(Edit)
This? https://github.com/pldmgg/misc-powershell/blob/master/MyFunctions/PowerShellCore_Compatible/Get-FileLockProcess.ps1
Called recursively? (Not sure if it'll work for subdirs, but I can experiment.)
Could try moving the items away somewhere safe first, and restore items that were successfully moved if anything goes wrong.
If you move things to a folder in the path for temp files, you don't have to delete them - the system will delete them sometime after your process releases the handle on the new folder.
Simple case:
$tempRoot = Split-Path (New-TemporaryFile).FullName
$tempDir = New-Item "$tempRoot\$(New-Guid)" -Type Directory
Move-Item "$HOME\Downloads\SomeFolder\" $tempDir
If anything is holding a file handle in the folder being moved, it will fail and the move won't occur.
Deleting several files and folders:
$tempRoot = Split-Path (New-TemporaryFile).FullName
$tempDir = New-Item "$tempRoot\$(New-Guid)" -Type Directory
try {
Move-Item "$HOME\Downloads\*" $tempDir.FullName -ErrorAction Stop
Remove-Variable tempDir
} catch {
Move-Item "$($tempDir.FullName)\*" "$HOME\Downloads"
}
Move-Item will move whole folders as given above, so no need for recursion.
The move operation should be quick, assuming the files/folders you're moving are on the same drive as the temp folder. The files themselves don't have to be physically moved on disk. The OS just modifies an entry in a table. Moving an entire folder only requires one update (not one for each file within it).
If the above doesn't satisfy your needs, maybe check out this article: How to Manage Open File Handles with PowerShell.
I have business laptop which I can use off hours for entertainment.
Unfortunately most of my personal folders ale located on the server and that causes some issues.
My 'Documents' folder is located on:
\\<server name>\RedirectedFolders$\<username>\Documents
Which causes me problems with game that want to use 'My Games' folder located inside 'Documents' folder.
I tried to move 'My Games' to other folder located locally on disk C:, then make symbolic link with PowerShell using command:
New-Item -ItemType SymbolicLink -Path "\\<servername>\RedirectedFolders$\<username>\Documents\My Games" -Target "C:\var\games\My Games"
Unfortunately it gives error:
New-Item: This operation is supported only when you are connected to the server.
Does anyone knows how to solve that?
EDIT. Forgot to mention, I'm connected via VPN to my company so when I type in explorer my online link to 'Documents' folder it opens it normally.
You reversed the arguments. You want the -Target to be the real directory path and the -Path to be the local directory link you access the target at. You can't create a symbolic link on a remote share to a local target.
Both methods work for me:
New-Item -ItemType SymbolicLink -Target \\server.domain.tld\share\SomeFolder -Path .\LocalSomeFolder
cmd /c mklink /D .\SomeLocalFolder \\server.domain.tld\share\SomeFolder
That said, this won't work like mapped drives and offline files. If you aren't online, you aren't accessing any of those folder's files. You can work around this by mapping the share to a network drive and ensuring offline file access is enabled for the mapping, then symlinking somewhere within the letter drive for the target rather than at the full network path.
This does bring its own share of issues, however, as "Offline Files" for network drives can be finicky.
Workaround
You can work around this gracefully by changing the location of your Documents library to %USERPROFILE%\Documents. Note that this won't migrate your existing files so it's up to you to make sure you can obtain the files before or after re-mapping the Documents library and populating your local Documents library with your expected files. Keep in mind some organizations will block changing some or all library locations so this might not be possible for you to achieve on a company-managed workstation.
Also note that this will no longer keep a remote copy of your local Documents library, which might interfere with any backup processes your organization has in place for your workstation. It becomes up to you to keep your Documents backed up in this case.
I have set up a Microsoft Teams GPO deployment using a .msi.
Unfortunately some users have already installed teams using the .exe installer.
This then creates issues with the .msi deployment as when the GPO sees the .exe files it does not apply.
Does anyone have a script that will delete all references to teams.exe within a users roaming profile?
Many thanks
This command will list all files with full path in Users' AppData\Roaming\Microsoft\Teams folder (recursive) and will delete all files that match with "\teams.exe"
Get-ChildItem "C:\Users\*\AppData\Roaming\Microsoft\Teams\*" -Recurse | ForEach{if($_.FullName -match '\\teams.exe'){Remove-Item "$($_.FullName)" -Force -WhatIf}}
Note: I added "-WhatIf" at the end of the command to avoid any mistake before running the command for real. So you have to remove "-WhatIf" after checking the result with "-WhatIf"
Running the follow command to zip all txt file:
Compress-Archive -Path "$testfolder\*.txt\" -CompressionLevel Optimal -DestinationPath $textfolder\TESTZIP
I created a scheduled task that will run every 5 minutes for a period of 1 hour. Since this is a test, files get created every 5 minutes as well. But my zip folder does not get updated.
How could I update my zip folder based on my command on top?
After 1 hour, email alerts gets sent out. I have the email settings set up.
When in doubt, read the documentation.
-Update
Updates the specified archive by replacing older versions of files in the archive with newer versions of files that have the same names. You can also add this parameter to add files to an existing archive.
Add the parameter -Update to your commandline.
To overwrite an existing zip file use the -Force argument.
I am using the following expression to delete a folder from PowerShell. I need to delete a complete folder (including all the files and sub folders).
Remove-Item -Recurse -Force $DesFolder
But this gives me the exception "The directory is not empty"
I am not getting this exception every time when I run the program; it happens randomly. What would be the reason for this and how do I fix this? Because I was failing to reproduce this.
We cannot delete an non-empty directory using commands like rmdir or Remove-Item, this is to avoid accidental deletion of important system files by users during programming.
Therefore before trying to delete the directory, empty it. Clear the contents and then delete it. :)
Remove-Item -Recurse always deletes the directory and all its contents recursively. But it may still fail if directory is modified (i.e. new files are created) by some third-party activity in middle of remove process.
Also, if some of the files cannot be deleted (e.g. due to permission restrictions) Remove-Item will also fail.
So, I'd recommend you to check what exactly is laying inside the directory after exception.