Trying to move the old log files from one folder to another - powershell

I'm trying to move the older log files (older than 7 days) from the source to the target folder using the below script:
But this code doesn't move the file. Where as when I use the below code for deleting the older files (older than 7 days) it works!
I tried saving them as .ps1 file and .bat file, but no luck in moving. Can someone please help me figure out the issue?

This will help for you..
Get-ChildItem "E:\SC\A" | where {$_.LastWriteTime -lt (Get-Date).AddDays(7)} | Move -Destination "E:\SC\B"

SharePoint does not work the same way. You need to create a PSDrive.
New-PSDrive -Name atemp -PSProvider FileSystem -Credential $cred -Root "$spdir"
Copy-Item -Path "$logfilename" -Destination "$spdir"
Remove-PSDrive atemp

When I tried mapping with the IP address it worked!
and I use -get childitem for moving.
Note: That ID address posted is just a random number I came up with. Also there was no -credential needed in my case

Related

When I Run PowerShell script always changes back to User Directory

Strange issue I've not seen. Started working somewhere new so not sure if it's a quirk but each time I run a PS Script the first thing it seems to do is change back to my user DIR (c:\Users\UserName)
seems to do this at the start of the script because if I output any files that's where they end up (obvs unless I use explicit path)
Bit annoying I'm gonna try work out why it's doing it but thought I'll ask a bunch of smarter people.
Cheers.
Thanks for the quick reply guys I dind't notice when connecting to Exchange Online with MFA it goes to that the userprofile folder.
$CreateEXOPSSession = (Get-ChildItem -Path $env:userprofile -Filter
CreateExoPSSession.ps1 -Recurse -ErrorAction SilentlyContinue -Force | Select -Last
1).DirectoryName
. "$CreateEXOPSSession\CreateExoPSSession.ps1" Connect-EXOPSSession -UserPrincipalName myUPN
It's early morning here I need a coffee!!
thanks again

Robocopy commands to copy a file to over 50 remote machines

I started looking at robocopy yesterday to try to copy and overwrite a file from one destination to many remote computers. I've tried Robocopy to copy files to a remote machine but it doesn't work. I get the same error as the person in the link. Does anybody have any suggestions or lead me in the right way ? thank you so much !
You could just use PowerShell for this. It has an inefficiency issue wherein it would copy one at a time but that shouldnt be an issue for 50ish machines. This could help if you made a PowerShell script
$computers = Get-Content "C:\filewithcomputers.txt"
$fileToCopy = "C:\filetocopy.txt"
ForEach($computer in $Computers){
Copy-Item -Path $fileToCopy -Destination "\\$computer\C`$\Temp"
}
The would copy the file $fileToCopy to each server in the file C:\filewithcomputers.txt assuming that the file contained a list of computer with each one on its own line. The file would be copied to the temp folder on each machine. Update the paths as required for your scenario. I only suggest this since you tagged powershell-remoting. If you are not adept with PowerShell maybe someone else can give you a better answer more of what you are looking for. Using RoboCopy for one file seemed tedious.
If you wanted to check to see if a folder exists and is accessible you could do something like this.
$computers = Get-Content "C:\filewithcomputers.txt"
$fileToCopy = "C:\filetocopy.txt"
ForEach($computer in $Computers){
$destinationx86 = "\\$computer\C`$\Program Files (x86)"
$destination = "\\$computer\C`$\Program Files"
If(Test-Path $destinationx86){
# Copy this to Program Files (x86)
Copy-Item -Path $fileToCopy -Destination $destinationx86
} Else {
# Copy this to Program Files
Copy-Item -Path $fileToCopy -Destination $destination
}
}
If you need to connect with different credentials, you can use
$credential = Get-Credential
New-PSDrive -Name "Computer01" -PSProvider FileSystem -Root "\\Computer01\Share" -Credential $credential -Scope global
Now you can copy to e.g. Computer01:\Folder01\
If you have set your environment up to support PSRemoting and have placed the file in a file share you can use PowerShell Remoting to instruct many computers to retrieve the file themselves nearly simultaneously with Invoke-Command. You can limit the number of simultaneous actions using -ThrottleLimit depending on the size of the source file and how robust the network/server are:
$computers = Get-Content "C:\filewithcomputers.txt"
$originalsource = "\\fileserver\shared\payload.exe"
$originaldestination = "c:\"
$scriptblockcontent = {
param($source,$destination)
Copy-Item -Path $source -Destination $destination
}
Invoke-Command –ComputerName $Computers –ScriptBlock $scriptblockcontent `
–ThrottleLimit 50 -ArgumentList $originalsource,$originaldestination

Count files in a network folder using PowerShell

I've searched numerous MSDN/Technet and StackOverflow articles regarding this but I can't find a solution to my problem.
SO references below.
I am trying to run a script on my server that simply counts the files in a folder on a network location.
I can get it working if it's a local folder, and I can get it working when I map the network drive. However I can't use a network drive because I'll be running this script from a web interface that doesn't have a user account (local drives work fine).
My script is:
$Files = Get-ChildItem \\storage\folder -File
$Files.count
I get the error:
Get-ChildItem : Cannot find path '\\storage\folder' because it does not exist.
[0]open folder from Network with Powershell
[1]File counting with Powershell commands
[2]Count items in a folder with PowerShell
[3]Powershell - remote folder availability while counting files
Two things that I can think of,
One would be to add -path to your get-childitem call. I tested this on my Powershell and it works fine.
$files = get-childitem -path C:\temp
$files.count
This returns the number of files in that path.
However I am testing this on a local file. If you are sure it is the remote access part giving you trouble I would suggest trying to set credentials. Besides the get-credentials option, you could also try setting them yourself.
$Credentials = New-Object System.Management.Automation.PsCredential("Username", "password")
Then perhaps you can set the drive and still be able to access your files. Hope that helps.
Try this:
set-location \\\storage\folder\
dir -recurse | where-object{ $_.PSIsContainer } | ForEach{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count }
This will count the number of files in each sub-folder (recurse) and display the full path and count in the output.

Delete directory regardless of 260 char limit

I'm writing a simple script to delete USMT migration folders after a certain amount of days:
## Server List ##
$servers = "Delorean","Adelaide","Brisbane","Melbourne","Newcastle","Perth"
## Number of days (-3 is over three days ago) ##
$days = -3
$timelimit = (Get-Date).AddDays($days)
foreach ($server in $servers)
{
$deletedusers = #()
$folders = Get-ChildItem \\$server\USMT$ | where {$_.psiscontainer}
write-host "Checking server : " $server
foreach ($folder in $folders)
{
If ($folder.LastWriteTime -lt $timelimit -And $folder -ne $null)
{
$deletedusers += $folder
Remove-Item -recurse -force $folder.fullname
}
}
write-host "Users deleted : " $deletedusers
write-host
}
However I keep hitting the dreaded Remove-Item : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
I've been looking at workarounds and alternatives but they all revolve around me caring what is in the folder.
I was hoping for a more simple solution as I don't really care about the folder contents if it is marked for deletion.
Is there any native Powershell cmdlet other than Remove-Item -recurse that can accomplish what I'm after?
I often have this issue with node projects. They nest their dependencies and once git cloned, it's difficult to delete them. A nice node utility I came across is rimraf.
npm install rimraf -g
rimraf <dir>
Just as CADII said in another answer: Robocopy is able to create paths longer than the limit of 260 characters. Robocopy is also able to delete such paths. You can just mirror some empty folder over your path containing too long names in case you want to delete it.
For example:
robocopy C:\temp\some_empty_dir E:\temp\dir_containing_very_deep_structures /MIR
Here's the Robocopy reference to know the parameters and various options.
I've created a PowerShell function that is able to delete a long path (>260) using the mentioned robocopy technique:
function Remove-PathToLongDirectory
{
Param(
[string]$directory
)
# create a temporary (empty) directory
$parent = [System.IO.Path]::GetTempPath()
[string] $name = [System.Guid]::NewGuid()
$tempDirectory = New-Item -ItemType Directory -Path (Join-Path $parent $name)
robocopy /MIR $tempDirectory.FullName $directory | out-null
Remove-Item $directory -Force | out-null
Remove-Item $tempDirectory -Force | out-null
}
Usage example:
Remove-PathToLongDirectory c:\yourlongPath
This answer on SuperUser solved it for me: https://superuser.com/a/274224/85532
Cmd /C "rmdir /S /Q $myDir"
I learnt a trick a while ago that often works to get around long file path issues. Apparently when using some Windows API's certain functions will flow through legacy code that can't handle long file names. However if you format your paths in a particular way, the legacy code is avoided. The trick that solves this problem is to reference paths using the "\\?\" prefix. It should be noted that not all API's support this but in this particular case it worked for me, see my example below:
The following example fails:
PS D:\> get-childitem -path "D:\System Volume Information\dfsr" -hidden
Directory: D:\System Volume Information\dfsr
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a-hs 10/09/2014 11:10 PM 834424 FileIDTable_2
-a-hs 10/09/2014 8:43 PM 3211264 SimilarityTable_2
PS D:\> Remove-Item -Path "D:\System Volume Information\dfsr" -recurse -force
Remove-Item : The specified path, file name, or both are too long. The fully qualified file name must be less than 260
characters, and the directory name must be less than 248 characters.
At line:1 char:1
+ Remove-Item -Path "D:\System Volume Information\dfsr" -recurse -force
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (D:\System Volume Information\dfsr:String) [Remove-Item], PathTooLongExcepti
on
+ FullyQualifiedErrorId : RemoveItemIOError,Microsoft.PowerShell.Commands.RemoveItemCommand
PS D:\>
However, prefixing the path with "\\?\" makes the command work successfully:
PS D:\> Remove-Item -Path "\\?\D:\System Volume Information\dfsr" -recurse -force
PS D:\> get-childitem -path "D:\System Volume Information\dfsr" -hidden
PS D:\>
If you have ruby installed, you can use Fileman:
gem install fileman
Once installed, you can simply run the following in your command prompt:
fm rm your_folder_path
This problem is a real pain in the neck when you're developing in node.js on Windows, so fileman becomes really handy to delete all the garbage once in a while
This is a known limitation of PowerShell. The work around is to use dir cmd (sorry, but this is true).
http://asysadmin.tumblr.com/post/17654309496/powershell-path-length-limitation
or as mentioned by AaronH answer use \?\ syntax is in this example to delete build
dir -Include build -Depth 1 | Remove-Item -Recurse -Path "\\?\$($_.FullName)"
If all you're doing is deleting the files, I use a function to shorten the names, then I delete.
function ConvertTo-ShortNames{
param ([string]$folder)
$name = 1
$items = Get-ChildItem -path $folder
foreach ($item in $items){
Rename-Item -Path $item.FullName -NewName "$name"
if ($item.PSIsContainer){
$parts = $item.FullName.Split("\")
$folderPath = $parts[0]
for ($i = 1; $i -lt $parts.Count - 1; $i++){
$folderPath = $folderPath + "\" + $parts[$i]
}
$folderPath = $folderPath + "\$name"
ConvertTo-ShortNames $folderPath
}
$name++
}
}
I know this is an old question, but I thought I would put this here in case somebody needed it.
There is one workaround that uses Experimental.IO from Base Class Libraries project. You can find it over on poshcode, or download from author's blog. 260 limitation is derived from .NET, so it's either this, or using tools that do not depend on .NET (like cmd /c dir, as #Bill suggested).
Combination of tools can work best, try doing a dir /x to get the 8.3 file name instead. You could then parse out that output to a text file then build a powershell script to delete the paths that you out-file'd. Take you all of a minute. Alternatively you could just rename the 8.3 file name to something shorter then delete.
For my Robocopy worked in 1, 2 and 3
First create an empty directory lets say c:\emptydir
ROBOCOPY c:\emptydir c:\directorytodelete /purge
rmdir c:\directorytodelete
This is getting old but I recently had to work around it again. I ended up using 'subst' as it didn't require any other modules or functions be available on the PC this was running from. A little more portable.
Basically find a spare drive letter, 'subst' the long path to that letter, then use that as the base for GCI.
Only limitation is that the $_.fullname and other properties will report the drive letter as the root path.
Seems to work ok:
$location = \\path\to\long\
$driveLetter = ls function:[d-z]: -n | ?{ !(test-path $_) } | random
subst $driveLetter $location
sleep 1
Push-Location $driveLetter -ErrorAction SilentlyContinue
Get-ChildItem -Recurse
subst $driveLetter /D
That command is obviously not to delete files but can be substituted.
PowerShell can easily be used with AlphaFS.dll to do actual file I/O stuff
without the PATH TOO LONG hassle.
For example:
Import-Module <path-to-AlphaFS.dll>
[Alphaleonis.Win32.Filesystem.Directory]::Delete($path, $True)
Please see at Codeplex: https://alphafs.codeplex.com/ for this .NET project.
I had the same issue while trying to delete folders on a remote machine.
Nothing helped but... I found one trick :
# 1:let's create an empty folder
md ".\Empty" -erroraction silentlycontinue
# 2: let's MIR to the folder to delete : this will empty the folder completely.
robocopy ".\Empty" $foldertodelete /MIR /LOG+:$logname
# 3: let's delete the empty folder now:
remove-item $foldertodelete -force
# 4: we can delete now the empty folder
remove-item ".\Empty" -force
Works like a charm on local or remote folders (using UNC path)
Adding to Daniel Lee's solution,
When the $myDir has spaces in the middle it gives FILE NOT FOUND errors considering set of files splitted from space. To overcome this use quotations around the variable and put powershell escape character to skip the quatations.
PS>cmd.exe /C "rmdir /s /q <grave-accent>"$myDir<grave-accent>""
Please substitute the proper grave-accent character instead of <grave-accent>
SO plays with me and I can't add it :). Hope some one will update it for others to understand easily
Just for completeness, I have come across this a few more times and have used a combination of both 'subst' and 'New-PSDrive' to work around it in various situations.
Not exactly a solution, but if anyone is looking for alternatives this might help.
Subst seems very sensitive to which type of program you are using to access the files, sometimes it works and sometimes it doesn't, seems to be the same with New-PSDrive.
Any thing developed using .NET out of the box will fail with paths too long. You will have to move them to 8.3 names, PInVoke (Win32) calls, or use robocopy

Powershell script required

this is the scenario.
p1
|_f1
|_f2
p2
|_f1
|_f2
Can anyone please help me with a powershell script that copies the files shown above from the TFS to a temporary folder ??? where f1,f2,and so on are the subfolders..
I have no experience with either, but in the interest of a least pointing you in the right direction, check out this site. http://coolthingoftheday.blogspot.com/2009/03/pstfs-powershell-and-tfs-better-than.html
There are a couple of commands that will give you at least part of what you want. You will still need to do some digging to figure out the time stamp stuff.
You may want to check the answer to my question on a very similar scenario.
You will find answer to
give me all files in this folder (or subfolder)
as well as
that where modified after x/y/zzzz
but I'm still not sure about the
dump those files to folder other than they would normally go to
update
Incorporating your approach
Get-TfsItemProperty $/MyFirstTFSProj -r -server xyzc011b |
Where {$_.CheckinDate -gt (Get-Date).AddDays(-30)} |
Copy-Item -Destination C:\SomeDir -Whatif
you normally can omit the Copy-Item -Path param because it will be provided by the pipeline.
I don't have a TFS at to test with Get-TfsItemProperty but you could try
Get-TfsItemProperty $/MyFirstTFSProj -r -server xyzc011b |
Where {$_.CheckinDate -gt (Get-Date).AddDays(-30)} |
Get-Member
do find out about where this $null value is coming from.
I assume you did already see this post. To maintain the folder structure on the destination you need to include the -Force switch on the Copy-Item to create missing target folders:
Get-TfsItemProperty $/MyFirstTFSProj -r -server xyzc011b |
Where {$_.CheckinDate -gt (Get-Date).AddDays(-30)} |
Copy-Item -Destination C:\SomeDir -Force -Whatif
I'm still not sure if you need to retrieve/export the files prior to copy them - you should check on the second answer from Richard Berg in the post mentiond above.