260 Character Limit, Get-ChildItem - powershell

I understand there is a 260 character limit for win32, but I am curious as to why my code is half working. See below.
$Age_of_Files = -30
$Path = '\\share\d$\share'
$Age_of_Files = -30
$Current_Date = Get-Date
$Del_date = $Current_Date.AddDays($Age_of_Files)
$post = "<BR><i>Report generated on $((Get-Date).ToString())</i>"
Get-ChildItem $Path -Recurse |
Where-Object { $_.LastWriteTime -lt $Del_date} |
Select Name, FullName, LastWriteTime
$Data | ConvertTo-HTML -PreContent $pre -PostContent $post | Out-File $Report
Invoke-Item $Report
Read-Host 'Have you checked the Output File...Ok to Continue with Delete?' | Out-Null
This will check my network share with no problem and give me no errors, although there are many directories longer then 260 characters but I also want to export this as a HTML file, so If I change this line of code.
$Data = Get-ChildItem $Path -Recurse |
Where-Object { $_.LastWriteTime -lt $Del_date } |
Select Name, FullName, LastWriteTime
It then does not recurse through the directories, and gives me the character limit error.
Is there a way around this? As apart from exporting it to HTML and adding in the actual delete command I think I am nearly there.

You can turn on long paths in Windows 10. There's a gpo.
https://www.howtogeek.com/266621/how-to-make-windows-10-accept-file-paths-over-260-characters/

This is a limitation of the Win32 API (see also). There's a PowerShell Module that supposedly works around the issue (haven't used it myself, though).
A common workaround is to shorten the path by substing the longest accessible path to a drive letter
& subst X: C:\longest\parent\folder
working on drive X:, then deleting the temporary drive afterwards:
& subst X: /d
For network paths use net use to the same end:
& net use X: \\server\share\longest\parent\folder
...
& net use X: /d

Related

Is there a way to display the latest file of multiple paths with information in a table format?

I check every day, whether a CSV-File has been exported to a specific folder (path). At the moment there are 14 different paths with 14 different files to check. The files are being stored in the folder and are not deleted. So i have to differ between a lot of files with "lastwritetime". I would like a code to display the results in table format. I would be happy with something like this:
Name LastWriteTime Length
ExportCSV1 21.09.2022 00:50 185
ExportCSV2 21.09.2022 00:51 155
My code looks like this:
$Paths = #('Path1', 'Path2', 'Path3', 'Path4', 'Path5', 'Path6', 'Path7', 'Path8', 'Path9', 'Path10', 'Path11', 'Path12', 'Path13', 'Path13')
foreach ($Path in $Paths){
Get-ChildItem $path | Where-Object {$_.LastWriteTime}|
select -last 1
Write-host $Path
}
pause
This way i want to make sure, that the files are being sent each day.
I get the results that i want, but it is not easy to look at the results individually.
I am new to powershell and would very much appreciate your help. Thank you in advance.
Continuing from my comments, here is how you could do this:
$Paths = #('Path1', 'Path2', 'Path3', 'Path4', 'Path5', 'Path6', 'Path7', 'Path8', 'Path9', 'Path10', 'Path11', 'Path12', 'Path13', 'Path13')
$Paths | ForEach-Object {
Get-ChildItem $_ | Where-Object {$_.LastWriteTime} | Select-Object -Last 1
} | Format-Table -Property Name, LastWriteTime, Length
If you want to keep using foreach() instead, you have to wrap it in a scriptblock {…} to be able to chain everything to Format-Table:
. {
foreach ($Path in $Paths){
Get-ChildItem $path | Where-Object {$_.LastWriteTime} | Select-Object -Last 1
}
} | Format-Table -Property Name, LastWriteTime, Length
Here the . operator is used to run the scriptblock immediately, without creating a new scope. If you want to create a new scope (e. g. to define temporary variables that exist only within the scriptblock), you could use the call operator & instead.

Copy from Specific Folder with Multiple Folders

I am creating a backup and restore tool with powershell script and I am trying to make it so that when restoring, the script picks the last folder created and restores from that directory structure. Basically I am having the script start with making a backup directory with a date/time stamp like so:
$CurrentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
$CurrentDomainName = $CurrentUser.split("\")[0]
$CurrentUserName = $CurrentUser.split("\")[1]
$folderdate = Get-Date -f MMddyyyy_Hm
$homedir = Get-Aduser $CurrentUserName -prop HomeDirectory | select -ExpandProperty
HomeDirectory
New-Item $homedir -Name "TXBackup\$folderdate" -ItemType Directory
$cbookmarks = "$env:userprofile\Appdata\Local\Google\Chrome\User Data\Default\Bookmarks"
md $homedir\TXBackup\$folderdate\Chrome
Copy-Item $cbookmarks "$homedir\TXBackup\$folderdate\Chrome" -Recurse
Backup Folder Structure
Basically everytime someone runs the backup tool it will create a subfolder under the Backup directory with the date/time name to track the latest one. The problem comes when I want to restore from the last one create I can no longer use a $folderdate variable since it will pull the whatever the time is while the tool is being run. Here is the code without taking into account what the last folder is. I tried using sort but that doesn't appear to give me a clear path to select the last one created or I just am such a noob I didn't use it right :(
##Restoring Files from Backup
$CurrentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
$CurrentDomainName = $CurrentUser.split("\")[0]
$CurrentUserName = $CurrentUser.split("\")[1]
$homedir = get-aduser $CurrentUserName -prop HomeDirectory | select -ExpandProperty HomeDirectory
##Restoring Chrome Bookmarks
Sort-Object -Property LastWriteTime |
Select-Object -Last 1
$rbookmarks = "$homedir\TXBackup\$folderdate\Chrome\Bookmarks"
Copy-Item $rbookmarks "C:\Test\"
I know I didn't use that correctly but any direction would be awesome for this newbie :)
You can use Sort-Object with a script block and use [DateTime] methods to parse the date from the folder name, using the same format string you used to create them.
# Sort directory names descending
Get-ChildItem -Directory | Sort-Object -Desc {
# Try to parse the long format first
$dt = [DateTime]::new( 0 )
if( [DateTime]::TryParseExact( $_.Name, 'MMddyyyy_HHmm', [CultureInfo]::InvariantCulture, [Globalization.DateTimeStyles]::none, [ref] $dt ) ) {
return $dt
}
# Fallback to short format
[DateTime]::ParseExact( $_.Name, 'MMddyyyy', [CultureInfo]::InvariantCulture )
} | Select-Object -First 1 | ForEach-Object Name
Note: I've changed the time format from Hm to HHmm, because Hm would cause a parsing ambiguity, e. g. 01:46 would be formatted as 146, but parsed as 14:06.
Also I would move the year to the beginning, e. g. 20220821_1406, so you could simply sort by name, without having to use a script block. But that is not a problem, just an (in)convenience and you might have a reason to put the year after the day.
Given these folders:
08212022
08222022
08222022_1406
08222022_1322
08222022_1324
08222022_1325
08222022_1343
The code above produces the following output:
08222022_1406
To confirm the ordering is correct, I've removed the Select-Object call:
08222022_1406
08222022_1343
08222022_1325
08222022_1324
08222022_1322
08222022
08212022
Note that the ordering is descending (-Desc), so Select-Object -First 1 can be used to more effectively select the latest folder.

create file index manually using powershell, tab delimited

Sorry in advance for the probably trivial question, I'm a powershell noob, please bear with me and give me advice on how to get better.
I want to achieve a file index index.txt that contains the list of all files in current dir and subdirs in this format:
./dir1/file1.txt 07.05.2020 16:16 1959281
where
dirs listed are relative (i.e. this will be run remotely and to save space, the relative path is good enough)
the delimiter is a tab \t
the date format is day.month.fullyear hours:minutes:seconds, last written (this is the case for me, but I'm guessing this would be different on system setting and should be enforced)
(the last number is the size in bytes)
I almost get there using this command in powershell (maybe that's useful to someone else as well):
get-childitem . -recurse | select fullname,LastWriteTime,Length | Out-File index.txt
with this result
FullName LastWriteTime Length
-------- ------------- ------
C:\Users\user1\Downloads\test\asdf.txt 07.05.2020 16:19:29 1490
C:\Users\user1\Downloads\test\dirtree.txt 07.05.2020 16:08:44 0
C:\Users\user1\Downloads\test\index.txt 07.05.2020 16:29:01 0
C:\Users\user1\Downloads\test\test.txt 07.05.2020 16:01:23 814
C:\Users\user1\Downloads\test\text2.txt 07.05.2020 15:55:45 1346
So the questions that remain are: How to...
get rid of the headers?
enforce this date format?
tab delimit everything?
get control of what newline character is used (\n or \r or both)?
Another approach could be this:
$StartDirectory = Get-Location
Get-ChildItem -Path $StartDirectory -recurse |
Select-Object -Property #{Name='RelPath';Expression={$_.FullName.toString() -replace [REGEX]::Escape($StartDirectory.ToString()),'.'}},
#{Name='LastWriteTime';Expression={$_.LastWriteTime.toString('dd.MM.yyyy HH:mm:ss')}},
Length |
Export-Csv -Path Result.csv -NoTypeInformation -Delimiter "`t"
I recommend to use proper CSV files if you have structured data like this. The resulting CSV file will be saved in the current working directory.
If the path you are running this from is NOT the current scrip path, do:
$path = 'D:\Downloads' # 'X:\SomeFolder\SomeWhere'
Set-Location $path
first.
Next, this ought to do it:
Get-ChildItem . -Recurse -File | ForEach-Object {
"{0}`t{1:dd.MM.yyyy HH:mm}`t{2}" -f ($_ | Resolve-Path -Relative), $_.LastWriteTime, $_.Length
} | Out-File 'index.txt'
On Windows the newline will be \r\n (CRLF)
If you want control over that, this should do:
$newline = "`n" # for example
# capture the lines as string array in variable $lines
$lines = Get-ChildItem . -Recurse -File | ForEach-Object {
"{0}`t{1:dd.MM.yyyy HH:mm}`t{2}" -f ($_ | Resolve-Path -Relative), $_.LastWriteTime, $_.Length
}
# join the array with the chosen newline and save to file
$lines -join $newline | Out-File 'index.txt' -NoNewline
Because your requirement is to NOT have column headers in the output file, I'm using Out-File here instead of Export-Csv

Powershell - How to create array of filenames based on filename?

I'm looking to create an array of files (pdf's specifically) based on filenames in Powershell. All files are in the same directory. I've spent a couple of days looking and can't find anything that has examples of this or something that is close but could be changed. Here is my example of file names:
AR - HELLO.pdf
AF - HELLO.pdf
RT - HELLO.pdf
MH - HELLO.pdf
AR - WORLD.pdf
AF - WORLD.pdf
RT - WORLD.pdf
HT - WORLD.pdf
....
I would like to combine all files ending in 'HELLO' into an array and 'WORLD' into another array and so on.
I'm stuck pretty early on in the process as I'm brand new to creating scripts, but here is my sad start:
Get-ChildItem *.pdf
Where BaseName -match '(.*) - (\w+)'
Updated Info...
I do not know the name after the " - " so using regex is working.
My ultimate goal is to combine PDF's based on the matching text after the " - " in the filename and the most basic code for this is:
$file1 = "1 - HELLO.pdf"
$file2 = "2 - HELLO.PDF"
$mergedfile = "HELLO.PDF"
Merge-PDF -InputFile $file1, $file2 -OututFile $mergedfile
I have also gotten the Merge-PDF to work using this code which merges all PDF's in the directory:
$Files = Get-ChildItem *.pdf
$mergedfiles = "merged.pdf"
Merge-PDF -InputFile $Files -OutputFile $mergedfiles
Using this code from #Mathias the $suffix portion of the -OutputFile works but the -InputFile portion is returning an error "Exception calling "Close" with "0" argument(s)"
$groups = Get-ChildItem *.pdf |Group-Object {$_.BaseName -replace
'^.*\b(\w+)$','$1'} -AsHashTable
foreach($suffix in $groups.Keys) {Merge-PDF -InputFile $(#($groups[$suffix]))
-OutputFile "$suffix.pdf"}
For the -InputFile I've tried a lot of different varieties and I keep getting the "0" arguments error. The values in the Hashtable seem to be correct so I'm not sure why this isn't working.
Thanks
This should do the trick:
$HELLO = Get-ChildItem *HELLO.pdf |Select -Expand Name
$WORLD = Get-ChildItem *WORLD.pdf |Select -Expand Name
If you want to group file names by the last word in the base name and you don't know them up front, regex is indeed an option:
$groups = Get-ChildItem *.pdf |Group-Object {$_.BaseName -replace '^.*\b(\w+)$','$1'} -AsHashTable
And then you can do:
$groups['HELLO'].Name
for all the file names ending with the word HELLO, or, to iterate over all of them:
foreach($suffixGroup in $groups.GetEnumerator()){
Write-Host "There are $($suffixGroup.Value.Count) files ending in $($suffixGroup.Key)"
}
Another option is to get all items with Get-ChildItem and use Where-Object to filter.
$fileNames = Get-ChildItem | Select-Object -ExpandProperty FullName
#then filter
$fileNames | Where-Object {$_.EndsWith("HELLO.PDF")}
#or use the aliases if you want to do less typing:
$fileNames = gci | select -exp FullName
$fileNames | ? {$_.EndsWith("HELLO.PDF")}
Just wanted to show more options -especially the Where-Object cmdlet which comes in useful when you're calling cmdlets that don't have parameters to filter.
Side note:
You may be asking what -ExpandProperty does.
If you just call gci | select -exp FullName, you will get back an array of PSCustomObjects (each of them with one property called FullName).
This can be confusing for people who don't really see that the objects are typed as it is not visible just by looking at the PowerShell script.

powershell filter to remove .pdf extension in the name of a file

I am trying to use powershell to get all child elements in a folder the code I am using is
Get-ChildItem -Recurse -path C:\clntfiles
this code gives output like
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 4/29/2015 9:11 AM 6919044 HD 100616 Dec2014.pdf
-a--- 5/1/2015 11:42 AM 7091019 HD 101642 Jan2015.pdf
I don't want Mode lastWriteTime Length and name of file without .pdf extension
the output should be like
Dec2014
Jan2015
I am not sure how to filter that. please advise
I'll start by posting something similar to Leptonator's answer, but simplified by using the Select-Object command (alias Select used in code because it's habit, and I'm lazy).
$files = Get-ChildItem -Recurse -path C:\clntfiles | Select -ExpandProperty BaseName
Now that gets you the file names without extension. But, you actually asked for only part of the file names, as the first file name is "HD 100616 Dec2014.pdf" and you specified that you actually only want "Dec2014" to be returned. We can do that a couple different ways, but my favorite of them would be a RegEx match (because RegEx is awesome, and I think the LastIndexOf/SubString combo is overly complicated imho).
So, a RegEx match of "\w+$" will get what you want. That is broken down like this:
\w means any letter or number
+ means 1 or more of them
$ means the end of the string/line
So that's 1 or more alpha-numeric characters at the end of the string. We pipe our array of file names into a ForEach-Object loop (alias ForEach used out of habit), and then we have:
$Files | ForEach{ [RegEx]::Matches($_,"\w+$")}
Now, this outputs a [System.Text.RegularExpressions.Match] object, which is more than you want, but it does have a property Value which is exactly what you asked for! So we use Select -Expand again for that property and the output is precisely what you asked for:
$files = Get-ChildItem -Recurse -path C:\clntfiles | Select -ExpandProperty BaseName
$files | ForEach{[regex]::Matches($_,"\w+$")} | Select -Expand Value
RegEx matches are really handy, and if you learn about them you can simplify that quite a bit more like this:
gci C:\clntfiles -Rec | ?{$_.BaseName -match "(\w+)$"} | %{$Matches[1]}
That one line, as well as the two line code above it both should output:
Dec2014
Jan2015
Something like this should do it for you..
$files = Get-ChildItem -Recurse -path C:\clntfiles
if ($files -ne $null)
{
foreach ($file in $files)
{
$file.BaseName
}
}
In my folder, it shows:
> 2014-03-28_exeresult_file
> 2014-03-30_exeresult_file
> 2014-03-31_exeresult_file
> 2014-04-02_exeresult_file
> 2014-04-03_exeresult_file
> 2014-04-04_exeresult_file
> 2014-04-06_exeresult_file
> 2014-04-08_exeresult_file
and are indeed .txt files
Hope this helps!
Use the following Get-ChildItem -Recurse -name -path C:\clntfiles. This will get you only the file names.
Working solution:
$names = Get-ChildItem -name
foreach($n in $names) {$n.Substring(0,$n.IndexOf("."))}
You can also use LastIndexOf if part of the file name is .