I am working on a script to list all the files with a specific extension (.dll) in this case. my script is working fine except i want to filter out all of those files which have microsoft's copyright. What approach should be taken ?
$Dir = Get-ChildItem C:\Windows\Microsoft.NET\Framework -include *.dll -recurse | sort-object name | format-table name, directory -auto
$Dir
Filter using $_.VersionInfo.LegalCopyright inside a Where-Object-statement. Ex:
$Dir = Get-ChildItem C:\Windows\Microsoft.NET\Framework -include *.dll -recurse |
Where-Object { $_.VersionInfo.LegalCopyright -notmatch 'Microsoft' }
$Dir | sort-object name | format-table name, directory -auto
Never store data from Format-Table in a variable. It throws away the objects and returns unusable format-objects. Only use it when outputing to console or with ex. | Out-String | Out-File ... when saving to a file.
Related
The purpose of this script is to find the Name, directory and last write time of files and output it to a csv file.
get-childitem D:\testing -recurse -filter *.txt | select-object Name,DirectoryName,LastWriteTime, #{Name="New_colimn";Expression={"copy-item \`"DirectoryName\`" To_Compile_directory"}} | where { $_.DirectoryName -ne $NULL } | Export-CSV D:\testing\rdf.csv
My problem is that there is 1 cell I want to fill with another script that takes values from the generated csv file. Is there a way to pull the value of each DirectoryName and paste it into the Expression of the same row? I only get an error that says DirectoryName is an invalid key.
when I try to pull using $.DirectoryName the script only reads the $ and the value it has is the Name.
Thank for helping.
Did you mean to collect the data from the files like this:
Get-ChildItem -Path 'D:\testing' -Filter '*.txt' -File -Recurse |
Select-Object Name,DirectoryName,LastWriteTime | Export-Csv -Path 'D:\testing\rdf.csv' -NoTypeInformation
and then have your other script read the DirectoryName's from it like this?
$directories = (Import-Csv -Path 'D:\testing\rdf.csv').DirectoryName | Select-Object -Unique
# maybe do something with these directories here?
foreach ($folderPath in $directories) {
# copy the directories including the files to an existing root destination folder
Copy-Item -Path $folderPath -Destination 'D:\SomeExistingDestinationPath' -Recurse -Force
}
I have a bunch of lists of documents generated in powershell using this command:
Get-ChildItem -Recurse |
Select-String -Pattern "acrn164524" |
group Path |
select Name > test.txt
In this example it generates a list of files containing the string acrn164524 the output looks like this:
Name
----
C:\data\logo.eps
C:\data\invoice.docx
C:\data\special.docx
InputStream
C:\datanew\special.docx
I have been using
Get-Content "test.txt" | ForEach-Object {
Copy-Item -Path $_ -Destination "c:\destination\" -Recurse -Container -Force
}
However, this is an issue if two or more files have the same name and also throws a bunch of errors for any lines in the file that are not a path.
sorry if I was not clear enough I would like to keep files with the same name by appending something to the end of the file name.
You seem to want the files, not the output of Select-String. So let's keep the files.
Get-ChildItem -Recurse -File | Where-Object {
$_ | Select-String acrn164524 -Quiet
} | Select-Object -ExpandProperty FullName | Out-File test.txt
Here
-File will make Get-ChildItem only return actual files. Think
about using a filter like *.txt to reduce the workload more.
-Quiet will make Select-String return $true or $false, which
is perfect for Where-Object.
Instead of Select-Object -ExpandProperty X in order to retrieve an array of raw property values (as opposed to an array of PSObjects, which is what Select-Object would normally do), it's simpler to use ForEach-Object X instead.
Get-ChildItem -Recurse -File | Where-Object {
$_ | Select-String acrn164524 -Quiet
} | ForEach-Object FullName | Out-File test.txt
After beginning this task at the command line I realised I need to get down and dirty with Powershell. I have about 100 folders and each folder has a few thousand CSV files that I would like to merge together inside each folder. Ideally the merged CSV file(s) in each folder would use the parent folders name. For example, here is a top level folder conatining the 100 folders
E:\CSVFolders
The subfolders are named in a semi-random fashion like this:
E:\CSVFolders\Folder1
E:\CSVFolders\Folder18
So far I am at this point:
# Merge csv files and use the parent folder name
Import-Csv (Get-ChildItem File*.csv) |
Export-Csv $folderName.csv -NoTypeInformation -Encoding UTF8
I am struggling to make the script enumerate the subfolders and then use their name as the basis for the merged CSV file so if anyone is able to shed light on this I would appreciate it!
Use two loops:
Get-ChildItem 'E:\CSVFolders' | Where-Object {
$_.PSIsContainer
} | ForEach-Object {
$csv = Join-Path $_.FullName ($_.Name + '.csv')
Get-ChildItem $_.FullName -Filter File*.csv | ForEach-Object {
Import-Csv $_.FullName
} | Export-Csv $csv -NoType -Encoding UTF8
}
you can group by directory like this:
Get-ChildItem "c:\temp" -file -Filter "*.csv" -Recurse |
group DirectoryName |
%{$dir=$_.Name; $_.Group.FullName | %{import-csv -path $_} | export-csv "$dir\global.csv" -NoTypeInformation}
short version (for no purist) :
gci "c:\temp" -file -Filter "*.csv" -Rec |
group DirectoryName |
%{$dir=$_.Name; $_.Group.FullName | %{ipcsv -path $_} | epcsv "$dir\global.csv" -NoType}
For a specific folder, I need to list all files with extension .js even if nested in subfolders at any level.
The result for the output console should be a list of file names with no extension line by line to be easily copy and pasted in another application.
At the moment I am trying this, but in output console I get several meta information and not a simple list.
Get-ChildItem -Path C:\xx\x-Recurse -File | sort length –Descending
Could you please provide me some hints?
If sorting by Length is not a necessity, you can use the -Name parameter to have Get-ChildItem return just the name, then use [System.IO.Path]::GetFileNameWithoutExtension() to remove the path and extension:
Get-ChildItem -Path .\ -Filter *.js -Recurse -File -Name| ForEach-Object {
[System.IO.Path]::GetFileNameWithoutExtension($_)
}
If sorting by length is desired, drop the -Name parameter and output the BaseName property of each FileInfo object. You can pipe the output (in both examples) to clip, to copy it into the clipboard:
Get-ChildItem -Path .\ -Filter *.js -Recurse -File| Sort-Object Length -Descending | ForEach-Object {
$_.BaseName
} | clip
If you want the full path, but without the extension, substitute $_.BaseName with:
$_.FullName.Remove($_.FullName.Length - $_.Extension.Length)
The simple option is to use the .Name property of the FileInfo item in the pipeline and then remove the extension:
Get-ChildItem -Path "C:\code\" -Filter *.js -r | % { $_.Name.Replace( ".js","") }
There are two methods for filtering files: globbing using an Wildcard, or using a Regular Expression (Regex).
Warning: The globbing method has the drawback that it also matches files which should not be matched, like *.jsx.
# globbing with Wildcard filter
# the error action prevents the output of errors
# (ie. directory requires admin rights and is inaccessible)
Get-ChildItem -Recurse -Filter '*.js' -ErrorAction 'SilentlyContinue'
# filter by Regex
Where-Object { $_.Name -Match '.*\.js$' }
You then can sort by name or filesize as needed:
# sort the output
Sort-Object -PropertyName 'Length'
Format it a simple list of path and filename:
# format output
Format-List -Property ('Path','Name')
To remove the file extension, you can use an select to map the result:
Select-Item { $_.Name.Replace( ".js", "") }
Putting it all together, there is also a very short version, which you should not use in scripts, because it's hardly readable:
ls -r | ? { $_.Name -matches '.*\.js' } | sort Length | % { $_.Name.Replace( ".js", "") | fl
If you like brevity, you can remove the ForEach-Object and quotes. -Path defaults to the current directory so you can omit it
(Get-ChildItem -Filter *.js -Recurse).BaseName | Sort length -Descending
The above Answers works fine. However in WIndows there is a alias called ls the same as on linux so another shorter command that works too would be ls -Filter *.exe
Use BaseName for the file name without the file extension.
Get-ChildItem -Path ".\*.js" | Sort-Object Length -Descending | ForEach-Object {
$_.BaseName
}
I always used cygwin for this in the past. My last employer locked down our environments and it wasn't available. I like to review the latest files I've modified often. I created the following environment variable named LatestCode to store the script. I then execute it with: iex $env:latest code.
Here is the script: get-childitem “.” -recurse -include *.ts, *.html , *.sass, *.java, *.js, *.css | where-object {$_.mode -notmatch “d”} | sort lastwritetime -descending | Select-Object -First 25 | format-table lastwritetime, fullname -autosize
When I filter some folders and output to a html file, the path in the result is always empty.
I can't find why it only works on files but folders?
Get-ChildItem -Recurse $source -Filter *PML_*_ECR* | where { $_.psiscontainer } | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-6)} | sort LastWriteTime -descending | select name,LastWriteTime,Directory | convertto-html -head $a -body "<H2>Folder LIST FOR PAST 7 DAYS </H2>" | out-file $output\results.htm
Folders are represented as DirectoryInfo objects, which don't have a Directory property. The full path of the folder object itself is provided via the FullName property:
... | select Name, LastWriteTime, FullName | ...
The path of the parent folder can be obtained via the Parent property:
... | select Name, LastWriteTime, #{n='Directory';e={$_.Parent.FullName}} | ...
Because Directory is not a property of that object. Try doing:
Get-ChildItem -Recurse $source -Filter *PML_*_ECR* | where { $_.psiscontainer } ||GM
Then look at the available properties. I think FullName may better suite your needs.