The two commands below are yielding exactly the same file list in bash under cygwin:
find ../../../../.. -name "*.o" -and -path "*/common/*"
find ../../../../.. -name "*.o" -and -path "*/common/*" -prune
This list includes files such as:
../../../../../platform/abc/common/ppng.o
../../../../../platform/abc/common/variant/pxx.o
The list does not include any files without "common" in their pathnames.
What I'm trying to do is find (and ultimately eliminate) object files in all directories but any that have a "common" directory component. I've tried about 25 other combinations without luck.
Any pointers?
afaik, -path doesn't take regular expressions. I think what you want to do is find all your object files (.*.o) and exclude all the common directories (.*/common/.*):
find . -regex '.*.o' -and -not -regex '.*/common/.*'
You can make it all case insensitive with -iregex if needed.
Related
I am trying to create a script to delete all files and directories within a folder, except for specified folders and their content.
Example tree:
test/
- images/
- folder_to_keep/
- misc/
- blah/
- some_other_folder/
- another_to_keep/
- snafu/
I've searched for this specific question with little luck. I've tried:
Remove-Item .\test\* -Exclude (".\test\images\folder_to_keep", ".\test\some_other_folder\another_to_keep") -Recurse
but that still deletes everything.
FYI, I am trying to run this script in a build job on an Atlassian Bamboo server if that helps anyone. See: https://confluence.atlassian.com/bamboo0515/script-894237366.html. If there is a better way using their options (Shell, Windows PowerShell, /bin/sh or cmd.exe) that would be great too.
Edit: robocopy might be an option also. The basic problem is that I need to mirror the source and the destination paths, but there are folders in the destination that must remain unchanged (they get filled from another process).
-Exclude applies only to leaf elements, not the full path. For example, a filter -Exclude 'foo' would remove a folder named "foo" from the result list, but not its files or child folders.
Combine the parent paths you want excluded in a regular expression and use a regexp (non-)match.
$excludes = "$($pwd.Path)\test\images\folder_to_keep",
"$($pwd.Path)\test\some_other_folder\another_to_keep"
$re = ($excludes | ForEach-Object {[regex]::Escape($_)}) -join '|'
Get-ChildItem -Recurse -Force | Where-Object {
$_.FullName -notmatch "^$re"
} | Remove-Item -Force
I am trying to configure my dotnet core project (in Windows) as "case sensitive", so it behaves as in my production server (linux).
I have found this way of doing it:
fsutil.exe file setCaseSensitiveInfo "C:\my folder" enable
The problem is that this function is not recursive:
The case sensitivity flag only affects the specific folder to which you apply it. It isn’t automatically inherited by that folder’s subfolders.
So I am trying to build a powershell script that applies this to all folders and subfolders, recursively.
I have tried googling something similar and just modifying the command line, but I don't seem to find the corrent keywords. This is the closest that I've gotten to this sort of example.
Correct code:
(Get-ChildItem -Recurse -Directory).FullName | ForEach-Object {fsutil.exe file setCaseSensitiveInfo $_ enable}
Explanation:
NOTE: The code in the answer assumes you're in the root of the directory tree and you want to run fsutil.exe against all the folders inside, as it's been pointed out in the comments (thanks #Abhishek Anand!)
Get-ChildItem -Recurse -Directory will give you list of all folders (recursively).
As you want to pass their full path, you can access it by using .FullName[1] (or more self-explanatory | Select-Object -ExpandProperty FullName ).
Then you use ForEach-Object to run fsutil.exe multiple times. Current file's FullName can be accessed using $_ (this represents current object in ForEach-Object)[2].
Hint:
If you want more tracking of what's currently being processed you can add the following to write the path of currently processed file to the console: ; Write-Host $_ (semicolon ; is to separate from fsutil invocation) as it was pointed out in the comments (thanks Fund Monica's Lawsuit !)
[1] .FullName notation works for PowerShell 3.0 and greater, Select-Object -ExpandProperty FullName is preferred if there's a chance that lower version will be used.
[2] $_ is an alias for $PSItem
(Get-ChildItem -Recurse -Directory).FullName | ForEach-Object {if (-Not ($_ -like '*node_modules*')) { fsutil.exe file setCaseSensitiveInfo $_ enable } }
I modified #robdy's code to allow excluding node_modules. You can replace the "node_modules" bit in the above with anything to exclude filepaths containing it.
If you're working with npm, you probably want to exclude node_modules. #robdy's answer is great, but was taking minutes at a time iterating over every single node package folder even if I didn't have the package installed; given that this is something one might want to run fairly often since directories might be added all the time, and since you probably aren't modifying anything in node_modules, excluding it seems reasonable.
With Cygwin and bash shell, you can do this:
$ find $THEDIR -type d -exec fsutil file setCaseSensitiveInfo "{}" enable \;
It appears that Windows handles the '/' characters output by the find command just fine.
In my case I had to first enable the Linux subsystem before using the fsutil tool. So my steps were:
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux
then restart, and then #robdy 's solution:
(Get-ChildItem -Recurse -Directory).FullName | ForEach-Object {fsutil.exe file setCaseSensitiveInfo $_ enable}
On windows 11, the other answers are not correct, as fsutil requires that the directory is not empty. To overcome this, I created a NEW empty directory, used fsutil file setCaseSensitiveInfo to set the case sensitive flag on the new directory, then MOVED the files from the other directory inside the new one. This works, as the directories are re-created when moved, and new directories inherit the case sensitive flag.
I have done some Windows batch scripting, but I have never worked with PowerShell, so I am looking for some startup help.
I would like to
parse a directory tree
find all the directories two levels down that do not contain folder.jpg
write the list of these directories to a text file
So far I have found this link which addresses part of the question. I have also found that Get-ChildItem \*\*\* should get me to the directories two levels down.
I would appreciate it if someone could help me put this together.
Thanks a lot
You need a combination of Get-Child, Foreach-Object, Test-Path, Join-Path and Write-Object:
Get-ChildItem *\*\* -Directory | ForEach-Object {if(!(Test-Path(Join-Path -Path $_ -ChildPath "folder.jpg"))) {Write-Output $_}}
This will write all the DirectoryInfo object where the file doesn't exist to the pipeline. You can then opt to write them to a file.
I wanted to write a small script that searched for an exact file name, not a string within a file name.
For instance if I search for 'hosts' using Explorer, I get multiple results by default. With the script I want ONLY the name I specify. I'm assuming that it's possible?
I had only really started the script and it's only for my personal use so it's not important, it's just bugging me. I have several drives so I started with 2 inputs, one to query drive letter and another to specify file name. I can search by extension, file size etc but can't seem to pin the search down to an exact name.
Any help would be appreciated!
EDIT : Thanks to all responses. Just to update. I added one of the answers to my tiny script and it works well. All three responses worked but I could only use one ultimately, no offence to the other two. Cheers. Just to clarify, 'npp' is an alias for Notepad++ to open the file once found.
$drv = read-host "Choose drive"
$fname = read-host "File name"
$req = dir -Path $drv -r | Where-Object { !$PsIsContainer -and [System.IO.Path]::GetFileNameWithoutExtension($_.Name) -eq $fname }
set-location $req.directory
npp $req
From a powershell prompt, use the gci cmdlet (alias for Get-ChildItem) and -filter option:
gci -recurse -filter "hosts"
This will return an exact match to filename "hosts".
SteveMustafa points out with current versions of powershell you can use the -File switch to give the following to recursively search for only files named "hosts" (and not directories or other miscellaneous file-system entities):
gci -recurse -filter "hosts" -File
The commands may print many red error messages like "Access to the path 'C:\Windows\Prefetch' is denied.".
If you want to avoid the error messages then set the -ErrorAction to be silent.
gci -recurse -filter "hosts" -File -ErrorAction SilentlyContinue
An additional helper is that you can set the root to search from using -Path.
The resulting command to search explicitly search from, for example, the root of the C drive would be
gci -Recurse -Filter "hosts" -File -ErrorAction SilentlyContinue -Path "C:\"
Assuming you have a Z: drive mapped:
Get-ChildItem -Path "Z:" -Recurse | Where-Object { !$PsIsContainer -and [System.IO.Path]::GetFileNameWithoutExtension($_.Name) -eq "hosts" }
I use this form for just this sort of thing:
gci . hosts -r | ? {!$_.PSIsContainer}
. maps to positional parameter Path and "hosts" maps to positional parameter Filter. I highly recommend using Filter over Include if the provider supports filtering (and the filesystem provider does). It is a good bit faster than Include.
I'm using this function based on #Murph answer.
It searches inside the current directory and lists the full path:
function findit
{
$filename = $args[0];
gci -recurse -filter "*${filename}*" -file -ErrorAction SilentlyContinue | foreach-object {
$place_path = $_.directory
echo "${place_path}\${_}"
}
}
Example usage: findit myfile
To search the whole computer:
gdr -PSProvider 'FileSystem' | %{ ls -r $_.root} 2>$null | where { $_.name -eq "httpd.exe" }
In findFileByFilename.ps1 I have:
# https://stackoverflow.com/questions/3428044/powershell-script-to-locate-specific-file-file-name
$filename = Read-Host 'What is the filename to find?'
gci . -recurse -filter $filename -file -ErrorAction SilentlyContinue
# tested works from pwd recursively.
This works great for me. I understand it.
I put it in a folder on my PATH.
I invoke it with:
> findFileByFilename.ps1
To search the whole computer:
gdr -PSProvider 'FileSystem' | %{ ls -r $.root} 2>$null | where {
$.name -eq "httpd.exe" }
I am pretty sure this is a much less efficient command, for MANY reasons, but the simplest is your piping everything to your where-object command, when you could still use -filter "httpd.exe" and save a ton of cycles.
Also, on a lot of computers the get-psdrive is gonna grab shared drives, and I am pretty sure you wanted that to get a complete search. Most shares can be IMMENSE with regard to the sheer number of files and folders, so at the very least I would sort my drives by size, and add a check after each search to exit the loop if we locate the file. That is if you are looking for a single instance, if not the only way to save yourself the IMMENSE time sink of searching a 10TB share or two, is to comment the command and highly suggest any user who were to need to use it should limit their search as much as they can. For instance our User Profile share is 10TB, at least the one I am on is, and I can limit my search to the directory $sharedrive\users\myname and search my 116GB directory rather than the 10TB one. Too many unknowns with shares for this type of script, which is already super inefficient with regard to resources and speed.
If I was seriously considering using something like this, I would add a call to a 3rd party package and leverage a DB.
How can I get a nice list of files in a directory tree that contains multiple OLD Files?
I'd like to see only files from directories named OLD that have a certain age.
As I understand the question, Raoul Supercopter's solution doesn't quite answer it. Instead of finding all files from directories that are named "OLD" the solution above finds all files that contain "OLD" in their name.
Instead, I think you're asking for something that finds all files older than a certain date that are in directories named OLD.
So, to find the directories, we need something like the following:
dir -r | ? {$_.name -match "\bold\b" -and $_.PSIsContainer}
But you then need something that can recursively go through each directory and find the files (and, potentially, any directories named "OLD" that are contained in other directories named "OLD").
The easiest way to do this is to write a function and then call it recursively, but here's a way to do it on one line that takes a different tactic (note the line continuation character so this would fit:
dir -r | ? {!$_.PSIsContainer -and $_.LastWriteTime -lt (Get-Date 5/1/2006)} `
| ? {(% {$_.directoryname} | split-path -leaf) -eq "OLD"}
So, what's happening here?
The first section is just a basic recursive directory listing. The next section checks to be sure that you're looking at a file (!$_.PSIsContainer) and that it meets your age requirements. The parentheses around the Get-Date section let you get the results of running the command. Then we get the directory name of each file and use the split-path cmdlet to get just the name of the closest directory. If that is "OLD" then we have a file that matches your requirements.
Well, the first part is to list all the files, the second part filter your files, and finally format your output. You can use Format-List or table (but I don't have a PowerShell installation nearby to test it :])
$yourDate = Get-Date 5/1/2006
ls -recurse . | ? { $_.fullname -match "OLD" -and $_.lastwritetime -lt $yourDate } | % { $_.fullname }
Get-Date creates a Date-Time object when you give it a specific date as a parameter. Just use it and filter.