Create PS script to find files - powershell

I want to start by saying coding is a bit outside of my skill set but because a certain problem keeps appearing at work, I'm trying to automate a solution.
I use the below script to read an input file for a list of name, search the C:\ for those files, then write the path to an output file if any are found.
foreach($line in Get-Content C:\temp\InPutfile.txt) {
if($line -match $regex){
gci -Path "C:\" -recurse -Filter $line -ErrorAction SilentlyContinue |
Out-File -Append c:\temp\ResultsFindFile.txt
}
}
I would like to make two modifications to this. First, to search all drives connected to the computer not just C:\. Next, be able to delete any found files. I'm using the Remove-Item -confirm command but so far can't make it delete the file it just found.

Related

Export-CSV is not populating separate CSV files based on source

Good morning,
Hopefully this will be a quick and easy one to answer.
I am trying to run a PS script and have it export to csv based on a list of IP addresses from a text file. At the moment, it will run but only produce one csv.
Code Revision 1
$computers = get-content "pathway.txt"
$source = "\\$computer\c$"
foreach ($computer in $computers) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "C:\path\$computer.csv" -NoTypeInformation
}
Edit
The script is now creating the individual server files as needed and I did change the source .txt file to list the servers by HostName rather than IP. The issue now is that no data is populating in the .csv files. It will create them but nothing populates. I have tried different source file paths to see if maybe its due to folder permissions or just empty but nothing seems to populate in the files.
The $computer file lists a number of server IP addresses so the script should run against each IP and then write out to a separate csv file with the results, naming the csv file the individual IP address accordingly.
Does anyone see any errors in the script that I provided, that would prevent it from writing out to a separate csv with each run? I feel like it has something to do with the foreach loop but I cannot seem to isolate where I am going wrong.
Also, I cannot use any third-party software as this is a closed network with very strict FW rules so I am left with powershell (which is okay). And yes this will be a very long run for each of the servers but I am okay with that.
Edit
I did forget to mention that when I run the script, I get an error indicating that the export-csv path is too long which doesn't make any sense unless it is trying to write all of the IP addresses to a single name.
"Export-CSV : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:14 char:1
TIA
Running the script against C: Drive of each computer is strongly not advisable that too with Recurse option. But for your understanding, this is how you should pass the values to the variables. I haven't tested this code.
$computer = get-content "pathway.txt"
foreach ($Source in $computer) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length | Export-Csv -Path "C:\Path\$source.csv" -NoTypeInformation
}
$computer will hold the whole content and foreach will loop the content and $source will get one IP at a time. I also suggest instead of IP's you can have hostname so that your output file have servername.csv for each server.
In hopes that this helps someone else. I have finally got the script to run and create the individual .csv files for each server hostname.
$servers = Get-Content "path"
Foreach ($server in $servers)
{
Get-ChildItem -Path "\\$server\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "path\$server.csv" -NoTypeInformation
}

Get-Content piped with Add-Content

I encountered something weird I do not understand. My scenario:
I have in C:\Functions multiple .ps1 files. I would like to copy the content of the files to one file (AllFunctions.ps1). The file CopyFunctions2AllFunctions.ps1 is the file that execudes my commands.
$path="C:\Functions\*.ps1"
$destination="C:\Functions\AllFunctions.ps1"
Clear-Content -Path C:\Functions\AllFunctions.ps1
Get-Content -Path $path -Exclude "C:\Functions\CopyFunctions2AllFunctions.ps1" | Add-Content -Path $destination
The error message is in german, however, it says AllFunctions.ps1 cannot be accessed, because it is used in another process.
The code works if replace
$path="C:\Functions\*.ps1"
with a specific file name like
$path="C:\Functions\Read-Date.ps1"
-Force didnt help
Also, the code worked until Add-Content -Path $destination. When I executed Get-Content... the terminal didnt show me just what was inside the .ps1 files, but also the content of the terminal, with all the errors I encountered while trying...
Does someone have an idea?
There are 2 things to be fixed in this code, first the new code:
$path="C:\Functions"
$destination="C:\Functions\AllFunctions.ps1"
Clear-Content -Path C:\Functions\AllFunctions.ps1
$functions=Get-ChildItem -Path $path -Exclude CopyFunctions2Profile.ps1 | Get-Content
Add-Content -Path $destination -Value $functions
Issue #1
$path="C:\Functions\*.ps1" doesnt work, PS is also copying the content of the terminal, I dont know why...Therefore, we dont use wildcards in our $path.
Because of that we need to use Get-Childitem in the code as the following:
$functions=Get-ChildItem -Path $path -Exclude CopyFunctions2Profile.ps1 | Get-Content
Issue #2
Working with pipes, PS processes one item and sends it through the pipe, then the second and so on. Due to that, when item 2 is sent to Add-Content, "AllFunctions.ps1" is still being used for item 1.
Therefore, we need to save our Get-Content in a variable ($functions) and then use it in Add-Content.

Copying files to specific folder declared in a CSV file using Powershell Script

i am quite new to powershell and i am trying to make a script that copy files to certain folders that are declared in a CSV file. But till now i am getting errors from everywhere and can't find nothing to resolve this issue.
I have this folders and .txt files created in the same folder as the script.
Till now i could only do this:
$files = Import-Csv .\files.csv
$files
foreach ($file in $files) {
$name = $file.name
$final = $file.destination
Copy-Item $name -Destination $final
}
This is my CSV
name;destination
file1.txt;folderX
file2.txt;folderY
file3.txt;folderZ
As the comments indicate, if you are not using default system delimiters, you should make sure to specify them.
I also recommend typically to use quotes for your csv to ensure no problems with accidentally including an entry that includes the delimiter in the name.
#"
"taco1.txt";"C:\temp\taco2;.txt"
"# | ConvertFrom-CSV -Delimiter ';' -Header #('file','destination')
will output
file destination
---- -----------
taco1.txt C:\temp\taco2;.txt
The quotes make sure the values are correctly interpreted. And yes... you can name a file foobar;test..txt. Never underestimate what users might do. 😁
If you take the command Get-ChildItem | Select-Object BaseName,Directory | ConvertTo-CSV -NoTypeInformation and review the output, you should see it quoted like this.
Sourcing Your File List
One last tip. Most of the time I've come across a CSV for file input lists a CSV hasn't been needed. Consider looking at grabbing the files you in your script itself.
For example, if you have a folder and need to filter the list down, you can do this on the fly very easily in PowerShell by using Get-ChildItem.
For example:
$Directory = 'C:\temp'
$Destination = $ENV:TEMP
Get-ChildItem -Path $Directory -Filter *.txt -Recurse | Copy-Item -Destination $Destination
If you need to have more granular matching control, consider using the Where-Object cmdlet and doing something like this:
Get-ChildItem -Path $Directory -Filter *.txt -Recurse | Where-Object Name -match '(taco)|(burrito)' | Copy-Item -Destination $Destination
Often you'll find that you can easily use this type of filtering to keep CSV and input files out of the solution.
example
Using techniques like this, you might be able to get files from 2 directories, filter the match, and copy all in a short statement like this:
Get-ChildItem -Path 'C:\temp' -Filter '*.xlsx' -Recurse | Where-Object Name -match 'taco' | Copy-Item -Destination $ENV:TEMP -Verbose
Hope that gives you some other ideas! Welcome to Stack Overflow. 👋

PowerShell to delete Desktop Items from a remote PC

I have 200 PC that need to have some specific icons removed.
I created a CSV file with the ComputerName (1 name per row)
I have another file with the file name of the icon that needs to be removed from the desktops (Shortcut1.lnk, etc). This other file is also a CSV (1 file name per row).
How can I run a PowerShell script to remove those icons. (Please note that not all computers in my CSV file maybe turned on. Some maybe off or have network issues).
$SOURCE = "C:\powershell\shortcuts"
$DESTINATION = "c$\Documents and Settings\All Users\Desktop"
$LOG = "C:\powershell\logs\logsremote_copy.log"
$REMOVE = Get-Content C:\powershell\shortcuts-removal.csv
Remove-Item $LOG -ErrorAction SilentlyContinue
$computerlist = Get-Content C:\powershell\computer-list.csv
foreach ($computer in $computerlist) {
foreach ($file in $REMOVE) {
Remove-Item "\\$computer\$DESTINATION\$file" -Recurse
}
}
This is my code so far but it doesn't appear to delete the files from
\\computername\c$\Documents and Settings\All Users\Desktop
I am getting errors and warnings. The log file also doesn't seem to be creating.
Anyway to get a report of what was deleted. what was not deleted?
Change this, you already specify a slash in your $destination variable, you are double up # \\c$
Remove-Item "\\$computer$DESTINATION\$file" -Recurse
otherwise, you are trying to delete this path and failing.
\\computername\\c$\Documents and Settings\All Users\Desktop\$file

Powershell script to locate specific file/file name?

I wanted to write a small script that searched for an exact file name, not a string within a file name.
For instance if I search for 'hosts' using Explorer, I get multiple results by default. With the script I want ONLY the name I specify. I'm assuming that it's possible?
I had only really started the script and it's only for my personal use so it's not important, it's just bugging me. I have several drives so I started with 2 inputs, one to query drive letter and another to specify file name. I can search by extension, file size etc but can't seem to pin the search down to an exact name.
Any help would be appreciated!
EDIT : Thanks to all responses. Just to update. I added one of the answers to my tiny script and it works well. All three responses worked but I could only use one ultimately, no offence to the other two. Cheers. Just to clarify, 'npp' is an alias for Notepad++ to open the file once found.
$drv = read-host "Choose drive"
$fname = read-host "File name"
$req = dir -Path $drv -r | Where-Object { !$PsIsContainer -and [System.IO.Path]::GetFileNameWithoutExtension($_.Name) -eq $fname }
set-location $req.directory
npp $req
From a powershell prompt, use the gci cmdlet (alias for Get-ChildItem) and -filter option:
gci -recurse -filter "hosts"
This will return an exact match to filename "hosts".
SteveMustafa points out with current versions of powershell you can use the -File switch to give the following to recursively search for only files named "hosts" (and not directories or other miscellaneous file-system entities):
gci -recurse -filter "hosts" -File
The commands may print many red error messages like "Access to the path 'C:\Windows\Prefetch' is denied.".
If you want to avoid the error messages then set the -ErrorAction to be silent.
gci -recurse -filter "hosts" -File -ErrorAction SilentlyContinue
An additional helper is that you can set the root to search from using -Path.
The resulting command to search explicitly search from, for example, the root of the C drive would be
gci -Recurse -Filter "hosts" -File -ErrorAction SilentlyContinue -Path "C:\"
Assuming you have a Z: drive mapped:
Get-ChildItem -Path "Z:" -Recurse | Where-Object { !$PsIsContainer -and [System.IO.Path]::GetFileNameWithoutExtension($_.Name) -eq "hosts" }
I use this form for just this sort of thing:
gci . hosts -r | ? {!$_.PSIsContainer}
. maps to positional parameter Path and "hosts" maps to positional parameter Filter. I highly recommend using Filter over Include if the provider supports filtering (and the filesystem provider does). It is a good bit faster than Include.
I'm using this function based on #Murph answer.
It searches inside the current directory and lists the full path:
function findit
{
$filename = $args[0];
gci -recurse -filter "*${filename}*" -file -ErrorAction SilentlyContinue | foreach-object {
$place_path = $_.directory
echo "${place_path}\${_}"
}
}
Example usage: findit myfile
To search the whole computer:
gdr -PSProvider 'FileSystem' | %{ ls -r $_.root} 2>$null | where { $_.name -eq "httpd.exe" }
In findFileByFilename.ps1 I have:
# https://stackoverflow.com/questions/3428044/powershell-script-to-locate-specific-file-file-name
$filename = Read-Host 'What is the filename to find?'
gci . -recurse -filter $filename -file -ErrorAction SilentlyContinue
# tested works from pwd recursively.
This works great for me. I understand it.
I put it in a folder on my PATH.
I invoke it with:
> findFileByFilename.ps1
To search the whole computer:
gdr -PSProvider 'FileSystem' | %{ ls -r $.root} 2>$null | where {
$.name -eq "httpd.exe" }
I am pretty sure this is a much less efficient command, for MANY reasons, but the simplest is your piping everything to your where-object command, when you could still use -filter "httpd.exe" and save a ton of cycles.
Also, on a lot of computers the get-psdrive is gonna grab shared drives, and I am pretty sure you wanted that to get a complete search. Most shares can be IMMENSE with regard to the sheer number of files and folders, so at the very least I would sort my drives by size, and add a check after each search to exit the loop if we locate the file. That is if you are looking for a single instance, if not the only way to save yourself the IMMENSE time sink of searching a 10TB share or two, is to comment the command and highly suggest any user who were to need to use it should limit their search as much as they can. For instance our User Profile share is 10TB, at least the one I am on is, and I can limit my search to the directory $sharedrive\users\myname and search my 116GB directory rather than the 10TB one. Too many unknowns with shares for this type of script, which is already super inefficient with regard to resources and speed.
If I was seriously considering using something like this, I would add a call to a 3rd party package and leverage a DB.