PowerShell File Sorting - powershell

So I feel like PS would be the best solution for this project, but cannot for the life of me figure out where to get started with it, here's the file layout..
I've got one folder, filled with folders generated by our automated system, they are labeled: foobarXXXXXXXXXXXXXXX
The end 15 characters of the folder is what I need to grab, and then search through another folder for any files that contain those 15 characters, and then move any files found into their respective folders.
I can give more details if this wasn't sufficient. Just need a point to get started.
I'm running Windows 7 should the version of PowerShell be a concern.

Ideally you want Powershell 3, but you can accomplish this task in Powershell 2 as well.
I would first look into the Select-String cmdlet. Also found on technet here.
It is also perfectly legal to use SubString function for .NET string manipulation.
$filePattern = $string.Substring(1,15)
To get collections of your files, you should use Get-ChildItem. Using the "#" in "#(Get-ChildItem)" produces an explicit array.
$files = #(Get-ChildItem -Path $path -Recurse)
And since there is no specific detail in your question there are no specific answers.
Also, I run Windows 7 with Powershell 2 and 3, side by side. Powershell 3 is kinda awesome.

Related

Compare 2 large folders

I know there are a lot of questions asked/answered related to this but my question has twists.
So I'm comparing 2 folders that has huge amount of data (over 20gb and can go up to 40gb) one of them being OneDrive.
I'm trying to compare and find the missing ones along with which ones are newer. I can accomplish either or but regardless which one I try because the folders are huge, it takes a long time and sometimes even crashes. On top of that, when you run the script, it tries to download the files on OneDrive (even tho they are present when you do Test-Path.
I found a post that does both (link below) but wondering if there is an easier way to accomplish this without downloading or putting it in a variable?
Thank you everyone in advance!
https://serverfault.com/questions/532065/how-do-i-diff-two-folders-in-windows-powershell/637776?newreg=b08ad3ef3c8e45d48ac0d17676a28df4
you can try with compare-object but you have to get all child items before like this:
$gci1 = Get-ChildItem -Recurse "Path to Folder"
$gci2 = Get-ChildItem -Recurse "Path to Folder"
Compare-Object $gci1 $gci2

FileSystem Provider Error in the Get-ChildItem Implementation of the Filter Parameter? [duplicate]

This question already has answers here:
Powershell, File system provider, Get-ChildItem filtering... where are the official docs?
(4 answers)
Closed 5 years ago.
Recently I was to create a report via PowerShell about the *.inf files located in the C:\Windows\winsxs folder unsing the Filter parameter of Get-ChildItem (see the official documentation about this cmdlet and its parameters), like:
$infFiles = Get-ChildItem -Path C:\Windows\winsxs -Filter *.inf -Recurse
I've noticed it however, that files having other file extensions, like .inf_loc or .inf_dbf42768 are included in the result as well.
On the other hand, the Include parameter works as expected, returning only the .inf files:
$infFiles = Get-ChildItem -Path C:\Windows\winsxs -Include *.inf -Recurse
I've tested the phenomenon using PS version 3 and 4, and found, that if the extension you are looking for has exactly 3 character, than any files, those extensions begin with the same 3 character are returned, even if those files have a longer extension. If the extension you are looking for is shorter or longer than 3 characters, there is no such issue, at least, based on my experience.
Although performance is in my case not crucial, as far as I understand, using the Filter parameter would be more efficient, than the Include parameter, as it performs the filtering already on the provider level, as discussed in the cmdlet description:
Filters are more efficient than other parameters, because the provider
applies them when retrieving the objects, rather than having Windows
PowerShell filter the objects after they are retrieved.
and here:
...the –Filter parameter generates early, early filtering,
whereas-Include is later early filtering! The performance difference
between the two approaches turns out to be significant!
Is it a documented behaviour / bug? I've found something similar in the post mentioned before as well, but not in the official documentations.
Per the linked question, this seems like a limitation of the -Filter parameter. I guess another solution would be to still use -Filter for its performance benefit but then filter out the extra results you don't want afterwards with Where-Object:
Get-ChildItem -filter "*.inf" | where {$_ -match "\.inf$"}
This uses the regex $ character to return only where .inf is at the end of the string.

Get all references to a given PowerShell module

Is there a way to find a list of script files that reference a given module (.psm1)? In other words, get all files that, in the script code, use at least 1 of the cmdlets defined in the module.
Obviously because of PowerShell 3.0 and above, most of my script files don't have an explicit Import-Module MODULE_NAME in the code somewhere, so I can't use that text to search on.
I know I can use Get-ChildItem -Path '...' -Recurse | Select-String 'TextToSearchFor' to search for a particular string inside of files, but that's not the same as searching for any reference to any cmdlet of a module. I could do a search for every single cmdlet in my module, but I was wondering if there is a better way.
Clarification: I'm only looking inside of a controlled environment where I have all the scripts in one file location.
Depending on the scenario, the callstack could be interesting to play around with. In that case you need to modify the functions which you want to find out about to gather information about the callstack at runtime and log it somewhere. Over time you might have enough logs to make some good assumptions.
function yourfunction {
$stack = Get-PSCallStack
if ($stack.Count -gt 1) {
$stack[1] # log this to a file or whatever you need
}
}
This might not work at all in your scenario, but I thought I throw it in there as an option.

Microsoft's Consistency in PowerShell CmdLet Parameter Naming

Let's say I wrote a PowerShell script that includes this commmand:
Get-ChildItem -Recurse
But instead I wrote:
Get-ChildItem -Re
To save time. After some time passed and I upgraded my PowerShell version, Microsoft decided to add a parameter to Get-ChildItem called "-Return", that for example returns True or False depending if any items are found or not.
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected? I understand Microsoft's attempt to save my typing time, but this is my concern and therefore I will probably always try to write the complete parameter name.
Unless of course you know something I don't. Thank you for your insight!
This sounds more like a rant than a question, but to answer:
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected?
Yes!
You should always use the full parameter names in scripts (or any other snippet of reusable code).
Automatic resolution of partial parameter names, aliases and other shortcuts are great for convenience when using PowerShell interactively. It lets us fire up powershell.exe and do:
ls -re *.ps1|% FullName
when we want to find the path to all scripts in the profile. Great for exploration!
But if I were to incorporate that functionality into a script I would do:
Get-ChildItem -Path $Home -Filter *.ps1 -Recurse |Select-Object -ExpandProperty FullName
not just for the reasons you mentioned, but also for consistency and readability - if a colleague of mine comes along and maybe isn't familiar with the shortcuts I'm using, he'll still be able to discern the meaning and expected output from the pipeline.
Note: There are currently three open issues on GitHub to add warning rules for this in PSScriptAnalyzer - I'm sure the project maintainers would love a hand with this :-)

preplog.exe ran in foreach log file

I have a folder with x amount of web log files and I need to prep them for bulk import to SQL
for that I have to run preplog.exe into each one of them.
I want to create a Power script to do this for me, the problem that I'm having is that preplog.exe has to be run in CMD and I need to enter the input path and the output path.
For Example:
D:>preplog c:\blah.log > out.log
I've been playing with Foreach but I haven't have any luck.
Any pointers will be much appreciated
I would guess...
Get-ChildItem "C:\Folder\MyLogFiles" | Foreach-Object { preplog $_.FullName | Out-File "preplog.log" -Append }
FYI it is good practice on this site to post your not working code so at least we have some context. Here I assume you're logging to the current directory into one file.
Additionally you've said you need to run in CMD but you've tagged PowerShell - it pays to be specific. I've assumed PowerShell because it's a LOT easier to script.
I've also had to assume that the folder contains ONLY your log files, otherwise you will need to include a Where statement to filter the items.
In short I've made a lot of assumptions that means this may not be an accurate answer, so keep all this in mind for your next question =)