preplog.exe ran in foreach log file - powershell

I have a folder with x amount of web log files and I need to prep them for bulk import to SQL
for that I have to run preplog.exe into each one of them.
I want to create a Power script to do this for me, the problem that I'm having is that preplog.exe has to be run in CMD and I need to enter the input path and the output path.
For Example:
D:>preplog c:\blah.log > out.log
I've been playing with Foreach but I haven't have any luck.
Any pointers will be much appreciated

I would guess...
Get-ChildItem "C:\Folder\MyLogFiles" | Foreach-Object { preplog $_.FullName | Out-File "preplog.log" -Append }
FYI it is good practice on this site to post your not working code so at least we have some context. Here I assume you're logging to the current directory into one file.
Additionally you've said you need to run in CMD but you've tagged PowerShell - it pays to be specific. I've assumed PowerShell because it's a LOT easier to script.
I've also had to assume that the folder contains ONLY your log files, otherwise you will need to include a Where statement to filter the items.
In short I've made a lot of assumptions that means this may not be an accurate answer, so keep all this in mind for your next question =)

Related

Get all references to a given PowerShell module

Is there a way to find a list of script files that reference a given module (.psm1)? In other words, get all files that, in the script code, use at least 1 of the cmdlets defined in the module.
Obviously because of PowerShell 3.0 and above, most of my script files don't have an explicit Import-Module MODULE_NAME in the code somewhere, so I can't use that text to search on.
I know I can use Get-ChildItem -Path '...' -Recurse | Select-String 'TextToSearchFor' to search for a particular string inside of files, but that's not the same as searching for any reference to any cmdlet of a module. I could do a search for every single cmdlet in my module, but I was wondering if there is a better way.
Clarification: I'm only looking inside of a controlled environment where I have all the scripts in one file location.
Depending on the scenario, the callstack could be interesting to play around with. In that case you need to modify the functions which you want to find out about to gather information about the callstack at runtime and log it somewhere. Over time you might have enough logs to make some good assumptions.
function yourfunction {
$stack = Get-PSCallStack
if ($stack.Count -gt 1) {
$stack[1] # log this to a file or whatever you need
}
}
This might not work at all in your scenario, but I thought I throw it in there as an option.

Microsoft's Consistency in PowerShell CmdLet Parameter Naming

Let's say I wrote a PowerShell script that includes this commmand:
Get-ChildItem -Recurse
But instead I wrote:
Get-ChildItem -Re
To save time. After some time passed and I upgraded my PowerShell version, Microsoft decided to add a parameter to Get-ChildItem called "-Return", that for example returns True or False depending if any items are found or not.
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected? I understand Microsoft's attempt to save my typing time, but this is my concern and therefore I will probably always try to write the complete parameter name.
Unless of course you know something I don't. Thank you for your insight!
This sounds more like a rant than a question, but to answer:
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected?
Yes!
You should always use the full parameter names in scripts (or any other snippet of reusable code).
Automatic resolution of partial parameter names, aliases and other shortcuts are great for convenience when using PowerShell interactively. It lets us fire up powershell.exe and do:
ls -re *.ps1|% FullName
when we want to find the path to all scripts in the profile. Great for exploration!
But if I were to incorporate that functionality into a script I would do:
Get-ChildItem -Path $Home -Filter *.ps1 -Recurse |Select-Object -ExpandProperty FullName
not just for the reasons you mentioned, but also for consistency and readability - if a colleague of mine comes along and maybe isn't familiar with the shortcuts I'm using, he'll still be able to discern the meaning and expected output from the pipeline.
Note: There are currently three open issues on GitHub to add warning rules for this in PSScriptAnalyzer - I'm sure the project maintainers would love a hand with this :-)

Powershell : Quickly count containers

I think we all know the PsIsContainer method to check if the current file is a folder or not. But in my project I need a way to quickly know the number of folders in a folder. All I need is to quickly get their number. I want to write in a .txt lines which would look like C:\folder;12. It would mean in the folder, with the -recurse argument, there would be 12 folders.
To explain why, I need to save the progress of my work when i cut off the program which is used to analyse some folders. When a folder's analysed, the result is written in a second .txt. For example, if a folder is called C:\folder\folder1, folder will be analysed and then folder1 will be too. Which makes folder appear 2 times in the file because the full name always is written. What i want to do is to count the number of lines where C:\folder is written. If it equals the number next it's path in the first .txt, it means the file already has been analysed and the function doesnt need to do it again.
Does someone have a solution ? Or maybe an another idea to save the progress ? Cause i really have the feeling this is taking too long to do this.
Thank you for your help.
Another approach, which i find much faster is using cmd built-in 'dir' command
of course this is in case you don't need the subfolders(which you can then run the function in a foreach loop, or change the function if this is the case)
Function Get-FolderCount($path)
{
$Dir = cmd /c dir $path /a:d
Return ($Dir[-1] -csplit 'Dir' -replace '\s')[0]
}
I use this as well for measuring folder size with /s switch and take the total size which is much faster then powershell, also much faster then run it on interactive shell...

Is it possible to make a cmdlet work with all items being piped into it at once?

Instead of counting sheep this evening, I created a cmdlet that lists all duplicate files in a directory. It's dirt stupid simple and it can only work with all files in a directory, and I'm not keen on reinventing the wheel to add filtering, so here's what I want to do with it instead:
dir | duplicates | del
The only catch is that, normally, any given command in the pipe only works with one object at a time, which will do no good whatsoever for detecting duplicates. (Of course there are no duplicates in a set of one, right?)
Is there a trick I can use to have the second command in the chain collect all the output from the first before doing its job and passing things on to the third command?
You can work with a single file at a time, you just have to store each file you receive it the Process block and then process all the files in an End block. This is how commands like Group & Sort work. They can't group or sort until they have all the input. Once they have all the input, they do their operation and then begin streaming the results down the pipeline again in grouped/sorted order.
So I actually came up with the answer while I was in the shower and came back to find Keith had already provided it. Here's an example anyway.
begin
{
Add-Type -Path ($env:USERPROFILE + '\bin\CollectionHelper.cs');
[string[]] $files = #()
}
process
{
$files += $FullName
}
end
{
[JMA.CollectionHelper]::Duplicates($files)
}

PowerShell File Sorting

So I feel like PS would be the best solution for this project, but cannot for the life of me figure out where to get started with it, here's the file layout..
I've got one folder, filled with folders generated by our automated system, they are labeled: foobarXXXXXXXXXXXXXXX
The end 15 characters of the folder is what I need to grab, and then search through another folder for any files that contain those 15 characters, and then move any files found into their respective folders.
I can give more details if this wasn't sufficient. Just need a point to get started.
I'm running Windows 7 should the version of PowerShell be a concern.
Ideally you want Powershell 3, but you can accomplish this task in Powershell 2 as well.
I would first look into the Select-String cmdlet. Also found on technet here.
It is also perfectly legal to use SubString function for .NET string manipulation.
$filePattern = $string.Substring(1,15)
To get collections of your files, you should use Get-ChildItem. Using the "#" in "#(Get-ChildItem)" produces an explicit array.
$files = #(Get-ChildItem -Path $path -Recurse)
And since there is no specific detail in your question there are no specific answers.
Also, I run Windows 7 with Powershell 2 and 3, side by side. Powershell 3 is kinda awesome.