Handling logging across scripts importing multiple modules - powershell

I have a module that currently overrides the in-built Write-Verbose/Error/Debug statements, this contains some custom code but also writes the message to a text file named after the calling script (utilising $MyInvocation).
This is imported into all my scripts. These same scripts also import other modules that also make use of the Write-Verbose etc.
Now, I want all the output to end up in a single text file named after the original invoking script. This works for any Write-Verbose etc. statements in the calling script, but as soon as it uses functions from other modules, that then becomes the "calling script" and so the Write-Verbose statements for that function end up in a different log file named after the name of the module file.
Is there a way around this?

Related

How to pass value to an exe and then the next button should be executed by powershell?

I have the code below which executes an exe file and then i want the path which is assigned to a variable to be copied into the text box of that exe and once it gets copied then the next button in the exe should get clicked\executed automatically by powershell.
& "D:\SOFTWARE\notepad ++.exe"
$Path="C:\Program Files"
Basically i will be using this code for some other exe file but process would be same. So is there any way by which i can do this by using powershell?
Consider the snapshot below of the UI of application installed. If i want to pass the product key which i have declared in a variable inside the powershell after passing it to the application and then i want its Get Product Details button to be hit\Run.
What you actually want is pass the variables into the msi by running it quietly without UI. Try the following first (assuming your exe is actually just an MSI wrapper)
msiexec.exe /i $PathToExe /q /l*vx log.txt
This should quietly launch the msi and produce a log.txt file. Wait a few moments for it to complete since it is installing in the background. Then check if it correctly installed. Then check the log file to see all the variables that were set at the end. Guessing by their names or values, you could then pass them in the next time simply by adding variablename=variablevalue to the command line call. You might need to experiment and learn a bit about MSI and msiexec as part of this.
If talk about invoking UI actions in general, then you would need to look at the AutomationElement class but it can get very complex here on...
You should look at using SendKeys.
https://blogs.technet.microsoft.com/heyscriptingguy/2011/01/10/provide-input-to-applications-with-powershell/

Can I make a module from a bunch of single-function scripts?

We've accumulated a bunch of scripts, each looks and feels like CmdLets, i.e. it has a set of declared params and then it immediately calls a Main function which does the work, calling private sub-functions within.
An example is Remove-ContentLine.ps1 which just spits out the contents of a file or piped input except for lines matching some pattern.
So they're like little "function-scripts".
Is there any way I can aggregate these scripts into a module while also keeping them exactly as they are in files?
Edit
If your hunch is that its easier to just copy paste and refactor them into a psm1 then just say ;)
You ask:
Is there any way I can aggregate these scripts into a module while
also keeping them exactly as they are in files?
But I am certain that is not what you really want. If so, then all of your code will immediately execute when you load the module! Rather, I think what you want is that each of your scripts should be contained within a function; that group of functions is then loaded when you import the module; and you can then execute any of your functions on demand.
The process is very straightforward, and I have written an extensive article on just how to do that (Further Down the Rabbit Hole: PowerShell Modules and Encapsulation) but I will summarize here:
(1) Edit each file to wrap the entire contents into a function and conclude with exporting the function. I would suggest name the function based on the file name. Thus, Remove-ContentLine.ps1 should now look like this:
function Remove-ContentLine()
{
# original content of Remove-ContentLine.ps1 here
}
Export-ModuleMember Remove-ContentLine
(2) Decide on a name for your module and create a directory of that name. Let's call it MyModule. Within the MyModule directory, create a subdirectory to place all your .ps1 files; let's call that ScriptCmdlets.
(3) Create a module file MyModule.psm1 within MyModule whose contents will be exactly this:
Resolve-Path $PSScriptRoot\ScriptCmdlets\*.ps1 |
? { -not ($_.ProviderPath.Contains(".Tests.")) } |
% { . $_.ProviderPath }
Yes, every module (.psm1) file I write contains that identical code!
(4) Create a module manifest MyModule.psd1 within MyModule using the New-ModuleManifest cmdlet.
Then to use your module, just use Import-Module. But I urge you to review my article for more details to gain a better understanding of the process.
I doubt you can if the scripts already executing something ("main"). If they just expose a function like Remove-ContentLine for the Remove-ContentLine.ps1 you could dot source all the scripts in a single script to aggregate them or use the ScriptsToProcess = #() section when working with a module manifest.
I think it would be best to refactor the functions from within each .ps1 into a proper module. It should be essentially just copy/pasting the scripts into a single .psm1 file and creating a .psd1 for it. Be sure to check for and properly handle anything that is set in the script or global scopes, and there are no naming conflicts between functions.
If you have Sapien PowerShell Studio, there is a 'New Module from Functions' option in the File menu which would help automate the bulk of this for you.

Need PowerShell help! Very strange things are happening

So I'm using PowerShell to manipulate a SharePoint 2010 library. I am uploading, downloading, and deleting files in a script using a custom module I made. My errors are so odd I can't understand them.I am using PowerGUI, Windows PowerShell ISE, and PowerShell Management Shell all in admin mode.
PowerGUI:
I sometimes can't get an spWeb object, sometimes I can. The URL string is being pulled from a CSV file so it never changes and neither does the code before I call Get-SPWeb -Identity $correctURL
Sometimes when I call a list RootFolder it returns $false for the Exists property, using management shell I can get past this. Otherwise I can touch it by calling $ListName.RootFolder.Files and it will magically return and hold the $true for Exists in future executions of my script.
Then when I call an XML file full of file properties(for uploaded files) it will return file property names for $fileFieldsXML.row.Attributes | foreach {$_} and values for $fileFieldsXML.row.Attributes | foreach {$_.ToString()}. This is, unless I set them to variables. When two very distinct vars are set to these two differentish calls they both are set to the array of property names! Why??
Windows PowerShell ISE and PowerShell Management Shell
I think these are just outdated somehow. I can call Get-SPWeb in Management Shell but I can't in ISE due to I guess outdated versions. Lately the management shell will act as if I haven't been doing anything to the files unless I close it out and reopen it. Does the management shell just hold a copy of all files when it starts or something? Can I make it update these files?
Can anyone suggest a better way to debug? Also why does a module seem to severely increase runtime? When everything was in the same script it was quick but my long functions take several times longer to execute now.
I also have been using PowerShell and SharePoint for almost two months now, so I am a beginner and intern. Perhaps that is really the cause of my problems :)

Powershell: Include another script that has includes?

I want to make my life easier when making scripts. I'm staring a little framework that will have a hierarchy of include files. The problem is dot sourcing a ps1 script that already has other files dot sourced brakes the scope in the original calling scripts.
It looks like this:
\config\loadvariables.ps1
$var = "shpc0001"
\config\config.ps1
. '.\loadvariables.ps1'
\test.ps1
. '.\config\config.ps1'
echo $var
The problem is that test.ps1 tries to load loadvariables.ps1 as it is located beside test.ps1 script.
How can I solve this?
The easiest way to manage a collection of scripts which have inter-dependencies is to convert them to modules. This feature is only available in 2.0 but it allows you to separate a group of scripts into independent components with declared dependencies.
Here is a link to a tutorial on getting modules up and running
https://learn.microsoft.com/en-us/powershell/scripting/developer/module/how-to-write-a-powershell-script-module
As Jared said, modules are the way to go. But since you may even dot-source inside your modules, it is best to use full paths (which can still be calculated at run time) like so.
## Inside modules, you can refer to the module's location like so
. "$PSScriptRoot\loadvariables.ps1"
## Outside a module, you can do this
$ScriptRoot = Split-Path $MyInvocation.MyCommand.Path
. "$ScriptRoot\loadvariables.ps1"

Run a PowerShell script from another one

What is the best and correct way to run a PowerShell script from another one?
I have a script a.ps1 from which I want to call b.ps1 which does different task.
Let me know your suggestions. Is dot sourcing is the best option here?
Dot sourcing will run the second script as if it is part of the caller—all script scope changes will affect the caller. If this is what you want then dot-source,
However it is more usual to call the other script as if it were a function (a script can use param and function level attributes just like a function). In many ways a script is a PowerShell function, with the name of the file replacing the naming of the function.
Dot sourcing makes it easier to at a later stage convert your script(s) into a module, you won't have to change the script(s) into functions.
Another advantage of dot sourcing is that you can add the function to your shell by adding the file that holds the functions to Microsoft.PowerShell_profile.ps1, meaning you have them available at all times (eliminating the need to worry about paths etc).
I have a short write-host at the top of my dot sourced files with the name of the function and common parameters and I dot source the functions in my profile. Each time I open PowerShell, the list of functions in my profile scrolls by (If like me, you frequently forget the exact names of your functions/files You'll appreciate this as over time as the number of functions start to pile up).
Old but still relevant.
I work with modules with "Import-Module ", this will import the module in the current powershell session.
To avoid keep in cache and to always have the last changes from the module I put a "Get-Module | Remove-Module" that will clear all the loaded modules in the current session.
Get-Module | Remove-Module
Import-Module '.\IIS\Functions.psm1'