How to Edit/Update Nant script - service

I need to update an Nant script automatically by fetching some data from database. The solution I can think of is to be done through a service which fetches the data from DB and update the Nant script.
Can this be done? If yes, how?

In theory, if you need to change how the script works then you could create a program to generate the NAnt build file, run it with the exec task, include that file and then call a target.
That seems a bit over-complicated though. I suppose it depends on how much the script will change based on the data.
If the data is simply configuration, then you can use the data to set properties in your build script (either by the same mechanism above, or by creating a custom task to create a property value based on the result of a SQL statement). Then use those properties to determine control flow in the build script using standard things like if statements and foreach loops.
I don't think that there's anything built-in that will do this for you, but custom tasks are very easy to create if you can program.

If you update/edit a nant script it does not change the current execution. Instead you can generate .build files and execute them via <nant> task, for example using a <foreach> loop or <style> xsl-transformation. An alternative would be to write a small <script>, in particular if you can program it comfortably in C#. If you wish more specific answers more information would be helpful. (database used, what tools you can use to extract data)

Related

Can file time stamps be used to define dependencies in Psake PowerShell makefiles?

From what I have seen Psake domain specific PowerShell scripts do not evaluate if dependent objects really need to be built - instead the dependent objects are always evaluated in order.
Is there a way to implement dependencies so that the script to build a make target, such as a file, is only executed if any of the dependent files are newer than the target file?
I experimented with precondition and post condition, with limited success but this seems like a standard requirement and is in every UNIX style "make" I've used in the past. It feels like I am missing something obvious. Help!
As far as I know, Psake does not have such tools. The similar PowerShell build tool Invoke-Build does. You may try it if "incremental" tasks are important for your build scripts. See its wiki pages
Incremental Tasks
Partial Incremental Tasks

How to give PowerShell WorkFlow access to previously imported modules

I'm trying to introduce PowerShell workflow into some existing scripts to take advantage of the parallel running capability.
Currently in the WorkFlow I'm having to use:
Inline
{
Import-Module My.Modules
Execute-MyModulesCustomFunctionFromImportedModules -SomeVariable $Using:SomeVariableValue
}
Otherwise I get the error stating it can't find the custom function. There must be a better way to do this?
The article at http://www.powershellmagazine.com/2012/11/14/powershell-workflows/ confirms that having to import modules and then use them is just how it works - MS gets around this by creating WF activities for all its common PowerShell commands:
General workflow design strategy
It’s important to understand that the entire contents of the workflow
get translated into WF’s own language, which only understands
activities. With the exception of a few commands, Microsoft has
provided WF activities that correspond to most of the core PowerShell
cmdlets. That means most of PowerShell’s built-in commands—the ones
available before any modules have been imported—work fine.
That isn’t the case with add-in modules, though. Further, because each
workflow activity executes in a self-contained space, you can’t even
use Import-Module by itself in a workflow. You’d basically import a
module, but it would then go away by the time you tried to run any of
the module’s commands.
The solution is to think of a workflow as a high-level task
coordination mechanism. You’re likely to have a number of
InlineScript{} blocks within a workflow because the contents of those
blocks execute as a single unit, in a single PowerShell session.
Within an InlineScript{}, you can import a module and then run its
commands. Each InlineScript{} block that you include runs
independently, so think of each one as a standalone script file of
sorts: Each should perform whatever setup tasks are necessary for it
to run successfully.

How to protect powershell file, and call single function

I'm having this problem for a while now and google have its limits.
I'm writing a powershell file that contain several generic function.
I use the function in vary scripts and now I want to let other personal in my work to use them as well.
the problem is, do to sensitive operation, I want to lock and protect the script (compile to a dll, exe etc').
how do I create powershell library like C# DLL?
one option I try but did not find out how to continue is to compile the script using powerGUI to executable file ( .exe) but then I canot access the function in it let alone pass on parameters to that function.
hope you understood me :)
thank you.
You don't. Rather than trying to obscure this information (if you compile them, they can be decompiled and your "protected" resources will no longer be), remove them entirely and make those parameters for your functions. This both protects your "sensitive" data and makes the code much more reusable.
You can then package your functions into a module

Is it possible for run NUnit against a specific (long) list of tests

I have a list of several thousand NUnit tests that I want to run (generated automatically by another tool). (This is a subset of all of the tests, and changes frequently)
I'd like to be able to run these via NUnit-Console.exe. Unfortunately the /run option only takes a direct list of files which in my case would not fit on a single command line. I'd like it to pickup the list from a filename.
I appreciate that I could use categories, but the list I want to run changes frequently and so I'd prefer not to have to start changing source code.
Does anyone know if there is a clean way to get NUnit to run my specified tests?
(I could break it down into a series of smaller calls to NUnit-console with a full command line, but that's not very elegant)
(If it's not possible, maybe I should add it as an NUnit feature request.)
Had a reply from Charlie Poole (from NUnit development team), that this is not currently possible but has been added as a feature request for NUnit 2.6
I see what you're saying, but like you say you can run a single fixture from the command line.
nunit-console /fixture:namespace.fixture tests.dll
How about generating all the tests in the same fixture? Or place them all in the same assembly?
nunit-console tests.dll
As mentioned in the nunitLink, we need to mention the scenario/test case name. It simple but it has bit of a trick in it. Directly mentioning the test case name will not serve the purpose and you will end up with the 0 testcases executed. We need to write the exact path for the same.
I don't know how it works for other languages but using c# I have found a solution. Whenever we create a feature file corresponding feature.cs file get's created in Visual Studio. Click on the featureFileName.feature.cs and look for namespace and keep it aside(Part 1)
namespace MMBank.Test.Features
Scroll a bit down you will get the class name. Note that as well and keep it aside(Part 2)
public partial class HistoricalTransactionFeature
Keep scrolling down, you will see the code which nunit understands for execution basically.
[NUnit.Framework.TestAttribute()]
[NUnit.Framework.DescriptionAttribute("TC_1_A B C D")]
[NUnit.Framework.CategoryAttribute("MM_Bank")]
Below the code you can see the function/method name which will most likely be TC_1_ABCD(certain parameters)
public virtual void TC_1_ABCD(string username, string password, string visit)
You will be having multiple such methods based on no. of scenarios you have in your feature file. Note the method(test case) which you want to execute and keep it aside(Part 3)
Now collate all the parts with dots. Finally you will land up with something like this,
MMBank.Test.Features.HistoricalTransactionFeature.TC_1_ABCD
This is it. Similarly you can create the test case names from multiple feature files and stack them up in text file. Every test case name should be in different line. For command you can browse through above nunit link for execution using command prompt.

Powershell in SQLCLR?

In the past I've been able to embed a sripting languate (like JScript) inside the SQLCLR, so scripts can be passed as parameters of functions, to perform certain calculations. Here is a simplistic example (the function ssScriptExecute returns a concatenation of all the print's in the script):
select dbo.ssScriptExecute( 'print("Calculation: "+(1+2/3) );' )
-- Calculation: 1.6666666666666665
I'd love to be able to embed a Powershell runtime in the same way. But I've had all sort of problems because the runtime tries to find assemblies by path, and there are no paths inside the SQlCLR. I'm happy to provide more information on the errors I get, but I was wondering if anybody has tried this!
Thanks!
I use il code injection to modified System.Automation.Management.
make variable version in GetPSVersionTable() be "2.0"
then i can run Powershell Code in SQL Server.
Be sure reference this modified dll in your visual studio project.
http://www.box.net/shared/57122v6erv9ss3aopq7p
btw, automated registering all dll you needed with running powershell in SQL
you can use this ps1 code
http://www.box.net/shared/tdlpu1875clsu8azxq4b
I think the only way to do this is to create a WCF service hosting powershell, and let SQLCLR send the request dbo.ssScriptExecute(...) to that service for execution.
Besides from that, I've also successfully embedded paxScript.net in the SQLCLR (an interpreter that does not have the memory leak problems of the DLR languages).
I thought SQLCLR was restricted to just a certain set of assemblies and PS Automation is not one of them.