I am trying to build a UI using Avalonia. I have been plowing through it, but I hit a snag. This is the example code they provide:
using ReactiveUI;
public class MyViewModel : ReactiveObject
{
private string caption;
public string Caption
{
get => caption;
set => this.RaiseAndSetIfChanged(ref caption, value);
}
}
I need to set properties to "RaiseAndSetIfChanged", which require setting the get; set; of the variable within the class. I'm not sure how to do this in Powershell.
I have tried things like this:
Powershell class implement get set property
To no avail, but maybe I wasn't doing it right.
Alternatively, if there would be a different way to set RaiseAndSetIfChanged after [MyViewModel]::New() was called
Link to Example:
https://avaloniaui.net/docs/binding/change-notifications
I figured out how to use ReactiveUI in a different way, I give examples in the readme in this fork I have created of the module of Avalonia for Powershell.
https://github.com/MaynardMiner/psavalonia
It has a method called RaisePropertyChanged. I used it as a workaround to update bindings.
There is no possible way to natively create a getter/setter for C# emulation after searching.
Are there any tools out there to picture/diagram reports and subreports in a hierarchy? We currently have around 50+ reports that are structured like this and I would like to build a visual picture.
At present I am looking through all of the xml in the files to get the Agent Ransack of course) which is taking a while.
I am debating on whether to write a little quick and dirty tool in C# WinForms that maybe automates Visio 2010 or try and take advantage of my Telerik components that I purchased. I did come across this Codeplex Visio Automation. Any thoughts on these or any other diagramming tools that might help in my quest.
Update
I've started creating a .NET WinForm Application using NShape which looks like a reasonable diagramming object model.
I did consider doing something similar myself but only spent an hour or so on it before other work commitments got in the way. However I did prototype a simple bit of code in LinqPad that takes the SSRS reports path and the rdl of the main report and returns all the sub reports.
It's not much but it may help you with your project in terms of getting the sub reports efficiently.
You could turn this into a simple app I guess but LinqPad makes life so much easier, get a free copy of it if you don't have it already and try this code.
void Main()
{
string filePath = #"D:\Dev\SSRS Projects\MyReportsAreHerePath\";
var x = XDocument.Load(filePath + #"MyMasterReportNameHere.rdl");
var ns = x.Root.GetDefaultNamespace();
Func<string, report>getSubReports= null;
getSubReports =
fileName => new report
{
name = fileName,
subReports =
XDocument
.Load(filePath + fileName + ".rdl")
.Descendants(ns + "Subreport")
.Select(sr=>getSubReports(sr.Element(ns + "ReportName").Value))
};
getSubReports("MyMasterReportNameHere").Dump();
}
public class report
{
public string name{get;set;}
public float top { get; set; }
public IEnumerable<report> subReports {get;set;}
}
Regards,
Al
Being one who likes to document thoroughly, I was glad to discover the SupportsWildcards attribute, among others, added in PowerShell V3. I have decorated parameters in my library with that attribute as appropriate. In the long run there is no issue, but in the short term there are still plenty of folks using V2 for various reasons (including me in one environment).
It seems silly that just because of one attribute some of my functions can no longer run in PowerShell V2. So I am looking for a way to mock the attribute in V2 to essentially turn it into a "no-op".
The solution, as I see it, needs two parts:
create an essentially empty custom attribute.
make this take effect in V2 but be ignored in V3 (and hence allow the true V3 attribute to work properly).
I am looking for guidance on both parts, having not played with custom attributes before.
Perhaps you can try this.
Add-Type #"
public class CustomAttribute : System.Attribute
{
public bool SupportSomething { get; set; }
}
"#
function Do-Something {
param(
[CustomAttribute(SupportSomething=$true)]
$Command
)
}
$parameters = Get-Command -Name Do-Something | Select-Object -ExpandProperty Parameters
$parameters["Command"].Attributes
Then the output:
SupportSomething : True
TypeId : CustomAttribute
We first define the attribute in C#, which you can also do in PowerShell.
Add the attribute to the parameter. Then get the list of attributes. See here for more attribute examples
Hi All i have a question regarding NUnit Extension (2.5.10).
What i am trying to do is write some additional test info to the
database. For that i have created NUnit extension using Event
Listeners.
The problem i am experiencing is that public void
TestFinished(TestResult result) method is being called twice at
runtime. And my code which writes to the database is in this method
and that leaves me with duplicate entries in the database. The
question is: Is that the expected behaviour? Can i do something about
it?
The extension code is below. Thanks.
using System;
using NUnit.Core;
using NUnit.Core.Extensibility;
namespace NuinitExtension
{
[NUnitAddinAttribute(Type = ExtensionType.Core,
Name = "Database Addin",
Description = "Writes test results to the database.")]
public class MyNunitExtension : IAddin, EventListener
{
public bool Install(IExtensionHost host)
{
IExtensionPoint listeners = host.GetExtensionPoint("EventListeners");
if (listeners == null)
return false;
listeners.Install(this);
return true;
}
public void RunStarted(string name, int testCount){}
public void RunFinished(TestResult result){}
public void RunFinished(Exception exception){}
public void TestStarted(TestName testName){}
public void TestFinished(TestResult result)
{
// this is just sample data
SqlHelper.SqlConnectAndWRiteToDatabase("test", test",
2.0, DateTime.Now);
}
public void SuiteStarted(TestName testName){}
public void SuiteFinished(TestResult result){}
public void UnhandledException(Exception exception){}
public void TestOutput(TestOutput testOutput){}
}
}
I have managed to fix the issue by simply removing my extension
assembly from NUnit 2.5.10\bin\net-2.0\addins folder. At the moment
everything works as expected but i am not sure how. I thought that you
have to have the extension/addin assembly inside the addins folder.
I am running tests by opening a solution via NUnit.exe. My extension
project is part of the solution i am testing. I have also raised this issue with NUnit guys and got the following explanation:
Most likely, your addin was being loaded twice. In order to make it easier to test addins, NUnit searches each test assembly for addins to be loaded, in addition to searching the addins directory. Normally, when you are confident that your addin works, you should remove it from the test assembly and install it in the addins folder. This makes it available to all tests that are run using NUnit. OTOH, if you really only want the addin to apply for a certain project, then you can leave it in the test assembly and not install it as a permanent addin.
http://groups.google.com/group/nunit-discuss/browse_thread/thread/c9329129fd803cb2/47672f15e7cc05d1#47672f15e7cc05d1
Not sure this answer is strictly relevant but might be useful.
I was having a play around with the NUnit library recently to read NUnit tests in so they could easily be transfered over to our own in-house acceptance testing framework.
It turns out we probably wont stick with this but thought it might be useful to share my experiences figuring out how to use the NUnit code:
It is different in that it doesn't get run by the NUnit console or Gui Runner but just by our own console app.
public class NUnitTestReader
{
private TestHarness _testHarness;
public void AddTestsTo(TestHarness testHarness)
{
_testHarness = testHarness;
var package = new TestPackage(Assembly.GetExecutingAssembly().Location){AutoBinPath = true};
CoreExtensions.Host.InitializeService();
var testSuiteBuilder = new TestSuiteBuilder();
var suite = testSuiteBuilder.Build(package);
AddTestsFrom(suite);
}
private void AddTestsFrom(Test node)
{
if (!node.IsSuite)
AddTest(node);
else
{
foreach (Test test in node.Tests)
AddTestsFrom(test);
}
}
private void AddTest(Test node)
{
_testHarness.AddTest(new WrappedNUnitTest(node, TestFilter.Empty));
}
}
The above reads NUnit tests in from the current assembly wraps them up and then adds them to our inhouse test harness. I haven't included these classes but they're not really important to understanding how the NUnit code works.
The really useful bit of information here is the static to "InitialiseService" this took quite a bit of figuring out but is necessary to get the basic set of test readers loaded in NUnit. You need to be a bit careful when looking at the tests in NUnit aswell as it includes failing tests (which I assume dont work because of the number of statics involved) - so what looks like useful documentation is actually misleading.
Aside from that you can then run the tests by implementing EventListener. I was interested in getting a one to one mapping between our tests and NUnit tests so each test is run on it's own. To achieve this you just need to implement TestStarted and TestFinished to do logging:
public void TestStarted(TestName testName)
{
}
public void TestFinished(TestResult result)
{
string text;
if (result.IsFailure)
text = "Failure";
else if (result.IsError)
text = "Error";
else
return;
using (var block = CreateLogBlock(text))
{
LogFailureTo(block);
block.LogString(result.Message);
}
}
There are a couple of problems with this approach: Inherited Test base classes from other assemblies with SetUp methods that delegate to ones in the current assembly dont get called. It also has problems with TestFixtureSetup methods which are only called in NUnit when TestSuites are Run (as opposed to running test methods on their own).
These both seem to be problems with NUnit although if you dont want to construct wrapped tests individually I think you could just put in a call to suite.Run with the appropriate parameters and this will fix the latter problem
I am redesigning a command line application and am looking for a way to make its use more intuitive. Are there any conventions for the format of parameters passed into a command line application? Or any other method that people have found useful?
I see a lot of Windows command line specifics, but if your program is intended for Linux, I find the GNU command line standard to be the most intuitive. Basically, it uses double hyphens for the long form of a command (e.g., --help) and a single hyphen for the short version (e.g., -h). You can also "stack" the short versions together (e.g., tar -zxvf filename) and mix 'n match long and short to your heart's content.
The GNU site also lists standard option names.
The getopt library greatly simplifies parsing these commands. If C's not your bag, Python has a similar library, as does Perl.
If you are using C# try Mono.GetOptions, it's a very powerful and simple-to-use command-line argument parser. It works in Mono environments and with Microsoft .NET Framework.
EDIT: Here are a few features
Each param has 2 CLI representations (1 character and string, e.g. -a or --add)
Default values
Strongly typed
Automagically produces an help screen with instructions
Automagically produces a version and copyright screen
One thing I like about certain CLI is the usage of shortcuts.
I.e, all the following lines are doing the same thing
myCli.exe describe someThing
myCli.exe descr someThing
myCli.exe desc someThing
That way, the user may not have to type the all command every time.
A good and helpful reference:
https://commandline.codeplex.com/
Library available via NuGet:
Latest stable: Install-Package CommandLineParser.
Latest release: Install-Package CommandLineParser -pre.
One line parsing using default singleton: CommandLine.Parser.Default.ParseArguments(...).
One line help screen generator: HelpText.AutoBuild(...).
Map command line arguments to IList<string>, arrays, enum or standard scalar types.
Plug-In friendly architecture as explained here.
Define verb commands as git commit -a.
Create parser instance using lambda expressions.
QuickStart: https://commandline.codeplex.com/wikipage?title=Quickstart&referringTitle=Documentation
// Define a class to receive parsed values
class Options {
[Option('r', "read", Required = true,
HelpText = "Input file to be processed.")]
public string InputFile { get; set; }
[Option('v', "verbose", DefaultValue = true,
HelpText = "Prints all messages to standard output.")]
public bool Verbose { get; set; }
[ParserState]
public IParserState LastParserState { get; set; }
[HelpOption]
public string GetUsage() {
return HelpText.AutoBuild(this,
(HelpText current) => HelpText.DefaultParsingErrorsHandler(this, current));
}
}
// Consume them
static void Main(string[] args) {
var options = new Options();
if (CommandLine.Parser.Default.ParseArguments(args, options)) {
// Values are available here
if (options.Verbose) Console.WriteLine("Filename: {0}", options.InputFile);
}
}
Best thing to do is don't assume anything if you can. When the operator types in your application name for execution and does not have any parameters either hit them with a USAGE block or in the alternative open a Windows Form and allow them to enter everything you need.
c:\>FOO
FOO
USAGE FOO -{Option}{Value}
-A Do A stuff
-B Do B stuff
c:\>
Parameter delimiting I place under the heading of a religious topic: hyphens(dashes), double hyphens, slashes, nothing, positional, etc.
You didn't indicate your platform, but for the next comment I will assume Windows and .net
You can create a console based application in .net and allow it to interact with the Desktop using Forms just by choosing the console based project then adding the Windows.Forms, System.Drawing, etc DLLs.
We do this all the time. This assures that no one takes a turn down a dark alley.
Command line conventions vary from OS to OS, but the convention that's probably gotten both the most use, and the most public scrutiny is the one supported by the GNU getopt package. See http://www.gnu.org/software/libc/manual/html_node/Using-Getopt.html for more info.
It allows you to mix single letter commands, such as -nr, with longer, self-documenting options, such as --numeric --reverse. Be nice, and implement a --help (-?) option and then your users will be able to figure out all they need to know.
Here's a CodeProject article that might help you out...
C#/.NET Command Line Arguments Parser
IF VB is your flavor, here's a separate article (with a bit more guidance related content) to check out...
Parse and Validate Command Line Parameters with VB.NET
Complementing #vonc's answer, don't accept ambiguous abbreviations. Eg:
myCli.exe describe someThing
myCli.exe destroy someThing
myCli.exe des someThing ???
In fact, in that case, I probably wouldn't accept an abbreviation for "destroy"...
I always add a /? parameter to get help and I always try to have a default (i.e. most common scenario) implementation.
Otherwise I tend to use the "/x" for switches and "/x:value" for switches that require values to be passed. Makes it pretty easy to parse the parameters using regular expressions.
I developed this framework, maybe it helps:
The SysCommand is a powerful cross-platform framework, to develop Console Applications in .NET. Is simple, type-safe, and with great influences of the MVC pattern.
https://github.com/juniorgasparotto/SysCommand
namespace Example.Initialization.Simple
{
using SysCommand.ConsoleApp;
public class Program
{
public static int Main(string[] args)
{
return App.RunApplication();
}
}
// Classes inheriting from `Command` will be automatically found by the system
// and its public properties and methods will be available for use.
public class MyCommand : Command
{
public void Main(string arg1, int? arg2 = null)
{
if (arg1 != null)
this.App.Console.Write(string.Format("Main arg1='{0}'", arg1));
if (arg2 != null)
this.App.Console.Write(string.Format("Main arg2='{0}'", arg2));
}
public void MyAction(bool a)
{
this.App.Console.Write(string.Format("MyAction a='{0}'", a));
}
}
}
Tests:
// auto-generate help
$ my-app.exe help
// method "Main" typed
$ my-app.exe --arg1 value --arg2 1000
// or without "--arg2"
$ my-app.exe --arg1 value
// actions support
$ my-app.exe my-action -a
-operation [parameters] -command [your command] -anotherthings [otherparams]....
For example,
YourApp.exe -file %YourProject.prj% -Secure true
If you use one of the standard tools for generating command line interfaces, like getopts, then you'll conform automatically.
The conventions that you use for you application would depend on
1) What type of application it is.
2) What operating system you are using.
This is definitely true. I'm not certain about dos-prompt conventions, but on unix-like systems the general conventions are roughly:
1) Formatting is
appName parameters
2) Single character parameters (such as 'x') are passed as -x
3) Multi character parameters (such as 'add-keys') are passed as --add-keys
The conventions that you use for you application would depend on
1) What type of application it is.
2) What operating system you are using. Linux? Windows? They both have different conventions.
What I would suggest is look at other command line interfaces for other commands on your system, paying special attention to the parameters passed. Having incorrect parameters should give the user solution directed error message. An easy to find help screen can aid in usability as well.
Without know what exactly your application will do, it's hard to give specific examples.
If you're using Perl, my CLI::Application framework might be just what you need. It lets you build applications with a SVN/CVS/GIT like user interface easily ("your-command -o --long-opt some-action-to-execute some parameters").
I've created a .Net C# library that includes a command-line parser. You just need to create a class that inherits from the CmdLineObject class, call Initialize, and it will automatically populate the properties. It can handle conversions to different types (uses an advanced conversion library also included in the project), arrays, command-line aliases, click-once arguments, etc. It even automatically creates command-line help (/?).
If you are interested, the URL to the project is http://bizark.codeplex.com. It is currently only available as source code.
I've just released an even better command line parser.
https://github.com/gene-l-thomas/coptions
It's on nuget Install-Package coptions
using System;
using System.Collections.Generic;
using coptions;
[ApplicationInfo(Help = "This program does something useful.")]
public class Options
{
[Flag('s', "silent", Help = "Produce no output.")]
public bool Silent;
[Option('n', "name", "NAME", Help = "Name of user.")]
public string Name
{
get { return _name; }
set { if (String.IsNullOrWhiteSpace(value))
throw new InvalidOptionValueException("Name must not be blank");
_name = value;
}
}
private string _name;
[Option("size", Help = "Size to output.")]
public int Size = 3;
[Option('i', "ignore", "FILENAME", Help = "Files to ignore.")]
public List<string> Ignore;
[Flag('v', "verbose", Help = "Increase the amount of output.")]
public int Verbose = 1;
[Value("OUT", Help = "Output file.")]
public string OutputFile;
[Value("INPUT", Help = "Input files.")]
public List<string> InputFiles;
}
namespace coptions.ReadmeExample
{
class Program
{
static int Main(string[] args)
{
try
{
Options opt = CliParser.Parse<Options>(args);
Console.WriteLine(opt.Silent);
Console.WriteLine(opt.OutputFile);
return 0;
}
catch (CliParserExit)
{
// --help
return 0;
} catch (Exception e)
{
// unknown options etc...
Console.Error.WriteLine("Fatal Error: " + e.Message);
return 1;
}
}
}
}
Supports automatic --help generation, verbs, e.g. commmand.exe
Enjoy.