How to pass a parameter from C# to a PowerShell script file? - powershell

From the command line I can do.
.\test.ps1 1
How do I pass the parameter when doing this from C#?
I've tried
.AddArgument(1)
.AddParameter("p", 1)
And I have tried passing values in as IEnumerable<object> in the .Invoke() but $p does not get the value.
namespace ConsoleApplication1
{
using System;
using System.Linq;
using System.Management.Automation;
class Program
{
static void Main()
{
// Contents of ps1 file
// param($p)
// "Hello World ${p}"
var script = #".\test.ps1";
PowerShell
.Create()
.AddScript(script)
.Invoke().ToList()
.ForEach(Console.WriteLine);
}
}
}

How's this?
static void Main()
{
string script = #"C:\test.ps1 -arg 'hello world!'";
StringBuilder sb = new StringBuilder();
PowerShell psExec = PowerShell.Create();
psExec.AddScript(script);
psExec.AddCommand("out-string");
Collection<PSObject> results;
Collection<ErrorRecord> errors;
results = psExec.Invoke();
errors = psExec.Streams.Error.ReadAll();
if (errors.Count > 0)
{
foreach (ErrorRecord error in errors)
{
sb.AppendLine(error.ToString());
}
}
else
{
foreach (PSObject result in results)
{
sb.AppendLine(result.ToString());
}
}
Console.WriteLine(sb.ToString());
}
Here's a similar version that passes an instance of a DateTime
static void Main()
{
StringBuilder sb = new StringBuilder();
PowerShell psExec = PowerShell.Create();
psExec.AddCommand(#"C:\Users\d92495j\Desktop\test.ps1");
psExec.AddArgument(DateTime.Now);
Collection<PSObject> results;
Collection<ErrorRecord> errors;
results = psExec.Invoke();
errors = psExec.Streams.Error.ReadAll();
if (errors.Count > 0)
{
foreach (ErrorRecord error in errors)
{
sb.AppendLine(error.ToString());
}
}
else
{
foreach (PSObject result in results)
{
sb.AppendLine(result.ToString());
}
}
Console.WriteLine(sb.ToString());
}

So, to make a long answer short: Use AddCommand instead of AddScript

another way is to fill the runspace with variables.
public static string RunPs1File(string filePath, Dictionary<string, object> arguments)
{
var result = new StringBuilder();
using (Runspace space = RunspaceFactory.CreateRunspace())
{
space.Open();
foreach (KeyValuePair<string, object> variable in arguments)
{
var key = new string(variable.Key.Where(char.IsLetterOrDigit).ToArray());
space.SessionStateProxy.SetVariable(key, variable.Value);
}
string script = System.IO.File.ReadAllText(filePath);
using (PowerShell ps1 = PowerShell.Create())
{
ps1.Runspace = space;
ps1.AddScript(script);
var psOutput = ps1.Invoke();
var errors = ps1.Streams.Error;
if (errors.Count > 0)
{
var e = errors[0].Exception;
ps1.Streams.ClearStreams();
throw e;
}
foreach (var line in psOutput)
{
if (line != null)
{
result.AppendLine(line.ToString());
}
}
}
}
return result.ToString();
}

Currently a simpler solution is to read the script as a string first.
PowerShell
.Create()
.AddScript(File.ReadAllText(script)).AddParameter("p", 1)
.Invoke().ToList()
.ForEach(Console.WriteLine);

Related

GameObject.Find not working with string var

I cannot get GameObject.Find to work by passing a string variable as an agrument.
Here is the code:
public IEnumerator GetControls(string uri)
{
using (UnityWebRequest webRequest = UnityWebRequest.Get(uri))
{
yield return webRequest.SendWebRequest();
controls = new JSONObject(webRequest.downloadHandler.text);
GameObject controlGO;
string keyName;
foreach (JSONObject control in controls)
{
keyName = control["keyName"].ToString();
controlGO = GameObject.Find(keyName); //WONT FIND IT
controlGO = GameObject.Find("growLight3"); //WILL FIND IT
}
}
}

Method listDuplicates() error

Can somebody help me with this problem here. I'm a newbie . I'm trying to find file duplicates or files with the same content in a directory and write texfile to display duplicates but now it says input string was not in a correct format
public static List<FileInfo> files = new List<FileInfo>();
public static void ListDrive(string drive)
{
try
{
DirectoryInfo di = new DirectoryInfo(drive);
foreach (FileInfo fi in di.GetFiles())
{
files.Add(fi);
}
}
catch (UnauthorizedAccessException)
{ }
}
//Find duplicates
public static void ListDuplicates()
{
var duplicatedFiles = files.GroupBy(x => new { x.Length }).Where(t => t.Count() > 1).ToList();
Console.WriteLine("Total items: {0}", files.Count);
Console.WriteLine("Probably duplicates {0} ", duplicatedFiles.Count());
StreamWriter duplicatesFoundLog = new StreamWriter("log.txt");
foreach (var filter in duplicatedFiles)
{
duplicatesFoundLog.WriteLine("Probably duplicated item: Name: { 0}, Length: { 1}",
filter.Key.Length);
var items = files.Where(x => x.Length == filter.Key.Length).ToList();
int c = 1;
foreach (var suspected in items)
{
duplicatesFoundLog.WriteLine("{3},{ 0}- { 1}, Creation date { 2}",
suspected.Name, suspected.FullName, suspected.CreationTime, c);
c++;
}
duplicatesFoundLog.WriteLine();
}
duplicatesFoundLog.Flush();
duplicatesFoundLog.Close();
}
Here is my client method that invokes the two methods
try
{
Console.WriteLine("Enter the path");
string path = Console.ReadLine();
ListDrive(path);
ListDuplicates();
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
}
Your help will be high appreciated...
Please eliminate the space, for example change { 0} to {0}. If you do this for all locations the error should go away.
If you want spaces, make the code "{3}, {0}- {1}, Creation date {2}". i.e. Add the space before the opening {, not after. The format items need have no space after the first opening brace to be interpreted as a formatting item correctly.

SharePoint Backup Tool for Custom Lists

I have a SharePoint 2013 document library with three custom lists.
Once a day I would like to backup the custom lists as excel documents.
Is there an inbuilt functionality in SharePoint 2013 which can be configured as a recurring task?
Or should one use PowerShell or CSOM to write script or an application which is then run by a Windows Task?
you dont have any OOB features to do this, i had the same req and i wrote an Utility - PFB the code - this will give you an o/p in .csv file
class Program
{
private static DataTable dataTable;
private static SPList list;
static void Main(string[] args)
{
try
{
Console.WriteLine("Site Url: ");
string _siteUrl = Console.ReadLine();
if (!string.IsNullOrEmpty(_siteUrl))
{
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite site = new SPSite(_siteUrl))
{
if (site != null)
{
SPWeb web = site.RootWeb;
if (web != null)
{
// Export List code segment
Console.WriteLine("List Name:");
string _listName = Console.ReadLine();
if (!string.IsNullOrEmpty(_listName))
{
list = web.Lists[_listName];
if (list != null)
{
dataTable = new DataTable();
//Adds Columns to SpreadSheet
InitializeExcel(list, dataTable);
string _schemaXML = list.DefaultView.ViewFields.SchemaXml;
if (list.Items != null && list.ItemCount > 0)
{
foreach (SPListItem _item in list.Items)
{
DataRow dr = dataTable.NewRow();
foreach (DataColumn _column in dataTable.Columns)
{
if (dataTable.Columns[_column.ColumnName] != null && _item[_column.ColumnName] != null)
{
dr[_column.ColumnName] = _item[_column.ColumnName].ToString();
}
}
dataTable.Rows.Add(dr);
}
}
}
}
System.Web.UI.WebControls.DataGrid grid = new System.Web.UI.WebControls.DataGrid();
grid.HeaderStyle.Font.Bold = true;
grid.DataSource = dataTable;
grid.DataBind();
using (StreamWriter streamWriter = new StreamWriter("C:\\" + list.Title + ".xls", false, Encoding.UTF8))
{
using (HtmlTextWriter htmlTextWriter = new HtmlTextWriter(streamWriter))
{
grid.RenderControl(htmlTextWriter);
}
}
Console.WriteLine("File Created");
#endregion
}
}
}
});
}
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
}
Console.ReadLine();
}
// Create excel funution
public static void InitializeExcel(SPList list, DataTable _datatable)
{
if (list != null)
{
string _schemaXML = list.DefaultView.ViewFields.SchemaXml;
if (list.Items != null && list.ItemCount > 0)
{
foreach (SPListItem _item in list.Items)
{
foreach (SPField _itemField in _item.Fields)
{
if (_schemaXML.Contains(_itemField.InternalName))
{
if (_item[_itemField.InternalName] != null)
{
if (!_datatable.Columns.Contains(_itemField.InternalName))
{
_datatable.Columns.Add(new DataColumn(_itemField.StaticName, Type.GetType("System.String")));
}
}
}
}
}
}
}
}
}

Efficiently counting lines in a file

I'm trying to count the lines in a not so small text file (multiple MBs). The answers I found here suggest this:
(Get-Content foo.txt | Measure-Object -Line).Lines
This works, but the performance is poor. I guess the whole file is loaded into memory instead of streaming it line by line.
I created a test program in Java to compare the performance:
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Scanner;
import java.util.concurrent.TimeUnit;
import java.util.function.ToLongFunction;
import java.util.stream.Stream;
public class LineCounterPerformanceTest {
public static void main(final String... args) {
if (args.length > 0) {
final String path = args[0];
measure(LineCounterPerformanceTest::java, path);
measure(LineCounterPerformanceTest::powershell, path);
} else {
System.err.println("Missing path.");
System.exit(-1);
}
}
private static long java(final String path) throws IOException {
System.out.println("Java");
try (final Stream<String> lines = Files.lines(Paths.get(path))) {
return lines.count();
}
}
private static long powershell(final String path) throws IOException, InterruptedException {
System.out.println("Powershell");
final Process ps = new ProcessBuilder("powershell", String.format("(Get-Content '%s' | Measure-Object -Line).Lines", path)).start();
if (ps.waitFor(1, TimeUnit.MINUTES) && ps.exitValue() == 0) {
try (final Scanner scanner = new Scanner(ps.getInputStream())) {
return scanner.nextLong();
}
}
throw new IOException("Timeout or error.");
}
private static <T, U extends T> void measure(final ExceptionalToLongFunction<T> function, final U value) {
final long start = System.nanoTime();
final long result = function.unchecked().applyAsLong(value);
final long end = System.nanoTime();
System.out.printf("Result: %d%n", result);
System.out.printf("Elapsed time (ms): %,.6f%n%n", (end - start) / 1_000_000.);
}
#FunctionalInterface
private static interface ExceptionalToLongFunction<T> {
long applyAsLong(T value) throws Exception;
default ToLongFunction<T> unchecked() {
return (value) -> {
try {
return applyAsLong(value);
} catch (final Exception ex) {
throw new RuntimeException(ex);
}
};
}
}
}
The plain Java solution is ~ 80 times faster.
Is there a built-in way to do this task with comparable performance? I'm on PowerShell 4.0, if that matters.
See if this isn't faster than your current method:
$count = 0
Get-Content foo.txt -ReadCount 2000 |
foreach { $Count += $_.count }
$count
You can use a StreamReader for this type of thing. Not sure how its speed compares to your Java code but my understanding is that only a single line at a time is being loaded by the ReadLine method.
$StreamReader = New-Object System.IO.StreamReader($File)
$LineCount = 0
while ($StreamReader.ReadLine() -ne $null)
{
$LineCount++
}
$StreamReader.Close()
SWITCH was faster for my GB+ file with 900+ character length lines.
$count = 0; switch -File $filepath {default { ++$count }}

build index using lucene.net 2.9.2.2

I have to use lucene.net 2.9.2.2 with NHibernate 3.0. I have started to edit this old code:
public void BuildSearchIndex()
{
FSDirectory entityDirectory = null;
IndexWriter writer = null;
var entityType = typeof(MappedSequence);
var indexDirectory = new DirectoryInfo(GetIndexDirectory());
if (indexDirectory.Exists)
{
indexDirectory.Delete(true);
}
try
{
entityDirectory = FSDirectory.GetDirectory(Path.Combine(indexDirectory.FullName, entityType.Name), true);
writer = new IndexWriter(entityDirectory, new StandardAnalyzer(Lucene.Net.Util.Version.LUCENE_29), true, IndexWriter.MaxFieldLength.UNLIMITED);
}
finally
{
if (entityDirectory != null)
{
entityDirectory.Close();
}
if (writer != null)
{
writer.Close();
}
}
IFullTextSession fullTextSession = Search.CreateFullTextSession(this.Session);
// Iterate through Suppliers and add them to Lucene's index
foreach (MappedSequence instance in Session.CreateCriteria(typeof(MappedSequence)).List<MappedSequence>())
{
fullTextSession.Index(instance);
}
}
private string GetIndexDirectory()
{
INHSConfigCollection nhsConfigCollection = CfgHelper.LoadConfiguration();
string property = nhsConfigCollection.DefaultConfiguration.Properties["hibernate.search.default.indexBase"];
var fi = new FileInfo(property);
return Path.Combine(AppDomain.CurrentDomain.BaseDirectory, fi.Name);
}
to build the index. The line:
FSDirectory.GetDirectory(Path.Combine(indexDirectory.FullName, entityType.Name), true);
still uses obsolete code. Could anyone be so kind and point out the necessary change. Thanks.
Christian
PS
Try using FSDirectory.Open(path) instead.