I am trying to write an automated conversion powershell script to convert las files with potreeconverter.
Here is my script
$ToBeConvertedFolders = 'D:\potree_converter\ToBeConvertedFolders\TestExport1.0'
$ApacheHtdocsFolder = 'C:\xampp\htdocs\projets\'
$InputFormat = 'las'
$PotreeConverter = 'D:\potree_converter\potreeconverter.exe'
Get-ChildItem "$ToBeConvertedFolders\LAS\*.$InputFormat" |
ForEach-Object {
$Filename = $_.BaseName
$InputPath = "`"$ToBeConvertedFolders\$Filename.$InputFormat`""
$OutputPath = "`"$ApacheHtdocsFolder\$Filename`""
Start-Process $PotreeConverter -ArgumentList "$InputPath -o $OutputPath --page-template
c:\potree_converter\resources\page_template\ --generate-page $Filename --title TITILE HERE" -wait
}
Script runs fine and start to process the las file and I have the html output file but it quits without completing the conversion process.
Could you please suggest anything to solve this problem?
Thank you.
Related
I am using the following script to call an API to create a bunch of log files. It loops through all the first and second items in a text file (same way as how tokens are defined in CMD). This works fine:
$linesXmls = #(Get-Content -Path "$path") -Replace " "
ForEach($line in $lines) {
$s = $line -split ","
$var1s = $s[0]
$var2s = $s[1]
Start-Process -FilePath $APIpath -ArgumentList "$argument1", "$argument2", "$argument3", "$path\folder\$var2s.log"
}
However this does not wait until the files are created before executing the next statement and causes my script to fail. Is there a way to wait before each $var2s log file is created?
Start-Process is asynchronous, in the sense that it instructs the operating system to create and start a new process, and then immediately returns - a classic "fire-and-forget" mechanism.
For this reason, you may likely see that multiple processes are running simultaneously, which might caught issues with access to shared file resources.
To prevent such a race condition, use the -Wait swith parameter with Start-Process to force it to wait until the resulting process has exited until returning:
$linesXmls = #(Get-Content -Path "$path") -Replace " "
ForEach($line in $lines) {
$s = $line -split ","
$var1s = $s[0]
$var2s = $s[1]
Start-Process -FilePath $APIpath -ArgumentList "$argument1", "$argument2", "$argument3", "$path\folder\$var2s.log" -Wait
}
Some useful info beforehand. What I'm attempting to do is read in output from an external command, specifically steamcmd, using powershell start-process and System.diagnostics.ProcessStartInfo.
What I'm running into is RedirectStandardOutput buffer limit of 4096 bytes. The output I'm getting from the steamcmd is more than that buffer, so I'm only getting a portion of what I need. I have no other method for getting this data, other than calling steamcmd.
You can see the output as well if you have steamcmd (it's free) and running this.
steamcmd +login anonymous +app_info_update 1 +app_info_print 443030 +quit
This will download all the manifest info about that appid.
I've tried to redirect to a file and also to a variable, both work as expected, it's just that it's cut short by the buffer. There also doesn't appear to be a powershell method in System.Diagnostics.Process to wait for the OutputDataReceived event.
Code used (stolen from another STackOverflow question)
$psi = New-object System.Diagnostics.ProcessStartInfo
$psi.CreateNoWindow = $true
$psi.UseShellExecute = $false
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$psi.FileName = "C:\Path\to\SteamCMD.exe"
$psi.Arguments = "+login anonymous +app_info_update 1 +app_info_print 443030 +quit"
$process = New-Object System.Diagnostics.Process
$process.StartInfo = $psi
[void]$process.Start()
$output = $process.StandardOutput.ReadToEnd()
$process.WaitForExit()
$output
I think the actual issue is that steamCMD just outputs in one big write instead of line by line. I guess a better question would be, how can I increase the standardoutput buffer size of Start-Process or System.Diagnostics.Process.
Note: running the steamcmd > somefile.txt results in same buffer limit.
steamcmd.exe appears to work properly only if run from an empty subdirectory, at least for this command line.
I can't explain it, but I was able to repro your issue when I rand the command twice.
Here is one way to work around the issue. Run steamcmd.exe from an empty directory. Depending on your needs, you could use a static temp dir and clean it before each run, or generate a temp dir and use that and decide how to clean it up later.
CODE
$steamcmd = "L:\test\steam\steamcmd.exe"
# steam works best if run in an empty directory
# why, i have no clue...
$tempName = [System.IO.Path]::GetRandomFileName()
$parentDir = Split-Path $steamcmd -Parent
$tempPath = Join-Path $parentDir $tempName
$null = New-Item $tempPath -ItemType Directory
$null = Copy-Item $steamcmd $tempPath
Write-Output "temp directory is '$tempPath'"
$steamcmdTemp = Join-Path $tempPath 'steamcmd.exe'
Write-Output "Running '$steamcmdTemp'..."
$output = & $steamcmdTemp +login anonymous +app_info_update 1 +app_info_print 443030 +quit
$now = [DateTime]::Now.ToString("yyyyMMdd HHmmss")
$outFile = "output {0}.txt" -f $now
$outputFile = Join-Path $parentDir $outFile
Write-Output "Saving output to '$outputFile'"
$output | Out-File $outputFile -Force
# todo: deal with removing the temp dir...
# or keep cleaning and reusing a static name...
Write-Output "Remember to cleanup '$tempPath'"
I have an powershell script as below: this to convert the string to new line.
$path = Join-Path $args[0] $args[1]
$word = "#####"
$replacement = "`r`n"
$text = get-content $path
$newText = $text -replace $word,$replacement
$newText > $path
$c=get-content $path
Set-Content -Encoding ASCII $c -Path $path
and calling it from a .bat file
This works fine when called manually but going to infinite loop when called from Informatica pre session with error message :
waiting n seconds for child process of the shell command to exit.
What can be possible gone wrong with the code ?
I am executing the following code attempting to execute the 7z.exe command to unzip files.
$dir contains the user input of the path to the zip file which can contain spaces of course! And $dir\temp2 below is a directory that I previously created.
Get-ChildItem -path $dir -Filter *.zip |
ForEach-Object {
$zip_path = """" + $dir + "\" + $_.name + """"
$output = " -o""$dir\temp2"""
&7z e $zip_path $output
}
When I execute it I get the following from 7z.exe:
7-Zip [64] 9.20 Copyright (c) 1999-2010 Igor Pavlov 2010-11-18
Processing archive: C:\test dir\test.zip
No files to process
Files: 0
Size: 0
Compressed: 50219965
If I then copy the value from $zip_path and $output to form my own cmd line it works!
For example:
7z e "c:\test dir\test.zip" -o"c:\test output"
Now, I can reproduce the same message "no files to process" I get when I execute within PowerShell by using the following cmd in cli.
7z e "c:\test dir\test.zip" o"c:\test output"
So, it seems that PowerShell is removing the dash char from my -o option. And yes, it needs to be -o"C:\test output" and not -o "c:\test output" with 7z.exe there is no space between the -o parameter and its value.
I am stumped. Am I doing something wrong or should I be doing this a different way?
I can never get Invoke-Expression (alias = &) to work right either, so I learned how to use a process object
$7ZExe = (Get-Command -CommandType Application -Name 7z )
$7ZArgs = #(
('-o"{0}\{1}"' -f $dir, $_.Name),
('"{0}\{1}"' -f $dir, 'temp2')
)
[Diagnostics.ProcessStartInfo]$7Zpsi = New-Object -TypeName:System.Diagnostics.ProcessStartInfo -Property:#{
CreateNoWindow = $false;
UseShellExecute = $false;
Filename = $7ZExe.Path;
Arguments = $7ZArgs;
WindowStyle = 'Hidden';
RedirectStandardOutput = $true
RedirectStandardError = $true
WorkingDirectory = $(Get-Location).Path
}
$proc = [System.Diagnostics.Process]::Start($7zpsi)
$7ZOut = $proc.StandardOutput
$7ZErr = $proc.StandardError
$proc.WaitForExit()
I was able to duplicate the exact issue and tried numerous combinations escaping the -o switch and escaping quotes " and what not.
But as one answer mentioned Sysinternals, and I used Process Monitor to find out the format it was passing to 7z.exe. Things that work on a plain commandline doesn't work inside PowerShell the same way.
For example, if I tried to construct parameters inside PowerShell just like cmdline it would fail. I.e., -o"C:\scripts\so\new folder" doesn't work. But if you include the -o switch inside quotes then PowerShell passes the string "-oC:\scripts\so\new folder" which 7z.exe is happy to accept. So I learned that 7z.exe would accept both the formats such as
"C:\Program Files\7-zip\7z.exe" e "C:\scripts\so\new folder.zip" -o"C:\scripts\so\new folder"
and
"C:\Program Files\7-zip\7z.exe" e "C:\scripts\so\new folder.zip" "-oC:\scripts\so\new folder"
And both examples contain spaces in them.
[string]$pathtoexe = "C:\Program Files\7-Zip\7z.exe"
$dir = "C:\scripts\so"
$output = "$dir\new folder"
Get-ChildItem -path $dir -Filter *.zip | % {
[array]$marguments = "e",$_.FullName,"-o$output";
& $pathtoexe $marguments
}
Another approach in PowerShell V3 is to escape the PowerShell parsing feature. You can use the --% command to tell PowerShell to stop parsing any more commands like this.
$zipfile = "C:\scripts\so\newfolder.zip"
$destinationfolder = "C:\scripts\so\New Folder"
[string]$pathtoexe = "C:\Program Files\7-Zip\7z.exe"
& $pathtoexe --% e "C:\scripts\so\newfolder.zip" -o"C:\scripts\so\new folder"
Using the --% syntax, you type commands just like you would type them on the command line. I tested this logic, and it extracts files to the destination folder.
To learn more about --%, check PS> help about_parsing.
The issue with this approach is after --% it is not possible to include a variable. The solution to this issue is to just include the --% as another string variable and pass it like this. And this approach is similar to the commandline approach which wasn't working originally.
[string]$pathtoexe = "C:\Program Files\7-Zip\7z.exe"
$dir = "C:\scripts\so"
$output = "$dir\new folder"
Get-ChildItem -path $dir -Filter *.zip | % {
$zipfile = $_.FullName;
[string]$formatted = [System.String]::Concat("e ", """$zipfile"""," -o""$output""");
[string]$stopparser = '--%';
& $pathtoexe $stopparser $formatted;
}
Using the excellent Process Explorer from the Windows Sysinternals suite I was able to observe some very interesting behavior. I simplified your command line a little as seen below:
dir -Path $dir -Filter *.zip |
select FullName |
% { & 7za.exe e $_ "-o$dir\tmp" }
This was actually invoking the following command line according to Process Explorer:
C:\temp\7za.exe #{FullName="C:\temp\test.zip"} -oC:\temp\test
Telling PowerShell to expand the FullName property forces it out of the hashmap and treats it as a regular string which 7-Zip can deal with:
dir -Path $dir -Filter *.zip |
select -ExpandProperty FullName |
% { & 7za.exe e $_ "-o$dir\tmp" }
There may still be other issues like dealing with spaces in file names that I really didn't consider or account for, but I thought it was worth adding a note that PowerShell (v2 in this case) wasn't quite passing the parameters as you might expect.
I’m new to PowerShell and am trying to convert a batch file that downloads multiple files based on names and extension from a directory on an ftp site. While I’ve found several examples that download a file, I’m struggling to find one that shows how to download multiple files. In a batch I can quite simply use the ftp.exe and the mget command with wildcards??
Can someone please point me in the right direction.
Thanks in advance.
John
There are multiple ways to achieve this. One is to use the System.Net.FtpWebRequest as shown in this example:
http://www.systemcentercentral.com/BlogDetails/tabid/143/IndexID/81125/Default.aspx
Or there are /n Software NetCmdlets you can use:
http://www.nsoftware.com/powershell/tutorials/FTP.aspx
In a batch I can quite simply use the ftp.exe and the mget command
with wildcards??
You can do the same in Powershell if you want to.
For a more Powershell way, you can use the FTPWebRequest. See here: http://msdn.microsoft.com/en-us/library/ms229711.aspx. You can build on the example to download multiple files in a loop.
But bottomline is, you do not have to convert something you have in batch to Powershell. You can, if you want, but what you have in batch, especially when calling external programs, should work just as well.
Another resource you might want to check: PowerShell FTP Client Module
http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb
Oddly enough there are no built in cmdlets to deal with FTP. I'm not sure why the PowerShell team made that decision but it means you'll have to rely on using .NET code, a third party script/module/snap-in or a Win32 program such as FTP.exe as others have already answered with.
Here's is an example of downloading multiple files (binary and text) using .NET code:
$files = "Firefox Setup 9.0.exe", "Firefox Setup 9.0.exe.asc"
$ftpFolder = 'ftp://ftp.mozilla.org/pub/firefox/releases/9.0/win32/en-US'
$outputFolder = (Resolve-Path "~\My Documents").Path
foreach ($file in $files) {
try {
$uri = $ftpFolder + '/' + $file
$request = [Net.WebRequest]::Create($uri)
$request.Method = [Net.WebRequestMethods+Ftp]::DownloadFile
$responseStream = $request.GetResponse().GetResponseStream()
$outFile = Join-Path $outputFolder -ChildPath $file
$fs = New-Object System.IO.FileStream $outFile, "Create"
[byte[]] $buffer = New-Object byte[] 4096
do {
$count = $responseStream.Read($buffer, 0, $buffer.Length)
$fs.Write($buffer, 0, $count)
} while ($count -gt 0)
} catch {
throw "Failed to download file '{0}/{1}'. The error was {2}." -f $ftpFolder, $file, $_
} finally {
if ($fs) { $fs.Flush(); $fs.Close() }
if ($responseStream) { $responseStream.Close() }
}
}
#Jacob. You need ::ListDirectory method to make a list. After, you have to output it in a text file with the out-file command. After that, you import the list with the get-content command. So with a text file, you can make a collection of objects with a foreach loop (don't forget to skip the last line with the '-cne' condition).
You include in this loop your download-ftp function with the parameter of your loop.
Understood ? Not sure if my explanation is good.
So there's an example from one of my script :
$files = Get-FtpList $ftpSource $ftpDirectory $ftpLogin $ftpPassword | Out-File -Encoding UTF8 -FilePath list.txt
$list = Get-Content -Encoding UTF8 -Path list.txt
foreach ($entry in $list -cne "")
{
Get-FtpFile $ftpSource $ftpDirectory $entry $target $ftpLogin $ftpPassword
Start-Sleep -Milliseconds 10
}
Hope it works now for you.
PS:Get-FtpList and Get-FtpFile are custom functions.
This is what i did.As i needed to download a file based on a pattern i dynamically created a command file and then let ftp do the rest
I used basic powershell commands. i did not need to download any additional components
I first Check if the Requisite number of files exist. if they do i invoke the FTP the second time with an Mget.
I run this from a windows 2008 Server connecting to a windows XP remote server
function make_ftp_command_file($p_file_pattern,$mget_flag)
{
# This function dynamically prepares the FTP file
# The file needs to be prepared daily because the pattern changes daily
# Powershell default encoding is Unicode
# Unicode command files are not compatible with FTP so we need to make sure we create an ASCII File
write-output "USER" | out-file -filepath C:\fc.txt -encoding ASCII
write-output "ftpusername" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "password" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "ASCII" | out-file -filepath C:\fc.txt -encoding ASCII -Append
If($mget_flag -eq "Y")
{
write-output "prompt" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "mget $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
else
{
write-output "ls $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
write-output quit | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
########################### Init Section ###############################
$yesterday = (get-date).AddDays(-1)
$yesterday_fmt = date $yesterday -format "yyyyMMdd"
$file_pattern = "BRAE_GE_*" + $yesterday_fmt + "*.csv"
$file_log = $yesterday_fmt + ".log"
echo $file_pattern
echo $file_log
############################## Main Section ############################
# Change location to folder where the files need to be downloaded
cd c:\remotefiles
# Dynamically create the FTP Command to get a list of files from the Remote Servers
echo "Call function that creates a FTP Command "
make_ftp_command_file $file_pattern N
#echo "Connect to remote site via FTP"
# Connect to Remote Server and get file listing
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
$matches=select-string -pattern "BRAE_GE_[A-Z][A-Z]*" C:\logs\$file_log
# Check if the required number of Files available for download
if ($matches.count -eq 36)
{
# Create the ftp command file
# this time the command file has an mget rather than an ls
make_ftp_command_file $file_pattern Y
# Change directory if not done so
cd c:\remotefiles
# Invoke Ftp with newly created command file
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
}
else
{
echo "Full set of Files not available"
}
It's not Powershell specific. But I've tried many other solutions and so far
The http://ncftp.com/ client works the best. It comes with ncftpls.exe for listing remote files and ncftpget.exe for getting files. Use them with Start-Process -Wait
A file list can be constructed in a variable, and used with a regular FTP command....
$FileList="file1_$cycledate.csv
file2_$cycledate.csv
file3_$cycledate.csv
file4_$cycledate.csv"
"open $FTPServer
user $FTPUser $FTPPassword
ascii
cd report
" +
($filelist.split(' ') | %{ "mget $_" }) | ftp -i -n