exe batch script to exe in powershel with parameters - powershell

I got a batch script to automate some operations.
This code take a .osis.xml file and transform it in a Osis format ready to be read in some Bible progrs.
SET work=D:\Documents\Downloads\Emule-Incoming\osis\
osis2mod %work% - < cei1974.osis.xml
Now I'm tring to transform my batch script into a PS script. Here is what I'm doig till now:
# $work variable contains the path of the folder with the original files.
$work = "D:\Documents\Downloads\Emule-Incoming\osis"
# $bpc variable contains the path of destination of CONF files for BPBiblePortable
$bpc = "D:\Documents\Downloads\Emule-Incoming\BPBiblePortable\App\BPBible\resources\mods.d\"
# $xic contains the path of destination of CONF files for xiphos
$xic = "C:\Users\Emanuele\AppData\Roaming\Sword\mods.d\"
# $bpo contains the path of destination of OSIS files for BPBiblePortable
$bpo = "D:\Documents\Downloads\Emule-Incoming\BPBiblePortable\App\BPBible\resources\modules\texts\rawtext\"
# $xio variablecontains the path of destination of OSIS files for xiphos.
$xio = "C:\Users\Emanuele\AppData\Roaming\Sword\modules\texts\rawtext\"
# $Confile Array contains the names of .conf files.
$Confile = #('cei1971.conf', 'cei1974.conf', 'cei2008.conf', 'tilc.conf', 'novav.conf')
foreach ($element in $Confile)
{
# The copy of the files goes well
Copy-Item -Path $work\$element -Destination $bpc
$wshell = New-Object -ComObject Wscript.Shell
$count = 1
$result = 0
While ($result -eq 0)
{
$result = $wshell.Popup("Copiato in $bpc",1,"$element",0)
$count += 1
Write-Host $count
if($count -eq 10)
{
Exit
}
}
Copy-Item -Path $work\$element -Destination $xic
$wshell = New-Object -ComObject Wscript.Shell
$count = 1
$result = 0
While ($result -eq 0)
{
$result = $wshell.Popup("Copiato in $xic",1,"$element",0)
$count += 1
Write-Host $count
if($count -eq 10)
{
Exit
}
}
}
# The name of the .exe.
$eseguibile = "osis2mod.exe"
#
# THIS IS THE COMMAND I'M NOT ABLE TO "translate" in PS
# The problem is that PS not recognize the "- <" passing
#
& $PSScriptRoot\$eseguibile --% $work\ - < $work\cei1974.osis.xml
Have someone find something like me?

Related

Powershell Word SaveAs command errors when run using a service account

This has been driving me nuts for days.... I have a powershell script that converts all .doc files in a target directory to PDF's using Word SaveAs interop.
The script works fine when run within context of the logged in user, but errors with "You cannot call a method on a null-valued expression." when I try to execute the script using a service account (via task scheduler, run as another user)... service account has local admin rights.
The exception occurs at this line: $Doc.SaveAs([ref]$Name.value,[ref]17)
My code is as follows, Im not the best coder in the world so any advice would be gratefully received.
thanks.
try
{
$FileSource = 'D:\PROCESSOR\NewArrivals\*.doc'
$SuccessPath = 'D:\PROCESSOR\Success\'
$docextn='.doc'
$Files=Get-ChildItem -path $FileSource
$counter = 0
$filesProcessed = 0
$Word = New-Object -ComObject Word.Application
#check files exist to be processed.
$WordFileCount = Get-ChildItem $FileSource -Filter *$docextn -File| Measure-Object | %{$_.Count} -ErrorAction Stop
If ($WordFileCount -gt 0) {
Foreach ($File in $Files) {
$Name="$(($File.FullName).substring(0, $File.FullName.lastIndexOf("."))).pdf"
$Doc = $Word.Documents.Open($File.FullName)
$Doc.SaveAs([ref]$Name.value,[ref]17)
$Doc.Close()
if ($counter -gt 100) {
$counter = 0
$Word.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Word)
$Word = New-Object -ComObject Word.Application
}
$counter = $counter + 1
$filesProcessed = $filesProcessed + 1
}
}
$Word.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Word)
}
catch
{
}
finally
{
}
If you are certain the service account has access to Word, then I think the exception you encounter is in the [ref] while doing the SaveAs().
AKAIK only Office versions below 2010 need [ref], versions above do not.
Next I think your code can be tydied up somewhat, for instance by releasing the com objects ($Doc and $Word) inside the finally block, as that is always executed.
Also, there is no need to perform a Get-ChildItem twice.
Something like this:
$SuccessPath = 'D:\PROCESSOR\Success'
$FileSource = 'D:\PROCESSOR\NewArrivals'
$filesProcessed = 0
try {
$Word = New-Object -ComObject Word.Application
$Word.Visible = $false
# get a list of FileInfo objects that have the .doc extension and loop through
Get-ChildItem -Path $FileSource -Filter '*.doc' -File | ForEach-Object {
# change the extension to pdf for the output file
$pdf = [System.IO.Path]::ChangeExtension($_.FullName, '.pdf')
$Doc = $Word.Documents.Open($_.FullName)
# Check Version of Office Installed. Pre 2010 versions need the [ref]
if ($word.Version -gt '14.0') {
$Doc.SaveAs($pdf, 17)
}
else {
$Doc.SaveAs([ref]$pdf,[ref]17)
}
$Doc.Close($false)
$filesProcessed++
}
}
finally {
# cleanup code
if ($Word) {
$Word.Quit()
$null = [System.Runtime.InteropServices.Marshal]::ReleaseComObject($Doc)
$null = [System.Runtime.InteropServices.Marshal]::ReleaseComObject($Word)
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
$Word = $null
}
}
Then, there is the question of $SuccessPath. You never use it. Is it your intention to save the PDF files in that path? If so, change the line
$pdf = [System.IO.Path]::ChangeExtension($_.FullName, '.pdf')
into
$pdf = Join-Path -Path $SuccessPath -ChildPath ([System.IO.Path]::ChangeExtension($_.Name, '.pdf'))
Hope that helps

Dot-sourced script not referencing current directory on 2nd run

I have a script, below, which I have in a Scripts folder. It references a set of command prompt applications $cmd1.exe, etc.. Using either Powershell or the Integrated Powershell terminal in VS Code, I follow these steps to use it:
Dot source the script from the Scripts directory > .\new-customconfig.ps1,
Change to the working directory > cd [New Directory],
Use the function in the script and pass a filename from the working directory > New-CustomConfig .\FileName.
It passes the file from the working directory the first time I run the script, but from any subsequent runs, it looks for the file in the Scripts directory.
Is there something I am doing wrong?
Function New-CustomConfig {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[string]
$FileName
)
Begin {
#Set tools directory
$ToolD = '[Directory]\Tools'
#Clean up filename
$FileName = $FileName -replace '^[\W]*', ''
#Setup Executables
$cmdtools = "\cmd1.exe", "\cmd2.exe", "\cmd3.exe"
#Setup Arguments
$cmdargs = "cmd1args", "cmd2args", "cmd3args"
#Setup Write-Host Comments
$cmdecho = "echo1", "echo2", "echo3"
#Setup command object info
$cmdinfo = New-Object System.Diagnostics.ProcessStartInfo
$cmdinfo.RedirectStandardError = $true
$cmdinfo.RedirectStandardOutput = $true
$cmdinfo.UseShellExecute = $false
#Create command object
$cmd = New-Object System.Diagnostics.Process
"Generating Config for $FileName"
}
Process {
for ($i = 0; $i -le $cmdtools.Count; $i++) {
$cmdinfo.FileName = $ToolD + $cmdtools[$i]
$cmdinfo.Arguments = '' + $cmdargs[$i] + ''
Write-Host $i
Write-Host $cmdinfo.FileName
Write-Host $cmdinfo.Arguments
Write-Host $(Get-Location)
$cmdecho[$i]
$cmd.StartInfo = $cmdinfo
$cmd.Start() | Out-Null
$cmd.WaitForExit()
$stdout = $cmd.StandardOutput.ReadToEnd()
$stderr = $cmd.StandardError.ReadToEnd()
Write-Host "stdout: $stdout"
Write-Host "stderr: $stderr"
Write-Host "exit code: " + $p.ExitCode
}
}
End {
"Press any key to continue..."
$null = $host.UI.RawUI.ReadKey('NoEcho,IncludeKeyDown')
}
}

SharePoint 2013 Powershell - Copy File From One Site Collection To Another

Please can someone assist in helping with the above subject?
I would like to copy one file from a specific folder in a sharepoint site collection to another library (of the same name) in a different sharepoint site collection (but still within the same Web Application).
I have very little Powershell experience and have tried a number of Google searches but cannot seem to find anything that works.
Below is an example of what i have tried to do (lots of Write-Host to try and figure out what is going on) with the error message at the bottom.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$DestFolder = $dList.Files
$RootFolder = $sList.RootFolder
Write-Host " line 25 -- " $RootFolder
$collfiles1 = $RootFolder.Files
Write-Host " line 27 -- "$collfiles1
Write-Host " line 28 -- "$DestFolder
Write-Host " line 30 -- "$str = $DestinationWebURL"/"$DestinationLibraryTitle"/"$FileName
Write-Host " line 31 -- "$collfiles1.Count
for($i = 0 ; $i -lt $collfiles1.Count ; $i++)
{
Write-Host " line 34 -- "$collfiles1[$i].Name
##Write-Host $FileName
if($collfiles1[$i].Name -eq $FileName)
{
## $str = $DestinationWebURL.Url + $DestinationLibraryTitle + "/" + $FileName
$str = $DestinationWebURL+"/" +$DestinationLibraryTitle+"/"
Write-Host " line 40 -- "$str
Write-Host " line 41 -- "$collfiles1[$i]
$FiletoCopy = $collfiles1[$i].Name
Write-Host " line 43 -- " $FiletoCopy
$FiletoCopy.CopyTo($str,$true)
}
}
Write-Host "Script Completed"
The below example gives the error
Cannot find an overload for "CopyTo" and the argument count: "2".
At line:44 char:3
+ $FiletoCopy.CopyTo($str,$true)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodCountCouldNotFindBest
If someone could point me in the right direction that would be very helpful.
Thanks in advance,
Ian.
The following PowerShell for your reference, copy a file from one library in site collection to another library in another site collection with fields.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
#$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
#$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$SourceFile=$sWeb.GetFile($SourceWebURL+"/"+$SourceLibraryTitle+"/"+$FileName)
$TargetFolder = $dWeb.GetFolder($DestinationLibraryTitle)
#Copy File from the Source
$NewFile = $TargetFolder.Files.Add($SourceFile.Name, $SourceFile.OpenBinary(),$True)
#Copy Meta-Data from Source
Foreach($Field in $SourceFile.Item.Fields)
{
If(!$Field.ReadOnlyField)
{
if($NewFile.Item.Fields.ContainsField($Field.InternalName))
{
$NewFile.Item[$Field.InternalName] = $SourceFile.Item[$Field.InternalName]
}
}
}
#Update
$NewFile.Item.UpdateOverwriteVersion()
Write-host "Copied File:"$SourceFile.Name
Reference: Copy Files Between Document Libraries in SharePoint using PowerShell
So in case of large files where file size is greater than 50MB. This script mentioned by #LZ_MSFT will never be able to copy that file may be. In that aspect, you need to chunk the file into small pieces.Here is the PS to copy from source to destination with chunking if file size is greater than 50MB. Plus point for this one script is, it is using Client so it can be used with SP online and on-prem.
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
Function UploadFileInSlice ($DestinationCtx, $SourceCtx, $SourceFileUrl, $DestinationFolderUrl, $fileName, $fileChunkSizeInMB) {
# Each sliced upload requires a unique ID.
$UploadId = [GUID]::NewGuid()
# Get File by Server Relative URL
$File = $SourceCtx.Web.GetFileByServerRelativeUrl($SourceFileUrl)
$SourceCtx.Load($File)
# Get file Steam with OpenBinarySteam
$StreamToUpload = $File.OpenBinaryStream()
$SourceCtx.ExecuteQuery()
# File size in bytes
$FileSize = ($File).length
# Get Destination Folder by Server Relative URL
$DestinationFolder =
$DestinationContext.Web.GetFolderByServerRelativeUrl($DestinationFolderUrl)
$DestinationCtx.Load($DestinationFolder)
$DestinationCtx.ExecuteQuery()
# Set Complete Destination URL with Destination Folder + FileName
$destUrl = $DestinationFolderUrl + "/" + $fileName
# File object.
[Microsoft.SharePoint.Client.File] $upload
# Calculate block size in bytes.
$BlockSize = $fileChunkSizeInMB * 1000 * 1000
Write-Host "File Size is: $FileSize bytes and Chunking Size is:$BlockSize bytes"
if ($FileSize -le $BlockSize)
{
# Use regular approach if file size less than BlockSize
Write-Host "File uploading with out chunking"
$upload =[Microsoft.SharePoint.Client.File]::SaveBinaryDirect($DestinationCtx, $destUrl, $StreamToUpload.Value, $true)
return $upload
}
else
{
# Use large file upload approach.
$BytesUploaded = $null
$Fs = $null
Try {
$br = New-Object System.IO.BinaryReader($StreamToUpload.Value)
#$br = New-Object System.IO.BinaryReader($Fs)
$buffer = New-Object System.Byte[]($BlockSize)
$lastBuffer = $null
$fileoffset = 0
$totalBytesRead = 0
$bytesRead
$first = $true
$last = $false
# Read data from file system in blocks.
while(($bytesRead = $br.Read($buffer, 0, $buffer.Length)) -gt 0) {
$totalBytesRead = $totalBytesRead + $bytesRead
# You've reached the end of the file.
if($totalBytesRead -eq $FileSize) {
$last = $true
# Copy to a new buffer that has the correct size.
$lastBuffer = New-Object System.Byte[]($bytesRead)
[array]::Copy($buffer, 0, $lastBuffer, 0, $bytesRead)
}
If($first)
{
$ContentStream = New-Object System.IO.MemoryStream
# Add an empty file.
$fileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$fileCreationInfo.ContentStream = $ContentStream
$fileCreationInfo.Url = $fileName
$fileCreationInfo.Overwrite = $true
#Add file to Destination Folder with file creation info
$Upload = $DestinationFolder.Files.Add($fileCreationInfo)
$DestinationCtx.Load($Upload)
# Start upload by uploading the first slice.
$s = New-Object System.IO.MemoryStream(,$Buffer)
Write-Host "Uploading id is:"+$UploadId
# Call the start upload method on the first slice.
$BytesUploaded = $Upload.StartUpload($UploadId, $s)
$DestinationCtx.ExecuteQuery()
# fileoffset is the pointer where the next slice will be added.
$fileoffset = $BytesUploaded.Value
Write-Host "First patch of file with bytes"+ $fileoffset
# You can only start the upload once.
$first = $false
}
Else
{
# Get a reference to your file.
$Upload = $DestinationCtx.Web.GetFileByServerRelativeUrl($destUrl);
If($last) {
# Is this the last slice of data?
$s = New-Object System.IO.MemoryStream(,$lastBuffer)
# End sliced upload by calling FinishUpload.
$Upload = $Upload.FinishUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
Write-Host "File Upload Completed Successfully!"
# Return the file object for the uploaded file.
return $Upload
}
else {
$s = New-Object System.IO.MemoryStream(,$buffer)
# Continue sliced upload.
$BytesUploaded = $Upload.ContinueUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
# Update fileoffset for the next slice.
$fileoffset = $BytesUploaded.Value
Write-Host "File uploading is in progress with bytes: "+ $fileoffset
}
}
} #// while ((bytesRead = br.Read(buffer, 0, buffer.Length)) > 0)
}
Catch {
Write-Host $_.Exception.Message -ForegroundColor Red
}
Finally {
if ($Fs -ne $null)
{
$Fs.Dispose()
}
}
}
return $null
}
#URL to Configure, in this case Destination is SP Online site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$DestnationSiteUrl = "https://your-domain.sharepoint.com/sites/xyz"
$DestinationRelativeURL = "/sites/xyz/TestLibrary" #server relative URL here with library Name and Folder name
$DestinationUserName = "xyz#your-domain.com"
$DestinationPassword = Read-Host "Enter Password for Destination User:
$DestinationUserName" -AsSecureString
#URL to Configure, in this case Source is On-Prem site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$SourceSiteUrl = "http://intranet/sites/xyz"
$SourceRelativeURL = "/sites/xyz/TestLibrary/myfile.pptx" #server relative URL here with library Name and file name with extension
$SourceUsername = "domain\xyz"
$SourcePassword = Read-Host "Enter Password for Source User: $SourceUsername" -AsSecureString
#Set a file name with extension
$FileNameWithExt = "myfile.pptx"
#Get Source Client Context with credentials
$SourceContext = New-Object Microsoft.SharePoint.Client.ClientContext($SourceSiteUrl)
#Using NetworkCredentials in case of On-Prem
$SourceCtxcredentials = New-Object System.Net.NetworkCredential($SourceUsername, $SourcePassword)
$SourceContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$SourceContext.ExecuteQuery();
#Get Destination Client Context with credentials
$DestinationContext = New-Object Microsoft.SharePoint.Client.ClientContext($DestnationSiteUrl)
#Using SharePointOnlineCredentials in case of SP-Online
$DestinationContext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($DestinationUserName, $DestinationPassword)
$DestinationContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$DestinationContext.ExecuteQuery();
#All Set up, now just call the UploadFileInSlice with parameters
$UpFile = UploadFileInSlice -DestinationCtx $DestinationContext -SourceCtx $SourceContext -DestinationFolderUrl $DestinationRelativeURL -SourceFileUrl $SourceRelativeURL -fileName $FileNameWithExt -fileChunkSizeInMB 10

Getting error output from a powershell 2.0 script running as a task

TL:DR actual question is at the bottom
I'm trying to troubleshoot a Powershell v1.0 script issue. The script basically downloads a file from an FTP site and puts it on a remote server via UNC and emails the success or failure of the task.
The script runs as a task with a generic ID that is a Domain Admin but is not used to log into systems so the server it runs off of does not contain a profile for it.
If I do a runas for that user and execute the script via command line it works flawlessly. However, if I try to run it as a task it runs then exits instantly. If I open a runas command prompt and run the scheduled task vi at he command line all I get back is:
SUCCESS: Attempted to run the scheduled task "Task Name".
I've tried writing variable values to a text file to see what is going on but it never writes even when I write them as the very first step of execution.
What I want to do is capture any script error messages you would normally see when trying to run the script and/or write the variable information to a text file.
Is there any way to do this? BTW I doing via calling powershell with the following arguments:
-file -ExecutionPolicy Bypass "d:\datscript\myscript.ps1"
-I've tried -command instead of -file.
-I've tried "d:\datscript\myscript.ps1 5>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 9>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 | out-file d:\datscript\test.txt"
Nothing worked. I'm sure I can fix whatever bug I have but I'm banging my head against the wall trying to get some kind of failure info.
--Update: Here is a copy of the script minus details--
#-------------------------------------------------------------------------------------------------------------------------------------------------------------
#
#Variable Declaration
#
#$path = Path on local server to downlaod DAT to
#$olddat = Old/last DAT downloaded
#$currentdat = Next DAT number
#$ftpsite = McAfee FTP site. Update if path changes
#$ftpuser = FTP user (anon login)
#$ftppass = FTP password (anon login)
#$tempstring = Manipulation variable
#$gotdat = Boolean if updated DAT exists
#$success = Status if a new DAT exists and has been downloaded (used for email notification).
#$thetime = Variable use dto hold time of day manipulation.
$path = "\\myservername\ftproot\pub\mcafee\datfiles\"
$olddat = ""
$currentdat =""
$ftpsite = "ftp://ftp.nai.com/virusdefs/4.x/"
$ftpuser = "something"
$ftppass = "anything"
$tempstring =""
$gotdat = "False"
$success = ""
$thetime = ""
#
#Normalized functions handles UNC paths
#
function Get-NormalizedFileSystemPath
{
<#
.Synopsis
Normalizes file system paths.
.DESCRIPTION
Normalizes file system paths. This is similar to what the Resolve-Path cmdlet does, except Get-NormalizedFileSystemPath also properly handles UNC paths and converts 8.3 short names to long paths.
.PARAMETER Path
The path or paths to be normalized.
.PARAMETER IncludeProviderPrefix
If this switch is passed, normalized paths will be prefixed with 'FileSystem::'. This allows them to be reliably passed to cmdlets such as Get-Content, Get-Item, etc, regardless of Powershell's current location.
.EXAMPLE
Get-NormalizedFileSystemPath -Path '\\server\share\.\SomeFolder\..\SomeOtherFolder\File.txt'
Returns '\\server\share\SomeOtherFolder\File.txt'
.EXAMPLE
'\\server\c$\.\SomeFolder\..\PROGRA~1' | Get-NormalizedFileSystemPath -IncludeProviderPrefix
Assuming you can access the c$ share on \\server, and PROGRA~1 is the short name for "Program Files" (which is common), returns:
'FileSystem::\\server\c$\Program Files'
.INPUTS
String
.OUTPUTS
String
.NOTES
Paths passed to this command cannot contain wildcards; these will be treated as invalid characters by the .NET Framework classes which do the work of validating and normalizing the path.
.LINK
Resolve-Path
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('PSPath', 'FullName')]
[string[]]
$Path,
[switch]
$IncludeProviderPrefix
)
process
{
foreach ($_path in $Path)
{
$_resolved = $_path
if ($_resolved -match '^([^:]+)::')
{
$providerName = $matches[1]
if ($providerName -ne 'FileSystem')
{
Write-Error "Only FileSystem paths may be passed to Get-NormalizedFileSystemPath. Value '$_path' is for provider '$providerName'."
continue
}
$_resolved = $_resolved.Substring($matches[0].Length)
}
if (-not [System.IO.Path]::IsPathRooted($_resolved))
{
$_resolved = Join-Path -Path $PSCmdlet.SessionState.Path.CurrentFileSystemLocation -ChildPath $_resolved
}
try
{
$dirInfo = New-Object System.IO.DirectoryInfo($_resolved)
}
catch
{
$exception = $_.Exception
while ($null -ne $exception.InnerException)
{
$exception = $exception.InnerException
}
Write-Error "Value '$_path' could not be parsed as a FileSystem path: $($exception.Message)"
continue
}
$_resolved = $dirInfo.FullName
if ($IncludeProviderPrefix)
{
$_resolved = "FileSystem::$_resolved"
}
Write-Output $_resolved
}
} # process
} # function Get-NormalizedFileSystemPath
#
#Get the number of the exisiting DAT file and increment for next DAT if the DAT's age is older than today.
# Otherwise, exit the program if DATs age is today.
#
$tempstring = "xdat.exe"
$env:Path = $env:Path + ";d:\datscript"
$path2 ="d:\datscript\debug.txt"
add-content $path2 $path
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
$path = Get-NormalizedFileSystemPath -Path $path
Set-Location -Path $path
$olddat = dir $path | %{$_.Name.substring(0, 4) }
$olddatfull = "$olddat" + "$tempstring"
if ( ((get-date) - (ls $olddatfull).LastWriteTime).day -lt 1)
{
#***** Commented out for testing!
# exit
}
$currentdat = [INT] $olddat
$currentdat++
$currentdat = "$currentdat" + "$tempstring"
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
#
#Connect to FTP site and get a current directory listing.
#
[System.Net.FtpWebRequest]$ftp = [System.Net.WebRequest]::Create($ftpsite)
$ftp.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
#
# Read all the data available from the ftp directory stream, writing it to the
# output buffer when done. After that the buffer is searched to see if it cotains the expected
# lastest DAT.
#
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
} catch { $foundMore = $false; $read = 0 }
} while($read -gt 0)
} while($foundmore)
$gotdat = $outputbuffer.Contains($currentdat)
$target = $path + $currentdat
#
# Downloads DATs and cleans up old DAT file. Returns status of the operation.
# Return 1 = success
# Return 2 = Latest DAT not found and 4pm or later
# Return 3 = DAT available but did not download or is 0 bytes
# Return 4 = LatesT DAT not found and before 4pm
#
$success = 0
if ($gotdat -eq "True")
{
$ftpfile = $ftpsite + $ftppath + $currentdat
write-host $ftpfile
write-host $target
$ftpclient = New-Object system.Net.WebClient
$uri = New-Object System.Uri($ftpfile)
$ftpclient.DownloadFile($uri, $target)
Start-Sleep -s 30
if ( ((get-date) - (ls $target).LastWriteTime).days -ge 1)
{
$success = 3
}
else
{
$testlength = (get-item $target).length
if( (get-item $target).length -gt 0)
{
Remove-Item "$olddatfull"
$success = 1
}
else
{
$success = 3
}
}
}
else
{
$thetime = Get-Date
$thetime = $thetime.Hour
if ($thetime -ge 16)
{
$success = 2
}
else
{
$success = 4
exit
}
}
#
# If successful download (success = 1) run push bat
#
if ($success -eq 1)
{
Start-Process "cmd.exe" "/c c:\scripts\mcafeepush.bat"
}
#Email structure
#
#Sends result email based on previous determination
#
#SMTP server name
$smtpServer = "emailserver.domain.com"
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "email1#domain.com"
$msg.ReplyTo = "email2#domain.com"
$msg.To.Add("email2#domain.com")
switch ($success)
{
1 {
$msg.subject = "McAfee Dats $currentdat successful"
$msg.body = ("DAT download completed successfully. Automaton v1.0")
}
2 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("Looking for DAT $currentdat on the FTP site but I coud not find it. Human intervention may be required. Automaton v1.0")
}
3 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("$currentdat is available for download but download has failed. Human intervention will be required. Automaton v1.0")
}
default {
$msg.subject = "DAT Automaton Error"
$msg.body = ("Something broke with the McAfee automation script. Human intervention will be required. Automaton v1.0")
}
}
#Sending email
$smtp.Send($msg)
#Needed to keep the program from exiting too fast.
Start-Sleep -s 30
#debugging stuff
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
Apparently you have an error in starting Powershell, either because execution policy is different on the Powershell version you start, or on the account, or there is an access error on the scheduled task. To gather actual error, you can launch a task like so:
cmd /c "powershell.exe -file d:\datscript\myscript.ps1 test.txt 2>&1" >c:\windows\temp\test.log 2&>1
This way if there would be an error on starting Powershell, it will be logged in the c:\windows\temp\test.log file. If the issue is in execution policy, you can create and run (once) a task with the following:
powershell -command "Get-ExecutionPolicy -List | out-file c:/windows/temp/policy.txt; Set-ExecutionPolicy RemoteSigned -Scope LocalMachine -Force"
Running a task under the account you plan to run your main task will first get the policies in effect (so that if setting machine-level policy won't help, you'll know what scope to alter) and set machine-level policy to "RemoteSigned", the least restrictive level beyond allowing every script (highly not recommended, there are encoder scripts written on Powershell that can ruin your data).
Hope this helps.
UPDATE: If that's not policy, there might be some errors in properly writing the parameters for the task. You can do this: Create a .bat file with the string that launches your script and redirects output to say test1.txt, then change the scheduled task to cmd.exe -c launcher.bat >test2.txt, properly specifying the home folder. Run the task and review both files, at least one of them should contain an error that prevents your script from launching.

Powershell to get metadata of files

I am looking to get metadata of a specified file (or directory of files). I am specifically looking for "Program Description" on .WTV files.
I found code, but it doesn't list that attribute. Some of that code looks like this:
foreach($sFolder in $folder)
{
$a = 0
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.namespace($sFolder)
foreach ($strFileName in $objFolder.items())
{ FunLine( "$($strFileName.name)")
for ($a ; $a -le 266; $a++)
{
if($objFolder.getDetailsOf($strFileName, $a))
{
$hash += #{ `
$($objFolder.getDetailsOf($objFolder.items, $a)) =`
$($objFolder.getDetailsOf($strFileName, $a))
} #end hash
$hash
$hash.clear()
I can see that attribute in File explorer.
The code you had #user1921849 has almost got it, but to address the original question more clearly you should use the Windows Shell Property System named properties, listed in the Windows devloper documentation for WTV files, and used in line 4 below.
$shell = new-object -com shell.application
$folder = $shell.namespace("\\MEDIA\Recorded Tv\")
$item = $folder.Items().Item('Person of Interest_WBBMDT_2013_11_26_21_01_00.wtv')
write-output $item.ExtendedProperty('System.RecordedTV.ProgramDescription')
Updated URL to docs
General properties list https://learn.microsoft.com/en-us/windows/desktop/properties/props
WTV properties list https://learn.microsoft.com/en-us/windows/desktop/properties/recordedtv-bumper
Grab TagLib# from Nuget or various other places. Then check out this blog post showing how to use TagLib# to edit MP3 tags. Hopefully, it can retrieve the WTV tag you're looking for.
$shell = new-object -comobject shell.application
$ShFolder=$shell.namespace("\\MEDIA\Recorded Tv\")
$ShFile =$ShFolder.parsename("Person of Interest_WBBMDT_2013_11_26_21_01_00.wtv")
$count = 0
while ($count -le 294)
{
$ShRating = $ShFolder.getdetailsof($ShFile,$count)
$count
$ShRating
$count = $count+1
}
Program Description is item 272.
I have done a sample code which will check all file in a folder and export a csv file with all the metadata details. Please find the following powershell script.
Step 1. Create a file Fileproperty.ps1
Import-Module ".\Module\AddModule.psm1" -Force
$commands = {
$source = read-host "Enter folder path "
if ([string]::IsNullOrWhitespace($source)){
Write-host "Invalid file path, re-enter."
$source = $null
&$commands
}else{
$output = read-host "Enter output folder path "
if ([string]::IsNullOrWhitespace($output)){
Write-host "Invalid output path, re-enter."
$output = $null
&$commands
}else{
$outputfilename = read-host "Enter output file name "
if ([string]::IsNullOrWhitespace($outputfilename)){
Write-host "Invalid file name, re-enter."
$outputfilename = $null
&$commands
}else{
Get-FileMetaData -folder $source | Export-Csv -Path $output\$outputfilename.csv -Encoding ascii -NoTypeInformation
Write-host "Process has been done..."
}
}
}
}
&$commands
Step 2. Create a folder Module
Step 3. create another file Module/AddModule.psm1
$FunctionFiles = #(Get-ChildItem -Path $PSScriptRoot\*.ps1 -ErrorAction SilentlyContinue)
Foreach($fileItem in #($FunctionFiles))
{
Try
{
. $fileItem.fullname
}
Catch
{
Write-Error -Message "Vsts module -> Unable to import a function in file $($fileItem.fullname): $_"
}
}
Export-ModuleMember -Function $FunctionFiles.Basename
Step 4. create another file Module/Get-FileMetaData.ps1
Function Get-FileMetaData
{
Param([string[]]$folder)
$OutputList = New-Object 'System.Collections.generic.List[psobject]'
foreach($sFolder in $folder) {
$a = 0
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.namespace($sFolder)
foreach ($File in $objFolder.items())
{
$FileMetaData = New-Object PSOBJECT
for ($a ; $a -le 266; $a++)
{
if($objFolder.getDetailsOf($File, $a))
{
$hash += #{$($objFolder.getDetailsOf($objFolder.items, $a)) =
$($objFolder.getDetailsOf($File, $a)) }
$FileMetaData | Add-Member $hash
$hash.clear()
} #end if
} #end for
$a=0
$OutputList.Add($FileMetaData)
} #end foreach $file
} #end foreach $sfolder
return $OutputList
} #end Get-FileMetaData
Hope this will work for you.