compile oracle form using powershell script - powershell

I have many oracle forms in one folder and I want to compile those forms through frmcmp command in powershell script.
I have written a powershell script which is following
$module="module="
get-childitem "C:\forms\fortest" -recurse |
where { $_.extension -eq ".fmb" } |
foreach {
C:\Oracle\Middleware\Oracle_FRHome1\BIN\frmcmp $module $_.FullName userid=xyz/xyz#xyz Output_File=C:\forms\11\common\fmx\$_.BaseName+'.fmx'
}
but this one is not working. i am new in powershell.
but when I try to compile a single form through command prompt its working like following.
frmcmp module=C:\forms\src\xyz.fmb userid=xyz/xyz#xyz Output_File=C:\forms\11\common\fmx\xyz.fmx

When you want to use variables inside a string in PowerShell you have different options. To start with, you will always need to use " as opposed to ' to wrap the string, if you want variables in your string.
$myVariable = "MyPropertyValue"
Write-Host "The variable has the value $MyVariable"
The above code would yield the output:
The variable has the value MyPropertyValue
If you want to use a property of a variable (or any expression) and insert it into the string, you need to wrap it in the string with $(expression goes here), e.g.
$MyVariable = New-Object PSObject -Property #{ MyPropertyName = 'MyPropertyValue' }
# The following will fail getting the property since it will only consider
# the variable name as code, not the dot or the property name. It will
# therefore ToString the object and append the literal string .MyPropertyName
Write-Host "Failed property value retrieval: $MyVariable.MyPropertyName"
# This will succeed, since it's wrapped as code.
Write-Host "Successful property value retrieval: $($MyVariable.MyPropertyName)"
# You can have any code in those wrappers, for example math.
Write-Host "Maths calculating: 3 * 27 = $( 3 * 27 )"
The above code would yield the following output:
Failed property value retrieval: #{MyPropertyName=MyPropertyValue}.MyPropertyName
Successful property value retrieval: MyPropertyValue
Maths calculating: 3 * 27 = 81
I generally try to use the Start-Process cmdlet when I start processes in PowerShell, since it gives me the possibility of additional control over the process started. This means that you could use something similar to the following.
Get-ChildItem "C:\forms\fortest" -Filter "*.fmb" -recurse | Foreach {
$FormPath = $_.FullName
$ResultingFileName = $_.BaseName
Start-Process -FilePath "C:\Oracle\Middleware\Oracle_FRHome1\BIN\frmcmp.exe" -ArgumentList "module=$FormPath", "userid=xyz/xyz#xyz", "Output_File=C:\forms\11\common\fmx\$ResultingFileName.fmx"
}
You could also add the -Wait parameter to the Start-Process command, if you want to wait with compilation of the next item until the current compilation has completed.

Related

Why does pipeline parameter cause error when combined with PSDefaultParameterValues?

My powershell function should accept a list of valid paths of mixed files and/or directories either as a named parameter or via pipeline, filter for files that match a pattern, and return the list of files.
$Paths = 'C:\MyFolder\','C:\MyFile'
This works: Get-Files -Paths $Paths This doesn't: $Paths | Get-Files
$PSDefaultParameterValues = #{
"Get-Files:Paths" = ( Get-Location ).Path
}
[regex]$DateRegex = '(20\d{2})([0-1]\d)([0-3]\d)'
[regex]$FileNameRegex = '^term2-3\.7_' + $DateRegex + '\.csv$'
Function Get-Files {
[CmdletBinding()]
[OutputType([System.IO.FileInfo[]])]
[OutputType([System.IO.FileInfo])]
param (
[Parameter(
Mandatory = $false, # Should default to $cwd provided by $PSDefaultParameterValues
ValueFromPipeline,
HelpMessage = "Enter filesystem paths that point either to files directly or to directories holding them."
)]
[String[]]$Paths
)
begin {
[System.IO.FileInfo[]]$FileInfos = #()
[System.IO.FileInfo[]]$SelectedFileInfos = #()
}
process { foreach ($Path in $Paths) {
Switch ($Path) {
{ Test-Path -Path $Path -PathType leaf } {
$FileInfos += (Get-Item $Path)
}
{ Test-Path -Path $Path -PathType container } {
foreach ($Child in (Get-ChildItem $Path -File)) {
$FileInfos += $Child
}
}
Default {
Write-Warning -Message "Path not found: $Path"
continue
}
}
$SelectedFileInfos += $FileInfos | Where-Object { $_.Name -match $FileNameRegex }
$FileInfos.Clear()
} }
end {
Return $SelectedFileInfos | Get-Unique
}
}
I found that both versions work if I remove the default parameter value. Why?
Why does passing a parameter via the pipeline cause an error when that parameter has a default defined in PSDefaultParameterValues, and is there a way to work around this?
Mathias R. Jessen provided the crucial pointer in a comment:
A parameter that is bound via an entry in the dictionary stored in the $PSDefaultParameterValues preference variable is bound before it is potentially bound via the pipeline, just like passing a parameter value explicitly, as an argument would.
Once a given parameter is bound that way, it cannot be bound again via the pipeline, causing an error:
The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not match any of the parameters that take pipeline input.
As you can see, the specific problem at hand - a parameter already being bound - is unfortunately not covered by this message. The unspoken part is that once a given parameter has been bound by argument (possibly via $PSDefaultParameterValues), it is removed from the set of candidate pipeline-binding parameters the input could bind to, and if there are no candidates remaining, the error occurs.
The only way to override a $PSDefaultParameterValue preset value is to use an (explicit) argument.
This comment on a related GitHub issue provides details on the order of parameter binding.
A simplified way to reproduce the problem:
& {
# Preset the -Path parameter for Get-Item
# In any later Get-Item calls that do not use -Path explicitly, this
# is the same as calling Get-Item -Path /
$PSDefaultParameterValues = #{ 'Get-Item:Path' = '/' }
# Trying to bind the -Path parameter *via the pipeline* now fails,
# because it has already been bound via $PSDefaultParameterValues.
# Even without the $PSDefaultParameterValues entry in the picture,
# you'd get the same error with: '.' | Get-Item -Path /
'.' | Get-Item
# By contrast, using an *argument* allows you to override the preset.
Get-Item -Path .
}
What's happening here?!
This is a timing issue.
PowerShell attempts to bind and process parameter arguments in roughly* the following order:
Explicitly named parameter arguments are bound (eg. -Param $value)
Positional arguments are bound (abc in Write-Host abc)
Default parameter values are applied for any parameter that wasn't processed during the previous two steps - note that applicable $PSDefaultParameterValues always take precedence over defaults defined in the parameter block
Resolve parameter set, validate all mandatory parameters have values (this only fails if there are no upstream command in the pipeline)
Invoke the begin {} blocks on all commands in the pipeline
For any commands downstream in a pipeline: wait for input and then start binding it to the most appropriate parameter that hasn't been handled in previous steps, and invoke process {} blocks on all commands in the pipeline
As you can see, the value you assign to $PSDefaultParameterValues takes effect in step 3 - long before PowerShell even has a chance to start binding the piped string values to -Paths, in step 6.
*) this is a gross over-simplification, but the point remains: default parameter values must have been handled before pipeline binding starts.
How to work around it?
Given the procedure described above, we should be able to work around this behavior by explicitly naming the parameter we want to bind the pipeline input to.
But how do you combine -Paths with pipeline input?
By supplying a delay-bind script block (or a "pipeline-bound parameter expression" as they're sometimes called):
$Paths | Get-Files -Paths { $_ }
This will cause PowerShell to recognize -Paths during step 1 above - at which point the default value assignment is skipped.
Once the command starts receiving input, it transforms the input value and binds the resulting value to -Paths, by executing the { $_ } block - since we just output the item as-is, the effect is the exact same as when the pipeline input is bound implicitly.
Digging deeper
If you want to learn more about what happens "behind the curtain", the best tool available is Trace-Command:
$PSDefaultParameterValues['Get-Files:Paths'] = $PWD
Trace-Command -Expression { $Paths |Get-Files } -Name ParameterBinding -PSHost
I should mention that the ParameterBinding tracer is very verbose - which is great for surmising what's going on - but the output can be a bit overwhelming, in which case you might want to replace the -PSHost parameter with -PSPath .\path\to\output.txt to write the trace output to a file

Incrementing variables in powershell

I'm new to PowerShell and am trying to create a script that goes through a csv file (simple name,value csv) and loads each new line in it as a variable and then runs a function against that set of variables.
I've had success at getting it to work for 1 variable by using the following code snippet:
Import-Csv -Path C:\something\mylist.csv | ForEach-Object {
New-Variable -Name $_.Name -Value $_.Value -Force
}
My csv looks like this:
name,value
RegKey1,"Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\LanmanWorkstation"
Basically it's a list of registry keys each named as RegKey# and then the path of that reg key is the intended value of the variable.
I'm currently playing around with the "Test-Path" cmdlet that just prints out true/false if the passed reg-key exists and then just prints out some text based on if it found the reg key or not.
That snippet looks like so:
Test-Path $RegKey1
IF ($LASTEXITCODE=0) {
Write-Output "It worked"
}
else {
Write-Output "It didn't work"
}
This works fine however what I'm trying to achieve is for powershell to run this cmdlet against each of the lines in the csv file - basically checking each reg key in it and then doing whatever specified to it.
What I'm trying to avoid is declaring hundreds of variables for every regkey I plan on using but instead have this one function that just runs through the csv and every time it runs, it increments the number next to the variable's name - RegKey1,RegKey2,RegKey3 etc.
Let me know if there's a way to do this in powershell or a better way of approaching this altogether. I also apologize in advance if I've not provided enough info, please do let me know.
You need to place your if statement in the Foreach-Object loop. This will also only work, if your variable all get the same name of $RegKey. To incriment, you may use the for loop.
Import-Csv -Path C:\something\mylist.csv | ForEach-Object {
New-Variable -Name $_.Name -Value $_.Value -Force
IF (Test-Path $RegKey1) {
Write-Output "It worked"
}
else {
Write-Output "It didn't work"
}
}
The if statement returns a boolean value of $true, or $false. So theres no need to use $LastExitCode by placing the Test-Path as the condition to evaluate for.
Alternatively, you can use the Foreach loop to accomplish the same thing here:
$CSV = Import-Csv -Path C:\something\mylist.csv
Foreach($Key in $CSV.Value){
$PathTest = Test-Path -Path $Key
if($PathTest) {
Write-Output "It worked"
} else {
Write-Output "It didn't work"
}
}
By iterating(reading through the list 1 at a time) through the csv only selecting the value(Reg Path), we can test against that value by assigning its value to the $PathTest Variable, to be evaluated in your if statement just like above; theres also no need to assign it to a variable and we can just use the Test-Path in your if statement like we did above as well for the same results.

Tab-complete a parameter value based on another parameter's already specified value

This question addresses the following scenario:
Can custom tab-completion for a given command dynamically determine completions based on the value previously passed to another parameter on the same command line, using either a parameter-level [ArgumentCompleter()] attribute or the Register-ArgumentCompleter cmdlet?
If so, what are the limitations of this approach?
Example scenario:
A hypothetical Get-Property command has an -Object parameter that accepts an object of any type, and a -Property parameter that accepts the name of a property whose value to extract from the object.
Now, in the course of typing a Get-Property call, if a value is already specified for -Object, tab-completing -Property should cycle through the names of the specified object's (public) properties.
$obj = [pscustomobject] #{ foo = 1; bar = 2; baz = 3 }
Get-Property -Object $obj -Property # <- pressing <tab> here should cycle
# through 'foo', 'bar', 'baz'
#mklement0, regarding first limitation stated in your answer
The custom-completion script block ({ ... }) invoked by PowerShell fundamentally only sees values specified via parameters, not via the pipeline.
I struggled with this, and after some stubbornness I got a working solution.
At least good enough for my tooling, and I hope it can make life easier for many others out there.
This solution has been verified to work with PowerShell versions 5.1 and 7.1.2.
Here I made use of $cmdAst (called $commandAst in the docs), which contains information about the pipeline. With this we can get to know the previous pipeline element and even differentiate between it containing only a variable or a command. Yes, A COMMAND, which with help of Get-Command and the command's OutputType() member method, we can get (suggested) property names for such as well!
Example usage
PS> $obj = [pscustomobject] #{ foo = 1; bar = 2; baz = 3 }
PS> $obj | Get-Property -Property # <tab>: bar, baz, foo
PS> "la", "na", "le" | Select-String "a" | Get-Property -Property # <tab>: Chars, Context, Filename, ...
PS> 2,5,2,2,6,3 | group | Get-Property -Property # <tab>: Count, Values, Group, ...
Function code
Note that apart from now using $cmdAst, I also added [Parameter(ValueFromPipeline=$true)] so we actually pick the object, and PROCESS {$Object.$Property} so that one can test and see the code actually working.
param(
[Parameter(ValueFromPipeline=$true)]
[object] $Object,
[ArgumentCompleter({
param($cmdName, $paramName, $wordToComplete, $cmdAst, $preBoundParameters)
# Find out if we have pipeline input.
$pipelineElements = $cmdAst.Parent.PipelineElements
$thisPipelineElementAsString = $cmdAst.Extent.Text
$thisPipelinePosition = [array]::IndexOf($pipelineElements.Extent.Text, $thisPipelineElementAsString)
$hasPipelineInput = $thisPipelinePosition -ne 0
$possibleArguments = #()
if ($hasPipelineInput) {
# If we are in a pipeline, find out if the previous pipeline element is a variable or a command.
$previousPipelineElement = $pipelineElements[$thisPipelinePosition - 1]
$pipelineInputVariable = $previousPipelineElement.Expression.VariablePath.UserPath
if (-not [string]::IsNullOrEmpty($pipelineInputVariable)) {
# If previous pipeline element is a variable, get the object.
# Note that it can be a non-existent variable. In such case we simply get nothing.
$detectedInputObject = Get-Variable |
Where-Object {$_.Name -eq $pipelineInputVariable} |
ForEach-Object Value
} else {
$pipelineInputCommand = $previousPipelineElement.CommandElements[0].Value
if (-not [string]::IsNullOrEmpty($pipelineInputCommand)) {
# If previous pipeline element is a command, check if it exists as a command.
$possibleArguments += Get-Command -CommandType All |
Where-Object Name -Match "^$pipelineInputCommand$" |
# Collect properties for each documented output type.
ForEach-Object {$_.OutputType.Type} | ForEach-Object GetProperties |
# Group properties by Name to get unique ones, and sort them by
# the most frequent Name first. The sorting is a perk.
# A command can have multiple output types. If so, we might now
# have multiple properties with identical Name.
Group-Object Name -NoElement | Sort-Object Count -Descending |
ForEach-Object Name
}
}
} elseif ($preBoundParameters.ContainsKey("Object")) {
# If not in pipeline, but object has been given, get the object.
$detectedInputObject = $preBoundParameters["Object"]
}
if ($null -ne $detectedInputObject) {
# The input object might be an array of objects, if so, select the first one.
# We (at least I) are not interested in array properties, but the object element's properties.
if ($detectedInputObject -is [array]) {
$sampleInputObject = $detectedInputObject[0]
} else {
$sampleInputObject = $detectedInputObject
}
# Collect property names.
$possibleArguments += $sampleInputObject | Get-Member -MemberType Properties | ForEach-Object Name
}
# Refering to about_Functions_Argument_Completion documentation.
# The ArgumentCompleter script block must unroll the values using the pipeline,
# such as ForEach-Object, Where-Object, or another suitable method.
# Returning an array of values causes PowerShell to treat the entire array as one tab completion value.
$possibleArguments | Where-Object {$_ -like "$wordToComplete*"}
})]
[string] $Property
)
PROCESS {$Object.$Property}
Update: See betoz's helpful answer for a more complete solution that also supports pipeline input.
The part of the answer below that clarifies the limitations of pre-execution detection of the input objects' data type still applies.
The following solution uses a parameter-specific [ArgumentCompleter()] attribute as part of the definition of the Get-Property function itself, but the solution analogously applies to separately defining custom-completion logic via the Register-CommandCompleter cmdlet.
Limitations:
[See betoz's answer for how to overcome this limitation] The custom-completion script block ({ ... }) invoked by PowerShell fundamentally only sees values specified via parameters, not via the pipeline.
That is, if you type Get-Property -Object $obj -Property <tab>, the script block can determine that the value of $obj is to be bound to the -Object parameter, but that wouldn't work with
$obj | Get-Property -Property <tab> (even if -Object is declared as pipeline-binding).
Fundamentally, only values that can be evaluated without side effects are actually accessible in the script block; in concrete terms, this means:
Literal values (e.g., -Object ([pscustomobject] #{ foo = 1; bar = 2; baz = 3 })
Simple variable references (e.g., -Object $obj) or property-access or index-access expressions (e.g., -Object $obj.Foo or -Object $obj[0])
Notably, the following values are not accessible:
Method-call results (e.g., -Object $object.Foo())
Command output (via (...), $(...), or #(...), e.g.
-Object (Invoke-RestMethod http://example.org))
The reason for this limitation is that evaluating such values before actually submitting the command could have undesirable side effects and / or could take a long time to complete.
function Get-Property {
param(
[object] $Object,
[ArgumentCompleter({
# A fixed list of parameters is passed to an argument-completer script block.
# Here, only two are of interest:
# * $wordToComplete:
# The part of the value that the user has typed so far, if any.
# * $preBoundParameters (called $fakeBoundParameters
# in the docs):
# A hashtable of those (future) parameter values specified so
# far that are side effect-free (see above).
param($cmdName, $paramName, $wordToComplete, $cmdAst, $preBoundParameters)
# Was a side effect-free value specified for -Object?
if ($obj = $preBoundParameters['Object']) {
# Get all property names of the objects and filter them
# by the partial value already typed, if any,
# interpreted as a name prefix.
#($obj.psobject.Properties.Name) -like "$wordToComplete*"
}
})]
[string] $Property
)
# ...
}

Is there a way to show all functions in a PowerShell script?

Is there any command to list all functions I've created in a script?
Like i created function doXY and function getABC or something like this.
Then I type in the command and it shows:
Function doXY
Function getABC
Would be a cool feature^^
Thanks for all your help.
You can have PowerShell parse your script, and then locate the function definitions in the resulting Abstract Syntax Tree (AST).
Get-Command is probably the easiest way to access the AST:
# Use Get-Command to parse the script
$myScript = Get-Command .\path\to\script.ps1
$scriptAST = $myScript.ScriptBlock.AST
# Search the AST for function definitions
$functionDefinitions = $scriptAST.FindAll({
$args[0] -is [Management.Automation.Language.FunctionDefinitionAst]
}, $false)
# Report function name and line number in the script
$functionDefinitions |ForEach-Object {
Write-Host "Function '$($_.Name)' found on line $($_.StartLineNumber)!"
}
You can also use this to analyze the functions' contents and parameters if necessary.
Where your script is named things.ps1, something like...
cat ./things.ps1 | grep function
For MacOS/Linux or...
cat ./things.ps1 | select-string function
For Windows.
This is a built-in feature as shown in the PowerShell help files.
About_Providers
Similar questions have been asked before. So, this is a potential duplicate of:
How to get a list of custom Powershell functions?
Answers... Using the PSDrive feature
# To get a list of available functions
Get-ChildItem function:\
# To remove a powershell function
# removes `someFunction`
Remove-Item function:\someFunction
Or
Function Get-MyCommands {
Get-Content -Path $profile | Select-String -Pattern "^function.+" | ForEach-Object {
[Regex]::Matches($_, "^function ([a-z.-]+)","IgnoreCase").Groups[1].Value
} | Where-Object { $_ -ine "prompt" } | Sort-Object
}
Or this one
Get List Of Functions From Script
$currentFunctions = Get-ChildItem function:
# dot source your script to load it to the current runspace
. "C:\someScript.ps1"
$scriptFunctions = Get-ChildItem function: | Where-Object { $currentFunctions -notcontains $_ }
$scriptFunctions | ForEach-Object {
& $_.ScriptBlock
}
As for this...
Thanks, this is kind of what i want, but it also shows functions like
A:, B:, Get-Verb, Clear-Host, ...
That is by design. If you want it another way, then you have to code that.
To get name of functions in any script, it has to be loaded into memory first, then you can dot source the definition and get the internals. If you just want the function names, you can use regex to get them.
Or as simple as this...
Function Show-ScriptFunctions
{
[cmdletbinding()]
[Alias('ssf')]
Param
(
[string]$FullPathToScriptFile
)
(Get-Content -Path $FullPathToScriptFile) |
Select-String -Pattern 'function'
}
ssf -FullPathToScriptFile 'D:\Scripts\Format-NumericRange.ps1'
# Results
<#
function Format-NumericRange
function Flush-NumberBuffer
#>
This function will parse all the functions included in a .ps1 file, and then will return objects for each function found.
The output can be piped directly into Invoke-Expression to load the retuned functions into the current scope.
You can also provide an array of desired names, or a Regular Expression to constrain the results.
My use case was I needed a way for loading individual functions from larger scripts, that I don't own, so I could do pester testing.
Note: only tested in PowerShell 7, but I suspect it will work in older versions too.
function Get-Function {
<#
.SYNOPSIS
Returns a named function from a .ps1 file without executing the file
.DESCRIPTION
This is useful where you have a blended file containing functions and executed instructions.
If neither -Names nor -Regex are provided then all functions in the file are returned.
Returned objects can be piped directly into Invoke-Expression which will place them into the current scope.
Returns an array of objects with the following
- .ToString()
- .Name
- .Parameters
- .Body
- .Extent
- .IsFilter
- .IsWorkFlow
- .Parent
.PARAMETER -Names
Array of Strings; Optional
If provided then function objects of these names will be returned
The name must exactly match the provided value
Case Insensitive.
.PARAMETER -Regex
Regular Expression; Optional
If provided then function objects with names that match will be returned
Case Insensitive
.EXAMPLE
Get all the functions names included in the file
Get-Function -name TestA | select name
.EXAMPLE
Import a function into the current scope
Get-Function -name TestA | Invoke-Expression
#>
param (
$File = "c:\fullpath\SomePowerShellScriptFile.ps1"
,
[alias("Name", "FunctionNames", "Functions")]
$Names
,
[alias("NameRegex")]
$Regex
) # end function
# get the script and parse it
$Script = Get-Command /Users/royomi/Documents/dev/javascript/BenderBot_AI/Import-Function.ps1
$AllFunctions = $Script.ScriptBlock.AST.FindAll({$args[0] -is [Management.Automation.Language.FunctionDefinitionAst]}, $false)
# return all requested functions
$AllFunctions | Where-Object { `
( $Names -and $Names -icontains $_.Name ) `
-or ( $Regex -and $Names -imatch $Regex ) `
-or (-not $Names -and -not $Regex) `
} # end where-object
} # end function Get-Function

What is conceptually wrong with get-date|Write-Host($_)

I'm trying to understand Powershell, but find somethings not so intuitive. What I understand of it is that in the pipeline objects are passed, instead of traditionally text. And $_ refers to the current object in the pipeline. Then, why is the following not working:
get-date|Write-Host "$_"
The errormessage is:
Write-Host : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not matc
h any of the parameters that take pipeline input.
At line:1 char:10
+ get-date|Write-Host $_
+ ~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (10-9-2014 15:17:00:PSObject) [Write-Host], ParameterBindingException
+ FullyQualifiedErrorId : InputObjectNotBound,Microsoft.PowerShell.Commands.WriteHostCommand
$_ is the current single item in the pipeline. To write each item in the pipeline you would write
get-data | foreach { Write-Host $_ }
Or in the short form
get-data |% { Write-Host $_ }
Conceptually, Foreach is a cmdlet that receives a function parameter, a pipeline input and applies the function on each item of the pipeline. You can't just write code with $_ - you need to have a function explicitly states that it agrees to receive pipeline input
And $_ refers to the current object in the pipeline
Indeed, the automatic $_ variable refers to the current pipeline object, but only in script blocks { ... }, notably those passed to the ForEach-Object and Where-Object cmdlets.
Outside of script blocks it has no meaningful value.
Therefore, the immediate fix to your command is the following:
Get-Date | ForEach-Object { Write-Host $_ }
However, note that:
Write-Host is is typically the wrong tool to use, unless the intent is to write to the display only, bypassing the success output stream and with it the ability to send output to other commands, capture it in a variable, or redirect it to a file.
To output a value, use it by itself; e.g, $value, instead of Write-Host $value (or use Write-Output $value); see this answer. To explicitly print only to the display but with rich formatting, use Out-Host.
Therefore, if merely outputting each pipeline input object is the goal, Get-Date | ForEach-Object { $_ } would do, where the ForEach-Object call is redundant if each input object is to simply be passed through (without transformation); that is, in the latter case just Get-Date would do.
As for what you tried:
get-date|Write-Host "$_"
As noted, the use of $_ in this context is pointless, but the reason for the error message you saw is unrelated to that problem:
Instead, the reason for the error is that you're mistakenly trying to provide input to Write-Host both via the pipeline Get-Date | Write-Host ... and by way of an argument (... | Write-Host "...")
Given that the argument ("$_") (positionally) binds to the -Object parameter, the pipeline input then has no parameter left to bind to, which causes the error at hand.