I created a log file using powershell in .txt or .config. Its look like :
# Hello
## This is readme.txt file
## This is commented line
# This is powershell command output
#
# File generated for read, re-load and edit the values
## -- many more comment is there
# Users can change values..
## There is no relation between $RegPath and $RegValue, this are only variables.
# this are registry path,
$RegPath = (
"\\hklm\software\microsoft\123",
"\\hklm\software\Adobe\123",
"\\hklm\software\Fax\123",
"\\hklm\software\IE\123");
# this are registry value.
$RegValue = (
"0",
"123",
"abc",
"456asdccxv",
"update",
"serv");
#this are some services with 0/1
# Win 7 OS exist
$IsWin7OS = 1
# Service pack installed
$IsSPInstalled = 0
# Check office
$MSOffice = 1
# This setting name is
$SettingName = "ReadMe.txt"
This is sample ReadMe.txt. I want to read this file in powershell and want to get values of $RegPath, $RegValue, $IsWin7OS, $IsSPInstalled, $MSOffice and $SettingName in powershell platform. Then I will update this value and save again in same file.
DISCLAIMER: This is a security WORST practice. It really is quite dangerous. That said, an easy way to read in this data is:
PS> Invoke-Expression (Get-Content C:\readfile.txt -Raw)
PS> $RegPath[0]
\\hklm\software\microsoft\123
The reason it is bad is that if someone adds this remove-item c:\ -recurse -force to the file the command above will execute that and it will be a bad day for you. What I recommend is that you put the data in a .psd1 file if you can in the form of a hashtable. PowerShell will not execute arbitrary code in that case. Or you could store the data as CLIXML or JSON and then read it back in with Import-Csv or ConvertFrom-Json.
Related
I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"
I have a command which runs a program in silent mode, it uses an XML file for the data repository and a word template to create multiple word documents based on a filter xml file.
The command I use is:
"P:\ath to\executable" -Username:Admin -Password:Pa55w0rd -Datadefinition:"C:\Data.xml" -Datafilter:"C:\Filter.xml" -wordtemplate:"C:\Batch\Paul1.dotx" -Targetdocument:="C:\Batch\Paul1.pdf" -filetype:PDF -Log:"C:\Logs\error.log" -Usage:DOCGENSILENT
I need to run this as a PowerShell script which I have mostly managed:
set-executionpolicy unrestricted
$datadefinition = Get-Content "C:\Data file.xml"
$datafilter = Get-Content "C:\Filter for data file.xml"
$wordTemplate = Get-Content "C:\"C:\Template\Paul1.dotx"
$targetFolder = Get-Content "C:\"C:\Paul\Paul.pdf"
Stop-Job = "Executable path" -Username:Admin -Password:Pa55w0rd -Datadefinition:%dataDefinition% -Datafilter:%dataFilter% -wordtemplate:%wordTemplate% -Targetdocument:%targetFolder% -filetype:docx -Log:%logPath% -Usage:DOCGENSILENT
Stop-Job 1
set-executionpolicy restricted
Write-Host -NoNewLine "Press any key to continue..."
$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
My issue is that the script starts the executable but then doesnt pass the Variables, can anyone guide me in the right direction to fix this?
Getting this working depends on the behavior of your executable. Some things I noticed:
Shouldn't this:
$wordTemplate = Get-Content "C:\"C:\Template\Paul1.dotx"
be this:
$wordTemplate = "C:\Template\Paul1.dotx"
Are you sure you need Get-Content? (Aside from that, the path and quoting in your sample are not correct.)
Shouldn't this:
$targetFolder = Get-Content "C:\"C:\Paul\Paul.pdf"
be this:
$targetDocument = "C:\Paul\Paul.pdf"
I doubt Get-Content is correct here, since presumably your output file doesn't exist yet? I also renamed the variable so it makes more sense in your command.
In fact, are you sure you need Get-Content for any of those? Aren't you specifying filenames, not the content of the files?
In PowerShell, variables are prefixed with $ rather than being surrounded by %.
Using Set-ExecutionPolicy within a script to enable scripts to run is pointless, because the script is already running. (That is, if execution policy prevented script execution, PowerShell wouldn't let you run the script in the first place.)
If my guesses regarding your variables are correct, I think your script should look something like this (note also that I specified a $logFile variable, which I didn't see in your script):
$datadefinition = "C:\Users\Administrator\data\Sample Model_146_object type(s).xml"
$datafilter = "C:\Users\Administrator\data\Sample Model_146_object type(s).xml"
$wordtemplate = "C:\Users\Administrator\Templates\Base object.docx"
$targetdocument = "C:\Users\Administrator\Result\sample test15"
$logfile = "C:\Users\Administrator\Logs\C4W Error.log"
& "C:\Program Files (x86)\Communicator4Word.exe" -Username:Admin -Password: -Datadefinition:$datadefinition -Datafilter:$datafilter -wordtemplate:$wordtemplate -Targetdocument:$targetdocument -filetype:docx -Log:$logfile -Usage:DOCGENSILENT
I don't know the behavior of Communicator4Word.exe when you use -Password: with no password after it. (Is that a syntax error, or should you just omit -Password: altogether?)
I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"
I've got a PowerShell-Script to create a VM from an Image in Azure and in this Script I deposited a .json (Parameter for VM, etc.). But if I want to create more than one VM the Names of the VM, Vnet, etc. cannot be the same for every execution (have to be in the same Resource Group).
So my Question: How can I insert Variables in the .json File to change the Name of the VM, etc. for every execution? Perhaps I have to rethink?
A very basic approach could be something like this:
# Grab the file contents
$contents = Get-Content -Path $templateFile
# Update some tokens in the file contents
$contents = $contents.replace("original value", "new value")
# Push the updated contents to a new file
Set-Content -Path $updatedFile -Value $contents
If you have a value that changes with every deployment, you could also consider using the -TemplateParameterObject parameter with the New-AzureRmResourceGroupDeployment cmdlet. That way, you can generate the values in your powershell script without having to output them to json file first.
For more details, have a look at the cmdlet specs
Is there a way to provide powershell parameters with a file?
At the moment I have a script which is called My_Script.ps1. To start this script I have to provide the right parameters in the command:
.\My_Script.ps1 -param1="x" -param2="x" -param3="x" -param4="x" -param5="x" -param6="x" ...
This works but it isn't a very easy way to start the script. Is it possible in powershell to use a file in which you store your parameters and to use that file when you start the script?
Example
In My_Script.ps1 I add something like:
Param(
[string]$File="Path/to/file"
)
In my file I have something like
param1="x"
param2="x"
param3="x"
param4="x"
...
To execute the script you can edit the file and just start the script with .\My_Script.ps1
Another option:
Just use a ps1 file as config file and define your variables as you would do in your main script
$Param1 = "Value"
$Param2 = 42
Then you can use dot-sourcing or import-module to get the data from the config file
. .\configfile.ps1
or
Import-Module .\Configfile.ps1
afterwards you can just use the variables
In addition to splatting you can create variables from = separated values in a file.
param1=foo
param2=bar
param3=herp
param4=derp
Don't quote the values. The parameter names should be valid for a variable (no spaces etc.)
PowerShell 3 and newer:
(Get-Content c:\params.ini -raw | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
PowerShell 2:
([IO.File]::ReadAllText('c:\params.ini') | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
The code creates variables in current scope. It's possible to create in a global/script/parent scope.
You can use this blog posting
for a start and declare your parameters in an ini-like format.
For sure you could also use a csv-like format and work with import-csv cmdlet.