Could you please help me with this script? I am writing in PowerShell a script in order to monitor data of one .csv file into PRTG, basically script goes to a remote server make a SFTP connection and brings it to PRTG server.
Then I just look for the data, the issue is that I have is to select and specific file, for example in the remote server there is a new file with and specific date generated every hour and some values are without a pattern.
I already tested and script works except, to look for specific file, cause if I put static name file works
These are examples of .csv files that i need
structure is name + date (Y-m-d-H) + 8 characters +.csv:
RG_400_Total_by_Hour-20210606021446-174.csv
RG_400_Total_by_Hour-20210606031630-456.csv
RG_400_Total_by_Hour-20210606041445-288.csv
RG_400_Total_by_Hour-20210606051513-761.csv
RG_400_Total_by_Hour-20210606061446-877.csv
RG_400_Total_by_Hour-20210606071446-454.csv
RG_400_Total_by_Hour-20210606081446-838.csv
RG_400_Total_by_Hour-20210606091505-171.csv
RG_400_Total_by_Hour-20210606101447-220.csv
This is the part where I completely stuck.
How can I use a wildcard in this variable that I'm trying:
$file=$remotepath+"/RG_400_Total_by_Hour-"+$date+""+$WILDCARD???+""
Here is a part of the script
$date= get-date -UFormat "%Y%m%d%H"
$t= Look for wildcart to use
$folder=$date
$hora= [int] (get-date (get-date).addhours(-2) -UFormat "%H")
$ayer = (get-date (get-date).addDays(-1) -UFormat "%Y%m%d")
if ($hora -eq 22) {$folder=$ayer}
if ($hora -eq 23) {$folder=$ayer}
$remotepath= $remotedir
$remotedir="/usr/local/sandvine/var/nds/cache/csv"
$file=$remotepath+"/RG_400_Total_by_Hour-"+$b+""+$t+""
$file2=$localpath+"/RG_400_Total_by_Hour-"+$b+""+$t+""
Get-SFTPFile -SFTPsession $global:Session -remotefile $file -localpath $localpath
Remove-SFTPSession -sftpsession $global:Session | Out-Null
I really appreciate any comment, thanks guys.
Related
I recently wrote a script and want to share it with my colleagues. It’s a simple copy and paste program that creates log-files after each run. The problem is that I used this: start transcript -Path C:\Users…
The program works fine but if anyone else runs the script it won’t be able to create log-files, since the directory is a copy of mine.
Now to my question: Is there anyway that the program can find out the directory where each user saved the script so it can create a sub-folder in that directory and then dump the logs in there?
Thank you in advance
The path to the folder containing the currently executing script can be obtained through the $PSScriptRoot automatic variable:
Start-Transcript -OutputDirectory $PSScriptRoot
Here's how I record PowerShell sessions using the Start-Transcript cmdlet. It creates a log file, where the script is run from.
#Log File Paths
$Path = Get-Location
$Path = $Path.ToString()
$Date = Get-Date -Format "yyyy-MM-dd-hh-mm-ss"
$Post = "\" + (Get-Date -Format "yyyy-MM-dd-hh-mm-ss") + "-test.log"
$PostLog = $Path + $Post
$PostLog = $PostLog.ToString()
#Start Transcript
Start-Transcript -Path $PostLog -NoClobber -IncludeInvocationHeader
#Your Script
Get-Date
#End Transcript
Stop-Transcript -ErrorAction Ignore
I have an excel file that is used to perform certain operations on the database & that sheet needs to be copied right when the update is made. I have the script that does the update on the database but before that, I'm using the Copy-Item command to save the copy of the source excel file with a new name. But I'm getting a generic error like:
"Error/ getting/reading excel file \\shared\\documents\\myData.xlsx"
Is that a permission issue or am I having an incorrect syntax. Or should I use robocopy command in powershell? Here's the chunk of powershell code that I have written so far:
#source file location
$fileLocation = "\\shared\documents\myData.xlsx"
#getting current date
$currentDate = Get-Date -Format "yyyy-MM-dd HH:mm"
#destination location
$backUpLocation = "\\shared\backup\myData_" + $currentDate + ".xlsx"
#copying file
Copy-Item $fileLocation $backUpLocation
Any better solution is appreciated. Thanks in Advance.
Shoot! There was a silly mistake a file name cannot have ':' while the $currDate I was fetching is having a colon that's breaking. Copy-Item works best as it also allows renaming the file in the single command itself.
I am trying to Create a separate script that will edit the file and add the year in the contents. How would I be able to do that
$content ="List of running Services"
$content | out-file C:\Windows\Temp\test
$textfile = get-content C:\Windows\Temp\test
write-host $textfile
Continuing from my comments.
What you need is defined in the Powershell Help files.
Use the Add-Content cmdlet. Use the examples in the Powershell help files
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/add-content?view=powershell-7.1
# Example 1: Add a string to all text files with an exception
Add-Content -Path .\*.txt -Exclude help* -Value 'End of file'
# Example 2: Add a date to the end of the specified files
This example appends the date to files in the current directory and displays the date in the PowerShell console.
Add-Content -Path .\DateTimeFile1.log, .\DateTimeFile2.log -Value (Get-Date) -PassThru
The above examples show how to do things, add strings to a file as well as how to use the date. So, you can extrapolate them both, use Example 2 to get your date info and use example 1 to add the said date to the file contents.
Complete PowerShell newbie so please forgive me as I might mess up what I am asking.
I have a file called debug.log, I need to rename it and move it.
I know I can do that with Move-Item, but I would like to append the Hostname to the front of the file name and the date stamp to the end.
For example:
In: C:\Temp\debug.log
Out: \\\sharename\MyHostdebug20201006.log
The files will go through a log parse after they get copied to the share, and the HostName and Date stamp are import to my tracking of which logs get processed. I have to do this for 8 separate servers on a monthly basis. I plan on setting this up as a Windows Schedule Task when it is completed.
Using Get-Date and, as mentioned by #Scepticalist above, $env:ComputerName you can create a string variable which you can pass into your Move-Item as the destination:
$timestamp = get-date -Format yyyyMMdd
$newPath = "\\sharename\" + $env:computername + "debug" + $timestamp + ".log"
I have remote server, where will be uploaded one file per day. I don't know when the file will be uploaded. I need to COPY this file to another server for processing and I need to do this just once per file (once a day). When the file is uploaded on remote server, I need to copy it within a hour, so I have to run this script at least once per hour. I'm using this script:
# Get yesterday date
$date = (Get-Date).Adddays(-1) | Get-Date -Format yyyyMMdd
$check = ""
$check = Get-Content c:\checkiftransfered.txt
# Test if file checkiftransfered.txt contains True or False. If it contains True, file for this day was already copyied
if ($check -ne "True") {
#Test if file exists - it has specific name and yesterday date
if(Test-Path \\remoteserver\folder\abc_$date.xls) {
Copy-Item \\remoteserver\folder\abc_$date.xls \\remoteserver2\folder\abc_$date.xls
# Write down information that file was already copyied
$check = "True" | Out-File c:\checkiftransfered.txt
} else { Write-Host "File has not been uploaded."}
} else { Write-Host "File has been copyied."}
# + I will need another script that will delete the checkiftransfered.txt at 0:00
It will work fine, I think, but I'm looking for more elegant solution - the best way how to solve it. Thank you
In PowerShell V3, Test-Path has a handy -NewerThan and -OlderThan parameters so you could simplify to this:
$yesterday = (Get-Date).AddDays(-1)
$date = $yesterday | Get-Date -Format yyyyMMdd
$path = "\\remoteserver\folder\abc_$date.xls"
if (Test-Path $path -NewerThan $yesterday)
{
Copy-Item $path \\remoteserver2\folder\abc_$date.xls -Verbose
(Get-Item $path).LastWriteTime = $yesterday
}
This eliminates the need to track copy status in a separate by using the LastWriteTime. One note about using -NewerThan and -OlderThan - don't use them together. It doesn't work as expected.
And lest we forget about some great native tools, here's a solution using robocopy:
robocopy $srcdir $destdir /maxage:1 /mot:60
The /mot:n option will cause robocopy to continuously monitor the source dir - every 60 minutes as specified above.
There is a much, much easier and more reliable way. You can use the FileSystemWatcher class.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = 'C:\Uploads'
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
$created = Register-ObjectEvent $watcher "Created" -Action {
Sleep (30*60)
Copy-Item $($eventArgs.FullPath) '\\remoteserver2\folder\'
}
So lets take a look at what we doing here, we create a new watcher and tell it to watch C:\Uploads when a new file is uploaded there the file system sends a notification through the framework to our program, which in turn fires the created event. When that happens, we tell our program to sleep to for 30 minutes to allow the upload to finish (that may be to long depending on the size of the upload) then we call Copy-Item on the event arguments which contains a full path to our new file.
By the way you would need to paste this in a powershell window and leave it open on the server, alternatively you could use the ISE and leave that open. Either way it is way more reliable that what you currently have.