I'm new to using PowerShell and Scripting Overall but i tried to solve my Problem for a few Days now and even with researching stackoverflow and the Web i can't find a solution.
I try to write a script to Download a fixed amount of files (.jdb, .exe) from a Website.
One Part of the filename is Always the same ex: -061-IPS_IU_SEP_14RU1.jdb
But the full filename is 20201120-061-IPS_IU_SEP_14RU1.jdb
The first part is the date where the files have been created.
So far i was able to download all the files using following Code:
$filename = #(
"-061-IPS_IU_SEP_14_0.exe",
"-061-IPS_IU_SEP_14_0.jdb",
"-061-IPS_IU_SEP_14_0_MP2.exe",
"-061-IPS_IU_SEP_14_0_MP2.jdb",
"-061-IPS_IU_SEP_14RU1.exe",
"-061-IPS_IU_SEP_14RU1.jdb",
"-061-IPS_IU_SEP_14.2_RU1.exe",
"-061-IPS_IU_SEP_14.2_RU1.jdb",
"-061-IPS_IU_SEP_14.2_RU2.exe",
"-061-IPS_IU_SEP_14.2_RU2.jdb"
)
# Zielverzeichnis
$output = "C:\IPS14\"
$url = "http://definitions.symantec.com/defs/ips/"
$Date = Get-Date -format yyyyMMdd ((Get-Date).AddDays(0))
$fullurl = ("$url" + $Date + $filename[0])
for ($i=0; $i -lt $filename.Length; $i++){
$fullurl = ("$url" + ($Date-1) + $filename[$i])
Try{
Start-BitsTransfer -Source $fullurl -Destination $Output
Write-Host ("$url" + $Date + $filename[$i] + " Downloading")
}
Catch{}
The current Problem is that some Files are not updated daily. Some are from 1 Day ago others are 3 or more Days old.
I only need the latest updated files.
Well because Downloading the files weren't my Problem i tried something like
$IPSindex = 'https://definitions.symantec.com/defs/download/symantec_enterprise/ips/index.html'
(Invoke-WebRequest –Uri $IPSindex).Links | Sort-Object href -Unique | Format-List innerText, href
to list all files on the Page. But now i need to filter the latest href using the $filename Array.
Acutally I'm stuck. Hope you can help me.
Greetings
I have modified your snippet to extract the date part of the filename and converting it into a datetime array.Sorting the datetime array to fetch recent update.
$IPSindex = 'https://definitions.symantec.com/defs/download/symantec_enterprise/ips/index.html'
$links = ((Invoke-WebRequest –Uri $IPSindex).Links).href
$dates =#()
foreach ($link in $links){
if($link -like "*IPS*"){
$dates += $link.split("/")[-1].split("-")[0]
}
}
$dates = $dates |Get-Unique | foreach {[datetime]::ParseExact($_,"yyyyMMdd",$null)} | Sort-Object -Descending
Write-Output "The latest available definition date: $($dates[0])"
Related
I am pulling data on adoption of Office 365 products on a daily basis. I don't know how to convert my current logic to write to a new file based on file size to one based on the report date.
My original thought process was to use an if statement to split the data out by month and have 12 files already ready to append to (depending on the month of data) but this seems inefficient.
$name = "O365SPSiteActivity.csv"
$auth=Get-AuthCode
$accesstoken=$auth[1]
### data pulling process has been omitted ###
if ($report -ne $null)
{
###New section for making the new files
#Get current file
$source = "D:\O365Data\"+ $name
$File = Get-Item $source
If (((Get-Item $file).Length/1MB) -ge 700)
{
$date = (get-date -Format dd-MM-yyyy)
$RenamedFileName = "O365SPSiteActivity-$date.csv"
Rename-Item $file.FullName -NewName $RenamedFileName
$FileName = "D:\temp\" + $name
Send-MailMessage –From svc_sps10#kbslp.com –To shelby.cundiff#kbslp.com –Subject “New File Has been Created" –Body “New File Name: $RenamedFileName " -SmtpServer kbslp-com.mail.protection.outlook.com -Port 25
}
Else
{
$FileName = "D:\temp\" + $name
Copy $File $FileName
}
#########################################################################
$Data=#()
$c=1
foreach ($row in $report)
{ Write-Progress -Activity $row.'User Principal Name' -PercentComplete (($c/$report.count)*100) -ID 4
$string = "" | Select "???Report Refresh Date","User Principal Name","Is Deleted","Deleted Date","Last Activity Date","Viewed or Edited File Count",
"Synced File Count","Shared Internally File Count","Shared Externally File Count","Visited Page Count","Report Period"
$string.'???Report Refresh Date' = Get-Date($row.'Report Refresh Date') -Format "yyyy-MM-dd"
$string.'User Principal Name' = $row.'User Principal Name'
$string.'Is Deleted' = $row.'Is Deleted'
$string.'Deleted Date' = $row.'Deleted Date'
$string.'Last Activity Date' = $row.'Last Activity Date'
$string.'Viewed or Edited File Count' = $row.'Viewed or Edited File Count'
$string.'Synced File Count' = $row.'Synced File Count'
$string.'Shared Internally File Count' = $row.'Shared Internally File Count'
$string.'Shared Externally File Count' = $row.'Shared Externally File Count'
$string.'Visited Page Count' = $row.'Visited Page Count'
$string.'Report Period' = $row.'Report Period'
$Data += $string
$c++
}
$Data | Export-Csv -Append -Path $FileName -NoTypeInformation -Force
#$FolderUrl = $teamSitePath + "/" + $ListName
#$UploadFileInfo = New-Object System.IO.FileInfo($FileName)
#Upload-SPOFile -WebUrl $teamSiteUrl -spCredentials $SPOCreds -FolderUrl $FolderUrl -FileInfo $UploadFileInfo
$newFile = Get-Item $FileName
Copy $newFile $File.FullName
}
$report = $null
$Data = $null
Ideally, i'd like to change this script to write to a file like:
O365SPSiteActivity-2019-Oct.csv during October, then O365SPSiteActivity-2019-Nov.csv during November, etc. depending on when the data is from.
Why would you write to a temporary file first and copy if over a (possibly existing) file when done?
If I understand the question, you would like to create a new report csv each month.
Then, why not simply do something like this:
# create a filename for this month
$currentReport = 'O365SPSiteActivity-{0:yyyy-MMM}.csv' -f (Get-Date)
and export -Append your data into that? If the file does not already exist, it will be created, otherwise the new data will be appended to it.
Are you looking for the current date, or the date from the spreadsheet?
$date = (get-date -Format dd-MM-yyyy)
will output to O365SPSiteActivity-2019-Oct.csv if you run it today.
If you're looking to chunk up the CSV based on dates within the data, I might move the export-CSV inside the foreach loop, and modify it to use a name similar to above instead of appending it to the $Data array.
$Data = "2019-May-31"
Get-Date -Format yyyy-MMM -Date ([datetime]::parseexact($Data, 'yyyy-MMM-dd', $null))
$FileName = "O365SPSiteActivity-$date.csv"
$Data | Export-Csv -Append -Path $FileName -NoTypeInformation -Force
That'll write each line to the appropriate CSV file- you'll end up with a different CSV file for every months worth of data.
Note that the 'yyyy-MMM-dd' in parseexact will need to match the format of the date that you're feeding it. For example, if you're sorting it based on $row.'Report Period' that has the date as 12/31/19, it would be 'MM/dd/yy' instead (notice I changed from - to / as the separator). Here's the documentation listing what each letter in that means.
I am writing some PS scripts to log times into a text file, login.txt, using the following code:
$logdir = "C:\FOLDER"
$logfile = "$logdir\LastLogin.txt"
$user = $env:USERNAME
$date = Get-Date -Format "dd-MM-yyyy"
if (!(Test-Path $logdir)){New-Item -ItemType Directory $logdir}else{}
if (!(Test-Path $logfile)){New-Item $logfile}else{}
if (Get-Content $logfile | Select-String $user -Quiet){write-host "exists"}else{"$user - $date" | Add-Content -path $logfile}
(Get-Content $logfile) | Foreach-Object {$_ -replace "$user.+$", "$user - $date"; } | Set-Content $logfile
This creates an entry in the text file like:
UserName - 01-01-1999
Using Powershell, I want to read the text file, compare the date, 01-01-1999, in the text file to the current date and if more than 30 days difference, extract the UserName to a variable to be used later in the script.
I would really appreciate any hints as to how I could do the following:
Compare the date in the text file to the current date.
If difference is more than 30 days, pick up UserName as a variable.
I would really appreciate any advice.
Checking all dates in the file with the help of a RegEx with named capture groups.
$logdir = "C:\FOLDER"
$logfile = Join-Path $logdir "LastLogin.txt"
$Days = -30
$Expires = (Get-Date).AddDays($Days)
Get-Content $logfile | ForEach-Object {
if ($_ -match "(?<User>[^ ]+) - (?<LastLogin>[0-9\-]+)") {
$LastLogin = [datetime]::ParseExact($Matches.LastLogin,"dd-MM-yyyy",$Null)
if ( $Expires -gt $LastLogin ) {
"{0} last login {1} is {2:0} days ago" -F $Matches.User, $Matches.LastLogin,
(New-TimeSpan -Start $LastLogin -End (Get-Date) ).TotalDays
}
}
}
Sample output
username last login 31-12-1999 is 6690 days ago
There is a way of doing that using regex (Regular Expressions). I will assume that the username which you get in your text file is .(dot) separated. For example, username looks like john.doe or jason.smith etc. And the entry in your text file looks like john.doe - 01-01-1999 or jason.smith - 02-02-1999. Keeping these things in mind our approach would be -
Using a regex we would get the username and date entry into a single variable.
Next up, we will split the pattern we have got in step 1 into two parts i.e. the username part and the date part.
Next we take the date part and if the difference is more than 30 days, we would take the other part (username) and store it in a variable.
So the code would look something like this -
$arr = #() #defining an array to store the username with date
$pattern = "[a-z]*[.][a-z]*\s[-]\s[\d]{2}[-][\d]{2}[-][\d]{4}" #Regex pattern to match entires like "john.doe - 01-01-1999"
Get-Content $logfile | Foreach {if ([Regex]::IsMatch($_, $pattern)) {
$arr += [Regex]::Match($_, $pattern)
}
}
$arr | Foreach {$_.Value} #Storing the matched pattern in $arr
$UserNamewithDate = $arr.value -split ('\s[-]\s') #step 2 - Storing the username and date into a variable.
$array = #() #Defining the array that would store the final usernames based on the time difference.
for($i = 1; $i -lt $UserNamewithDate.Length;)
{
$datepart = [Datetime]$UserNamewithDate[$i] #Casting the date part to [datetime] format
$CurrentDate = Get-Date
$diff = $CurrentDate - $datepart
if ($diff.Days -gt 30)
{
$array += $UserNamewithDate[$i -1] #If the difference between current date and the date received from the log is greater than 30 days, then store the corresponding username in $array
}
$i = $i + 2
}
Now you can access the usernames like $array[0], $array[1] and so on. Hope that helps!
NOTE - The regex pattern will change as per the format your usernames are defined. Here is a regex library which might turn out to be helpful.
I have a list of URLs for xml files saved in list.txt. I want to use list.txt to download the URLs and save them with incremental filenames: download1.xml, download2.xml, etc. How do I achieve this with Powershell?
I have the following code snippet as a starting point - this achieves the download from list, but not the incremental naming. Any help much appreciated.
$object = New-Object System.Net.WebClient
$lists = get-content C:\list.txt
$downdir = "C:\Download\"
foreach($list in $lists)
{
$filename = $list.split('/');
$object.DownloadFile($list, $downdir+$filename[$filename.count-1])
}
Also is there a way to send all the download requests at 5 second intervals rather than waiting for each single download to complete before sending the next request? My knowledge is limited, so specifics would be a big help. Many thanks.
object = New-Object System.Net.WebClient
$lists = get-content C:\list.txt
$downdir = "C:\Download\"
$i = 1
foreach($list in $lists)
{
$filename = $list.fullname.split('/') | select -Last 1
$newfile =$filename.FullName.split(".")[0] + $i + ".xml"
$object.DownloadFile($list, $newfile)
$i +=1
}
I'd created below script to check files remaining at a particular path in server.
I've below question please help me out.
How to change file.CreationTime to 12 hours format.
How to export the entire contents to file or email.
Kindly help me in fine tuning the below script
$fullPath = "\\server\D$\fn_1"
$numdays = 0
$numhours = 0
$nummins = 1
function ShowOldFiles($path, $days, $hours, $mins)
{
$files = #(get-childitem $path -include *.* -recurse | where {($_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$mins)) -and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
write-host ("File Name: " + $file.Name, ", Pending Since : " + $file.CreationTime) -Fore Red
}
}
}
ShowOldFiles $fullPath $numdays $numhours $nummins
To dump to a file, replace your whole for loop with this:
$files | select-object Name,#{name="Pending Since";Expression={$_.CreationTime}}|export-csv -notypeinfo -path c:\output.csv;
This will produce a CSV file with all of your files listed, along with their creation time. Save the formatting for your final delivery/presentation to the user (in this case, you could format the columns in Excel & then save as XLSX).
To send via email, you'll probably want to convert it to HTML.
$filesForEmail = $files | select-object Name,#{name="Pending Since";Expression={$_.CreationTime}} | convertto-HTML;
send-mailmessage -to RECIPIENT -from FROM -subject "Your file listing" -body $filesForEmail -BodyAsHTML -smtpserver smtp.yourcompany.com
You can format the CreationTime value by running it through get-date and using the -format command and the standard DateTime formatting options (http://msdn.microsoft.com/en-us/library/system.globalization.datetimeformatinfo(VS.85).aspx). Try this line:
write-host ("File Name: " + $file.Name, ", Pending Since : " + $(get-date $file.CreationTime -format "dddd, MMMM d, yyy h:mm tt")) -Fore Red
The main bit here is replacing $file.CreationTime with $(get-date $file.CreationTime -format "dddd, MMMM d, yyy h:mm tt"). That is a fairly standard date/time format with 12 hour formatting. You could get more detailed if you wanted and define the format earlier and only put out relevant info (such as, if days = 0 exclude that from the date format).
I am very new to PowerShell, and I was hoping I could get some help creating a script that tells me the modified date of a file.
I wish I knew more about PowerShell, as I feel like I am asking a lot (all my free-time this week will be dedicated to learning PowerShell better). Pointing me in the direction of where to learn how to do this would be very helpful as well.
Here is the complete rundown. I need to run a report daily that checks a list of computers at 90 different stores to make sure their a certain backup was performed. The modified date should tell if the backup had been performed, and will be set to the previous date.
If the modified date is yesterday, then there does not need to be an output. If it is not yesterday, I would like to have the output in the PowerShell window, or to a text file, whichever would be easier.
I also have to check that a folder is no older than seven days for each of the 90 stores, with the same criteria for the output. The idea that I have would be like this for each store
For Store 1:
Check file date for \\server\store\computer\c:\folder\"newest modified date in folder"
if date equals yesterday
then do nothing
if date does not equal yesterday
then output "Test did not backup"
check folder modified date for \\server\sample\store\backupfolder
if date equals <7 days old
then do nothign
if date equals >7 days old
then output "test did not backup"
Sorry for not proving my research effort, I am very new to Powershell and I was on a deadline to get this done. I have, since yesterday, learned how to do everything that I needed to do with this script. Thanks to #Keith for setting me on the correct path. I ended up using the following code to accomplish my goal of only out-putting the location where result was false.
$a = Get-ChildItem \\server\XXX\Received_Orders\*.* | Where{$_.LastWriteTime -ge (Get-Date).AddDays(-7)}
if ($a = (Get-ChildItem \\server\XXX\Received_Orders\*.* | Where{$_.LastWriteTime -ge (Get-Date).AddDays(-7)}))
{
}
Else
{
'STORE XXX HAS NOT RECEIVED ANY ORDERS IN THE PAST 7 DAYS'
}
$b = Get-ChildItem \\COMP NAME\Folder\*.* | Where{$_.LastWriteTime -ge (Get-Date).AddDays(-1)}
if ($b = (Get-ChildItem \\COMP NAME\TFolder\*.* | Where{$_.LastWriteTime -ge (Get-Date).AddDays(-1)}))
{
}
Else
{
'STORE XXX DID NOT RUN ITS BACKUP LAST NIGHT'
}
If you run the Get-Item or Get-ChildItem commands these will output System.IO.FileInfo and System.IO.DirectoryInfo objects that contain this information e.g.:
Get-Item c:\folder | Format-List
Or you can access the property directly like so:
Get-Item c:\folder | Foreach {$_.LastWriteTime}
To start to filter folders & files based on last write time you can do this:
Get-ChildItem c:\folder | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-7)}
To get the modified date on a single file try:
$lastModifiedDate = (Get-Item "C:\foo.tmp").LastWriteTime
To compare with another:
$dateA= $lastModifiedDate
$dateB= (Get-Item "C:\other.tmp").LastWriteTime
if ($dateA -ge $dateB) {
Write-Host("C:\foo.tmp was modified at the same time or after C:\other.tmp")
} else {
Write-Host("C:\foo.tmp was modified before C:\other.tmp")
}
Here's what worked for me:
$a = Get-ChildItem \\server\XXX\Received_Orders\*.* | Where{$_.LastWriteTime -ge (Get-Date).AddDays(-7)}
if ($a = (Get-ChildItem \\server\XXX\Received_Orders\*.* | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-7)}
#Im using the -gt switch instead of -ge
{}
Else
{
'STORE XXX HAS NOT RECEIVED ANY ORDERS IN THE PAST 7 DAYS'
}
$b = Get-ChildItem \\COMP NAME\Folder\*.* | Where{$_.LastWriteTime -ge (Get-Date).AddDays(-1)}
if ($b = (Get-ChildItem \\COMP NAME\TFolder\*.* | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-1)))}
{}
Else
{
'STORE XXX DID NOT RUN ITS BACKUP LAST NIGHT'
}
You can try dirTimesJS.bat and fileTimesJS.bat
example:
C:\>dirTimesJS.bat %windir%
directory timestamps for C:\Windows :
Modified : 2020-11-22 22:12:55
Modified - milliseconds passed : 1604607175000
Modified day of the week : 4
Created : 2019-12-11 11:03:44
Created - milliseconds passed : 1575709424000
Created day of the week : 6
Accessed : 2020-11-16 16:39:22
Accessed - milliseconds passed : 1605019162000
Accessed day of the week : 2
C:\>fileTimesJS.bat %windir%\notepad.exe
file timestamps for C:\Windows\notepad.exe :
Modified : 2020-09-08 08:33:31
Modified - milliseconds passed : 1599629611000
Modified day of the week : 3
Created : 2020-09-08 08:33:31
Created - milliseconds passed : 1599629611000
Created day of the week : 3
Accessed : 2020-11-23 23:59:22
Accessed - milliseconds passed : 1604613562000
Accessed day of the week : 4
PowerShell code to find all document library files modified from last 2 days.
$web = Get-SPWeb -Identity http://siteName:9090/
$list = $web.GetList("http://siteName:9090/Style Library/")
$folderquery = New-Object Microsoft.SharePoint.SPQuery
$foldercamlQuery =
'<Where> <Eq>
<FieldRef Name="ContentType" /> <Value Type="text">Folder</Value>
</Eq> </Where>'
$folderquery.Query = $foldercamlQuery
$folders = $list.GetItems($folderquery)
foreach($folderItem in $folders)
{
$folder = $folderItem.Folder
if($folder.ItemCount -gt 0){
Write-Host " find Item count " $folder.ItemCount
$oldest = $null
$files = $folder.Files
$date = (Get-Date).AddDays(-2).ToString(“MM/dd/yyyy”)
foreach ($file in $files){
if($file.Item["Modified"]-Ge $date)
{
Write-Host "Last 2 days modified folder name:" $folder " File Name: " $file.Item["Name"] " Date of midified: " $file.Item["Modified"]
}
}
}
else
{
Write-Warning "$folder['Name'] is empty"
}
}