Checking to see if this logic makes any sense or if I'm missing something.
Each IIS Site has their own Site ID. This Site has it's own IIS Log folder based on that Site ID. So if your Site ID is 2385, then your Log folder is W3SVC2385.
You can get the Site ID for each Site in command line by running '%windir%\system32\inetsrv\appcmd list site' which gives you SITE "Site Name" (id:####, bindings, State)
By pulling that data into a file, one should be able to isolate both the Site Name and the ID, then create a variable for each IIS Log folder. Using this information, you can pull in file information from the IIS Logs folders. By looking at the date the last created and/or modified log file was created/updated in each folder, shouldn't that tell us, what day that site was last accessed? Or am I missing something?
So here is the code I came up with. This only works if your site name is also the URL for the site (which is the case for us here). It inserts the Website Name into the first part of the URL stem.
Because this is for a SharePoint farm for us, I also filtered for default.aspx and home.aspx in the LogParser command.
Using Get-Website, I get both the site name & id. I add the name into the url stem and use the id to get into the appropriate log folder and then create a csv file based on the site name.
Hopefully this can be helpful for others!
$Sites = Get-WebSite
foreach ($site in $sites)
{
$id = $SITE.id
$name = $site.name
$name = $name -replace '\s',''
$filename = $name + '.csv'
$logfolder = 'D:\LOGS\IISLOGS\W3SVC' + $id
$logpath = $logfolder + "\*.*"
& "C:\Program Files (x86)\Log Parser 2.2\LogParser.exe" -i:iisw3c "SELECT DISTINCT TO_LOWERCASE(cs-uri-stem),max(date) into "$filename" from $logpath WHERE cs-uri-stem LIKE '%Default.asp%' OR cs-uri-stem LIKE '%home.asp%' group by cs-uri-stem" -o:CSV
$Content = #()
$NewContent = #()
$Content = Get-Content $filename
foreach ($line in $Content)
{
$NewContent += $name + $line
}
$NewContent[0] = "Site URL,DateLastAccessed"
$NewContent | Set-Content $filename
}
I notice the SharePoint tag is on this question. Since this is in the context of SharePoint, you should keep in mind that SharePoint Analytics:
https://technet.microsoft.com/en-us/library/jj219554.aspx
Is tracking the use of each of your SharePoint web applications. With that in mind you can programmatically access Analytics:
https://radutut.wordpress.com/2013/01/27/how-to-get-search-analytics-reports-programmatically-in-sharepoint-2013/
to get recent usage data.
Related
I can't seem to find any documentation or information on how to transform a key that exists in multiple files. The File Transform task seems to only support the transformation of unique keys. The windows web app i have setup is an OrchardCore CMS with 3 tenants, each tenant has their own appSettings.json file and in each of the files is a ConnectionString.
I initially thought there would be some way to connect a File Transform task to a specific variable in which case this would be easy but it doesnt look like this is possible. In addition to this, due to certain project requirements we can't use any extensions from the Market Place like MagicChunks.
Any help would be immensely appreciated, this has been driving me nuts.
You could try to install this free 3rd-party extension: XDT Transform, and then get external task: XDT transform task in pipeline.
Ok I've found a temporary workaround for this but its dirty, need to modify the below so that I am updating json properties instead of replacing string values. I also don't like that this approach modifies the artifact directly. The below is a Power Shell Task with inline script and it uses pipeline variables. Hope this is helpful to someone.
# cd to the agent artifacts directory (where the zip file exist)
cd $env:Agent_ReleaseDirectory
[Reflection.Assembly]::LoadWithPartialName("System.IO.Compression.FileSystem");
# Open zip and find the particular file (assumes only one inside the Zip file)
$zipfileName = Get-ChildItem $(System.DefaultWorkingDirectory) -depth 4 -filter '*.zip'
$zip = [System.IO.Compression.ZipFile]::Open($zipfileName.FullName,"Update")
$defaultAppSettings = $zip.Entries | Where-Object { $_.FullName -eq "App_Data/Sites/Default/appsettings.json" }
$secondaryAppSettings = $zip.Entries | Where-Object { $_.FullName -eq "App_Data/Sites/Secondary/appsettings.json" }
Write-Host "Update Default App Settings"
# Update Default Settings
$defaultAppSettingsFile = [System.IO.StreamReader]($defaultAppSettings).Open()
$defaultAppSettingsText = $defaultAppSettingsFile.ReadToEnd()
$defaultAppSettingsFile.Close()
$defaultAppSettingsFile.Dispose()
$defaultAppSettingsText = $defaultAppSettingsText -replace "Server=###.###.###.###;Initial Catalog=############;MultipleActiveResultSets=true;User ID=######;Password=#######;ConnectRetryCount=0","$(Default.ConnectionString)"
$defaultAppSettingsText = $defaultAppSettingsText -replace "#########","$(Default.AppSettings.ApiSetting.ApiKey)"
$defaultAppSettingsText = $defaultAppSettingsText -replace "#########","$(Default.AppSettings.ApiSetting.ApiBaseUrl)"
#update file with new content
$defaultAppSettingsFile = [System.IO.StreamWriter]($defaultAppSettings).Open()
$defaultAppSettingsFile.BaseStream.SetLength(0)
# Insert the $text to the file and close
$defaultAppSettingsFile.Write($defaultAppSettingsText)
$defaultAppSettingsFile.Flush()
$defaultAppSettingsFile.Close()
Write-Host "Default App Settings Updated"
Write-Host "Update Secondary App Settings"
# Update Scoot Settings
$secondaryAppSettingsFile = [System.IO.StreamReader]($secondaryAppSettings).Open()
$secondaryAppSettingsText = $secondaryAppSettingsFile.ReadToEnd()
$secondaryAppSettingsFile.Close()
$secondaryAppSettingsFile.Dispose()
$secondaryAppSettingsText = $secondaryAppSettingsText -replace "Server=###.###.###.###;Initial Catalog=############;MultipleActiveResultSets=true;User ID=######;Password=#######;ConnectRetryCount=0","$(Secondary.ConnectionString)"
$secondaryAppSettingsText = $secondaryAppSettingsText -replace "#########","$(Default.AppSettings.ApiSetting.ApiKey)"
$secondaryAppSettingsText = $secondaryAppSettingsText -replace "#########","$(Default.AppSettings.ApiSetting.ApiBaseUrl)"
#update file with new content
$secondaryAppSettingsFile = [System.IO.StreamWriter]($secondaryAppSettings).Open()
$secondaryAppSettingsFile.BaseStream.SetLength(0)
# Insert the $text to the file and close
$secondaryAppSettingsFile.Write($secondaryAppSettingsText)
$secondaryAppSettingsFile.Flush()
$secondaryAppSettingsFile.Close()
Write-Host Secondary App Settings Updated"
# Write the changes and close the zip file
$zip.Dispose()
I've built a tiny little script to create folders from information that is in a .csv but the folder naming convention could be different every time.
So I was wondering if there was a way I could have the variables pre-saved and just call them when I come to create the folders, i.e I could have 732881-English or English732881 depending on what I write in the read-host, the only problem is it wont print in the file name i've tried using ${variable) ${variable} and nothing appears to work
I've attatched a similiar code here to what I am using
Import-Csv "\\Desktop\Powershell Testing\staffdata.csv"
$students = Import-Csv "\\Desktop\Powershell Testing\staffdata.csv"
foreach($object in $students) {
$id = $object.'Roll Number'
$subject = $object.'Subject'
$foldername = Read-Host "Enter Folder Name"
New-Item -Path "\\Desktop\Powershell Testing\ExamHomes\$foldername" -ItemType Directory
}
I am attempting to access a DLNA server that is set-up by using the dir structure created by clicking through it from My Computer, like the below
Now I want to read the file names in the folder and write them. This is the syntax I have tried
Get-ChildItem "This PC\Serviio (AMDDesktop)\Video\Folders\TV Shows\Battlebots\Season 01"
Foreach-Object {
$content = Get-Content $_.FullName
Write-Host $content
}
But it produces an error that says the path does not exist.
What would be the proper way to iterate these files? Or better yet, maybe the proper way to word the ? is how do I get the address to these files to iterate?
I am attempting to create a script for use when we perform manual data transfers at work, this can be tedious to perform when users have a ton of random data in random locations. I want to move those items from the old location on the old drive to our network location and then pull it back down. What I have below is a beta version of what I am looking to do, my issue is that I am unable to figure out why I am unable to find the current logged in user and exclude certain accounts.
$DOCDIR = [Environment]::GetFolderPath("MyDocuments")
$TARGETDIR = 'C:\TextFiles'
if(!(Test-Path -Path $TARGETDIR )){
New-Item -ItemType directory -Path $TARGETDIR
}
$Include=#("*.*")
$Path=#("C:\Users\%USERNAME%\Documents","C:\Users\%USERNAME%\Pictures")
Get-ChildItem -Path $Path -Include $Include -Recurse | Move-Item -Destination C:\TextFiles
Mind you more will be added to this but I am unsure how to get the current user and have it exclude our administrator account on the units.
Thank you for any help.
You can use the environment variable named USERDOMAIN and USERNAME to determine the currently logged on user.
if ($env:UserName -eq 'Trevor.Sullivan') {
# Do something
}
To take it one step further, you could build an array of the user accounts that you want to exclude, and then check to see if the currently logged on user account is contained in that array. Here is an example:
# Build the list of excluded users
$ExcludedUserList = #(
'User1'
, 'User2'
, 'User3'
, 'User4'
);
# Check if user is contained in exclusion list
if ('User5' -notin $ExcludedUserList) {
# Do something here
}
I have the following script (adapted from here) for uploading files via ftp for a website.
$files = #(dir -Path $path)
foreach ($file in $files) {
if ($file.GetType().FullName -eq 'System.IO.FileInfo') {
"uploading $file"
$uri = New-Object System.Uri($ftp+$file.Name)
$webclient.UploadFile($uri, $file.FullName)
}elseif ($file.GetType().FullName -eq 'System.IO.DirectoryInfo') {
Recurse $file.FullName
}
This works fine if all files go to the root of the directory. The problem I am having is that there are subdirectories for the site under the root. This places (as expected) all files at the root regardless of where they exist in the actual directory structure.
Is there a simple way to transfer all of the files while maintaining the directory structure of the source. I'm sure I could put something together using split-path, but I just wanted to make sure that I wasn't overlooking something before I went any further.
Thanks.
Per request converted from the comments:
geekswithblogs.net has a solution for recursive FTP copy.
Perhaps Microsoft Documentation can help here :
The URI may be relative or absolute. If the URI is of the form "ftp://contoso.com/%2fpath" (%2f is an escaped '/'), then the URI is absolute, and the current directory is /path. If, however, the URI is of the form "ftp://contoso.com/path", first the .NET Framework logs into the FTP server (using the user name and password set by the Credentials property), then the current directory is set to /path.