I have this script that downloads files from a report server and puts those files in a local network share. The script does what it needs to, but the download folder looks like this hitsqlp -> Extracts -> output -> web16p...this is the pathway of where the folder needs to live, but it is replicating that pathway into subfolders so now I have to click on every subfolder to get to the files.
I want the folder 'SSRSFolder' to be a subfolder of \epicsqlt\Extracts\Output\HIT\web16p
Code below, I'm not sure where I went wrong:
set-location -path \\epicsqlt\Extracts\Output\HIT\web16p
$downloadFolder = "\\epicsqlt\Extracts\Output\HIT\web16p"
$ssrsServer = "blahblahblah"
$secpasswd = ConvertTo-SecureString "password" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ("username", $secpasswd)
$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)" -Credential $mycreds
$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)" -UseDefaultCredential
$ssrsItems = $ssrsProxy.ListChildren("/", $true) | Where-Object {$_.TypeName -eq "DataSource" -or $_.TypeName -eq "Report"}
Foreach($ssrsItem in $ssrsItems)
{
# Determine extension for Reports and DataSources
if ($ssrsItem.TypeName -eq "Report")
{
$extension = ".rdl"
}
else
{
$extension = ".rds"
}
Write-Host "Downloading $($ssrsItem.Path)$($extension)";
$downloadFolderSub = $downloadFolder.Trim('\') + $ssrsItem.Path.Replace($ssrsItem.Name,"").Replace("/","\").Trim()
New-Item -ItemType Directory -Path $downloadFolderSub -Force > $null
$ssrsFile = New-Object System.Xml.XmlDocument
[byte[]] $ssrsDefinition = $null
$ssrsDefinition = $ssrsProxy.GetItemDefinition($ssrsItem.Path)
[System.IO.MemoryStream] $memoryStream = New-Object System.IO.MemoryStream(#(,$ssrsDefinition))
$ssrsFile.Load($memoryStream)
$fullDataSourceFileName = $downloadFolderSub + "\" + $ssrsItem.Name + $extension;
$ssrsFile.Save($fullDataSourceFileName);
}
if i'm reading this right.
you are starting the script with
set-location -path \epicsqlt\Extracts\Output\HIT\web16p
then you are setting the $downloadfolder variable to that path and including $downloadfolder in your $downloadfoldersub creation.
so the result would be
epicsqlt\Extracts\Output\HIT\web16p\somepath\somefolder\
and then you are creating a new-item with that whole path, when you are already working in the \web16p\ folder.
Related
I am running a script in the ISE that essentially downloads a file from a public site:
#This PowerShell code scrapes the site and downloads the latest published file.
Param(
$Url = 'https://randomwebsite.com',
$DownloadPath = "C:\Downloads",
$LocalPath = 'C:\Temp',
$RootSite = 'https://publicsite.com',
$FileExtension = '.gz'
)
#Define the session cookie used by the site and automate acceptance. $session = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$cookie = New-Object System.Net.Cookie
$cookie.Name = "name"
$cookie.Value = "True"
$cookie.Domain = "www.public.com"
$session.Cookies.Add($cookie);
$FileNameDate = Get-Date -Format yyyyMMdd
$DownloadFileName = $DownloadPath + $FileNameDate + $FileExtension
$DownloadFileName
TRY{
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$WebSite = Invoke-WebRequest $Url -WebSession $session -UseBasicParsing #this gets the links we need from the main site.
$Table = $WebSite.Links | Where-Object {$_.href -like "*FetchDocument*"} | fl href #filter the results that we need.
#Write-Output $Table
$FilterTable=($Table | Select-Object -Unique | sort href -Descending) | Out-String
$TrimString = $FilterTable.Trim()
$FinalString = $RootSite + $TrimString.Trim("href :")
#Write-Verbose $FinalString | Out-String
#Start-Process powershell.exe -verb RunAs -ArgumentList "-File C:\some\path\base_server_settings.ps1" -Wait
Invoke-WebRequest $FinalString -OutFile $DownloadFileName -TimeoutSec 600
$ExpectedFileName = Get-ChildItem | Sort-Object LastAccessTime -Descending | Select-Object -First 1 $DownloadPath.Name | SELECT Name
$ExpectedFileName
Write-Host 'The latest DLA file has been downloaded and saved here:' $DownloadFileName -ForegroundColor Green
}
CATCH{
[System.Net.WebException],[System.IO.IOException]
Write "An error occured while downloading the latest file."
Write $_.Exception.Message
}
Expectation is that it downloads a file into the downloads folder and does in fact download the file when using the ISE.
When I try to run this as a command however (PowerShell.exe -file "/path/script.ps1) I get an error stating:
An error occurred while downloading the latest file. Operation is not valid due to the current state of the object.
out-lineoutput : The object of type
"Microsoft.PowerShell.Commands.Internal.Format.GroupEndData" is not
valid or not in the correct sequence. This is likely caused by a
user-specified "format-*" command which is conflicting with the
default formatting. At
\path\to\file\AutomatedFileDownload.ps1:29
char:9
$FilterTable=($Table | Select-Object -Unique | sort href -Des ...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : InvalidData: (:) [out-lineoutput], InvalidOperationException
FullyQualifiedErrorId : ConsoleLineOutputOutOfSequencePacket,Microsoft.PowerShell.Commands.OutLineOutputCommand
I found several articles describing using the MTA or STA switch and I have tried to add in -MTA or -STA to the command, but it still gives me the same error in the command.
As commented, you are trying to get one link from the website, but pipe your commande to things like Format-List and Out-String, rendering the result to either nothing at all or as a single multiline string.. In both cases, this won't get you what you are after.
Not knowing the actual values of the linksof course, I suggest you try this:
Param(
$Url = 'https://randomwebsite.com',
$DownloadPath = "C:\Downloads",
$LocalPath = 'C:\Temp',
$RootSite = 'https://publicsite.com',
$FileExtension = '.gz'
)
# test if the download path exists and if not, create it
if (!(Test-Path -Path $DownloadPath -PathType Container)){
$null = New-Item -Path $DownloadPath -ItemType Directory
}
#Define the session cookie used by the site and automate acceptance.
$session = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$cookie = New-Object System.Net.Cookie
$cookie.Name = "name"
$cookie.Value = "True"
$cookie.Domain = "www.public.com"
$session.Cookies.Add($cookie);
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
try {
$WebSite = Invoke-WebRequest -Uri $Url -WebSession $session -UseBasicParsing -ErrorAction Stop #this gets the links we need from the main site.
# get the file link
$lastLink = ($WebSite.Links | Where-Object {$_.href -like "*FetchDocument*"} | Sort-Object href -Descending | Select-Object -First 1).href
# create the file URL
$fileUrl = "$RootSite/$lastLink"
# create the full path and filename for the downloaded file
$DownloadFileName = Join-Path -Path $DownloadPath -ChildPath ('{0:yyyyMMdd}{1}' -f (Get-Date), $FileExtension)
Write-Verbose "Downloading $fileUrl as '$DownloadFileName'"
Invoke-WebRequest -Uri $fileUrl -OutFile $DownloadFileName -TimeoutSec 600 -ErrorAction Stop
# test if the file is downloaded
if (Test-Path -Path $DownloadFileName -PathType Leaf) {
Write-Host "The latest DLA file has been downloaded and saved here: $DownloadFileName" -ForegroundColor Green
}
else {
Write-Warning "File '$DownloadFileName' has NOT been downloaded"
}
}
catch [System.Net.WebException],[System.IO.IOException]{
Write-Host "An error occured while downloading the latest file.`r`n$($_.Exception.Message)" -ForegroundColor Red
}
catch {
Write-Host "An unknown error occured while downloading the latest file.`r`n$($_.Exception.Message)" -ForegroundColor Red
}
I am using following code to get the files and it works for the directory but I would like to traverse thru sub directories too and when I put -recurse it stops working.
Import-Module –Name "C:\Users\Administrator\Documents\WindowsPowerShell\Modules\Posh-SSH" -Verbose
$passwordTest = "Password"
$securePasswordTest = ConvertTo-SecureString $passwordTest -AsPlainText -Force
$credentialsTest = New-Object System.Management.Automation.PSCredential ("USername", $securePasswordTest)
$sessionTest = New-SFTPSession -ComputerName SFTP -Credential $credentialsTest -AcceptKey
$sourceTest = "/u01/G"
$destinationTest= "F:\SourceOLTP\"
Get-SFTPChildItem $sessionTest -Path $sourceTest | ForEach-Object{
if ($_.Fullname -like '*.csv')
{
Get-SFTPFile $sessionTest -RemoteFile $_.FullName -LocalPath $destinationTest -Overwrite
}
write-output $_.FullName
}
Remove-SFTPSession $sessionTest -Verbose
The recurse switch on Get-SFTPChildItem appears to be -Recursive.
Source: https://github.com/darkoperator/Posh-SSH/blob/master/docs/Get-SFTPChildItem.md
I am following this MSDN guide to publish / upload ASP.Net Web Application files to Azure Web App (Resource Manager). But getting UploadFile error whenever the Sub Folder starts. Root folder is going fine.
Uploading to ftp://XXXXXX.ftp.azurewebsites.windows.net/site/wwwroot/bin/Antlr3.Runtime.dll
From C:\Users\SampleWebApp\bin\Antlr3.Runtime.dll
Exception calling "UploadFile" with "2" argument(s):
The remote server returned an error: (550) File unavailable (e.g., file not found, no access)
Param(
[string] [Parameter(Mandatory=$true)] $AppDirectory,
[string] [Parameter(Mandatory=$true)] $WebAppName,
[string] [Parameter(Mandatory=$true)] $ResourceGroupName
)
$xml = [Xml](Get-AzureRmWebAppPublishingProfile -Name $webappname `
-ResourceGroupName $ResourceGroupName `
-OutputFile null)
$username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse | Where-Object{!($_.PSIsContainer)}
foreach ($file in $files)
{
if ($file.FullName)
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
"Uploading to " + $uri.AbsoluteUri
"From " + $file.FullName
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
as the issue starts only with first occurrence of sub directory(bin) file, it could be because of some other process is using the Antlr dll. can you close all active debug sessions and run this script again? and also make sure you're not having any whitespaces after forming relative uri path
[UPDATE]
It was failing to create sub-directory and hence the error "file not found" while uploading a file from sub directory.
made few changes in the for-loop to create sub-directory on ftp before uploading file from sub-directory and is working fine.
$appdirectory="<Replace with your app directory>"
$webappname="mywebapp$(Get-Random)"
$location="West Europe"
# Create a resource group.
New-AzureRmResourceGroup -Name myResourceGroup -Location $location
# Create an App Service plan in `Free` tier.
New-AzureRmAppServicePlan -Name $webappname -Location $location `
-ResourceGroupName myResourceGroup -Tier Free
# Create a web app.
New-AzureRmWebApp -Name $webappname -Location $location -AppServicePlan $webappname `
-ResourceGroupName myResourceGroup
# Get publishing profile for the web app
$xml = (Get-AzureRmWebAppPublishingProfile -Name $webappname `
-ResourceGroupName myResourceGroup `
-OutputFile null)
# Not in Original Script
$xml = [xml]$xml
# Extract connection information from publishing profile
$username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
# Upload files recursively
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse #Removed IsContainer condition
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
if($file.PSIsContainer)
{
$uri.AbsolutePath + "is Directory"
$ftprequest = [System.Net.FtpWebRequest]::Create($uri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
$response.StatusDescription
continue
}
"Uploading to " + $uri.AbsoluteUri + " from "+ $file.FullName
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
i also blogged about this in detail on how I troubleshooted this issue to get the fix here.
$adminUPN="xxxxx#Home500.onmicrosoft.com"
$orgName="xxxxxx"
$userCredential = Get-Credential -UserName $adminUPN -Message "Type the password."
Connect-SPOService -Url https://$orgName-admin.sharepoint.com -Credential $userCredential
# Begin the process
$loadInfo1 = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
$loadInfo2 = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$loadInfo3 = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.UserProfiles")
#Add SharePoint PowerShell SnapIn if not already added
$snapin = Get-PSSnapin | Where-Object {$_.Name -eq 'Microsoft.SharePoint.Powershell'}
if ($snapin -eq $null)
{
Write-Host "Loading SharePoint Powershell Snapin"
Add-PSSnapin "Microsoft.SharePoint.Powershell" -EA SilentlyContinue
}
CLS
$StartTime = $(get-date -f F)
$timeStamp = Get-Date -format "MM_dd_yy_hh_mm"
#Get Current folder file path
$invocation = (Get-Variable MyInvocation).Value
$currentPath = Split-Path $invocation.MyCommand.Path
$currentPath = $currentPath + "\"
#Config File Path
#$configPath = $currentPath + "Config.xml"
$configPath = "C:\Users\EMXBG\Downloads\Script_AddSiteContent\Script_AddSiteContent\Config.xml"
#fetching details from config.xml
[xml]$configXML = Get-Content $configPath
$inputFileName = [string]$configXML.Config.Constants.InputFileName
$errorFileName = [string]$configXML.Config.Constants.ErrorFileName
$outFilePath = [string]$configXML.Config.Constants.OutputFileName
#Source File path containing list of WebApplications in a farm.
$webApplFilePath = $currentPath + $inputFileName
#Output File path of the exported AD Security Groups with Site collection and Group Name details.
$sitesFilePath = $currentPath + $outFilePath
#File path of the file which will capture all the errors while running the script.
$errorPath = $currentPath + $errorFileName + $timeStamp + ".csv"
# Creating object to write logging into the error and output file
$sitesFile = New-Object System.IO.StreamWriter $sitesFilePath
$errorfile = New-Object System.IO.StreamWriter $errorPath
# Fetching SharePoint WebApplications list from a CSV file
$CSVData = Import-CSV -path $webApplFilePath
$sitesFile.WriteLine("SiteCollectionName"+","+"SiteURL")
$errorfile.WriteLine("SiteURL"+"`t"+"ExceptionLevel"+"`t"+"ExceptionMsg");
addSiteContentLink $CSVData
$sitesFile.Close()
$errorfile.Close()
# Function to add Site Content link in thes where it does not exists
function addSiteContentLink($CSVData)
{
try
{
$compareText = "Site contents"
foreach ($row in $CSVData)
{
$webUrl = $row.webUrl
#$username = $row.username
#$password = $row.password
#Get Web Application and credentials
#$securePass = ConvertTo-SecureString $password -AsPlainText -Force
#$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($webUrl)
#$ctx.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $securePass)
# Get the collection of navigation nodes from the quick launch bar
#$web = $ctx.Web
$quickLaunch = $webUrl.Navigation.QuickLaunch
try
{
#Iterate through each iten in Quick launch menu
foreach($quickLaunch in $web)
{
if ($quickLaunch -contains $compareText)
{
Write-Host "Site Content link Exists!"
}
else
{
# Add a new navigation node
$navNode = New-Object Microsoft.SharePoint.Client.NavigationNodeCreationInformation
$navNode.AsLastNode = $true
$navNode.Title = "Site Contents"
$navNode.Url = $web.Url + "_layouts/15/viewlsts.aspx"
$navNode.IsExternal = $false
$ctx.Load($quickLaunchColl.Add($navNode))
$ctx.ExecuteQuery()
}
}
}
catch
{
Write-Host("Exception at Site Collection Url :" + $currentSite.Url)
$errorfile.WriteLine($currentSite.Url+"`t"+"`t"+$_.Exception.Message)
}
}
#Export Data to CSV
$sitesCollection | export-csv $sitesFile -notypeinformation
$site.Dispose()
}
catch
{
Write-Host("Exception at Site Collection Url :" +$currentSite.Url)
$errorfile.WriteLine($currentSite.Url+"`t"+"SiteCollection"+"`t"+$_.Exception.Message)
}
}
Below is the Error I am getting
Export-Csv : Cannot bind argument to parameter 'InputObject' because it is null.
At C:\Users\EMXBG\Downloads\Script_AddSiteContent\Script_AddSiteContent\ScriptForSiteContentLinkQuickLaunch - Copy.ps1:126 char:29
+ $sitesCollection | export-csv $sitesFile -notypeinformation
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Export-Csv], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.ExportCsvCommand
This error is probably because $sitesCollection is empty/null. I can't see anything in your code that assigns it a value.
it seems like I can upload a file to SPO's Site Collection URL but not the subfolder. For example, I can upload to "https://xyz.sharepoint.com/sites/Reporting", but not "https://xyz.sharepoint.com/sites/Reporting/Sales". Here's the working code's relevant bit:
$Username = "me#domain.com"
$Password = "Password"
$SiteCollectionUrl = "https://xyz.sharepoint.com/sites/Reporting"
$securePassword = ConvertTo-SecureString $password -AsPlainText -Force
$DocLibName = "Sales"
$Folder="C:\MySalesFolder"
Function Get-SPOCredentials([string]$UserName,[string]$Password)
{
$SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force
return New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword)
}
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteCollectionUrl)
$Context.Credentials = Get-SPOCredentials -UserName $UserName -Password $Password
#Retrieve list
$List = $Context.Web.Lists.GetByTitle($DocLibName)
$Context.Load($List)
$Context.ExecuteQuery()
#Upload file
Foreach ($File in (dir $Folder -File))
{
$FileStream = New-Object IO.FileStream($File.FullName,[System.IO.FileMode]::Open)
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.ContentStream = $FileStream
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$Context.Load($Upload)
$Context.ExecuteQuery()
I tried to hardcode "/Sales" subfolder but no luck. Anyone can point me in the right direction?
According to Trying to upload files to subfolder in Sharepoint Online via Powershell You will need to amend the FileCreationURL:
$FileCreationInfo.URL = $List.RootFolder.ServerRelativeUrl + "/" + $FolderName + "/" + $File.Name
I believe you just need to prepend $File.Name with the Folder path to your root directory.
Web.GetFolderByServerRelativeUrl Method:
Alternatively to attempting to Upload via List.RootFolder..., you can use the "GetFolderByServerRelativeUrl" method to get your Target Folder
$List = $Context.Web.Lists.GetByTitle($DocLibName)
$Context.Load($List.RootFolder)
$Context.ExecuteQuery()
$FolderName = "Sales"
$TargetFolder = $Context.Web.GetFolderByServerRelativeUrl($List.RootFolder.ServerRelativeUrl + "/" + $FolderName);
Then for upload you would use
$FileCreationInfo.URL = $File.Name
$UploadFile = $TargetFolder.Files.Add($FileCreationInfo)
$Context.Load($UploadFile)
$Context.ExecuteQuery()