copying many files from sharepoint to other using powershell - powershell

I m trying to copy documents from list in one sharepoint to another :
this is my code :
*************************************************
$source= "\\...\s1"
$destination = "\\..\s2"
foreach ($result in $result )
{ copy-item -path $source -dest $destination}
*************************************************
-$result is the list of all documents that I got using web-services, it s type is system.array
-$source and $destination are UNC that refers to URL of the two sharepoint
the error is
" can not find the path \...\s1system.xml.Xml.XmlElement"
PS: I m not using server machine,it s just a client
Here is my code
*****************************************************
{
param (
[String]$Value,
[String]$Field,
[String]$RowLimit = "0",
[String]$Operator = "Contains",
[String]$WebURL = "https://.................../wer",
[String]$ListName = "Main documents",
[String]$ViewName,
[Switch]$Recurse
)
$ScriptDirectory = split-path $MyInvocation.MyCommand.Definition
$dllPath = "P:\SamlCookieAuth.dll" -f $ScriptDirectory
[void][System.Reflection.Assembly]::LoadFrom($dllPath)
$queryOptionsValue = ''
if ($Recurse)
{
$queryOptionsValue = '<ViewAttributes Scope="RecursiveAll"/>'
}
$WSUri = $WebURL + "/_vti_bin/Lists.asmx?wsdl"
$listsWebServiceReference = New-WebServiceProxy -Uri $WSUri -UseDefaultCredential
$listsWebServiceReference.Url = $webURL + "/_vti_bin/lists.asmx"
[System.Uri]$CookieUri = $WebURL
$listsWebServiceReference.CookieContainer = [ST.SamlCookieAuth.SamlCookieManager]::GetAuthenticatedCookiesContainer($CookieUri.AbsoluteUri, 0, 0)
[System.Xml.XmlDocument]$xmlDoc = New-Object -TypeName System.Xml.XmlDocument
[System.Xml.XmlElement]$queryOptions =$xmlDoc.CreateElement("QueryOptions")
$queryOptions.InnerXml = $queryOptionsValue
if ($PSBoundParameters.Keys.Contains("Value"))
{
[System.Xml.XmlElement]$query = $xmlDoc.CreateElement("Query")
$queryValue = "<Where><$Operator><FieldRef Name='$Field'/><Value Type='Text'>$Value</Value></$Operator></Where>"
$query.InnerXml = $queryValue
$result=$listsWebServiceReference.GetListItems($listName, $viewName, $query, $null, $rowLimit, $queryOptions, $null).data.row
}
else
{
$result=$listsWebServiceReference.GetListItems($listName, $viewName, $null, $null, $rowLimit, $queryOptions, $null).data.row
}
$destDirectory = "\\.............\TER\Main Documents"
foreach ($resul in $result)
{Copy-Item -path $resul -destination $destDirectory }
}

Perhaps the problem that you are having is the result of a conversion issue. You are trying to write content from one SP to another as a document, however, you cannot do so with an XmlElement.
Suggest checking out this post:
Converting system.xml.xmlelement to system.xml.xmldocument with PowerShell
It would be helpful to see the content of "-$result is the list of all documents that I got using web-services, it s type is system.array".
Also, for readability, I would suggest differentiating between $result and $result as follows:
foreach($document in $documentList){}
-or- (at minimum)
foreach($result in $results){}

Related

How to export XML of webpart on the page

I want to export the Web part XML based on GUID to the local. I am trying to export from SharePoint 2013 using CSOM PowerShell. Anyone suggest on this?
function EnsureDirectory($exportFolderPath)
{
if ( -not (Test-Path $exportFolderPath) ) {New-Item $exportFolderPath -Type Directory | Out-Null}
}
function ExportAllWebParts()
{
$WebURL= ""
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($WebURL)
$Page = $ctx.web.GetFileByServerRelativeUrl("")
$ctx.Load($Page)
$ctx.ExecuteQuery()
$wpManager = $Page.GetLimitedWebPartManager([Microsoft.SharePoint.Client.WebParts.PersonalizationScope]::Shared)
$webparts = $wpManager.Webparts
$ctx.Load($webparts)
$ctx.ExecuteQuery()
if($webparts.Count -gt 0){
Write-Host "Looping through all webparts"
foreach($webpart in $webparts){
$exportPath = "" + "\" + $webpart.Title + ".xml"
$xwTmp = new-object System.Xml.XmlTextWriter($exportPath,$null);
$xwTmp.Formatting = 1;#Indent
$wpManager.ExportWebPart($webpart, $xwTmp);
$xwTmp.Flush();
$xwTmp.Close();
#$webpartXmlWriter = New-Object System.Xml.XmlTextWriter($fileName,$null)
#$webpartXmlWriter.Formatting = [System.Xml.Formatting]::Indented
#$wpManager.ExportWebPart($webpart,$webpartXmlWriter)
}
}
}
ExportAllWebParts

Split Powershell script into two separate parts

I've got this script that connects to Sharepoint Online, indexes all the files and folders, downloads them all in a systematic fashion and churns out a .csv with the name of file, folders, size, path, etc.
For various reasons I've ended up in a situation where I've got all the data, but the metadata is corrupted (the .csv file aforementioned).
Unfortunately re running the whole script just for that isn't really an option, as that would require around 90 hours.
I've been trying to break the code down in order to remove the "download files" functions and just keep the part that generates the .csv, but so far without luck.
I've found the Function that seem to be in charge of it (WriteLog), but I'm struggling to separate it from the rest.
P.S. The code is not mine, I've inherited it from a developer I haven't got access to (unfortunately)
Please find the code below:
param(
[Parameter(Mandatory = $true)]
[string]$srcUrl,
[Parameter(Mandatory = $true)]
[string]$username,
[Parameter(Mandatory = $false,HelpMessage = "From Date: (dd/mm/yyyy)")]
[string]$fromDate,
[Parameter(Mandatory = $false,HelpMessage = "To Date: (dd/mm/yyyy)")]
[string]$toDate,
[Parameter(Mandatory = $true)]
[string]$folderPath,
[Parameter(Mandatory = $true)]
[string]$csvPath
) #end param
cls
#Load SharePoint CSOM Assemblies
Add-Type -Path "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.dll"
Add-Type -Path "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.Runtime.dll"
$global:OutFilePath = -join ($csvPath,"\Documents.csv")
$global:OutFilePathError = -join ($csvPath,"\ErrorLog_GetDocuments.csv")
$header = "Title,Type,Parent,Name,Path,FileSize(bytes),Created,Created by,Modified,Modified by,Matterspace title,Matterspace url"
$srcLibrary = "Documents"
$securePassword = Read-Host -Prompt "Enter your password: " -AsSecureString
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials ($username,$securePassword)
$sUrl = [System.Uri]$srcUrl
$domainUrl = -join ("https://",$sUrl.Host)
function WriteLog
{
param(
[Parameter(Mandatory = $true)] $title,$type,$folderName,$name,$path,$fileSize,$created,$createdby,$modifed,$modifiedby,$matterspacetitle,$materspaceUrl
)
$nowTime = Get-Date -Format "dd-MMM-yy,HH:mm:ss"
$folderName = $folderName.Replace(",","|") ### sometime folder / file name has comma so replace it with something
$name = $name.Replace(",","|")
#$path = $path.Replace(",","|")
$title=[System.String]::Concat("""""""$title""""""")
$type=[System.String]::Concat("""""""$type""""""")
$folderName=[System.String]::Concat("""""""$folderName""""""")
$name=[System.String]::Concat("""""""$name""""""")
$path=[System.String]::Concat("""""""$path""""""")
$fileSize=[System.String]::Concat("""""""$fileSize""""""")
$created=[System.String]::Concat("""""""$created""""""")
$createdby=[System.String]::Concat("""""""$createdby""""""")
$modified=[System.String]::Concat("""""""$modified""""""")
$modifiedby=[System.String]::Concat("""""""$modifiedby""""""")
$matterspacetitle=[System.String]::Concat("""""""$matterspacetitle""""""")
$materspaceUrl=[System.String]::Concat("""""""$materspaceUrl""""""")
$lineContent = "$("$title"),$($type),$($folderName),$($name),$($path),$($fileSize),$($created),$($createdby),$($modified),$($modifiedby),$($matterspacetitle),$($materspaceUrl)"
Add-Content -Path $global:OutFilePath -Value "$lineContent"
}
#Function to get all files of a folder
Function Get-FilesFromFolder([Microsoft.SharePoint.Client.Folder]$Folder,$SubWeb,$MTitle)
{
Write-host -f Yellow "Processing Folder:"$Folder.ServerRelativeUrl
$folderItem = $Folder.ListItemAllFields
#$srcContext.Load($f)
$Ctx.Load($folderItem)
$Ctx.ExecuteQuery()
#Get All Files of the Folder
$Ctx.load($Folder.files)
$Ctx.ExecuteQuery()
$authorEmail = $folderItem["Author"].Title
$editorEmail = $folderItem["Editor"].Title
$filepath = $folderItem["FileDirRef"]
if([string]::IsNullOrEmpty($filepath))
{
$filepath=$Folder.ServerRelativeUrl
}
$created = $folderItem["Created"]
$modified = $folderItem["Modified"]
$title = $folderItem["Title"]
if ([string]::IsNullOrEmpty($title))
{
$title = "Not Specified"
}
#$fileSize = $fItem["File_x0020_Size"]
$fileName = $Folder.Name
#list all files in Folder
write-host $Folder.Name
$splitString=$Folder.ServerRelativeUrl -split('/')
$dirUrl="";
write-host $splitString.Length
$parentUrl=""
For($i=3; $i -le $splitString.Length;$i++)
{
if($splitString[$i] -notcontains('.'))
{
Write-Host $i
Write-Host $splitString[$i]
$dirUrl=-join($dirUrl,"\",$splitString[$i])
$parentUrl=-join($parentUrl,"\",$splitString[$i+1])
}
}
$dirPath = -join ($folderPath,$dirUrl)
WriteLog $title "Folder" $parentUrl.TrimEnd('\') $fileName $filepath 0 $created $authorEmail $modified $editorEmail $MTitle $SubWeb
write-host $dirPath
if (-not (Test-Path -Path $dirPath))
{
New-Item -ItemType directory -Path $dirPath
}
ForEach ($File in $Folder.files)
{
try{
$remarkDetail = ""
$replacedUser = ""
$fItem = $File.ListItemAllFields
#$srcContext.Load($f)
$Ctx.Load($fItem)
$Ctx.ExecuteQuery()
$authorEmail = $fItem["Author"].Email
$editorEmail = $fItem["Editor"].Email
$filepath = $fItem["FileDirRef"]
$fileSizeBytes = $fItem["File_x0020_Size"];
$fileSize = ($fileSizeBytes) / 1MB
$fileName = $fItem["FileLeafRef"]
$title = $fItem["Title"]
$filecreated = $fitem["Created"]
$fileModified = $fitem["Modified"]
$FileUrl = $fItem["FileRef"]
$Fname=$File.Name
if ([string]::IsNullOrEmpty($title))
{
$title = "Not Specified"
}
#$title,$type, $folderName,$name,$path,$fileSize,$created,$createdby,$modifed,$modifiedby,$matterspacetitle,$materspaceUrl
$dateToCompare = Get-Date (Get-Date -Date $fileModified -Format 'dd/MM/yyyy')
#Get the File Name or do something
if (($dateToCompare -ge $startDate -and $dateToCompare -le $endDate) -or ($startDate -eq $null -and $endDate -eq $null))
{
$downloadUrl = -join ($dirPath,$File.Name)
$fromfile = -join ($domainUrl,$FileUrl)
Write-Host "Downloading the file from " $fromfile -ForegroundColor Cyan
try{
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials ($username,$securePassword)
$webclient.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED","f")
$webclient.DownloadFile($fromfile,$downloadUrl)
$webclient.Dispose()
}
catch{
$ErrorMessage=$_.Exception.Message
$ErrorMessage = $ErrorMessage -replace "`t|`n|`r",""
$ErrorMessage = $ErrorMessage -replace " ;|; ",";"
$lineContent = "$($Fname),$($fromfile ),$($ErrorMessage)"
Add-Content -Path $global:OutFilePathError -Value "$lineContent"
Write-Host "Skipping the file and recalling the function" -ForegroundColor Blue
}
WriteLog $title "File" $Folder.Name $fileName $FileUrl $fileSize $created $authorEmail $modified $editorEmail $MTitle $SubWeb
Write-host -f Magenta $File.Name
}
else
{
Write-Host "Skipping the matterspace :" $title " as the matterspace was not in the date range" -ForegroundColor Blue
}
}
catch{
$ErrorMessage=$_.Exception.Message
$ErrorMessage = $ErrorMessage -replace "`t|`n|`r",""
$ErrorMessage = $ErrorMessage -replace " ;|; ",";"
$lineContent = "$($Fname),$($fromfile ),$($ErrorMessage)"
Add-Content -Path $global:OutFilePathError -Value "$lineContent"
}
}
#Recursively Call the function to get files of all folders
$Ctx.load($Folder.Folders)
$Ctx.ExecuteQuery()
#Exclude "Forms" system folder and iterate through each folder
ForEach($SubFolder in $Folder.Folders | Where {$_.Name -ne "Forms"})
{
Get-FilesFromFolder -Folder $SubFolder -SubWeb $SubWeb -Mtitle $MTitle
}
}
Function Get-SPODocLibraryFiles()
{
param
(
[Parameter(Mandatory=$true)] [string] $SiteURL,
[Parameter(Mandatory=$true)] [string] $LibraryName
)
#Setup the context
$Ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
$Ctx.Credentials = $credentials
$srcWeb = $Ctx.Web
$childWebs = $srcWeb.Webs
$Ctx.Load($childWebs)
$Ctx.ExecuteQuery()
foreach ($childweb in $childWebs)
{
try
{
#Get the Library and Its Root Folder
$Library=$childweb.Lists.GetByTitle($LibraryName)
$Ctx.Load($Library)
$Ctx.Load($Library.RootFolder)
$Ctx.ExecuteQuery()
#Call the function to get Files of the Root Folder
if($childweb.Url.ToLower() -notlike "*ehcontactus*" -and $childweb.Url.ToLower() -notlike "*ehfaqapp*" -and $childweb.Url.ToLower() -notlike "*ehquicksearch*" -and $childweb.Url.ToLower() -notlike "*ehsiteapps*" -and $childweb.Url.ToLower() -notlike "*ehsitelist*" -and $childweb.Url.ToLower() -notlike "*ehwelcomeapp*" -and $childweb.Url.ToLower() -notlike "*ehimageviewer*")
{
Get-FilesFromFolder -Folder $Library.RootFolder -SubWeb $childweb.Url -MTitle $childweb.Title
}
}
catch{
write-host "Skipping the matterpsace as the library does not exists" -ForegroundColor Blue
}
}
}
#Config Parameters
#$SiteURL= "https://impigerspuat.sharepoint.com/sites/ELeave/Eleave1/adminuat#impigerspuat.onmicrosoft.com"
$LibraryName="Documents"
#$securePassword = Read-Host -Prompt "Enter your password: " -AsSecureString
#Call the function to Get All Files from a document library
if (-not ([string]::IsNullOrEmpty($fromDate)))
{
$startDate = Get-Date (Get-Date -Date $fromDate -Format 'dd/MM/yyyy')
}
else
{
$startDate = $null;
}
if (-not ([string]::IsNullOrEmpty($toDate)))
{
$endDate = Get-Date (Get-Date -Date $toDate -Format 'dd/MM/yyyy')
}
else
{
$endDate = $null
}
Get-SPODocLibraryFiles -SiteURL $srcUrl -LibraryName $LibraryName
Have you tried running just that function and giving it the parameters it's requesting in the function?
Copy the code into a WriteLog.ps1 file and then call the script file with the parameters.
ie.
Writelog.ps1 $srcUrl $username $fromDate $toDate $folderPath $csvPath
Obviously, inputting data in place of the variables.
FWIW, pulling relevant pieces of code out of someone else's scripts is a great skill to practice. Everything you want to do has been done before, but you might have to break down someone else's work before it fits your exact enviornment.
Unfortunately it looks like you have to do this the old fashion way. The problem is the author is outputting to the log (csv) as the files are being downloaded. As opposed to downloading to a staging area first...
I suggest setting an early break-point in the code then stepping through to see exactly how it's flowing. That should give you a general idea, and enough info to start writing refactored code.
Reverse engineering is always tough, be prepared it will be methodical exercise so say the least.
Bad news: this will be an iterative process, not a single 'solve'. Nothing "wrong" with that code, but there are a few design choices that make this a challenge. It's not indented consistently and it weaves through all the variable assignments in slightly different ways. Looks better than most of my code, I'm just telling you what makes it a challenge.
Good news: At least that WriteLog function is separate. And it's really just adding content to the .csv file defined in this variable assigned here:
$global:OutFilePath = -join ($csvPath,"\Documents.csv")
(line 20 in my copy)
*
RECOMMENDATION: (this is an approach, just a guide to your final solution)
Take that existing code and drop it in an IDE to help you visually. The Windows Powershell ISE is adequate, but I would highly recommend VSCode.
Comment out that last line:
Get-SPODocLibraryFiles -SiteURL $srcUrl -LibraryName $LibraryName
So you can retain any of the other context from the script you actually want to keep.
Create a separate function named something like:
function Get-FilesFromLocalFolder ($localdir, $SubWeb, $MTitle)
to use instead of the existing function Get-FilesFromFolder. That way you can iterate through whatever directories you need, get the files, and assign variables to pass as parameters. Then when you call WriteLog, it will look very similar. Those last two parameters ($SubWeb, $MTitle) are passed just because WriteLog needs them. You could make them your own labels, or you could remove them and make them optional in WriteLog.
You could start by hard-coding values in each of required parameters for the function, and then run it to see if the output is working.
It will take you some iterations (agree with #Steven) and it is definitely a valuable exercise (agree with #TheIdesOfMark). :)

Windows Power Shell rename files

I am sort of new to scripting and here's my task:
A folder with X files. Each file contains some Word documents, Excel sheets, etc. In these files, there is a client name and I need to assign an ID number.
This change will affect all the files in this folder that contain this client's name.
How can do this using Windows Power Shell?
$configFiles = Get-ChildItem . *.config -rec
foreach ($file in $configFiles)
{
(Get-Content $file.PSPath) |
Foreach-Object { $_ -replace " JOHN ", "123" } |
Set-Content $file.PSPath
}
Is this the right approach ?
As #lee_Daily pointed out you would need to have different code to perform a find and replace in different file types. Here is an example of how you could go about doing that:
$objWord = New-Object -comobject Word.Application
$objWord.Visible = $false
foreach ( $file in (Get-ChildItem . -r ) ) {
Switch ( $file.Extension ) {
".config" {
(Get-Content $file.FullName) |
Foreach-Object { $_ -replace " JOHN ", "123" } |
Set-Content $file.FullName
}
{('.doc') -or ('.docx')} {
### Replace in word document using $file.fullname as the target
}
{'.xlsx'} {
### Replace in spreadsheet using $file.fullname as the target
}
}
}
For the actual code to perform the find and replace, i would suggest com objects for both.
Example of word find and replace https://codereview.stackexchange.com/questions/174455/powershell-script-to-find-and-replace-in-word-document-including-header-footer
Example of excel find and replace Search & Replace in Excel without looping?
I would suggest learning the ImportExcel module too, it is a great tool which i use a lot.
For Word Document : This is what I'm using. Just can't figure out how this script could also change Header and Footer in a Word Document
$objWord = New-Object -comobject Word.Application
$objWord.Visible = $false
$list = Get-ChildItem "C:\Users\*.*" -Include *.doc*
foreach($item in $list){
$objDoc = $objWord.Documents.Open($item.FullName,$true)
$objSelection = $objWord.Selection
$wdFindContinue = 1
$FindText = " BLAH "
$MatchCase = $False
$MatchWholeWord = $true
$MatchWildcards = $False
$MatchSoundsLike = $False
$MatchAllWordForms = $False
$Forward = $True
$Wrap = $wdFindContinue
$Format = $False
$wdReplaceNone = 0
$ReplaceWith = "help "
$wdFindContinue = 1
$ReplaceAll = 2
$a = $objSelection.Find.Execute($FindText,$MatchCase,$MatchWholeWord, `
$MatchWildcards,$MatchSoundsLike,$MatchAllWordForms,$Forward,`
$Wrap,$Format,$ReplaceWith,$ReplaceAll)
$objDoc.Save()
$objDoc.Close()
}
$objWord.Quit()
What If I try to run on C# ? Is anything else missing?
}
string rootfolder = #"C:\Temp";
string[] files = Directory.GetFiles(rootfolder, "*.*",SearchOption.AllDirectories);
foreach (string file in files)
{ try
{ string contents = File.ReadAllText(file);
contents = contents.Replace(#"Text to find", #"Replacement text");
// Make files writable
File.SetAttributes(file, FileAttributes.Normal);
File.WriteAllText(file, contents);
}
catch (Exception ex)
{ Console.WriteLine(ex.Message);
}
}

Dynamic Parameters - with Dynamic ValidateSet

I have a script that I've been working on to provide parsing of SCCM log files. This script takes a computername and a location on disk to build a dynamic parameter list and then present it to the user to choose the log file they want to parse. Trouble is I cannot seem to get the ValidateSet portion of the dynamic parameter to provide values to the user. In addition the script won't display the -log dynamic parameter when attempting to call the function.
When you run it for the first time you are not presented with the dynamic parameter Log as I mentioned above. If you then use -log and then hit tab you’ll get the command completer for the files in the directory you are in. Not what you’d expect; you'd expect that it would present you the Logfile names that were gathered during the dynamic parameter execution.
PSVersion 5.1.14409.1012
So the question is how do I get PowerShell to present the proper Validate set items to the user?
If you issue one of the items in the error log you get the proper behavior:
Here are the two functions that i use to make this possible:
function Get-CCMLog
{
[CmdletBinding()]
param([Parameter(Mandatory=$true,Position=0)]$ComputerName = '$env:computername', [Parameter(Mandatory=$true,Position=1)]$path = 'c:\windows\ccm\logs')
DynamicParam
{
$ParameterName = 'Log'
if($path.ToCharArray() -contains ':')
{
$FilePath = "\\$ComputerName\$($path -replace ':','$')"
if(test-path $FilePath)
{
$logs = gci "$FilePath\*.log"
$LogNames = $logs.basename
$logAttribute = New-Object System.Management.Automation.ParameterAttribute
$logAttribute.Position = 2
$logAttribute.Mandatory = $true
$logAttribute.HelpMessage = 'Pick A log to parse'
$logCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
$logCollection.add($logAttribute)
$logValidateSet = New-Object System.Management.Automation.ValidateSetAttribute($LogNames)
$logCollection.add($logValidateSet)
$logParam = New-Object System.Management.Automation.RuntimeDefinedParameter($ParameterName,[string],$logCollection)
$logDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
$logDictionary.Add($ParameterName,$logParam)
return $logDictionary
}
}
}
begin {
# Bind the parameter to a friendly variable
$Log = $PsBoundParameters[$ParameterName]
}
process {
# Your code goes here
#dir -Path $Path
$sb2 = "$((Get-ChildItem function:get-cmlog).scriptblock)`r`n"
$sb1 = [scriptblock]::Create($sb2)
$results = Invoke-Command -ComputerName $ComputerName -ScriptBlock $sb1 -ArgumentList "$path\$log.log"
[PSCustomObject]#{"$($log)Log"=$results}
}
}
function Get-CMLog
{
param(
[Parameter(Mandatory=$true,
Position=0,
ValueFromPipelineByPropertyName=$true)]
[Alias("FullName")]
$Path,
$tail =10
)
PROCESS
{
if(($Path -isnot [array]) -and (test-path $Path -PathType Container) )
{
$Path = Get-ChildItem "$path\*.log"
}
foreach ($File in $Path)
{
if(!( test-path $file))
{
$Path +=(Get-ChildItem "$file*.log").fullname
}
$FileName = Split-Path -Path $File -Leaf
if($tail)
{
$lines = Get-Content -Path $File -tail $tail
}
else {
$lines = get-cotnet -path $file
}
ForEach($l in $lines ){
$l -match '\<\!\[LOG\[(?<Message>.*)?\]LOG\]\!\>\<time=\"(?<Time>.+)(?<TZAdjust>[+|-])(?<TZOffset>\d{2,3})\"\s+date=\"(?<Date>.+)?\"\s+component=\"(?<Component>.+)?\"\s+context="(?<Context>.*)?\"\s+type=\"(?<Type>\d)?\"\s+thread=\"(?<TID>\d+)?\"\s+file=\"(?<Reference>.+)?\"\>' | Out-Null
if($matches)
{
$UTCTime = [datetime]::ParseExact($("$($matches.date) $($matches.time)$($matches.TZAdjust)$($matches.TZOffset/60)"),"MM-dd-yyyy HH:mm:ss.fffz", $null, "AdjustToUniversal")
$LocalTime = [datetime]::ParseExact($("$($matches.date) $($matches.time)"),"MM-dd-yyyy HH:mm:ss.fff", $null)
}
[pscustomobject]#{
UTCTime = $UTCTime
LocalTime = $LocalTime
FileName = $FileName
Component = $matches.component
Context = $matches.context
Type = $matches.type
TID = $matches.TI
Reference = $matches.reference
Message = $matches.message
}
}
}
}
}
The problem is that you have all the dynamic logic inside scriptblock in the if statement, and it handles the parameter addition only if the path provided contains a semicolon (':').
You could change it to something like:
if($path.ToCharArray() -contains ':') {
$FilePath = "\\$ComputerName\$($path -replace ':','$')"
} else {
$FilePath = $path
}
and continue your code from there
PS 6 can do a dynamic [ValidateSet] with a class:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_functions_advanced_parameters?view=powershell-6#dynamic-validateset-values

How to create directories from powershell on FTP server?

I want to run a PS script when I want to publish to FTP server. I took this script as structure : structure script.
I have very simple folder :
C:\Uploadftp\Files\doc.txt
C:\Uploadftp\Files\Files2
C:\Uploadftp\Files\Files2\doc2.txt
nothing fancy there.
Here is my script :
cd C:\Uploadftp
$location = Get-Location
"We are here: $location"
$user = "test" # Change
$pass = "test" # Change
## Get files
$files = Get-ChildItem -recurse
## Get ftp object
$ftp_client = New-Object System.Net.WebClient
$ftp_client.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$ftp_address = "ftp://test/TestFolder"
## Make uploads
foreach($file in $files)
{
$directory = "";
$source = $($file.DirectoryName + "/" + $file);
if ($file.DirectoryName.Length -gt 0)
{
$directory = $file.DirectoryName.Replace($location,"")
}
$directory = $directory.Replace("\","/")
$source = $source.Replace("\","/")
$directory += "/";
$ftp_command = $($ftp_address + $directory + $file)
# Write-Host $source
$uri = New-Object System.Uri($ftp_command)
"Command is " + $uri + " file is $source"
$ftp_client.UploadFile($uri, $source)
}
I keep getting this error :
Exception calling "UploadFile" with "2" argument(s): "An exception occurred during a WebClient request."
If I hardcode specific folder for $uri and tell source to be some specific folder on my computer, this script doesn't create directory, it creates a file. What am I doing wrong?
P.S. dont hit me too hard, its my fist time ever doing something in power shell.
Try the "Create-FtpDirectory" function from https://github.com/stej/PoshSupport/blob/master/Ftp.psm1
function Create-FtpDirectory {
param(
[Parameter(Mandatory=$true)]
[string]
$sourceuri,
[Parameter(Mandatory=$true)]
[string]
$username,
[Parameter(Mandatory=$true)]
[string]
$password
)
if ($sourceUri -match '\\$|\\\w+$') { throw 'sourceuri should end with a file name' }
$ftprequest = [System.Net.FtpWebRequest]::Create($sourceuri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
Write-Host Upload File Complete, status $response.StatusDescription
$response.Close();
}