I'm hoping somebody can shed light on this, because it has been driving me to distraction.
I have a script which will save the reports it creates to a sharepoint document library via UNC path, if the path exists, otherwise it saves to the UNC path of a network drive location as a fallback.
I've noticed that checking with test-path, saving (through an msexcel COM object) or trying to open the folder in windows explorer using invoke-item only work if I had already accessed the sharepoint site (via web browser or windows explorer) since the PC last logged on (I'm running Windows 7 Enterprise Service Pack 1 - 64-bit edition).
If I haven't yet been on to sharepoint manually since last logon, test-path returns false, and the other methods cause ItemNotFoundException e.g.
ii : Cannot find path '\\uk.sharepoint.mydomain.local\sites\mycompany\myteam\Shared Documents\Reports' because it does not exist.
At line:1 char:1
+ ii '\\uk.sharepoint.mydomain.local\sites\mycompany\myteam\Shared Document ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (\\uk.sharepoint...\Reports:String) [Invoke-Item], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.InvokeItemCommand
Example areas of code:
$LANPath = "\\myserver\myshare\teamdirs\scriptdir"
$SharepointPath = "\\uk.sharepoint.mydomain.local\sites\mycompany\myteam\Shared Documents\Reoprts"
$ScriptPath = $LANPath + "\bin"
If (Test-Path $SharepointPath) {$BasePath = $SharepointPath;write-host "Using sharepoint to save reports"} else {$BasePath = "$LANPath\Reports";write-host "Using LAN to save reports - sharepoint not accessible"}
and
$_|select -expandproperty HTMLBody | Out-File $($BasePath + "\Eml_body.html")
Write-Host "Reformating HTML"
$html = New-Object -ComObject "HTMLFile";
$source = Get-Content -Path ($BasePath + "\Eml_body.html") -Raw;
and when saving the excel spreadsheet from within my COM object:
$workbook._SaveAs($fileout,[Microsoft.Office.Interop.Excel.XlFileFormat]::xlOpenXMLWorkbook,$Missing,$Missing,$false,$false,[Microsoft.Office.Interop.Excel.XlSaveAsAccessMode]::xlNoChange,[Microsoft.Office.Interop.Excel.XlSaveConflictResolution]::xlLocalSessionChanges,$true,$Missing,$Missing)
You should be able to use a System.Net.WebClient object to access SharePoint file locations.
$client = New-Object System.Net.WebClient
The documentation for the WebClient.Credentials property suggests that the default credentials in this case may be for the ASP.NET server-side process rather than the current user's credentials:
If the WebClient class is being used in a middle tier application, such as an ASP.NET application, the DefaultCredentials belong to the account running the ASP page (the server-side credentials). Typically, you would set this property to the credentials of the client on whose behalf the request is made.
You therefore may want to set the credentials manually. You can plug them in as plain text...
$client.Credentials = New-Object System.Net.NetworkCredential("username","pswd","domain")
...or you could prompt the current user for their credentials.
$client.Credentials = Get-Credential
Here's an example that grabs a file and writes its content to the screen:
$client = New-Object System.Net.WebClient
$client.Credentials = Get-Credential
$data = $client.OpenRead("http://yoursharepointurl.com/library/document.txt")
$reader = New-Object System.IO.StreamReader($data)
$results = $reader.ReadToEnd()
Write-Host $results
$data.Close()
$reader.Close()
I know this is an old thread but for those searching, check out this link: https://www.myotherpcisacloud.com/post/Sometimes-I-Can-Access-the-WebDAV-Share-Sometimes-I-Cant!
Because SharePoint exposes its shares over WebDav, you need to ensure the WebClient service is running on the machine from which you are accessing the path. Browsing the path in explorer automatically fires up the service, while command-line methods do not.
If you change the startup type of WebClient to Automatic, it should resolve the issue.
Related
I have to implement a solution where I have to deploy a SSIS project (xy.ispac) from one machine to another. So far I've managed to copy-cut-paste the following stuff from all around the internet:
# Variables
$ServerName = "target"
$SSISCatalog = "SSISDB" # sort of constant
$CatalogPwd = "catalog_password"
$ProjectFilePath = "D:\Projects_to_depoly\Project_1.ispac"
$ProjectName = "Project_name"
$FolderName = "Data_collector"
# Load the IntegrationServices Assembly
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices")
# Store the IntegrationServices Assembly namespace to avoid typing it every time
$ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Write-Host "Connecting to server ..."
# Create a connection to the server
$sqlConnectionString = "Data Source=$ServerName;Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
$integrationServices = New-Object "$ISNamespace.IntegrationServices" $sqlConnection
$catalog = $integrationServices.Catalogs[$SSISCatalog]
# Create the Integration Services object if it does not exist
if (!$catalog) {
# Provision a new SSIS Catalog
Write-Host "Creating SSIS Catalog ..."
$catalog = New-Object "$ISNamespace.Catalog" ($integrationServices, $SSISCatalog, $CatalogPwd)
$catalog.Create()
}
$folder = $catalog.Folders[$FolderName]
if (!$folder)
{
#Create a folder in SSISDB
Write-Host "Creating Folder ..."
$folder = New-Object "$ISNamespace.CatalogFolder" ($catalog, $FolderName, $FolderName)
$folder.Create()
}
# Read the project file, and deploy it to the folder
Write-Host "Deploying Project ..."
[byte[]] $projectFile = [System.IO.File]::ReadAllBytes($ProjectFilePath)
$folder.DeployProject($ProjectName, $projectFile)
This seemed to be working surprisingly well on the development machine - test server pair. However, the live environment will be a bit different, the machine doing the deployment job (deployment server, or DS from now on) and the SQL Server (DB for short) the project is to be deployed are in different domains and since SSIS requires windows authentication, I'm going to need to run the above code locally on DS but using credentials of a user on the DB.
And that's the point where I fail. The only thing that worked is to start the Powershell command line interface using runas /netonly /user:thatdomain\anuserthere powershell, enter the password, and paste the script unaltered into it. Alas, this is not an option, since there's no way to pass the password to runas (at least once with /savecred) and user interactivity is not possible anyway (the whole thing has to be automated).
I've tried the following:
Simply unning the script on DS, the line $sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString would use the credentials from DS which is not recognized by DB, and New-Object does not have a -Credential arg that I could pass to
Putting everything into an Invoke-Command with -Credential requires using -Computername as well. I guess it would be possible to use the local as 'remote' (using . as Computername) but it still complains about access being denied. I'm scanning through about_Remote_Troubleshooting, so far without any success.
Any hints on how to overcome this issue?
A solution might be to use a sql user (with the right access rights) instead of an AD used.
Something like this should work.
(Check also the answer to correct the connection string)
I need to create a folder in sharepoint if it does not already exist. My powershell script is not running on the sharepoint server so I think I have to use the sharepoint web services? I am currently uploading files to sharepoint with powershell using webclient as below - but I need to create the folder for the file first... if it does not already exist;
# Upload the file
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = $credentials
$webclient.UploadFile($destination + "/" + $File.Name, "PUT", $File.FullName)
Is this possible to do with webclient? If not, how can this be done using the sharepoint web services?
Since you mentioned SharePoint Web Services, Lists.UpdateListItems Method could be utilized for that purpose, for example:
Function Create-Folder([string]$WebUrl,[string]$ListUrl,[string]$ListName,[string]$FolderName)
{
$url = $WebUrl + "/_vti_bin/lists.asmx?WSDL"
$listsProxy = New-WebServiceProxy -Uri $url -UseDefaultCredential
$batch = [xml]"<Batch OnError='Continue' RootFolder='$WebUrl/$ListUrl'><Method ID='1' Cmd='New'><Field Name='ID'>New</Field><Field Name='FSObjType'>1</Field><Field Name='BaseName'>$FolderName</Field></Method></Batch>"
$result = $listsProxy.UpdateListItems($ListName, $batch)
}
Usages
Create Orders folder under Documents library:
Create-Folder -WebUrl "http://contoso.intranet.com" -ListUrl "Documents" -ListName "Documents" -FolderName "Orders"
Create 2014 folder in Requests list:
Create-Folder -WebUrl "http://contoso.intranet.com" -ListUrl "Lists/Requests" -ListName "Requests" -FolderName "2014"
Update
If folder already exists then SOAP service will throw the error:
The operation failed because an unexpected error occurred. (Result
Code: 0x8107090d)
but since OnError attribute is set to Continue for Batch Element, PowerShell will continue the execution.
I am new to both powershell and sharepoint, and I need to make script to automate the removal and uploading of attachments from outlook to sharepoint. I have easily completed the first part of extracting the attachment, however the uploading to sharepoint has become difficult do to my company's rules. As I understand to use sharepoint cmdlets you need to add the sharepoint snap-in but I am unable to do so because I dont have access to the sharepoint server. Is there anyway to the snapin without being on the server and if not can I upload it another way?
You can't add the SP snap in unless the server is a SP server. Instead, use a webservice/webclient approach to upload the file. Something like this should work depending on your SP version:
http://blog.sharepoint-voodoo.net/?p=205
Accepted answer link is broken.
This script uses PowerShell to upload a file to a document library in SharePoint using purely web service calls so it could be done remotely, also meaning it should work with O365 though I have not tried.
These variables are used throughout the script for source file, destination file and authentication. If your workstation is on the same domain as SharePoint, and your logged on user has permissions to the SharePoint site, you can omit $username, $password, and $domain
$LocalPath = "C:\filename.docx"
$spDocLibPath = "http://site.contoso.com/sites/spteam/Shared Documents/"
$username = "someone"
$password = "somepassword"
$domain = "contoso"
$UploadFullPath = $spDocLibPath + $(split-path -leaf $LocalPath)
$WebClient = new-object System.Net.WebClient
if($username -eq "" -or $password -eq "" -or $password -eq "")
{
# Use Local Logged on User Credentials
$WebClient.Credentials = [System.Net.CredentialCache]::DefaultCredentials
}
else
{
# Alternate Login for specifying credentials
$WebClient.Credentials = new-object System.Net.NetworkCredential($username, $password, $domain)
}
$WebClient.UploadFile($UploadFullPath, "PUT", $LocalPath)
https://web.archive.org/web/20160404174527/http://blog.sharepoint-voodoo.net/?p=205
I have a powershell script that we use during a Microsoft SCCM PXE task sequence for naming a PC. It worked flawlessly until a recent upgrade to SCCM 2012 R2 by the primary server admin.
Now when the code runs search if a user is in a specified AD group needed to complete the PXE build it gives this COM error
Exception calling "FindAll" with "0" argument(s): "Unknown error (0x80005000)"
At X:\Windows\System32\OSD\x86_PXE.ps1:202 char:1
+ $colResults = $objSearcher.FindAll() # Finds all items that match search and put ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : COMException
I have searched far and wide to try and solve this. It seems like a .Net error but I have been unsuccessful in resolving it.
Below is the relevant code. Note that this is being ran in Windows PE that is included with SCCM 2012 R2 as well as the current Windows ADK. It is most likely going to work just fine on a normal PC as it does on mine.
Things to note, you will need to change to match you environments
$Domain
$strFilter - specifically "Memberof=cn="
$objOU - server path
function get-humadcreds {
$global:creds = get-credential -message "Please authenticate to Domain"
$global:UserName = $creds.username
$global:encPassword = $creds.password
$password = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($encpassword)) # Converts secure string to plain text
$Domain = #Domain
Add-Type -AssemblyName System.DirectoryServices.AccountManagement
$ct = [System.DirectoryServices.AccountManagement.ContextType]::Domain
$pc = New-Object System.DirectoryServices.AccountManagement.PrincipalContext $ct,$Domain
$authed = $pc.ValidateCredentials($UserName,$Password)
# Recursively requests credentials if authorization fails
if ($authed -eq $false) {
[System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms")
[System.Windows.Forms.MessageBox]::Show("Authentication failed - please retry!")
get-humadcreds
}
}
get-humadcreds # Gets AD credentials from user
###Provisioning Authentication
$strFilter = "(&(objectCategory=user)(SAMACCOUNTNAME=$global:UserName)(|(Memberof=cn=,OU=Delegation,OU=,dc=,dc=,dc=)))" # Filter for searching
$decodedpassword = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($encpassword)) # Decoded password from AD Auth
$objOU = New-Object System.DirectoryServices.DirectoryEntry("LDAP://server/OU=,dc=,dc=,dc=",$global:username,$decodedpassword) # Authentication must specify domain controller
$objDomain = New-Object System.DirectoryServices.DirectoryEntry
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$objSearcher.SearchRoot = $objOU # Starts search in this OU
$objSearcher.PageSize = 1000
$objSearcher.Filter = $strFilter # Applies filter to search
$objSearcher.SearchScope = "Subtree"
$colProplist = "name"
$isInProvGroup = $False # Defaults value to false.
echo $objSearcher >> X:\Windows\System32\OSD\results.txt
$colResults = $objSearcher.FindAll() # Finds all items that match search and puts them in array $colResults
echo $colResults
foreach ($objResult in $colResults){
$isInProvGroup=$True #If user is in a group to add PCs (if $colResults is not empty), result will be true
}
echo $isInProvGroup
PE OS Verson 6.3.9600.16384
Welp.. found my answer, fixed it Aug 11th. Reddit thread.
Previously in SCCM 2012 prior to R2 the boot image was a Windows 8 PE4 image in which we had to integrated ADSI back into to using a version of it written by Johan Arwidmark. This can be found here for reference.
This time around after the R2 update and subsequently the forced upgrade of the boot images to 8.1 PE5 since no prior boot images would boot from PXE we had to add ADSI back in again this time from here. Previously and this time it was done through the configuration manager under drivers, its added as a driver with its required files and is added as a driver component into the boot.wim but in reality after digging for quite some time I found that it wasn't actually adding the needed dll files into the image even though the operation returned successful.
What I ended up doing was manually mounting the wim file on my PC with DISM, adding the driver from a folder, allowing unsigned ones to be installed. then manually verified the dlls were put into
place in the mounted System32 folder. After I did that I was able to unmount the wim committing changes, replace the boot wim used by the server, distribute content and test it. Which was successful.
Just as a reference, the required files are listed below and are also in the readme's. In my case they had to come from a Windows 8.1 32bit install. If going for 64bit they have to come from a computer or image with Windows 8.1 64bit
adsldp.dll
adsmsext.dll
adsnt.dll
mscoree.dll
mscorier.dll
mscories.dll
I am writing a PowerShell script that I want to run from Server A.
I want to connect to Server B and copy a file to Server A as a backup.
If that can't be done then I would like to connect to Server B from Server A and copy a file to another directory in Server B.
I see the Copy-Item command, but I don't see how to give it a computer name.
I would have thought I could do something like
Copy-Item -ComputerName ServerB -Path C:\Programs\temp\test.txt -Destination (not sure how it would know to use ServerB or ServerA)
How can I do this?
From PowerShell version 5 onwards (included in Windows Server 2016, downloadable as part of WMF 5 for earlier versions), this is possible with remoting. The benefit of this is that it works even if, for whatever reason, you can't access shares.
For this to work, the local session where copying is initiated must have PowerShell 5 or higher installed. The remote session does not need to have PowerShell 5 installed -- it works with PowerShell versions as low as 2, and Windows Server versions as low as 2008 R2.[1]
From server A, create a session to server B:
$b = New-PSSession B
And then, still from A:
Copy-Item -FromSession $b C:\Programs\temp\test.txt -Destination C:\Programs\temp\test.txt
Copying items to B is done with -ToSession. Note that local paths are used in both cases; you have to keep track of what server you're on.
[1]: when copying from or to a remote server that only has PowerShell 2, beware of this bug in PowerShell 5.1, which at the time of writing means recursive file copying doesn't work with -ToSession, an apparently copying doesn't work at all with -FromSession.
Simply use the administrative shares to copy files between systems.
It's much easier this way.
Copy-Item -Path \\serverb\c$\programs\temp\test.txt -Destination \\servera\c$\programs\temp\test.txt;
By using UNC paths instead of local filesystem paths, you help to
ensure that your script is executable from any client system with
access to those UNC paths. If you use local filesystem paths, then you
are cornering yourself into running the script on a specific computer.
Use net use or New-PSDrive to create a new drive:
New-PsDrive: create a new PsDrive only visible in PowerShell environment:
New-PSDrive -Name Y -PSProvider filesystem -Root \\ServerName\Share
Copy-Item BigFile Y:\BigFileCopy
Net use: create a new drive visible in all parts of the OS.
Net use y: \\ServerName\Share
Copy-Item BigFile Y:\BigFileCopy
Just in case that the remote file needs your credential to get accessed, you can generate a System.Net.WebClient object using cmdlet New-Object to "Copy File Remotely", like so
$Source = "\\192.168.x.x\somefile.txt"
$Dest = "C:\Users\user\somefile.txt"
$Username = "username"
$Password = "password"
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.NetworkCredential($Username, $Password)
$WebClient.DownloadFile($Source, $Dest)
Or if you need to upload a file, you can use UploadFile:
$Dest = "\\192.168.x.x\somefile.txt"
$Source = "C:\Users\user\somefile.txt"
$WebClient.UploadFile($Dest, $Source)
None of the above answers worked for me. I kept getting this error:
Copy-Item : Access is denied
+ CategoryInfo : PermissionDenied: (\\192.168.1.100\Shared\test.txt:String) [Copy-Item], UnauthorizedAccessException>
+ FullyQualifiedErrorId : ItemExistsUnauthorizedAccessError,Microsoft.PowerShell.Commands.CopyItemCommand
So this did it for me:
netsh advfirewall firewall set rule group="File and Printer Sharing" new enable=yes
Then from my host my machine in the Run box I just did this:
\\{IP address of nanoserver}\C$