I would like to be able to right click on a file(s) and "send-to" a local MSSQL database. The details are that I would like to store the file contents in "contents" column and the file name in the "filename" column ... how novel :)
*In most cases the file contents is HTML.
It seems like it should be possible through windows shell/SQL Shell using a shortcut to a command in the "shell:sendto" folder.
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | Out-Null
$Server1 = New-Object ("Microsoft.SqlServer.Management.Smo.Server") 'SQLSERVER'
$Server1.databases["DB"].tables["Table"].rowcount
$RowCount = $server1.databases["DB"].tables["Table"].rowcount.ToString()
$TotalRecords = [int]$RowCount
$wc = New-Object system.net.WebClient
$url = ""
$files = #(Get-ChildItem c:\test\*.*)
"Number of files $($files.length)"
# Errors out when no files are found
if($files.length -lt 1) { return }
foreach($file1 in $files) {
# $txt = Get-Content($file1)
# $txt = $txt.Replace("'", "''")
# Write-Host $file1.name + " - - " + $Txt
$url1 = $url + $file1
Write-Host("URL is " + $url1)
$webpage = $wc.DownloadData($url1)
$string = [System.Text.Encoding]::ASCII.GetString($webpage)
$string = $string.Replace("'", "''")
Invoke-SqlCmd -ServerInstance SERVER -Query "Insert into DATABASE.dbo.Table(text,filename) Values ('$string','$file1')"}
Related
I am trying to edit the Mirth property file to add microsoft sql server driver url and class name using powershell. I am able to edit and save . When I start the Mirth services after editing, the Mirth property file is corrupted. But I reviewed both the original and the edited content. Everything is the same except the added changes and the file size is increased from 5kb to 10 kb. Can anyone help me out? Here is the powershell script.
$server="Localhost"
$PropertyfilePath = "C:\Program Files\Mirth Connect\conf\mirth.properties"
$connectionString = "jdbc:sqlserver://$($server):1433;instanceName=$($server);databaseName=Mydb;integratedSecurity=true"
$driverName = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
$data= #()
Copy-Item $PropertyfilePath "D:\Mirthbackup" -force
$newstreamreader = New-Object System.IO.StreamReader($PropertyfilePath)
[int]$eachlinenumber = 1
while (($readeachline = $newstreamreader.ReadLine()) -ne $null)
{
if($readeachline.Contains("= derby")){
$readeachline.Remove(0)
$update= "database = sqlserver"
$data +=$update
$eachlinenumber++
}
elseif($readeachline.Contains("database.url"))
{
$update=$readeachline.Substring(0,12)+" = "+$connectionString
$data +=$update
$eachlinenumber++
}
elseif($readeachline.Contains(".driver"))
{
$readeachline.Remove(0)
$update ="database.driver = "+$driverName
$data +=$update
$eachlinenumber++
}
else{
$data +=$readeachline
$eachlinenumber++
}
}
$newstreamreader.Dispose()
$data | Out-File -FilePath $PropertyfilePath
Your properties file should be ISO-8859-1 (latin1) encoded.
Try to change your code to:
$server="Localhost"
$PropertyfilePath = "C:\Program Files\Mirth Connect\conf\mirth.properties"
$connectionString = "jdbc:sqlserver://$($server):1433;instanceName=$($server);databaseName=Mydb;integratedSecurity=true"
$driverName = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
$data= #()
Copy-Item $PropertyfilePath "D:\Mirthbackup" -force
$encoding = [System.Text.Encoding]::GetEncoding('iso-8859-1'))
$newstreamreader = New-Object System.IO.StreamReader($PropertyfilePath, $encoding)
[int]$eachlinenumber = 1
while (($readeachline = $newstreamreader.ReadLine()) -ne $null)
{
# if statements left out
}
$newstreamreader.Dispose()
[System.IO.File]::WriteAllLines($PropertyfilePath,$data, $encoding)
I'm trying to achieve the following via powershell:
I have a table(TBL_DDL) with 5 columns (CATALOG,SCHEMA,OBJECT_TYPE,OBJECT_NAME,DDL)
Now, i'm extract data from this table and then trying to create a folder structure by concatenating first 4 columns (CATALOG,SCHEMA,OBJECT_TYPE,OBJECT_NAME) in C: drive and then exporting the data in DDL column in txt file.
For eg: C:\"CATALOG"\"SCHEMA"\"OBJECT_TYPE"\"OBJECT_NAME"\DDL.txt
I'm trying to achieve this via powershell. Can anyone help me please?
$SqlCmd = 'snowsql -c example -d tu_test -s public -q "select catalog,schema,OBJECT_TYPE,OBJECT_NAME,DDL from SF_TBL_DDL limit 2"'
$MultiArray = #(Invoke-Expression $SqlCmd)
$dt = New-Object System.Data.Datatable
[void]$dt.Columns.Add("CATALOG")
[void]$dt.Columns.Add("SCHEMA")
$Output = foreach ($Object in $MultiArray)
{
foreach ($SCHEMA in $Object.SCHEMA)
{
$someother = New-Object -TypeName psobject -Property #{CATALOG = $Object.CATALOG; SCHEMA = $SCHEMA}
$nRow = $dt.NewRow()
$nRow.CATALOG = $someother.CATALOG
$nRow.SCHEMA = $someother.SCHEMA
$dt.Rows.Add($nRow)
}
}
$dt.row.count
At the moment, i'm getting 0 rows in $dt.
Cheers
You can use System.Data.DataTable object the pull your result set and then loop through it to perform the required operation.
Here GetTableValues function will retrieve the table values and then use following cmdlet to create directory and file
New-Item -ItemType "directory" -Path $dirPath
New-Item -ItemType "file" -Path $filePath
Complete code looks like this
function GetTableValues(){
$DBConnectionString = "<Your DB connection string>";
$sqlConn = new-object System.Data.SqlClient.sqlConnection $DBConnectionString;
$sqlConn.Open();
$sqlCommand = $sqlConn.CreateCommand();
$sqlCommand.CommandText = "select catalog,[schema],OBJECT_TYPE,OBJECT_NAME,DDL from TBL_DDL"; ##Put your correct query here
$result = $sqlCommand.ExecuteReader();
$table = New-Object System.Data.DataTable;
$table.Load($result);
$sqlConn.Close();
return $table;
}
$tableValue = GetTableValues;
foreach ($Row in $tableValue)
{
$filePath = "C:\" + $Row.catalog.TrimEnd() + "\" + $Row.schema.TrimEnd() + "\" + $Row.OBJECT_TYPE.TrimEnd() + "\" + $Row.OBJECT_NAME.TrimEnd() + "\" + $Row.DDL.TrimEnd() + ".txt"
$dirPath = "C:\" + $Row.catalog.TrimEnd() + "\" + $Row.schema.TrimEnd() + "\" + $Row.OBJECT_TYPE.TrimEnd() + "\" + $Row.OBJECT_NAME.TrimEnd()
New-Item -ItemType "directory" -Path $dirPath ##Creates directory
New-Item -ItemType "file" -Path $filePath ##Creates file in $dirPath directory
}
This works perfectly fine for me.
I have a large number of PDF files in a folder with several subfolders. In this pile of files I need to find the ones with a specific string and move them to a new destination.
I already have a fine piece of code for the search process that gives me the files needed (thx to the creator) - now I need help to combine this code with a move-function. All the files found by the following code should be moved to a new destination.
$searchString = "text i need to find"
$searchPath = "C:\test"
$sql = "SELECT System.ItemPathDisplay, System.DateModified, "
+ "System.Size, System.FileExtension FROM SYSTEMINDEX "
+ "WHERE SCOPE = '$searchPath' AND FREETEXT('$searchstring')"
$provider = "provider=search.collatordso;extended properties=’application=windows’;"
$connector = new-object system.data.oledb.oledbdataadapter -argument $sql, $provider
$dataset = new-object system.data.dataset
if ($connector.fill($dataset)) { $dataset.tables[0] }
The output is like:
SYSTEM.ITEMPATHDISPLAY SYSTEM.DATEMODIFIED SYSTEM.SIZE SYSTEM.FILEEXTENSION
---------------------- ------------------- ----------- --------------------
C:\test\file.pdf 27.08.2019 19:14:57 17119 .pdf
Thank you for your help!
I found a solution by myself. For anyone interested.
Note: $searchPath must be a local drive on the machine you are running the script on, because the PDF files need to be indexed by the windows search. For that you probably have to install an iFilter: https://superuser.com/questions/402673/how-to-search-inside-pdfs-with-windows-search
$searchString = "Merkblatt für nüchtern eintretende Patienten"
$searchPath = "Y:\"
$targetPath = "\\Server\Path\folder"
$sql = "SELECT System.ItemPathDisplay, System.DateModified, " +
"System.Size, System.FileExtension FROM SYSTEMINDEX " +
"WHERE SCOPE = '$searchPath' AND FREETEXT('$searchstring')"
$provider = "provider=search.collatordso;extended properties=’application=windows’;"
$connector = new-object system.data.oledb.oledbdataadapter -argument $sql, $provider
$dataset = new-object system.data.dataset
if ($connector.fill($dataset)) {
#$dataset.tables[0]
foreach ( $Row in $dataset.tables[0].Rows) {
$targetFile = $Row[0] -replace "^Y:", $targetPath
$targetSubfolder = Split-Path -Path $targetFile
#write-host "Targetfile : $targetFile"
#write-host "Moving: $($Row[0])"
Move-Item -Path $($Row[0]) -Destination $targetPath -Force
}
}
Please can someone assist in helping with the above subject?
I would like to copy one file from a specific folder in a sharepoint site collection to another library (of the same name) in a different sharepoint site collection (but still within the same Web Application).
I have very little Powershell experience and have tried a number of Google searches but cannot seem to find anything that works.
Below is an example of what i have tried to do (lots of Write-Host to try and figure out what is going on) with the error message at the bottom.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$DestFolder = $dList.Files
$RootFolder = $sList.RootFolder
Write-Host " line 25 -- " $RootFolder
$collfiles1 = $RootFolder.Files
Write-Host " line 27 -- "$collfiles1
Write-Host " line 28 -- "$DestFolder
Write-Host " line 30 -- "$str = $DestinationWebURL"/"$DestinationLibraryTitle"/"$FileName
Write-Host " line 31 -- "$collfiles1.Count
for($i = 0 ; $i -lt $collfiles1.Count ; $i++)
{
Write-Host " line 34 -- "$collfiles1[$i].Name
##Write-Host $FileName
if($collfiles1[$i].Name -eq $FileName)
{
## $str = $DestinationWebURL.Url + $DestinationLibraryTitle + "/" + $FileName
$str = $DestinationWebURL+"/" +$DestinationLibraryTitle+"/"
Write-Host " line 40 -- "$str
Write-Host " line 41 -- "$collfiles1[$i]
$FiletoCopy = $collfiles1[$i].Name
Write-Host " line 43 -- " $FiletoCopy
$FiletoCopy.CopyTo($str,$true)
}
}
Write-Host "Script Completed"
The below example gives the error
Cannot find an overload for "CopyTo" and the argument count: "2".
At line:44 char:3
+ $FiletoCopy.CopyTo($str,$true)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodCountCouldNotFindBest
If someone could point me in the right direction that would be very helpful.
Thanks in advance,
Ian.
The following PowerShell for your reference, copy a file from one library in site collection to another library in another site collection with fields.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
#$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
#$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$SourceFile=$sWeb.GetFile($SourceWebURL+"/"+$SourceLibraryTitle+"/"+$FileName)
$TargetFolder = $dWeb.GetFolder($DestinationLibraryTitle)
#Copy File from the Source
$NewFile = $TargetFolder.Files.Add($SourceFile.Name, $SourceFile.OpenBinary(),$True)
#Copy Meta-Data from Source
Foreach($Field in $SourceFile.Item.Fields)
{
If(!$Field.ReadOnlyField)
{
if($NewFile.Item.Fields.ContainsField($Field.InternalName))
{
$NewFile.Item[$Field.InternalName] = $SourceFile.Item[$Field.InternalName]
}
}
}
#Update
$NewFile.Item.UpdateOverwriteVersion()
Write-host "Copied File:"$SourceFile.Name
Reference: Copy Files Between Document Libraries in SharePoint using PowerShell
So in case of large files where file size is greater than 50MB. This script mentioned by #LZ_MSFT will never be able to copy that file may be. In that aspect, you need to chunk the file into small pieces.Here is the PS to copy from source to destination with chunking if file size is greater than 50MB. Plus point for this one script is, it is using Client so it can be used with SP online and on-prem.
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
Function UploadFileInSlice ($DestinationCtx, $SourceCtx, $SourceFileUrl, $DestinationFolderUrl, $fileName, $fileChunkSizeInMB) {
# Each sliced upload requires a unique ID.
$UploadId = [GUID]::NewGuid()
# Get File by Server Relative URL
$File = $SourceCtx.Web.GetFileByServerRelativeUrl($SourceFileUrl)
$SourceCtx.Load($File)
# Get file Steam with OpenBinarySteam
$StreamToUpload = $File.OpenBinaryStream()
$SourceCtx.ExecuteQuery()
# File size in bytes
$FileSize = ($File).length
# Get Destination Folder by Server Relative URL
$DestinationFolder =
$DestinationContext.Web.GetFolderByServerRelativeUrl($DestinationFolderUrl)
$DestinationCtx.Load($DestinationFolder)
$DestinationCtx.ExecuteQuery()
# Set Complete Destination URL with Destination Folder + FileName
$destUrl = $DestinationFolderUrl + "/" + $fileName
# File object.
[Microsoft.SharePoint.Client.File] $upload
# Calculate block size in bytes.
$BlockSize = $fileChunkSizeInMB * 1000 * 1000
Write-Host "File Size is: $FileSize bytes and Chunking Size is:$BlockSize bytes"
if ($FileSize -le $BlockSize)
{
# Use regular approach if file size less than BlockSize
Write-Host "File uploading with out chunking"
$upload =[Microsoft.SharePoint.Client.File]::SaveBinaryDirect($DestinationCtx, $destUrl, $StreamToUpload.Value, $true)
return $upload
}
else
{
# Use large file upload approach.
$BytesUploaded = $null
$Fs = $null
Try {
$br = New-Object System.IO.BinaryReader($StreamToUpload.Value)
#$br = New-Object System.IO.BinaryReader($Fs)
$buffer = New-Object System.Byte[]($BlockSize)
$lastBuffer = $null
$fileoffset = 0
$totalBytesRead = 0
$bytesRead
$first = $true
$last = $false
# Read data from file system in blocks.
while(($bytesRead = $br.Read($buffer, 0, $buffer.Length)) -gt 0) {
$totalBytesRead = $totalBytesRead + $bytesRead
# You've reached the end of the file.
if($totalBytesRead -eq $FileSize) {
$last = $true
# Copy to a new buffer that has the correct size.
$lastBuffer = New-Object System.Byte[]($bytesRead)
[array]::Copy($buffer, 0, $lastBuffer, 0, $bytesRead)
}
If($first)
{
$ContentStream = New-Object System.IO.MemoryStream
# Add an empty file.
$fileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$fileCreationInfo.ContentStream = $ContentStream
$fileCreationInfo.Url = $fileName
$fileCreationInfo.Overwrite = $true
#Add file to Destination Folder with file creation info
$Upload = $DestinationFolder.Files.Add($fileCreationInfo)
$DestinationCtx.Load($Upload)
# Start upload by uploading the first slice.
$s = New-Object System.IO.MemoryStream(,$Buffer)
Write-Host "Uploading id is:"+$UploadId
# Call the start upload method on the first slice.
$BytesUploaded = $Upload.StartUpload($UploadId, $s)
$DestinationCtx.ExecuteQuery()
# fileoffset is the pointer where the next slice will be added.
$fileoffset = $BytesUploaded.Value
Write-Host "First patch of file with bytes"+ $fileoffset
# You can only start the upload once.
$first = $false
}
Else
{
# Get a reference to your file.
$Upload = $DestinationCtx.Web.GetFileByServerRelativeUrl($destUrl);
If($last) {
# Is this the last slice of data?
$s = New-Object System.IO.MemoryStream(,$lastBuffer)
# End sliced upload by calling FinishUpload.
$Upload = $Upload.FinishUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
Write-Host "File Upload Completed Successfully!"
# Return the file object for the uploaded file.
return $Upload
}
else {
$s = New-Object System.IO.MemoryStream(,$buffer)
# Continue sliced upload.
$BytesUploaded = $Upload.ContinueUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
# Update fileoffset for the next slice.
$fileoffset = $BytesUploaded.Value
Write-Host "File uploading is in progress with bytes: "+ $fileoffset
}
}
} #// while ((bytesRead = br.Read(buffer, 0, buffer.Length)) > 0)
}
Catch {
Write-Host $_.Exception.Message -ForegroundColor Red
}
Finally {
if ($Fs -ne $null)
{
$Fs.Dispose()
}
}
}
return $null
}
#URL to Configure, in this case Destination is SP Online site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$DestnationSiteUrl = "https://your-domain.sharepoint.com/sites/xyz"
$DestinationRelativeURL = "/sites/xyz/TestLibrary" #server relative URL here with library Name and Folder name
$DestinationUserName = "xyz#your-domain.com"
$DestinationPassword = Read-Host "Enter Password for Destination User:
$DestinationUserName" -AsSecureString
#URL to Configure, in this case Source is On-Prem site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$SourceSiteUrl = "http://intranet/sites/xyz"
$SourceRelativeURL = "/sites/xyz/TestLibrary/myfile.pptx" #server relative URL here with library Name and file name with extension
$SourceUsername = "domain\xyz"
$SourcePassword = Read-Host "Enter Password for Source User: $SourceUsername" -AsSecureString
#Set a file name with extension
$FileNameWithExt = "myfile.pptx"
#Get Source Client Context with credentials
$SourceContext = New-Object Microsoft.SharePoint.Client.ClientContext($SourceSiteUrl)
#Using NetworkCredentials in case of On-Prem
$SourceCtxcredentials = New-Object System.Net.NetworkCredential($SourceUsername, $SourcePassword)
$SourceContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$SourceContext.ExecuteQuery();
#Get Destination Client Context with credentials
$DestinationContext = New-Object Microsoft.SharePoint.Client.ClientContext($DestnationSiteUrl)
#Using SharePointOnlineCredentials in case of SP-Online
$DestinationContext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($DestinationUserName, $DestinationPassword)
$DestinationContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$DestinationContext.ExecuteQuery();
#All Set up, now just call the UploadFileInSlice with parameters
$UpFile = UploadFileInSlice -DestinationCtx $DestinationContext -SourceCtx $SourceContext -DestinationFolderUrl $DestinationRelativeURL -SourceFileUrl $SourceRelativeURL -fileName $FileNameWithExt -fileChunkSizeInMB 10
I get following error:
Cannot index into a null array.
At C:\tmp\Folder\excel\output\net45\test.ps1:14 char:1
+ $Data = $Reader.AsDataSet().Tables[0].Rows
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : NullArray
# Zero based index. The second row has index 1.
$StartRow = 2
# Input File
$InputFileName = "C:\tmp\Folder\excel\output\net20\test.xlsx"
# Output File
$OutputFileName = "C:\tmp\Folder\excel\output\net20\SomeFile.csv"
# Path to Excel.dll is saved (downloaded from http://exceldatareader.codeplex.com/)
$DllPath = "C:\tmp\Folder\excel\output\net45\Excel.4.5.dll"
[void]([Reflection.Assembly]::LoadFrom($DllPath))
$Stream = New-Object IO.FileStream($InputFileName, "Open", "Read")
$Reader = [Excel.ExcelReaderFactory]::CreateBinaryReader($Stream)
$Data = $Reader.AsDataSet().Tables[0].Rows
# Read the column names. Order should be preserved
$Columns = $Data[$StartRow].ItemArray
# Sort the remaining data into an object using the specified columns
$Data[$($StartRow + 1)..$($Data.Count - 1)] | % {
# Create an object
$Output = New-Object Object
# Read each column
for ($i = 0; $i -lt $Columns.Count; $i++) {
$Output | Add-Member NoteProperty $Columns[$i] $_.ItemArray[$i]
}
# Leave it in the output pipeline
$Output
} | Export-CSV $OutputFileName -NoType
You're calling the binary method (.xls) and using an Open XML format file (.xlsx). Try using [Excel.ExcelReaderFactory]::CreateOpenXmlReader($Stream) instead.
This works for me:
$DllPath = 'C:\Excel.DataReader.45\Excel.4.5.dll';
$FilePath = 'C:\Students.xlsx';
$FileMode = [System.IO.FileMode]::Open;
$FileAccess = [System.IO.FileAccess]::Read;
Add-Type -Path $DllPath;
$FileStream = New-Object -TypeName System.IO.FileStream $FilePath, $FileMode, $FileAccess;
$ExcelDataReader = [Excel.ExcelReaderFactory]::CreateOpenXmlReader($FileStream);
$ExcelDataReader.IsFirstRowAsColumnNames = $true;
$ExcelDataSet = $ExcelDataReader.AsDataSet();
$ExcelDataReader.Dispose();
$FileStream.Close();
$FileStream.Dispose();
$ExcelDataSet.Tables | Format-Table -AutoSize
If you're still having trouble, you might consider using the Microsoft.ACE.OLEDB.12.0 provider, which you install separately from Office. There's some doc here.
I've read this "Convert XLS to CSV on command line" and this "convert-xlsx-file-to-csv-using-batch" before in a similar doubt I have. Try too see if it helps.