Converting accdb to csv with powershell - powershell

I am trying to convert some excel (.xlsx) and Access (.accdb) to CSV files.
I quickly found a way to do this with Excel but now I cannot find any helpful documentation on converting .accdb files.
So far I have:
$adOpenStatic = 3
$adLockOptimistic = 3
$objConnection = New-Object -com "ADODB.Connection"
$objRecordSet = New-Object -com "ADODB.Recordset"
$objConnection.Open("Provider = Microsoft.ACE.OLEDB.12.0; Data Source = " + $Filepath)
$objRecordset.Open("Select * From TableName",$objConnection,$adOpenStatic, $adLockOptimistic)
#Here I need some way to either saveas .csv or loop through
#each row and pass to csv.
$objRecordSet.Close()
$objConnection.Close()
Any Ideas?
I would be willing to do this with another language (VB, Java, PHP) if anyone knows a way.

If you use .NET rather than COM it's a lot easier. Here's some code to handle the Excel XLSX files
#Even /w Excel 2010 installed, needed to install ACE:
#http://www.microsoft.com/downloads/en/details.aspx?FamilyID=c06b8369-60dd-4b64-a44b-84b371ede16d&displaylang=en
#Becareful about executing in "right" version x86 vs. x64
#Change these settings as needed
$filepath = 'C:\Users\u00\Documents\backupset.xlsx'
#Comment/Uncomment connection string based on version
#Connection String for Excel 2007:
$connString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=`"$filepath`";Extended Properties=`"Excel 12.0 Xml;HDR=YES`";"
#Connection String for Excel 2003:
#$connString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=`"$filepath`";Extended Properties=`"Excel 8.0;HDR=Yes;IMEX=1`";"
$qry = 'select * from [backupset$]'
$conn = new-object System.Data.OleDb.OleDbConnection($connString)
$conn.open()
$cmd = new-object System.Data.OleDb.OleDbCommand($qry,$conn)
$da = new-object System.Data.OleDb.OleDbDataAdapter($cmd)
$dt = new-object System.Data.dataTable
[void]$da.fill($dt)
$conn.close()
$dt | export-csv ./test.csv -NoTypeInformation

If you want to stick with ADODB COM object:
# loop through all records - do work on each record to convert it to CSV
$objRecordset.Open("Select * FROM Tablename", $objConnection,$adOpenStatic,$adLockOptimistic)
$objRecordset.MoveFirst()
do {
# do your work to get each field and convert this item to CSV
# fields available thru: $objRecordset.Fields['fieldname'].Value
$objRecordset.MoveNext()
} while ($objRecordset.EOF -eq $false)
$objRecordset.Close()

Related

Creating powershell array

I am a newby to powershell and am trying to automate a complex process. I need to copy and paste a 3 columns of data from one spreadsheet to a second one which will then be exported to an SAP transaction line by line. In my search I found the link below that discusses using $arr1 = #(0) * 20 to create the array. I have two questions.
The first question is what is the (0) referencing?
Also, using this formula how do I reference the workbook, sheet, and column that I need to use to create the array? Any help would be greatly appreciated.
PowerShell array initialization
#() is an array literal (typically used to define empty or single element array)
You can also use list object (System.Collection.ArrayList), might be more convenient since it has dynamic size.
Below is a quick and dirty example with COM object (reads data from range into array list)
$excel= new-object -com excel.application
$excel.Visible = $false
$wb = $excel.workbooks.open("$home\Documents\book1.xlsx")
$ws = $wb.Sheets.Item("Sheet1")
$data = New-Object System.Collections.ArrayList
foreach ($i in 1..20){
$data.Add( (New-Object PSObject -Property #{A=$ws.Range("A$i").Value2; B=$ws.Range("B$i").Value2}) ) | Out-Null
}
Write-Output $data
$wb.Close()

I have some code to download all the tables in my database to csv, is there a way to specify row separators?

Currently the code uses a comma for the column and a new line for row.
This is an issue because some of the data in the tables are paragraphs which already include commas and new lines.
I want to be able to use a delimiter with multiple characters but that is returning an error
Cannot bind parameter Delimiter. Cannot convert value '/~/' to type System.Char
$server = "(server)\instance"
$database = "DBName"
$tablequery = "SELECT name from sys.tables"
#Delcare Connection Variables
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $tablequery
$command.Connection = $connection
#Load up the Tables in a dataset
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
# Loop through all tables and export a CSV of the Table Data
foreach ($Row in $DataSet.Tables[0].Rows)
{
$queryData = "SELECT * FROM [$($Row[0])]"
#Specify the output location of your dump file
$extractFile = "C:\temp\backups\$($Row[0]).csv"
$command.CommandText = $queryData
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv $extractFile -Delimiter '/~/'
}
Export-Csv and ConvertTo-Csv correctly handles newline and comma characters
It is not a problem that the data could potentially contain commas and/or new lines.
If you look at the CSV "specification" (written in quotation signs here because a lot of usages of the term CSV is not referring to anything following this specification) you'll see that a field in a CSV file can be enclosed in quation characters. If the data of that field contains a quotation character, the delimiter character or a newline character it must be enclosed in quotation characters. If the data contains a quotation character, that quotation character should be doubled.
This will all be handled correctly by the ConvertTo-Csv and the Export-Csv cmdlets.
$obj = New-Object PSObject -Property ([ordered]#{
FirstColumn = "First Value";
SecondColumn = "Second value, including a comma";
ThirdColumn ="Third Value including two`nnewline`ncharacters";
FourthColumn = 'Fourth value including a " character'}
)
$obj | ConvertTo-Csv -NoTypeInformation
This will give us the following output:
"FirstColumn","SecondColumn","ThirdColumn","FourthColumn"
"First Value","Second value, including a comma","Third Value including two
newline
characters","Fourth value including a "" character"
Which is correctly handled according to the CSV specification.
So you do not need to worry about the data containing comma characters or newline characters, since it is handled by the CSV format.
Open a CSV file in Excel
I don't know what your current problem with the data is, but I'm guessing you're trying to open the resulting file in Excel and seeing incorrect data. This is because Excel unfortunately doesn't open .csv files as... well... CSV files.
One way (there might be more ways) to open it in Excel is to go on the Data tab and in the "Get & Transform Data" section press the "From Text/CSV" button. This way, Excel should open the file correctly according to the CSV standard.

Upload file to SharePoint Online with metadata using PowerShell

I'm trying to upload a batch of files to SharePoint Online using PowerShell and include metadata too (column fields). I know how to upload files, this works fine:
$fs = New-Object IO.FileStream($File.FullName,[System.IO.FileMode]::Open)
$fci= New-Object Microsoft.SharePoint.Client.FileCreationInformation
$fci.Overwrite = $true
$fci.ContentStream = $fs
$fci.URL = $file
$upload = $list.RootFolder.Files.Add($fci)
$ctx.Load($upload)
$ctx.ExecuteQuery()
and I know how to edit fields/columns, this works:
...
$item["project"] = "Test Project"
$item.Update()
...
$list.Update()
$ctx.ExecuteQuery()
but I don’t know how to tie the two together. I need to get an item reference to the file I’ve uploaded so that I can then update the item/file's metadata. As you can guess, PowerShell and SharePoint are all new to me!
The following example demonstrates how to upload a file and set file metadata using SharePoint CSOM API in PowerShell:
$filePath = "C:\Users\jdoe\Documents\SharePoint User Guide.docx" #source file path
$listTitle = "Documents"
$targetList = $Context.Web.Lists.GetByTitle($listTitle) #target list
#1.Upload a file
$fci= New-Object Microsoft.SharePoint.Client.FileCreationInformation
$fci.Overwrite = $true
$fci.Content = [System.IO.File]::ReadAllBytes($filePath)
$fci.URL = [System.IO.Path]::GetFileName($filePath)
$uploadFile = $targetList.RootFolder.Files.Add($fci)
#2.Set metadata properties
$listItem = $uploadFile.ListItemAllFields
$listItem["LastReviewed"] = [System.DateTime]::Now
$listItem.Update()
$Context.Load($uploadFile)
$Context.ExecuteQuery()
#vadimGremyachev --- THANKS! Question... I have a CSV of files that I'm uploading. One of the CSV columns is a metadata tag which I'm using a hash to convert to its GUID from the termstore. of the 500 files, I have ~5 files that refuse to set the value in SPO. It does NOT throw an error. I know the GUID is correct in my hash because other documents are tagged with the shared value from the HASH.
#2.Set metadata properties
$listItem = $upload.ListItemAllFields
$listItem["LegacySharePointFolder"] = $row.LegacySharePointFolder
$listItem.Update()
$listItem["Customer"] = $TermStoreCustomerHash[$row.CUSTOMER]
$listItem.Update()
Thanks!

How to insert strings into a table with their UTF-8 encodings?

I am trying to upload some string values into an Oracle table by means of powershell. However when I upload strings directly some characters are shown up like ? in the table.
Actually, I first parse a text and retrieve some results through regex as below:
if($wiki_link -match "http:\/\/en\.wikipedia\.org\/wiki\/(.*)") {$city = $matches[1]}
Then I wanna upload this $city variable into a table as below:
[System.Reflection.Assembly]::LoadWithPartialName("System.Data.OracleClient")
$connectionString = "Data Source=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(Host=xxxxxxxxx)(Port=1521)))(CONNECT_DATA=(SERVER = DEDICATED) (SERVICE_NAME =xxxxx)));user id=xxxxxx;password=xxxxx"
$connection = New-Object System.Data.OracleClient.OracleConnection($connectionString)
$connection.Open()
$cmd2=$connection.CreateCommand()
$cmd2.CommandText="insert into mehmet.goo_region (city) values ('$city')"
$rdr2=$cmd2.ExecuteNonQuery()
When I apply this method, the city named Elâzığ appears as Elaz?? in the table cell.
I guess I have to convert string into UTF-8 but I could not find a solution through web.
Thanks in advance...
Try this, it should work:
$u = New-Object System.Text.UTF8Encoding
$s = $u.GetBytes("YourStringGoesHere")
$u.GetString($s) ## this is your UTF-8 string
So your code becomes
$u = New-Object System.Text.UTF8Encoding
$s = $u.GetBytes($city)
$utf8city = $u.GetString($s)

file IO, is this a bug in Powershell?

I have the following code in Powershell
$filePath = "C:\my\programming\Powershell\output.test.txt"
try
{
$wStream = new-object IO.FileStream $filePath, [System.IO.FileMode]::Append, [IO.FileAccess]::Write, [IO.FileShare]::Read
$sWriter = New-Object System.IO.StreamWriter $wStream
$sWriter.writeLine("test")
}
I keep getting error:
Cannot convert argument "1", with value: "[IO.FileMode]::Append", for
"FileStream" to type "System.IO.FileMode": "Cannot convert value
"[IO.FileMode]::Append" to type "System.IO.FileMode" due to invalid
enumeration values. Specify one of the following enumeration values
and try again. The possible enumeration values are "CreateNew, Create,
Open, OpenOrCreate, Truncate, Append"."
I tried the equivalent in C#,
FileStream fStream = null;
StreamWriter stWriter = null;
try
{
fStream = new FileStream(#"C:\my\programming\Powershell\output.txt", FileMode.Append, FileAccess.Write, FileShare.Read);
stWriter = new StreamWriter(fStream);
stWriter.WriteLine("hahha");
}
it works fine!
What's wrong with my powershell script? BTW I am running on powershell
Major Minor Build Revision
----- ----- ----- --------
3 2 0 2237
Another way would be to use just the name of the value and let PowerShell cast it to the target type:
New-Object IO.FileStream $filePath ,'Append','Write','Read'
When using the New-Object cmdlet and the target type constructor takes in parameters, you should either use the -ArgumentList parameter (of New-Object) or wrap the parameters in parenthesis - I prefer to wrap my constructors with parens:
# setup some convenience variables to keep each line shorter
$path = [System.IO.Path]::Combine($Env:TEMP,"Temp.txt")
$mode = [System.IO.FileMode]::Append
$access = [System.IO.FileAccess]::Write
$sharing = [IO.FileShare]::Read
# create the FileStream and StreamWriter objects
$fs = New-Object IO.FileStream($path, $mode, $access, $sharing)
$sw = New-Object System.IO.StreamWriter($fs)
# write something and remember to call to Dispose to clean up the resources
$sw.WriteLine("Hello, PowerShell!")
$sw.Dispose()
$fs.Dispose()
New-Object cmdlet online help: http://go.microsoft.com/fwlink/?LinkID=113355
Yet another way could be to enclose the enums in parens:
$wStream = new-object IO.FileStream $filePath, ([System.IO.FileMode]::Append), `
([IO.FileAccess]::Write), ([IO.FileShare]::Read)
If your goal is to write into a logfile or text file, then you could try the supported cmdlets in PowerShell to achieve this?
Get-Help Out-File -Detailed