How can I copy a column value from excel to txt file - powershell

I'm new to Power shell. I have a number of excel files (500+) having a column Animal Count that I would like to save in a new '.txt' file. Can any one give me tips to achieve this.

Looking at the image you provided, The count value is not in a column called 'Animal count', but in the column next to a label with that text.
As for the type of output, I would recommend not to use a .txt file, but output the found info as CSV file to be abe to keep the file names and the animal count values in a structured way.
Try:
$Source = 'D:\Test' # the path to where the Excel files are
# create an Excel COM object
$excel = New-Object -ComObject Excel.Application
# find Excel files in the Source path and loop through.
# you may want to add the -Recurse switch here if the code should also look inside subfolders
$result = Get-ChildItem -Path $Source -Filter '*.xlsx' -File | ForEach-Object {
$workBook = $excel.Workbooks.Open($_.FullName)
$workSheet = $Workbook.Sheets.Item(1)
$count = 0
$label = $WorkSheet.Cells.Find('*Animal count*')
if ($label) {
# get the numeric value for the cell next to the label
# empty cells will translate to 0
$count = [int]$workSheet.Cells.Item($label.Row, $label.Column + 1).Value()
}
# output a PSObject with the full filename and the animal count value
[PsCustomObject] #{
'File' = $_.FullName
'AnimalCount' = $count
}
$workBook.Close()
}
# quit Excel and clean up the used COM objects
$excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workSheet) | Out-Null
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workBook) | Out-Null
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel) | Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
# output on screen
$result | Format-Table -AutoSize
#output to CSV file
$result | Export-Csv -Path 'D:\Test\AnimalCount.csv' -UseCulture -NoTypeInformation
The result on screen wil look something like this:
File AnimalCount
---- -----------
D:\Test\File1.xlsx 165
D:\Test\File2.xlsx 0
D:\Test\File3.xlsx 87596
Edit
Since you've commented the labels are in Merged cells, you need to use this to find the value for Animal count:
$label = $workSheet.Range('$A:$B').Find('*Animal count*')
if ($label) {
# get the numeric value for the cell next to the label
# empty cells will translate to 0
$count = [int]$workSheet.Cells.Item($label.Row, $label.Column + 2).Value()
}
That is assuming there are two cells merged into one.
P.S. If the animal count value can ever exceed 2147483647, cast to [int64] instead of [int]

You could use Import-Csv to turn the excel file into a PS object, and the columns would be the new object's properties.
$excel = Import-Csv $excelPath
$excel.Animals | out-file $txtPath

Try this this is for 1 file save to txt same name.
this can you done for more file's with foreach options
$FileName = "C:\temp\test.xlsx"
$Excel = New-Object -ComObject Excel.Application
$Excel.visible = $false
$Excel.DisplayAlerts = $false
$WorkBook = $Excel.Workbooks.Open($FileName)
$NewFilePath = [System.IO.Path]::ChangeExtension($FileName,".txt")
$Workbook.SaveAs($NewFilepath, 42) # xlUnicodeText
# cleanup
$Excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($WorkBook) | Out-Null
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel) | Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()

Related

Filling Color to the first row in excell sheet using PS script

I wrote a script where in it will export all the SSL certificate details to an excel sheet, but i wanted a help to fill the color to the first row of the sheet.
Please help me in writing the script.
My Script
Clear-Host
$threshold = 300 #Number of days to look for expiring certificates
$deadline = (Get-Date).AddDays($threshold) #Set deadline date
Invoke-Command -ComputerName 'AAA', 'BBB' {
Get-ChildItem -Path 'Cert:\LocalMachine\My' -Recurse |
Select-Object -Property Issuer, Subject, NotAfter,
#{Label = 'ServerName';Expression = {$env:COMPUTERNAME}},
#{Label='Expires In (Days)';Expression = {(New-TimeSpan -Start (Get-Date) -End $PSitem.NotAfter).Days}}
} | Export-Csv -Path C:\users\$env:username\documents\WorkingScript.csv -NoTypeInformation -Force
Thanks in Advance.
You are creating a CSV file which does not hold formatting. Instead you could interact with Excel directly and create formatting that way.
As Olaf mentioned you could import a module to do this, or use the Excel com object.
Example using the COM object below
# Creating COM object to interact with excel
$excel = New-Object -ComObject Excel.Application
# set to false to hide the application
$excel.visible = $true
# Add a workbook to the application
$workbook = $excel.Workbooks.Add()
# Adding a workbook automatically adds a sheet.
# We select it and then name it
$worksheetOne = $workbook.Worksheets.Item(1)
$worksheetOne.Name = 'Data'
# Setting the text in two different cells
$worksheetOne.Cells.Item(1, 1) = 'Column One Text'
$worksheetOne.Cells.Item(1, 2) = 'Column Two Text'
# Selecting the EntireRow of the cell "1,1" and setting it to a color
$worksheetOne.Cells.Item(1, 1).EntireRow.Interior.ColorIndex = 4
# Setting the same row to bold
$worksheetOne.Cells.Item(1, 1).EntireRow.Font.Bold = $true
# Option autofit all columns
$worksheetOne.UsedRange.EntireColumn.AutoFit() | Out-Null
# Save the file
$excel.ActiveWorkbook.SaveAs('C:\Users\Username\example.xlsx')
You can see some of the colors below.
https://learn.microsoft.com/en-us/office/vba/api/excel.colorindex

Powershell Mass Rename files with a excel reference list

I need help with PowerShell.
I will have to start renaming files in a weekly basis which I will be renaming more than 100 a week or more each with a dynamic name.
The files I want to rename are in a folder name Scans located in the "C: Documents\Scans". And they would be in order, to say time scanned.
I have an excel file located in "C: Documents\Mapping\ New File Name.xlsx.
The workbook has only one sheet and the new names would be in column A with x rows. Like mention above each cell will have different variables.
P Lease make comments on your suggestions so that I may understand what is going on since I'm a new to coding.
Thank you all for your time and help.
Although I agree with Ad Kasenally that it would be easier to use CSV files, here's something that may work for you.
$excelFile = 'C:\Documents\Mapping\New File Name.xlsx'
$scansFolder = 'C:\Documents\Scans'
########################################################
# step 1: get the new filenames from the first column in
# the Excel spreadsheet into an array '$newNames'
########################################################
$excel = New-Object -ComObject Excel.Application
$excel.Visible = $false
$workbook = $excel.Workbooks.Open($excelFile)
$worksheet = $workbook.Worksheets.Item(1)
$newNames = #()
$i = 1
while ($worksheet.Cells.Item($i, 1).Value() -ne $null) {
$newNames += $worksheet.Cells.Item($i, 1).Value()
$i++
}
$excel.Quit
# IMPORTANT: clean-up used Com objects
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($worksheet) | Out-Null
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workbook) | Out-Null
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel) | Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
########################################################
# step 2: rename the 'scan' files
########################################################
$maxItems = $newNames.Count
if ($maxItems) {
$i = 0
Get-ChildItem -Path $scansFolder -File -Filter 'scan*' | # get a list of FileInfo objects in the folder
Sort-Object { [int]($_.BaseName -replace '\D+', '') } | # sort by the numeric part of the filename
Select-Object -First ($maxItems) | # select no more that there are items in the $newNames array
ForEach-Object {
try {
Rename-Item -Path $_.FullName -NewName $newNames[$i] -ErrorAction Stop
Write-Host "File '$($_.Name)' renamed to '$($newNames[$i])'"
$i++
}
catch {
throw
}
}
}
else {
Write-Warning "Could not get any new filenames from the $excelFile file.."
}
You may want to have 2 columns in the excel file:
original file name
target file name
From there you can save the file as a csv.
Use Import-Csv to pull the data into Powershell and a ForEach loop to cycle through each row with a command like move $item.original $item.target.
There are abundant threads describing using import-csv with forEach.
Good luck.

Powershell skip first 2 lines of txt file when importing it

I have a powershell script designed to read a txt file on a remote server and import it into SQL.
I want to be able to skip the first 2 lines of the txt file. I am currently using the code below to import the file. The txt file is delimited
$datatable = new-object System.Data.DataTable
$reader = New-Object System.IO.StreamReader($empFile)
$columns = (Get-Content $empfile -First 1).Split($empFileDelimiter)
if ($FirstRowColumnNames -eq $true)
{
$null = $reader.readLine()
}
foreach ($column in $columns)
{
$null = $datatable.Columns.Add()
}
# Read in the data, line by line, not column by column
while (($line = $reader.ReadLine()) -ne $null)
{
$null = $datatable.Rows.Add($line.Split($empFiledelimiter))
The column parameter takes the first line of the txt file and creates the columns for the PS datatable.
The problem I have is the first two lines of the txt file are not needed and I need to skip them and use the third line of the txt file for the columns. I have the following line of code which will do this but I am uncertain how to integrate it into my code.
get-content $empFile | select-object -skip 2
Create an array for the $empfile without the first two lines, then use the first item of the array for the Columns, like this:
$Content = Get-Content $empFile | Select-Object -Skip 2
$columns = $Content[0].Split($empFileDelimiter)
just a quick one liner
(Get-Content $empFile| Select-Object -Skip 2) | Set-Content $empFile
Put in two unused calls to ReadLine(). Something like this:
$datatable = new-object System.Data.DataTable
$reader = New-Object System.IO.StreamReader($empFile)
$reader.ReadLine()
$reader.ReadLine()
$columns = ($reader.ReadLine()).Split($empFileDelimiter)
...

Import Excel data into PowerShell variables

I have an Excel File which has an unknown number of records in it, and these 3 columns:
Variable Name, Store Number, Email Address
I use this in QlikView to import data for certain stores and then create a separate report for each store in the list. I then need to email each report to each individual store (store number will be in the report file name).
So in PowerShell I would like to read the Excel File and set variables for each store:
$Store1 = The Store Number in Row 2 of the Excel File
$Store1Email = The Store Email in Row 2 of the Excel File
$Store2 = The Store Number in Row 3 of the Excel File
$Store2Email = The Store Email in Row 3 of the Excel File
etc. for each Storein the file (can be any number of stores).
Please note the "Variable Name" in the excel file must be ignored (that is for QLikView) and the PowerShell variables must be named as per my above examples, each time incrementing the number.
Check out my PowerShell Excel Module on Github. You can also grab it from the PowerShell Gallery.
$stores = Import-Excel C:\Temp\store.xlsx
$stores[2].Name
$stores[2].StoreNumber
$stores[2].EmailAddress
''
'All stores'
'----------'
$stores
Ok, first off if you are going to be working with actual .XLS or .XLSX or .XLSM files I would highly suggest using the Import-XLS function from the TechNet gallery (found here).
After that, just reference the object it imports to send the emails instead of making objects for each store. Such as:
$StoreList = Import-XLS <path to Excel file>
GC <report folder> | %{
$Current = $_
$Store = $StoreList|?{$_.StoreNumber -match $Current.BaseName}|Select -ExpandProperty StoreNumber
$Email = $StoreList|?{$_.StoreNumber -match $Current.BaseName}|Select -ExpandProperty StoreEmail
<code to send $Current to $Email>
}
My preference is to Save-As the Excel file to a '.csv' type. The comma separated value can easily be imported into PowerShell.
$csvFile = Import-Csv -Path c:\scripts\temp\excelFile.csv
#now the entire Excel '.csv' file is saved into csvFile variable
$csvFile |Get-Member
#look at the properties
Remember to study the greats so your PowerShell script looks great. Jeffery Snover, Jason Hicks, Don Jones, Ashley McGlone, and anyone on their friends list ha ha
The above answers usually work, but I just had a project with excel datasheets that caused some problems.
edit: Here's a much more advanced version that will pull it into an object, can handle blank and duplicate column names, and can skip human information at the beginning of the worksheet by looking for something in the header row. I've also included some example usages
Your example:
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.from_excel("c:\folder\file.xls")
$Store1 = $file.data[0]."Store Number" #first row, column named "Store Number"
$Store1Email = $file.data[0]."Store Email" #first row, column named "Store Email"
foreach ($row in $file.data)
{
write-host "Store: $($row."Store Number")"
write-host "Store Email: $($row."Store Email")"
}
Example 1:
# Simplest example
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.from_excel("c:\folder\file.xls")
$file.data[0]
Example 2:
#advanced usage
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.header_contains="First Name" # if included it will drop everything before the first line that contains this, useful if there are instructions for humans in the worksheet
$file.indexer_column = 5 # Default: 1 (first column); This column's contents will set the minimum number of rows, use if there are blank rows in your file but more data after them
$file.worksheet_index = "January" # Default: 1; can be a sheet index or sheet name
$file.filename = "c:\folder\file.xls" #can set this independently, useful for validation and troubleshooting
$file.from_excel() #This is where we actually pull from excel
$collected = $file.data|ogv -pass thru #this is a neat way to select some rows you want
$file.headers.count # It stores an array of the headers here, useful for troubleshooting and advanced logic
Excel Reader pseudoclass
$file_template = {
# -- universal --
$filename = ""
$delimiter = ","
$headers = #()
$data = #()
# -- used by some functions --
# we put these here to allow assigning them before calling functions, which improves readability and auditability
$header_contains=""
$indexer_column=1
$worksheet_index=1
function from_excel(
$filename=$this.filename,
$worksheet_index=$this.worksheet_index
)`
{
$this.filename = $filename
$this.worksheet_index = $worksheet_index
$data_by_row = $this.from_excel_as_csv() # $data_by_row = $file.from_excel_as_csv($test_file)
$data_by_row = $data_by_row -split"`n"
#if ($this.headers.count -lt 1) {$this.headers = $data_by_row[0] -split $this.delimiter} #this would let us set headers elsewhere which is more flexible but less adaptive, Because columns change unpredicably we need something more adaptive
$temp_headers = $data_by_row[0] -split $this.delimiter
$temp_headers = $this.fix_blank_headers($temp_headers)
$this.headers = $this.dedupe_headers($temp_headers)
$this.data = $data_by_row|select -Skip 1|ConvertFrom-Csv -Header $this.headers -Delimiter $this.delimiter
}
function from_csv($filename=$this.filename)`
{
$this.filename = $filename
$this.headers = (Get-Content $this.filename -ReadCount 1|select -first 1) -split $this.delimiter
$this.data = Get-Content $this.filename|ConvertFrom-Csv -Delimiter $this.delimiter
}
function from_excel_as_csv(
$filename=$this.filename,
$worksheet_index=$this.worksheet_index
)`
{
$this.filename = $filename
$this.worksheet_index = $worksheet_index
#set up excel
Write-Host "Importing from excel, this may take a little while..."
$excel = New-Object -ComObject Excel.Application
$excel.DisplayAlerts = $false
$excel.Visible = $false
$workbook = $excel.workbooks.open($this.filename)
$worksheet = $workbook.Worksheets.Item($this.worksheet_index)
#import from excel
try{
$data_by_row = ""
$indexed_column = $worksheet.columns.item($this.indexer_column).value2 #we use this to work around some files having headers with blank space
$minimum_rows = (($indexed_column -join "◘").TrimEnd("◘") -split "◘").count # This Strips the million or so extra blank rows excel appends to get a realistic column length.
[bool]$header_found = 0
$i=1
do `
{
$row = $worksheet.rows.item($i).value2
$row_as_text = $row -join "◘" # ◘ (alt+8) is just a placeholder that's unlikely to show up in the text
$row_as_text = $row_as_text -replace $this.delimiter,"."
$row_as_text = $row_as_text.TrimEnd("◘")
$row_as_text = $row_as_text -replace "◘",$this.delimiter
if ($row_as_text -like "*$($this.header_contains)*"){[bool]$header_found=1}
if ($header_found) {$data_by_row+="$row_as_text`n"}
$i++
}
while ( ($row_as_text.Length -gt 1) -or ($i -lt $minimum_rows) )
}
catch {Write-Warning "ERROR Importing from excel"}
#close excel
$workbook.Close()
$excel.Quit()
write-host "Done importing from excel"
return $data_by_row
}
function dedupe_headers($headers){
$dupes = ($headers|group)|?{$_.count -gt 1}
if ($dupes.count -ge 1)
{
foreach ($dupe in $dupes)
{ #$dupe = $dupes[0]
$i=1
$new_headers = #()
foreach ($header in $headers)
{ #$header = $headers[0]
if ($header -eq $dupe.name)
{
$header = "$($header)_$($i)" # "header_#"
$i++
}
$new_headers += $header
}
}
}
else {$new_headers = $headers} # no duplicates found
return $new_headers
}
function fix_blank_headers($headers)
{
$replace_blanks_with = "_"
$new_headers = #()
foreach ($header in $headers)
{
if ($header -eq "") {$new_headers += $replace_blanks_with}
else {$new_headers += $header}
}
if ($new_headers.count -ne $header)
{
$error_json = #($headers),#($new_headers)|ConvertTo-Json -Compress
Write-Error "Error when fixing blank headers, original and new counts are different $($error_json)"
}
return $new_headers
}
<# function some_function($some_parameter){return $some_parameter} #>
Export-ModuleMember -Function * -Variable *
}
Forgive the ugliness here. I am not a programmer, so there are undoubtedly more optimized ways to do this, as well as better formatting. It will work, however, if I understand your requirements correctly.
$excelfile = import-csv "c:\myfile.csv"
$i = 1
$excelfile | ForEach-Object {
New-Variable "Store$i" $_."Store Number"
$iemail = $i.ToString() + "Email"
New-Variable "Store$iemail" $_."Email Address"
$i ++
}
edit: as per the reply to your original post, this works with a csv file. Just save it to csv first if necessary.
$excelfile = import-csv "C:\Temp\store.csv"
$i = 1 $excelfile | ForEach-Object {
$NA= $_."Name"
$SN= $_."StoreNumber"
Write-Output "row $i"
$NA
$SN
$i++ }

Help inserting db query results to a CSV file

I have a table that contains information pointing to files stored on a file server. I'm trying to write a powershell script to verify that each record in the table has a physical file associated with it on the server, if it does not write to the CSV file. I'm not having much luck wit the csv portion and was hoping you could help?
Here is what I have so far (please let me know if there are better ways to do this, Im brand new to powershell). I only want to add records to the csv if the test-path fails to find the file in question.
[Reflection.Assembly]::LoadFile("C:\ora10g\odp.net\bin\2.x\Oracle.DataAccess.dll")
$connectionString = "Data Source=XXXXXX;User Id=XXXX;Password=XXXXXXXX;"
$connection = New-Object Oracle.DataAccess.Client.OracleConnection($connectionString)
$connection.Open()
$queryString = "SELECT Key, Sequence, Type, File FROM FileTable WHERE Type IN (2, 5)"
$command = new-Object Oracle.DataAccess.Client.OracleCommand($queryString, $connection)
$mediaObjRS = $command.ExecuteReader()
# Loop through recordset
while ($mediaObjRS.read()) {
# Assign variables from recordset.
$objectKey = $mediaObjRS.GetString(0)
$objectSeq = $mediaObjRS.GetDecimal(1)
$objectType = $mediaObjRS.GetDecimal(2)
$objectFileName = $mediaObjRS.GetString(3)
# Check if file exists based on filetype.
if($objectType -eq 2){
# Type 2 = OLE
$fileLocation = "\\fileserver\D$\files\" + $objectFileName
}elseif($objectType -eq 5){
# Type 5 = FILE
$fileLocation = $objectFileName
}
$fileExists = Test-Path $fileLocation
if($fileExists -eq $False){
#Write to output file
$objectKey | Export-Csv missingfiles.csv
$objectSeq | Export-Csv missingfiles.csv
$objectType | Export-Csv missingfiles.csv
$objectFileName | Export-Csv missingfiles.csv
}
}
$connection.Close()
Each time you write to the missingfiles.cs you clobber the previous one. Plus this cmdlet is oriented towards saving objects where each property represents one of the comma separated values.
The simplest way to do this is to manually write (append) to the csv file:
if(!$fileExists) {
#Write to output file
"`"$objectKey`",`"$objectSeq`",`"$objectType`",`"$objectFileName`" >> foo.csv
}
The "slick" PowerShell way to do it would be to create an object for each file and then put those in an array and when you're done, export that array of object to a CSV file e.g.:
$files = #() # Initialize array outside loop
...
if (!$fileExists) {
$obj = new-object -psobject -prop #{
Key = $objectKey
Seq = $objectSeq
Type = $objectType
FileName = $objectFileName
}
$files += obj
}
...
# outside of loop, export array of objects to csv
$files | Export-Csv missingfiles.csv