Extract specific values from cells from a CSV - powershell

I have to combine a lot of files , mostly CSV, already have code to combine however I need first to trim the desired csv files so I can get the data that I want. Each CSV has first 2 columns of 8 rows which contain data that I want. and then just below those there is a row that generates 8 columns. I am only having issue grabbing data from the first 8 rows of the 2 columns.
Example of the csv first 3 rows:
Target Name: MIAW
Target OS: Windows
Last Updated On: June 27 2019 07:35:11
This is the data that I want, the first 3 rows are like this, with 2 columns. My idea is to store the 3 values of the 2nd column each into a variable and then use it with the rest of my code.
As I only have a problem extracting the data, and since the way the CSV are formated there is no header at all, it is hard to come up with an easy way to read the 2nd column data. Below is an example, this of course will be used to process several files so it will be a foreach, but I want to come up first with the simple code for 1 file so I can adapt it myself to a foreach.
$a = MIAW-Results-20190627T203644Z.csv
Write-Host "$a[1].col2"
This would work if and only if I had a header called col2, I could name it with the first value on the 2nd column but the issue is that that value will change for CSV file. So the code I tried would not work for example if I were to import several files using:
$InFiles = Get-ChildItem -Path $PSScriptRoot\ *.csv -File |
Where-Object Name -like *results*
Each CSV will have a different value for the first value on the 2nd column.
Is there an easier way to just grab the 3 rows of the second column that I need? I need to grab each one and store each in a different variable.

Related

PowerShell script to amend the row in existing CSV

I'm trying to insert retrieved values into an existing CSV by using -Append -Force. However, it does not place it under the rows that I wanted it to. Anyway to solve this? Currently, the values are in row 12 to 17, but I want it to be in row 2 to 11. Column A to F are in an existing CSV while column G is what I want to input the values into.
Here is my current CSV output
Expected Output that I'm looking for:
And here is my current Powershell code:
Current Powershell code

How to automatically transfer data

I have thousands of csv files and they basically have 2 formats. One type of 2 formats is that in those csv files there are 100 rows and 2 columns. The other type of csv files has 50 columns and 5 rows. The numbers are given just to provide an example.
What I want to do is to write a Matlab code that will extract the complete second row of the csv files with the first format and make it the first row of the csv files with the second format. The number of the csv files with the first and second format is equal.
Any help is appreciated.

Transposing a CSV file using PowerShell

I have a CSV file that has a static number of columns (with headers) and a dynamic number of rows and I need convert it to another CSV which has a static number of rows and a dynamic number of columns (with or without headers).
Here is an example of what the before and after would look like in a spreadsheet. Please keep in mind that the top section would represent my input data and the bottom section is the desired state.
Of course this is a small subset as I could have a user that has 50 groups and a user which may only have 1 group associated with it. This leaves me with not knowing the number of columns in advance so it has to be dynamic.
I have tried using what I found on this post, but I am running into an issue trying to modify the code to fit my needs. The closest I have been able to do is create a row for each unique user ID but instead of having the corresponding group names in columns next to each user, they are set as the headers and no values for the users.
If you don't require headers or defined columns you could simply collect the values from the second column in a hashtable of arrays where you use the values from the first column as keys:
$data = #{}
Import-Csv 'C:\path\to\input.csv' | ForEach-Object {
$data[$_.UserID] += #($_.GroupName)
}
Then you can export the data to a text file like this:
$data.Keys | ForEach-Object {
$_ + ',' + ($data[$_] -join ',')
} | Set-Content 'C:\path\to\output.txt'

Extract columns from a csv file with fields containing delimited values

I am trying to extract certain fields from a csv file, having comma separated values.
The issues is , some of the fields also contains comma and the fields are not enclosed within quotes. Given the scenario, how can i extract the fields.
also,only one of the field contains comma within values, and i don't need that. e.g: I want to extract the first 2 columns and the last 5 columns from the data set of 8 columns , where the third column contains values with comma
PS: Instead of down voting i would suggest to come ahead and post your
brilliant ideas if you have any.
Solution:
$path = "C:\IE3BW0047A_08112017133859.csv"
Get-Content $path| Foreach {"$($_.split(',')[0,1,-8,-7,-6,-5,-4,-3,-2,-1] -join '|')"} | set-content C:\IE3BW0047A_08112017133859_filter.csv

Trailing rows in datastore with multiple csv files

Matlab 2015b
I have several large (100-300MB) csv files, I want to merge to one and filter out some of the columns. They are shaped like this:
timestamp | variable1 | ... | variable200
01.01.16 00:00:00 | 1.59 | ... | 0.5
01.01.16 00:00:01 | ...
.
.
For this task I am using a datastore class including all the csv files:
ds = datastore('file*.csv');
When I read all of the entries and try to write them back to a csv file using writetable, I get an error, that the input has to be a cell array.
When looking at the cell array read from the datastore in debug mode, I noticed, that there are several rows containing only a timestamp, which are not in the original files. These columns are between the last row of a file and the first rows of the following one. The timestamps of this rows are the logical continuation of the last timestamp (as you would get them using excel).
Is this a bug or intended behaviour?
Can I avoid reading this rows in the first place or do I have to filter them out afterwards?
Thanks in advance.
As it seems nobody else had this problem, I will share how I dealt with it in the end:
toDelete = strcmp(data.(2), '');
data(toDelete, :) = [];
I took the second column of the table and checked for an empty string. Afterwards I filled all faulty rows with an empty array via logical indexing. (As shown in the Matlab Documentation)
Sadly I found no method to prevent loading the faulty data, but in the end the amount of data was not to big to do this processing step in memory.