Magento import: Can not find required columns: sku - import

I am going nuts here with Magento's import function. I have created one template product within my store and then exported it, so I can see what the attributes look like. Next I used Pentaho Data Integration to transform our suppliers product list into that format.
The header row contains, like the export, the service columns (starting with an underline). Here is one record of what my generated data looks like:
sku,_store,_attribute_set,_type,_category,_root_category,_product_websites,color,cost,country_of_manufacture,created_at,custom_design,custom_design_from,custom_design_to,custom_layout_update,description,gallery,gift_message_available,has_options,image,image_label,manufacturer,media_gallery,meta_description,meta_keyword,meta_title,minimal_price,msrp,msrp_display_actual_price_type,msrp_enabled,name,news_from_date,news_to_date,options_container,page_layout,price,required_options,short_description,small_image,small_image_label,special_from_date,special_price,special_to_date,status,tax_class_id,thumbnail,thumbnail_label,updated_at,url_key,url_path,visibility,weight,qty,min_qty,use_config_min_qty,is_qty_decimal,backorders,use_config_backorders,min_sale_qty,use_config_min_sale_qty,max_sale_qty,use_config_max_sale_qty,is_in_stock,notify_stock_qty,use_config_notify_stock_qty,manage_stock,use_config_manage_stock,stock_status_changed_auto,use_config_qty_increments,qty_increments,use_config_enable_qty_inc,enable_qty_increments,is_decimal_divided,_links_related_sku,_links_related_position,_links_crosssell_sku,_links_crosssell_position,_links_upsell_sku,_links_upsell_position,_associated_sku,_associated_default_qty,_associated_position,_tier_price_website,_tier_price_customer_group,_tier_price_qty,_tier_price_price,_group_price_website,_group_price_customer_group,_group_price_price,_media_attribute_id,_media_image,_media_lable,_media_position,_media_is_disabled
4053258104446,,Default,simple,"Schmuck/Halsschmuck",Default Category,base,,,,25.07.2015 20:06,,,,,"Collier, PVC, braun, 42 cm, Karabinerverschluss 925/- S, Durchmesser ca. 2 mm",,,0,"35416.jpg",,"JOBO",,,,,,,"Konfiguration verwenden","Konfiguration verwenden","Collier PVC braun, Verschluss aus 925 Silber 42 cm Karabiner ",,,"Artikelinformationsspalte",,5,0,"Collier PVC braun, Verschluss aus 925 Silber 42 cm Karabiner","no_selection",,,,,1,2,"no_selection",,2015/07/25 20:06:32.291,,,4,,,,1,0,0,1,1,1,0,1,1,,1,0,1,0,1,0,1,0,0,,,,,,,,,,,,,,,,,,,,,
Magento complains with:
Can not find required columns: sku
I just don't see what might be wrong with my data. Obviously the sku is there, and my DB is empty! Things I have checked:
File-Encoding is UTF-8
Tried with LR and CR/LF
Strings are surrounded by "
Which fields are manadatory for an import? I just coudn't find anything within the documentation.
I have spent timeless hours on this. Any help is greatly appreciated!

Check if your column names are exactly the same as database names -
Moreover, encoding should be UTF-8 without BOM

Related

Is it possible to merge raster bands from several folders using GDAL?

I have two folders containing about 15 000 .tif files. Each file in the first folder is a raster with 5 bands, named AA_"number" meaning it looks like
AA_1.tif,
AA_2.tif,
...,
AA_15000.tif.
Each file in the second folder is a raster with 2 bands named BB_"number" and looks like
BB_1.tif,
BB_2.tif,
...,
BB_15000.tif.
My goal is to add bands 1-3 from first file from folder AA with band 1 from the first file in folder BB to create a 4 band raster, and make 15000 4 band rasters. After doing some research and testing things out in QGIS I believe the tool Merge from GDAL could solve this task, but I have not been able make it find the right files in different folders. And as I have 2x 15 000 files, it is not possible to do this selection manually. Is there anyone who know a smart solution to this, preferably using GDAL or QGIS?
There are many ways to do this, and it really depends on what the exact use case is. Like the type of analysis/visualization that needs to be done on the result.
With this many files, it could for example be nice to merge them using a VRT. That will avoid creating redundant data, but whether that's actually the best solution depends. Just stacking them in a new tiff-file would of course also work.
Unfortunately, creating a VRT using gdalbuildvrt / gdal.BuildVRT is not possible with multi-band inputs.
If your inputs are homogeneous in terms of properties, it should be fairly simple to set up a template where you fill in the file locations and write the VRT to disk. For more inputs with heterogeneous properties it might still be possible, but you'll have to be careful to take it all into account.
Conceptually such a VRT would look something like:
<VRTDataset rasterXSize="..." rasterYSize="...">
<SRS>...</SRS>
<GeoTransform>....</GeoTransform>
<VRTRasterBand dataType="..." band="1">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/aa_folder/aa_file1.tif</SourceFilename>
<SourceBand>1</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
<VRTRasterBand dataType="..." band="2">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/aa_folder/aa_file1.tif</SourceFilename>
<SourceBand>2</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
<VRTRasterBand dataType="..." band="3">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/aa_folder/aa_file1.tif</SourceFilename>
<SourceBand>3</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
<VRTRasterBand dataType="..." band="4">
<ComplexSource>
<SourceFilename relativeToVRT="0">//some_drive/bb_folder/bb_file1.tif</SourceFilename>
<SourceBand>1</SourceBand>
...
</ComplexSource>
</VRTRasterBand>
</VRTDataset>
You can first use gdalbuildvrt on some of your files to find all the properties that need to be filled in, like projection, pixel dimensions etc. That will work, but gdalbuildvrt will only be able to take the first band from the inputs. If all bands have homogeneous properties (like nodata value etc), that should be fine as a reference.

Import CSV file to mariadb

Lately, I am facing problems importing from CSV files. I am using
MariaDB : 10.3.32-MariaDB-0ubuntu0.20.04.1
On Ubuntu : Ubuntu 20.04.3 LTS
I am using this command
LOAD DATA LOCAL INFILE '/path_to_file/data.csv' INTO TABLE tab
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
After searching and trying I found that I can only load File from tmp folder. i.e.
SELECT load_file('/tmp/data.csv');
But it didn't work on other paths.
And secondly, I found that even If the CSV file is present in tmp folder; If it contains a lot of fields then again MariaDB would fail to load. The main problem is that LOAD DATA command does not give any type of error or even warning; except if the file does not exist. Other than that nothing is shown. And nothing is imported.
I only succeeded to import very simple CSV from tmp folder
What I Suspected is that
MariaDB had been updated and in this new version there are some flags or configuration options that prohibit MariaDB from importing CSV files from other than tmp folder and
MariaDB would fail to load CSV because of some unknown problem, Maybe some special character (which I made sure nothing is in there).
There must be some option that makes MariaDB produce verbose error and warning log. Which I didn't know. Except for /var/log/mysql/error.log file. which does not contain any info containing failed to load CSV.
Any help would be appreciated.
Below is the first record of CSV. Actual CSV contains 49 fields and 1862 records (but the below sample contains only one record)
"S.No","Training Code","Intervention Type (NRM/Emp. Skill
Training)","Training Title/Activity","Start Month","Ending
Month","No. of Days Trainings","Start Date Training","End Date
Training","Name of Person","Father
Name","CNIC","Gender","Age","Education","Skill Level","CO Ref
#","COName","Village Name","Tehsil Name","District","Type of Farm
production","Total Land (if applicable)","Total Trees (if
applicable)","Sheeps/goats","Buffalo/Cows","Profession","Person's
Income Emp. Skill (Pre-Intervention)","Income from NRM (Pre-
Intervention)","HH Other Sources of Income","Total HH Income","Type
of Support provided","Tool Kit/Inputs Received or Not","Date of
Tool Kit receiving","Other intervention , like exposure market
trial, followup support, Advance Training etc","Production (Pre-
Intervention)","Production (Post-Intervention)","Change in
Production","Unit (kg, Maund,Liter, etc)","Income gain from
production (Post-Intervention)","Change in Income (NRM)","Income
gain by Employment -Emp.Skill (Post Intervention)","Change in
Income (Emp. Skill)","Outcome Trend","Employment/Self-
Employment/Other","Outcome Result","Remarks","Beneficiaries Contact
No.","Activity Location"
1,"AUP-0001","NRM","Dates Processing &
Packaging","Sep/2018","Sep/2018",2,"25/Sep/2018","26/Sep/2018",
"Some name","Barkat Gul",1234567891234,"Male",34,"Primary","Semi-
Skilled","AUP-NWD-073","MCO Haider Khel Welfare Committee","Haider
Khel","Mir Ali","North
Waziristan","Dates",,20,,,"Farming",,5000,"Farming",5000,"Training,
Packaging Boxes","Yes","10/10/2018",,180,320,140,"Kg",8000,3000,,,
"Positive","Self Employed","Value addition to the end product
(Packaging increase the Price per KG to 25%)",,,"Field NW"
BTW am NON-Technical :-O
While using Mariadb version 10.5.13-3.12.1 am able to import CSV files into Tables have set up.
Except with dates,
https://dba.stackexchange.com/questions/283966/tradedate-import-tinytext-how-to-show-date-format-yyyymmdd-of-20210111-or-2021?noredirect=1#comment555600_283966
There am still struggling to import text-format-dates AND to convert text-dates into the (YYYY-MM-DD) date format.
end.

What are missing attributes as defined in the hdf5 specification and metadata in group h5md?

I have a one hdf5 format file Data File containing the molecular dynamics simulation data. For quick inspection, the h5ls tool is handy. For example:
h5ls -d xaa.h5/particles/lipids/positions/time | less
now my question is based on the comment I received on the data format! What attributes are missing according the hdf5 specifications and metadata in group?
Are you trying to get the value of the Time attribute from a dataset? If so, you need to use h5dump, not h5ls. And, the attributes are attached to each dataset, so you have to include the dataset name on the path. Finally, attribute names are case sensitive; Time != time. Here is the required command for dataset_0000 (repeat for 0001 thru 0074):
h5dump -d /particles/lipids/positions/dataset_0000/Time xaa.h5
You can also get attributes with Python code. Simple example below:
import h5py
with h5py.File('xaa.h5','r') as h5f:
for ds, h5obj in h5f['/particles/lipids/positions'].items():
print(f'For dataset={ds}; Time={h5obj.attrs["Time"]}')

Issues with "QUERY(IMPORTRANGE)"

Here's my first question on this forum, though I've read through a lot of good answers here.
Can anyone tell me what I'm doing wrong with my attempt to do a query import from one sheet to a column in another?
Here's the formula I've tried, but all my adjustments still get me a parsing error.
=QUERY(IMPORTRANGE("https://docs.google.com/spreadsheets/d/1yGPdI0eBRNltMQ3Wr8E2cw-wNlysZd-XY3mtAnEyLLY/edit#gid=163356401","Master Treatment Log (Responses)!V2:V")"WHERE Col8="'&B2&'")")
Note that importrange is only needed for imports between spreadsheets. If you only import from one sheet into another within the same spreadsheet I would suggest using filter() or query().
Assuming the value in B2 is actually a string (and not a number), you can try
=QUERY(IMPORTRANGE("https://docs.google.com/spreadsheets/d/1yGPdI0eBRNltMQ3Wr8E2cw-wNlysZd-XY3mtAnEyLLY/edit#gid=163356401","Master Treatment Log (Responses)!V2:V"), "WHERE Col8="'&B2&'", 0)
Note the added comma before "WHERE". If you want to import a header row, change 0 to 1.
See if that helps? If not, please share a copy of your spreadsheet (sensitive data erased).

Google Sheets - Retrieve "A:File1" to "A:File2" where "Sheetname:File1" = "B:File2" if "C:File2" is between "E" and "F" in "File1"

Sorry for the somewhat long title, but I was told to be as specific as possible. :D
My problem will require some explantion.
So, I have 2 spreadsheets files ("Konverteringstabeller" and "Tee Posen").
In "Tee Posen" I have a sheet named "Scores MIK" (golf scorecard and my name).
In "Konverteringstabeller" I have sheets with conversion tables for multiple golf courses, but if one works, all should.
What I need is to find out what course handicap I would get if my golf handicap is "HCP 26,0" (as shown in File 2 Picture), and in this case that result should be 29 (not visible), but you should get the point.
(example: golf hcp 10 would result in course hcp 11, because 10 is between 9,9-10,7)
While I have been able to find the right result, it has only been in the "Konverteringstabeller" spreadsheet file and that is not the place I need it.
I want to have it written in E6 in the "Scores MIK" sheet in File 2.
I should mention that in "Scores MIK : File 2", cell C2 (Ikast Golf Klub) has data validation so I can easily change between the different courses in the "Konverteringstabeller" file once I add more.
What I have been messing with is something with vlookup and importrange with concatenate in it, but I can't figure out how to do it, so I ask for your help.
And I am by no means skilled in the art of Spreadsheets, so I would very much appreciate a detailed explanation.
Picture - Scores MIK (File 2)
Picture - Ikast Golf Klub (File 1)
Thanks in advance!
// Mikkel Christensen
OK so a couple notes - One is that to join a static cell where you keep the sheet name but allow it to chance you should add '$' around it, also if the rows for B8-E70 will always be the same position on the various sheets you also need to add $ around those as well.
here is an example of the whole formula
=IFERROR(ARRAYFORMULA(VLOOKUP(E5:E25;IMPORTRANGE("spreadsheet key";"'"&C2&"'!$B$8:$E$70");4;TRUE)))
And lastly - using the "&" operator to concatenate is better at least in my opinion because concatenate sometimes does not work as well with array formula - plus I find it personally quicker and easier to use that having wrap yet another function around my stuff.