How to load CSV data in snappydata table with rowstore mode using JDBC connection - scala

Hi i am starting to learning the snappydata rowstore link there i tried all the example its working , but i need to store the csv , json data in snappydata table, in the example they are using manually connecting the snappy-shell and creating the tables and inserting the records,and another option for JDBC client link , i am tried this way but i dont know how to load csv and store in snappy table,then i tried another method direct query based access in snappydata store also link ,if anyone knows how to store csv data in snappytable using jdbc please share me, Thank you..

You should go through the SnappyData docs - http://snappydatainc.github.io/snappydata/. The "How-to" section has loading from CSV examples ... http://snappydatainc.github.io/snappydata/howto/#how-to-load-data-in-snappydata-tables
val someCSV_DF = snSession.read.csv("Myfile.csv")
someCSV_DF.write.insertInto("MyTable")

Related

Azure data factory: Implementing the SCD2 on txt files

I have flat files in adls source,
for full load we are adding 2 columns Insert and datatimestamp.
For change load we need to Lookup with full data, the data available in full should be taken as Updated and not available data as Insert and copy.
below is the approach I tried to work out, but i'm unable to perform.
Can any one help me on this.
Thanks you and waiting for quick response.
Currently, the feature to update the existing flat file using the Azure data factory sink is not supported. You have to create a new flat file.
You can also use data flow activity to read full and incremental data and load to a new file in sink transformation.

Can SnappyData load from s3 and save into s3?

I found the website of SnappyData recently. I'm interested about SparkSQL query performance. Is there anybody who tried loading s3 and saving s3 operation with SnappyData ? I can't find such a document.
I want to use pyspark, and specify 'com.databricks.spark.csv' format and various options.
yes, you can. Here is an example.

Is it possible to load Phoenix tables from HDFS?

I am new to Phoenix. Is there any way to load tables to Phoenix from hadoop filesystem ?
Yes...
Phoenix is a wrapper on hBase...
So, you can create a phoenix table pointing to HDFS Data and can use it.
Please let me know if you are facing any specific issue related to that...
You can't do it straight away. You need to import data into HBase first. There are pre built importers (CSV format):
https://phoenix.apache.org/bulk_dataload.html

How to extract data from hive table to csv using talend

I want ot transfer data from one hadoop server to another hadoop server with help of Talend.
Through my research I come to know we can transfer data through flat files.Can any one suggest me how to transfer data from hive to flat file. If any other alternative way to transfer data using talend please suggest me.
You can use the tHiveInput Component to read the data from the hive table. Use a row link to connect it to a tfileInputDelimited component.
If you want to transfer to another Hadoop system, you can use a tHiveOutput or a tHDFSOutput instead of tFileInputDelimited.

Loading a CSV into Core Data managed sqlite db

I have a CSV file containing data.
I want to load it into a Core Data managed sqlite db.
I just ran one of the sample Core Data Xcode apps and noticed it created the db file.
I noticed table names all started with Z and the primary keys were stored in separate table so from this am I right in presuming that just importing the CSV data directly into the db using sqlite3 command line might mess up primary keys.
Do I need to write a program to read in the CSV line by line and then create objects for each row and persist them to the db.
Anyone got any code for this?
And can I write a desktop client to do this using Core Data. If so will the db be fine to use in IPhone core data app?
Can I then just include the prefilled db in my project and it will be deployed with the app correctly or is there something else I should do.
Use NSScanner to read your CSV file into the NSManagedObject instances in your Core Data store.
I have some categories on NSString for reading and writing CSV files from/to NSArrays. I'll post them online and edit my answer with a link to it.
edit
They're online here: http://github.com/davedelong/CHCSVParser