Is there any import option available for Accumulo..? - mongodb

I am new to Accumulo. We are migrating MongoDB to Accumulo database. we got a file with all tables information from mongoDB. Is there any option available in Accumulo to import the file and create the tables by its own? Through the API document I came to know that we can create table by shell script and also through programmatically. Can anyone tell me is there any import option available for Accumulo to import the file and create the tables?

There is no native way to do that. MongoDb deals with JSON documents which have an entirely different layout/schema than the way Accumulo does things. You could try using something like http://gora.apache.org/index.html, but that requires you to change the format of your MongoDB data. If you can't do that than you'll more than likely you'll have to do this pragmatically yourself.

Related

import data from Postgres to Cassandra

I need to import data from Postgres to Cassandra using open source technologies only.
Can anyone please outline the steps I need to take.
As per instructions, I have to refrain from using DataStax software as they come with license.
Steps I have already tried:
Export one table from Postgres in csv format and imported to HDFS (using sqoop) {If I take this approach do I need to use Map_Reduce after this?}.
Tried to import the csv file to Cassandra using cql, however, got this error
Cassandra: Unable to import null value from csv
I am trying several methods, but unable to find a solid approach of attach.
Can anyone of you please provide me the steps required for the whole process. I believe there would be many people who have already done that.

Importing AccessDB and Oracle directly into MongoDB

I am receiving .dmp and .mdb files from a customer & need to get that data into MongoDB.
Is there any way to straight import these file types into Mongo?
The goal is to programmatically ingest these into mongo in any way I can. The only rule is that customer will not change their method of data delivery, meaning I'm stuck with the .dmp and .mdb files as a source.
Any assistance would be greatly appreciated.
Here are a few options/ideas:
Convert mdb to csv, then use mongoimport --type csv to import into MongoDB.
Use an ETL tool, e.g. Pentaho, Informatica, etc. This will give you much more flexibility for doing any necessary transformation/conversion of data.
Write a custom ETL tool, using libraries that know how to read mdb and dmp files.
You don't mention how you plan to use this data, how many tables are in the database, and how normalized the tables are. Depending on the specifics of your use case, it's very possible that loading the data from Access "as is" will not be a good choice since normalized schemas are not a good fit for MongoDB and MongoDB does not natively support joins. This is where an ETL tool can help, by extracting the source data and transforming it into an appropriate JSON structure.
MongoDB has released ODBC drivers. Go Here MongoDB ODBC Drivers connect MSAccess directly to MongoDB through ODBC. Voila!

Connect Neo4J on an existing Postgresql database

I'm a Neo4j new user and I played around with the webadmin interface of Neo4j to create small databases and simple queries in Cypher. Now I want to use Neo4J to create a graph with my existing database. It's a postgresql database with millions of entries with the same structure (Neo4J is very adapted to represent these data). My question is how to import these data ? What is the easiest way to do that ? I already saw that Cypher recognizes csv files but do I have to create a csv file with my data or is there another way to import them ? Thank you for your help. Sam
One option is to export your postgres data to csv and apply LOAD CSV to import them into the graph.
Another way is writing a script in a language of choice (I'd vote for groovy here) that connects to Postgres using JDBC and connects to Neo4j and then applies the business logic to transform between the two.
A third option is using a ETL tool like Talend. It basically does the same as your custom script but provides a point & click interface to define the transformation, see http://neo4j.com/blog/fun-with-music-neo4j-and-talend/ for more details.

Download HTTP data with Postgres StoredProcedure

I am wondering if there is a way to import data from an HTTP source from within an pgsql function.
I am porting an old system that harvests data from a website. Rather than maintaining a separate set of files to manage the downloading of the data, I was hoping to put the import routines directly into stored procedures.
I do know how to import data with COPY, but that requires the data to already be available locally. Is there a way to get the download the data with PL/PGSQL? Am I out to lunch?
Related: How to import CSV file data into a PostgreSQL table?
Depending what you're after, the Postgres extension www_fdw might work for you: http://pgxn.org/dist/www_fdw/
If you want download custom data by HTTP protocol, then PostgreSQL extensive support for different languages might be handy. Here is the example of connecting to Google Translate service from Postgres function written in Python:
https://wiki.postgresql.org/wiki/Google_Translate

How do I import certain columns from a csv file into mongodb?

I'm not sure if I should ask this in ServerFault or here, but I'm trying to write a PHP script that loops through a folder and adds csv files to a mongodb database.
I'd only like to import certain fields/columns. Is that possible, or do I need to import the whole table/collection, then drop fields? Google doesn't seem to be helping...
You might find this question/answer helpful. The OP there was attempting to do the same thing.
The mongoimport command doesn't give you the option of skipping fields during the import process, so that would require a full import followed by $unset operations on the fields you intended to omit. Ultimately, that would leave your collection with fragmentation.
You would be better served using fgetcsv or str_getcsv in PHP to parse the file. This would also allow you the change to validate/sanitize the input as necessary. Finally, MongoCollection::batchInsert() would efficiently insert multiple documents into MongoDB at once.