I have installed Pentaho Data Integration version (ce-5.0.1.A-stable) in my machine and I am trying to retrive information from MongoDB using PDI. I have created a transformation with Mongo Input step. Now when I try to configure my MongoDB connection details, I couldnt find any explicit connection Type for MongoDB. Could someone please advise on how to configure MongoDB datasource in Pentaho.
I have referred most of the Pentaho-MongoDb docs, but none of the solution works out.
Also, I have tried performing below steps as mentioned in Pentaho Official site, but still I couldnt find any connection Type for MongoDB
1- Move the following folder out of the data-integration folder structure:
data-integration/plugins/pentaho-big-data-plugin
2- Move the following files out of the data-integration folder structure if they exist:
data-integration/libext/JDBC/pentaho-hadoop-hive-jdbc-shim-1.3.0.jar
data-integration/libext/JDBC/pentaho-hadoop-hive-jdbc-shim-1.3.1.jar
data-integration/libext/JDBC/pentaho-hadoop-hive-jdbc-shim-1.3.2.jar
3- Unzip the file pentaho-big-data-plugin-shimtastic-1.3.3.1.zip from the data-integration/plugins folder.
4- Optionally, remove irrelevant folders under data-integration/plugins/pentaho-big-data-plugin/hadoop-configurations.
5- Copy the file pentaho-hadoop-hive-jdbc-shim-1.3.3.jar into the folder
data-integration/libext/JDBC
6- Unzip the file pentaho-instaview-templates-shimtastic-1.3.3.zip to the following directory to
data-integration/plugins/spoon/agile-bi/platform/pentaho-solutions/system/instaview/templates/Big Data
Any help is really appreciated..!
Pentaho doesnot have a specific database connection for MongodB. So you will not find it in the Database Connection viewer. The way to connect to Mongodb is to use Mongodb Input step in PDI. There you will find the connection details section (configure credentials). You can then connect JSON Input step to read the results of your mongodb output. Check the below screenshot:
You can also read it from the Pentaho Wiki in here. Though the documentation seems to be slightly old, but it is the exact process to do it.
On a note you don't need Bigdata shims to connect to mongodb. It seems you have configured the hadoop-hive shims. It not required in here.
Hope it helps :)
Related
I have an existing database for a discord bot in MongoDB Compass v1.28.1 I want to transfer all the data in the database to mongodb atlas because of its more extensive functionality and to not have to wait for compass to take ages to load each time I open it. However when I follow the steps to connect that are provided in Atlas, the pop-up that's supposed to appear when I copy a path to the clipboard doesn't appear, and nothing happens. I tried to connect through my app in VSCode, the same way I did for Compass, using mongoose. Still no collections are loading or any data being stored. I have made my schemas etc. which work perfectly fine in Compass...
Migration to Atlas is documented at https://docs.atlas.mongodb.com/import/
To save you some reads, you have to options - export/import and mongodump/mongorestore.
I would recommend to try export/import first. It's built into Compass https://docs.mongodb.com/compass/current/import-export/ and must be simpler to use considering limited experience with mongo. It's UI oriented so just follow the click-through guide in the documentation.
Unfortunately it has some limitations related to data type conversion from BSON to JSON and may be a bit tedious if you have large number of collections.
In this case you will need to follow CLI mongodump/mongorestore way #barrypicker suggested in the comments. Both commands are available in cmd and PowerShell consoles.
First you backup your local database https://docs.mongodb.com/v4.2/reference/program/mongodump/:
mongodump --uri="mongodb://username:password#localhost:27017/discordbot"
username and password are the ones you use in compass to connect to the source database.
It will create dump directory with all collections you have.
Then you have to upload the backup to Atlas:
mongorestore --uri="mongodb+srv://username:password#cluster.tenant.mongodb.net/database" dump/
username and password are the ones you use to connect to atlas cluster, listed in the "Security/Database Access" section.
You can get the exact subdomains for the --uri part from Atlas. In the dashboard click "Connect" button for the cluster you want to connect to, then choose "shell" as the connection method in the connection pop-up:
I am using a Windows Machine to connect to a remote DB2 instance. Ran into this issue
SQL1531N The connection failed because the name specified with the DSN connection string keyword could not be found in either the db2dsdriver.cfg configuration file or the db2.cli.ini configuration file. Data source name specified in the connection string: <DSN>
I have configured ODBC Data source using ODBC Data Source Administrator it has connected successfully.
Upon further investigation, I am unable to locate db2dsdriver.cfg on IBM DATA SERVER DRIVER folder. I am able to find db2dsdriver.lvl and dbs2dsdriver.xds. Just not the .cfg file. I am also unsure where HammerDB looks for the config file.
I have looked at the configuration of DB2 from the website but I am unable to get any useful information from there. https://www.hammerdb.com/docs/ch04s02.html
For the tiny footprint ODBC and CLI driver (known as clidriver) from IBM, you are responsible for creating and editing the db2dsdriver.cfg configuration file. It is a small XML file documented here and in related linked pages. The hammerdb documentation also gives a minimal example and you linked to this page in your question.
You can create and edit this file either by command lines to the db2cli tool, or by directly editing with a text editor (or XML editor). It may be easier to use an editor than to learn the command lines, although command lines have the advantage that they lend themselves to scripting this activity for larger installations.
On Microsoft-Windows you can also use Notepad to create and edit the file db2dsdriver.cfg.
An important step is that following editing of the file you must first validate its contents before trying any database connections. Validation checks that the syntax of the XML in the file is correct. To validate, you use the db2cli validate command described here. It must show a successful result before you try to connect to any database. Once validation completes without errors, you can also use db2cli validate -connect -dsn XXX -user YYY -passwd ZZZ to test the connection independently of your application (in this case hammerdb). Once you get a successful connection with the db2cli validate -connect -dsn ... then your application (hammerdb) will connect correctly.
There are many examples of db2dsdriver.cfg contents online , but your first source should be the Db2 Knowledge Centre online, which details the command line options to the db2cli command, along with giving examples of db2dsdriver.cfg.
If you already have a working Db2 configuration with local and remote databases (but no db2dsdriver.cfg file), you can also use a tool db2dsdcfgfill to populate db2dsdriver.cfg from your existing Db2 configuration. See docs here.
I'm trying to connect a mongodb database to Birt and create a dataset. But after i connect (ping get succeeded) the database and trying to set up the dataset by specifying a collection name but following error comes
Unable to find available fields. Invalid collection name clinic.
Seems its an issue with the mongodb-java driver.
Open the Eclipse IDE and navigate to the plugins dir, delete the org.eclipse.orbit.mongodb_2.10.1.v20130422-1135.jar file, and add the mongo-java-driver-2.14.3.jar there.
Currently I am trying to clone a cosmos db collection from one database to another database within the cosmos db. The API of the cosmos db is set to Mongo API.
I already tried to use Azure Data factory, but it looks like that there is no support for the Mongo API so far.
Has anyone an idea how to do this respective to efficiency, automation and performance?
Any ideas are appreciated.
You can use data Migration tool suggested by Microsoft to do the same.
There is no way to take a backup and import cosmosdb.
EDIT:
With the new Cosmic Clone tool, you can take a clone/backup with data/stored procedures/triggers/udf, etc. Read my blog on the same.
I already tried to use Azure Data factory, but it looks like that
there is no support for the Mongo API so far.
Actually, Cosmos DB Mongo API and SQL API are all belong to Azure Cosmos DB service.So , you still can create cosmos db linked service and dataset in the azure data factory for your database.
Then you could create copy activity to import data from one collection to another collection.
If you want to make it as an automation task, I suggest using following 2 ways to run the copy activity.
1.Azure Time Trigger Function.
2.Web job which is run in the background of Azure Web App.
Hope it helps you.Any concern, please feel free to let me know.
I used mongodump and mongorestore to copy my database (with mongodb version 4.0.9 installed). From the windows command line I ran the following commands from my mongodb bin directory (c:\Program Files\MongoDB\Server\4.0\bin in my case).
This will copy all the collections, including indexes, in the DB to the specified /out directory as .json files.
mongodump.exe /uri:URI /out:A_DIRECTORY_TO_DUMP_TO
I then ran the following command to take everything in the /out directory and write it to the target DB:
mongorestore.exe /uri:URI /dir:DIRECTORY_TO_RESTORE_FROM
NOTE: Before importing I also had to increase the throughput for the collection, otherwise I ran into rate limiting errors. If you've set throughput at the database level this may need to be changed.
I had been trying to migrate my parse data to localhost mongoDB but to no avail. There are a total of 12 steps as mentioned in https://parse.com/migration#database
I am currently still at step 1 and I had encountered some difficulties. I managed to set mongoDB on my computer (localhost). Then I went to my "app settings" in parse to start the data migration. Parse wanted me to paste the mongoDB connection URL which I had entered as "mongodb://localhost/". However, there was an error "no reachable servers". On my localhost, I am running the mongoDB using my terminal.
Any kind advise on this? This is my first time doing data migration and trying out mongoDB. Any help will be greatly appreciated!
Cheers
In your parse dashboard, go to App settings -> General. In this page you can find the "Export app data" button. Click and parse send you an email with the csv database data, use it for import your data in your local database (use rockmongo for example or mongoimport)