In Pentaho Data Integration what is Update Partition Plugin?
I tried PDI tool version pdi-ce-9.3.0.0-428 and I'm getting the error for Update Partition Plugin
You seem to be using PDI with files created in the past or by another person. That file was created using a step that was not available in the stock software, but added when adding a plugin to the PDI installation.
I've googled for a plugin going by that name, but I haven't found it with a quick search. Maybe the people providing the file (transformation or job) can be able to help you to install that plugin.
Related
I am trying to follow, https://github.com/qubole/s3-sqs-connector and trying to load the connector but seems like the connector is not available on maven and while generating the buiold manually the sqs classes are not loaded.
Can some1 guide me on this.
Thanks,
Dipesh
Update:
S3-SQS connector is now released on maven central.
https://mvnrepository.com/artifact/com.qubole/spark-sql-streaming-sqs
You should be able to use it now. Get back to us if you face any problems.
Unfortunately, registration of the s3-sqs-connector in spark-packages is still pending. We'll try to release the package in maven central directly as soon as possible but it should take a few days at the very least.
In the meantime, you can clone the repository and build the jar yourself if required.
When updating an assembly to plugin registration - in step 2 : select the plugin and workflow activities to register, if not all plugin selected they will be deleted with their steps and images from plugin registration, is there a way to recover a plugin that was deleted, is there an XML or a file that helps recover the steps and images?
If you have earlier solution backup or take the latest solution by including the Sdk message Processing Steps from other environments & import to get the lost Plugin steps/images registration data.
Also, as an Ops guide for troubleshooting & human readable version tracker in TFS source code, I follow this on each plugin. This helped me a lot. Even if its not deployed correctly in other environments, this will help to identify the gap.
Helpful in some situation too (for future), if there is no other environments other than Dev yet.
VS2015 Community is not showing SQLite in the list of available data sources in one place and showing it in the other.
If I click New Connection button in Server Explorer and click Change, I get the following list of Data Sources:
If I add a new item to my project > choose Entity Model > from existing database > New Connection, I get the following list of Data Sources:
How can I get SQLite in the New Connection data sources list?
Background
The problem started when my existing EDMX failed to load with the infamous error message
The Operation could not be completed: Invalid Pointer
This error can be fixed by deleting ComponentModelCache folder as described in this post. This method has worked for me in the past, but not this time. I finally decided to recreate the EDMX from scratch. Since then I'm facing this issue.
A few things that might give some hint:
I have recently installed VS2017 Community side-by-side with VS2015. VS2017 can open the existing EDMX just fine, but cannot do Update from Database, so I came back to VS2015.
I uninstalled and reinstalled System.Data.SQLite provider several times, thinking that this might be a registration issue. Didn't do any good.
Note that VS2017 support is not there yet on System.Data.SQLite's download page. I'm using the last available version that supports VS2015 (version number 1.0.104.0).
Good news is that the issue is fixed finally; at least for VS2015. Bad news is that I don't know what exactly did the trick. So I'll list down everything that I tried and maybe this could help someone in the future. These steps are not in any particular order.
Uninstall all SQLite packages from NuGet.
Uninstall Entity Framework package too.
Reinstall all these packages.
Remove and reinstall the latest version of SQLite provider (1.0.104.0 as of this writing).
Use VS2015 only. VS2017 is currently not supported by SQLite provider.
Clear ComponentModelCache folder and restart Visual Studio.
I found an easy solution, just install below extension from market place and Sqlite will be available to data source list
SQLite/SQL Server Compact Toolbox
i'm using mongodb to store data.
But to search I prefer to use elasticsearch or similar. But i didn't found solution.
Because I read some problems and issues with RIVER .
What's your experience and recommendations ?
I'm using elasticsearch with mongodb. I tried Solr but I didnt have the integration. The two tools are using the lucene so has "approximately" the same query syntax.
There are some tutorial, but it didnt work for me. I believe the reason is that the github doesnt allow now to upload and download binary files. So, we can not use the ./plugin command. To overcome this problem you have to git clone the repositories and make the .jar files on your own. To do that you have to use apache maven and make mvn package to create the packages.
Add both river and Mapper Attachments to elasticsearch. And make sure that you follow the compatible versions according to the river version table.
After that everything will working file.
I have installed magento on ubuntu using magento-1.7.0.2.zip. I have also downloaded magento sample data magento-sample-data-1.6.1.0.tar.gz . But I can't integrate this sample data with my magento installation.
Please somebody help me.
Thanks
Here is a guide on how to install it
Main thing that you should know that you need to extract magento sample data (.sql file and media folder) then put media folder into your magento installation and import .sql file into your empty database before you will run Magento installation wizard, otherwise you won't be able to successfully install sample data
This is simplest ways i checked out all of above answer are complicated
1.install the sample database (mysql) file from sample data.
2.copy catlog folder from media in sample data and paste into your magento project.
3.if on windows no problem of folder permission other have to check for there specific way .