trying to execute s3-sqs qubole connector for spark structured streaming - spark-structured-streaming

I am trying to follow, https://github.com/qubole/s3-sqs-connector and trying to load the connector but seems like the connector is not available on maven and while generating the buiold manually the sqs classes are not loaded.
Can some1 guide me on this.
Thanks,
Dipesh

Update:
S3-SQS connector is now released on maven central.
https://mvnrepository.com/artifact/com.qubole/spark-sql-streaming-sqs
You should be able to use it now. Get back to us if you face any problems.
Unfortunately, registration of the s3-sqs-connector in spark-packages is still pending. We'll try to release the package in maven central directly as soon as possible but it should take a few days at the very least.
In the meantime, you can clone the repository and build the jar yourself if required.

Related

Pentaho Data Integration what is Update Partition Plugin

In Pentaho Data Integration what is Update Partition Plugin?
I tried PDI tool version pdi-ce-9.3.0.0-428 and I'm getting the error for Update Partition Plugin
You seem to be using PDI with files created in the past or by another person. That file was created using a step that was not available in the stock software, but added when adding a plugin to the PDI installation.
I've googled for a plugin going by that name, but I haven't found it with a quick search. Maybe the people providing the file (transformation or job) can be able to help you to install that plugin.

Kafka Connect Hbase sink

I am trying to deploy hbase sink connector for kafka (https://github.com/mravi/kafka-connect-hbase). So I downloaded and configured Hbase and Confluent Platform as per step 1 & 2.
Then it says,
Copy hbase-sink.jar and hbase-sink.properties from the project build location to $CONFLUENT_HOME/share/java/kafka-connect-hbase
But I don't see hbase-sink.jar and hbase-sink.properties anywhere in the Hbase and confluent directory location. Any help where I can get them.
But I don't see hbase-sink.jar and hbase-sink.properties
Sounds like you've not cloned that repo and ran mvn clean package, then opened up the target directory
As the other answer says, that project seems abandoned.
Try looking at this one too https://docs.lenses.io/connectors/sink/hbase.html
Have you seen the Hbase connector from Confluent kafka-connect-hbase? The one which you are using seems to be abandoned (no commits for the last 4 years).
kafka-connect-hbase documentation

How to recover my plugins data from plugin-registration in crm?

When updating an assembly to plugin registration - in step 2 : select the plugin and workflow activities to register, if not all plugin selected they will be deleted with their steps and images from plugin registration, is there a way to recover a plugin that was deleted, is there an XML or a file that helps recover the steps and images?
If you have earlier solution backup or take the latest solution by including the Sdk message Processing Steps from other environments & import to get the lost Plugin steps/images registration data.
Also, as an Ops guide for troubleshooting & human readable version tracker in TFS source code, I follow this on each plugin. This helped me a lot. Even if its not deployed correctly in other environments, this will help to identify the gap.
Helpful in some situation too (for future), if there is no other environments other than Dev yet.

How to update running kafka connector

I have kafka conenct running in Marathon container. If I want to update the connector plugin (jar) I have to upload the new one and then restart the Connect task.
Is it possible to do that without restarting/downtime?
The updated jar for the connector plugin needs to be added to the classpath and then the classloader for the worker needs to pick it up. The best way to do this currently is to take an outage as described here.
Depending on your connector, you might be able to do rolling upgrades, but the generic answer is that if you need to upgrade the connector plugin, you currently have to take an outage.

Building an Issue Tracker Plugin for TortoiseSVN

I've read a lot about IBugTraqProvider interface and implementing an issue tracker into the commit dialog of TortoiseSVN.
IBugTraqProvider is written here.
Is there a more simpler way not to do it, building the plug-in and installing it on TortoiseSVN. The Document is not that clear that a developer can create its own plugin.
I'm working with SalesForce as the Issue Tracker, and retrieved the WSDL file to integrate with the Working Items. Now I need to know how to connect it to TortoiseSVN.
Please any suggestions?
Take a look at issue-tracker-plugins.txt in the contrib directory in the TSVN source code. There's a fairly decent example in C# that should get you heading in the right direction.
When I built a plugin, I built a test harness that passed arbitrary information using the IBugtraqProvider interface, so that I could debug the plugin whilst building it, without having to reinstall the plugin into TSVN each time.