I have a requirement to integrate data from Bitbucket through Mulejob. Not looking for CI/CD integration, I mean actually read content of a repository file through Mule. Also given, I am a newbie to Mule.
I am able to read a content of a repo branch through Mule using Atlassian Stash connector.
Q1: Does this flow actually retrieve file or just metadata of files? Able to print names of files.
Q2: Assuming files are indeed retrieved, how do I read the contents of the file? Tried using Mule Requester to read output payload of Stash connector, but I see that a payload is null when I print it. No errors are thrown, just blank payload. Appreciate your help!
My Mule flow: http -> Stash Connecter (read files) -> For Each -> Mule Requester: Retrieve File -> Log payload.
Mule to Stash connector API I am using: http://hotovo.github.io/mule-stash-connector/mule/stash-config.html#commit-files-get
Syntax I am trying with Mule Requester instance: file://#[payload.value]
Output:
INFO 2019-08-09 14:37:07,691
[[test-bitbucket-connect].HTTP_Listener_Configuration.worker.01]
org.mule.api.processor.LoggerMessageProcessor: FOR payload ---- Key:
repository/.java Value: repository/.java INFO
2019-08-09 14:37:07,692
[[test-bitbucket-connect].HTTP_Listener_Configuration.worker.01]
org.mule.lifecycle.AbstractLifecycleManager: Initialising:
'file-connector-config.requester.1217682634'. Object is:
FileMessageRequester INFO 2019-08-09 14:37:07,692
[[test-bitbucket-connect].HTTP_Listener_Configuration.worker.01]
org.mule.lifecycle.AbstractLifecycleManager: Starting:
'file-connector-config.requester.1217682634'. Object is:
FileMessageRequester INFO 2019-08-09 14:37:07,693
[[test-bitbucket-connect].HTTP_Listener_Configuration.worker.01]
org.mule.api.processor.LoggerMessageProcessor: Post retrieve File:
null
Related
I asked the same question at MS qna site too.
In ADF, I tried to call get BLOB() https://learn.microsoft.com/en-us/rest/api/storageservices/get-blob
I got this error message: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature."
I'd like to read an image or non structured file and insert it into a varchar(max) column in SQL server. (source: binary to sink:binary in sQL server)
My pipeline is configured as below.
linked service:
base url: https://{account name}.blob.core.windows.net/
authentication type: anonymouse
server certificate: disabled
type: Rest
data set
type :Rest
relative url: {container name}/xyz.jpeg
copy data activity
request method: get
x-ms-date: #concat(formatDateTime(utcNow(), 'yyyy-MM-ddTHH:mm:ss'), 'Z')
x-ms-version: 2018-11-09
x-ms-blob-type: BlockBlob
Authorization: SharedKey {storage name}:CBntp....{SAS key}....LsIHw%3D
( I took a key from an SAS connection string....https&sig=CBntp{SAS key}LsIHw%3D)
Is it possible to call the Azure Blob rest API in ADF pipelines?
Unfortunately this is not possible because When using Binary dataset in copy activity, you can only copy from Binary dataset to Binary dataset.
Source dataset property when Source is Binary
Sink dataset property
Reference - https://learn.microsoft.com/en-us/azure/data-factory/format-binary#copy-activity-properties
now I'm trying to conduct load test with gatling.
I have been trying create gatling's simulation script via gatling recorder and
it was going well.
but, when I executed that, I encountered file not found error below although there are files
(in this case /step1/0006_request.json is exist)
request_6: Failed to build request: Resource /step1/0006_request.json not found
there are many json files and that error occurred some of specific post requests.
every requests which is failed are composed following setting.
using request header 'headers_6' below
using RawFileBody method
(even if this is obvious thing because content-type is 'application/json')
I already have been using 12 hour over and I have to finish my task in a timely manner.
I'm so sorry about that I can't share my application which is target of this issue.
if anyone have any idea or kindly want to more detail please ask me.
val headers_6 = Map(
"Content-Type" -> "application/json",
"Origin" -> "applicationServerURL")
.exec(http("request_6")
.post("requestUrl")
.headers(headers_6)
.body(RawFileBody("/step1/0006_request.json"))
.resources(http("request_7")
additionally, I checked json files which were created by gatling recorder on 'resource' directory, them contains only one char 'X'.
this means Gatling recorder doesn't capture request json file's content?
env:
Gatling version gatling 3.5.1
Used Browser: FireFox
I just solved this issue.
this issue was caused by very simple fact.
firstly, I encountered error message below
request_6: Failed to build request: Resource /step1/0006_request.json not found
and I have to check the file paths although this script was auto created by gatling recorder(do not trust gatling tool).
The auto created file path was absolute path and start with ‘/’. And the script started to work once I remove the ‘/’.
I referred other post on stack overflow, and it was written that file path must be written with absolute path if you are using gatling version 3 or later version.
I hope my reply can be informative to someone who may be having same issue as me.
Using 'relative path'.
//As is
.exec(http("request_6")
.post("requestUrl")
.headers(headers_6)
.body(RawFileBody("/step1/0006_request.json"))
.resources(http("request_7")
//To be
.exec(http("request_6")
.post("requestUrl")
.headers(headers_6)
.body(RawFileBody("step1/0006_request.json"))
.resources(http("request_7")
I am trying Scala code using the KCL library to read a Kinesis stream. I keep getting this CloudWatchException and I would like to know why?
16:16:06.629 [aws-akka-http-akka.actor.default-dispatcher-20] DEBUG software.amazon.awssdk.request - Received error response: 400
16:16:06.638 [cw-metrics-publisher] WARN software.amazon.kinesis.metrics.CloudWatchMetricsPublisher - Could not publish 16 datums to CloudWatch
software.amazon.awssdk.services.cloudwatch.model.CloudWatchException: When Content-Type:application/x-www-form-urlencoded, URL cannot include query-string parameters (after '?'): '/?Action=PutMetricData&Version=2010-08-01&Namespace=......
Any idea what's causing this or as I suspect, perhaps it's a bug in the Kinesis library?
We are trying to run a spark program using NiFi. This is the basic sample we tried to follow.
We have configured Apache-Livy server in 127.0.0.1:8998.
ExecutiveSparkInteractive processor is used to run sample Spark code.
val gdpDF = spark.read.json("gdp.json")
val gdpRDD = gdpDF.rdd
gdpRDD.count()
LivyController is confiured for 127.0.0.1 port 8998 and Session Type : spark.
When we run the processor we get following error :
Spark Session returned an error, sending the output JSON object as the flow file content to failure (after penalizing)
We just want to output the line count in JSON file. How to redirect it to flowfile?
NiFi User log :
2020-04-13 21:50:49,955 INFO [NiFi Web Server-85]
org.apache.nifi.web.filter.RequestLogger Attempting request for
(anonymous) GET
http://localhost:9090/nifi-api/flow/controller/bulletins (source ip:
127.0.0.1)
NiFi app.log
ERROR [Timer-Driven Process Thread-3]
o.a.n.p.livy.ExecuteSparkInteractive
ExecuteSparkInteractive[id=9a338053-0173-1000-fbe9-e613558ad33b] Spark
Session returned an error, sending the output JSON object as the flow
file content to failure (after penalizing)
I have seen several people struggling with this example. I recommend following this example from the Cloudera Community (especially note part 2).
https://community.cloudera.com/t5/Community-Articles/HDF-3-1-Executing-Apache-Spark-via-ExecuteSparkInteractive/ta-p/247772
The key points I would be concerned with:
Does your spark work in general
Does your livy work in general
Is the Spark sample code good
Is it possible to let the server send messages to all connected clients without waiting for any action from them? Let me explain it :-) I've been reading the docs/examples and I have found nothing that satifies my needs: the flow is always the same; a client connects (e.g: a GET call to a Rest API), the connection is suspendend and until a new API call is received (e.g.: a POST call) the server simply waits (or at least this is what I have understood). My use case is pretty different: I want the server to send some "notifications" once new data become available. This would be my use case (pretty simplifed):
Client A connects to server
Connection is suspended since no new data is available at the moment
The server gets notified new data is available from an external
source and broadcasts it to client A
Go to step 2
What I have achieved so far is getting the connection successfully established. The next step is to solve this server issue. I must say this technology is completely new to me so it is possible I misunderstood how something works. If that's the case, let me know!
This is my stack:
Spring 3.2.0 RELEASE
Jersey 1.8
Atmosphere Jersey 1.0.13
Tomcat 7.0.40
Thank you all in advance!
UPDATE: After following this I get this warning, which I have no idea how to get rid of:
2013-06-04 09:40:36,284 WARN [org.atmosphere.cpr.AtmosphereFramework] - Failed using comet support: org.atmosphere.container.Tomcat7AsyncSupportWithWebSocket, error: Tomcat failed to detect this is a Comet application because context.xml is missing or the Http11NioProtocol Connector is not enabled.
If that's not the case, you can also remove META-INF/context.xml and WEB-INF/lib/atmosphere-compat-tomcat.jar Is the Nio or Apr Connector enabled?
2013-06-04 09:40:36,285 WARN [org.atmosphere.cpr.AtmosphereFramework] - Using org.atmosphere.container.Tomcat7BIOSupportWithWebSocket
I followed the app structure commented here, so this should not be a problem. I have noticed that by changing the transport to "websocket" instead of "long-polling" shows no errors. The server finally sends data tough :)
I followed your link and modified the code a little.
When you are in the step 3 "The server gets notified new data is available from an external source and broadcasts it to client A", you have to write a line like this:
BroadcasterFactory.getDefault().lookup("/*").broadcast(response);
At first I used the TextMessage received from my ActiveMQ Queue but I get this error, so I used a Jackson class as an object response and everything worked fine.
SEVERE: A message body writer for Java class org.apache.activemq.command.ActiveMQTextMessage, and Java type class org.apache.activemq.command.ActiveMQTextMessage, and MIME media type application/json was not found
jun 03, 2014 11:32:21 AM com.sun.jersey.spi.container.ContainerResponse write
SEVERE: The registered message body writers compatible with the MIME media type are:
application/json (JSONJAXBElementProvider, JSONArrayProvider, JSONObjectProvider, JSONRootElementProvider, JSONListElementProvider, ...)