Cannot delete a queue via the artemis web console - activemq-artemis

I'm experimenting with artemis 2.13.0 on Docker.
I can create a queue programmatically but I cannot delete it via the web console.
Sometimes destroyQueue appears as an option in the operations for the queue. Other times it does not.
Recently I cannot destroy the queue from the console.
I get the following error:
java.lang.IllegalArgumentException : No operation destroyQueue found on MBean org.apache.activemq.artemis:address="example",broker="77643207e938",component=addresses,queue="example",routing-type="anycast",subcomponent=queues
Any ideas why this might be happening?
Thanks

The destroyQueue is an operation only accessible the trought ActiveMQServerControl, ie to destroy the queue with the name TEST using the MBean org.apache.activemq.artemis:broker:
curl -H "Origin:http://localhost" -u admin:admin http://localhost:8161/console/jolokia/exec/org.apache.activemq.artemis:broker=%220.0.0.0%22/destroyQueue%28java.lang.String%29/TEST
You are getting this error because the MBean used in your request is org.apache.activemq.artemis:address while should be org.apache.activemq.artemis:broker as in the previous example.
This could be caused by the selection of the wrong node in the left panel.
To solve this issue, select the broker node in the left panel before to execute the destroyQueue operation.

Related

Keycloak Cached clientScope not found

Getting repeatedly this error in Keycloak logs.
Attached below are the logs for reference:
The said client scope is not found if I try to search the same under Keycloak admin console too.
2022-09-22 04:04:12,718 ERROR [org.key.ser.err.KeycloakErrorHandler] (executor-thread-610) Uncaught server error: java.lang.IllegalStateException: Cached clientScope not found: 1e84ef04-9ef9-44fe-b1bd-f45e6d4 at org.keycloak.models.cache.infinispan.RealmAdapter.lambda$getClientScopesStream$3(RealmAdapter.java:1495) at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) at java.base/java.util.ArrayList$ArrayListSpliterator.tryAdvance(ArrayList.java:1632) at
Steps to recreate:
Create client scope and attached to a client
Remove client scope attached with a client
Delete client scope
On the occurrence of this issue, I see well-known and many other endpoints to start giving 500 as they seem to be fetching cached scoped and doing certain validation
Any configuration miss that needs to be considered?

How do I resolve RESTEASY002186 so my Wildfly 26 web application can use SSE over https?

I have a web application running on Wildfly 26 that uses SSE broadcasting and works correctly with http. However, when I switch to using an https endpoint, I get Wildfly log entries of:
WARN [org.jboss.resteasy.resteasy_jaxrs.i18n] (default task-1)
RESTEASY002186: Failed to set servlet request into asynchronous mode,
server sent events may not work
This happens with each registration attempt of the https endpoint but I never see this when registering with the http endpoint.
Testing with curl against the http endpoint results in curl waiting for events to show up (and keeps printing them out as it receives them) until I quit. Using curl to test the https endpoint, I will see the same headers I got from the http endpoint, namely:
HTTP/1.1 200 OK
Connection: keep-alive
Transfer-Encoding: chunked
Content-Type: text/event-stream
But after printing out my registration successful event, curl seems to believe the stream is closed and exits -- giving me my command prompt back.
My #GET MediaType.SERVER_SENT_EVENTS registration endpoint will create an OutboundSseEvent and send it to the SseEventSink to acknowledge successful registration to my SseBroadcaster instance (this is the event curl sees and prints before exiting). I then log a registration successful message before exiting the method. All of this appears to work correctly for both http and https but the stream doesn't stay open once the request endpoint completes because of the failure to run asynchronously as outlined above.
I have not found information on the causes and/or workaround solutions for my RESTEASY002186 problem. I posted a question on this issue last week using the Wildfly Google Group (https://groups.google.com/g/wildfly/c/SO2eHdvMEko) but thought I would try a wider audience since this doesn't seem to be a commonly experienced condition. I don't see any indications during initialization that WildFly will be unable to use asynchronous mode, it just complains when it tries and fails... Any help would be greatly appreciated!
Edit 6/6/2022
The code is running on an isolated network so I can't just cut/paste the code here, but I gutted the resources file to a bare minimum -- just leaving enough for the client to be able to register. The problem remains unchanged. The code is now essentially:
#Path("sse")
public class SseResources {
#GET
#Produces(MediaType.SERVER_SENT_EVENTS)
public void listen(#Context Sse sse, #Context SseEventSink sseEventSink) {
SseRegComplete regComplete = new SseRegComplete("sse-server");
OutboundSseEvent event = sse.newEventBuilder().
name(regComplete.getType().toString()).
id(regComplete.getEventId()).
mediaType(MediaType.APPLICATION_JSON_TYPE).
data(SseRegComplete.class, regComplete).
comment("Event Stream Registration Completed Successfully").
build();
sseEventSink.send(event);
}
}
Before the above simplified code, I had declared the resource as #ApplicationScoped, had Sse injected into it, and kept a reference to the SseBroadcaster so I could use it whenever an event would come in. I was catching the events to broadcast by using an #Observes method (which I also got rid of). I was calling register(sseEventSink) on the SseBroadcaster in the listen method so I could later call broadcast(outboundEvent) whenever I had updates to publish. I got rid of all that just to see if I could get the stream to stay open but to no avail. I still get the RESTEASY002186 message and curl still exits after printing out the regComplete event sent to it in the code above.
Edit 6/7/2022
Yesterday I was able to get my code working in a new vanilla Wildfly 26 install using an https endpoint URL by following these configuration instructions. Something I hadn't mentioned in the original post is that I am trying to add SSE functionality to an already existing app. It is several years old and we actually moved to Wildfly 26 about 6 months ago because of the log4j vulnerability in the earlier version of Wildfly we were using. I suspect that the problem is related to either our Wildfly configuration (perhaps because old settings were brought over that shouldn't have been) or some 3rd party dependency that is preventing Wildfly from using asynchronous mode.
We are using Shiro for authentication and authorization against an LDAP server -- perhaps Shiro has some hooks into the Wildfly runtime that are causing issues? After initial login, we use a session cookie in all subsequent calls. That is a difference from my test server but I don't think it is relevant because the call definitely passed authentication before executing the registration code. The only other thing that comes to mind right now is our web app ships with LogBack and tells Wildfly not to use the default logging framework.
I plan to start today by comparing the two standalone.xml files to see if anything jumps out at me as being fundamentally different. Is there anything else I should be checking for differences (I think there is a domain.xml file somewhere...)?
Edit 6/14/2022
This definitely has something to do with Shiro being in the loop. When I edit the web.xml file to have Shiro's filter-mapping url-pattern to not include the SSE endpoint, everything works as expected.

How to connect Ksql with ibm-cloud event-stream?

we created a project with ibm functions and event-streams in IBM Cloud.
Now, I am trying to connect KSQL with IBM cloud Event Stream, and I am following along the Document for getting basic ideas of integration.
By following the instructions, I created a file called ksql-server.properties and modified bootstrap.servers, username, password according to my credentials. Then I ran ksql http://localhost:8088 --config-file ksql-server.properties with ksql local cli. I assume everying runs correctly so far since the ksql> shows in the front of every new line...
Then I decided to check if the ksql connected with my ibm cloud by running SHOW topics;
Turns out some error lines:
`Error issuing POST to KSQL server. path:ksql'`
`Caused by: com.fasterxml.jackson.databind.JsonMappingException: Failed to set 'ssl.protocol' to 'TLSv1.2' (through reference chain: io.confluent.ksql.rest.entity.KsqlRequest["streamsProperties"])`
`Caused by: Failed to set 'ssl.protocol' to 'TLSv1.2' (through reference chain: io.confluent.ksql.rest.entity.KsqlRequest["streamsProperties"])
`
`Caused by: Failed to set 'ssl.protocol' to 'TLSv1.2'`
`Caused by: Cannot override property 'ssl.protocol'`
Also, I am quick lost at step 4 when it tells me to:
`Then start DataGen twice as follows:
i. With bootstrap-server=HOSTNAME:PORTNUMBER quickstart=users format=json topic=users maxInterval=10000 to start creating users events.
ii. With bootstrap-server=HOSTNAME:PORTNUMBER quickstart=pageviews format=delimited topic=pageviews maxInterval=10000 to start creating pageviews events.`
Is there anyone have done this before or would love to help me out? Thank you very much!!!
The IBM document is very out of date. KSQL runs as a client/server. The server needs to be run with the details of the broker, and then you can connect to it with a client, including the CLI, REST API, or web interface provided by Confluent Control Center.
So you need to run the KSQL server using your properties file:
./bin/ksql-server-start ksql-server.properties
and then connect to it with the CLI (for example):
./bin/ksql http://localhost:8088
See https://docs.confluent.io/current/ksql/docs/installation/installing.html for more information.

UI console to browse topics on Message Hub

I have a Message Hub instance on Bluemix, and am able to produce / consume messages off it. I was looking for a quick, reasonable way to browse topics / messages to see what's going on. Something along the lines of kafka-topics-ui.
I installed kafka-topics-ui locally, but could not get it to connect to Message Hub. I used the kafka-rest-url value from the MessageHub credentials in the kafka-topics-ui configuration file (env.js), but could not figure out where to provide the API key.
Alternatively, in the Bluemix UI, under Kibana, I can see log entries for creating the topic. Unfortunately, I could not see log entries for messages in the topic (perhaps I'm looking the wrong place or have wrong filters?).
My guess is I'm missing something basic. Is there a way to either:
configure a tool such as kafka-topics-ui to connect to MessageHub,
or,
browse topic messages easily?
Cheers.
According to Using the Kafka REST API on Bluemix you need an additional header in all API requests:
-H "X-Auth-Token: APIKEY"
A quick solution is to edit the topic-ui code and include your token in every request. Another solution would be to use a Chrome plugin that can inject the above header. For a more formal solution, i have opened a ticket on github

MQJCA4004: Message delivery to an MDB 'null' failed with exception: 'deactivate of endpoint is in progress.'

I have an EAR deployed on WAS 7.0.0.3 server and the web service also deployed.
MQ listener is up and running in my WAS server and corresponding MQ host, channel name and MQ queue name are configured correctly. It is the response queue.
Whenever I'm getting data from MQ, I'm getting the below error in SystemOut.log
error
"MQJCA4004: Message delivery to an MDB 'null' failed with exception:
'deactivate of endpoint is in progress.' "
Please help me on this.
I've managed to further analyze similar MQJCA4004 error using this, kinda hackish, technique.
You need a Java decompiler that preserves line numbers.
JadClipse plugin for Eclipse can do that. Just make sure you have enabled:
Debug Settings
[x] Align code for debugging
in Window > Preferences > Java > Decompiler.
The standard Intellij Idea Community decompiler has that feature already enabled.
The IBM class responsible for producing the MQJCA4004 error is
com.ibm.mq.connector.inbound.AbstractWorkImpl
When you open it (with decompilation enabled) you'll be able to find the string MQJCA4004. Place a breakpoint there and start remote debugging your application server. When you hit that breakpoint, you'll gain access to all the context information including the exception (with call stack) that's causing this error.