I am using Kafka rest proxy, but not whole Confluent Platform just Kafka rest with my Kafka brokers. But I am not able to create topics from command line by the following command.
bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic test.
I want to know is there any other way out.
The command you have tried out is not meant to interact with the REST Proxy Server of Kafka, but rather interacting with the Kafka cluster directly.
According to the Confluent REST Proxy API Reference the creation of a topic is only possible with the REST Proxy API v3 that is currently available as a preview feature.
"The API v3 can be used for evaluation and non-production testing purposes or to provide feedback to Confluent."
An example of a topic creation request is presented below and documented here:
POST /v3/clusters/cluster-1/topics HTTP/1.1
Host: kafkaproxy.example.com
Content-Type: application/vnd.api+json
Accept: application/vnd.api+json
{
"data": {
"attributes": {
"topic_name": "topic-1",
"partitions_count": 2,
"replication_factor": 3,
"configs": [
{
"name": "cleanup.policy",
"value": "compact"
}
]
}
}
}
Using curl:
curl -X POST -H "Content-Type: application/vnd.api+json" -H "Accept: application/vnd.api+json" \
--data '{"data":{"attributes": {"topic_name": "topic-1", "partitions_count": 2, "replication_factor": 1, "configs": [{"name": "cleanup.policy","value": "compact"}]}}}' \
"http://localhost:8082/v3/clusters/<cluster-id>/topics"
where the cluster-id can be identified using
curl -X GET -H "Accept: application/vnd.api+json" localhost:8082/v3/clusters
Related
I'm following the confluent tutorial to produce messages (Produce and Consume Avro Messages), but when I post in the messages defining the schema registry, it gives the following error and I don't know how to continue, I've looked in several places and none have this error
{
"error_code": 40801,
"message": "Error when registering schema. format = AVRO, subject = teste-value, schema = {\"type\":\"record\",\"name\":\"teste\",\"fields\":[{\"name\":\"name\",\"type\":\"string\"}]}"
}
curl -X POST -H "Content-Type: application/vnd.kafka.avro.v2+json" \
-H "Accept: application/vnd.kafka.v2+json" \
--data '{"value_schema": "{\"type\": \"record\", \"name\": \"teste\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}]}", "records": [{"value": {"name": "teste"}}]}' \
"http://localhost:38082/topics/teste"
Can you try formatting your query like the following and give it a try.
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"Payment\",\"namespace\":\"io.confluent.examples.clients.basicavro\",\"fields\":[{\"name\":\"id\",\"type\":\"string\"},{\"name\":\"amount\",\"type\":\"double\"}]}"}' http://localhost:8081/subjects/test-value/versions
In order to transmit HL7v2 messages over TCP/IP connections using the minimal lower layer protocol (MLLP) I'm following this guide. When I get to the part where I create an HL7v2 store configured with a Pub/Sub topic (here) I get an error.
This is what I typed in my terminal:
curl -X POST \
--data "{
'notificationConfigs': [
{
'pubsubTopic': 'projects/PROJECT_ID/topics/PUBSUB_TOPIC',
'filter': ''
}
]
}" \
-H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/hl7V2Stores?hl7V2StoreId=HL7V2_STORE_ID"
This is the error I get:
{
"error": {
"code": 403,
"message": "Permission healthcare.hl7V2Stores.create denied on resource projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID (or it may be malformed or not exist)",
"status": "PERMISSION_DENIED"
}
}
The Dataset projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID exists, I double checked it.
So, is this somehow related to my permissions (IAM policy) ? I don't understand because I am Administer HL7v2 Stores.
How can I create my Datastore without getting this error ?
I found out that the command gcloud auth application-default print-access-token was not returning the correct token but gcloud auth print-access-token is.
So with the right auth token, the command works and I get the correct response:
{
"name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/hl7V2Stores/HL7V2_STORE_ID",
"notificationConfigs": [
{
"pubsubTopic": "projects/PROJECT_ID/topics/PUBSUB_TOPIC"
}
]
}
I have a trouble producing to Kafka topic through Rest API proxy.
I have a running confluent kafka cluster in which I'd like to create a topic and produce a message to it using REST API.
For that purpose I have followed the documentation and created API key and secret.
I manage to create topic:
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
-d "{\"topic_name\":\"test1\",\"partitions_count\":6,\"configs\":[]}" \
https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics" | jq
-------------------
returns:
{
"kind": "KafkaTopic",
"metadata": {
"self": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1",
"resource_name": "crn:///kafka=lkc-1zoqz/topic=test1"
},
"cluster_id": "lkc-1zoqz",
"topic_name": "test1",
"is_internal": false,
"replication_factor": 0,
"partitions_count": 0,
"partitions": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/partitions"
},
"configs": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/configs"
},
"partition_reassignments": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/partitions/-/reassignment"
},
"authorized_operations": []
}
MY PROBLEM:
I can't produce to that topic (can't produce to ANY topic through Kafka Rest API) :
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
--data "{"records":[{"value":"S2Fma2E="}]}" \
"https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics/test1"
-----------------
returns:
{"error_code":405,"message":"HTTP 405 Method Not Allowed"}
ALSO TRIED LIKE THIS:
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
-d "{ \"key\": {\"value\": \"S2Fma2E=\"} }" \
"https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics/test1/records"
----------------
Returns the same:
{"error_code":405,"message":"HTTP 405 Method Not Allowed"}
IDK if this has something to do with ACL management? Looking into that right now....
Are you using Confluent Platform (hosted) or Confluent Cloud API?
If the latter, this API is not supported, here's the documentation on what's supported by Cloud API: https://docs.confluent.io/cloud/current/api.html
We have a SSL enabled kafka broker and Schema Registry access is through Keycloak. From external machine, I am able to send the data using kafka-console-producer and below is my configs.
ssl.properties:
security.protocol=SASL_SSL
ssl.truststore.location=truststore.jks
ssl.truststore.password=password
sasl.mechanism=PLAIN
jaas.conf:
KafkaClient
{
org.apache.kafka.common.security.plain.PlainLoginModule required
username="<user-name>"
password="<password>";
};
export KAFKA_OPTS="-Djavax.net.ssl.trustStore=truststore.jks -Djavax.net.ssl.trustStorePassword=password -Djava.security.auth.login.config=jaas.conf"
./kafka-console-producer --bootstrap-server broker-url:<external_port> --topic sample.data --producer.config ssl.properties
Hi sample data sent
I am able to see them using consumer
Now, for schema registry I need to get the token as shown below:
curl -k --data 'grant_type=password&client_id=schema-registry-client&username=username&password=password' https://<keycloakurl>/auth/realms/<namespace>/protocol/openid-connect/token
output:
{"access_token":"<access_token>","expires_in":600,"refresh_expires_in":1800,"refresh_token":"<refresh_token>","token_type":"bearer","not-before-policy":0,"session_state":"4117e69c-afe9-43ae-9756-90b151f0b536","scope":"profile email"}
curl -k -H "Authorization: Bearer <access_token>"
https://<sc_url>/schemaregistry/subjects
output:
["test.data-value"]
Question is, how can I use the access_token in avro-console-producer ? I dont see a way.
Based on the source code for the Registry REST client, something like these properties should be what you want (untested)
bearer.auth.credentials.source="STATIC_TOKEN"
bearer.auth.token="your token"
While you can create a topic via Java or Java-based languages (see here), there does not seem to be a clean way to do this without using Java. As a result, pure-language client APIs (like kafka-node, a pure JavaScript client) can't directly create topics. Instead, we have two options:
1) Use a hack like sending a metadata request to a topic -- if auto.create.topics.enable is set to true, then you can create a topic -- but only with the default configuration, no control over partitions, etc.
2) Write a wrapper around a Java-based client just for topic creation. The easiest way to do this is to exec the script bin/kafka-topics.sh with command line arguments, which is ugly, to say the least.
Is there a better way to do this, though? There's a pure-JavaScript client for Zookeeper, node-zookeeper-client, what happens if I manipulate broker / partition info directly in Zookeeper?
Any other thoughts?
You can now use REST Proxy API v3 to create Kafka topics with http requests for non-Java languages.
According to the Confluent REST Proxy API Reference the creation of a topic is possible with the REST Proxy API v3 that is currently available as a preview feature.
"The API v3 can be used for evaluation and non-production testing purposes or to provide feedback to Confluent."
An example of a topic creation request is presented below and documented here:
POST /v3/clusters/cluster-1/topics HTTP/1.1
Host: kafkaproxy.example.com
Content-Type: application/vnd.api+json
Accept: application/vnd.api+json
{
"data": {
"attributes": {
"topic_name": "topic-1",
"partitions_count": 2,
"replication_factor": 3,
"configs": [
{
"name": "cleanup.policy",
"value": "compact"
}
]
}
}
}
Using curl:
curl -X POST -H "Content-Type: application/vnd.api+json" -H "Accept: application/vnd.api+json" \
--data '{"data":{"attributes": {"topic_name": "topic-1", "partitions_count": 2, "replication_factor": 1, "configs": [{"name": "cleanup.policy","value": "compact"}]}}}' \
"http://localhost:8082/v3/clusters/<cluster-id>/topics"
where the cluster-id can be identified using
curl -X GET -H "Accept: application/vnd.api+json" localhost:8082/v3/clusters