Can't produce to Confluent Kafka topic through Kafka REST API - apache-kafka

I have a trouble producing to Kafka topic through Rest API proxy.
I have a running confluent kafka cluster in which I'd like to create a topic and produce a message to it using REST API.
For that purpose I have followed the documentation and created API key and secret.
I manage to create topic:
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
-d "{\"topic_name\":\"test1\",\"partitions_count\":6,\"configs\":[]}" \
https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics" | jq
-------------------
returns:
{
"kind": "KafkaTopic",
"metadata": {
"self": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1",
"resource_name": "crn:///kafka=lkc-1zoqz/topic=test1"
},
"cluster_id": "lkc-1zoqz",
"topic_name": "test1",
"is_internal": false,
"replication_factor": 0,
"partitions_count": 0,
"partitions": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/partitions"
},
"configs": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/configs"
},
"partition_reassignments": {
"related": "https://pkc-xmzwx.europe-central2.gcp.confluent.cloud/kafka/v3/clusters/lkc-1zoqz/topics/test1/partitions/-/reassignment"
},
"authorized_operations": []
}
MY PROBLEM:
I can't produce to that topic (can't produce to ANY topic through Kafka Rest API) :
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
--data "{"records":[{"value":"S2Fma2E="}]}" \
"https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics/test1"
-----------------
returns:
{"error_code":405,"message":"HTTP 405 Method Not Allowed"}
ALSO TRIED LIKE THIS:
curl -X POST -H "Authorization: Basic <BASE64_ENCODED_AUTH_KEY_SECRET>" \
-H "Content-Type: application/json" \
-d "{ \"key\": {\"value\": \"S2Fma2E=\"} }" \
"https://pkc-xmzwx.europe-central2.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-1zoqz/topics/test1/records"
----------------
Returns the same:
{"error_code":405,"message":"HTTP 405 Method Not Allowed"}
IDK if this has something to do with ACL management? Looking into that right now....

Are you using Confluent Platform (hosted) or Confluent Cloud API?
If the latter, this API is not supported, here's the documentation on what's supported by Cloud API: https://docs.confluent.io/cloud/current/api.html

Related

Error when registering schema. Problem to produce AVRO messages via Kafka REST. Error_code: 40801

I'm following the confluent tutorial to produce messages (Produce and Consume Avro Messages), but when I post in the messages defining the schema registry, it gives the following error and I don't know how to continue, I've looked in several places and none have this error
{
"error_code": 40801,
"message": "Error when registering schema. format = AVRO, subject = teste-value, schema = {\"type\":\"record\",\"name\":\"teste\",\"fields\":[{\"name\":\"name\",\"type\":\"string\"}]}"
}
curl -X POST -H "Content-Type: application/vnd.kafka.avro.v2+json" \
-H "Accept: application/vnd.kafka.v2+json" \
--data '{"value_schema": "{\"type\": \"record\", \"name\": \"teste\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}]}", "records": [{"value": {"name": "teste"}}]}' \
"http://localhost:38082/topics/teste"
Can you try formatting your query like the following and give it a try.
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"Payment\",\"namespace\":\"io.confluent.examples.clients.basicavro\",\"fields\":[{\"name\":\"id\",\"type\":\"string\"},{\"name\":\"amount\",\"type\":\"double\"}]}"}' http://localhost:8081/subjects/test-value/versions

Create source representation instance

I followed the documentation to create an external replica of cloud SQL here. I have a MySQL instance with version 8.0 on Google cloud. I successfully created the external read replica. Now, I want to demote this replica to master. I need to create a source representation instance for that. I called the following API as mentioned in the documentation.
gcloud auth login
ACCESS_TOKEN="$(gcloud auth print-access-token)"
curl --header "Authorization: Bearer ${ACCESS_TOKEN}" \
--header 'Content-Type: application/json' \
--data '{
"name": "[SOURCE_REPRESENTATION_NAME]",
"region": "[REGION]",
"databaseVersion": "[EXTERNAL_SERVER_MYSQL_VERSION]",
"onPremisesConfiguration": {
"hostPort": "[EXTERNAL_SERVER_IP]:[EXTERNAL_SERVER_PORT]"
}
}' \
-X POST \
https://www.googleapis.com/sql/v1beta4/projects/[PROJECT-ID]/instances
The API works when I set databaseVersion to 5. But it fails when I try to set to version 8.
{
"error": {
"code": 400,
"message": "Missing parameter: DatabaseVersion.",
"errors": [
{
"message": "Missing parameter: DatabaseVersion.",
"domain": "global",
"reason": "required"
}
]
}
}
Is the MySQL version 8 not supported for representational server?
From documentation:
databaseVersion should be the MySQL version running on your source database server. The choices are MYSQL_5_5, MYSQL_5_6, MYSQL_5_7 or MYSQL_8_0.

How to create topics from command line in Kafka REST proxy

I am using Kafka rest proxy, but not whole Confluent Platform just Kafka rest with my Kafka brokers. But I am not able to create topics from command line by the following command.
bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic test.
I want to know is there any other way out.
The command you have tried out is not meant to interact with the REST Proxy Server of Kafka, but rather interacting with the Kafka cluster directly.
According to the Confluent REST Proxy API Reference the creation of a topic is only possible with the REST Proxy API v3 that is currently available as a preview feature.
"The API v3 can be used for evaluation and non-production testing purposes or to provide feedback to Confluent."
An example of a topic creation request is presented below and documented here:
POST /v3/clusters/cluster-1/topics HTTP/1.1
Host: kafkaproxy.example.com
Content-Type: application/vnd.api+json
Accept: application/vnd.api+json
{
"data": {
"attributes": {
"topic_name": "topic-1",
"partitions_count": 2,
"replication_factor": 3,
"configs": [
{
"name": "cleanup.policy",
"value": "compact"
}
]
}
}
}
Using curl:
curl -X POST -H "Content-Type: application/vnd.api+json" -H "Accept: application/vnd.api+json" \
--data '{"data":{"attributes": {"topic_name": "topic-1", "partitions_count": 2, "replication_factor": 1, "configs": [{"name": "cleanup.policy","value": "compact"}]}}}' \
"http://localhost:8082/v3/clusters/<cluster-id>/topics"
where the cluster-id can be identified using
curl -X GET -H "Accept: application/vnd.api+json" localhost:8082/v3/clusters

How to list all assets in a catalog in watson data science experience?

I have created a Watson data science experience (DSX) account, created a catalog into it and added data assets to it.
I am trying to use the REST APIs as documented at: https://developer.ibm.com/api/view/id-1084:title-Watson_Data_Platform_Core_Services#id36962
... to retrieve the assets using curl.
curl -H "Authorization: Bearer <---stripped the auth token --->" -X GET 'https://api.dataplatform.ibm.com/v2/assets?catalog_id=bd2b56c3-091f-4ff5-beab-b3a1da85488d'
I get the following response:
{
"errors": [
{
"code": "invalid_parameter",
"message": "COMSV3006E: Missing or Invalid 'asset' id",
"target": {
"name": "asset",
"type": "parameter"
}
}
],
"trace": "e7b07khusvkj7s0ymgrggm6si"
}
How do I specify the asset id to retrieve the same?
Also, I am looking to upload assets, assign metadata/tags to existing assets using REST APIs. Is there any documentation/tutorial available, which can help explain me that?
One option is the search api, although it is listed as deprecated:
curl -X POST -d '{"query":"asset.asset_state:available"}' -H "Content-Type: application/json" https://api.dataplatform.ibm.com/v2/catalogs/<catalog_guid>/types/<type>/search -H "Authorization: Bearer ...."
https://developer.ibm.com/api/view/id-1084:title-Watson_Data_Platform_Core_Services#id37001
For <type>, you probably want data_asset, but you can also look up all existing types:
curl -X GET https://api.dataplatform.ibm.com/v2/catalogs/<catalog_guid>/types -H "Authorization: Bearer ...."
https://developer.ibm.com/api/view/id-1084:title-Watson_Data_Platform_Core_Services#id36916

Couchbase create document fails through sync-gateway public rest API

As per Couchbase Sync-Gateway REST API documentation here below mentioned cURL should create a document in the specified database.
Below is the generated cURL from Postman.
curl -X PUT -H "Cache-Control: no-cache" -H "Postman-Token: 498d0fb6-77ac-9335-2379-14258c6731c7" -d '' "http://192.168.244.174:4984/db/"
I also tried adding JSON to the body of the request.
But when I send the put request through Postman, instead of creating a new document, it tries to create a new database and the JSON response is
{
"error": "Precondition Failed",
"reason": "Database already exists"
}
Am I missing something or it was a bug? Is there any other way to create a document to sync gateway?
There is a mistake in the documentation.
As per documentation,
You can either specify the document ID by including the _id object in the request message body, or let the software generate an ID.
But Couchbase REST API does not seem to work like that (may be they are not updating their documentation regularly). You need to provide the id in the URL like /{db}/{id}.
The below cURL worked for me.
curl -X PUT -H "Content-Type: application/json" -H "Cache-Control: no-cache" -H "Postman-Token: 75ab844e-5130-708e-69e9-e87f878108b4" -d '{"name": "xxx",
"full_name": "xxx yyy"}' "http://192.168.244.174:4984/db/123"
JSON response is
{
"id": "123",
"ok": true,
"rev": "1-9324dabc947fc963a754b113d1215ac3"
}