What's the proper way to update the "published" field on a Photo using the Facebook Graph API? - facebook

When I try to update the published field on an existing Photo object I get: OAuthException code 1: "An unknown error has occurred." Here's what I'm trying:
Upload an image:
$ curl -X POST -F published=false -F "access_token=$TOKEN" -F file=#soccer.jpg https://graph.facebook.com/$ALBUM_ID/photos
Grab the link and verify that the link shows up for me, but not my friends:
$ curl -X GET "https://graph.facebook.com/v2.6/$PHOTO_ID?fields=link&access_token=$TOKEN"
Set the published bit:
$ curl -X POST -F published=true -F "access_token=$TOKEN" https://graph.facebook.com/$PHOTO_ID
{"error":{"message":"An unknown error has occurred.","type":"OAuthException","code":1,"fbtrace_id":"D+Z1Gs9zZat"}}%
According to the docs that field is available for updating. So is this a documentation bug or an API bug?
I'm trying to upload a bunch of pictures with published=false and then publish them all at the same time later by just updating the published field.

Looks like there's currently a bug in the platform. Here's the bug report.

Related

Can't upload feature type to Geoserver REST API using Curl

Geoserver version 2.20.1
I am attempting to register a PostGIS table as a layer in Geoserver.
Here is my Curl command in bash
curl -v -u $GEOSERVER_ADMIN_USER:$GEOSERVER_ADMIN_PASSWORD \
-XPOST -H "Content-type: text/xml" \
-d "\
<featureType>
<name>$dataset</name>\
<title>$dataset</title>\
<nativeCRS class='projected'>EPSG:4326</nativeCRS><srs>EPSG:4326</srs>\
<nativeBoundingBox>\
<minx>-94.0301461140306003</minx>\
<maxx>-91.0935619356926054</maxx>\
<miny>46.5128696410899991</miny>\
<maxy>47.7878144308049002</maxy>\
<crs class='projected'>EPSG:4326</crs>\
</nativeBoundingBox>
</featureType>" \
http://geoserver:8080/geoserver/rest/workspaces/foropt/datastores/postgis/featuretypes
where $dataset is the name of the table.
Here is the error I am getting:
The retquest has not been applied because it lacks valid
authentication credentialsn for the target resource.
I have never seen this error before.
And I can't see how it's an issue with my credentials, since I am successfully performing other tasks (such as importing GeoTIFFs) within the same bash script using the same credentials. What is going on here?
In this situation, Geoserver is setup alongside PostGIS in a docker-compose arrangement.
Interestingly enough, when I first posted, I was using Postgres version 14, PostGIS version 3.1.
When I revert back to using Postgres version 13, the error goes away (well, a new one crops up but it seems to be a separate issue--you know how it goes). ¯_(ツ)_/¯
I'm not familiar enough with Postgres versions to say what difference it made by reverting back to version 13 (maybe there are security changes at version 14??), but it worked for me.

Can't add a new schema in Kafka after deleting the old one

I'm trying to use a proper schema for keys too. By default, it is created as:
{"subject":"AVROTEST-key","version":1,"id":60,"schema":"\"string\""}
But I want it like:
{"subject":"AVROTEST-key","version":1,"id":60,"schema": "{\"type\":\"record\",\"name\":\"AVROTEST\",\"fields\":[{\"name\":\"key\",\"type\":\"long\"}]}"}
Because of the compatibility issues, I tried to delete it completely and add a new one. I've deleted it using
curl -X DELETE http://XXXXXX.XXXXXX:1234/subjects/AVROTEST-key/versions/1
there are no other versions and I get a 404 error when I try to GET it after deleting, which means it's deleted successfully. But when I try to register a new schema, I get this error:
"error_code":409,"message":"Schema being registered is incompatible
with an earlier schema"
How can it be incompatible with an earlier schema, while there is no schema? What am I missing?
This is how I register a new schema:
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"AVROTEST\",\"fields\":[{\"name\":XXXXXX.XXXXXXXX:1234/subjects/AVROTEST-key/versions
Looks like there was another schema version which I didn't know about it. A funny mistake actually, I should've deleted all the versions, instead of [1]. So entering the command
curl -X DELETE http://XXXXXX.XXXXXX:1234/subjects/AVROTEST-key/versions/
solved my problem. All the previous schemas are deleted. But notice the new schema will not be registered as version [1]. It will increase the latest schema id.

Register schema operation failed while writing to the Kafka store [50001] (with documentation example)

I'm getting the above error while trying to register the schema in the confluent documentation (via the REST endpoint):
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"Payment\",\"namespace\":\"io.confluent.examples.clients.basicavro\",\"fields\":[{\"name\":\"id\",\"type\":\"string\"},{\"name\":\"amount\",\"type\":\"double\"}]}"}' http://localhost:8081/subjects/test-value/versions
Example can be found here:
https://docs.confluent.io/5.4.0/schema-registry/schema_registry_tutorial.html
Under the section Auto Schema Registration
The schema registry was installed via the helm charts and is version 5.4.0
Any help greatly appreciated
LOGS
Caused by: io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException: Error while registering the schema due to generating an ID that is already in use

Add New Schema/Subject to Schema Registry in Kafka Avro file format

I am trying to run schema to a schema registry by explicit curl command.
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema" : {"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}' http://localhost:8081/subjects/avro-test/versions/
I am getting the below error
{"error_code":500,"message":"Internal Server Error"}
Note : I am able to access the data from the existing subject but getting the same error while pushing it.
The schema needs to be string escaped
For example, starting out
POST -d'{"schema": {\"type\":\"record\"
If you can, then installing jq tool and create an AVSC file instead, that would help - see my comment here

Does Coverity have Rest API

I want to store results from Coverity® to InfluxDB and I was wondering does Coverity have REST API?
If you're only trying to dump data to InfluxDB, you can curl data from REST API and insert resulting json to the database. I do something similar, but in CSV format.
Create a view in coverity 'Issues: By Snapshot' that contains all your defects.
Curl data from coverity view
json format
curl --user <userid>:<password>
"http://<coverity_url>/api/viewContents/issues/v1/<View Name>?projectId=<project ID>&rowCount=-1"
csv format
curl --header "Accept: text/csv" --user <userid>:<password>
"http://<coverity_url>/api/viewContents/issues/v1/<View Name>?projectId=<project ID>&rowCount=-1"
Example:
If you created a view 'My Defects' in project 'My Project' the command would be
curl --user <userid>:<password> "http://<coverity_url>/api/viewContents/issues/v1/My%20Defects?projectId=My%20Project&rowCount=-1"
In above URL:
%20 -- URL encoded space
rowcount=-1 -- Download all rows in view. You can set it to desired limit.
Not really, no.
There is a very limited REST api but it only covers a few very specific things. I'd recommend you use cov-manage-im where you can and only use the SOAP API if you need something more.
cov-manage-im can help, it can be used to retrive defects for specific projects and streams. cov-manage-im --help can give you more info
cov-manage-im --host youcovhostname --user yourusername --password yourpassword --mode defects --show --project yourprojectname