flink error Could not find a suitable table factory for 'org.apache.flink.table.factories.BatchTableSourceFactory' in the classpath - scala

I'm new to Apache Flink and I'm trying to read an Avro file as follows,
val schema = new Schema()
.field("tconst", "string")
.field("titleType", "string")
.field("primaryTitle", "string")
.field("originalTitle", "string")
.field("isAdult", "int")
.field("startYear", "string")
.field("endYear", "string")
.field("runtimeMinutes", "int")
.field("genres", "string")
val avroFormat: Avro = new Avro()
.avroSchema(
"{" +
" \"type\": \"record\"," +
" \"name\": \"test\"," +
" \"fields\" : [" +
" {\"name\": \"tconst\", \"type\": \"string\"}," +
" {\"name\": \"titleType\", \"type\": \"string\"}" +
" {\"name\": \"primaryTitle\", \"type\": \"string\"}" +
" {\"name\": \"originalTitle\", \"type\": \"string\"}" +
" {\"name\": \"isAdult\", \"type\": \"int\"}" +
" {\"name\": \"startYear\", \"type\": \"string\"}" +
" {\"name\": \"endYear\", \"type\": \"string\"}" +
" {\"name\": \"runtimeMinutes\", \"type\": \"int\"}" +
" {\"name\": \"genres\", \"type\": \"string\"}" +
" ]" +
"}"
)
tableEnv.connect(new FileSystem().path("/Users/x/Documents/test_1.avro"))
.withSchema(schema)
.withFormat(avroFormat)
.registerTableSource("sink")
But when I run this I got the following error.
Exception in thread "main" org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.BatchTableSourceFactory' in
the classpath.
Reason: No context matches.
The following properties are requested:
connector.path=/Users/x/Documents/test_1.avro
connector.property-version=1
connector.type=filesystem
format.avro-schema=.... // above schema
format.property-version=1
format.type=avro
schema.0.name=tconst
schema.0.type=string
schema.1.name=titleType
schema.1.type=string
schema.2.name=primaryTitle
schema.2.type=string
schema.3.name=originalTitle
schema.3.type=string
schema.4.name=isAdult
schema.4.type=int
schema.5.name=startYear
schema.5.type=string
schema.6.name=endYear
schema.6.type=string
schema.7.name=runtimeMinutes
schema.7.type=int
schema.8.name=genres
schema.8.type=string
The following factories have been considered:
org.apache.flink.api.java.io.jdbc.JDBCTableSourceSinkFactory
org.apache.flink.table.sources.CsvBatchTableSourceFactory
org.apache.flink.table.sources.CsvAppendTableSourceFactory
org.apache.flink.table.sinks.CsvBatchTableSinkFactory
org.apache.flink.table.sinks.CsvAppendTableSinkFactory
org.apache.flink.formats.avro.AvroRowFormatFactory
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactory
In this Avro file it has a Flink Dataset and used AvroOutputFormat to write the file.
val avroOutputFormat = new AvroOutputFormat[Row](classOf[Row])
flinkDatase.write(avroOutputFormat, "/Users/x/Documents/test_1.avro").setParallelism(1)
I'm thinking, if it's a wrong type of data that could result the mentioned error. Is there a way to identify the exact problem of this?

Sorry for misguiding you. As of now, Filesystem connector unfortunately does not support Avro.
So there is no option but to use dataset API. I recommend to use avrohugger to generate an appropriate scala class for your avro schema.
// convert to your scala class
val dsTuple: DataSet[User] = tableEnv.toDataSet[User](table)
// write out
val avroOutputFormat = new AvroOutputFormat<>(User.class)
avroOutputFormat.setCodec(Codec.SNAPPY)
avroOutputFormat.setSchema(User.SCHEMA$)
specificUser.write(avroOutputFormat, outputPath1)

Related

Rest Assured cannot be resolved to a variable

I have created a java project and am getting the error in my console
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
RestAssured cannot be resolved to a variable
added jar- rest-assured-4.3.3-dist.zip- all extracted
from official website- https://github.com/rest-assured/rest-assured/wiki/Downloads
here is my code-
//java class basics
import io.restassured.RestAssured;
import static io.restassured.RestAssured.*;
public class Basics {
public static void main(String[] args) {
//adding given, when , then conditions
RestAssured.baseURI = "https://rahulshettyacademy.com"; //added the base URI here
//adding given condition here with log report
given().log().all().queryParam("key", "qaclick123").header("Content-Type", "application/json")
.body("{\r\n" +
" \"location\": {\r\n" +
" \"lat\": -38.383494,\r\n" +
" \"lng\": 33.427362\r\n" +
" },\r\n" +
" \"accuracy\": 50,\r\n" +
" \"name\": \" Muzammil house\",\r\n" +
" \"phone_number\": \"(+91) 983 893 3937\",\r\n" +
" \"address\": \"29, side layout, cohen 09\",\r\n" +
" \"types\": [\r\n" +
" \"shoe park\",\r\n" +
" \"shop\"\r\n" +
" ],\r\n" +
" \"website\": \"http://google.com\",\r\n" +
" \"language\": \"French-IN\"\r\n" +`enter code here`
"}") // end of body
.when().post("maps/api/place/add/json") // added the resource here
.then().log().all().assertThat().statusCode(200); // validating response here
}
}
How do I resolve this?
I assume that you are using maven. If that is the case you need to remove
<scope> test </scope>
node form your rest assured dependency in pom.xml file. If you are not using maven, then try to set build path and make sure that you added all your .jar files into the project.

Failed to connect to Confluent Platform Schema Registry - Apache Flink SQL Confluent Avro Format

I am using Confluent managed Kafka cluster, Schema Registry service and trying to process Debezium messages in a Flink job. The job is configured to use Table & SQL Connectors and Confluent Avro Format.
However the job is not able to connect to Schema Registry and raises 401 error.
Table Connector configurations
tEnv.executeSql("CREATE TABLE flink_test_1 (\n" +
" ORDER_ID STRING,\n" +
" ORDER_TYPE STRING,\n" +
" USER_ID STRING,\n" +
" ORDER_SUM BIGINT\n" +
") WITH (\n" +
" 'connector' = 'kafka',\n" +
" 'topic' = 'flink_test_1',\n" +
" 'scan.startup.mode' = 'earliest-offset',\n" +
" 'format' = 'avro-confluent',\n" +
" 'avro-confluent.schema-registry.url' = 'https://<SR_ENDPOINT>',\n" +
" 'avro-confluent.schema-registry.subject' = 'flink_test_1-value',\n" +
" 'properties.basic.auth.credentials.source' = 'USER_INFO',\n" +
" 'properties.basic.auth.user.info' = '<SR_API_KEY>:<SR_API_SECRET>',\n" +
" 'properties.bootstrap.servers' = '<CLOUD_BOOTSTRAP_SERVER_ENDPOINT>:9092',\n" +
" 'properties.security.protocol' = 'SASL_SSL',\n" +
" 'properties.ssl.endpoint.identification.algorithm' = 'https',\n" +
" 'properties.sasl.mechanism' = 'PLAIN',\n" +
" 'properties.sasl.jaas.config' = 'org.apache.kafka.common.security.plain.PlainLoginModule required username=\"<CLUSTER_API_KEY>\" password=\"<CLUSTER_API_SECRET>\";'\n" +
")");
Error Message
Caused by: java.io.IOException: Failed to deserialize Avro record.
at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema.deserialize(AvroRowDataDeserializationSchema.java:101)
at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema.deserialize(AvroRowDataDeserializationSchema.java:44)
at org.apache.flink.api.common.serialization.DeserializationSchema.deserialize(DeserializationSchema.java:82)
at org.apache.flink.streaming.connectors.kafka.table.DynamicKafkaDeserializationSchema.deserialize(DynamicKafkaDeserializationSchema.java:113)
at org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:179)
at org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.runFetchLoop(KafkaFetcher.java:142)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:826)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:241)
Caused by: java.io.IOException: Could not find schema with id 100256 in registry
at org.apache.flink.formats.avro.registry.confluent.ConfluentSchemaRegistryCoder.readSchema(ConfluentSchemaRegistryCoder.java:77)
at org.apache.flink.formats.avro.RegistryAvroDeserializationSchema.deserialize(RegistryAvroDeserializationSchema.java:70)
at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema.deserialize(AvroRowDataDeserializationSchema.java:98)
... 9 more
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized; error code: 401
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:292)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:352)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:660)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:642)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:217)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:291)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaById(CachedSchemaRegistryClient.java:276)
at io.confluent.kafka.schemaregistry.client.SchemaRegistryClient.getById(SchemaRegistryClient.java:64)
at org.apache.flink.formats.avro.registry.confluent.ConfluentSchemaRegistryCoder.readSchema(ConfluentSchemaRegistryCoder.java:74)
... 11 more
I successfully tested the connection to Schema Registry by:
curl -u <SR_API_KEY>:<SR_API_SECRET> https://<SR_ENDPOINT>
It seem like the error message "io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized; error code: 401" clearly says that <SR_API_KEY>:<SR_API_SECRET> were not passed to the Confluent Schema Registry.
I checked the documentation here https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/formats/avro-confluent.html, where only 3 Format options described: ["format", "avro-confluent.schema-registry.url", "avro-confluent.schema-registry.subject"] and no options for specifying SR_API_KEY and SR_API_SECRET.
I can't figure out how to successfully connect to the secure schema registry from the Flink program.
Is this connection type supported by Flink?
Does anyone know what the correct connection configuration should look like?
Thanks.
I got the same issue.
After some investigation, I found a Jira ticket about this issue.
If you can't upgrade your flink version, you can first use DataStream API
to consume data and then convert it to Table.

Getting Object tag value in AWS S3

I am using scala to get information about my objects that are in S3 I am intersted in getting each object I have inside S3 Bucket his tag value so far I have accomplished this code to get me information about my objects but did not succeeded in getting its tagging value: I used this code in scala:
def retrieveObjectTags(keyName: String): Unit ={
try {
println("Listing objects")
val req: ListObjectsV2Request =
new ListObjectsV2Request().withBucketName(bucketName).withMaxKeys(2)
var result: ListObjectsV2Result = null
do {
result = client.listObjectsV2(req)
for (objectSummary <- result.getObjectSummaries) {
println(
" - " + objectSummary.getKey + " " + "(size = " + objectSummary.getSize +
")")
println(objectSummary.getETag)
}
println("Next Continuation Token : " + result.getNextContinuationToken)
req.setContinuationToken(result.getNextContinuationToken)
} while (result.isTruncated == true);
}catch {
case ase: AmazonServiceException => {
println(
"Caught an AmazonServiceException, " + "which means your request made it " +
"to Amazon S3, but was rejected with an error response " +
"for some reason.")
println("Error Message: " + ase.getMessage)
println("HTTP Status Code: " + ase.getStatusCode)
println("AWS Error Code: " + ase.getErrorCode)
println("Error Type: " + ase.getErrorType)
println("Request ID: " + ase.getRequestId)
}
case ace: AmazonClientException => {
println(
"Caught an AmazonClientException, " + "which means the client encountered " +
"an internal error while trying to communicate" +
" with S3, " +
"such as not being able to access the network.")
println("Error Message: " + ace.getMessage)
}
}
// val getTaggingRequest = new GetObjectTaggingRequest(bucketName,keyName)
// var getTagResult = client.getObjectTagging(getTaggingRequest)
//println(getTaggingRequest)
var tag: Tag = new Tag()
println("tag name:" + tag.getValue)
}
as for the remarked lines I have encounter a problem with it, what other way I can use to solve this problem?

Yet another "Failed to validate oauth signature and token"

I've this problem that many others have been through. I'm doing everything correct but still i get this annoying "Failed to validate oauth signature and token" error :)
Well, something got to be wrong I guess..
I'm trying to obtain a request_token by making a post to "https://api.twitter.com/oauth/request_token" with headers:
Authorization:
OAuth oauth_consumer_key="MYVq....................ywj2g",
oauth_nonce="m8NG0s4oc87AOIpuILafAeI1YoMv5Mu9",
oauth_signature="Bxb%252FFIfOG9KLVj%252FUNdV%252FycVlGPs%253D",
oauth_signature_method="HMAC-SHA1",
oauth_timestamp="1378976842",
oauth_version="1.0"
But it complains about signature and token.
Is my signature invalid somehow?
And for this request I dont need a token right??
I can't figure out whats wrong.
Here's some of my getRequestToken code:
val oauth_consumer_key: String = CONSUMER_KEY
val oauth_nonce: String = generateNonce()
val oauth_timestamp: String = (System.currentTimeMillis / 1000).toString
var oauth_signature: String = ""
val oauth_signature_method: String = "HMAC-SHA1"
val oauth_version: String = "1.0"
val PARAMETER_STRING: String =
"oauth_consumer_key=" + oauth_consumer_key + "&" +
"oauth_nonce=" + oauth_nonce + "&" +
"oauth_signature_method=" + oauth_signature_method + "&" +
"oauth_timestamp=" + oauth_timestamp + "&" +
"oauth_version=" + oauth_version
val BASE_STRING: String =
"POST&" + URLEncoder.encode("https://api.twitter.com/oauth/request_token", "UTF-8") + "&" + URLEncoder.encode(PARAMETER_STRING, "UTF-8")
oauth_signature = getSignature(CONSUMER_SECRET, BASE_STRING, "HmacSHA1")
val AUTHORIZATION = "OAuth " +
"oauth_consumer_key=\"" + URLEncoder.encode(oauth_consumer_key, "UTF-8") +
"\", oauth_nonce=\"" + URLEncoder.encode(oauth_nonce, "UTF-8") +
"\", oauth_signature=\"" + URLEncoder.encode(oauth_signature, "UTF-8") +
"\", oauth_signature_method=\"" + URLEncoder.encode(oauth_signature_method, "UTF-8") +
"\", oauth_timestamp=\"" + URLEncoder.encode(oauth_timestamp, "UTF-8") +
"\", oauth_version=\"" + URLEncoder.encode(oauth_version, "UTF-8") + "\""
WS.url("https://api.twitter.com/oauth/request_token").withHeaders("Authorization" -> AUTHORIZATION).post(Results.EmptyContent()).map(response => {
if(response.status != 200) Logger.error(response.body) //THIS IS WHERE I GET THE ERROR
else {
if((response.json \ "oauth_callback_confirmed").as[String] == "true") {
REQUEST_TOKEN = (response.json \ "oauth_token").as[String]
REQUEST_SECRET = (response.json \ "oauth_token_secret").as[String]
requestDone.success(true)
}
}
})
Ok so I've got everyting to work (without the oauth_callback parameter, because if I add this I get the error again).
I get the Request_token, which is valid because when I manually paste the authenticate url in the browser together with the generated request token I get redirected to twitter authenticate page and then a correct callback is made and the result is correct also. (token, token_secret, user_id and screen_name)
But my code seem to ignore my redirect to this authorize page.
requestToken_future.map { result =>
Redirect("https://api.twitter.com/oauth/authenticate?oauth_token="+REQUEST_TOKEN)
}
If I put a Logger inside the brackets it shows the log in my terminal window. But that Redirect seems to just be ignored. Never goes off.
You haven't included the oauth_callback parameter which is required. See the documentation here.

Error C#.NET 3.0: The best overloaded method match for 'CreatePageSection(ref string, ref string, ref object)' has some invalid arguments

I have got the following problem. I encountered an error regarding the default parameters. I added a simple piece of code for an overload. Now I get in the new code (line 3: CreatePageSection(sItemID, "", null);) gives the error mentioned in the title.
I looked for the answer in the other topics, but I can't find the problem. Can someone help me?
The code is found here:
public void CreatePageSection(ref string sItemID)
{
CreatePageSection(sItemID, "", null);
}
public void CreatePageSection(ref string sItemID, ref string sFrameUrl, ref object vOnReadyState)
{
if (Strings.InStr(msPresentPageSections, "|" + sItemID + "|", 0) > 0) {
return;
}
msPresentPageSections = msPresentPageSections + sItemID + "|";
string writeHtml = "<div class=" + MConstants.QUOTE + "PageSection" + MConstants.QUOTE + " id=" + MConstants.QUOTE + "Section" + sItemID + "Div" + MConstants.QUOTE + " style=" + MConstants.QUOTE + "display: none;" + MConstants.QUOTE + ">";
this.WriteLine_Renamed(ref writeHtml);
//UPGRADE_WARNING: Couldn't resolve default property of object vOnReadyState. Click for more: 'ms-help://MS.VSCC.v90/dv_commoner/local/redirect.htm?keyword="6A50421D-15FE-4896-8A1B-2EC21E9037B2"'
//UPGRADE_NOTE: IsMissing() was changed to IsNothing_Renamed(). Click for more: 'ms-help://MS.VSCC.v90/dv_commoner/local/redirect.htm?keyword="8AE1CB93-37AB-439A-A4FF-BE3B6760BB23"'
writeHtml = " <iframe id=" + MConstants.QUOTE + sItemID + "Frame" + MConstants.QUOTE + " name=" + MConstants.QUOTE + sItemID + "Frame" + MConstants.QUOTE + " frameborder=" + MConstants.QUOTE + "0" + MConstants.QUOTE + (!string.IsNullOrEmpty(sFrameUrl) ? " src=" + MConstants.QUOTE + sFrameUrl + MConstants.QUOTE : "") + ((vOnReadyState == null) ? "" : " onreadystatechange=" + MConstants.QUOTE + Convert.ToString(vOnReadyState) + MConstants.QUOTE) + ">";
this.WriteLine_Renamed(ref writeHtml);
writeHtml = " </iframe>";
this.WriteLine_Renamed(ref writeHtml);
writeHtml = "</div>";
this.WriteLine_Renamed(ref writeHtml);
}
you must pass params by reference
public void CreatePageSection(ref string sItemID)
{
var missingString = String.Empty;
object missingObject = null;
CreatePageSection(ref sItemID, ref missingString, ref missingObject);
}
Since your are not manipulating sFrameUrl and vOnReadyState, remove the ref keyword from those parameters.
See: http://msdn.microsoft.com/en-us/library/14akc2c7(v=vs.71).aspx
hth
Mario