How to use ConfluentAvroRegistryDeserializationSchema.forSpecific with POJO generated from complex avro schemas? - apache-kafka

I am using avro schemas downloaded from Confluent Cloud. My data is fed to confluent cloud with Debezium connectors from a data source. My schema looks like this:
{"connect.name" : "sample.employee_preferences",
"fields": [
{
"default" : null,
"name": "_changed_at",
"type": [
"null",
{
"connect.name": "org.apache.kafka.connect.data.Timestamp",
"connect.version": 1,
"logicalType": "timestamp-millis",
"type": "long"
},
{
"default" : null,
"name": "_changed_at",
"type": [
"null",
{
"connect.name": "org.apache.kafka.connect.data.Timestamp",
"connect.version": 1,
"logicalType": "timestamp-millis",
"type": "long"
},
{
"default" : null,
"name": "_changed_at",
"type": [
"null",
{
"connect.name": "org.apache.kafka.connect.data.Timestamp",
"connect.version": 1,
"logicalType": "timestamp-millis",
"type": "long"
}
],
"name": "EmployeePreferences",
"namespace": "sample",
"type": "record"
}
I am converting this to a POJO using the avro-maven-plugin. Using Flink's KafkaSource I'm reading the records and trying to process them with ConfluentAvroRegistryDeserializationSchema.forSpecific(). After some experimentation, I realized that the issue came from the field with the logicalType property. There is also an open flink bug that shows that this method has issues with Logical Types.
I get the following error using the POJO generated from my schema:
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Expecting type to be a PojoTypeInfo
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:98)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:846)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:240)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1090)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1168)
at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1168)
Caused by: java.lang.IllegalStateException: Expecting type to be a PojoTypeInfo
at org.apache.flink.formats.avro.typeutils.AvroTypeInfo.generateFieldsFromAvroSchema(AvroTypeInfo.java:72)
at org.apache.flink.formats.avro.typeutils.AvroTypeInfo.<init>(AvroTypeInfo.java:55)
at org.apache.flink.formats.avro.AvroDeserializationSchema.getProducedType(AvroDeserializationSchema.java:177)
at org.apache.flink.connector.kafka.source.reader.deserializer.KafkaValueOnlyDeserializationSchemaWrapper.getProducedType(KafkaValueOnlyDeserializationSchemaWrapper.java:56)
at org.apache.flink.connector.kafka.source.KafkaSource.getProducedType(KafkaSource.java:216)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getTypeInfo(StreamExecutionEnvironment.java:2634)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.fromSource(StreamExecutionEnvironment.java:2006)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.fromSource(StreamExecutionEnvironment.java:1977)
at org.flink.test.DataStreamJob.main(DataStreamJob.java:74)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
... 8 more
This is the code that produces this error:
public class DataStreamJob {
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env =StreamExecutionEnvironment.getExecutionEnvironment();
// read configs and set properties to connect to Confluent
......
KafkaSource<EmployeePreferences> history_ledger_source = KafkaSource. <EmployeePreferences>builder()
.setBootstrapServers(BOOTSTRAP_SERVERS)
.setTopics("<employee-preferences-topic>")
.setStartingOffsets(OffsetsInitializer.earliest())
.setGroupId("<employee-preferences-group>")
.setProperty("security.protocol", "SASL_SSL")
.setProperty("sasl.mechanism", "PLAIN")
.setProperty("sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username='" + CONFLUENT_API_KEY + "' password='" + CONFLUENT_SECRET + "';") .setProperty("client.dns.lookup","use_all_dns_ips")
.setValueOnlyDeserializer(ConfluentRegistryAvroDeserializationSchema.forSpecific(LedgerAccounts.class, SCHEMA_REGISTRY_URL, SCHEMA_REGISTRY_CONFIG))
.build();
DataStream<EmployeePreferences> employeePreferencesStream = env
.fromSource(employee_preferences_source, WatermarkStrategy.noWatermarks(), "Kafka Source");
employeePreferencesStream.print();
KafkaSink<EmployeePreferences> sink = KafkaSink.<EmployeePreferences>builder()
.setBootstrapServers(BOOTSTRAP_SERVERS)
.setProperty("security.protocol", "SASL_SSL")
.setProperty("sasl.mechanism", "PLAIN")
.setProperty("sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username='" + CONFLUENT_API_KEY + "' password='" + CONFLUENT_SECRET + "';") .setProperty("client.dns.lookup","use_all_dns_ips")
.setRecordSerializer(KafkaRecordSerializationSchema.builder()
.setTopic("flink-poc-test")
.setValueSerializationSchema(ConfluentRegistryAvroSerializationSchema.forSpecific(LedgerAccounts.class, SUBJECT, SCHEMA_REGISTRY_URL, SCHEMA_REGISTRY_CONFIG))
.build()
)
.setDeliveryGuarantee(DeliveryGuarantee.AT_LEAST_ONCE)
.build();
employeePreferencesStream.sinkTo(sink);
env.execute("Preferences Source and Sink");
}
}
Things I have tried from previous stack overflow questions and answers:
Added "createSetters"="true" in pom.xml (It's true by default)
Have my EmployeePreferences POJO extend SpecificRecordBase, it already extends it
Tried to extend AvroDeserializationSchema with the line that addresses logical types in Kafka as suggested in the bug on JIRA
Here is the code I tried to implement for the extension of AvroDeserializationSchema.
public class CustomConfluentDeserializationSchema extends AvroDeserializationSchema {
public CustomConfluentDeserializationSchema() {
}
private static final long serialVersionUID = -6766681879020862312L;
/** Class to deserialize to. */
private final Class<T> recordClazz;
/** Schema in case of GenericRecord for serialization purpose. */
private final String schemaString;
/** Reader that deserializes byte array into a record. */
private transient GenericDatumReader<T> datumReader;
/** Input stream to read message from. */
private transient MutableByteArrayInputStream inputStream;
/** Avro decoder that decodes binary data. */
private transient Decoder decoder;
/** Avro schema for the reader. */
private transient Schema reader;
void checkAvroInitialized() {
if (datumReader != null) {
return;
}
ClassLoader cl = Thread.currentThread().getContextClassLoader();
if (SpecificRecord.class.isAssignableFrom(recordClazz)) {
#SuppressWarnings("unchecked")
SpecificData specificData =
AvroFactory.getSpecificDataForClass(
(Class<? extends SpecificData>) recordClazz, cl);
**specificData.get.addLogicalTypeConversion(new TimeConversions.TimestampConversion());**
this.datumReader = new SpecificDatumReader<>(specificData);
this.reader = AvroFactory.extractAvroSpecificSchema(recordClazz, specificData);
} else {
this.reader = new Schema.Parser().parse(schemaString);
GenericData genericData = new GenericData(cl);
this.datumReader = new GenericDatumReader<>(null, this.reader, genericData);
}
this.inputStream = new MutableByteArrayInputStream();
this.decoder = DecoderFactory.get().binaryDecoder(inputStream, null);
}
}
I only added the bold line the rest is copied directly from the parent class. The Class does not expose this method for inheritance.

Related

Symfony 5 Rest, Base64 encoded file to DTO with validation as File object

I have a PostController looking like this:
#[Route(name: 'add', methods: ['POST'])]
public function addPost(Request $request): JsonResponse
{
/** #var PostRequestDto $postRequest */
$postRequest = $this->serializer->deserialize(
$request->getContent(),
PostRequestDto::class,
'json'
);
return $this->postService->addPost($postRequest);
}
PostService:
public function addPost(PostRequestDto $request): JsonResponse
{
$errors = $this->validator->validate($request);
if (count($errors) > 0) {
throw new ValidationHttpException($errors);
}
#...
return new JsonResponse(null, Response::HTTP_CREATED);
}
And PostRequestDto:
class PostRequestDto
{
#[Assert\NotBlank]
#[Assert\Length(max: 250)]
private ?string $title;
#[Assert\NotBlank]
#[Assert\Length(max: 1000)]
private ?string $article;
#[Assert\NotBlank]
#[Assert\Image]
private ?File $photo;
#[Assert\NotBlank]
#[Assert\GreaterThanOrEqual('now', message: 'post_request_dto.publish_date.greater_than_or_equal')]
private ?DateTimeInterface $publishDate;
}
My Postman request looks like this:
{
"title": "Test",
"article": "lorem ipsum....",
"photo": "base64...",
"publishDate": "2021-10-15 08:00:00"
}
As you can see from postman request, I'm sending base64 encoded file.
Now, in the controller I want to deserialize it to match with PostRequestDto so I can validate it as a File in the PostService - how can I achieve this ?
I don't know how exactly your serializer ($this->serializer) is configured, but I think you have to adjust/add your normilizer with Symfony\Component\Serializer\Normalizer\DataUriNormalizer
// somewhere in your controller/service where serilaizer is configured/set
$normalizers = [
//... your other normilizers if any
new DataUriNormalizer(), // this one
];
$encoders = [new JsonEncoder()];
$this->serializer = new Serializer($normalizers, $encoders);
If you look inside DataUriNormalizer you'll see, it works with File which is exactly what you have in your PostRequestDto
The only thing to be aware of → format of base64.
If you follow the link of denormilize() method, you will see it expects data:image/png;base64,...
So it has to start with data:... and you probably have to change your postman-json-payload to
{
"title": "Test",
"article": "lorem ipsum....",
"photo": "data:<your_base64_string>",
"publishDate": "2021-10-15 08:00:00"
}
Since you work with images, I would also send the mime-type. Like:
"photo": "data:image/png;base64,<your_base64_string>",

java.io.BufferedReader().map Cannot infer type argument(s) for <T> fromStream(Stream<? extends T>)

Scenario: a Spring WebFlux triggering CommandLineRunner.run in order to load data to MongoDb for testing purpose.
Goal: when starting the microservice locally it is aimed to read a json file and load documents to MongDb.
Personal knowledge: "bufferedReader.lines().filter(l -> !l.trim().isEmpty()" reads each json node and return it as stream. Then I can map it to "l" and access the get methods. I guess I don't have to create a list and then stream it since I have already load it as stream by "new InputStreamReader(getClass().getClassLoader().getResourceAsStream()" and I assume I can use lines() since it node will result in a string line. Am I in right direction or I am messing up some idea?
This is a json sample file:
{
"Extrato": {
"description": "credit",
"value": "R$1.000,00",
"status": 11
},
"Extrato": {
"description": "debit",
"value": "R$2.000,00",
"status": 99
}
}
model
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
#Document
public class Extrato {
#Id
private String id;
private String description;
private String value;
private Integer status;
public Extrato(String id, String description, String value, Integer status) {
super();
this.id = id;
this.description = description;
this.value = value;
this.status = status;
}
... getters and setter accordinly
Repository
import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.reactive.ReactiveCrudRepository;
import com.noblockingcase.demo.model.Extrato;
import reactor.core.publisher.Flux;
import org.springframework.data.domain.Pageable;
public interface ExtratoRepository extends ReactiveCrudRepository<Extrato, String> {
#Query("{ id: { $exists: true }}")
Flux<Extrato> retrieveAllExtratosPaged(final Pageable page);
}
command for loading from above json file
import org.springframework.boot.CommandLineRunner;
import org.springframework.stereotype.Component;
import com.noblockingcase.demo.model.Extrato;
import com.noblockingcase.demo.repository.ExtratoRepository;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import reactor.core.publisher.Flux;
#Component
public class TestDataLoader implements CommandLineRunner {
private static final Logger log = LoggerFactory.getLogger(TestDataLoader.class);
private ExtratoRepository extratoRepository;
TestDataLoader(final ExtratoRepository extratoRepository) {
this.extratoRepository = extratoRepository;
}
#Override
public void run(final String... args) throws Exception {
if (extratoRepository.count().block() == 0L) {
final LongSupplier longSupplier = new LongSupplier() {
Long l = 0L;
#Override
public long getAsLong() {
return l++;
}
};
BufferedReader bufferedReader = new BufferedReader(
new InputStreamReader(getClass().getClassLoader().getResourceAsStream("carga-teste.txt")));
//*** THE ISSUE IS NEXT LINE
Flux.fromStream(bufferedReader.lines().filter(l -> !l.trim().isEmpty())
.map(l -> extratoRepository.save(new Extrato(String.valueOf(longSupplier.getAsLong()),
l.getDescription(), l.getValue(), l.getStatus()))))
.subscribe(m -> log.info("Carga Teste: {}", m.block()));
}
}
}
Here is the MongoDb config althought I don't think it is relevant
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import com.mongodb.MongoClientOptions;
#Configuration
public class MongoDbSettings {
#Bean
public MongoClientOptions mongoOptions() {
return MongoClientOptions.builder().socketTimeout(2000).build();
}
}
If I tried my original code and adjust it for reading a text file I can successfully read text file instead of json. Obvisouly it doesn't fit my demand since I want read json file. By the way, it can clarify a bit more where I am blocked.
load-test.txt (available in https://github.com/jimisdrpc/webflux-worth-scenarious/blob/master/demo/src/main/resources/carga-teste.txt)
crédito de R$1.000,00
débito de R$100,00
snippet code working with simple text file
BufferedReader bufferedReader = new BufferedReader(
new InputStreamReader(getClass().getClassLoader().getResourceAsStream("carga-teste.txt")));
Flux.fromStream(bufferedReader.lines().filter(l -> !l.trim().isEmpty())
.map(l -> extratoRepository
.save(new Extrato(String.valueOf(longSupplier.getAsLong()), "Qualquer descrição", l))))
.subscribe(m -> log.info("Carga Teste: {}", m.block()));
Whole project working succesfully reading from text file: https://github.com/jimisdrpc/webflux-worth-scenarious/tree/master/demo
Docker compose for booting MongoDb https://github.com/jimisdrpc/webflux-worth-scenarious/blob/master/docker-compose.yml
To summarize, my issue is: I didn't figure out how read a json file and insert the data into MongoDb during CommandLineRunner.run()
I found an example with Flux::using Flux::fromStream to be helpful for this purpose. This will read your file into a Flux and then you can subscribe to and process with .flatmap or something. From the Javadoc
using(Callable resourceSupplier, Function> sourceSupplier, Consumer resourceCleanup)
Uses a resource, generated by a supplier for each individual Subscriber, while streaming the values from a Publisher derived from the same resource and makes sure the resource is released if the sequence terminates or the Subscriber cancels.
and the code that I put together:
private static Flux<Account> fluxAccounts() {
return Flux.using(() ->
new BufferedReader(new InputStreamReader(new ClassPathResource("data/ExportCSV.csv").getInputStream()))
.lines()
.map(s->{
String[] sa = s.split(" ");
return Account.builder()
.firstname(sa[0])
.lastname(sa[1])
.build();
}),
Flux::fromStream,
BaseStream::close
);
}
Please note your json is invalid. Text data is not same as json. Json needs a special handling so always better to use library.
carga-teste.json
[
{"description": "credit", "value": "R$1.000,00", "status": 11},
{"description": "debit","value": "R$2.000,00", "status": 99}
]
Credits goes to article here - https://www.nurkiewicz.com/2017/09/streaming-large-json-file-with-jackson.html.
I've adopted to use Flux.
#Override
public void run(final String... args) throws Exception {
BufferedReader bufferedReader = new BufferedReader(
new InputStreamReader(getClass().getClassLoader().getResourceAsStream("carga-teste.json")));
ObjectMapper mapper = new ObjectMapper();
Flux<Extrato> flux = Flux.generate(
() -> parser(bufferedReader, mapper),
this::pullOrComplete,
jsonParser -> {
try {
jsonParser.close();
} catch (IOException e) {}
});
flux.map(l -> extratoRepository.save(l)).subscribe(m -> log.info("Carga Teste: {}", m.block()));
}
}
private JsonParser parser(Reader reader, ObjectMapper mapper) {
JsonParser parser = null;
try {
parser = mapper.getFactory().createParser(reader);
parser.nextToken();
} catch (IOException e) {}
return parser;
}
private JsonParser pullOrComplete(JsonParser parser, SynchronousSink<Extrato> emitter) {
try {
if (parser.nextToken() != JsonToken.END_ARRAY) {
Extrato extrato = parser.readValueAs(Extrato.class);
emitter.next(extrato);
} else {
emitter.complete();
}
} catch (IOException e) {
emitter.error(e);
}
return parser;
}

Unable to get an Observable List using Gluon Connect - Multiple Exceptions

I'm trying to populate multiple tables with JSON streaming data. News, Twitter and Forex Data.
I am trying to populate a list or table with REST data using Gluon Connect.
I've followed the documentation to the letter but i am not getting anywhere. From what i can see, REST client connects successfully, but is unable to return an Observable List.
I've followed the sample code from the documentation, I've included a piece of the sample code.
// create a RestClient to the specific URL
RestClient restClient = RestClient.create()
.method("GET")
.host("https://api.stackexchange.com")
.path("/2.2/errors");
// create a custom converter
InputStreamIterableInputConverter<Error> converter = new ItemsIterableInputConverter<>(Error.class);
// retrieve a list from the DataProvider using the custom converter
GluonObservableList<Error> errors = DataProvider.retrieveList(restClient.createListDataReader(converter));
//ItemsIterableInputConverter is not a valid command, Ive tried InterableInputConverter and JsonInputIterableConverter, non of which wo
//JSON Output Looks like this
{
"status": "ok",
"totalResults": 70,
-"articles": [
-{
-"source": {
"id": null,
"name": "Marketwatch.com"
},
"author": "Greg Robb",
"title": "Fed officials shied away ...",
"description": "Federal Reserve officials ...",
"url": "https://www.marketwatch.com/",
"urlToImage": "http://s.20190821101018.jpg",
"publishedAt": "2019-08-21T18:24:00Z",
"content": "President Trumps criticism of ...+2840 chars]"
},
My POJO:
Article.java
package app;
import lombok.Getter;
import lombok.Setter;
import org.apache.commons.lang3.builder.ToStringBuilder;
#Getter
#Setter
public class Article {
private Source source;
private String author;
private String title;
private String description;
private String url;
private String urlToImage;
private String publishedAt;
private String content;
/**
* No args constructor for use in serialization
*/
public Article() {
}
public Article(Source source, String author, String title, String description, String url, String urlToImage, String publishedAt, String content) {
super();
this.source = source;
this.author = author;
this.title = title;
this.description = description;
this.url = url;
this.urlToImage = urlToImage;
this.publishedAt = publishedAt;
this.content = content;
}
JavaFX - NewsController
#FXML
private ListView<Article> newsList;
// create a RestClient to the specific URL
RestClient restClient = RestClient.create()
.method("GET")
.host("https://newsapi.org")
.path("//v2/everything?q=GBP_USD&from=2019-08-21&to=2019-08-21&sortBy=popularity&apiKey=API_KEY");
InputStreamIterableInputConverter<Article> converter = new JsonIterableInputConverter<>(Article.class);
GluonObservableList<Article> articles = DataProvider.retrieveList(restClient.createListDataReader(converter));
articles.initializedProperty().addListener((obv,ov,nv)-> {
newsList.setItems(articles);
}
);
//Here I tried to use a TableView instead
// newsTbl.setItems(articles);
//
// newsDateCol.setCellValueFactory(new PropertyValueFactory<>("newsDateCol"));
// newsPublisherCol.setCellValueFactory(new PropertyValueFactory<>("newsPublisherCol"));
// newsHeadlineCol.setCellValueFactory(new PropertyValueFactory<>("newsHeadlineCol"));
// newsLinkCol.setCellValueFactory(new PropertyValueFactory<>("newsLinkCol"));
This is my output
Task :App.main()
Connection Successful
Exception in thread "DataProviderThread-0" java.lang.ExceptionInInitializerError
at com.gluonhq.connect.converter.JsonIterableInputConverter.iterator(JsonIterableInputConverter.java:136)
at com.gluonhq.connect.provider.RestListDataReader.iterator(RestListDataReader.java:80)
at com.gluonhq.connect.provider.DataProvider.lambda$retrieveList$23(DataProvider.java:206)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:835)
Caused by: javax.json.JsonException: Provider org.glassfish.json.JsonProviderImpl not found
at javax.json.spi.JsonProvider.provider(JsonProvider.java:75)
at javax.json.Json.createReaderFactory(Json.java:215)
at com.gluonhq.impl.connect.converter.JsonUtil.<clinit>(JsonUtil.java:48)
... 6 more
Caused by: java.lang.ClassNotFoundException: org.glassfish.json.JsonProviderImpl
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:332)
at javax.json.spi.JsonProvider.provider(JsonProvider.java:72)
... 8 more
Task :App.main() FAILED
Update :
My build.gradle file.
plugins {
id 'java'
id 'application'
id 'org.openjfx.javafxplugin' version '0.0.8'
}
version '1.0-SNAPSHOT'
sourceCompatibility = 12
repositories {
mavenCentral()
}
javafx {
version = "12.0.2"
modules =["javafx.controls","javafx.fxml"]
}
dependencies {
testCompile group: 'junit', name: 'junit', version: '4.12'
compile group: 'com.jfoenix', name: 'jfoenix', version: '9.0.2'
compile group: 'mysql', name: 'mysql-connector-java', version: '8.0.17'
compile group: 'org.twitter4j', name: 'twitter4j-core', version: '4.0.7'
compileOnly 'org.projectlombok:lombok:1.18.8'
annotationProcessor 'org.projectlombok:lombok:1.18.8'
compile group: 'com.gluonhq', name: 'connect', version: '2.0.1'
compile group: 'org.apache.commons', name: 'commons-lang3', version: '3.0'
compile 'org.apache.httpcomponents:httpclient:4.5.9'
compile group: 'com.fasterxml', name: 'jackson-module-json-org', version: '0.9.1'
}
task Customrun(type: JavaExec, dependsOn: classes) {
mainClassName = 'app.App'
classpath = sourceSets.main.runtimeClasspath
}
I've taken Jose's advice and imported the 'org.glassfish', name: 'javax.json', version: '1.0.4', which cleared up the Error.
Table is still not populating, so i did a System.out.println on the ObservableList 'articles' and only receive empty brackets []. The default "no content in table" has disapeared which seems like it did work, only thing is i receive a empty list.
UPDATE :
Created a custom converter class ArticlesIterableInputConverter
// modified the Iterator Method
#Override
public Iterator<E> iterator() {
index = 0;
try (JsonReader reader = Json.createReader(getInputStream())) {
JsonObject jsonObject = reader.readObject();
jsonArray = jsonObject.getJsonArray("articles");
}
return this;
}
REST CLient Call and GluonObservable List Creation
// create a RestClient to the specific URL
RestClient restClient = RestClient.create()
.method("GET")
.host("https://newsapi.org")
.path("//v2/everything?q=GBP_USD&from=2019-08-21&to=2019-08-21&sortBy=popularity&apiKey=9a7a9daab76f440fb796350c83db0694");
InputStreamIterableInputConverter<Article> converter = new ArticlesIterableInputConverter<>(Article.class);
GluonObservableList<Article> articles = DataProvider.retrieveList(restClient.createListDataReader(converter));
articles.initializedProperty().addListener((obv,ov,nv)-> {
newsList.setItems(articles);
}
);
Screenshot of List Output
*** Need to figure out how to format the output into the fields I actually need.
Added a toString Method to the Article Class
#Override
public String toString() {
return new ToStringBuilder(this).append("publishedAt", publishedAt).append("name", source.name).append("title", title).append("url", url).toString();
}
New Output :
Output after adding toString Method

Can I use repository populator bean with fongo?

I'm using Fongo not only for unit tests but also for integration tests so I would like to initialize Fongo with some collections, is that possible?
This is my java config (based on Oliver G. answer):
#EnableAutoConfiguration(exclude = {
EmbeddedMongoAutoConfiguration.class,
MongoAutoConfiguration.class,
MongoDataAutoConfiguration.class
})
#Configuration
#ComponentScan(basePackages = { "com.foo" },
excludeFilters = { #ComponentScan.Filter(classes = { SpringBootApplication.class })
})
public class ConfigServerWithFongoConfiguration extends AbstractFongoBaseConfiguration {
private static final Logger log = LoggerFactory.getLogger(ConfigServerWithFongoConfiguration.class);
#Autowired
ResourcePatternResolver resourceResolver;
#Bean
public Jackson2RepositoryPopulatorFactoryBean repositoryPopulator() {
Jackson2RepositoryPopulatorFactoryBean factory = new Jackson2RepositoryPopulatorFactoryBean();
try {
factory.setResources(resourceResolver.getResources("classpath:static/collections/*.json"));
} catch (IOException e) {
log.error("Could not load data", e);
}
return factory;
}
}
When I run my IT tests, on the log it appears Reading resource: file *.json but the tests fails because they retrieve nothing (null) from Fongo database.
Tests are annotated with:
#RunWith(SpringRunner.class)
#SpringBootTest(classes={ConfigServerWithFongoConfiguration.class})
#AutoConfigureMockMvc
#TestPropertySource(properties = {"spring.data.mongodb.database=fake"})
#DirtiesContext(classMode = DirtiesContext.ClassMode.AFTER_CLASS)
Lol, I feel so stupid right now. Was format issue. JSON collections must be formated like this:
[
{/*doc1*/},
{/*doc2*/},
{/*doc3*/}
]
I was missing the [] and comma separated documents.

no suitable HttpMessageConverter found for response type [com.enimbus.book.Post] and content type [text/html;charset=UTF-8]

I am developing android app using restful service. I call get request from a url and it returns content type application/json;charset=UTF-8. I want to show return json data in my android view. to do that I use below code in android mainactivity
private class HttpRequestTask extends AsyncTask<Void, Void, Post> {
#Override
protected Post doInBackground(Void... params) {
try {
final String url = "http://192.168.0.100:8080/rposts/view/46";
RestTemplate restTemplate = new RestTemplate();
restTemplate.getMessageConverters().add(new MappingJackson2HttpMessageConverter());
Post post = restTemplate.getForObject(url, Post.class);
return post;
} catch (Exception e) {
Log.e("MainActivity", e.getMessage(), e);
}
return null;
}
#Override
protected void onPostExecute(Post post) {
TextView PostIdText = (TextView) findViewById(R.id.post_title);
TextView PostContentText = (TextView) findViewById(R.id.post_body);
PostIdText.setText(post.getTitle());
PostContentText.setText(post.getBody());
}
}
when I run my app it gives an error
Could not extract response: no suitable HttpMessageConverter found for response type [com.enimbus.book.Post] and content type [text/html;charset=UTF-8]
org.springframework.web.client.RestClientException: Could not extract response: no suitable HttpMessageConverter found for response type [com.enimbus.book.Post] and content type [text/html;charset=UTF-8]
at org.springframework.web.client.HttpMessageConverterExtractor.extractData(HttpMessageConverterExtractor.java:79)
at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:484)
at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:439)
at org.springframework.web.client.RestTemplate.getForObject(RestTemplate.java:237)
at com.enimbus.book.MainActivity$HttpRequestTask.doInBackground(MainActivity.java:123)
at com.enimbus.book.MainActivity$HttpRequestTask.doInBackground(MainActivity.java:116)
at android.os.AsyncTask$2.call(AsyncTask.java:288)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
at java.lang.Thread.run(Thread.java:841)
11-02 14:34:39.194 27146-27146/com.enimbus.book D/AndroidRuntime: Shutting down VM
11-02 14:34:39.194 27146-27146/com.enimbus.book W/dalvikvm: threadid=1: thread exiting with uncaught exception (group=0x41e90da0)
11-02 14:34:39.204 27146-27146/com.enimbus.book E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.enimbus.book, PID: 27146
java.lang.NullPointerException
at com.enimbus.book.MainActivity$HttpRequestTask.onPostExecute(MainActivity.java:137)
at com.enimbus.book.MainActivity$HttpRequestTask.onPostExecute(MainActivity.java:116)
at android.os.AsyncTask.finish(AsyncTask.java:632)
at android.os.AsyncTask.access$600(AsyncTask.java:177)
at android.os.AsyncTask$InternalHandler.handleMessage(AsyncTask.java:645)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:146)
at android.app.ActivityThread.main(ActivityThread.java:5653)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1291)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1107)
at dalvik.system.NativeStart.main(Native Method)
I am using below dependencies in gradle app
dependencies {
compile fileTree(include: ['*.jar'], dir: 'libs')
testCompile 'junit:junit:4.12'
compile 'com.android.support:appcompat-v7:23.4.0'
compile 'com.android.support:design:23.4.0'
compile 'org.springframework.android:spring-android-rest-template:1.0.1.RELEASE'
compile 'com.fasterxml.jackson.core:jackson-databind:2.3.2'
}
returned json data when I test with postman
{
"id": 46,
"title": "hellov",
"slug": "tharu",
"postedOn": "08/12/2016 3:04:58 PM",
"keywords": [
"i"
],
"tags": [
"love"
],
"active": true,
"author": {
"id": 20,
"firstName": "Tharindu",
"lastName": "Gihan",
"email": "gihan#gmail.com"
},
"teaser": "<p>to</p>",
"body": "<p>you</p>"
}
Server side spring boot rest controller
package com.gihangreen.controller.rest;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import com.gihangreen.domain.Post;
import com.gihangreen.service.CommentService;
import com.gihangreen.service.PostService;
#RestController
#RequestMapping("/rposts")
public class PostRestController {
private PostService postService;
private CommentService commentService;
#Autowired
public PostRestController(PostService postService, CommentService commentService) {
super();
this.postService = postService;
this.commentService = commentService;
}
//get all posts
#RequestMapping( value = "/list", method = RequestMethod.GET )
public Iterable<Post> list(){
return postService.list();
}
//get post content by id
#RequestMapping(value="/view/{id}", method = RequestMethod.GET)
public Post read(#PathVariable(value = "id") long id) {
return postService.get(id);
}
//get post by author id
#RequestMapping(value="/byAuthor/{id}", method = RequestMethod.GET)
public Iterable<Post> byAuthor(#PathVariable(value = "id") long id) {
return postService.listByAuthor(id);
}
//search post by string
#RequestMapping(value="/search", method = RequestMethod.GET)
public Iterable<Post> search(#RequestParam("search") String search) {
return postService.searching(search);
}
}
How I fixed this issue? help
Might be better to first see what the data is being sent back?
CURL http://192.168.0.100:8080/rposts/view/46 -v