Spring Cloud Contract for AMQP - autowiring issues with RabbitTemplate - spring-cloud

I am trying to run a contract test for Spring AMQP using Spring Cloud Contract. However I am running into an issue with autowiring of the RabbitTemplate. In my Base Test class below, the autowired RabbitTemplate expects a ConnectionFactory with valid connection details (host and port of RabbitMQ brokers). Since contract tests are not expected to actually connect to the message broker, there is no host and port supplied to the connection factory in the test env.
I get the error
org.springframework.amqp.AmqpIOException: java.net.UnknownHostException: ${queue.hosts}: nodename nor servname provided, or not known
at org.springframework.amqp.rabbit.support.RabbitExceptionTranslator.convertRabbitAccessException(RabbitExceptionTranslator.java:71)
at org.springframework.amqp.rabbit.connection.AbstractConnectionFactory.createBareConnection(AbstractConnectionFactory.java:476)
at org.springframework.amqp.rabbit.connection.CachingConnectionFactory.createConnection(CachingConnectionFactory.java:614)
at org.springframework.amqp.rabbit.connection.ConnectionFactoryUtils.createConnection(ConnectionFactoryUtils.java:240)
at org.springframework.amqp.rabbit.core.RabbitTemplate.doExecute(RabbitTemplate.java:1810)
at org.springframework.amqp.rabbit.core.RabbitTemplate.execute(RabbitTemplate.java:1784)
at org.springframework.amqp.rabbit.core.RabbitTemplate.send(RabbitTemplate.java:864)
Is the RabbitTemplate supposed to be mocked? I tried it but even that isn't working.
Also tried passing mock ConnectionFactory to the actual RabbitTemplate but it tries to get a real connection from the mock factory.
How do you get around the problem that it tries to make an actual connection?
Base Test Class
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = QueueConfiguration.class)
#AutoConfigureMessageVerifier
public class CreateNotificationBase {
private String message;
#Autowired
private RabbitTemplate rabbitTemplate;
#Before
public void setUp() {
client = new ChangeNotificationClient();
message = client.buildChangeNotificationMessage();
}
protected void onUserCreation() {
rabbitTemplate.send("change_notification_exchange",
RoutingKey.CHANGE_NOTIFICATION_KEY.getName(),
org.springframework.amqp.core.MessageBuilder.withBody(message.getBytes()).build());
}
}
Contract Definition
label: user_create
input:
triggeredBy: onUserCreation()
outputMessage:
sentTo: change_notification_exchange
body: ''' {"data":["....."jsonapi":{"version":"1.0"}} '''
Auto-generated Test
public class CreateTest extends CreateNotificationBase {
#Inject ContractVerifierMessaging contractVerifierMessaging;
#Inject ContractVerifierObjectMapper contractVerifierObjectMapper;
#Test
public void validate_create() throws Exception {
// when:
onUserCreation();
// then:
ContractVerifierMessage response = contractVerifierMessaging.receive("change_notification_exchange");
assertThat(response).isNotNull();
// and:
Object responseBody = (contractVerifierObjectMapper.writeValueAsString(response.getPayload()));
// assertions
;
}
}

Related

JpaRepository not making the connection to the local machine database instead of the docker container database

I'm trying to use a docker container to run the tests of a springboot microservices application. The problem is that the application makes the connection to my machine instead of the docker container that holds the database.
This is my first time doing this so I'm guessing this is a configuration problem but, every time I look for instructions, it seems that the annotations #Testcontainers and #Container are enough to make sure springboot uses the container.
Here's my code:
#SpringBootTest
#Testcontainers
#AutoConfigureMockMvc
class ProductServiceApplicationTests {
#Autowired
private MockMvc mockMvc;
#Autowired
private ObjectMapper objectMapper;
#Autowired
private ProductRepository productRepository;
#Container
static PostgreSQLContainer<?> postgreSQLContainer = new PostgreSQLContainer<>("postgres:12.13")
.withDatabaseName("postgreSQLContainer")
.withUsername("test")
.withPassword("test");
static void setProperties(DynamicPropertyRegistry dynamicPropertyRegistry) {
dynamicPropertyRegistry.add("spring.datasource.url", postgreSQLContainer::getJdbcUrl);
dynamicPropertyRegistry.add("spring.datasource.username", postgreSQLContainer::getUsername);
dynamicPropertyRegistry.add("spring.datasource.password", postgreSQLContainer::getPassword);
}
#Test
void shouldCreateProduct() throws Exception {
ProductRequest productRequest = getProductRequest();
String productRequestString = objectMapper.writeValueAsString(productRequest);
mockMvc.perform(MockMvcRequestBuilders.post("/api/product")
.contentType(MediaType.APPLICATION_JSON)
.content(productRequestString)
).andExpect(status().isCreated()); //THIS IS OK
Assertions.assertTrue(productRepository.findAll().size() == 1); //THIS FAILS
}
private ProductRequest getProductRequest() {
return ProductRequest.builder()
.name("some-product")
.description("some-description")
.price(BigDecimal.valueOf(10))
.build();
}
}
Debugging this issue I saw that the repository was hitting the postgresql database that I had local.
And if I shotdown my local postgresql service this is what happens:
org.postgresql.util.PSQLException: Connection to localhost:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
You are missing #DynamicPropertySource annotation on your setProperties method. Spring doesn't know that you are providing properties without this annotation.

Feign client manually. Load balancer does not have available server for client

I have two services registered with eureka. Service C calls service A. Service C is feign client. I want implement feign client manually. But I catch an exception:
com.netflix.client.ClientException: Load balancer does not have
available server for client: service-test-a
Application class:
#EnableEurekaClient
#SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
Feign interface:
#Component
public interface FeignService {
#RequestLine("GET /")
public String getServiceA();
}
Feign config:
#Configuration
#Import(FeignClientsConfiguration.class)
public class MyConfig {
}
Controller:
#RestController
public class Controller {
private FeignService feignService;
#Autowired
public void Controller() {
feignService = Feign.builder()
.client(RibbonClient.create())
.target(FeignService.class, "http://service-test-a");
}
#RequestMapping(value = "/build", method = RequestMethod.GET)
public String getServiceC() {
return feignService.getServiceA();
}
}
What am I doing wrong?
AFAIK, there is no easy way of using OpenFeign with eureka. There is no guide or example for that. Also I guess that it may require some additional implementations and configuration.
Instead, please try to use Spring Cloud Feign. It provides full integration with eureka and ribbon without any additional implementation. You can use Spring Cloud Feign with just a few changes in your above code.
Please refer to Spring Cloud Feign

How to properly create Spring Cloud Task with custom parameters?

According to the samples here (actually - timestamp task), I have implemented a small task class:
#SpringBootApplication
#EnableTask
#EnableConfigurationProperties({ RestProcessorTaskProperties.class })
public class RestProcessorTaskApplication {
public static void main(String[] args) {
SpringApplication.run(RestProcessorTaskApplication.class, args);
}
#Autowired
private RestProcessorTaskProperties config;
// some fields and beans
#Bean
public CommandLineRunner run(RestTemplate restTemplate) {
return args -> {
// doing some stuff
};
}
}
and then I've created Properties class (in the same package)
#ConfigurationProperties("RestProcessor")
public class RestProcessorTaskProperties {
private String host = "http://myhost:port";
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
}
But after I've registered task on my local Spring Cloud Data Server, I see numerous parameters, that, I suppose, was added automatically. I those mean parameters like:
abandon-when-percentage-full java.lang.Integer
abandoned-usage-tracking java.lang.Boolean
acceptors java.lang.Integer
access-to-underlying-connection-allowed java.lang.Boolean
and others...
Is it possible somehow to hide (or remove) them, so that when launching task I could configure only those parameters, that was added by me (single host property in my example above)?
By default Spring Cloud Data Flow will show you all the available properties for a boot application. However, you can create a whitelist of properties that you wish to show.
Here is a link to the Spring Cloud Data Flow reference doc that will discuss how to do this: http://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#spring-cloud-dataflow-stream-app-whitelisting.
And here is link to the timestamp starter app that has an example of this: https://github.com/spring-cloud/spring-cloud-task-app-starters/tree/master/spring-cloud-starter-task-timestamp

Inject EntityManager in SwitchYard Junit implementation

I am trying to implement Junit in SwitchYard Application.
i am using JPA , without using Camel. i have persistence.xml with the following details. And i am using resource producer pattern to expose EntityManager.
But when i am testing a service, i am getting null Invocation for EntityManager in DAO layer.
Is there any way , i can mock or inject EntityManager in SwitchYard Junit
#RunWith(SwitchYardRunner.class)
#SwitchYardTestCaseConfig(config = SwitchYardTestCaseConfig.SWITCHYARD_XML, mixins = {
CDIMixIn.class, HTTPMixIn.class, NamingMixIn.class })
public class SalesModuleServiceTest {
private SwitchYardTestKit testKit;
private CDIMixIn cdiMixIn;
private HTTPMixIn httpMixIn;
private static NamingMixIn namingMixIn;
private TransformerRegistry transformerRegistry;
#ServiceOperation("SalesModuleService")
private Invoker service;
//------ JUnit test with REST binding fails if no resteasy properties defined ------
#BeforeDeploy
public void setProperties()
{
System.setProperty("org.switchyard.component.resteasy.standalone.port", "8081");
System.setProperty("org.switchyard.component.resteasy.standalone.path", "");
}
#Test
public void testUpdateCustomerStatus() throws Exception {
SalesDetailsRequest message = null;
BudgetResponse<?> result = service.operation("updateCustomerStatus")
.sendInOut(message).getContent(SalesResponse.class);
// validate the results
Assert.assertTrue("Implement me", false);
}
}

Morphia, Embed Mongo and Spring. Address already in use

I am trying use MongoDB, Morphia and Spring and test it, so I started use Embedded Mongo.
When I had only one DAO to persist I did not had any problem with my tests, however, in some cases I needed use more than one DAO, and in that cases my injected Datasore give me an problem: addr already in use.
My Spring Test Database Configuration is this:
#Configuration
public class DatabaseMockConfig {
private static final int PORT = 12345;
private MongodConfigBuilder configBuilder;
private MongodExecutable mongodExecutable;
private MongodProcess mongodProcess;
#Bean
#Scope("prototype")
public MongodExecutable getMongodExecutable() {
return this.mongodExecutable;
}
#Bean
#Scope("prototype")
public MongodProcess mongodProcess() {
return this.mongodProcess;
}
#Bean
public IMongodConfig getMongodConfig() throws UnknownHostException, IOException {
if (this.configBuilder == null) {
configBuilder = new MongodConfigBuilder().version(Version.Main.PRODUCTION).net(new Net(PORT, Network.localhostIsIPv6()));
}
return this.configBuilder.build();
}
#Autowired
#Bean
#Scope("prototype")
public Datastore datastore(IMongodConfig mongodConfig) throws IOException {
MongodStarter starter = MongodStarter.getDefaultInstance();
this.mongodExecutable = starter.prepare(mongodConfig);
this.mongodProcess = mongodExecutable.start();
MongoClient mongoClient = new MongoClient("localhost", PORT);
return new Morphia().createDatastore(mongoClient, "morphia");
}
#Autowired
#Bean
#Scope("prototype")
public EventDAO eventDAO(final Datastore datastore) {
return new EventDAO(datastore);
}
#Autowired
#Bean
#Scope("prototype")
public EditionDAO editionDAO(final Datastore datastore) {
return new EditionDAO(datastore);
}
}
And my DAO classes are similar to that
#Repository
public class EventDAO {
private final BasicDAO<Event, ObjectId> basicDAO;
#Autowired
public EventDAO(final Datastore datastore) {
this.basicDAO = new BasicDAO<>(Event.class, datastore);
}
...
}
My test class is similar to that:
#ContextConfiguration(classes = AppMockConfig.class)
#RunWith(SpringJUnit4ClassRunner.class)
public class EventDAOTest {
#Autowired
private EventDAO eventDAO;
#Autowired
private MongodExecutable mongodExecutable;
#Autowired
private MongodProcess mongodProcess;
#Rule
public ExpectedException expectedEx = ExpectedException.none();
#After
public void tearDown() {
this.mongodProcess.stop();
this.mongodExecutable.stop();
}
...
}
I use prototype scope to solve problem with singleton and make sure that my mock database is clean when I start my test, after that I stop mongod process and mongod executable.
However since I need use more than one DAO I receive that error:
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'editionDAO' defined in class br.com.mymusicapp.spring.DatabaseMockConfig: Unsatisfied dependency expressed through constructor argument with index 0 of type [org.mongodb.morphia.Datastore]: :
Error creating bean with name 'datastore' defined in class br.com.mymusicapp.spring.DatabaseMockConfig: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.mongodb.morphia.Datastore]:
Factory method 'datastore' threw exception; nested exception is java.io.IOException: Could not start process: ERROR: listen(): bind() failed errno:98 Address already in use for socket: 0.0.0.0:12345
2015-01-04T01:05:04.128-0200 [initandlisten] ERROR: addr already in use
I know what the error means, I just do not know how can I design my Configuration to solve that. As last option I am considering install a localhost MongoDB just for tests, however I think could be a better solution
That is based on the embedded mongod by flapdoodle, right?
If you want to run multiple tests in parallel (could be changed via JUnit annotations, but it's probably faster in parallel), you cannot use a single, hardcoded port. Instead, let the embedded process select an available port automatically.