#Tailable(spring-data-reactive-mongodb) equivalent in spring-data-r2dbc - postgresql

I am trying my hands on spring-data-r2dbc. I am try this on Postgresql. I have tried spring-data-mongodb-reactive before. I couldn't help but to compare both.
I see that Query Derivation is not yet supported. But I was wondering if there is an equivalent for #Tailable. This way I would be notified of the database changes in real time. Ca anyone share any code samples with respect to this.
I understand that the underlying database should support this. I believe Postgresql does support this kinda thing using Logical Decoding(Correct me if I am wrong here).
Is there a #Tailable equivalent in spring-data-r2dbc ?

I was on the same issue not sure if you found a solution or not but I was able to accomplish something similar by doing the following. First, I added trigger to my table
CREATE TRIGGER trigger_name
AFTER INSERT OR DELETE OR UPDATE
ON table_name
FOR EACH ROW
EXECUTE PROCEDURE trigger_function_name;
This will set a trigger on the table whenever a row, is updated, deleted, or inserted. Then it will call the trigger function I have set up which looked something like this:
CREATE FUNCTION trigger_function_name
RETURNS trigger
LANGUAGE 'plpgsql'
COST 100
VOLATILE NOT LEAKPROOF
AS
$BODY$
DECLARE
payload JSON;
BEGIN
payload = row_to_json(NEW);
PERFORM pg_notify('notification_name', payload::text);
RETURN NULL;
END;
$BODY$;
This will allow me to 'listen' to the any of these updates from my spring boot project and it will send the entire row as a payload.
Next, in my spring boot project I configured a connection to my db.
#Configuration
#EnableR2dbcRepositories("com.(point to wherever repository is)")
public class R2DBCConfig extends AbstractR2dbcConfiguration {
#Override
#Bean
public ConnectionFactory connectionFactory() {
return new PostgresqlConnectionFactory(PostgresqlConnectionConfiguration.builder()
.host("host")
.database("db")
.port(port)
.username("username")
.password("password")
.schema("schema")
.connectTimeout(Duration.ofMinutes(2))
.build());
}
}
With that I Autowire (dependency injection) it into the constructor in my service class and cast it to a r2dbc PostgressqlConnection class like so:
this.postgresqlConnection = Mono.from(connectionFactory.create()).cast(PostgresqlConnection.class).block();
Now we want to 'listen' to our table and get the notified when perform some update to our table. To do that we set up an initialization method that is performed after dependency injection by using the #PostContruct annotation
#PostConstruct
private void postConstruct() {
postgresqlConnection.createStatement("LISTEN notification_name").execute()
.flatMap(PostgresqlResult::getRowsUpdated).subscribe();
}
Notice that we listen to whatever name we put inside the pg_notify method. Also we want to set up a method to close the the connection when the bean is about to be tossed away, like so:
#PreDestroy
private void preDestroy() {
postgresqlConnection.close().subscribe();
}
Now I simply create a method that returns a Flux of whatever is currently in my table, and I also merge it with my notifications, as I said before the notifications come in as a json, so I had to deserialize it and I decided to use ObjectMapper. So, it will look something like this:
private Flux<YourClass> getUpdatedRows() {
return postgresqlConnection.getNotifications().map(notification -> {
try {
//deserialize json
return objectMapper.readValue(notification.getParameter(), YourClass.class);
} catch (IOException e) {
//handle exception
}
});
}
public Flux<YourClass> getDocuments() {
return documentRepository.findAll().share().concatWith(getUpdatedRows());
}
Hope this helps.
Cheers!

Related

Calling a simple stored procedure in EF core 3.1

I have a very simple stored procedure:
CREATE PROCEDURE [dbo].[ClearIterations]
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON
delete from iterations
END
GO
When calling it from EF it is not called. I get no errors:
public void ClearIterations()
{
this.Iterations.FromSqlRaw("ClearIterations").IgnoreQueryFilters();
}
Any pointers? I found the sample above on another thread in here that where the code above is the answer. It seems kind of strange I have to call this this.Iterations to call a SP.
EF Core 3.x+ provides two raw SQL sets of methods - FromSql and ExecuteSql, both with Raw / Interpolated and Async versions.
The former are used for querying. They return IQueryable<T>, allow query composition and as any LINQ query are not executed until the result is enumerated.
While the later is used to immediately execute arbitrary SQL (DDL, DML, batch etc.). They are EF Core equivalent of ADO.NET ExecuteNonQuery and return the records affected. Output (or input/output) primitive value parameters can be used to obtain the results.
Shortly, ExecuteSql methods are what you are seeking for. With your example, ExecuteSqlRaw, e.g. (assuming this is method in your DbContext derived class):
public void ClearIterations()
{
this.Database.ExecuteSqlRaw("ClearIterations");
}
//Interface
public interface IRepository
{
Task AddQualification(int userId, string qualification);
}
Repository class implementing your interface remember to register your interface in your startup class ConfigureServices services.AddScoped<IRepository, Repository>()
public class Repository : IRepository
{
public async Task AddQualification(int userId, string qualification)
{
await appDbContext.Database.ExecuteSqlRawAsync("InsertQualification {0}, {1}", userId, qualification);
}
}
Constructor injection of the interface, then call the method
public async Task OnPost()
{
await _repository.AddQualification(1, "English A-Level");
}
SQL Stored Procedure
CREATE PROCEDURE InsertQualification
-- Add the parameters for the stored procedure here
#UserID int,
#Qualification varchar(255)
AS
BEGIN
SET NOCOUNT ON;
INSERT INTO dbo.UserQualifications (UserID, Qualification)
VALUES (#UserID, #Qualification)
END
GO

cassandra trigger create vs update

I need to implement a cassandra trigger in java that will take a different action for INSERT vs UPDATE operations.
I've seen how it is possible to identify a DELETE operation in the question Cassandra sample trigger to get the deleted row and column values but I can't see any methods in the ColumnFamily object that would allow allow the code to differentiate between INSERT vs UPDATE, is there a way to achieve this?
There isn't conceptually any difference between INSERT and UPDATE. Indeed an INSERT and an UPDATE are just mutation. Cassandra gives them different name so that people coming from the relational DB have familiar concepts
As #doanduyhai mentioned and according to Casssandra sources only INSERT operation updates row timestamp (LivenessInfo).
For other operations this timestamp is not updated and equals Long.MIN_VALUE.
Custom Trigger which checks for INSERT operation
public class CustomTrigger implements ITrigger {
#Override
public Collection<Mutation> augment(Partition partition) {
checkQueryType(partition));
return Collections.emptyList();
}
private void checkQueryType(Partition partition) {
UnfilteredRowIterator it = partition.unfilteredIterator();
while (it.hasNext()) {
Unfiltered unfiltered = it.next();
Row row = (Row) unfiltered;
if (isInsert(row)) {
// Implement insert related logic
}
}
}
private boolean isInsert(Row row) {
return row.primaryKeyLivenessInfo().timestamp() != Long.MIN_VALUE;
}
}

Generic way to initialize a JPA 2 lazy association

So, the question at hand is about initializing the lazy collections of an "unknown" entity, as long as these are known at least by name. This is part of a more wide effort of mine to build a generic DataTable -> RecordDetails miniframework in JSF + Primefaces.
So, the associations are usually lazy, and the only moment i need them loaded is when someone accesses one record of the many in the datatable in order to view/edit it. The issues here is that the controllers are generic, and for this I also use just one service class backing the whole LazyLoading for the datatable and loading/saving the record from the details section.
What I have with come so far is the following piece of code:
public <T> T loadWithDetails(T record, String... associationsToInitialize) {
final PersistenceUnitUtil pu = em.getEntityManagerFactory().getPersistenceUnitUtil();
record = (T) em.find(record.getClass(), pu.getIdentifier(record));
for (String association : associationsToInitialize) {
try {
if (!pu.isLoaded(record, association)) {
loadAssociation(record, association);
}
} catch (..... non significant) {
e.printStackTrace(); // Nothing else to do
}
}
return record;
}
private <T> void loadAssociation(T record, String associationName) throws IntrospectionException, InvocationTargetException, IllegalAccessException, NoSuchFieldException {
BeanInfo info = Introspector.getBeanInfo(record.getClass(), Object.class);
PropertyDescriptor[] props = info.getPropertyDescriptors();
for (PropertyDescriptor pd : props) {
if (pd.getName().equals(associationName)) {
Method getter = pd.getReadMethod();
((Collection) getter.invoke(record)).size();
}
}
throw new NoSuchFieldException(associationName);
}
And the question is, did anyone start any similar endeavor, or does anyone know of a more pleasant way to initialize collections in a JPA way (not Hibernate / Eclipselink specific) without involving reflection?
Another alternative I could think of is forcing all entities to implement some interface with
Object getId();
void loadAssociations();
but I don't like the idea of forcing my pojos to implement some interface just for this.
With the reflection solution you would suffer the N+1 effect detailed here: Solve Hibernate Lazy-Init issue with hibernate.enable_lazy_load_no_trans
You could use the OpenSessionInView instead, you will be affected by the N+1 but you will not need to use reflection. If you use this pattern your transaction will remain opened until the end of the transaction and all the LAZY relationships will be loaded without a problem.
For this pattern you will need to do a WebFilter that will open and close the transaction.

Create database with data

I'm trying to create my database (code first) and I want to add some data in it when it's created.
public class InitializerWithData : CreateDatabaseIfNotExists<DatabaseContext>
{
protected override void Seed(DatabaseContext ctx)
{
GroupType gt = new GroupType() { Name = "RNC" };
//save
ctx.GroupType.Add(gt);
ctx.SaveChanges();
}
}
public DatabaseContext()
{
Database.SetInitializer<DatabaseContext>(new InitializerWithData());
Database.CreateIfNotExists();
}
As you can see I wrote my custom initializer but the code inside it is never fired though the database does get created.
So how do I solve this?
When you call Database.CreateIfNotExists(), it doesn't trigger the InitializeDatabase of the initializer. Basically it has separated implementation than the initializer.
If you want the Seed method to be fired. You need to execute a code that causes EF to send a query to the database.
First remove this line.
Database.CreateIfNotExists();
Then just execute a query, the least you could have is something like.
using(var db = new DatabaseContext())
{
db.Set<GroupType>().Any();
}
This code will create the database if it doesn't exist and execute the Seed method.

Why do I need to Insert and Update methods to my DomainService?

If I don't add Insert and Update methods to my domain service, I get exceptions when I try to add entities to the associated EntityCollection of my Entity. Now that I've added them (completely blank) I can add entities and modify them on the client but they never show up in my database. What am I missing? Do I need to create my own Insert and Update methods to my domain service and if so, what on earth would I put in them?
Edits:
This is what I have in my DomainContext. This seems a bit superfluous; I would think the Entity
Framework would already do this.
[Update]
public void UpdateProject(Project a_project)
{
ObjectContext.AcceptAllChanges();
}
[Update]
public void UpdateProjectItem(ProjectItem a_projectItem)
{
ObjectContext.AcceptAllChanges();
}
[Insert]
public void InsertProjectItem(ProjectItem a_projectItem)
{
ObjectContext.ProjectItems.AddObject(a_projectItem);
ObjectContext.AcceptAllChanges();
}
And this is how I'm using this on the client.
ProjectItem projectItem = new ProjectItem();
_reservedProject.Status = Project.ProjectStatusSubmitted;
_reservedProject.ProjectItems.Add(projectItem);
projectItem.LibraryItem = a_item;
_projectItems.Add(projectItem);
_domainContext.SubmitChanges();
UpdateProjectItem is never called.