Room database handling inheritance in POJO and multiple tables - android-sqlite

I have an issue while migrating from sqlite to Room. I have 1 parent and 1 child class and 2 tables corresponding to it.
I have inheritance as follows
public class Sms {
int _id;
String sender;
String body;
Date date;
}
public class Event extends Sms {
String eventName;
long eventDueDate;
}
And I have tables as
SmsTable >>
_id INTEGER NOT NULL primary key autoincrement,
sender TEXT not null,
body TEXT not null,
date INTEGER not null
EventTable >>
_id INTEGER NOT NULL primary key autoincrement,
sms_id INTEGER,
eventName TEXT not null,
eventDueDate INTEGER
Now when I define Event as #Entity(tableName = "EventTable") it gives me an error that Migration didn't properly handle Events as expected TableInfo and found TableInfo doesn't match.
Expected Table Info has columns for sender, body, date while my EventTable doesn't have them.
How do I migrate my Event class which has inherited from Sms but tables are not flattened ?
P.s. I can not flatten the EventTable as SmsTable exists even without Event and I need to convert Sms also into Entity.

IgnoredColumns : In cases where an entity inherits fields from a parent entity, it's usually easier to use the ignoredColumns property of the #Entity attribute.
foreignKeys : Even though you cannot use direct relationships, Room still allows you to define Foreign Key constraints between entities.
#Entity(tableName = "EventTable", ignoredColumns = "sender","body","date",
foreignKeys = #ForeignKey(entity = Sms.class,
parentColumns = "id" , childColumns = "sms_id"))
public class Event extends Sms {
#PrimaryKey(autoGenerate = true)
    #ColumnInfo(name = "_id")
public int id;
#ColumnInfo(name = "sms_id")
public int smsId;
public String evwntName;
public long eventDuedate;
}

Related

Spring R2DBC query by example with projected/custom entity

The title maybe is not properly written but here is what, more or less, I want to achieve.
I would like to be able to write dynamic queries with use of Query by Example that would join multiple tables and create (projection?) DTO for me.
This DTO would have fields that are mapped to different columns in joined tables. Consider following:
Tables:
CREATE TABLE address
(
id SERIAL,
address_code VARCHAR(255) NOT NULL,
street_name VARCHAR(255),
building_number VARCHAR(255)
);
CREATE TABLE account
(
id SERIAL,
account_number BIGINT UNIQUE
);
CREATE TABLE customer
(
id SERIAL,
name VARCHAR(255)
)
I would like to be able to create a query which result would be:
address.address_code, account.account_number, customer.name
so basically the result would be a custom DTO. I also mentioned that I would like to have this backed up with Query by Example because I will to dynamically append WHERE clauses so I thought that if I created a DTO like:
public record CustomQueryResultDTO(String addressCode, BigInteger accountNumber, String name) {}
I could simply query just like it is in Spring R2DBC documentation.
The problem here is that I am not sure what should be a viable solution for such problem because on one hand I would like to reuse ReactiveQueryByExampleExecutor but that would mean that I have to create something like:
#Repository
public interface CustomQueryResultRepository extends ReactiveCrudRepository<CustomQueryResultDTO, Integer>, ReactiveQueryByExampleExecutor<CustomQueryResultDTO> {
}
Which kind of seems to me not a way to go as I do not have a corresponding table for CustomQueryResultDTO therefore there is really no mapping for this repository interface - or am I overthinking this and it is actually a way to go?
I think you are potentially overthinking it.
You can do it in a number of ways (note Java 17 text blocks):
Via R2DBC JPA-like #Query
Create a normal ReactiveCrudRepository but collect into a projection (DTOP)
// Repository
#Repository
public interface UserRefreshTokenRepository extends ReactiveCrudRepository<UserRefreshToken, Integer> {
#Query(
"""
select *
from user.user_refresh_tokens t
join user.user_infos c on c.user_id = t.user_id
where c.username = :username
"""
)
Flux<UserRefreshTokenDtop> findAllByUsername(String username);
}
// Entity
#Data
#Builder
#AllArgsConstructor
#NoArgsConstructor
#ToString(exclude = {"refreshToken"})
#Table(schema = "user", name = "user_refresh_tokens")
public class UserRefreshToken {
#Id private Integer id;
private String userId;
private String username; # will be joined
private String ipAddr;
private OffsetDateTime createdAt;
private String refreshToken;
private OffsetDateTime refreshTokenIat;
private OffsetDateTime refreshTokenExp;
}
// DTO projection
public interface UserRefreshTokenDtop {
Integer getId();
String getUserId();
String getUsername(); # will be joined
String getIpAddr();
OffsetDateTime getRefreshTokenIat();
OffsetDateTime getRefreshTokenExp();
}
Via DatabaseClient
This one also uses TransactionalOperator to ensure query atomicity
private final DatabaseClient client;
private final TransactionalOperator operator;
#Override
public void deleteAllUsedExpiredAttempts(Duration resetInterval) {
// language=PostgreSQL
String allUsedExpiredAttempts = """
select t.id failed_id, c.id disable_id, t.username
from user.failed_sign_attempts t
join user.disable_sign_attempts c on c.username = t.username
where c.is_used = true
and :now >= c.expires_at + interval '%d seconds'
""";
// POTENTIAL SQL injection - half-arsed but %d ensures that only Number is allowed
client
.sql(String.format(allUsedExpiredAttempts, resetInterval.getSeconds()))
.bind("now", Instant.now())
.fetch()
.all()
.flatMap(this::deleteFailed)
.flatMap(this::deleteDisabled)
.as(operator::transactional)
.subscribe(v1 -> log.debug("Successfully reset {} user(s)", v1));
}
Via R2dbcEntityTemplate
I don't have a working example but it is pain in the ass to join via the .join() operator
If you are interested check the docs for R2dbcEntityTemplate
13.4.3. Fluent API > Methods for the Criteria Class
https://docs.spring.io/spring-data/r2dbc/docs/current/reference/html

JPA - perform an insert on a Postgres table whose primary key is generated from a database trigger

I am writing an API where I am inserting a record into a table (Postgres). I was hoping to use JPA for the work. Here is the potential challenge: the primary key for the insert is generated from a database trigger, rather than from sequence count or similar. In fact, the trigger creates the primary key using the values of other fields being passed in as part of the insert. So for example,
if I have a entity class like the following:
#Entity
#Validated
#Table(name = "my_table", schema="common")
public class MyModel {
#Id
#Column(name = "col_id")
private String id;
#Column(name = "second_col")
private String secCol;
#Column(name = "third_col")
private String thirdCol;
public MyModel() {
}
public MyModel(String id, String secCol, String thirdCol) {
this.id = id;
this.secCol = secCol;
this.thirdCol = thirdCol;
}
}
I would need the col_id field to somehow honor that the key is generated from the trigger, and the trigger would need to be able to read the values for second_col and third_col in order to generate the primary key. Finally, I would need the call to return the value of the primary key.
Can this be done with jpa and repository interface such as:
public interface MyRepo extends JpaRepository <MyModel, String> {
}
and then use either default save method such as myRepo.saveAndFlush(myModel) or custom save methods? I can't find anything on using JPA with DB triggers that generating keys. If it cannot be done with JPA, I would be grateful for any alternative ideas. Thanks.
ok, I was able to get this to work. It required writing a custom query that ignored the primary key field:
public interface MyRepo extends JpaRepository <MyModel, String> {
#Transactional
#Modifying
#Query(value = "INSERT INTO my_table(second_col, third_col)", nativeQuery = true)
int insertMyTable(#Param("second_col") String second_col, #Param("third_col") String third_col);
}
The model class is unchanged from above. Because it was executed as a native query, it allowed postGres to do its thing uninterrupted.

Entity Framework 6.2 Create index on Foreign Key

How do you set up Entity Framework clustered index and foreign keys?
public class WorkDay {
public int Id { get;set;}
public DateTime Date { get;set;}
public Keyword Kw {get;set;}
}
public class Keyword {
public int Id { get;set;}
public string Name {get;set;}
}
I want to add index of Date and kw for the WorkDay entity, but cannot say how.
builder.Entity<WorkDay>().HasIndex(item => new { item.Date, item.Keyword });
this will give me error due to the fact that the mapping is only done for simple types
builder.Entity<WorkDay>().HasIndex(item => new { item.Date, item.Keyword.Id });
gives me error
The properties expression 'item => new <>f__AnonymousType21`2(Date = item.Date, Id = item.Keyword.Id)' is not valid. The expression should represent a property
What is the correct way?
Create a property for the foreign key (e.g. WorkDay.KeywordId) and reference that when defining the index.

Spring Data JPA : How save data into database using save() of jpaRepository

I created one spring Application. I am trying to save data into database using save method of JPA Repository. i am getting Error null value in column "id" violates not-null constraint
HomeController
#RestController
public class HomeController
{
#Autowired
public userRepository repository;
#RequestMapping(value="/save2",method=RequestMethod.POST )
public String save1(#ModelAttribute user us)
{
repository.save(us);
return "sucessfull";
}
}
user
#Entity
#Table(name="user", schema="new")
public class user implements Serializable
{
private static final long serialVersionUID = -2956665320311624925L;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
public Integer id;
#Column(name="uname")
public String uname;
#Column(name="pass")
public String pass;
Table Script
Through Postman I am trying to Insert following data
I am getting this error
org.postgresql.util.PSQLException: ERROR: null value in column "id" violates not-null constraint
Can Any one tell me what i am doing wrong in above code
I see couple of issues here.
First, replace your #ModelAttribute with #RequestBody since you're sending a JSON request, it is wise to use the latter. (Read up here and here). In your case, the values from request is not passed to repository save method including Id value. That's the reason you're getting not null constraint error.
Second, since you're using GenerationType.IDENTITY strategy, you should use serial or bigserial type to let Postgres to generate your primary key.
Read up nicely written answers on IDENTITY strategy here
You defined id as an Integer field in your model class. Try to pass the value in the json as an Integer, not as a String.
{
"id": 1,
"uname": "abc",
"upass": "abc"
}

(JDBI/Dropwizard) PSQLException when retrieving auto-incremented id from PostgreSQL

I'm trying to set up a dropwizard project but I'm stuck. When I try to get the auto generated id field with #GetGeneratedKeys then I'm getting the following Exception:
org.postgresql.util.PSQLException: Bad value for type long : foo.
The request is a simple JSON Request
{"name":"foo"}
The INSERT into the database is successful but it seems that the statement returns the value of the name instead of the generated id. How can I solve this?
I use postgresql, and the table project contains a primary key field "id" with nextval('project_id_seq'::regclass). Here are the POJO, DAO and Resource Classes I use:
public class Project {
private long id;
private String name;
public Project() { // Jackson deserialization }
public Project(long id, String name) {
this.id = id;
this.name = name;
}
...
}
#RegisterMapper(ProjectMapper.class)
public interface ProjectDAO {
#SqlUpdate("insert into project (name) values (:name)")
#GetGeneratedKeys
public long insert(#Bind("name") String name);
}
#Path("/project")
#Consumes({MediaType.APPLICATION_JSON})
#Produces({MediaType.APPLICATION_JSON})
public class ProjectResource {
ProjectDAO projectDAO;
public ProjectResource(ProjectDAO personDAO) {
this.projectDAO = personDAO;
}
#POST
#Timed
public Response add(#Valid Project project) {
long newId = projectDAO.insert(project.getName());
project.setId(newId);
return Response.status(Response.Status.CREATED)
.entity(project).build();
}
}
===============
UPDATE
I just figured out that this relates to the fact that my id column isn't the first column in my table. The column name is. The problem occurs because #GetGeneratedKeys is using org.skife.jdbi.v2.sqlobject.FigureItOutResultSetMapper which is using org.skife.jdbi.v2.PrimitivesMapperFactory which returns org.skife.jdbi.v2.util.LongMapper.FIRST. This mapper is calling
java.sql.ResultSet.getLong(1) through the method extractByIndex(...) to retrieve the generated id, which isn't the id in my case...
I'll fix the issue by reorganizing the columns in the database, but I'd like to have a robust implementation if possible: Is there a way to specify the column name of the id column when using the #GetGeneratedKeys Annotation? (The org.skife.jdbi.v2.util.LongMapper class contains a also method called extractByName(...))
This is an issue in the jdbi implementation and is fixed in a newer version as described in https://github.com/jdbi/jdbi/issues/114