Could not create a trigger for a table on cassandra - triggers

I tried the cassandra version of 2.2.6 (a docker image) and 3.7 (the latest version, not with docker). Both of them report the same issue when I creating trigger for a table.
Exception when creating cassandra trigger
```
package com.ttData.triggers;
import ...
public class DataTrigger implements ITrigger {
private Properties properties = loadProperties();
#Autowired
private KafkaTemplate<Integer, String> kafkaTemplate;
private static AtomicInteger index = new AtomicInteger(1);
#Override
public Collection<Mutation> augment(Partition update) {
...
return Collections.singletonList(audit.build());
}
private static Properties loadProperties()
{
...
return properties;
}
}
```

You should use single-quote instead instead of double-quote for className
cqlsh:test> CREATE TRIGGER myTrigger on mytable using "className";
SyntaxException: <ErrorMessage code=2000 [Syntax error in CQL query] message="line 1:42 mismatched input 'className' expecting STRING_LITERAL (...TRIGGER myTrigger on mytable using ["classNam]e";)">
cqlsh:test>
cqlsh:test> CREATE TRIGGER myTrigger on mytable using 'className';
ConfigurationException: <ErrorMessage code=2300 [Query invalid because of configuration issue] message="Trigger class 'className' doesn't exist">

After debugging the cassandra source code, I think this is a bug.
Even though the trigger directory and trigger classes are both correct, it could still reports error that the trigger class doesn't exist.
The reason is that the working thread for creating trigger is not a secure thread which should be managed by cassandra's SecurityManager and belongs to a SecurityThreadGroup. So, an exception will be thrown when validating security failed.

Related

Why is my data not persisted/accessible in an Spring-Boot integration-test with HTTPGraphQLTester and TestEntityManager

I have a bare-bones Spring-Boot app with some GraphQL endpoints and a Postgres database and want to run an integration test against an endpoint. It should find an entity by its ID and does so without a problem when I send a request manually via Postman. However when I write an integration test for the controller it doesn't. The data seems to be saved after using
TestEntityManager (or the JpaRepository directly) an I get the entity back with its ID. I then stick that ID into a query with HttpGraphQlTester which fails with an empty result/null. I traced it with the debugger and discovered that when the endpoint calls the repository to retrieve the entity with the given ID it gets null or when I look at all the repo-contents it's just an empty list. So my data seems to be accessible in my test but not in my repo/service. Any pointers would be very much appreciated.
Test
#SpringBootTest
#AutoConfigureHttpGraphQlTester
#AutoConfigureTestEntityManager
#Transactional
public class BackboneTreeControllerTest {
#Autowired
HttpGraphQlTester tester;
#Autowired
private TestEntityManager testEntityManager;
#Test
void findTaxon() {
Taxon taxon = Taxon.builder()
.path(Arrays.asList("path", "to", "taxon"))
.nameCanonical("Cocos nucifera")
.authorship("Me")
.extinct(false)
.numDescendants(1l)
.numOccurrences(1l)
.build();
Taxon savedTaxon = testEntityManager.persistFlushFind(taxon); // (1)
this.tester.documentName("queries")
.operationName("FindTaxon")
.variable("taxonId", savedTaxon.getId())
.execute()
.path("findTaxon.authorship")
.entity(String.class)
.isEqualTo("Me");
the testEntityManager returns successfully with an ID.
Query
query FindTaxon($taxonId: ID!) {
findTaxon(id: $taxonId) {
authorship
}
}
Controller
#Controller
#AllArgsConstructor
public class BackboneTreeController {
private final TaxonService taxonService;
#QueryMapping
public Taxon findTaxon(#Argument Integer id) {
Optional<Taxon> taxon = taxonService.findTaxon(id);
return taxon.orElse(null);
}
}
Service
#Service
#AllArgsConstructor
public class TaxonService {
private final TaxonRepository taxonRepository;
public Optional<Taxon> findTaxon(Integer id) {
return taxonRepository.findById(id); // (2)
}
}
This is where I would expect the repo to return the entity but it does not. Also using .findAll here returns an empty list.
Repository
#Repository
public interface TaxonRepository extends JpaRepository<Taxon, Integer> {
}
Note that everything works fine when I just run the app and send the exact same query manually!
I don't know HttpGraphQlTester but I'd assume that it generates requests which then get processed in a separate thread.
That thread won't see the changes made in the test because they aren't committed yet.
If this is the reason resolve it by putting the setup in it's own transaction, for example by using TransactionTemplate.

Spring Boot ShedLock "relation "shedlock" does not exist"

I added ShedLock to my project to prevent working of scheduled job more than one time. I configured it like below but I'm getting
"org.postgresql.util.PSQLException: ERROR: relation "shedlock" does not exist" error.
This is lockProviderBean:
#Bean
public LockProvider lockProvider(DataSource dataSource) {
return new JdbcTemplateLockProvider(
JdbcTemplateLockProvider.Configuration.builder()
.withJdbcTemplate(new JdbcTemplate(dataSource))
.usingDbTime()
.build()
);
}
This is scheduled job:
#Scheduled(cron = "${cronProperty:0 00 23 * * *}")
#SchedulerLock(name = "schedulerLockName")
public void scheduledJob() {
..............
}
I added these notations to my class which contains schduledJob method:
#EnableScheduling
#Component
#Configuration
#EnableSchedulerLock(defaultLockAtMostFor = "2m")
I'm using Spring Data to do database operations and using these properties:
spring.datasource.url = jdbc:postgresql://ip:port/databaseName?currentSchema=schemeName
spring.datasource.driver-class-name = org.postgresql.Driver
spring.jpa.database = postgresql
spring.datasource.platform = postgresql
spring.datasource.hikari.maximum-pool-size=5
spring.jpa.database-platform=org.hibernate.dialect.PostgreSQLDialect
spring.datasource.username = username
spring.datasource.password = password
You have to create the table as described in the documentation.
maybe this is what you are missing:
If you need to specify a schema, you can set it in the table name
using the usual dot notation new JdbcTemplateLockProvider(datasource,
"my_schema.shedlock")
I face this problem too even though shedlock table has been created.
Workarounds for this is by
Setting pg's user default schema using ALTER ROLE YourPgUser SET search_path TO ... , or
Specifing shedlock schema on LockProvider bean
#Bean
public LockProvider getLockProvider(#Autowired JdbcTemplate jdbcTemplate) {
jdbcTemplate.execute("SET search_path TO domaindbschema");
return new JdbcTemplateLockProvider(jdbcTemplate);
}
or another style
#Bean
public LockProvider getLockProvider(#Autowired JdbcTemplate jdbcTemplate) {
return new JdbcTemplateLockProvider(jdbcTemplate, "domaindbschema.shedlock");
}

Spring JPA native query to '#IdClass' annotated table and getting "No Dialect mapping for JDBC type: 1111" [duplicate]

I'm working on a Spring JPA Application, using MySQL as database. I ensured that all spring-jpa libraries, hibernate and mysql-connector-java is loaded.
I'm running a mysql 5 instance. Here is a excerpt of my application.properties file:
spring.jpa.show-sql=false
spring.jpa.hibernate.ddl-auto=create-drop
spring.jpa.database-platform=org.hibernate.dialect.MySQL5Dialect
spring.datasource.url=jdbc:mysql://localhost/mydatabase
spring.datasource.username=myuser
spring.datasource.password=SUPERSECRET
spring.datasource.driverClassName=com.mysql.jdbc.Driver
When executing an integration test, spring startsup properly but fails on creating the hibernate SessionFactory, with the exception:
org.hibernate.MappingException: No Dialect mapping for JDBC type: 1111
I think my dialects should be Mysql5Dialect, I also tried the one explicitly stating InnoDB, and the two dialect options which don't indicate the version 5. But I always end up with the same 'No Dialect mapping for JDBC type: 1111' message.
My application.properties file resides in the test/resources source folder. It is recognized by the JUnit Test runner (I previously got an exception because of an typo in it).
Are the properties I'm setting wrong? I couldn't find some official documentation on these property names but found a hint in this stackoverflow answer: https://stackoverflow.com/a/25941616/1735497
Looking forward for your answers, thanks!
BTW The application is already using spring boot.
I got the same error because my query returned a UUID column. To fix that I returned the UUID column as varchar type through the query like "cast(columnName as varchar)", then it worked.
Example:
public interface StudRepository extends JpaRepository<Mark, UUID> {
#Modifying
#Query(value = "SELECT Cast(stuid as varchar) id, SUM(marks) as marks FROM studs where group by stuid", nativeQuery = true)
List<Student> findMarkGroupByStuid();
public static interface Student(){
private String getId();
private String getMarks();
}
}
Here the answer based on the comment from SubOptimal:
The error message actually says that one column type cannot be mapped to a database type by hibernate.
In my case it was the java.util.UUID type I use as primary key in some of my entities. Just apply the annotation #Type(type="uuid-char") (for postgres #Type(type="pg-uuid"))
There is also another common use-case throwing this exception. Calling function which returns void. For more info and solution go here.
I got the same error, the problem here is UUID stored in DB is not converting to object.
I tried applying these annotations #Type(type="uuid-char") (for postgres #Type(type="pg-uuid") but it didn't work for me.
This worked for me. Suppose you want id and name from a table with a native query in JPA. Create one entity class like 'User' with fields id and name and then try converting object[] to entity we want. Here this matched data is list of array of object we are getting from query.
#Query( value = "SELECT CAST(id as varchar) id, name from users ", nativeQuery = true)
public List<Object[]> search();
public class User{
private UUID id;
private String name;
}
List<User> userList=new ArrayList<>();
for(Object[] data:matchedData){
userList.add(new User(UUID.fromString(String.valueOf(data[0])),
String.valueOf(data[1])));
}
Suppose this is the entity we have
Please Check if some Column return many have unknow Type in Query .
eg : '1' as column_name can have type unknown
and 1 as column_name is Integer is correct One .
This thing worked for me.
Finding the column that triggered the issue
First, you didn't provide the entity mapping so that we could tell what column generated this problem. For instance, it could be a UUID or a JSON column.
Now, you are using a very old Hibernate Dialect. The MySQL5Dialect is meant for MySQL 5. Most likely you are using a newer MySQL version.
So, try to use the MySQL8Dialect instead:
spring.jpa.database-platform=org.hibernate.dialect.MySQL8Dialect
Adding non-standard types
In case you got the issue because you are using a JSON column type, try to provide a custom Hibernate Dialect that supports the non-standard Type:
public class MySQL8JsonDialect
extends MySQL8Dialect{
public MySQL8JsonDialect() {
super();
this.registerHibernateType(
Types.OTHER, JsonStringType.class.getName()
);
}
}
Ans use the custom Hibernate Dialect:
<property
name="hibernate.dialect"
value="com.vladmihalcea.book.hpjp.hibernate.type.json.MySQL8JsonDialect"
/>
If you get this exception when executing SQL native queries, then you need to pass the type via addScalar:
JsonNode properties = (JsonNode) entityManager
.createNativeQuery(
"SELECT properties " +
"FROM book " +
"WHERE isbn = :isbn")
.setParameter("isbn", "978-9730228236")
.unwrap(org.hibernate.query.NativeQuery.class)
.addScalar("properties", JsonStringType.INSTANCE)
.getSingleResult();
assertEquals(
"High-Performance Java Persistence",
properties.get("title").asText()
);
Sometimes when you call sql procedure/function it might be required to return something. You can try returning void: RETURN; or string (this one worked for me): RETURN 'OK'
If you have native SQL query then fix it by adding a cast to the query.
Example:
CAST('yourString' AS varchar(50)) as anyColumnName
In my case it worked for me.
In my case, the issue was Hibernate not knowing how to deal with an UUID column. If you are using Postgres, try adding this to your resources/application.properties:
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.PostgreSQL9Dialect
Another simple explanation might be that you're fetching a complex Type (Entity/POJO) but do not specify the Entity to map to:
String sql = "select yourentity.* from {h-schema}Yourentity yourentity";
return entityManager.createNativeQuery(sql).getResultList();
simply add the class to map to in the createNativeQuery method:
return entityManager.createNativeQuery(sql, Yourentity.class).getResultList();
In my case the problem was that, I forgot to add resultClasses attribute when I setup my stored procedure in my User class.
#NamedStoredProcedureQuery(name = "find_email",
procedureName = "find_email", resultClasses = User.class, //<--I forgot that.
parameters = {
#StoredProcedureParameter(mode = ParameterMode.IN, name = "param_email", type = String.class)
}),
This also happens when you are using Hibernate and returning a void function. AT least w/ postgres. It doesnt know how to handle the void. I ended up having to change my void to a return int.
If you are using Postgres, check that you don't have a column of type Abstime. Abstime is an internal Postgres datatype not recognized by JPA. In this case, converting to Text using TO_CHAR could help if permitted by your business requirements.
if using Postgres
public class CustomPostgreSqlDialect extends PostgreSQL94Dialect{
#Override
public SqlTypeDescriptor remapSqlTypeDescriptor(SqlTypeDescriptor sqlTypeDescriptor)
{
switch (sqlTypeDescriptor.getSqlType())
{
case Types.CLOB:
return VarcharTypeDescriptor.INSTANCE;
case Types.BLOB:
return VarcharTypeDescriptor.INSTANCE;
case 1111://1111 should be json of pgsql
return VarcharTypeDescriptor.INSTANCE;
}
return super.remapSqlTypeDescriptor(sqlTypeDescriptor);
}
public CustomPostgreSqlDialect() {
super();
registerHibernateType(1111, "string");
}}
and use
<prop key="hibernate.dialect">com.abc.CustomPostgreSqlDialect</prop>
For anybody getting this error with an old hibernate (3.x) version:
do not write the return type in capital letters. hibernate type implementation mapping uses lowercase return types and does not convert them:
CREATE OR REPLACE FUNCTION do_something(param varchar)
RETURNS integer AS
$BODY$
...
This is for Hibernate (5.x) version
Calling database function which return JSON string/object
For this use unwrap(org.hibernate.query.NativeQuery.class).addScalar() methods for the same.
Example as below (Spring & Hibernate):
#PersistenceContext
EntityManager em;
#Override
public String getJson(String strLayerName) {
String *nativeQuery* = "select fn_layer_attributes(:layername)";
return em.createNativeQuery(*nativeQuery*).setParameter("layername", strLayerName).**unwrap(org.hibernate.query.NativeQuery.class).addScalar**("fn_layer_attributes", **new JsonNodeBinaryType()**) .getSingleResult().toString();
}
Function or procedure returning void cause some issue with JPA/Hibernate, so changing it with return integer and calling return 1 at the end of procedure may solved the problem.
SQL Type 1111 represents String.
If you are calling EntityManager.createNativeQuery(), be sure to include the resulting java class in the second parameter:
return em.createNativeQuery(sql, MyRecord.class).getResultList()
After trying many proposed solutions, including:
https://stackoverflow.com/a/59754570/349169 which is one of the solutions proposed here
https://vladmihalcea.com/hibernate-no-dialect-mapping-for-jdbc-type/
it was finally this one that fixed everything with the least amount of changes:
https://gist.github.com/agrawald/adad25d28bf6c56a7e4618fe95ee5a39
The trick is to not have #TypeDef on your class, but instead have 2 different #TypeDef in 2 different package-info.java files. One inside your production code package for your production DB, and one inside your test package for your test H2 DB.

#Tailable(spring-data-reactive-mongodb) equivalent in spring-data-r2dbc

I am trying my hands on spring-data-r2dbc. I am try this on Postgresql. I have tried spring-data-mongodb-reactive before. I couldn't help but to compare both.
I see that Query Derivation is not yet supported. But I was wondering if there is an equivalent for #Tailable. This way I would be notified of the database changes in real time. Ca anyone share any code samples with respect to this.
I understand that the underlying database should support this. I believe Postgresql does support this kinda thing using Logical Decoding(Correct me if I am wrong here).
Is there a #Tailable equivalent in spring-data-r2dbc ?
I was on the same issue not sure if you found a solution or not but I was able to accomplish something similar by doing the following. First, I added trigger to my table
CREATE TRIGGER trigger_name
AFTER INSERT OR DELETE OR UPDATE
ON table_name
FOR EACH ROW
EXECUTE PROCEDURE trigger_function_name;
This will set a trigger on the table whenever a row, is updated, deleted, or inserted. Then it will call the trigger function I have set up which looked something like this:
CREATE FUNCTION trigger_function_name
RETURNS trigger
LANGUAGE 'plpgsql'
COST 100
VOLATILE NOT LEAKPROOF
AS
$BODY$
DECLARE
payload JSON;
BEGIN
payload = row_to_json(NEW);
PERFORM pg_notify('notification_name', payload::text);
RETURN NULL;
END;
$BODY$;
This will allow me to 'listen' to the any of these updates from my spring boot project and it will send the entire row as a payload.
Next, in my spring boot project I configured a connection to my db.
#Configuration
#EnableR2dbcRepositories("com.(point to wherever repository is)")
public class R2DBCConfig extends AbstractR2dbcConfiguration {
#Override
#Bean
public ConnectionFactory connectionFactory() {
return new PostgresqlConnectionFactory(PostgresqlConnectionConfiguration.builder()
.host("host")
.database("db")
.port(port)
.username("username")
.password("password")
.schema("schema")
.connectTimeout(Duration.ofMinutes(2))
.build());
}
}
With that I Autowire (dependency injection) it into the constructor in my service class and cast it to a r2dbc PostgressqlConnection class like so:
this.postgresqlConnection = Mono.from(connectionFactory.create()).cast(PostgresqlConnection.class).block();
Now we want to 'listen' to our table and get the notified when perform some update to our table. To do that we set up an initialization method that is performed after dependency injection by using the #PostContruct annotation
#PostConstruct
private void postConstruct() {
postgresqlConnection.createStatement("LISTEN notification_name").execute()
.flatMap(PostgresqlResult::getRowsUpdated).subscribe();
}
Notice that we listen to whatever name we put inside the pg_notify method. Also we want to set up a method to close the the connection when the bean is about to be tossed away, like so:
#PreDestroy
private void preDestroy() {
postgresqlConnection.close().subscribe();
}
Now I simply create a method that returns a Flux of whatever is currently in my table, and I also merge it with my notifications, as I said before the notifications come in as a json, so I had to deserialize it and I decided to use ObjectMapper. So, it will look something like this:
private Flux<YourClass> getUpdatedRows() {
return postgresqlConnection.getNotifications().map(notification -> {
try {
//deserialize json
return objectMapper.readValue(notification.getParameter(), YourClass.class);
} catch (IOException e) {
//handle exception
}
});
}
public Flux<YourClass> getDocuments() {
return documentRepository.findAll().share().concatWith(getUpdatedRows());
}
Hope this helps.
Cheers!

Spring batch JdbcBatchItemWriter insert is very slow with MYSQL

I'm using a chunk step with a reader and writer. I am reading data from Hive with 50000 chunk size and insert into mysql with same 50000 commit.
#Bean
public JdbcBatchItemWriter<Identity> writer(DataSource mysqlDataSource) {
return new JdbcBatchItemWriterBuilder<Identity>()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql(insertSql)
.dataSource(mysqlDataSource)
.build();
}
When i have started dataload and insert into Mysql its commiting very slow and 100000 records are takiing more than a hr to load while same loader with Gemfire loading 5 million recordsin 30 min.
seems like it insert one by one insted of batch as laoding 1500 then 4000 then ....etc ,does anyone faced same issue ?
Since you are using BeanPropertyItemSqlParameterSourceProvider, this will include lot of reflection to set variables in prepared statement.This will increase time.
If speed is the your high priority try implementing your own ItemWriter as given below and use prepared statement batch to execute update.
#Component
public class CustomWriter implements ItemWriter<Identity> {
//your sql statement here
private static final String SQL = "INSERT INTO table_name (column1, column2, column3, ...) VALUES (?,?,?,?);";
#Autowired
private DataSource dataSource;
#Override
public void write(List<? extends Identity> list) throws Exception {
PreparedStatement preparedStatement = dataSource.getConnection().prepareStatement(SQL);
for (Identity identity : list) {
// Set the variables
preparedStatement.setInt(1, identity.getMxx());
preparedStatement.setString(2, identity.getMyx());
preparedStatement.setString(3, identity.getMxt());
preparedStatement.setInt(4, identity.getMxt());
// Add it to the batch
preparedStatement.addBatch();
}
int[] count = preparedStatement.executeBatch();
}
}
Note: This is a rough code. So Exception handling and resource handling is not done properly. You can work on the same. I think this will improve your writing speed very much.
Try Adding ";useBulkCopyForBatchInsert=true" to your connection url.
Connection con = DriverManager.getConnection(connectionUrl + ";useBulkCopyForBatchInsert=true")
Source : https://learn.microsoft.com/en-us/sql/connect/jdbc/use-bulk-copy-api-batch-insert-operation?view=sql-server-ver15