Spring Boot. Running liquibase changelog after jpa auto-dll tables generation on hsqldb - jpa

Case is like this.
I have liquibase changelog contaning only inserts.
I am trying to force Spring Boot to initialize database (hsqldb) schema using JPA based on #Entities and later execute liquibase changelog. Unfortunatelly Spring Boot is doing it in oposite order.
I checked LiquibaseAutoConfiguration and it has:
#AutoConfigureAfter({ DataSourceAutoConfiguration.class,
HibernateJpaAutoConfiguration.class })
so it is executed after HibernateJpaAutoConfiguration however Spring Boot still do it not the way I wish ;).
Spring Boot version: 1.3.0.RELEASE
Liquibase-core version: 3.5.1
Thank you in advance for any naswer

Possible solution is to disable automatic boot liquibase run via application.properties:
spring.jpa.hibernate.ddl-auto=create
liquibase.enabled=false
and then manually configure SpringLiquibase bean to depends on entityManagerFactory:
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceBuilder;
import org.springframework.boot.autoconfigure.liquibase.LiquibaseProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.DependsOn;
import liquibase.integration.spring.SpringLiquibase;
#SpringBootApplication
public class DemoApplication {
#Autowired
private DataSource dataSource;
#Bean
public LiquibaseProperties liquibaseProperties() {
return new LiquibaseProperties();
}
#Bean
#DependsOn(value = "entityManagerFactory")
public SpringLiquibase liquibase() {
LiquibaseProperties liquibaseProperties = liquibaseProperties();
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setChangeLog(liquibaseProperties.getChangeLog());
liquibase.setContexts(liquibaseProperties.getContexts());
liquibase.setDataSource(getDataSource(liquibaseProperties));
liquibase.setDefaultSchema(liquibaseProperties.getDefaultSchema());
liquibase.setDropFirst(liquibaseProperties.isDropFirst());
liquibase.setShouldRun(true);
liquibase.setLabels(liquibaseProperties.getLabels());
liquibase.setChangeLogParameters(liquibaseProperties.getParameters());
return liquibase;
}
private DataSource getDataSource(LiquibaseProperties liquibaseProperties) {
if (liquibaseProperties.getUrl() == null) {
return this.dataSource;
}
return DataSourceBuilder.create().url(liquibaseProperties.getUrl())
.username(liquibaseProperties.getUser())
.password(liquibaseProperties.getPassword()).build();
}
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
However I'd strongly encourage to use liquibase to build schema as well. I believe it was designed (see org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration.LiquibaseJpaDependencyConfiguration) to run before hibernate's ddl-auto so that it's possible to set ddl-auto=validate and have liquibase schema validated by hibernate.

The solution provided by Radek Postołowicz served me quite some time but didn't work anymore after updating to spring-boot 2.5.0. I think it can be fully replaced by adding the following property to application.properties (or yml):
spring.jpa.defer-datasource-initialization=true
This is also mentioned in the release notes.

I just updated Spring Boot to 2.5.3 and have the same problem.
I solved the issue by using a class CustomSpringLiquibase (Kotlin version) :
class CustomSpringLiquibase(
private var springLiquibase: SpringLiquibase
) : InitializingBean, BeanNameAware, ResourceLoaderAware {
companion object {
private val LOGGER = LoggerFactory.getLogger(CustomSpringLiquibase::class.java)
}
#Throws(LiquibaseException::class)
override fun afterPropertiesSet() {
LOGGER.info("Init Liquibase")
springLiquibase.afterPropertiesSet()
}
override fun setBeanName(name: String) {
springLiquibase.beanName = name
}
override fun setResourceLoader(resourceLoader: ResourceLoader) {
springLiquibase.resourceLoader = resourceLoader
}
}
And in my SpringBootApplication class I added the following (Java Version):
#Bean
#DependsOn(value = "entityManagerFactory")
public CustomSpringLiquibase liquibase() {
LiquibaseProperties liquibaseProperties = liquibaseProperties();
SpringLiquibase liquibase = new SpringLiquibase();
....
return new CustomSpringLiquibase(liquibase);
}

You should use Liquibase for DDL statements too. It doesn't make sense to use it solely for DML statements and then use another solution for DDL. Liquibase is equally well suited for either. And Liquibase is especially well-suited for the case where you develop on one type of database and deploy to another. In fact Liquibase is database engine agnostic.
If you need to execute some SQL before Liquibase fires (like creating the schema where Liquibase itself lives) then you can use Pre-Liquibase, but that should only be for whatever it is that absolutely cannot be in Liquibase changelogs.
All in all: I would advice against using all of the following:
JPA ddl (meaning the spring.jpa.generate-ddl settings)
Hibernate ddl (meaning the spring.jpa.hibernate.ddl-auto setting)
DataSource initialization (meaning the spring.sql.init.mode setting)
when using Spring Boot and Liquibase. The above methods are not guaranteed to fire before Liquibase. When you have Liquibase you don't need any of the above methods.

Related

Is there any way to force spring not to use/create '_class' field in the mapping?

The thing is on production servers we got mapping for Elasticsearch with dynamic set to strict. Currently, we use a rest level client to communicate with Elastisearch, however, we would like to migrate to spring-data-elasticsearch.
Unfortunately, it seems spring data force to use either _class or #TypeAlias which also interfere with the mapping itself. Is any way to use spring-data without _class or #TypeAlias?
Ok I have found a workaround for it.
Be aware of using it when your elasticsearch model uses inheritance.
To solve this problem create class like this:
public class CustomMappingEsConverter extends MappingElasticsearchConverter {
public CustomMappingEsConverter(MappingContext<? extends ElasticsearchPersistentEntity<?>, ElasticsearchPersistentProperty> mappingContext, GenericConversionService conversionService) {
super(mappingContext, conversionService);
}
#Override
public Document mapObject(#Nullable Object source) {
Document target = Document.create();
if (source != null) {
this.write(source, target);
}
target.remove("_class"); // << workaround to remove those _class field in elasticsearch
return target;
}
}
And register the bean:
#Configuration
public class MappingEsConfiguration {
#Bean
#Primary
public CustomMappingEsConverter CustomMappingElasticsearchConverter(MappingContext<? extends ElasticsearchPersistentEntity<?>, ElasticsearchPersistentProperty> mappingContext,
GenericConversionService genericConversionService) {
return new CustomMappingEsConverter(mappingContext, genericConversionService);
}
}
After this changes I was able to use spring data without additional field _class.
Currently this is not possible. There is an open issue for that.
Edit 25.04.2021:
this feature will be available from the next version (4.3) on.

Upgrading from Spring Boot 1.5 to 2.0 - cannot execute UPDATE in a read-only transaction

Updating Spring Boot 1.5 to 2.1.5
When trying to do a operation repository.save(entity) it gives the following error:
Caused by: com.impossibl.postgres.jdbc.PGSQLSimpleException: cannot execute UPDATE in a read-only transaction
We use the org.springframework.data.repositoryCrudRepository interface to perform the operations.
1) #Transactional(readOnly = false), as i understood setting Read-Only mode to false only works as a hint to the sub-layers, how can i check and change the other layers?
#Service
public class ServiceImpl
private final Repository repository;
#Autowired
public ServiceImpl(Repository repository) {
this.repository = repository;
}
#Transactional(readOnly = false)
public void operation(Entity entity){
repository.save(entity);
}
And Repository is
public interface Repository extends CrudRepository<Entity, UUID>{
#Query("select e from Entity e where lower(u.name) = lower(?1)")
Entity findByName(String name);
}
build.gradle
------------
`dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:2.1.5.RELEASE")
}
`
```runtime("org.springframework.boot:spring-boot-properties-migrator")
compile("org.springframework.boot:spring-boot-starter-security")
compile("org.springframework.boot:spring-boot-starter-jersey")
compile("org.springframework.boot:spring-boot-starter-web")
compile("org.springframework.boot:spring-boot-starter-thymeleaf")
compile("org.springframework.boot:spring-boot-starter-data-jpa")
compile("org.springframework.boot:spring-boot-starter-jetty")
compile("org.springframework.boot:spring-boot-starter-mail")
compile("org.springframework.boot:spring-boot-starter-actuator")
compile("org.quartz-scheduler:quartz:2.3.1")
compile("com.fasterxml.jackson.dataformat:jackson-dataformat-xml")
compile("com.fasterxml.jackson.datatype:jackson-datatype-jsr310")
compile("com.fasterxml.woodstox:woodstox-core:5.2.1")
compile("org.glassfish.jersey.media:jersey-media-multipart:2.28")
compile("net.java.dev.msv:msv-core:2013.6.1")
compile("com.impossibl.pgjdbc-ng:pgjdbc-ng:0.8.2")
compile('org.apache.commons:commons-lang3:3.9')
compile('commons-io:commons-io:2.6')
compile('org.apache.commons:commons-compress:1.18')
compile('org.apache.poi:poi-ooxml:4.1.0')
compile('org.apache.xmlbeans:xmlbeans:3.1.0')
compile('org.mitre.dsmiley.httpproxy:smiley-http-proxy-servlet:1.10')
compile('com.monitorjbl:xlsx-streamer:2.1.0')
compile('com.zaxxer:HikariCP:3.3.1')
application.properties
spring.datasource.driverClassName=com.impossibl.postgres.jdbc.PGDriver
spring.datasource.url=
spring.datasource.username=
spring.datasource.password=
spring.datasource.type=com.zaxxer.hikari.HikariDataSource
spring.datasource.hikari.idle-timeout=10000
# Set auto-commit = false, otherwise - Caused by: java.sql.SQLException:
Clobs require connection to be in manual-commit mode...
spring.datasource.hikari.auto-commit=false
logging.level.ROOT=INFO
logging.level.org.springframework.orm.jpa=DEBUG
logging.level.org.springframework.transaction=DEBUG
One important thing is that i added the Auto-Commit to false in Hikari, otherwise it would fail with an exception as it can bee seen in the comment.
Note: In some threads it was suggested to check postgres connection
show default_transaction_read_only;
default_transaction_read_only
-------------------------------
off
SELECT pg_is_in_recovery();
pg_is_in_recovery
-------------------
f
Thanks in advance.
Property readOnly is false by default, so you should never use #Transactional(readOnly = false), use #Transactional instead.
When you mark some methods or class with #Transactional Spring creates a proxy of that class to inject the logic of Transaction Manager. It uses a bean that implements interface org.springframework.transaction.PlatformTransactionManager
In your specific case a bean of org.springframework.orm.jpa.JpaTransactionManager will be created.
Spring Boot is using Hibernate as a default JPA provider, so eventually all transaction logic will affect Hibernate. E.g. readOnly = true is used in order to disable "dirty checking" mechanism that performs all UPDATE operations in Hibernate.
By default, Spring Transaction Manager creates a new Hibernate Session (new transition) when it calls a method marker with #Transactional and no Session is attached to the current thread. So all the following calls in the current thread will use the same Session (and same transaction). Unless you change the propagation property.
It all means that configs for the transaction are set when Spring calls #Transactional method the first time and those configs are used for all methods calls in the same thread. See the code example:
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionSynchronizationManager;
#Service
public class ServiceA {
#Transactional(readOnly = true)
public void a() {
boolean isReadOnly = TransactionSynchronizationManager.isCurrentTransactionReadOnly();
System.out.println(isReadOnly);
}
}
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
#Service
public class ServiceB {
private final ServiceA serviceA;
public ServiceB(ServiceA serviceA) {
this.serviceA = serviceA;
}
#Transactional
public void b() {
serviceA.a();
}
}
serviceA.a() will print true
serviceB.b() will print false
See Spring Boot 2.0 Migration Guide about Gradle and add dependency management plugin:
Spring Boot’s Gradle plugin no longer automatically applies the dependency management plugin. Instead, Spring Boot’s plugin now reacts to the dependency management plugin being applied by importing the correct version of the spring-boot-dependencies BOM. This gives you more control over how and when dependency management is configured.
For most applications applying the dependency management plugin will be sufficient:
apply plugin: 'org.springframework.boot'
apply plugin: 'io.spring.dependency-management' // <-- add this to your build.gradle
Also you can remove spring.datasource.type
If you used spring.datasource.type to force the use of Hikari in a Tomcat-based application, you can now remove that override.
Notice also that minimum Hibernate version is 5.2
Also I see you added spring-boot-properties-migrator, note that it should be removed after you finish migration tweaks
Once you’re done with the migration, please make sure to remove this module from your project’s dependencies.

Problems while connecting to two MongoDBs via Spring

I'm trying to achieve to connect to two different MongoDBs with Spring (1.5.2. --> we included Spring in an internal Framework therefore it is not the latest version yet) and this already works partially but not fully. More precisely I found a strange behavior which I will describe below after showing my setup.
So this is what I done so far:
Project structure
backend
config
domain
customer
internal
repository
customer
internal
service
In configI have my Mongoconfigurations.
I created one base class which extends AbstractMongoConfiguration. This class holds fields for database, host etc. which are filled with the properties from a application.yml. It also holds a couple of methods for creating MongoClient and SimpleMongoDbFactory.
Furthermore there are two custom configuration classes. For each MongoDB one config. Both extend the base class.
Here is how they are coded:
Primary Connection
#Primary
#EntityScan(basePackages = "backend.domain.customer")
#Configuration
#EnableMongoRepositories(
basePackages = {"backend.repository.customer"},
mongoTemplateRef = "customerDataMongoTemplate")
#ConfigurationProperties(prefix = "customer.mongodb")
public class CustomerDataMongoConnection extends BaseMongoConfig{
public static final String TEMPLATE_NAME = "customerDataMongoTemplate";
#Override
#Bean(name = CustomerDataMongoConnection.TEMPLATE_NAME)
public MongoTemplate mongoTemplate() {
MongoClient client = getMongoClient(getAddress(),
getCredentials());
SimpleMongoDbFactory factory = getSimpleMongoDbFactory(client,
getDatabaseName());
return new MongoTemplate(factory);
}
}
The second configuration class looks pretty similar. Here it is:
#EntityScan(basePackages = "backend.domain.internal")
#Configuration
#EnableMongoRepositories(
basePackages = {"backend.repository.internal"}
mongoTemplateRef = InternalDataMongoConnection.TEMPLATE_NAME
)
#ConfigurationProperties(prefix = "internal.mongodb")
public class InternalDataMongoConnection extends BaseMongoConfig{
public static final String TEMPLATE_NAME = "internalDataMongoTemplate";
#Override
#Bean(name = InternalDataMongoConnection.TEMPLATE_NAME)
public MongoTemplate mongoTemplate() {
MongoClient client = getMongoClient(getAddress(), getCredentials());
SimpleMongoDbFactory factory = getSimpleMongoDbFactory(client,
getDatabaseName());
return new MongoTemplate(factory);
}
}
As you can see, I use EnableMongoRepositoriesto define which repository should use which connection.
My repositories are defined just like it is described in the Spring documentation.
However, here is one example which is located in package backend.repository.customer:
public interface ContactHistoryRepository extends MongoRepository<ContactHistoryEntity, String> {
public ContactHistoryEntity findById(String id);
}
The problem is that my backend always only uses the primary connection with this setup. Interestingly, when I remove the beanname for the MongoTemplate (just #Bean) the backend then uses the secondary connection (InternalMongoDataConnection). This is true for all defined repositories.
My question is, how can I achieve that my backend really take care of both connections? Probably I missed to set another parameter/configuration?
Since this is a pretty extensive post I apologise if I forgot something to mention. Please ask for missing information in the comments.
I found the answer.
In my package structure there was a empty configuration class (of my colleague) with the annotation #Configurationand #EnableMongoRepositories. This triggered the automatic wiring process of Stpring Data and therefore led to the problems I reported above.
I simply deleted the class and now it works as it should!

Flyway Spring Boot Autowired Beans with JPA Dependency

I am using Flyway 5.0.5 and I am unable to create a java (SpringJdbcMigration) with autowired properties... They end up null.
The closest thing I can find is this question: Spring beans are not injected in flyway java based migration
The answer mentions it being fixed in Flyway 5 but the links are dead.
What am I missing?
I struggled with this for a long time due to my JPA dependency. I am going to edit the title of my question slightly to reflect this...
#Autowired beans are instantiated from the ApplicationContext. We can create a different bean that is ApplicationContextAware and use that to "manually wire" our beans for use in migrations.
A quite clean approach can be found here. Unfortunately, this throws an uncaught exception (specifically, ApplicationContext is null) when using JPA. Luckily, we can solve this by using the #DependsOn annotation and force flyway to run after the ApplicationContext has been set.
First we'll need the SpringUtility from avehlies/spring-beans-flyway2 above.
package com.mypackage;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.stereotype.Component;
#Component
public class SpringUtility implements ApplicationContextAware {
#Autowired
private static ApplicationContext applicationContext;
public void setApplicationContext(final ApplicationContext applicationContext) {
this.applicationContext = applicationContext;
}
/*
Get a class bean from the application context
*/
public static <T> T getBean(final Class clazz) {
return (T) applicationContext.getBean(clazz);
}
/*
Return the application context if necessary for anything else
*/
public static ApplicationContext getContext() {
return applicationContext;
}
}
Then, configure a flywayInitializer with a #DependsOn for springUtility. I extended the FlywayAutoConfiguration here hoping to keep the autoconfiguration functionality. This mostly seems to have worked for me, except that turning off flyway in my gradle.build file no longer works, so I had to add the #Profile("!integration") to prevent it from running during my tests. Other than that the autoconfiguration seems to work for me but admittedly I've only run one migration. Hopefully someone will correct me if I am wrong.
package com.mypackage;
import org.flywaydb.core.Flyway;
import org.springframework.boot.autoconfigure.flyway.FlywayMigrationInitializer;
import org.springframework.boot.autoconfigure.flyway.FlywayAutoConfiguration.FlywayConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Profile;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.DependsOn;
import com.mypackage.SpringUtility;
#Configuration
#Profile("!integration")
class MyFlywayConfiguration extends FlywayConfiguration {
#Primary
#Bean(name = "flywayInitializer")
#DependsOn("springUtility")
public FlywayMigrationInitializer flywayInitializer(Flyway flyway){
return super.flywayInitializer(flyway);
//return new FlywayMigrationInitializer(flyway, null);
}
}
And just to complete the example, here is a migration:
package db.migration;
import org.flywaydb.core.api.migration.spring.BaseSpringJdbcMigration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Component;
import com.mypackage.repository.AccountRepository;
import com.mypackage.domain.Account;
import com.mypackage.SpringUtility;
import java.util.List;
public class V2__account_name_ucase_firstname extends BaseSpringJdbcMigration {
private AccountRepository accountRepository = SpringUtility.getBean(AccountRepository.class);
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
List<Account> accounts = accountRepository.findAll();
for (Account account : accounts) {
String firstName = account.getFirstName();
account.setFirstName(firstName.substring(0, 1).toUpperCase() + firstName.substring(1));
account = accountRepository.save(account);
}
}
}
Thanks to avehlies on github, Andy Wilkinson on stack overflow and OldIMP on github for helping me along the way.
In case you are using more recent versions of Flyway, then extend BaseJavaMigration instead of BaseSpringJdbcMigration as the later is deprecated. Also, take a look at the below two comments by the user Wim Deblauwe.
The functionality hasn't made it into Flyway yet. It's being tracked by this issue. At the time of writing that issue is open and assigned to the 5.1.0 milestone.
Seems the updated answer provided by #mararn1618 is under documented on the official documentation, so I will provide a working setup here. Thanks to #mararn1618 for guiding in that direction.
Disclaimer, it's written in Kotlin :)
First you need a configuration for loading the migration classes, in Spring Boot (and perhaps Spring) you need either an implementation of FlywayConfigurationCustomizer or a setup of FlywayAutoConfiguration.FlywayConfiguration. Only the first is tested, but both should work
Configuration a, tested
import org.flywaydb.core.api.configuration.FluentConfiguration
import org.flywaydb.core.api.migration.JavaMigration
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.autoconfigure.flyway.FlywayConfigurationCustomizer
import org.springframework.context.ApplicationContext
import org.springframework.stereotype.Component
#Component
class MyFlywayConfiguration #Autowired constructor(
val applicationContext: ApplicationContext
) : FlywayConfigurationCustomizer {
override fun customize(configuration: FluentConfiguration?) {
val migrationBeans = applicationContext.getBeansOfType(JavaMigration::class.java)
val migrationBeansAsArray = migrationBeans.values.toTypedArray()
configuration?.javaMigrations(*migrationBeansAsArray)
}
}
Configuration option B, untested, but should also work
import org.flywaydb.core.api.migration.JavaMigration
import org.springframework.boot.autoconfigure.flyway.FlywayAutoConfiguration
import org.springframework.boot.autoconfigure.flyway.FlywayConfigurationCustomizer
import org.springframework.context.ApplicationContext
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
#Configuration
class MyFlywayConfiguration : FlywayAutoConfiguration.FlywayConfiguration() {
#Bean
fun flywayConfigurationCustomizer(applicationContext: ApplicationContext): FlywayConfigurationCustomizer {
return FlywayConfigurationCustomizer { flyway ->
val p = applicationContext.getBeansOfType(JavaMigration::class.java)
val v = p.values.toTypedArray()
flyway.javaMigrations(*v)
}
}
}
And with that you can just write your migrations as almost any other Spring bean:
import org.flywaydb.core.api.migration.BaseJavaMigration
import org.flywaydb.core.api.migration.Context
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.stereotype.Component
#Component
class V7_1__MyMigration #Autowired constructor(
) : BaseJavaMigration() {
override fun migrate(context: Context?) {
TODO("go crazy, mate, now you can import beans, but be aware of circular dependencies")
}
}
Side notes:
Be careful of circular dependencies, your migrations can most likely not depend on repositories (also makes sense, you are preparing them, after all)
Make sure your migrations are located where Spring scans for classes. So if you want to place them in the namespace db/migrations, you need to ensure that Spring scans that location
I haven't tested, but it's likely one should be cautious with mixing the path for these migrations and the locations where Flyway scans for migrations
Current flyway 6.5.5 version is released and back from 6.0.0 I believe support for spring beans is provided.
You can directly autowire spring beans into your Java based migrations (using #autowired), But the hunch is your Migration class also should be managed by Spring to resolve dependency.
There is a cool and simple way for it, by overriding default behavior of Flyway, check out https://reflectoring.io/database-migration-spring-boot-flyway/
the article clearly answers your question with code snippets.
If you are using deltaspike you can use BeanProvider to get a reference to your DAO.
Change your DAO code:
public static UserDao getInstance() {
return BeanProvider.getContextualReference(UserDao.class, false, new DaoLiteral());
}
Then in your migration method:
UserDao userdao = UserDao.getInstance();
And there you've got your reference.
(referenced from: Flyway Migration with java)

What steps are required to add support for Phoenix to ActiveJDBC?

I am trying to add some support for Apache Phoenix to ActiveJDBC. I am using the ActiveJDBC simple-example project as test, and making changes to a clone of ActiveJDBC 2.0-SNAPSHOT (latest from github).
So far in ActiveJDBC 2.0-SNAPSHOT I have:
created a PhoenixDialect class in org.javalite.activejdbc.dialects to
Override the insert method (Phoenix uses UPSERT)
added an if stanza to the getDialect(String dbType) method in
Configuration
In the simple-example project I have:
added the phoenix-client as a dependency (we are using Phoenix as
part of HortonWorks HDP 2.5.3.0 on HBase 1.1.2.2.5)
set the database.properties with Phoenix values
created the relevant tables in Phoenix manually (db-migrate does
not work for obvious reasons)
However, the database dialect is not being recognized, and is, I believe, defaulting to the DefaultDialect as I get a Phoenix error on the use of "INSERT" which is not recognized in the Phoenix grammar. Phoenix grammar
Are there additional steps I am missing when adding support for an additional dialect?
I also suspect the Phoenix jdbc driver may not support a getDbName() type method, the Phoenix driver, when asked for getPropertyInfo() returns EMPTY_INFO, see PhoenixEmbeddedDriver
If the driver does not return the DbName, is there a workaround?
It might be worth mentioning we are successfully interacting with Phoenix using standard Java jdbc classes (PreparedStatement and all that good stuff), but ActiveJDBC is much more elegant and we would like to use it.
Pieces of what we have so far:
PhoenixDialect
import java.util.Iterator;
import java.util.Map;
import org.javalite.activejdbc.MetaModel;
import static org.javalite.common.Util.join;
public class PhoenixDialect extends DefaultDialect {
#Override
public String insert(MetaModel metaModel, Map<String, Object> attributes) {
StringBuilder query = new StringBuilder().append("UPSERT INTO ").append(metaModel.getTableName()).append(' ');
if (attributes.isEmpty()) {
appendEmptyRow(metaModel, query);
} else {
boolean addIdGeneratorCode = (metaModel.getIdGeneratorCode() != null
&& attributes.get(metaModel.getIdName()) == null); // do not use containsKey
query.append('(');
if (addIdGeneratorCode) {
query.append(metaModel.getIdName()).append(", ");
}
join(query, attributes.keySet(), ", ");
query.append(") VALUES (");
if (addIdGeneratorCode) {
query.append(metaModel.getIdGeneratorCode()).append(", ");
}
Iterator<Object> it = attributes.values().iterator();
appendValue(query, it.next());
while (it.hasNext()) {
query.append(", ");
appendValue(query, it.next());
}
query.append(')');
}
return query.toString();
}
}
Configuration
public Dialect getDialect(String dbType) {
Dialect dialect = dialects.get(dbType);
if (dialect == null) {
if (dbType.equalsIgnoreCase("Oracle")) {
dialect = new OracleDialect();
}
else if (dbType.equalsIgnoreCase("Phoenix")) {
dialect = new PhoenixDialect();
}
else if (dbType.equalsIgnoreCase("MySQL")) {
dialect = new MySQLDialect();
}
database.properties
development.driver=org.apache.phoenix.jdbc.PhoenixDriver
development.username=anything
development.password=anything
development.url=jdbc:phoenix:hdp-c21:2181:/hbase-unsecure
Here is a branch that was used to integrate SQLServer with new Dialect, test suite and other related stuff:
https://github.com/javalite/activejdbc/tree/sql_server_integration
Here is a branch for h2:
https://github.com/javalite/activejdbc/commits/h2integration
Things may have changed since then, but this branch will give you good guidance. Best if you fork the project, and when done submit your work as a pull request.