What steps are required to add support for Phoenix to ActiveJDBC? - apache-phoenix

I am trying to add some support for Apache Phoenix to ActiveJDBC. I am using the ActiveJDBC simple-example project as test, and making changes to a clone of ActiveJDBC 2.0-SNAPSHOT (latest from github).
So far in ActiveJDBC 2.0-SNAPSHOT I have:
created a PhoenixDialect class in org.javalite.activejdbc.dialects to
Override the insert method (Phoenix uses UPSERT)
added an if stanza to the getDialect(String dbType) method in
Configuration
In the simple-example project I have:
added the phoenix-client as a dependency (we are using Phoenix as
part of HortonWorks HDP 2.5.3.0 on HBase 1.1.2.2.5)
set the database.properties with Phoenix values
created the relevant tables in Phoenix manually (db-migrate does
not work for obvious reasons)
However, the database dialect is not being recognized, and is, I believe, defaulting to the DefaultDialect as I get a Phoenix error on the use of "INSERT" which is not recognized in the Phoenix grammar. Phoenix grammar
Are there additional steps I am missing when adding support for an additional dialect?
I also suspect the Phoenix jdbc driver may not support a getDbName() type method, the Phoenix driver, when asked for getPropertyInfo() returns EMPTY_INFO, see PhoenixEmbeddedDriver
If the driver does not return the DbName, is there a workaround?
It might be worth mentioning we are successfully interacting with Phoenix using standard Java jdbc classes (PreparedStatement and all that good stuff), but ActiveJDBC is much more elegant and we would like to use it.
Pieces of what we have so far:
PhoenixDialect
import java.util.Iterator;
import java.util.Map;
import org.javalite.activejdbc.MetaModel;
import static org.javalite.common.Util.join;
public class PhoenixDialect extends DefaultDialect {
#Override
public String insert(MetaModel metaModel, Map<String, Object> attributes) {
StringBuilder query = new StringBuilder().append("UPSERT INTO ").append(metaModel.getTableName()).append(' ');
if (attributes.isEmpty()) {
appendEmptyRow(metaModel, query);
} else {
boolean addIdGeneratorCode = (metaModel.getIdGeneratorCode() != null
&& attributes.get(metaModel.getIdName()) == null); // do not use containsKey
query.append('(');
if (addIdGeneratorCode) {
query.append(metaModel.getIdName()).append(", ");
}
join(query, attributes.keySet(), ", ");
query.append(") VALUES (");
if (addIdGeneratorCode) {
query.append(metaModel.getIdGeneratorCode()).append(", ");
}
Iterator<Object> it = attributes.values().iterator();
appendValue(query, it.next());
while (it.hasNext()) {
query.append(", ");
appendValue(query, it.next());
}
query.append(')');
}
return query.toString();
}
}
Configuration
public Dialect getDialect(String dbType) {
Dialect dialect = dialects.get(dbType);
if (dialect == null) {
if (dbType.equalsIgnoreCase("Oracle")) {
dialect = new OracleDialect();
}
else if (dbType.equalsIgnoreCase("Phoenix")) {
dialect = new PhoenixDialect();
}
else if (dbType.equalsIgnoreCase("MySQL")) {
dialect = new MySQLDialect();
}
database.properties
development.driver=org.apache.phoenix.jdbc.PhoenixDriver
development.username=anything
development.password=anything
development.url=jdbc:phoenix:hdp-c21:2181:/hbase-unsecure

Here is a branch that was used to integrate SQLServer with new Dialect, test suite and other related stuff:
https://github.com/javalite/activejdbc/tree/sql_server_integration
Here is a branch for h2:
https://github.com/javalite/activejdbc/commits/h2integration
Things may have changed since then, but this branch will give you good guidance. Best if you fork the project, and when done submit your work as a pull request.

Related

Is there any way to force spring not to use/create '_class' field in the mapping?

The thing is on production servers we got mapping for Elasticsearch with dynamic set to strict. Currently, we use a rest level client to communicate with Elastisearch, however, we would like to migrate to spring-data-elasticsearch.
Unfortunately, it seems spring data force to use either _class or #TypeAlias which also interfere with the mapping itself. Is any way to use spring-data without _class or #TypeAlias?
Ok I have found a workaround for it.
Be aware of using it when your elasticsearch model uses inheritance.
To solve this problem create class like this:
public class CustomMappingEsConverter extends MappingElasticsearchConverter {
public CustomMappingEsConverter(MappingContext<? extends ElasticsearchPersistentEntity<?>, ElasticsearchPersistentProperty> mappingContext, GenericConversionService conversionService) {
super(mappingContext, conversionService);
}
#Override
public Document mapObject(#Nullable Object source) {
Document target = Document.create();
if (source != null) {
this.write(source, target);
}
target.remove("_class"); // << workaround to remove those _class field in elasticsearch
return target;
}
}
And register the bean:
#Configuration
public class MappingEsConfiguration {
#Bean
#Primary
public CustomMappingEsConverter CustomMappingElasticsearchConverter(MappingContext<? extends ElasticsearchPersistentEntity<?>, ElasticsearchPersistentProperty> mappingContext,
GenericConversionService genericConversionService) {
return new CustomMappingEsConverter(mappingContext, genericConversionService);
}
}
After this changes I was able to use spring data without additional field _class.
Currently this is not possible. There is an open issue for that.
Edit 25.04.2021:
this feature will be available from the next version (4.3) on.

Spring Boot. Running liquibase changelog after jpa auto-dll tables generation on hsqldb

Case is like this.
I have liquibase changelog contaning only inserts.
I am trying to force Spring Boot to initialize database (hsqldb) schema using JPA based on #Entities and later execute liquibase changelog. Unfortunatelly Spring Boot is doing it in oposite order.
I checked LiquibaseAutoConfiguration and it has:
#AutoConfigureAfter({ DataSourceAutoConfiguration.class,
HibernateJpaAutoConfiguration.class })
so it is executed after HibernateJpaAutoConfiguration however Spring Boot still do it not the way I wish ;).
Spring Boot version: 1.3.0.RELEASE
Liquibase-core version: 3.5.1
Thank you in advance for any naswer
Possible solution is to disable automatic boot liquibase run via application.properties:
spring.jpa.hibernate.ddl-auto=create
liquibase.enabled=false
and then manually configure SpringLiquibase bean to depends on entityManagerFactory:
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceBuilder;
import org.springframework.boot.autoconfigure.liquibase.LiquibaseProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.DependsOn;
import liquibase.integration.spring.SpringLiquibase;
#SpringBootApplication
public class DemoApplication {
#Autowired
private DataSource dataSource;
#Bean
public LiquibaseProperties liquibaseProperties() {
return new LiquibaseProperties();
}
#Bean
#DependsOn(value = "entityManagerFactory")
public SpringLiquibase liquibase() {
LiquibaseProperties liquibaseProperties = liquibaseProperties();
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setChangeLog(liquibaseProperties.getChangeLog());
liquibase.setContexts(liquibaseProperties.getContexts());
liquibase.setDataSource(getDataSource(liquibaseProperties));
liquibase.setDefaultSchema(liquibaseProperties.getDefaultSchema());
liquibase.setDropFirst(liquibaseProperties.isDropFirst());
liquibase.setShouldRun(true);
liquibase.setLabels(liquibaseProperties.getLabels());
liquibase.setChangeLogParameters(liquibaseProperties.getParameters());
return liquibase;
}
private DataSource getDataSource(LiquibaseProperties liquibaseProperties) {
if (liquibaseProperties.getUrl() == null) {
return this.dataSource;
}
return DataSourceBuilder.create().url(liquibaseProperties.getUrl())
.username(liquibaseProperties.getUser())
.password(liquibaseProperties.getPassword()).build();
}
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
However I'd strongly encourage to use liquibase to build schema as well. I believe it was designed (see org.springframework.boot.autoconfigure.liquibase.LiquibaseAutoConfiguration.LiquibaseJpaDependencyConfiguration) to run before hibernate's ddl-auto so that it's possible to set ddl-auto=validate and have liquibase schema validated by hibernate.
The solution provided by Radek Postołowicz served me quite some time but didn't work anymore after updating to spring-boot 2.5.0. I think it can be fully replaced by adding the following property to application.properties (or yml):
spring.jpa.defer-datasource-initialization=true
This is also mentioned in the release notes.
I just updated Spring Boot to 2.5.3 and have the same problem.
I solved the issue by using a class CustomSpringLiquibase (Kotlin version) :
class CustomSpringLiquibase(
private var springLiquibase: SpringLiquibase
) : InitializingBean, BeanNameAware, ResourceLoaderAware {
companion object {
private val LOGGER = LoggerFactory.getLogger(CustomSpringLiquibase::class.java)
}
#Throws(LiquibaseException::class)
override fun afterPropertiesSet() {
LOGGER.info("Init Liquibase")
springLiquibase.afterPropertiesSet()
}
override fun setBeanName(name: String) {
springLiquibase.beanName = name
}
override fun setResourceLoader(resourceLoader: ResourceLoader) {
springLiquibase.resourceLoader = resourceLoader
}
}
And in my SpringBootApplication class I added the following (Java Version):
#Bean
#DependsOn(value = "entityManagerFactory")
public CustomSpringLiquibase liquibase() {
LiquibaseProperties liquibaseProperties = liquibaseProperties();
SpringLiquibase liquibase = new SpringLiquibase();
....
return new CustomSpringLiquibase(liquibase);
}
You should use Liquibase for DDL statements too. It doesn't make sense to use it solely for DML statements and then use another solution for DDL. Liquibase is equally well suited for either. And Liquibase is especially well-suited for the case where you develop on one type of database and deploy to another. In fact Liquibase is database engine agnostic.
If you need to execute some SQL before Liquibase fires (like creating the schema where Liquibase itself lives) then you can use Pre-Liquibase, but that should only be for whatever it is that absolutely cannot be in Liquibase changelogs.
All in all: I would advice against using all of the following:
JPA ddl (meaning the spring.jpa.generate-ddl settings)
Hibernate ddl (meaning the spring.jpa.hibernate.ddl-auto setting)
DataSource initialization (meaning the spring.sql.init.mode setting)
when using Spring Boot and Liquibase. The above methods are not guaranteed to fire before Liquibase. When you have Liquibase you don't need any of the above methods.

UnitTest FluentNhibernate using PostgreSQLConfiguration

When setting up our new architecture I followed a guide which used NHibernate with MsSql2008 configuration.
We are not using MsSql2008, instead using Postgresql. The configuration for this all works great and it saves to the database etc.
I am trying to write a unit test to test the UoW but I can't get the InMemory configuration to work.
The guide that I followed used this following Provider:
public class InMemoryNHibernateConfigurationProvider : NHibernateConfigurationProvider
{
public override Configuration GetDatabaseConfiguration()
{
var databaseDriver = SQLiteConfiguration.Standard.InMemory().ShowSql();
return CreateCoreDatabaseConfiguration(databaseDriver);
}
public static void InitialiseDatabase(Configuration configuration, ISession session)
{
new SchemaExport(configuration).Execute(true, true, false, session.Connection, Console.Out);
}
}
My standard (Non UnitTest) configuration looks like this:
public abstract class NHibernateConfigurationProvider : INHibernateConfigurationProvider
{
public abstract Configuration GetDatabaseConfiguration();
public Configuration CreateCoreDatabaseConfiguration(
IPersistenceConfigurer databaseDriver,
Action<Configuration> databaseBuilder = null)
{
var fluentConfiguration =
Fluently.Configure()
.Database(databaseDriver)
.Mappings(m => m.AutoMappings.Add(AutoMap.AssemblyOf<Organisation>(new DefaultMappingConfiguration())
//.Conventions.AddFromAssemblyOf<IdGenerationConvention>()
.UseOverridesFromAssemblyOf<OrganisationMappingOverride>()));
if (databaseBuilder != null)
{
fluentConfiguration.ExposeConfiguration(databaseBuilder);
}
return fluentConfiguration.BuildConfiguration();
}
}
public class PostgreSQLServerNHibernateConfigurationProvider : NHibernateConfigurationProvider
{
private static readonly string NpgsqlConnectionString = ConfigurationManager.ConnectionStrings["ProdDBConnection"].ConnectionString;
public override Configuration GetDatabaseConfiguration()
{
return CreateCoreDatabaseConfiguration(
PostgreSQLConfiguration.Standard.ConnectionString(NpgsqlConnectionString).
Dialect("NHibernate.Dialect.PostgreSQL82Dialect").ShowSql(),
BuildDatabase);
}
....... // Other Methods etc
}
How do I write a InMemoryConfigurationProvider that tests using PostgresqlConfiguration instead of SqlLiteCOnfiguration. PostgresqlConfiguration does not have an InMemory option.
Do I implement a configuration that creates another database and just drop it on teardown? Or is there perhaps another way of doing it?
Using sqlite works really well and although it does have some differences to SQL-server which we use they are so minor it doesn't matter for testing purposes.
With that said, this is how we setup the tests:
All test-cases where we want to write/read from db extend the SqLiteTestBaseclass. That way they all get access to a session created by the basesetup method, and can setup the daos / repositories as needed.
Using this approach we also always get a fresh new db for each test-case.
Update:
After trying this out a bit more I actually found that you have to modify it a bit to use InMemory (we had previously used sqlite backed by a file on disk instead). So the updated (complete) setup looks like this:
private Configuration _savedConfig;
[SetUp]
public void BaseSetup()
{
FluentConfiguration configuration =
Fluently.Configure()
.Database(SQLiteConfiguration.Standard
.InMemory)
.ExposeConfiguration(
x => x.SetInterceptor(new MultiTenancyInterceptor(ff)))
.Mappings(m => m.FluentMappings.AddFromAssemblyOf<IRepository>())
.Mappings(m => m.FluentMappings.ExportTo("c:\\temp\\mapping"))
.ExposeConfiguration(x => _savedConfig = x) //save the nhibernate configuration for use when creating the schema, in order to be able to use the same connection
.ExposeConfiguration(x => ConfigureEnvers(x))
.ExposeConfiguration(x => ConfigureListeners(x));
ISessionFactory sessionFactory;
try
{
sessionFactory = configuration.BuildSessionFactory();
}
catch (Exception ex)
{
Console.WriteLine(ex.StackTrace);
throw;
}
_session = sessionFactory.OpenSession();
BuildSchema(_savedConfig, _session);
}
private void BuildSchema(Configuration config, ISession session)
{
new SchemaExport(config)
.Execute(false, true, false, session.Connection, null);
}
The reason why you have to jump through all these hoops in order to use the in-memory version of Sqlite is due to the db being tied to the connection. You have to use the same connection that creates the db to populate the schema, thus we have to save the Configuration object so that we can export the schema later when we've created the connection.
See this blogpost for some more details: http://www.tigraine.at/2009/05/29/fluent-nhibernate-gotchas-when-testing-with-an-in-memory-database/
N.B: This only shows the setup of the db. We have some code which also populates the db with standard values (users, customers, masterdata etc) but I've omitted that for brevity.

Play framework 2 + JPA with multiple persistenceUnit

I'm struggling with Play and JPA in order to be able to use two different javax.persistence.Entity model associated to two different persistence units (needed to be able to connect to different DB - for example an Oracle and a MySQL db).
The problem come from the Transaction which is always bind to the default JPA persitenceUnit (see jpa.default option).
Here is two controller actions which show the solution I found to manually define the persistence :
package controllers;
import models.Company;
import models.User;
import play.db.jpa.JPA;
import play.db.jpa.Transactional;
import play.mvc.Controller;
import play.mvc.Result;
public class Application extends Controller {
//This method run with the otherPersistenceUnit
#Transactional(value="other")
public static Result test1() {
JPA.em().persist(new Company("MyCompany"));
//Transaction is run with the "defaultPersistenceUnit"
JPA.withTransaction(new play.libs.F.Callback0() {
#Override
public void invoke() throws Throwable {
JPA.em().persist(new User("Bobby"));
}
});
return ok();
}
//This action run with the otherPersistenceUnit
#Transactional
public static Result test2() {
JPA.em().persist(new User("Ryan"));
try {
JPA.withTransaction("other", false, new play.libs.F.Function0<Void>() {
public Void apply() throws Throwable {
JPA.em().persist(new Company("YourCompany"));
return null;
}
});
} catch (Throwable throwable) {
throw new RuntimeException(throwable);
}
return ok();
}
}
This solution doesn't seem to be really "clean". I'd like to know if you know a better way to avoid the need to manually modify the transaction used.
For this purpose, I created a repo on git with a working sample application which shows how I configured the project.
https://github.com/cm0s/play2-jpa-multiple-persistenceunit
Thank you for your help
i met the same problem, too. too many advices are about PersistenceUnit annotation or getJPAConfig. but both them seem not work in play framework.
i found out a method which works well in my projects. maybe you can try it.
playframework2 how to open multi-datasource configuration with jpa
gud luk!

Entity Framework MigrateDatabaseToLatestVersion giving error

I am attempting to use Entity Framework code based migrations with my web site. I currently have a solution with multiple projects in it. There is a Web API project which I want to initialize the database and another project called the DataLayer project. I have enabled migrations in the DataLayer project and created an initial migration that I am hoping will be used to create the database if it does not exist.
Here is the configuration I got when I enabled migrations
public sealed class Configuration : DbMigrationsConfiguration<Harris.ResidentPortal.DataLayer.ResidentPortalContext>
{
public Configuration()
{
AutomaticMigrationsEnabled = false;
}
protected override void Seed(Harris.ResidentPortal.DataLayer.ResidentPortalContext context)
{
// This method will be called after migrating to the latest version.
// You can use the DbSet<T>.AddOrUpdate() helper extension method
// to avoid creating duplicate seed data. E.g.
//
// context.People.AddOrUpdate(
// p => p.FullName,
// new Person { FullName = "Andrew Peters" },
// new Person { FullName = "Brice Lambson" },
// new Person { FullName = "Rowan Miller" }
// );
//
}
}
The only change I made to this after it was created was to change it from internal to public so the WebAPI could see it and use it in it's databaseinitializer. Below is the code in the code in the Application_Start that I am using to try to initialize the database
Database.SetInitializer(new MigrateDatabaseToLatestVersion<ResidentPortalContext, Configuration>());
new ResidentPortalUnitOfWork().Context.Users.ToList();
If I run this whether or not a database exists I get the following error
Directory lookup for the file "C:\Users\Dave\Documents\Visual Studio 2012\Projects\ResidentPortal\Harris.ResidentPortal.WebApi\App_Data\Harris.ResidentPortal.DataLayer.ResidentPortalContext.mdf" failed with the operating system error 2(The system cannot find the file specified.).
CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
It seems like it is looking in the totally wrong place for the database. It seems to have something to do with this particular way I am initializing the database because if I change the code to the following.
Database.SetInitializer(new DropCreateDatabaseAlways<ResidentPortalContext>());
new ResidentPortalUnitOfWork().Context.Users.ToList();
The database will get correctly created where it needs to go.
I am at a loss for what is causing it. Could it be that I need to add something else to the configuration class or does it have to do with the fact that all my migration information is in the DataLayer project but I am calling this from the WebAPI project?
I have figured out how to create a dynamic connection string for this process. You need to first add this line into your EntityFramework entry on Web or App.Config instead of the line that gets put there by default.
<defaultConnectionFactory type="<Namespace>.<ConnectionStringFacotry>, <Assembly>"/>
This tells the program you have your own factory that will return a DbConnection. Below is the code I used to make my own factory. Part of this is a hack to get by the fact that a bunch of programmers work on the same set of code but some of us use SQL Express while others use full blown SQL Server. But this will give you an example to go by for what you need.
public sealed class ResidentPortalConnectionStringFactory: IDbConnectionFactory
{
public DbConnection CreateConnection(string nameOrConnectionString)
{
SqlConnectionStringBuilder builder = new SqlConnectionStringBuilder(ConfigurationManager.ConnectionStrings["PortalDatabase"].ConnectionString);
//save off the original catalog
string originalCatalog = builder.InitialCatalog;
//we're going to connect to the master db in case the database doesn't exist yet
builder.InitialCatalog = "master";
string masterConnectionString = builder.ToString();
//attempt to connect to the master db on the source specified in the config file
using (SqlConnection conn = new SqlConnection(masterConnectionString))
{
try
{
conn.Open();
}
catch
{
//if we can't connect, then append on \SQLEXPRESS to the data source
builder.DataSource = builder.DataSource + "\\SQLEXPRESS";
}
finally
{
conn.Close();
}
}
//set the connection string back to the original database instead of the master db
builder.InitialCatalog = originalCatalog;
DbConnection temp = SqlClientFactory.Instance.CreateConnection();
temp.ConnectionString = builder.ToString();
return temp;
}
}
Once I did that I coudl run this code in my Global.asax with no issues
Database.SetInitializer(new MigrateDatabaseToLatestVersion<ResidentPortalContext, Configuration>());
using (ResidentPortalUnitOfWork temp = new ResidentPortalUnitOfWork())
{
temp.Context.Database.Initialize(true);
}