Error in implementing hmsonline Cassandra-Trigger - triggers

Was trying to implement the triggers in Cassandra.
Have been trying to use the available Cassandra help: https://github.com/hmsonline/cassandra-triggers
Porting it on the latest version 1.2.3, and following the GettingStarted instruction.
when i tried to set the value for the trigger, inserting the data to fire the log, and checked the logs, following is the error which i received
java.lang.AssertionError
at org.apache.cassandra.thrift.ThriftSessionManager.currentSession(ThriftSessionManager.java:51)
at org.apache.cassandra.thrift.CassandraServer.state(CassandraServer.java:88)
at org.apache.cassandra.thrift.CassandraServer.validateLogin(CassandraServer.java:881)
at org.apache.cassandra.thrift.CassandraServer.set_keyspace(CassandraServer.java:1492)
at com.hmsonline.cassandra.triggers.dao.CassandraStore.getConnection(CassandraStore.java:42)
at com.hmsonline.cassandra.triggers.dao.ConfigurationStore.getConfiguration(ConfigurationStore.java:76)
at com.hmsonline.cassandra.triggers.dao.ConfigurationStore.isCommitLogEnabled(ConfigurationStore.java:44)
at com.hmsonline.cassandra.triggers.TriggerTask.run(TriggerTask.java:47)
at java.lang.Thread.run(Thread.java:636)
But we test it works fine at older version(ex: 1.1.2)
So is this a configuration problem, or the Thrift API implementation has been changed?
thanks

Related

Vertx Form Login Handler with Postgresql Failure

I am trying to authenticate user using FormLoginHandler and Postgresql Database with SqlAuthentication.
But I get the following error:
Jun 15, 2022 1:14:34 PM io.vertx.ext.web.RoutingContext
SEVERE: Unhandled exception in router
io.vertx.ext.web.handler.HttpException: Unauthorized
Caused by: io.vertx.core.impl.NoStackTraceThrowable: Invalid username/password
I am providing the right credentials.
The code snippet is:
SqlAuthenticationOptions sauthopts = new SqlAuthenticationOptions();
sauthopts.setAuthenticationQuery(AUTHENTICATE_QUERY);
SqlAuthentication authenticationProvider = SqlAuthentication.create(sqlClient, sauthopts);
router.route("/secure/*").handler(RedirectAuthHandler.create(authenticationProvider, "/login.html"));
FormLoginHandler formLoginHandler = FormLoginHandler.create(authenticationProvider);
router.route("/loginhandler").handler(formLoginHandler);
Please let me know if I am missing something here; or point me to a sample example.
Thanks in Advance.
Your setup doesn't show anything abnormal at first sight. For security reasons, we cannot "just" log the authentication data, as it would be a critical OWASP bug and security vulnerability.
My best guess is that probably is something not totally correct with the query, so this means you have now 2 options:
debug the application and see the query that is being sent + the arguments
prepare a small complete example that shows the bug and open an issue in vert.x so we can debug it further.
If you're upgrading from an older version, be aware that in vert.x 4.2.0 some changes were made to the base64 encoding to keep it consistent across modules. This could be a reason why authentication could fail as the encoded hashes may be slightly different. If you're just doing 4.3.0 from the start, then this would not be a problem.

Starting KsqlRestApplication form scala and getting NoSuchMethodError org.apache.kafka.streams.StreamsConfig.getConsumerConfigs

I am trying to write a program that enables me to run predefined KSQL operations on Kafka topics in Scala, but I don't want to open the KSQL Cli everytime. Therefore I want to start the KSQL "Server" from within my Scala program. If I understand the KSQL source code right, I have to build and start a KsqlRestApplication:
def restServer = KsqlRestApplication.buildApplication(new
KsqlRestConfig(defaultServerProperties), true, new VersionCheckerAgent
{override def start(ksqlModuleType: KsqlModuleType, properties:
Properties): Unit = ???})
But when I try doing that, I get the following error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.kafka.streams.StreamsConfig.getConsumerConfigs(Ljava/lang/String;Ljava/lang/String;)Ljava/util/Map;
at io.confluent.ksql.rest.server.BrokerCompatibilityCheck.create(BrokerCompatibilityCheck.java:62)
at io.confluent.ksql.rest.server.KsqlRestApplication.buildApplication(KsqlRestApplication.java:241)
I looked into the function call in BrokerCompatibilityCheck and in the create function it calls the StreamsConfig.getConsumerConfigs() with 2 Strings as parameters instead of the parameters defined in
https://kafka.apache.org/0102/javadoc/org/apache/kafka/streams/StreamsConfig.html#getConsumerConfigs(StreamThread,%20java.lang.String,%20java.lang.String).
Are my KSQL and Kafka version simply not compatible or am I doing something wrong?
I am using KSQL version 4.1.0-SNAPSHOT and Kafka version 1.0.0.
Yes, NoSuchMethodError typically indicates a version incompatibility between libraries.
The link you posted is to javadoc for kafka 0.10.2. The method hasn't changed in 1.0 but indeed in the upcoming 1.1 it only takes 2 Strings:
https://kafka.apache.org/11/javadoc/org/apache/kafka/streams/StreamsConfig.html#getConsumerConfigs(java.lang.String,%20java.lang.String)
. That suggests the version of KSQL you're using (4.1.0-SNAPSHOT) depends on version 1.1 of kafka streams, which is currently in the release candidate phase and I believe and should be out soon:
https://lists.apache.org/thread.html/780c4458b16590e99261b69d7b41b9ec374a3226d72c8d38885a008a#%3Cusers.kafka.apache.org%3E
As per that email you can find the latest (1.1.0-rc2) artifacts in the apache staging repo:
https://repository.apache.org/content/groups/staging/

getSchema in PostgreSQL JDBC driver throws java.lang.AbstractMethodError or java.sql.SQLFeatureNotSupportedException

I'm using Postgresql 8.4 and my application is trying to connect to the database.
I've registered the driver:
DriverManager.registerDriver(new org.postgresql.Driver());
and then trying the connection:
db = DriverManager.getConnection(database_url);
(btw, my jdbc string is something like: jdbc:postgresql://localhost:5432/myschema?user=myuser&password=mypassword)
I've tried various version of the jdbc driver and getting two type of errors:
with jdbc3:
Exception in thread "main" java.lang.AbstractMethodError: org.postgresql.jdbc3.Jdbc3Connection.getSchema()Ljava/lang/String;
with jdbc4:
java.sql.SQLFeatureNotSupportedException: Il metodo ½org.postgresql.jdbc4.Jdbc4Connection.getSchema()╗ non Þ stato ancora implementato.
that means: method org.postgresql.jdbc4.Jdbc4Connection.getSchema() not implemented yet.
I'm missing something but I don't know what..
------ SOLVED ---------
The problem were not in the connection String or the Driver version, the problem were in the code directly above the getConnection() method:
db = DriverManager.getConnection(database_url);
LOGGER.info("Connected to : " + db.getCatalog() + " - " + db.getSchema());
It seems postgresql driver doesn't have getSchema method, as the java console were often trying to say to me..
The Connection.getSchema() version was added in Java 7 / JDBC 4.1. This means that it is not necessarily available in a JDBC 3 or 4 driver (although if an implementation exists, it will get called).
If you use a JDBC 3 (Java 4/5) driver or a JDBC 4 (Java 6) driver in Java 7 or higher it is entirely possible that you receive a java.lang.AbstractMethodError when calling getSchema if it does not exist in the implementation. Java provides a form of forward compatibility for classes implementing an interface.
If new methods are added to an interface, classes that do not have these methods and were - for example - compiled against an older version of the interface, can still be loaded and used provided the new methods are not called. Missing methods will be stubbed by code that simply throws an AbstractMethodError. On the other hand: if a method getSchema had been implemented and the signature was compatible that method would now be accessible through the interface, even though the method did not exist in the interface at compile time.
In March 2011, the driver was updated so it could be compiled on Java 7 (JDBC 4.1), this happened by stubbing the new JDBC 4.1 methods with an implementation that throws a java.sql.SQLFeatureNotSupportedException, including the implementation of Connection.getSchema. This code is still in the current PostgreSQL JDBC driver version 9.3-1102. Technically a JDBC-compliant driver is not allowed to throw SQLFeatureNotSupportedException unless the API documentation or JDBC specification explicitly allows it (which it doesn't for getSchema).
However the current code on github does provide an implementation since April this year. You might want to consider compiling your own version, or ask on the pgsql-jdbc mailinglist if there are recent snapshots available (the snapshots link on http://jdbc.postgresql.org/ shows rather old versions).

Spring Data Cassandra - Environment must not be null Error

I am following basic tutorial at Spring Data Cassandra reference http://docs.spring.io/spring-data/cassandra/docs/1.1.0.RC1/reference/html/ and I am running into following exception
java.lang.IllegalArgumentException: Environment must not be null!
at org.springframework.util.Assert.notNull(Assert.java:112)
at org.springframework.data.repository.config.RepositoryConfigurationSourceSupport.<init>(RepositoryConfigurationSourceSupport.java:50)
at org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource.<init>(AnnotationRepositoryConfigurationSource.java:74)
at org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport.registerBeanDefinitions(RepositoryBeanDefinitionRegistrarSupport.java:74)
at org.springframework.context.annotation.ConfigurationClassParser.processImport(ConfigurationClassParser.java:394)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:204)
at org.springframework.context.annotation.ConfigurationClassParser.processConfigurationClass(ConfigurationClassParser.java:163)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:138)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:284)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:225)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:630)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:461)
at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
at com.strides.platform.domain.UserRepositoryDaoTest.<init>(UserRepositoryDaoTest.java:28)
I have completed steps mentioned in document,
1) Use Cassandra Properties
2) Create Java configuration
3) Create domain and repository classes
I have autowired Environment variable in Test classes. I checked couple of sample projects and not sure what needs to be done more.
I've encountered this error message and found the problem only occuring when I used Spring Framework version 3.2.8.RELEASE.
My solution was to upgrade to version 3.2.9.RELEASE.
See also java.lang.IllegalArgumentException: Environment must not be null

Seam 2.2GA + JBoss AS 5.1GA + Postgres 8.4

Sorry for the big wall of text, but its mostly logs
Thx for any help in any of my problems
I've been trying to get help from Seam forums, but in vain.
I'm trying this Setup mentioned in the title, but unsuccessfully.
I have it all installed correctly and the problems start with the seam-gen.
This is my build.properties
#Generated by seam setup
#Sat Aug 29 19:12:18 BRT 2009
hibernate.connection.password=abc123
workspace.home=/home/rgoytacaz/workspace
hibernate.connection.dataSource_class=org.postgresql.ds.PGConnectionPoolDataSource
model.package=com.atom.Commerce.model
hibernate.default_catalog=PostgreSQL
driver.jar=/home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
action.package=com.atom.Commerce.action
test.package=com.atom.Commerce.test
database.type=postgres
richfaces.skin=glassX
glassfish.domain=domain1
hibernate.default_schema=Core
database.drop=n
project.name=Commerce
hibernate.connection.username=postgres
glassfish.home=C\:/Program Files/glassfish-v2.1
hibernate.connection.driver_class=org.postgresql.Driver
hibernate.cache.provider_class=org.hibernate.cache.HashtableCacheProvider
jboss.domain=default
project.type=ear
icefaces.home=
database.exists=y
jboss.home=/srv/jboss-5.1.0.GA
driver.license.jar=
hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
hibernate.connection.url=jdbc\:postgresql\:Atom
icefaces=n
./seam create-project works okay, but when I try generate-entities, I get the following...
generate-model:
[echo] Reverse engineering database using JDBC driver /home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
[echo] project=/home/rgoytacaz/workspace/Commerce
[echo] model=com.atom.Commerce.model
[hibernate] Executing Hibernate Tool with a JDBC Configuration (for reverse engineering)
[hibernate] 1. task: hbm2java (Generates a set of .java files)
[hibernate] log4j:WARN No appenders could be found for logger (org.hibernate.cfg.Environment).
[hibernate] log4j:WARN Please initialize the log4j system properly.
[javaformatter] Java formatting of 4 files completed. Skipped 0 file(s).
this is problem no.1. How do I fix this? What is this? I had to do this in eclipse. It worked.
Then I import the seam-gen created project into eclipse, and deploy to JBoss 5.1. While my servers start I've noticed the following..
03:18:56,405 ERROR [SchemaUpdate] Unsuccessful: alter table PostgreSQL.atom.productsculturedetail add constraint FKBD5D849BC0A26E19 foreign key (culture_Id) references PostgreSQL.atom.cultures
03:18:56,406 ERROR [SchemaUpdate] ERROR: cross-database references are not implemented: "postgresql.atom.productsculturedetail"
03:18:56,407 ERROR [SchemaUpdate] Unsuccessful: alter table PostgreSQL.atom.productsculturedetail add constraint FKBD5D849BFFFC9417 foreign key (product_Id) references PostgreSQL.atom.products
03:18:56,408 ERROR [SchemaUpdate] ERROR: cross-database references are not implemented: "postgresql.atom.productsculturedetail"*
03:18:56,408 INFO [SchemaUpdate] schema update complete
Problem no.2. What is this cross-database references?
What about this..
03:18:55,089 INFO [SettingsFactory] JDBC driver: PostgreSQL Native Driver, version: PostgreSQL 8.4 JDBC3 (build 701)
Problem no.3 I've said in the build.properties to use JDBC4 driver, I don't know why seam insists to use JDBC3 driver. Where do I change this?
When I go into http://localhost:5443/Commerce and try to browse the auto-generated CRUD UI.
I get this error.. Error reading 'resultList' on type com.atom.Commerce.action.ProductsList_$$_javassist_seam_2
And this is what is showing in my server logs...
03:34:00,828 INFO [STDOUT] Hibernate:
select
products0_.product_Id as product1_0_,
products0_.active as active0_
from
PostgreSQL.atom.products products0_ limit ?
03:34:00,848 WARN [JDBCExceptionReporter] SQL Error: 0, SQLState: 0A000
03:34:00,849 ERROR [JDBCExceptionReporter] ERROR: cross-database references are not implemented: "postgresql.atom.products"
Position: 81
03:34:00,871 SEVERE [viewhandler] Error Rendering View[/ProductsList.xhtml]
javax.el.ELException: /ProductsList.xhtml: Error reading 'resultList' on type com.atom.Commerce.action.ProductsList_$$_javassist_seam_2
Caused by: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: could not execute query
Problem no.4 What is going on here? Cross-database references?
Thx for any help in any of my problems.
You did receive a few answers on the Seam forums (here and here), but you didn't follow up. Anyway, all these are actually caused by one problem:
As Stuart Douglas told you, you shouldn't use a catalog when connecting to PostgreSQL. To fix this, replace the property "hibernate.default_catalog=PostgreSQL" in your properties file by the property: "hibernate.default_catalog.null=", so that your file looks like this:
...
model.package=com.atom.Commerce.model
hibernate.default_catalog.null= # <-- This is the replaced property
driver.jar=/home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
...
You should be able to use seam generate-entities fine afterwards (assuming the rest of your configuration is correct). I'd recommend doing the generation into a clean folder.
Cross-database references is when a query tries to access two or more different databases. PostgreSQL does not support this, and thus complains when there is more than 1 period in the table name, so in PostgreSQL.atom.productsculturedetail, the bold part should be removed. Hibernate adds this prefix when you tell it to use a default catalog, which we already fixed in step 1 above (by telling it not to use a catalog), so this problem should be fixed after you regenerate your entities.
(Note that this is effectively the same as what Stuart Douglas told you, that you should remove the catalog="PostgreSQL" attribute in the annotations on your entity classes.)
When you specified the postgresql-8.4-701.jdbc4.jar file in the properties file, this didn't mean that the driver supports JDBC4. Although the name of the file would suggest so, the driver's website clearly states that "The driver provides a reasonably complete implementation of the JDBC 3 specification". This shouldn't be a problem for you, as you're not using the driver directly (or at least you're not supposed to). The driver is sufficient for Hibernate to fulfill its requirements and provide the required functionality.
This issue is caused by the same problem above. Hibernate is unable to read data from the database because of the incorrect query. Fixing the catalog problem should fix this issue.