Is there a way to let liquibase show the statements it is running (for example when using update) or generating (for example when using updateSql) on the console of the command-line while actually running them on the database / generating them to a SQL file?
I am not sure if this would work or not, but you could try setting up liquibase to use SLF4J logging framework, then configure logback (or anything else that slf4j supports) so that debug-level logging is directed to the console. If I remember correctly, liquibase will log the statements being executed at debug level.
Related
Currently, I am using HSQLDB version 2.6.0. I can connect and execute queries using HSQLDB GUI (runManagerSwing).
I want to know, how to connect and execute queries via CMD prompt? (Like we do for SQL Server, MySQL, Etc). I have tried the steps given in the below link. it doesn't works.
FYR - https://confluence.atlassian.com/bamkb/how-to-access-embedded-hsql-database-via-command-line-847749291.html
It would be great if anyone gives the step by step process. Thanks!
The steps are still OK. Use the SqlTool.jar version 2.6.0 and replace the URL with the JDBC URL for your database. If you have a problem, post YOUR config together the steps you are taking.
I use Karate as standalone JAR, and for writing scenarios I've installed Visual Source Code with the "karate-runner" plugin as IDE support.
I use an external jar for encryption treatments.
The trouble is that, when I execute a Karate scenario from Visual Source Code (for debuggig purpose), my external jar is not taken into account, and during execution, I get the message "java.lang.ClassNotFoundException: GenerateSign" in the console.
I've no problem when I launch the scenario directly in command line like :
Karate.bat mytest.feature
With the content of karate.bat is :
java -cp karate.jar;Sign.jar;. com.intuit.karate.Main %*
So, how to configure the tools in order to execute my karate scenarios from VScode taking into account my external jar too?
Thanks a lot.
I suspect the problem is you haven't updated the "karateCli" property in your launch.json debug configuration. Can you try to update it including your additional jar file and try again.
EDIT
Based on what command line does work in your batch file you should update your "Karate Runner" extensions settings as shown below in the images.
For running tests from Codelens with "Run Karate Test(s)"
For running tests with VSCode debugger
Maybe you simply are on the wrong version. Dir you try 0.9.5 ?
Here are the instructions: https://marketplace.visualstudio.com/items?itemName=kirkslota.karate-runner
For those coming across this in the future, you can use this as an additional reference: https://github.com/intuit/karate/wiki/Karate-Robot-Windows-Install-Guide
I'm using liquibase since several years and it's extremly helpful for me as application developer to bring source code and database in sync, so thank you to all contributors for this tool.
During my daily work, I usually start liquibase from the command line in order to test the changesets and database operations. If everything is wired right, I start my application (Spring Boot) and the liquibase setup within the application performs all those sync steps. These setup works perfect unless my changelog file contains changesets with loaddata in order to populate data from CSV files into the database. Every application start fails with liquibase.exception.ValidationFailedException: Validation Failed:
change sets check sum
The reason seems to be the different file locations for the CVS files mentioned in loaddata which are part of the checksum computation. If startet from the application, the changesets looks like this:
classpath:liquibase/changelog.xml: classpath:liquibase/changelog.xml::loadDefaultRolePermissions::dominik
But if started from commandline, there is no way to use classpath resources, the changeset infos looks like that
liquibase: src/main/resources/liquibase/changelog.xml: src/main/resources/liquibase/changelog.xml::loadDefaultRolePermissions::dominik
Both values differs and leads to different checksums.
If you look into liquibase.integration.commandline.Main.java, there is no classpath resource accessor used:
FileSystemResourceAccessor fsOpener = new FileSystemResourceAccessor();
CommandLineResourceAccessor clOpener = new CommandLineResourceAccessor(classLoader);
CompositeResourceAccessor fileOpener = new CompositeResourceAccessor(fsOpener, clOpener);
from liquibase.integration.commandline.Main.java
Is there any way to let liquibase be interoperable between command line AND application startup run ?
Thanks in advance
Dominik
RESOLVED by updating from liquibase 3.3.1 to 3.5.3
I have a Websphere Application Server v8.0, and my job requires me to change the location of my JDBC data source to different values to test in different environments. I traditionally would do this via the admin console and change the settings via the Resources > JDBC > Data sources section, but I'd like to write a script to change these settings. When I run the admin console, where do the settings get stored? I can run the console vis-a-vis the Servers tab in Eclipse (Rational Application Developer) or by navigating to localhost:9044, but I don't know where the settings are stored - which I'd need to write said script.
Can anybody help me out?
From what I remember of WebSphere Application Server, the settings are ultimately persisted to the file system - however you shouldn't be changing them this way because application server config is a messy and complicated business and by directly changing settings you risk destroying your app server.
I'd recommend checking out this redbook, particularly Chapter 8 which describes how you can configure your app server with scripts. Also I seem to recall plans to display the equivalent scripting commands in the admin console.
If it helps, I had a quick look locally and found a reference to my JDBC data source in "resources.xml" located within the websphere directory at...
<server profile root>\config\cells\<aNodeCell>\nodes\<aNode>\servers\<aServer>
In the past I've used xml config to read values for convenience, but not often to update. Instead I have made use of some of the jython script options available and can echo Jim's response to check out the options there in case there is something that would be a viable alternative.
Edit:
There is another link that may be of interest Configuring data access with wsadmin scripting. I've not used this particular feature of wsadmin myself but it does appear to show promise at first glance.
If you want to write a script, then rather than looking at file system write a proper jython script, which will do your modifications in the similar way as you would do it via console.
To make writing script easier you can use:
Command assistance in the console - the Help portlet on the right shows last invoked command in jython
Script library, which already provides some scripts - Automating data access resource configuration using wsadmin scripting
And basic scripting commands - Configuring data access with wsadmin scripting
I am using slf4j-simple in my project. I would like to change logging level for slick to INFO. After reading Logging options for Slick
and Class SimpleLogger docsI have tried to add following options to my VM line:
-Dorg.slf4j.simpleLogger.defaultLogLevel=INFO
-Dlogger.scala.slick=INFO
-Dlogger.scala.slick.jdbc.JdbcBackend.statement=INFO
-Dorg.slf4j.simpleLogger.log.scala.slick=INFO
I see a few INFO level logs comming from jetty, therefore the basic logging seems to be working. I am also able to change level of logs shown by using -Dorg.slf4j.simpleLogger.defaultLogLevel=TRACE, but even that only shows more jetty logs, no Slick logs are shown to me.
How can I configure slf4j-simple to shown slick logs to me?
According to http://www.slf4j.org/api/org/slf4j/impl/SimpleLogger.html the correct system property should be
-Dorg.slf4j.simpleLogger.log.scala.slick=debug
instead of
-Dlogger.scala.slick=INFO
For reference, the list of all loggers used by Slick can be found in logback.xml. We only use the debug level with all of them.