There is an option in the menifest file to call method during module installation. We can mention pre_init and post_init.
I would like to call one method while upgrading the module as similar to pre_init. because after module gets installed pre_init will not be called.
Any suggestion for this ?
Why I need this ...
I have a stored procedure to generate report data quickly which uses postgresql stored procedure, now when there is a slight change in the procedure I would like to update it thourh the module upgrade process.
There should be some option available to call method during upgrade module as like pre_init and post_init.
I tried following methods to do this.
# Added following code in XML file
<function model="sale.order" name="action_custom_method"/>
#api.model
def action_custom_method(self):
# stored procedure code
return True
But this is not working for me, I am Using odoo 14.
in Odoo 14 it became pre_init_hook & post_init_hook. which would be set in the __manifest__.py
{
'pre_init_hook': pre_init_hook_method,
'post_init_hook': post_init_hook_method,
}
you also add the method definition in __init__.py
def pre_init_hook_method(cr):
env = api.Environment(cr, SUPERUSER_ID, {}) # to get env
def post_init_hook_method(cr, registry):
env = api.Environment(cr, SUPERUSER_ID, {}) # to get env
you may also use the migrations folder & module versioning like what is mentioned there Data Migration
so in your module folder you would have a migration folder. as the following structure:
<moduledir>
`-- migrations
|-- 1.0
| |-- pre-update_table_x.py
| |-- pre-update_table_y.py
| |-- post-create_plop_records.py
| |-- end-cleanup.py
| `-- README.txt # not processed
|-- 12.0.1.1 # processed only on a 12.0 server
| |-- pre-delete_table_z.py
| `-- post-clean-data.py
|-- 0.0.0
| `-- end-invariants.py # processed on all version update
so if your module defined with version like 12.0.1.1 before. you could increment a number to be like 12.0.1.2 in __manifest__.py. later on in your migration folder you add a new folder 12.0.1.2. where you could add pre- or post- python files. which would include you stored procedure definition. please also note that folder 0.0.0 will always run. if that didn't work it could be something else is wrong.
don't hesitate to let us know if you face any problems.
Finally, I got the solution.
We created one .sql file and into that file we added all our stored procedures and other database objects.
We added that sql file in the __manifest__.py
'data': [
'db_function/get_product_sales_history_data.sql',
'db_function/update_product_sales_history.sql',
],
It will execute these sql files at the time of module installation as well as module upgrade time.
Related
I have a PyQt5 (5.15.6) application running in Python 3 and want to reference my qss file as such
qss_file = QtCore.QFile("my_app_qss.qss")
However, I have multiple apps that use the same qss file so depending on where I run the app from I need an absolute import rather than a relative import. I would also like to compile any of those apps with pyinstaller and deploy them to another machine. How can I reference this qss file?
example folder structure
main
| - resources/my_app_qss.qss
| - apps/
|--------project1/app1.py
| -------project2/
|-----------------subfolder/app2.py
The issue is that I did not understand that
qss_file = QtCore.QFile("my_app_qss.qss")
Is not a path to a file. It is referencing a file that gets built by pyrcc4 from the .qrc source
I have a problem with Liquibase database versioning, because I want to include all changelog files in folder, without specifing its names and do update using them. Is that possible? My structure:
db--
|
--release-v1
|
--fistchangelog.sql
--secondchangelog.sql
|
--release-v2
|
--newchangelog.sql
...
And with every release I want to use all *.sql files from its folder. I don't want to hardcode names of sql files. I use only liquibase command lines for CI/CD in Azure.
I found something like (liquibase.org/get-started/best-practices), but it only works with xml files and I use sql extension.
I have one idea but it seems bad for me and I ll use it as a last resort. Just making a loop, for every file in folder, in cmd...
Does someone know is there a simpler and better way?
You can do like this:
have a parent change log file:
databaseChangeLog = {
include file: "releases/ref1.groovy", relativeToChangelogFile: true
include file: "releases/release1.groovy", relativeToChangelogFile: true
include file: "releases/release2.groovy", relativeToChangelogFile: true
include file: "releases/release3.groovy", relativeToChangelogFile: true
}
in releases folder ref1 file can have:
databaseChangeLog = {
include file: "../tables/changelog.groovy", relativeToChangelogFile: true
}
So inside the same releases folder you can have tables folder which then have create, delete update folders. So you can have another file in tables like, in create folder, you have all the necessary files that have been added in different releases:
databaseChangeLog = {
include file: "create/changelog.groovy", relativeToChangelogFile: true
include file: "update/changelog.groovy", relativeToChangelogFile: true
include file: "data/changelog.groovy", relativeToChangelogFile: true
}
finally in releases main folder you can add a file like release1.groovy:
databaseChangeLog = {
include file: "../tables/create/your_structured_sql_file.groovy",
relativeToChangelogFile: true
}
These are in groovy but the structuring is the same.
i think you should just use "includeAll" tag. Then liquibase will execute all .sql files inside the folder. but remember that if you not specifying every single file, Liquibase will execute those files in alphabetical order.
Sth like:
<includeAll path="com/example/changelogs/"/>
Read here -> https://docs.liquibase.com/concepts/changelogs/attributes/includeall.html
I am working on a project which has multiple modules with their own doxyfiles. My idea is to have a single master doxyfile which can include other private doxyfiles to create a one big documentation of the project. The directory structure looks like following:
MyProject+
|-Private_Prj1+
| |-Doc+
| |-doxyfile_privateprj1
|-Private_Prj2+
|-Doc+
|-doxyfile_privateprj2
|-Doc+
|-doxyfile_myproject(AKA Master Doxyfile)
How can I configure the doxyfile_myproject to include doxyfile_privateprj1 and doxyfile_privateprj2 in such a way that when I run Doxygen on the doxyfile_myproject, it then sequentially runs other doxyfiles?
I have been given a skeleton SBT project to work on. The directory structure is as follows:
|-- build.sbt
|-- project
| |-- build.properties
| |-- plugins.sbt
| |-- project
| `-- target
|-- README.md
`-- src
|-- main
| `-- scala
| `-- com
| `-- app-name
| |-- domain
| |-- exception
| |-- repository
| `-- util
`-- test
`-- scala
`-- Vagrantfile
The instructions are to create an app entry point which should take a single command line argument and run some logic.
I have managed to get a simple "hello world" sbt project working but I'm new to scala/sbt. Where would I place this entry point and how can I accept a command line argument?
The root folder for source files would be src/main/scala.
Parameters are referenced using the args array within your entry point object.
The entry point is any object under that source tree which extends App. Since this is a hello world example and you're just getting started, I'd drop it right into the root of the sources (src/main/scala/MyApp.scala).
Something like this:
object MyApp extends App {
println(args.length match {
case 0 => "You passed in no arguments!"
case 1 => s"You passed in 1 argument, which was ${args(0)}"
case x => s"You passed in $x arguments! They are: ${args.mkString(",")}"
})
}
To run your app, issue the sbt run command in the project root. To run with parameters, do sbt run "arg1".
I'm building a website with Symfony2 and PostgreSQL (for the first time). I've recently discovered the database layer called Pomm, I decided to use it instead of Doctrine2.
However I get a Fatal Error Exception when I try to display some data. The problem might come from a wrong path to the generated Pomm map file. Unfortunately I haven't found any help in the Manual and the tutorials I read to fix my mistake.
Here is what I did:
1- PommBundle Installation into Symfony2.3.1 with Composer = ok
2- Setup (PommBundle registration in the application kernel + database settings) = ok
3- Map file generation for the db table 'product' (as follows) = ok
app/console pomm:mapfile:create product
Pomm generated the folder 'Database' and now the website structure is:
-- Source Files
|-- Database
|-- PublicSchema
|-- Base
ProductMap.php
Product.php
ProductMap.php
|-- app
|-- bin
|-- src
|-- vendor
|-- web
4- app/autoload.php
The PommBundle Documentation about autoload.php is a bit confusing (for a non-native English speaker). Indeed, here's what is written:
If you are using Symfony 2.0.x, you may still be using sf2 autoloader.
Update your app/autoload.php file.
However I'm using Symfony 2.3.1 that's why I thought I don't have to update the app/autoload.php file.
Moreover it's not very clear what you have to add into the file:
# app/autoload.php (original file)
use Doctrine\Common\Annotations\AnnotationRegistry;
use Composer\Autoload\ClassLoader;
$loader = require __DIR__.'/../vendor/autoload.php';
AnnotationRegistry::registerLoader(array($loader, 'loadClass'));
return $loader;
The PommBundle documentation says:
Update your app/autoload.php file [by adding the following code]:
$loader->registerNamespaces(array(
'Symfony' => array(__DIR__.'/../vendor/symfony/src', __DIR__.'/../vendor/bundles'),
...
'Pomm' => __DIR__.'/../vendor/pomm/pomm',
'Pomm\\PommBundle' => __DIR__.'/../vendor/pomm/pomm-bundle',
I didn't understand how I could add this code to my file (shown above). So I guessed this was only for Symfony 2.0.*.
5- Problem in the Controller
In the COntroller I typed the path to the Pomm map file as follows:
namespace Admin\ProductBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
class DefaultController extends Controller
{
public function indexAction()
{
$myproducts = $this->get('pomm')
->getDatabase('database')
->createConnection()
->getMapFor('Database\PublicSchema\Product')
->findAll();
return $this->render('AdminProductBundle:Default:index.html.twig',
array("myproducts" => $myproducts));
}
}
I've certainly done something wrong because I get this error:
FatalErrorException: Error: Class 'Database\PublicSchema\ProductMap' not found in
/var/www/mywebsite/vendor/pomm/pomm/Pomm/Connection/Connection.php line 153
I'd be very grateful for any help. Thanks.
The problem is in namespacing.
Symfony trying to call class MyDatabase\PublicSchema\ProductMap that should be located in MyDatabase/PublicSchema/ProductMap.php file whereas your file located in Database/PublicSchema/ProductMap.php
So you should rename the Database folder to MyDatabase ot Database name to Database.