I am working on a project which uses PostgreSQL on the backend to handle database. As PostgreSQL has procedural languages i.e. PL/pgSQL support, existing code contains various user defined functions.
Now I want to migrate to SQLite which does not have support for procedural langauges, I think. How can I handle this situation in SQLite and use same code fragment (i.e. functions) different time by just calling it.
Related
pgmodeler is said to be PostgreSQL Database Modeler.
As far as I know it is for relational database design, and relational database design isn't RDMBS specific.
So is pgmodeler only used for PostgreSQL? Can it be used with other RDBMS, such as mysql, sqlserver, oracle database?
What part of pgmodeler is postgresql specific, and what part of it is not?
Thanks.
It is specific to postgresql in the sense that it supports everything that posgresql does : sql extensions ("create table like..."), its procedural language pl/pgsql, foreign data wrappers, table partitioning, among many many others ; and these are usually totally incompatible with other RDBMS.
But the main developper is studying closely an integration with pgloader, which would make such a compatibility a thing in a near future.
If you stick to the pure design features of pgmodeler, "keep it classic" and never go for an implementation, then pgmodeler is somehow universal.
Edit : To answer more precisely, the model is "universal", the code produced when you export the model to a database is specific to postgresql (sql data types, extensions...).
I am trying to import to pgAdmin a big table with more than 100 columns. Is there any way to import the table without creating those 100 columns in a table within the pgAdmin? That would be a considerably time-consuming task.
You are not importing data into pgAdmin, you are importing it into Postgres, and using pgAdmin to help you in that task. Graphical tools like pgAdmin are, at heart, just convenience wrappers around the actual functionality of the database, and everything they do can be done in other ways.
In the case of a simple task like creating a table, the relevant SQL syntax is well worth learning. It will work in any database tool, even (with some minor changes) on other SQL databases (e.g. MySQL), can be saved in version control, and manipulated with an editor of your choice.
You could even go so far as to write a script in the language of your choice that generates the SQL for you based on some other data (e.g. the headings of the CSV file) - although make sure you don't run that with third-party data without checking the result or taking extreme care with code injection and other security concerns!
The Postgres manual has an introduction to tables and creating them which would be a good place to start.
i have two database cvtl and cvtl_db , i need to write a single query to retrieve data from table A in cvtl and table B in cvtl_db.
Postgres is throwing error: cross database reference are not implemented
Basically you have two ways:
Older tools.
If you need to support older versions of PostgreSQL, use dblink or DBI-link. These two provide robust support for cross-db queries across a number of PostgreSQL versions. pl/proxy is another possibility.
Newer tools.
The newer approach is to use foreign data wrappers. This has more functionality (such as better transaction handling) and probably has more eyes in terms of support than dblink etc do today.
I have a Postgresql 9.1 server, and I want to write some functions on Python.
There 2 ways: plpy or psycopg2. For me writing functions in plpy is like nightmare, a lot of "prepare" and "execute" methods... more comfortably to use psycopg2, but I care about efficiency.
Is it correct using psycopg2 on server?
They are 2 different things. plpy (PL/Python) is a stored procedure language like PLpgSQL and psycopg2 is a client library which allows Python code to access a PostgreSQL database. With PL/Python, you will need to make a connection to the database and then call your function from the database. With psycopg2, you would write Python code to call the database. Unless you have specific requirements of writing a Python function that runs from within PostgreSQL itself, use pysocpg2.
For me writing functions in plpy is like nightmare
You've already made your mind up. Even if you use pl/python, any bugs in your code will be attributed to the language or its environment. After a while you will be able to abandon your pl/python code and re-implement it the way you always wanted to*.
The beauty of it is, that as Jim says above the two interfaces do completely different things - one running inside the database within a single transaction, the other outside as normal client code. There's almost no overlap between the use cases for the two languages. Somehow though you've missed that even while making up your mind about which is "good" or "bad".
*We all have our programming predjudices, but I'm not sure there's much point in opening a question with a bold statement of them.
I want to know what PostgreSQL functions are.
When do I have to write them?
How can I write them?
And how can I call them?
Definition, from wikipedia:
A stored procedure is a subroutine available to applications that
access a relational database system.
Advantages of stored procedures in general, from wikipedia:
Overhead: Because stored procedure statements are stored directly in
the database, they may remove all or part of the compilation overhead
that is typically required in situations where software applications
send inline (dynamic) SQL queries to a database. (...)
Avoidance of network traffic: A major advantage with stored procedures
is that they can run directly within the database engine. In a
production system, this typically means that the procedures run
entirely on a specialized database server, which has direct access to
the data being accessed. The benefit here is that network
communication costs can be avoided completely. This becomes
particularly important for complex series of SQL statements.
Encapsulation of business logic: Stored procedures allow programmers
to embed business logic as an API in the database, which can simplify
data management and reduce the need to encode the logic elsewhere in
client programs. (...)
Delegation of access-rights: In many systems, stored procedures can be
granted access rights to the database that users who execute those
procedures do not directly have.
Some protection from SQL injection attacks: Stored procedures can be
used to protect against injection attacks. Stored procedure parameters
will be treated as data even if an attacker inserts SQL commands. (...)
In PostgresSQL stored procedures are called user defined functions.
Definition example:
CREATE FUNCTION somefunc(quantity integer) RETURNS integer AS $$
DECLARE
myvariable integer := 2;
BEGIN
RETURN quantity * myvariable;
END;
$$ LANGUAGE plpgsql;
(You can use other languages to define stored functions in PostgreSQL)
Calling example:
SELECT somefunc(100);
More info: http://www.postgresql.org/docs/9.1/static/server-programming.html
PostgreSQL runs stored procedures in more than a dozen programming languages, including Java, Perl, Python, Ruby, Tcl, C/C++, and its own
PL/pgSQL, which is similar to Oracle's PL/SQL.
The use of stored procedure depends on your needs and depends on the logic of your program, in my opinion stored procedure are useful only in some cases and not ever...
I've used stored procedure in a multi database server application, in this case the use of stored procedure could be extremely useful for example in the case you have a query that need to be modified for run in another database server type, in that case you can write a stored procedure in each database server and call it from your program being sure that it run and retrieve the wanted resultset without any changes in the client code.
To learn how to create stored procedure in PostgreSQL refer to this page of the documentation.
the main advantage is reducing network traffic overhead. Stored procedure is almost same (not exactly) as business logic or logic tire. Its main advantage is making dynamic enterprise application.You can find 100s of good product failed just because lack of dynamic database structure. Stores procedures,Functions,Triggers,Sequences,indexes and relational nature of database are the real keys to create great applications.My company always try to reduce client side logic layers with the help of stored procedures. Most of the critical logics are store in stored procedures which makes programmers and testers happy and meet their timelines.