One query to run on two different tables with same fields so that data comes # one go rather then running two different queries - dbeaver

What Will be the one query to run on two different tables with same fields so that data comes # one go rather then running two different queries and then joining two data sets .
Example of Tables - Postpaid and Prepaid - Having Customers request and complaint database
Tried making one query but its not running

Related

Tableau Virtual Connections: Multiple tables in a single VC vs multiple VCs with one table in each?

Hello Tableau Community,
Our team is considering switching to virtual connections to better maintain row level security.
However we are not sure what is the best choice when it comes to grouping multiple tables in a single connection or spearate in more VCs.
We have all our tables in a single database and plan on using "Extracts Only" VCs.
Here are some of our concerns (that might be rooted in misconcetpions):
Concerns about having multiple tables in on VC:
Will extracts become big and slow? What happens if one table fails -
will entire extract fail?
Concerns with having multiple VCs with
single or few tables in each
How can we relate the talbes in workbooks? Relationship model does
not seem to work for different published data sources.
Cumbersome to use multiple connections in one workbook
Any recommendations or insights?

PostgreSQL data warehouse: create separate databases or different tables within the same database?

We seek to run cross-table queries and perform different type of merges. For cross-database queries we need to establish connection every time.
So to answer your question,
we should create multiple(different)tables within same database.
because cross database operation is not supported(for ex. In short you can't do the join on 2 tables in different database)
But if you want to segregate your data within same database you can create different schemas/layer .and create your tables under that.
for ex.
**1st load landingLayer.tablename
2nd transformation goodDataLayer.tablename
3rd transformation widgetLayer.tablename**

Does EntityFramework allow you to map a single model to multiple tables?

My project will allow users to create and work on multiple projects, so most of my tables will accordingly have a ProjectID column.
However one of those tables, let's call it ProjectData will become very large very quickly, as each project can have hundreds of thousands of rows. To me it makes more sense to have a table for each Project, thus 001Data, 002Data, etc.
Does EntityFramework allow for such single model -> multiple table mappings? Perhaps it would be better to have separate databases for each project, but that too provides its own set of challenges.

(JasperReports) Combine data from different datasources as columns of the same report row

I am evaluating JasperReports (CE) as a reporting solution for one of my clients.
As for now I like it very much and it looks like a pretty solid platform. One thing I cannot find info about, is the possibility of combining results of sub-queries made to different datasources in one report (not as drill-down sub-reports but as different columns of the same row).
As in example: there is some products info in one database (Firebird), but the sales info, actual stock and purchase prices are stored in a different system, which uses different database (SQL Server of Microsoft). In both databases products are represented with the same product unique code. So I need to query the first database to obtain the "master recordset" for fulfilling some report columns, and then query each product for additional info, which is stored in the second database, combining resulting data from both datasources in the same row as different columns of the same report.
Is it possible with JasperReports? If not, I'd appreciate your suggestions on other reporting solutions being able to fulfill my request.
Since your row data is from different DBs, you need to query the required tables in both Dbs, build a BeanDatasource from the resultsets and pass it on to jasper reports.

Tableau - How to query large data sources efficiently?

I am new to Tableau, and having performance issues and need some help. I have a query that joins several large tables. I am using a live data connection to a MySQL db.
The issue I am having is that it is not applying the filter criteria before asking MySQL for the data. So it is essentially doing a SELECT * from my query and not applying the filter criteria to the where clause. It pulls all the data from MySQL db back to Tableau, then throws away the un-needed data based on my filter criteria. My two main filter criteria are on account_id and a date range.
I can cleanly get a list of the accounts from just doing a select from my account table to populate the filter list, then need to know how to apply that selection when it goes to pull the data from the main data query from MySQL.
To apply a filter at the data source first, try using context filters.
Performance can also be improved by using extracts.
I would personally use an extract, go into your MySQL DB Back-end, run the query, and a CREATE TABLE extract1 AS statement, or whatever you want to call your data table.
When you import this table into Tableau it will already have a SELECT * of your aggregate data in the workbook. From here your query efficiency will be increased ten fold.
Unfortunately, it's going to take awhile for Tableau processing time + mySQL backend DB query time = Ntime to process your data.
Try the extracts...
I've been struggling with the very same thing. I have found that the tableau extracts aren't any faster than pulling directly from a SQL table. What I have done is within SQL created tables that already have the filtered data in them, so the Select * will have only the needed data. The downside to this is it takes up more space on the server, but this isn't a problem on my side.
For the Large Data sets Tableau recommend using an Extract.
An extract will create a snapshot of the data that you are connected with and processing on this data will be faster than a live connection.
All the charts and visualization will load faster and saves your time, each time when you go to the Dashboard.
For the filters that you are using to filter the data-set will work faster in an extract connection. But to get the latest data you have to refresh the extract or schedule a refresh in the server ( if you are uploading the report to server).
There are multiple type of filters available in Tableau, the use of which depends on your application, context filters and global filters can be use to filter the whole set of data.