Oracle Sql multiple row filtering - oracle-sqldeveloper

I have 2 tables and i want to filter one table based on another one.
1 table gives me file numbers and the other gives multiple statuses per file.
For example:
File No Status
12345 U-APP
12345 U-FCL
12345 ACT
123456 ACT
123456 CLSOE
My goal is that files with certain staus will be totally disapperaed.
I tried the following:
Select File_no from my table
Where File status not in ('U-APP', 'U-FCL')
I get the following:
File No Status
12345 ACT
123456 ACT
123456 CLSOE
my goal is that file 12345 will disappear after filtering...
thanks

Try this
Select File_no,Status from mytable where File_no not in (Select File_no from my table
Where File status in ('U-APP', 'U-FCL'))

Related

PostgreSQL how do I COUNT with a condition?

Can someone please assist with a query I am working on for school using a sample database from PostgreSQL tutorial? Here is my query in PostgreSQL that gets me the raw data that I can export to excel and then put in a pivot table to get the needed counts. The goal is to make a query that counts so I don't have to do the manual extraction to excel and subsequent pivot table:
SELECT
i.film_id,
r.rental_id
FROM
rental as r
INNER JOIN inventory as i ON i.inventory_id = r.inventory_id
ORDER BY film_id, rental_id
;
From the database this gives me a list of films (by film_id) showing each time the film was rented (by rental_id). That query works fine if just exporting to excel. Since we don't want to do that manual process what I need is to add into my query how to count how many times a given film (by film_id) was rented. The results should be something like this (just showing the first five here, the query need not do that):
film_id | COUNT of rental_id
1 | 23
2 | 7
3 | 12
4 | 23
5 | 12
Database setup instructions can be found here: LINK
I have tried using COUNTIF and CASE (following other posts here) and I can't get either to work, please help.
Did you try this?:
SELECT
i.film_id,
COUNT(1)
FROM
rental as r
INNER JOIN inventory as i ON i.inventory_id = r.inventory_id
GROUP BY i.film_id
ORDER BY film_id;
If there can be >1 rental_id in your data you may want to use COUNT(DISTINCT r.rental_id)

How do I show distinct values from one column along with one of it's corresponding rows?

How can I show distinct values from one column, and show one of the values from another column on the same row? This is probably easiest to explain with example outputs, shown below.
The query should only return the same number of rows as there are distinct values in the permission column, in this case, two.
Where the permission column has select, that row will either show Alice or Bob in the username column. Where the permission column has insert, that row will either show Carol, Alice, Bob, or David in the username column.
It's okay if different usernames are returned each time. Performance is not a critical factor - there will be a less than a few dozen rows per table.
users_and_permissions table
username
permission
Alice
select
Bob
select
Carol
insert
Alice
insert
Bob
insert
David
insert
Output #1
username
permission
Alice
select
Carol
insert
Output #2
username
permission
Bob
select
Bob
insert
Output #3
username
permission
Bob
select
David
insert
Output #4
username
permission
Alice
select
David
insert
Edit: I deleted my SQL from the original post because it was using two tables and a join with a DISTINCT ON clause.
Check out DISTINCT clause at https://www.postgresql.org/docs/current/sql-select.html
SELECT
DISTINCT ON (permission)
username,
permission
FROM
users_and_permissions
ORDER BY
permission;
If you want control of the usernames that are displayed - use ORDER BY at the end of the statement.

How to generate multiple formatted files in Talend open studio based on MySQL tables?

I need to pull data from MYSQL DB and create formatted files depending on the data. How can I do that by Talend open studio??
MySQL DB has one table ( user_id, order_id, purchase_date) and I need to generate csv files for each user contains his orders. files names should have the user_id ( output files could be like user_id.csv)
Thanks
You can try below -
tMysqlInput--->tFlowToIterate---(iterate)-->tMysqlInput--->tFileOutputDelimited
More details given below -
tmysqlInput(select user_id from table group by user_id) --- row Main ---> tFlowToIterate (uncheck use the default key option, create a new key called user_id and set value to user_id in dropdown) ----- Iterate -----> tmysqlInput(sql = "select user_id, order_id,purchase_date from table where user_id=((String)globalMap.get("user_id))") ----- row main ----> tFileOutputDelimited(set filename = (String)globalMap.get("user_id))+".csv").
to summarize - you first get list of all distinct user_id then you iterate through each of them and again fetch orders for each user_id by applying filter and use this user_id value from global variable into filename..

fetching a table with another tables "conditions"

I have two tables like this:
Table Name: users
emx | userid
---------------
1 | 1
2 | 2
and another table called bodies
id | emx | text
--------------------------
1 | 1 | Hello
2 | 2 | How are you?
As you can see, bodies table has emx which is id numbers of users table. Now, when i want to fetch message that contains Hello i just search it on bodies and get the emx numbers and after that i fetch users table with these emx numbers. So, i am doing 2 sql queries to find it.
So, all i want to do is make this happen in 1 SQL query.
I tried some queries which is not correct and also i tried JOIN too. No luck yet. I just want to fetch users table with message contains 'Hello' in bodies table.
Note: I am using PostgreSQL 9.1.3.
Any idea / help is appreciated.
Read docs on how to join tables.
Try this:
SELECT u.emx, u.userid, b.id, b.text
FROM bodies b
JOIN users u USING (emx)
WHERE b.text ~ 'Hello';
This is how I'd do the join. I've left out the exact containment test.
SELECT users.userid
FROM users JOIN bodies ON (users.emx = bodies.emx)
WHERE ⌜true if bodies.text contains ?⌟

Export data from db2 with column names

I want to export data from db2 tables to csv format.I also need that first row should be all the column names.
I have little success by using the following comand
EXPORT TO "TEST.csv"
OF DEL
MODIFIED BY NOCHARDEL coldel: ,
SELECT col1,'COL1',x'0A',col2,'COL2',x'0A'
FROM TEST_TABLE;
But with this i get data like
Row1 Value:COL1:
Row1 Value:COL2:
Row2 Value:COL1:
Row2 Value:COL2:
etc.
I also tried the following query
EXPORT TO "TEST.csv"
OF DEL
MODIFIED BY NOCHARDEL
SELECT 'COL1',col1,'COL2',col2
FROM ADMIN_EXPORT;
But this lists column name with each row data when opened with excel.
Is there a way i can get data in the format below
COL1 COL2
value value
value value
when opened in excel.
Thanks
After days of searching I solved this problem that way:
EXPORT TO ...
SELECT 1 as id, 'COL1', 'COL2', 'COL3' FROM sysibm.sysdummy1
UNION ALL
(SELECT 2 as id, COL1, COL2, COL3 FROM myTable)
ORDER BY id
You can't select a constant string in db2 from nothing, so you have to select from sysibm.sysdummy1.
To have the manually added columns in first row you have to add a pseudo-id and sort the UNION result by that id. Otherwise the header can be at the bottom of the resulting file.
Quite old question, but I've encountered recently the a similar one realized this can be achieved much easier in 11.5 release with EXTERNAL TABLE feature, see the answer here:
https://stackoverflow.com/a/57584730/11946299
Example:
$ db2 "create external table '/home/db2v115/staff.csv'
using (delimiter ',' includeheader on) as select * from staff"
DB20000I The SQL command completed successfully.
$ head /home/db2v115/staff.csv | column -t -s ','
ID NAME DEPT JOB YEARS SALARY COMM
10 Sanders 20 Mgr 7 98357.50
20 Pernal 20 Sales 8 78171.25 612.45
30 Marenghi 38 Mgr 5 77506.75
40 O'Brien 38 Sales 6 78006.00 846.55
50 Hanes 15 Mgr 10 80659.80
60 Quigley 38 Sales 66808.30 650.25
70 Rothman 15 Sales 7 76502.83 1152.00
80 James 20 Clerk 43504.60 128.20
90 Koonitz 42 Sales 6 38001.75 1386.70
Insert the column names as the first row in your table.
Use order by to make sure that the row with the column names comes out first.