For example my table contains a column 'skills'.
i want to search the name who having skills in java and oracle.
I know its very simple,i am very new to oracle10g,please help me.
i tried like
select name from table_name where skills='java' or skills='java';
the name 'c' having skills in java and html,how can i select ?
select name from table_name where skills='java,html';
i dont know which method i wanna use,please help me.
This will return any name that has java in its skills list.
select name from table_name where ','||skills||',' like '%,java,%'
select name from table_name where skills like '%java%' or skills like '%oracle%';
That will give you names who have skills in java or oracle. If you want names that have both at the same time, change the "or" for "and".
as shown in your table if you want to select names whose skills are either java or oracle you can do select name from table_name where skills like '%java%' or skills like '%oracle%' this will select all the names whose skills contains java or oracle
Related
I have a database table with name 'student' and have column 'name'.
I want to retrieve the names of students starting without a title 'miss' but starting with 'miss'.
I did:
select name from students where lower(name) like 'miss%' and name not like 'miss %'
The above query returns names with title also.
Any help will be highly appreciated.
do you maybe just need lower() around "name" the second time? For me with Postgres 9.5 your query works, if you say "and lower(name) not like 'miss %'
The use case is, say you have a table with a lot of columns (100+) and you want to see if a certain column name exists in the table. Another use case is, say there is a name scheme for the columns in the table that allows me to search for a term that will show all fields with that name - e.g. all fields related to a payment card are prefixed with "card_".
In MySQL I could handle both cases above by doing a show fields in <table_name> like '%<search_term>%'. I've googled for a solution but have only found results related to filtering actual table names and showing table schemas (e.g. \d+), which is not what I am looking for. I've also tried variations of the MySQL command in the psql shell, but no luck.
I'm looking for a way to do this with SQL or with some other Postgres built-in way. Right now I'm resorting to copying the table schema to a text file and searching through it that way.
You can query information_schema.columns using table_name and column_name. For example:
>= select table_name, column_name
from information_schema.columns
where table_name = 'users'
and column_name like '%password%';
table_name | column_name
------------+------------------------
users | encrypted_password
users | reset_password_token
users | reset_password_sent_at
Scenario: Table with over 100 fields (not my doing... I inherited this)
Only 50 these fields are required to be displayed on a web site
They want to maintain the other 50 fields for historical purposes.
There is a possibility that some of the not required fields may become required sometime in the future.
Problem: I'm looking for a way to easily indentify the 50 required fields such that I could pull the field names with a query.
Psuedo Query: Select FieldNames from TableName where Required = Yes
Is there a setting I could change?
What about using Extended Properties?
Thanks in advance for any direction you can provide.
Unless I'm missing a nuance to your question, use the INFORMATION_SCHEMA table for COLUMNS. This query identifies all the columns in table dbo.dummy that are required.
SELECT
IC.COLUMN_NAME
FROM
INFORMATION_SCHEMA.COLUMNS IC
WHERE
IC.TABLE_SCHEMA = 'dbo'
AND IC.TABLE_NAME = 'dummy'
AND IC.IS_NULLABLE = 'NO'
After doing more thinking, perhaps you wanted a generic query that would grab all the required columns and then build out the select query. This query covers that possible request
DECLARE
#hax varchar(max)
, #schemaName sysname
, #tableName sysname
SELECT
#schemaName = 'dbo'
, #tableName = 'dummy'
; WITH A AS
(
-- this query identifies all the columns that are not nullable
SELECT
IC.TABLE_SCHEMA + '.' + IC.TABLE_NAME AS tname
, IC.COLUMN_NAME
FROM
INFORMATION_SCHEMA.COLUMNS IC
WHERE
IC.TABLE_SCHEMA = #schemaName
AND IC.TABLE_NAME = #tableName
AND IC.IS_NULLABLE = 'NO'
)
, COLUMN_SELECT (column_list) AS
(
-- this query concatenates all the column names
-- returned by the above
SELECT STUFF((SELECT '], [' + A.Column_Name
FROM A
FOR XML PATH('')),1, 2, '')
)
-- Use the above to build a query string
SELECT DISTINCT
#hax = 'SELECT ' + CS.column_list + '] FROM ' + A.tname
FROM
A
CROSS APPLY
COLUMN_SELECT CS
-- invoke the query
EXECUTE (#hax)
How about creating a view that only has the required fields.
I am not sure if I understand the question correctly. Is this what you are looking for? The code is in MS SQL.
select t.name as TABLE_NAME, c.name as COLUMN_NAME, c.is_nullable
from sys.tables t
inner join sys.columns c on c.object_id = t.object_id
WHERE t.name = '<TableName>'
and c.is_nullable = 0
There's no flag you can put on a field to determine whether it's relevant or not -- that's what the SELECT list is for. A couple of ideas...
1) Split the historical data out into a separate table, with a one-to-one relationship to the source table.
2) Re-name the historical fields in your table as "OBSOLETE_" + fieldname. This will at least give you a quick visual reference for when you're writing your sql.
3) Create a view. Big drawback to this one would be that you can take some big performance hits as soon as you try to use the view as a table in other queries. But if you're just pulling off it directly without joining it, you should be fine.
We use separate metatables describing all tables and columns in database. We store information like friendly name (for example 'username' column shoud be displayed to user as 'User name'), formating, etc. You could use this approach to store information about required columns.
We have tried object extended properties (sp_addextendedproperty etc.), but metatable(s) solution came up better for us.
Within TSQL this is not easy as you cannot dynamically build the columns in the select line nor the alias name for those columns. The parser and query optimizer need some stuff to be static. Is it an ASP.NET web site? In your development environment (e.g. C#) you could dynamically build the query.
how to make this work in mysql?
select ID,COMPANY_NAME,contact1, SUBURB, CATEGORY, PHONE from Victoria where (city in ( select suburb from allsuburbs)) and CATEGORY='Banks'
this below statement is working:
select ID,COMPANY_NAME,contact1, SUBURB, CATEGORY, PHONE from Victoria where city in ( select suburb from allsuburbs)
if I add "and" , it gives me an empty resultset,
thanks
Learn how joins work.
select
v.ID,v.COMPANY_NAME,v.contact1,v.SUBURB,v.CATEGORY,v.PHONE
from
Victoria v
inner join allsuburbs s on s.suburb = v.city
where
v.CATEGORY='Banks'
Apart from that, your query does not make a whole lot of sense.
Your table is namend Victoria, but it contains a field named city?! Do your other cities have their own table too?
You have a table named allsuburbs, but your criterion is that Victoria.city equals allsuburbs.suburb, even though a field named Victoria.suburb exists?! What's Victoria.suburb for, then?
Your table is named allsuburbs. Do you have another table that contains suburbs or is this your only one? If it is your only one, the name is redundant.
You have a field contact1. Do you have contact2...contact10 as well? Bad database design.
Why is half of your fieldnames in caps, and not all of them (or none of them)?
Oh, and the usual format for SQL is: SQL keywords in caps, the field names etc. in mixed case/lower case. Much easier to read.
I think you might have misplaced a parentheses?
.. PHONE from Victoria where
(city in ( select suburb from allsuburbs)) and CATEGORY='Banks'
I'm guessing should be:
.. PHONE from Victoria where
city in ( select suburb from allsuburbs) and CATEGORY='Banks'
Not sure if that makes more sense, but the first case is not an ok SQL-statement I believe.
I have a table that has about 6 fields. I want to write a SQL statement that will return all records that do not have "England" in the country field, English in the language field & english in the comments field.
What would the sql query be like?
Well, your question depends a lot on what DBMS you're using and what your table set up looks like. This would be one way to do it in MySQL or TSQL:
SELECT *
FROM tbl
WHERE country NOT LIKE '%England%' AND language NOT LIKE '%english%'
AND comments NOT LIKE '%english%';
The way you word your question makes it sound like all these fields could contain a lot of text, in which case the above query would be the way to go. However, more likely than not you'd be looking for exact matches in a real database:
SELECT *
FROM tbl
WHERE country!='England' AND language!='english'
AND comments NOT LIKE '%english%';
Start with this and modify as necessary:
SELECT *
FROM SixFieldTable
WHERE Country <> 'England'
AND language <> 'english'
AND comments NOT LIKE '%english%'
Hope this helps.
Are you wanting something like
select * from myTableOfMadness
where country <> 'England'
and language <> 'English'
and comments not like '%english%'
Not sure if you want 'and's or 'or's, or all 'not' comparisons. Your sentence structure is somewhat misleading.
The above solutions do not appear to account for possible nulls in the columns. The likes of
Where country <> 'England'
will erroneously exclude entries where Country is null, under default SQL Server connection settings.
Instead, you could try using
IsNull(Country, '') <> 'England'
To ignore case:
SELECT *
FROM SixFieldTable
WHERE LOWER(Country) <> 'england' AND
LOWER(language) <> 'english' AND
LOWER(comments) NOT LIKE '%english%'
Try This
Select * From table
Where Country Not Like '%England%'
And Language Not Like '%English%'
And comments Not Like '%English%'