I created a Talend MS Sql job using tMSQLInput_1 and inserted my own query. Below is example of my query.
if object_id('tempdb..#lang_guid')is not null
drop table #lang_guid;
create table #lang_guid(
patient_guid varchar(255)
,accountid varchar(255)
);
insert into #lang_guid
select c.customerid
,'0000001'
from customer c with(nolock)
select patient_guid
,accountid
from #lang_guid
The issue I'm having is the query pulls the patient_guid from the table but not the accountid I creating that on the fly in the temp table. when i run the job in Talend it returns the patient_guid but I don't get any data back for the accountid. Have anyone seen this issue with Talend before and if so how do I fix it.
In tMssqlInput, if you click on Guess schema you can see what columns are detected from your query.
Did you define the corresponding schema ?
In Talend, when using an input component (a Database or any other input), you have to define a schema by clicking on ..., in the component view of tMSSQLInput.
You should have two defined columns in your case.
Make sure that for every input, the right talend Schema is defined.
Check the schema at the tMssqlInput compont (defined against the fetch query) has both columns.
Click on Guess schema which is above Query box or click on edit schema and define your schema
or you can have a look at Talend documentation as provided below
https://www.talendforge.org/tutorials/tutorial.php?idTuto=11&nbrFields=10&validate=true
Here are the steps you need to follow:
You need to write the exact query to fetch the fields you seek,
in your case it may be select patient_guid, accountid from customer c with(nolock)
Then you have to click guess schema to populate metadata for the component.
You can check the fields by clicking Edit Schema button in component view.
Then connect it to tLogRow to see whether your intended data is coming or not.
Other wise, simply write the query Select * from customer c with(nolock) -> Guess Schema -> Connect it with tLogRow and see which field your data is coming from -> then add tFilterColumn to pick your choice of fields and process the data further.
I hope I was helpful, please provide your feedback.
Related
I'm a real beginner when it comes to SQL and I'm currently trying to build a database using postgres. I have a lot of data I want to put into my database in JSON files, but I have trouble converting it into tables. The JSON is nested and contains many variables, but the behavior of jsonb_populate_record allows me to ignore the structure I don't want to deal with right now. So far I have:
CREATE TABLE raw (records JSONB);
COPY raw from 'home/myuser/mydocuments/mydata/data.txt'
create type jsonb_type as (time text, id numeric);
create table test as (
select jsonb_populate_record(null::jsonb_type, raw.records) from raw;
When running the select statement only (without the create table) the data looks great in the GUI I use (DBeaver). However it does not seem to be an actual table as I cannot run select statements like
select time from test;
or similar. The column in my table 'test' also is called 'jsonb_populate_record(jsonb_type)' in the GUI, so something seems to be going wrong there. I do not know how to fix it, I've read about people using lateral joins when using json_populate_record, but due to my limited SQL knowledge I can't understand or replicate what they are doing.
jsonb_populate_record() returns a single column (which is a record).
If you want to get multiple columns, you need to expand the record:
create table test
as
select (jsonb_populate_record(null::jsonb_type, raw.records)).*
from raw;
A "record" is a a data type (that's why you need create type to create one) but one that can contain multiple fields. So if you have a column in a table (or a result) that column in turn contains the fields of that record type. The * then expands the fields in that record.
Following the blog of Rob Conery I have set of unique IDs across the tables of my Postgres DB.
Now, using these unique IDs, is there a way to query a row on the DB without knowing what table it is in? Or can those tables be indexed such that if the row is not available on the current table, I just increase the index and I can query to the next table?
In short - if you did not prepared for that - then no. You can prepare for that by generating your own uuid. Please look here. For instance PG has uuid that preserve order. Also uuid v5 has something like namespaces. So you can build hierarchy. However that is done by hashing namespace, and I don't know tool to do opposite inside PG.
If you know all possible tables in advance you could prepare a query that simply UNIONs a search with a tagged type over all tables. In case of two tables named comments and news you could do something like:
PREPARE type_of_id(uuid) AS
SELECT id, 'comments' AS type
FROM comments
WHERE id = $1
UNION
SELECT id, 'news' AS type
FROM news
WHERE id = $1;
EXECUTE type_of_id('8ecf6bb1-02d1-4c04-8875-f1da62b7f720');
Automatically generating this could probably be done by querying pg_catalog.pg_tables and generating the relevant query on the fly.
This problem scenario may sounds strange but I am trying to write a trigger to log the query type into another table and so far i havent been able to find anything on google
the database i am using is postgres
i.e.
if i have two tables; table1 and querylog(has a string field called querytype)
and a select query is executed on table1, i want to insert a row into the query log table with the querytype field populated with "select"
anyone have any idea how to reference the query type in a function that will be called by a trigger?
Triggers do not get called for SELECT queries, so that won't work.
If you want to audit queries, you can use the PostgreSQL log file or tools like pgaudit that hook into PostgreSQL to retrieve and log the information.
I'm working on an application that imports data from Access to SQL Server 2008. Currently, I'm using a stored procedure to import the data individually by record. I can't go with a bulk insert or anything like that because the data is inserted into two related tables...I have a bunch of fields that go into the Account table (first name, last name, etc.) and three fields that will each have a record in an Insurance table, linked back to the Account table by the auto-incrementing AccountID that's selected with SCOPE_IDENTITY in the stored procedure.
Performance isn't very good due to the number of round trips to the database from the application. For this and some other reasons I'm planning to instead use a staging table and import the data from there. Reading up on my options for approaching this, a cursor that executes the same insert stored procedure on the data in the staging table would make sense. However it appears that cursors are evil incarnate and should be avoided.
Is there any way to insert data into one table, retrieve the auto-generated IDs, then insert data for the same records into another table using the corresponding ID, in a set-based operation? Or is a cursor my only option here?
Look at the OUTPUT clause. You should be able to add it to your INSERT statement to do what you want.
BTW, if you need to output columns into the second table that weren't inserted into the first one, then use MERGE instead of INSERT (as suggested in the comment to the original question) as its OUTPUT clause supports referencing other columns from the source table(s). Otherwise, keeping it with an INSERT is more straightforward, and it does give you access to the inserted identity column.
I'm having experiment to worked out in inserting multiple record into related table using databinding. So, try this!
Hopefully this is very helpful. Follow this link How to insert record into related tables. for more information.
I am using Sybase Advantage, I have 2 tables:
The first table has the data records
The second table stores a history of the first
The first table has triggers to populate records in the second table depending on which fields get changed.
I would like to store the connection name (PC which made the request), the name that is displayed in the active queries page (Server Info dialog) and not the username. Does anyone know if this is possible?
Thanks
The following SQL statement can be used to retrieve the computer name instead of the user name.
SELECT *
FROM
( EXECUTE PROCEDURE sp_mgGetConnectedUsers() ) ConnUsers
WHERE
ConnUsers.DictionaryUser = USER();
The stored procedure sp_mgGetConnectedUsers is documented here.