Basically I want to retrieve rows of data that meet my clause conditions using Power Query.
I got 400 rows of lookup values in my spreadsheet.
Each row represent 1 lookup code for example, code AAA1, AAB2 and so on
So lets say I have a select statement and I want to construct the where clauses using the above codes so my end sql statement will look like
select * from MyTable where Conditions in ('AA1', 'AAB2')
so so far I have this
let
Source = Excel.CurrentWorkbook(){[Name="Table5"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Form ID",
Int64.Type}}),
test = Sql.Database("myserver", "myDB", [Query="SELECT * FROM myTable where" & #"Changed Type" & "])"
in
test
Obviously that didnt work but thats my pseduo scenario anyway.
Please could you advice what to do?
Thank you
Peddie
I would create a "lookup" Power Query based on the Excel table. I would set the "Load To" properties to "Only Create Connection".
Then I would start the main Query by connecting to the SQL server using the Navigator to select "MyTable". Then I would add a Merge step to the main Query, to join to the "lookup" Query, matching the "Conditions" column to the "lookup" code. I would set the Join Type to "Inner". The Merge properties window will show you visually if the 2 columns you select actually contain matching data.
This approach does not require any coding, and is easier to build, extend and maintain.
Mike Honey's join is best for your problem, but here's a more general solution if you find yourself needing other logic in your where clause.
Normally Power query only generates row filters on an equality expression, but you can put any code you want in a Table.SelectRows filter, like each List.Contains({"AA1", "AAB2"}, [Conditions])
So for your table, your query would look something like:
let
Source = Excel.CurrentWorkbook(){[Name="Table5"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Form ID", Int64.Type}}),
test = Sql.Database("myserver", "myDB"),
yourTable = test{[Name="myTable"]}[Data],
filtered = Table.SelectRows(yourTable, each List.Contains(#"Changed Type"[Form ID], [Conditions]))
in
filtered
The main downside to using the library functions is that Table.SelectRows only knows how to generate SQL where clauses for specific expression patterns, so the row filter probably runs on your machine after downloading the whole table, instead of having the Sql Server run the filter.
Related
The program just selects everything if the carrid is ok even if it is not ok with the lt_spfli. And there aren't any entries with that carrid it gets runtime error. If I try with for all entries he just selects absolutely the entire SFLIGHT.
PARAMETERS: pa_airp TYPE S_FROMAIRP,
pa_carid TYPE S_CARR_ID.
DATA: lt_spfli TYPE RANGE OF SPFLI,
lt_sflight TYPE TABLE OF SFLIGHT.
SELECT CONNID FROM SPFLI
INTO TABLE lt_spfli
WHERE AIRPFROM = pa_airp.
SELECT * FROM SFLIGHT
INTO TABLE lt_sflight
WHERE CARRID = pa_carid AND CONNID in lt_spfli.
I just suppose, that you want every flight connection from a given airport...
Notice, that a RANGE structure has two more fields in front of the actual "compare value". So selecting directly into it will result in a very gibberish table.
Possible Solutions:
Selecting with RANGE
If you really want to use this temporary table, you can have a look at my answer here where I describe the way to fill RANGEs without any overhead. After this step, your current snippet will work the way to wanted it too. Just make sure, that it really has been filled or everything will be selected.
Selecting with FOR ALL ENTRIES
Before you use this variant you should make absolutely sure, that your specified data object is filled. Otherwise it will result in the same mess as the first solution. To do that, you could write:
* select connid
IF lt_spfli[] IS NOT INITIAL.
* select on SFLIGHT
ELSE.
* no result
ENDIF.
Selecting with JOIN
The "correct" approach in this case would be a JOIN like:
SELECT t~*
FROM spfli AS i
JOIN sflight AS t
ON t~carrid = #pa_carid
AND t~connid = i~connid
INTO TABLE #DATA(li_conns)
WHERE i~airpfrom = #pa_airp.
Use a FOR ALL ENTRIES instead of CONNID in lt_SPFLI.
As so:
SELECT *
FROM sflight
FOR ALL ENTRIES IN lt_spfli
WHERE carrid = pa_carid
AND connid = lt_spfli-connid
You are misunderstanding what a "Ranges Table" is. You fill it incorrectly.
This part of your code demonstrates the misunderstanding (with a little debug, you would see the erroneous contents immediately):
DATA: lt_spfli TYPE RANGE OF SPFLI.
SELECT CONNID FROM SPFLI INTO TABLE lt_spfli ...
A "Ranges Table" is an internal table with 4 components (SIGN, OPTION, LOW, HIGH), used in Open SQL to do complex selections on one database column (NB: it can also be used in several ABAP statements to test the value of an ABAP variable).
So, with your SQL statement, you only initialize the first component of the Ranges table, while you should transfer CONNID into the third component.
In "modern" Open SQL, you'd better do:
SELECT 'I' as SIGN, 'EQ' as OPTION, CONNID as LOW FROM SPFLI INTO TABLE #lt_spfli ...
For more information about Ranges Tables, you may refer to the answer here: What actually high and low means in a ranges table
I have a mongo database and I'm trying to write an Eloquent code to change some fields before using them in WHERE or ORDER BY clauses. something like this SQL query:
Select ag.*, ht.*
from agency as ag inner join hotel as ht on ag.hotel_id = ht.id
Where ht.title = 'OrangeHotel'
-- or --
Select ag.*, ht.*
from agency as ag inner join hotel as ht on ag.hotel_id = ht.id
Order by ht.title
sometimes there is no other table and I just need to use calculated field in Where or Order By clause:
Select *
from agency
Where func(agency_admin) = 'testAdmin'
Select *
from agency
Order by func(agency_admin)
where func() is my custom function.
any suggestion?
and I have read Laravel 4/5, order by a foreign column for half of my problem, but I don't know how can I use it.
For the first query: mongodb only support "join" partially with the aggregation pipeline, which limits your aggregation in one collection. For "join"s between different collections/tables, just select from collections one by one, first the one containing the "where" field, then the one who should "join" with the former, and so on.
The second question just puzzled me for some minutes until I see this question and realized it's the same as your first question: sort the collection containing your sort field and retrive some data, then go to another.
For the 3rd question, this question should serve you well.
I have a query like this, which we use to generate data for our custom dashboard (A Rails app) -
SELECT AVG(wait_time) FROM (
SELECT TIMESTAMPDIFF(MINUTE,a.finished_time,b.start_time) wait_time
FROM (
SELECT max(start_time + INTERVAL avg_time_spent SECOND) finished_time, branch
FROM mytable
WHERE name IN ('test_name')
AND status = 'SUCCESS'
GROUP by branch) a
INNER JOIN
(
SELECT MIN(start_time) start_time, branch
FROM mytable
WHERE name IN ('test_name_specific')
GROUP by branch) b
ON a.branch = b.branch
HAVING avg_time_spent between 0 and 1000)t
GROUP BY week
Now I am trying to port this to tableau, and I am not being able to find a way to represent this data in tableau. I am stuck at how to represent the inner group by in a calculated field. I can also try to just use a custom sql data source, but I am already using another data source.
columns in mytable -
start_time
avg_time_spent
name
branch
status
I think this could be achieved new Level Of Details formulas, but unfortunately I am stuck at version 8.3
Save custom SQL for rare cases. This doesn't look like a rare case. Let Tableau generate the SQL for you.
If you simply connect to your table, then you can usually write calculated fields to get the information you want. I'm not exactly sure why you have test_name in one part of your query but test_name_specific in another, so ignoring that, here is a simplified example to a similar query.
If you define a calculated field called worst_case_test_time
datediff(min(start_time), dateadd('second', max(start_time), avg_time_spent)), which seems close to what your original query says.
It would help if you explained what exactly you are trying to compute. It appears to be some sort of worst case bound for avg test time. There may be an even simpler formula, but its hard to know without a little context.
You could filter on status = "Success" and avg_time_spent < 1000, and place branch and WEEK(start_time) on say the row and column shelves.
P.S. Your query seems a little off. Don't you need an aggregation function like MAX or AVG after the HAVING keyword?
I have a dataset that is a query which has a where clause like this 'where field1 like #parameter1' parameter1 is a string defined as a parameter in the dataset1. I have various text boxes that calls the dataset with expressions like =First(Fields!field_xx, "Dataset1"). For each textbox I like to specify a different value for #parameter1 when it calls the "dataset1". How can I modify the expression in each textbox as to call the "dataset1" from each of them with a hardcoded value for #parameter1
the query:
SELECT TOP (1) job.job_id, job.originating_server, job.name, job.enabled, job.description, job.start_step_id, job.category_id, job.owner_sid, job.notify_level_eventlog,
job.notify_level_email, job.notify_level_netsend, job.notify_level_page, job.notify_email_operator_id, job.notify_netsend_operator_id, job.notify_page_operator_id,
job.delete_level, job.date_created, job.date_modified, job.version_number, job.originating_server_id, job.master_server, activity.session_id, activity.job_id AS Expr1,
activity.run_requested_date, activity.run_requested_source, activity.queued_date, activity.start_execution_date, activity.last_executed_step_id,
activity.last_executed_step_date, activity.stop_execution_date, activity.job_history_id, activity.next_scheduled_run_date, steps.step_name
FROM sysjobs_view AS job INNER JOIN
sysjobactivity AS activity ON job.job_id = activity.job_id INNER JOIN
sysjobsteps AS steps ON activity.last_executed_step_id = steps.step_id AND activity.job_id = steps.job_id
WHERE (job.name LIKE 'Actual Job Name')
ORDER BY activity.start_execution_date DESC
It is not possible to call a dataset with different parameters in the same report execution. Every execution and rendering of the report fetches each dataset only once.
This means that you have to construct your dataset in a way so that it returns all the data you need, to populate each of your textboxes.
Depending on your data model, you may want to add more columns to your dataset, or return the data in multiple rows. If you have multiple rows, then you can use the Lookup function in an expression, to filter out the row in each individual textbox.
Perhaps if you elaborated a little more on what your report should look like, and what the structure of the data you are fetching is, it would be possible to give a better answer to how to solve your problem with a single dataset.
I have a database that has two tables that need to be linked, but in one table the data is padded with zeros. For example, the tables may look like this:
CUSTOMER.CUSTNUM = 00000000123456
CUSTOMERPHONE.CUSTNUM = 123456
I can't figure out how to get these tables to properly join.
What I'm trying to do now is trick Crystal Reports into specifying the Join clause by adding the following to the selection expert:
Right ({CUSTOMER.CUSTNUM}) = {CUSTOMERPHONE.CUSTNUM}
That's not working though, and I get no records at all in my report.
Any ideas?
Crystal doesn't like heterogeneous joins.
Options:
use a command object, which will give you more control over the linkage
create a SQL Expression that performs the desired concatination; link fields in the record-selection formula
use a subreport for the linked table
alter the table to make the data types compatible
create a SQL view that performs the joins
First thing, why does CUSTOMER.CUSTNUM have leading zeros in the first place? It seems to me that it should be a NUMERIC data type instead of a VARCHAR. CUSTNUM should be consistent in all of the tables. Just a thought.
Anyway, to answer your question, you could try creating a SQL Command in Crystal to join the two tables. In the join, just use your database's function for converting from a varchar to a number. For example, in Access you could do:
SELECT *
FROM `Customer`
LEFT OUTER JOIN `Orders` ON `Orders`.`Numeric Customer ID` = CLng(`Customer`.`Varchar Customer ID`)
If fastest performance isn't an issue, you can accomplish this using Select Expert. I think the problem is your formula.
Try changing your formula from this:
{CUSTOMERPHONE.CUSTNUM} = Right({CUSTOMER.CUSTNUM})
to this:
{CUSTOMERPHONE.CUSTNUM} = Right({CUSTOMER.CUSTNUM}, Length({CUSTOMERPHONE.CUSTNUM}))