For several user defined fields, I would like to be able to find out which recurring transaction template (if any) triggered an Invoice.
Unfortunately, I haven't been able to find a fool-proof link between the two of them.
This is using SAP Business ONE PL 9 and MS SQL Server 2016.
So far, I am getting good results most of the time using the code below.
SELECT *
FROM ORCP T0
INNER JOIN ODRF T1 ON T0.DraftEntry = T1.DocEntry
WHERE T0.DraftEntry =
(
SELECT DocEntry
FROM ODRF
WHERE CardCode = '12345'
AND DocTotal = '1000'
AND ObjType = '13'
AND CANCELED = 'N'
)
Unfortunately, this is very dependent on well-crafted draft documents.
Related
Pseudocode:
Get all projects
Use a table function to get all related parts, which uses project id as input and returns 0..* part ids
Copy a value from project to all found part ids
Datamodel:
Table projects consists of fields pj_id and pj_desc
Table parts consists of fields pj_desc_copy and prt_id
There's a function LookupRelationShips(string) that outputs multiple columns (rel_type and rel_id, where if rel_type = 2, rel_id would be a prt_id
My best attempt is this, but it won't let me use the output of the subselect:
UPDATE parts
SET pj_desc_copy = rel.pj_desc
from parts prt
INNER JOIN
(select (select rel_type, rel_id, pj.pj_desc
from LookupRelationShips(pj.pj_id)
where rel_type = 2)
from projects pj) as rel
ON rel.rel_id = prt.prt_id
Use case/restrictions:
This is a one-time statement to update all current parts. From this point onwards project CRUD will result in syncing parts, but using the application to bulk update previous projects is less than ideal (built-in timeouts, lots of overhead, large dataset).
I think your query should be as follow. You can use CROSS APPLY() on the function
UPDATE prt
SET pj_desc_copy = rel.pj_desc
FROM parts prt
INNER JOIN projects pj ON pj.rel_id = prt.prt_id
CROSS APPLY LookupRelationShips(pj.pj_id) rel
WHERE rel.rel_type = 2
UPDATE parts
SET pj_desc_copy = pj.pj_desc
FROM projects pj
CROSS APPLY LookupRelationShips(pj.pj_id) rel
RIGHT JOIN parts prt on rel.rel_id = prt.prt_id
WHERE rel.rel_type = 2
Is there a way to list all objects from a server (for all db) and its activities?
What I mean by activities:
If an object is a table/view, I'd like to know if last time something
got updated or this table was accessed.
If an object is a function, I'd like to know last time function used.
If an object is a stored
procedure, I'd like to know last time executed.
Goal is to eliminate some of the non-used objects or at least identify them so we can further analyze it. If there is a better way to do this please let me know.
Without a specific audit or explicit logging instructions in your code what you are asking might be difficult to achieve.
Here are some hints that, in my opinion, can help you retrieving the information you need:
Tables/Views You can rely on dynamic management view that record index information: sys.dm_db_index_usage_stats (more info here)
SELECT last_user_update, *
FROM sys.dm_db_index_usage_stats
WHERE database_id = DB_ID('YourDBName')
AND OBJECT_ID = OBJECT_ID('[YourDBName].[dbo].[YourTableName]')
Stored Procedures If SP execution is still cached you can query sys.dm_exec_procedure_stats (more info here)
select last_execution_time, *
from sys.dm_exec_procedure_stats
WHERE database_id = DB_ID('YourDBName')
AND OBJECT_ID = OBJECT_ID('[YourDBName].[dbo].[YourSpName]')
Functions If function execution is still cached you can query sys.dm_exec_query_stats (from this great answer), more info here
SELECT qs.last_execution_time
FROM sys.dm_exec_query_stats qs
CROSS APPLY (SELECT 1 AS X
FROM sys.dm_exec_plan_attributes(qs.plan_handle)
WHERE ( attribute = 'objectid'
AND value = OBJECT_ID('[YourDBName].[dbo].[YourFunctionName]') )
OR ( attribute = 'dbid'
AND value = DB_ID('YourDBName') )
HAVING COUNT(*) = 2) CA
The ghtorrent-bq data is great to have snapshot of GitHub, however, it is not clear when it is updated and how I could get more up to date data
Theoretically, it is updated every time a new GHTorrent MySQL dump has been released. Practically, there are still manual adjustments that need to be done to the generated CSVs as there is lots of weird text in fields such as user locations that CSV parsers fail to handle.
http://ghtorrent.org/gcloud.html
(related to https://stackoverflow.com/a/42930963/132438)
GHTorrent only provides a periodical snapshot of their data on BigQuery, while GitHub Archive updates daily (or even hourly - let me check that).
It would be great to have a more frequent snapshot of GHTorrent (maybe https://twitter.com/gousiosg can help), but in the meantime you can merge both datasets (look for the GHTorrent snapshot data, and then add the latest stars from GitHub Archive):
#standardSQL
SELECT COUNT(DISTINCT login) c
FROM (
SELECT login
FROM (
SELECT login
FROM `ghtorrent-bq.ght_2017_01_19.watchers` a
JOIN `ghtorrent-bq.ght_2017_01_19.projects` b
ON a.repo_id=b.id
JOIN `ghtorrent-bq.ght_2017_01_19.users` c
ON a.user_id=c.id
WHERE url = 'https://api.github.com/repos/angular/angular'
)
UNION ALL (
SELECT actor.login
FROM `githubarchive.month.2017*`
WHERE repo.name='angular/angular'
AND type = "WatchEvent"
)
)
I am currently trying to use the Kentico API to gather the data from all forms within Kentico.
So far I've found that there are two places to view form data and they can be found at these endpoints:
/rest/cms.forms <---- Returns all form definitions (excluding field data types
/rest/bizformitem.bizform.FORM_NAME/ <---- Returns all form data (inserted by end users)
What I am trying to do is keep a record of all of the form data on a daily basis. Is there a better way to do this using the API rather than making 'x' number of calls (one per form).
EDIE: Out of 100+ forms I only need to pull 15-20 of them on a daily basis.
You can get all in sql and it depends how many forms you have. Each form is separate sql table that has a record in CMS_Class table
-- this will give the list of all tables that you need query
select ClassTableName from CMS_Class where ClassIsForm = 1
Then you can find out the ones that were updated let say with in the 24 hrs
SELECT
[db_name] = d.name
, [table_name] = SCHEMA_NAME(o.[schema_id]) + '.' + o.name
, s.last_user_update
FROM sys.dm_db_index_usage_stats s
JOIN sys.databases d ON s.database_id = d.database_id
JOIN sys.objects o ON s.[object_id] = o.[object_id]
WHERE o.[type] = 'U'
AND s.last_user_update IS NOT NULL
AND s.last_user_update BETWEEN DATEADD(day, -1, GETDATE()) AND GETDATE()
and s.[object_id] in (select OBJECT_ID(ClassTableName)
from CMS_Class where ClassIsForm =1 )
You might have a few hundred forms... to go and query few hundred tables might be unproductive. I usually get 18-20 out 100+ we have.
There is Kentico API (not a REST API) that allows you to get all data you need at code behind. You can find examples here.
I have a report with a query that used to contain a client-id for each item. The client (the boss) wanted multiple clients for each record. That was easy - replace the client-id field with a new table ItemClients(item-id, client-id) and join on item-id.
The problem is now the reports - of course each item is duplicated for each time there is a client, BUT i WANT ONLY ONE COPY OF THE ITEM IN THE REPORT - irrespective of how many clients there are to an item. Also, there is the ability to filter the report by client, which I can do fine in MySql and worked fine in the previous single-client version.
My solution is to add a group by item-id clause to the query (MySql allows this). The problem is how to add the group by clause to the report. Any attempt to group gets interpreted as a report grouping, which I don't want. I tried to make the whole query as a command, which worked for a while, but now the whole report blows up, crashing my web server.
Any insights would be helpful. thanks.
(edit)
Here's the (hybrid) code in the command, the GROUP BY clause was added by me, the rest was scraped from crystal's show sql command and modified. (btw, this is a sub-report)
SELECT `ITEM`.`Date`, `ITEM`.`Started`, `ITEM`.`Stopped`, `TERM`.`Terminal`, `ITEM`.`TonnesLoaded`, `ITEM`.`LoadID`,
`ITEM`.`TotalLoaded`, `ITEM`.`ToGo`, `ITEM`.`OrderId`,
`COM`.`Commodity`,
`berths1`.`Berth`,
`CQ`.`LotNo`, `CQ`.`CategoryID`,
`IC`.`ClientId`
FROM `berths` `berths1`
INNER JOIN `Items` `ITEM` ON `berths1`.`BerthID`=`ITEM`.`BerthID`
INNER JOIN `terminals` `TERM` ON `berths1`.`TerminalID`=`TERM`.`TerminalID`
INNER JOIN `Quantities` `CQ` ON `ITEM`.`ItemId`=`CQ`.`ItemId`
AND `ITEM`.`OrderId`=`CQ`.`OrderId`
INNER JOIN `commodities` `COM` ON `CQ`.`CommodityID`=`COM`.`CommodityID`
LEFT OUTER JOIN `ItemClients` `IC` ON `CQ`.`OrderId`=`IC`.`OrderId`
AND `CQ`.`ItemId`=`IC`.`ItemId`
WHERE `ITEM`.`OrderId`={?Pm-details.OrderId}
AND ( {?Pm-?Category}=0 AND {?Pm-?ClientId}=0 )
OR (
`CQ`.CategoryID>0
AND `CQ`.CategoryID = {?Pm-?Category}
AND ( {?Pm-?ClientId}=0 OR IC.ClientId = {?Pm-?ClientId} )
)
OR ( {?Pm-?ClientId}>0 AND IC.ClientId = {?Pm-?ClientId} AND {?Pm-?Category}=0 )
GROUP BY `ITEM`.`ItemId`
ORDER BY `ITEM`.`Date`, `ITEM`.`Started`, `ITEM`.`LoadID`