The following store procedure retrives nearest 500 addresses for the given latitude and longitude. Many applications use it, and it is one of the useful query.
Is it possible to rewrite with Entity-to-SQL? If so, could you please point me to the right direction (I am not new to Entity-to-SQL)? Thanks in advance.
DECLARE #CntXAxis FLOAT
DECLARE #CntYAxis FLOAT
DECLARE #CntZAxis FLOAT
SET #CntXAxis = COS(RADIANS(-118.4104684)) * COS(RADIANS(34.1030032))
SET #CntYAxis = COS(RADIANS(-118.4104684)) * SIN(RADIANS(34.1030032))
SET #CntZAxis = SIN(RADIANS(-118.4104684))
SELECT
500 *,
ProxDistance = 3961 * ACOS( dbo.XAxis(LAT, LONG)*#CntXAxis + dbo.YAxis(LAT, LONG)*#CntYAxis + dbo.ZAxis(LAT)*#CntZAxis)
FROM
tbl_ProviderLocation
WHERE
(3961 * ACOS( dbo.XAxis(LAT, LONG)*#CntXAxis + dbo.YAxis(LAT, LONG)*#CntYAxis + dbo.ZAxis(LAT)*#CntZAxis) <= 10)
ORDER BY
ProxDistance ASC
If you are using Ms Sql Server, you can use SqlClient functions with Entity SQL
http://msdn.microsoft.com/en-us/library/bb399586.aspx
According to this those functions are available for LINQ queries aswell. I couldn't find an example but it seems straightforward.
var qry = from r in mytable
select new {Acos = SqlFunctions.ACos(r.mycloumn)};
Related
I am trying to get the index position of a POINT in a MULTILINESTRING.
Here is the whole query I'm stuck with :
SELECT req.id, (dp).geom, netgeo_point_tech.id, ST_Line_Locate_Point(st_lineMerge(geom_cable), (dp).geom)
FROM (SELECT id, ST_DumpPoints(geom) as dp, geom as geom_cable FROM netgeo_cable_test ) as req
JOIN netgeo_point_tech ON ST_dwithin(netgeo_point_tech.geom, (dp).geom, 1)
ORDER BY req.id, (dp).path [ 1] ASC
The error I get is : line_locate_point : 1st arg isnt a line.
The error is due to the return of st_lineMerge() function that is returning LINESTRING but also MULTILINESTRING.
I don't get this. st_lineMerge() is supposed to return only LINESTRING.ST_LineMerge()
When I jsut try a simple query like this :
select st_astext(st_linemerge(geom)) from netgeo_cable_test
The output is :
)
I want to learn from this, so, if possible, explain to me what I'm doing wrong here, or if my approach is lacking insight.
Thanks to #JGH for the suggestion to use ST_Dump I came up with this function:
create or replace function MultiLineLocatePoint(line geometry, point geometry) returns numeric as $$
select (base + extra) / ST_Length(line)
from (
select
sum(ST_Length(l.geom)) over (order by l.path) - ST_Length(l.geom) base,
ST_LineLocatePoint(l.geom, point) * ST_Length(l.geom) extra,
ST_Distance(l.geom, point) dist
from ST_Dump(line) l
) points
order by dist
limit 1;
$$ language SQL;
I've got some columns that are stored as an INT value that I am doing some addition and division on and I would like to display the results by limiting the digits after the decimal to 2. I've tried different combinations of DECIMAL / NUMERIC / ROUND but I can't get the solution.
Could anyone offer any advice on how to get the desired solution?
Query:
SELECT cc.code AS [CountyID], RTRIM(LTRIM(cc.[description])) AS [CountyName],
SUM(ISNULL(ps.push_count,0)) AS [CountyPushCounts],
SUM(ISNULL(ps.push_unique_count,0)) AS [UniquePushCount],
SUM(ISNULL(ps.error_count,0)) AS [PushErrorCount],
SUM(ISNULL(ps.warning_count,0)) AS [PushWarningCount],
(CAST(SUM(ISNULL(ps.push_unique_count,0)) AS DECIMAL(15,2)) / CAST(SUM(ISNULL(ps.push_count,0)) AS DECIMAL(15,2)) * 100.0) AS [Unique Push Per Day] ,
((CAST(SUM(ISNULL(ps.error_count,0)) AS DECIMAL(15,2)) + CAST(SUM(ISNULL(ps.warning_count,0)) AS DECIMAL(15,2))) / CAST(SUM(ISNULL(ps.push_count,0)) AS DECIMAL(15,2)) * 100.0) AS [Data Error Rate]
FROM dbo.push_stats AS [ps]
INNER JOIN CCIS.dbo.county_codes AS [cc] ON ps.county_code = cc.code
WHERE DATEPART(YEAR,ps.ldstat_date) = 2017
AND DATEPART(MONTH,ps.ldstat_date) = 3
GROUP BY cc.code, cc.[description]
ORDER BY cc.[description];
And my data set looks as follows:
CountyID CountyName PushCounts UniquePushCount PushErrorCount PushWarningCount [Unique Push Per Day] [Data Error Rate]
1 ALACHUA 210422 77046 0 39 36.61499273 0.018534184
2 BAKER 8099 5306 0 71 65.51426102 0.876651438
3 BAY 3178214 434893 117 2793 13.68356568 0.091560857
4 BRADFORD 17654 12119 0 131 68.64733205 0.742041464
I think this will do it:
SELECT cc.code AS [CountyID], RTRIM(LTRIM(cc.[description])) AS [CountyName],
SUM(ISNULL(ps.push_count,0)) AS [CountyPushCounts],
SUM(ISNULL(ps.push_unique_count,0)) AS [UniquePushCount],
SUM(ISNULL(ps.error_count,0)) AS [PushErrorCount],
SUM(ISNULL(ps.warning_count,0)) AS [PushWarningCount],
CAST((SUM(ISNULL(ps.push_unique_count,0)) * 100.00) / SUM(ISNULL(ps.push_count,0)) AS DECIMAL(15,2)) AS [Unique Push Per Day] ,
CAST((SUM(ISNULL(ps.error_count,0)) + SUM(ISNULL(ps.warning_count,0)) * 100.00) / SUM(ISNULL(ps.push_count,0)) AS DECIMAL(15,2)) AS [Data Error Rate]
FROM dbo.push_stats AS [ps]
INNER JOIN CCIS.dbo.county_codes AS [cc] ON ps.county_code = cc.code
WHERE DATEPART(YEAR,ps.ldstat_date) = 2017
AND DATEPART(MONTH,ps.ldstat_date) = 3
GROUP BY cc.code, cc.[description]
ORDER BY cc.[description];
There are two main points here:
With the division operation, get at least one term to a floating point type of some kind before the division happens, so that it doesn't do integer division and truncate the decimal portion of the result. You're okay as long as either term is a floating point type of some kind, and you can accomplish this simply by moving the * 100.00 earlier.
You want to allow much greater than 2 decimal places for the internal calculations, and only set your output format at the very end. Rounding or casting to limited types too soon in an expression can change intermediate values in small ways that are exaggerated in the final results. This means you only want one big CAST() operation around the whole set.
You have to cast the final result, not just the individual numerator and denominator
DECLARE #Numerator INTEGER
DECLARE #Denominator INTEGER
SET #Numerator = 10
SET #Denominator = 3;
-- This will produce 3.33333333
SELECT CAST(#Numerator AS DECIMAL(5,2))/CAST(#Denominator AS DECIMAL(5,2))
-- This will give you 3.33
SELECT CAST(
CAST(#Numerator AS DECIMAL(5,2))
/CAST(#Denominator AS DECIMAL(5,2))
AS DECIMAL(5,2))
I got a problem to make a "simple" query with the Doctrine QueryBuilder.
I try to get some "persons" which are on 10 km max.
My query :
$QB = $this->createQueryBuilder('p');
$QB->add('select', 'p')
->add('from', 'MyProject\Bundle\FrontBundle\Entity\Pro p')
->where('p.job = :job')
->andWhere('(3956 * 2 * ASIN(SQRT( POWER(SIN((:latitude - abs(pro.latitude)) * pi()/180 / 2),2) + COS(:latitude * pi()/180 ) * COS(abs(pro.latitude) * pi()/180) * POWER(SIN((:longitude - pro.longitude) * pi()/180 / 2), 2) ))) <= 10')
->addOrderBy('p.dateCreation', 'DESC')
->addOrderBy('p.id', 'DESC')
->setParameter('latitude', $latitude)
->setParameter('longitude', $longitude)
->setParameter('job', $jobId);
The problem is on the second 'where' statement, Doctrine fails on "ASIN" because of the parenthesis that follows. It tries to execute the function...
Is there a way to escape it ? Or another way to construct this condition ?
Thanks for Doctrine professional ;)
If I understand correctly, you can either write a raw SQL query for that, or implement the function ASIN(or any other) in doctrine.
Here you have a bundle that might help:
https://github.com/wiredmedia/doctrine-extensions
You can use its implementation of ASIN function:
https://github.com/wiredmedia/doctrine-extensions/blob/master/lib/DoctrineExtensions/Query/Mysql/Asin.php
And and article about custom DQL functions:
http://symfony.com/doc/current/cookbook/doctrine/custom_dql_functions.html
Here's a simple query we do for ad hoc requests from our Marketing department on the leads we received in the last 90 days.
SELECT ID
,FIRST_NAME
,LAST_NAME
,ADDRESS_1
,ADDRESS_2
,CITY
,STATE
,ZIP
,HOME_PHONE
,MOBILE_PHONE
,EMAIL_ADDRESS
,ROW_ADDED_DTM
FROM WEB_LEADS
WHERE ROW_ADDED_DTM BETWEEN #START AND #END
They are asking for more derived columns to be added that show the number of previous occurences of ADDRESS_1 where the EMAIL_ADDRESS matches. But they want is for different date ranges.
So the derived columns would look like this:
,COUNT_ADDRESS_1_LAST_1_DAYS,
,COUNT_ADDRESS_1_LAST_7_DAYS
,COUNT_ADDRESS_1_LAST_14_DAYS
etc.
I've manually filled these derived columns using update statements when there was just a few. The above query is really just a sample of a much larger query with many more columns. The actual request has blossomed into 6 date ranges for 13 columns. I'm asking if there's a better way then using 78 additional update statements.
I think you will have a hard time writing a query that includes all of these 78 metrics per e-mail address without actually creating a query that hard-codes the different choices. However you can generate such a pivot query with dynamic SQL, which will save you some keystrokes and will adjust dynamically as you add more columns to the table.
The result you want to end up with will look something like this (but of course you won't want to type it):
;WITH y AS
(
SELECT
EMAIL_ADDRESS,
/* aggregation portion */
[ADDRESS_1] = COUNT(DISTINCT [ADDRESS_1]),
[ADDRESS_2] = COUNT(DISTINCT [ADDRESS_2]),
... other columns
/* end agg portion */
FROM dbo.WEB_LEADS AS wl
WHERE ROW_ADDED_DTM >= /* one of 6 past dates */
GROUP BY wl.EMAIL_ADDRESS
)
SELECT EMAIL_ADDRESS,
/* pivot portion */
COUNT_ADDRESS_1_LAST_1_DAYS = *count address 1 from 1 day ago*,
COUNT_ADDRESS_1_LAST_7_DAYS = *count address 1 from 7 days ago*,
... other date ranges ...
COUNT_ADDRESS_2_LAST_1_DAYS = *count address 2 from 1 day ago*,
COUNT_ADDRESS_2_LAST_7_DAYS = *count address 2 from 7 days ago*,
... other date ranges ...
... repeat for 11 more columns ...
/* end pivot portion */
FROM y
GROUP BY EMAIL_ADDRESS
ORDER BY EMAIL_ADDRESS;
This is a little involved, and it should all be run as one script, but I'm going to break it up into chunks to intersperse comments on how the above portions are populated without typing them. (And before long #Bluefeet will probably come along with a much better PIVOT alternative.) I'll enclose my interspersed comments in /* */ so that you can still copy the bulk of this answer into Management Studio and run it with the comments intact.
Code/comments to copy follows:
/*
First, let's build a table of dates that can be used both to derive labels for pivoting and to assist with aggregation. I've added the three ranges you've mentioned and guessed at a fourth, but hopefully it is clear how to add more:
*/
DECLARE #d DATE = SYSDATETIME();
CREATE TABLE #L(label NVARCHAR(15), d DATE);
INSERT #L(label, d) VALUES
(N'LAST_1_DAYS', DATEADD(DAY, -1, #d)),
(N'LAST_7_DAYS', DATEADD(DAY, -8, #d)),
(N'LAST_14_DAYS', DATEADD(DAY, -15, #d)),
(N'LAST_MONTH', DATEADD(MONTH, -1, #d));
/*
Next, let's build the portions of the query that are repeated per column name. First, the aggregation portion is just in the format col = COUNT(DISTINCT col). We're going to go to the catalog views to dynamically derive the list of column names (except ID, EMAIL_ADDRESS and ROW_ADDED_DTM) and stuff them into a #temp table for re-use.
*/
SELECT name INTO #N FROM sys.columns
WHERE [object_id] = OBJECT_ID(N'dbo.WEB_LEADS')
AND name NOT IN (N'ID', N'EMAIL_ADDRESS', N'ROW_ADDED_DTM');
DECLARE #agg NVARCHAR(MAX) = N'', #piv NVARCHAR(MAX) = N'';
SELECT #agg += ',
' + QUOTENAME(name) + ' = COUNT(DISTINCT '
+ QUOTENAME(name) + ')' FROM #N;
PRINT #agg;
/*
Next we'll build the "pivot" portion (even though I am angling for the poor man's pivot - a bunch of CASE expressions). For each column name we need a conditional against each range, so we can accomplish this by cross joining the list of column names against our labels table. (And we'll use this exact technique again in the query later to make the /* one of past 6 dates */ portion work.
*/
SELECT #piv += ',
COUNT_' + n.name + '_' + l.label
+ ' = MAX(CASE WHEN label = N''' + l.label
+ ''' THEN ' + QUOTENAME(n.name) + ' END)'
FROM #N as n CROSS JOIN #L AS l;
PRINT #piv;
/*
Now, with those two portions populated as we'd like them, we can build a dynamic SQL statement that fills out the rest:
*/
DECLARE #sql NVARCHAR(MAX) = N';WITH y AS
(
SELECT
EMAIL_ADDRESS, l.label' + #agg + '
FROM dbo.WEB_LEADS AS wl
CROSS JOIN #L AS l
WHERE wl.ROW_ADDED_DTM >= l.d
GROUP BY wl.EMAIL_ADDRESS, l.label
)
SELECT EMAIL_ADDRESS' + #piv + '
FROM y
GROUP BY EMAIL_ADDRESS
ORDER BY EMAIL_ADDRESS;';
PRINT #sql;
EXEC sp_executesql #sql;
GO
DROP TABLE #N, #L;
/*
Now again, this is a pretty complex piece of code, and perhaps it can be made easier with PIVOT. But I think even #Bluefeet will write a version of PIVOT that uses dynamic SQL because there is just way too much to hard-code here IMHO.
*/
I have a SQL 2008 R2 database with some tables on it having some of those tables a Full-Text Index defined. I'd like to know how to determine the size of the index of a specific table, in order to control and predict it's growth.
Is there a way of doing this?
The catalog view sys.fulltext_index_fragments keeps track of the size of each fragment, regardless of catalog, so you can take the SUM this way. This assumes the limitation of one full-text index per table is going to remain the case. The following query will get you the size of each full-text index in the database, again regardless of catalog, but you could use the WHERE clause if you only care about a specific table.
SELECT
[table] = OBJECT_SCHEMA_NAME(table_id) + '.' + OBJECT_NAME(table_id),
size_in_KB = CONVERT(DECIMAL(12,2), SUM(data_size/1024.0))
FROM sys.fulltext_index_fragments
-- WHERE table_id = OBJECT_ID('dbo.specific_table_name')
GROUP BY table_id;
Also note that if the count of fragments is high you might consider a reorganize.
If you are after a specific Catalogue
Use SSMS
- Clik on [Database] and expand the objects
- Click on [Storage]
- Right Click on {Specific Catalogue}
- Choose Propertie and click.
IN General TAB.. You will find the Catalogue Size = 'nn'
I use something similar to this (which will also calculate the size of XML-indexes, ... if present)
SELECT S.name,
SO.name,
SIT.internal_type_desc,
rows = CASE WHEN GROUPING(SIT.internal_type_desc) = 0 THEN SUM(SP.rows)
END,
TotalSpaceGB = SUM(SAU.total_pages) * 8 / 1048576.0,
UsedSpaceGB = SUM(SAU.used_pages) * 8 / 1048576.0,
UnusedSpaceGB = SUM(SAU.total_pages - SAU.used_pages) * 8 / 1048576.0,
TotalSpaceKB = SUM(SAU.total_pages) * 8,
UsedSpaceKB = SUM(SAU.used_pages) * 8,
UnusedSpaceKB = SUM(SAU.total_pages - SAU.used_pages) * 8
FROM sys.objects SO
INNER JOIN sys.schemas S ON S.schema_id = SO.schema_id
INNER JOIN sys.internal_tables SIT ON SIT.parent_object_id = SO.object_id
INNER JOIN sys.partitions SP ON SP.object_id = SIT.object_id
INNER JOIN sys.allocation_units SAU ON (SAU.type IN (1, 3)
AND SAU.container_id = SP.hobt_id)
OR (SAU.type = 2
AND SAU.container_id = SP.partition_id)
WHERE S.name = 'schema'
--AND SO.name IN ('TableName')
GROUP BY GROUPING SETS(
(S.name,
SO.name,
SIT.internal_type_desc),
(S.name, SO.name), (S.name), ())
ORDER BY S.name,
SO.name,
SIT.internal_type_desc;
This will generally give numbers higher than sys.fulltext_index_fragments, but when combined with the sys.partitions of the table, it will add up to the numbers returned from EXEC sys.sp_spaceused #objname = N'schema.TableName';.
Tested with SQL Server 2016, but documentation says it should be present since 2008.