With Linq to Entities, I am trying to query a Log table to find rows near a matching row. I am having trouble with adding a date inside the query. This is what I have so far.
from
l in objectSet.Logs
let
match = objectSet.Logs.Where(whatever).FirstOrDefault()
where
l.Timestamp > (match.Timestamp - twoHours)
&& l.Timestamp < (match.Timestamp + twoHours)
select
l
Leaving out the "whatever" condition that finds the row I'm interested in, "twoHours" has variably been a time span, a .AddHours() function and so forth. I haven't found the right way that EF can generate SQL that adds the value from a field (match.Timestamp) to a constant.
The obvious solution is to do the "match" query first and then use the literal value in a second query, but I have simplified the code example here to the main problem (adding dates in the query) and in actual fact my query is more complex and this would not be ideal.
Cheers
You can generate an AddHours using the EntityFunctions class.
from
l in objectSet.Logs
let
match = objectSet.Logs.Where(whatever).FirstOrDefault()
where
(l.Timestamp > EntityFunctions.AddHours(match.Timestamp, -1 * twoHours))
&& // ...
select
l
However, don't expect this WHERE to be optimized with an index unless you have an expression index on the column.
EntityFunctions is deprecated in favor of DbFunctions
public int GetNumUsersByDay(DateTime Date)
{
using (var context = db)
{
var DateDay = new DateTime(Date.Year, Date.Month, Date.Day);
var DateDayTomorrow = DateDay.AddDays(1);
return context.Users.Where(m => DbFunctions.AddHours(m.DateCreated,-5) >= DateDay && m.DateCreated < DateDayTomorrow).Count();
}
}
As it was described in this article - http://www.devart.com/blogs/dotconnect/?p=2982#first, use parameters (declare variable) instead of DateTime using in your queries.
Related
I implemented a Linq query in F# that uses this solution to group by multiple columns. It compiles and works half of the time but in the other half of the time the program throws a runtime type miss-match error. Sometimes the AnonymousObject seems to get an int instead of a Nullable<int>, which then causes an error.
let q = query{
for wh in d.Table1 do
where (wh.Date >= vDate)
where (wh.Date <= bDate)
join tae in d.Table2 on
(wh.Table2Key = tae.key)
let key = AnonymousObject<int,int,Nullable<int>>(wh.Table3key,wh.ProjectTableKey,tae.ProjectPhaseKey)
where tae.ProjectPhaseKey.HasValue
groupValBy wh key into g
select {pkey = g.Key.Item2; lphasekey = g.Key.Item3 ; orgk = g.Key.Item1; time = g.Sum (fun x -> x.data) }
}
How can it be, that the types change at runtime? Can anybody give me a hint? Or has an idea how to work around that?
I am trying to filter by Date. The answer on this post worked for me by boindiil
AngularJS ngTable filtering by Date
The only problem I am having is that the filter for the Date is Case sensitive. How do I make it not to be case sensitive? When you run his code and you type lowercase j or f for the date, no results are shown. You have to type exactly what it is.
The Name filter is not case sensitive. you can type lowercase or uppercase it works.
You just need to add a check in your filter for lowercase as well.
Just replace
if($filter('date')(value.Date).indexOf(dateString) >= 0) {
filtered.push(value);
}
with
var lower = $filter('date')(value.Date.toDateString().toLowerCase()).indexOf(dateString);
var normal = $filter('date')(value.Date).indexOf(dateString);
if(normal >= 0 || lower >= 0) {
filtered.push(value);
}
I have a field code that contains a value in hexadecimal.
How can I get the highest value in the collection in one mongoose query?
If I have several collection with the same field code, is there a way to get the highest value based on all the collections in one request?
Generally to find maximum value of a field you need to either:
Have an index on that field - then the retrieval is relatively quick - you just take the first item in inverse sort. The problem here is that if you index hexadecimal strings, the order will be lexicographical, so the maximum will be the lexicographical maximum of the string set.
That's why adding an integer field would be the best choice if this operation is going to be repeated many times.
Besides, it's probably better from the pure data-modelling point of view. The field is actually an integer, the hex string is just it's representation, so maybe it should be converted to hex only when presented to the end user?
Iterate all the elements of the collection while maintaining and updating max value. This can be easily done using a simple .forEach on the mongo cursor:
var max = some_small_value;
cur.forEach(function (doc) {
var current = parseInt(doc.field, 16);
if (max < current) {
max = current;
}
});
Or in mongoose, using query streams:
var stream = Model.find().stream();
var max = some_small_value;
stream.on('data', function (doc) {
var current = parseInt(doc.field, 16);
if (max < current) {
max = current;
}
});
stream.on('close', function () {
// do something with max
})
I'm having issues implementing the TOP or SKIP functionality when building a new object query.
I can't use eSQL because i need to use an "IN" command - which could get quite complex if I loop over the IN and add them all as "OR" parameters.
Code is below :
Using dbcontext As New DB
Dim r As New ObjectQuery(Of recipient)("recipients", dbcontext)
r.Include("jobs")
r.Include("applications")
r = r.Where(Function(w) searchAppIds.Contains(w.job.application_id))
If Not statuses.Count = 0 Then
r = r.Where(Function(w) statuses.Contains(w.status))
End If
If Not dtFrom.DbSelectedDate Is Nothing Then
r = r.Where(Function(w) w.job.create_time >= dtDocFrom.DbSelectedDate)
End If
If Not dtTo.DbSelectedDate Is Nothing Then
r = r.Where(Function(w) w.job.create_time <= dtDocTo.DbSelectedDate)
End If
'a lot more IF conditions to add in additional predicates
grdResults.DataSource = r
grdResults.DataBind()
If I use any form of .Top or .Skip it throws an error : Query builder methods are not supported for LINQ to Entities queries
Is there any way to specify TOP or Limit using this method? I'd like to avoid a query returning 1000's of records if possible. (it's for a user search screen)
Rather than
r = new ObjectQuery<recipient>("recipients", dbContext)
try
r = dbContext.recipients.
.Skip() and .Take() return IOrderedQueriable<T> while .Where returns IQueriable<T>. Thus put the .Skip() and .Take() last.
Also change grdResults.DataSource = r to grdResults.DataSource = r.ToList() to execute the query now. That'll also allow you to temporarily wrap this line in try/catch, which may expose a better message about why it's erroring.
Mark this one down to confusion. I should have been using the .Take instead of .Top or .Limit or anything.
my final part is the below and it works :
grdResults = r.Take(100)
This sort of thing:
Dim MatchingValues() As Integer = {5, 6, 7}
Return From e in context.entity
Where MatchingValues.Contains(e.Id)
...works great. However, in my case, the values in MatchingValues are provided by the user. If none are provided, all rows ought to be returned. It would be wonderful if I could do this:
Return From e in context.entity
Where (MatchingValues.Length = 0) OrElse (MatchingValues.Contains(e.Id))
Alas, the array length test cannot be converted to SQL. I could, of course, code this:
If MatchingValues.Length = 0 Then
Return From e in context.entity
Else
Return From e in context.entity
Where MatchingValues.Contains(e.Id)
End If
This solution doesn't scale well. My application needs to work with 5 such lists, which means I'd need to code 32 queries, one for every situation.
I could also fill MatchingValues with every existing value when the user doesn't want to use the filter. However, there could be thousands of values in each of the five lists. Again, that's not optimal.
There must be a better way. Ideas?
Give this a try: (Sorry for the C# code, but you get the idea)
IQueryable<T> query = context.Entity;
if (matchingValues.Length < 0) {
query = query.Where(e => matchingValues.Contains(e.Id));
}
You could do this with the other lists aswell.