Sorting a Meteor Cursor - mongodb

I am using Ionic 2 with Meteor/Mongo.
I am trying to sort a Cursor, but find it just keeps the original order which the items were inserted.
model
interface Chat {
_id?: string;
memberIds?: string[];
title?: string;
subTitle?: string;
picture?: string;
lastMessage?: Message;
lastMessageCreatedAt?: Date;
receiverComp?: Tracker.Computation;
lastMessageComp?: Tracker.Computation;
}
ts
private sortLocalChats(): void {
this.localChatCursor = this.localChatCollection.find({}, { sort: { lastMessageCreatedAt: -1 } });
this.localChatCursor.forEach((chat: Chat) => {
console.log(chat.title+' '+chat.lastMessageCreatedAt);
});
console.log('==> loaded sorted local chats: ' + this.localChatCollection.find().count());
output
Ashton Marais Thu Oct 06 2016 16:50:36 GMT+0800 (CST)
Ashton Marais Wed Oct 12 2016 21:20:18 GMT+0800 (CST)
ghjghj ghjghg Wed Oct 05 2016 23:37:49 GMT+0800 (CST)
Brett Simpson Thu Oct 06 2016 23:52:05 GMT+0800 (CST)
==> loaded sorted local chats: 4
I would have expected this to be sorted by lastMessageCreatedAt.
Any help appreciated.

To get sorted results you probably need to call fetch() first. So in the above example this works:
this.localChatCursor.fetch().forEach(...);

Related

Alter client quotas using kafka-python

I'm using kafka-python library to manage my kafka cluster. I want to add the quotas features (alter/describe), these features are not yet implented in kafka-python library but they are available in kafka protocol doc here.
AlterClientQuotas API (Key: 49):
Requests:
AlterClientQuotas Request (Version: 0) => [entries] validate_only
entries => [entity] [ops]
entity => entity_type entity_name
entity_type => STRING
entity_name => NULLABLE_STRING
ops => key value remove
key => STRING
value => FLOAT64
remove => BOOLEAN
validate_only => BOOLEAN
Responses:
AlterClientQuotas Response (Version: 0) => throttle_time_ms [entries]
throttle_time_ms => INT32
entries => error_code error_message [entity]
error_code => INT16
error_message => NULLABLE_STRING
entity => entity_type entity_name
entity_type => STRING
entity_name => NULLABLE_STRING
To implement these features I'm adding the following code
kafka/protocol/admin.py
from kafka.protocol.api import Request, Response
class AlterClientQuotasResponse_v0(Response):
API_KEY = 49
API_VERSION = 0
SCHEMA = Schema(
("throttle_time_ms", Int32),
(
"entries",
Array(
("error_code", Int16),
("error_message", String),
("match", CompactString("utf-8")),
(
"entity",
Array(
("entity_type", String("utf-8")),
("entity_name", String("utf-8")),
),
),
),
),
)
class AlterClientQuotasResponse_v1(Response):
API_KEY = 49
API_VERSION = 0
SCHEMA = Schema(
("throttle_time_ms", Int32),
(
"entries",
Array(
("error_code", Int16),
("error_message", String),
("match", CompactString("utf-8")),
(
"entity",
Array(
("entity_type", String("utf-8")),
("entity_name", String("utf-8")),
),
),
("tags", TaggedFields),
),
),
)
class AlterClientQuotasRequest_v0(Request):
API_KEY = 49
API_VERSION = 0
RESPONSE_TYPE = AlterClientQuotasResponse_v0
SCHEMA = Schema(
(
"entries", Array(
("entity_type", String("utf-8")),
("entity_name", String("utf-8")),
)
),
(
"ops", Array(
("key", String("utf-8")),
("value", Float64),
("remove", Boolean),
)
),
("validate_only", Boolean)
)
class AlterClientQuotasRequest_v1(Request):
API_KEY = 49
API_VERSION = 0
RESPONSE_TYPE = AlterClientQuotasResponse_v1
SCHEMA = Schema(
(
"entries", Array(
("entity_type", String("utf-8")),
("entity_name", String("utf-8")),
)
),
(
"ops", Array(
("key", String("utf-8")),
("value", Float64),
("remove", Boolean),
)
),
("validate_only", Boolean),
("tags", TaggedFields),
)
AlterClientQuotasResponse = [
AlterClientQuotasResponse_v0,
AlterClientQuotasResponse_v1
]
AlterClientQuotasRequest = [
AlterClientQuotasRequest_v0,
AlterClientQuotasRequest_v1
]
kafka/admin/client.py
request = AlterClientQuotasRequest[0](
entries=[("user", "<...>.producer"),],
ops=[("producer_byte_rate", 1024, False),],
validate_only=True,
#tags={}
)
print("--> Quota Request :", request)
future = self._send_request_to_node(self._client.least_loaded_node(), request)
self._wait_for_futures([future])
print("-->>> FUTURE Value", future.value)
Running the code below gets the follwing errors :
client output
File /<...>/py310/lib/python3.10/site-packages/kafka/admin/client.py:1342, in KafkaAdminClient._wait_for_futures(self, futures)
1339 self._client.poll(future=future)
1341 if future.failed():
-> 1342 raise future.exception
KafkaConnectionError: KafkaConnectionError: socket disconnected
server output
Feb 03 11:24:25 broker-05: org.apache.kafka.common.errors.InvalidRequestException: Error getting request for apiKey: ALTER_CLIENT_QUOTAS, apiVersion: 0, connectionId: , listenerName: ListenerName(SECURED), principal: User:
Feb 03 11:24:25 broker-05 Caused by: java.lang.RuntimeException: Tried to allocate a collection of size 292211, but there are only 71 bytes remaining.
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.message.AlterClientQuotasRequestData$EntryData.read(AlterClientQuotasRequestData.java:347)
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.message.AlterClientQuotasRequestData$EntryData.(AlterClientQuotasRequestData.java:300)
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.message.AlterClientQuotasRequestData.read(AlterClientQuotasRequestData.java:125)
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.message.AlterClientQuotasRequestData.(AlterClientQuotasRequestData.java:73)
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.requests.AlterClientQuotasRequest.parse(AlterClientQuotasRequest.java:142)
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.requests.AbstractRequest.doParseRequest(AbstractRequest.java:269)
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.requests.AbstractRequest.parseRequest(AbstractRequest.java:165)
Feb 03 11:24:25 broker-05 at org.apache.kafka.common.requests.RequestContext.parseRequest(RequestContext.java:95)
Feb 03 11:24:25 broker-05 at kafka.network.RequestChannel$Request.(RequestChannel.scala:101)
Feb 03 11:24:25 broker-05 at kafka.network.Processor.$anonfun$processCompletedReceives$1(SocketServer.scala:1096)
Feb 03 11:24:25 broker-05 at java.base/java.util.LinkedHashMap$LinkedValues.forEach(LinkedHashMap.java:608)
Feb 03 11:24:25 broker-05 at kafka.network.Processor.processCompletedReceives(SocketServer.scala:1074)
Feb 03 11:24:25 broker-05 at kafka.network.Processor.run(SocketServer.scala:960)
Feb 03 11:24:25 broker-05 at java.base/java.lang.Thread.run(Thread.java:829)
Has anyone been able to implement this ?
help please ?
Thank you !

Drools: Sliding window:time is selecting even elapsed data

I wanted to check Anomalous events in a window of say 3 days. I have a following drools code which outputs all events and not restricting to just 3 days. Same problem happens even if I put 1 day instead of 3. I do fireAllRules after each insert.
If use window:length(3) it works perfectly but not by time. Is there anything wrong I am doing?
Also, I have another question #expires annotation, does it work on my custom timestamp or from the time an event/fact inserted into working memory?
Have put simplified unit test at https://github.com/mtekp/droolsExperiments/tree/main/testdrools
I am using drools:7.43.1.Final.
My kmodule.xml
<?xml version="1.0" encoding="UTF-8"?>
<kmodule xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://www.drools.org/xsd/kmodule">
<kbase name="TestEventsC" packages="test.events.continuous" eventProcessingMode="stream">
<ksession name="TestDroolEventsContinous" clockType="realtime"/>
</kbase>
</kmodule>
The drool rule
package test.events.continuous
declare AnomalyCount
#role(event)
#timestamp(anomalyDate)
#expires(3d)
end
rule "insert anomaly count"
dialect "java"
when
$aei: AnomalyEventInsert()
not AnomalyCount(anomalyDate == $aei.ts)
then
insert(new AnomalyCount($aei.getTs(), 1));
delete($aei);
end
rule "increment count"
dialect "java"
when
$aei: AnomalyEventInsert()
$ac: AnomalyCount(anomalyDate == $aei.ts)
then
modify($ac) { setAnomalyCount($ac.getAnomalyCount() + 1)};
delete($aei);
end
rule "Check continuous 3 days"
dialect "java"
when
$anomalies : List() from accumulate(AnomalyCount($dt : anomalyDate) over window:time(3d);collectList( $dt ))
then
System.out.println("anomalies: "+ $anomalies);
end
AnomalyCount.java
import org.kie.api.definition.type.Role;
import java.util.Date;
#org.kie.api.definition.type.Role(Role.Type.EVENT)
#org.kie.api.definition.type.Timestamp("anomalyDate")
//#org.kie.api.definition.type.Expires("1d")
public class AnomalyCount {
public AnomalyCount(Date anomalyDate, int anomalyCount) {
this.anomalyDate = anomalyDate;
this.anomalyCount = anomalyCount;
}
public Date getAnomalyDate() {
return anomalyDate;
}
public void setAnomalyDate(Date anomalyDate) {
this.anomalyDate = anomalyDate;
}
public int getAnomalyCount() {
return anomalyCount;
}
public void setAnomalyCount(int anomalyCount) {
this.anomalyCount = anomalyCount;
}
#Override
public String toString() {
return "AnomalyCount{" +
"anomalyDate=" + anomalyDate +
", anomalyCount=" + anomalyCount +
'}';
}
private Date anomalyDate;
private int anomalyCount = 0;
}
AnomalyEventInsert.java
import java.time.LocalDate;
import java.time.ZoneOffset;
import java.util.Date;
public class AnomalyEventInsert extends AnomalyEvent {
public AnomalyEventInsert(LocalDate ts, long sequenceCount, long value) {
this.ts = Date.from(ts.atStartOfDay().toInstant( ZoneOffset.UTC));
this.value = value;
this.sequenceCount = sequenceCount;
}
public AnomalyEvent toAnomalyEvent() {
return new AnomalyEvent(ts, sequenceCount, value);
}
#Override
public String toString() {
return "AnomalyEventInsert{" +
"ts=" + ts +
", sequenceCount=" + sequenceCount +
", value=" + value +
'}';
}
}
When I insert data for 6th,7th,8th,9th I get all four instead of last 3 when the window moves.
found output
anomalies: [Tue Oct 06 05:30:00 IST 2020, Wed Oct 07 05:30:00 IST 2020, Thu Oct 08 05:30:00 IST 2020, Fri Oct 09 05:30:00 IST 2020]
instead of
anomalies: [Wed Oct 07 05:30:00 IST 2020, Thu Oct 08 05:30:00 IST 2020, Fri Oct 09 05:30:00 IST 2020]

Failed to send event to MongoDb

I am using logstash to collect data from android device using http-poller as input plugin. I want to store the collected data in a NoSQL(mongodb) using mongodb as output plugin. I have successfully set up the configuration file but when I run my set-up, I get the following error:
Failed to send event to MongoDB {:event=>#, #cancelled=false, #data={"error"=>501, "#version"=>"1", "#timestamp"=>"2016-10-03T18:02:11.337Z", "http_poller_metadata"=>{"name"=>"some_other_service", "host"=>"s18660276.domainepardefaut.fr", "request"=>{"method"=>"post", "url"=>"http://127.0.0.1/web/app.php/api/addsmartlog"}, "runtime_seconds"=>0.526, "code"=>200, "response_headers"=>{"date"=>"Mon, 03 Oct 2016 18:02:11 GMT", "server"=>"Apache/2.2.22 (Debian)", "x-powered-by"=>"PHP/5.5.30-1~dotdeb+7.1", "cache-control"=>"no-cache", "content-length"=>"13", "keep-alive"=>"timeout=5, max=100", "connection"=>"Keep-Alive", "content-type"=>"application/json"}, "response_message"=>"OK", "times_retried"=>0}}, #metadata={}, #accessors=#501, "#version"=>"1", "#timestamp"=>"2016-10-03T18:02:11.337Z", "http_poller_metadata"=>{"name"=>"some_other_service", "host"=>"s18660276.domainepardefaut.fr", "request"=>{"method"=>"post", "url"=>"http://127.0.0.1/web/app.php/api/addsmartlog"}, "runtime_seconds"=>0.526, "code"=>200, "response_headers"=>{"date"=>"Mon, 03 Oct 2016 18:02:11 GMT", "server"=>"Apache/2.2.22 (Debian)", "x-powered-by"=>"PHP/5.5.30-1~dotdeb+7.1", "cache-control"=>"no-cache", "content-length"=>"13", "keep-alive"=>"timeout=5, max=100", "connection"=>"Keep-Alive", "content-type"=>"application/json"}, "response_message"=>"OK", "times_retried"=>0}}, #lut={"http_poller_metadata"=>[{"error"=>501, "#version"=>"1", "#timestamp"=>"2016-10-03T18:02:11.337Z", "http_poller_metadata"=>{"name"=>"some_other_service", "host"=>"s18660276.domainepardefaut.fr", "request"=>{"method"=>"post", "#timestamp"=>"2016-10-03T18:02:11.337Z", "http_poller_metadata"=>{"name"=>"some_other_service", "host"=>"s18660276.domainepardefaut.fr", "request"=> "runtime_seconds"=>0.526, "code"=>200, "response_headers"=>{"date"=>"Mon, 03 Oct 2016 18:02:11 GMT", "server"=>"Apache/2.2.22 (Debian)", "x-powered-by"=>"PHP/5.5.30-1~dotdeb+7.1", "cache-control"=>"no-cache", "content-length"=>"13", "keep-alive"=>"timeout=5, max=100", "connection"=>"Keep-Alive", "content-type"=>"application/json"}, "response_message"=>"OK", "times_retried"=>0}}, "#timestamp"]}>>, :exception=>#"admin"}, #server_selection_timeout=30>>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/server_selector/selectable.rb:99:in select_server'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/cluster.rb:122:innext_primary'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/collection.rb:190:in insert_many'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/collection.rb:175:ininsert_one'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-mongodb-2.0.5/lib/logstash/outputs/mongodb.rb:56:in receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:109:inmulti_receive'", "org/jruby/RubyArray.java:1613:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:109:inmulti_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:114:inmulti_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301:in output_batch'", "org/jruby/RubyHash.java:1342:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301:in output_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:232:inworker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:201:in `start_workers'"], :level=>:warn}
^CSIGINT received. Shutting down the agent. {:level=>:warn}
Can anyone help out?

Retrieve AEM Page Properties via Search/QueryBuilder API

Is there a way to retrieve data stored as page metadata (Page Properties) stored in a separate JCR node via the QueryBuilder API?
Example:
The search results should include the data under ~/article-1/jcr:content/thumbnail. However, the only results I am getting are data under the ~article-1/jcr:content/content (a parsys included on the template).
An example query:
http://localhost:4502/bin/querybuilder.json?p.hits=full&path=/content/path/to/articles
Results in (snippet):
{
"jcr:path":"/content/path/to/articles/article-1",
"jcr:createdBy":"admin",
"jcr:created":"Tue Dec 03 2013 16:26:51 GMT-0500",
"jcr:primaryType":"cq:Page"
},
{
"jcr:path":"/content/path/to/articles/article-1/jcr:content",
"sling:resourceType":"myapp/components/global/page/productdetail",
"jcr:lockIsDeep":true,
"jcr:uuid":"4ddebe08-82e1-44e9-9197-4241dca65bdf",
"jcr:title":"Article 1",
"jcr:mixinTypes":[
"mix:lockable",
"mix:versionable"
],
"jcr:created":"Tue Dec 03 2013 16:26:51 GMT-0500",
"jcr:baseVersion":"24cabbda-1e56-4d37-bfba-d0d52aba1c00",
"cq:lastReplicationAction":"Activate",
"jcr:isCheckedOut":true,
"cq:template":"/apps/myapp/templates/global/productdetail",
"cq:lastModifiedBy":"admin",
"jcr:primaryType":"cq:PageContent",
"jcr:predecessors":[
"24cabbda-1e56-4d37-bfba-d0d52aba1c00"
],
"cq:tags":[
"mysite:mytag"
],
"jcr:createdBy":"admin",
"jcr:versionHistory":"9dcd41d4-2e10-4d52-b0c0-1ea20e102e68",
"cq:lastReplicatedBy":"admin",
"cq:lastModified":"Mon Dec 09 2013 17:57:59 GMT-0500",
"cq:lastReplicated":"Mon Dec 16 2013 11:42:54 GMT-0500",
"jcr:lockOwner":"admin"
}
Search configuration is the out-of-the-box default.
EDIT: Data is returning in JSON, however, is not accessable in the API:
Result:
{
"success":true,
"results":2,
"total":2,
"offset":0,
"hits":[
{
"jcr:path":"/content/path/to/articles/article-a",
"jcr:createdBy":"admin",
"jcr:created":"Tue Dec 03 2013 16:27:01 GMT-0500",
"jcr:primaryType":"cq:Page",
"jcr:content":{
"sling:resourceType":"path/to/components/global/page/productdetail",
"_comment":"// ***SNIP*** //",
"thumbnail":{
"jcr:lastModifiedBy":"admin",
"imageRotate":"0",
"jcr:lastModified":"Wed Dec 04 2013 12:10:47 GMT-0500",
"jcr:primaryType":"nt:unstructured"
}
}
},
{
"jcr:path":"/content/path/to/articles/article-1",
"jcr:createdBy":"admin",
"jcr:created":"Tue Dec 03 2013 16:26:51 GMT-0500",
"jcr:primaryType":"cq:Page",
"jcr:content":{
"sling:resourceType":"path/to/components/global/page/productdetail",
"_comment":"// ***SNIP*** //",
"thumbnail":{
"jcr:lastModifiedBy":"admin",
"imageRotate":"0",
"fileReference":"/content/dam/path/to/IBMDemo/apparel/women/wsh005_shoes/WSH005_0533_is_main.jpg",
"jcr:lastModified":"Mon Dec 09 2013 17:57:58 GMT-0500",
"jcr:primaryType":"nt:unstructured"
}
}
}
]
}
Implementation code:
searchCriteria.put("path", path);
searchCriteria.put("type", "cq:Page");
searchCriteria.put("p.offset", offset.toString());
searchCriteria.put("p.limit", limit.toString());
searchCriteria.put("p.hits", "full");
searchCriteria.put("p.properties", "thumbnail");
searchCriteria.put("p.nodedepth", "2");
PredicateGroup predicateGroup = PredicateGroup.create(searchCriteria);
Query query = queryBuilder.createQuery(predicateGroup, session);
SearchResult result = query.getResult();
for (Hit hit : result.getHits()) {
try {
ValueMap properties = hit.getProperties();
VFSearchResult res = new VFSearchResult();
res.setUrl(hit.getPath());
res.setImageUrl((String)properties.get("thumbnail"));
res.setTags((String[])properties.get("cq:tags"));
res.setTeaserText((String)properties.get("teaserText"));
res.setTitle((String)properties.get("jcr:title"));
searchResults.add(res);
} catch (RepositoryException rex) {
logger.debug(String.format("could not retrieve node properties: %1s", rex));
}
}
After setting the path in the query, then set one or more property filters, such as in this example:
type=cq:Page
path=/content/path/to/articles
property=jcr:content/thumbnail
property.operation=exists
p.hits=selective
p.properties=jcr:content/thumbnail someSpecificThumbnailPropertyToRetrieve
p.limit=100000
You can set those on /libs/cq/search/content/querydebug.html and then also use that to get the JSON URL for the same query.
Check this out for some other examples: http://dev.day.com/docs/en/cq/5-5/dam/customizing_and_extendingcq5dam/query_builder.html
You could use CURL to retrieve node properties in the HTML/ JSON/ XML format. All you have to do is install download CURL and run your curl commands from your terminal from the same directory as the curl's .exe file.
HTML example:
C:\Users\****\Desktop>curl -u username:password http://localhost:4502/content/geometrixx-media/en/gadgets/leaps-in-ai.html
JSON example:
**C:\Users\****\Desktop>**curl -u username:password http://localhost:4502/content/geometrixx-media/en/gadgets/leaps-in-ai.html.tidy.infinity.json
Note: infinity in the above query ensures you get every property of every node under your specified path recursively

Rails I18n locale set date format in Rails

My environment: rails => 3.0.6, ruby => 1.9.2
I set my locale to Italan. Infact, inside the console
I18n.locale # => :it
My locale file work just fine, but I'cant make my dates display right. For ex. in my console
Date.current => Sun, 05 Jun 2011
instead of
05 Giugno 2011
But if I try other methods it return the right translated output
helper.number_to_currency(30) # => "30.00 €"
Locale issue occurs only with dates. Why?
Date.current => Sun, 05 Jun 2011
Won't run your code through the localizer, you should use
I18n.localize(Date.current)
I18n.l(Date.current)
There are also the helper methods in Rails, which will respect the locale, but are only (typically) available in the view, documentation for these lives here: http://api.rubyonrails.org/classes/ActionView/Helpers/DateHelper.html
Here's a short excerpt from an IRB session in a Rails 3.0.7 application (I don't have the other locales available)
ruby-1.9.2-p180 :001 > Date.current
=> Sun, 05 Jun 2011
ruby-1.9.2-p180 :002 > I18n.locale
=> :en
ruby-1.9.2-p180 :003 > I18n.l(Date.current)
=> "2011-06-05"
ruby-1.9.2-p180 :004 > I18n.locale = :ru
=> :ru
ruby-1.9.2-p180 :005 > I18n.l(Date.current)
=> I18n::MissingTranslationData: translation missing: ru.date.formats.default
Try
I18n.localize(Date.today)
or in the view just
l(Date.today)
Source:
http://guides.rubyonrails.org/i18n.html#adding-date-time-formats