Illegal character in query at index 40: facebook - facebook

Sorry, I am reformulating the question; I was so frustrated by this Error that I posted the question in a snap.
I am trying to use camel-facebook component and using very simple route that figures as such in the blueprint.xml file:
from uri="facebook://me?oAuthAppId={{oAuthAppId}}&oAuthAppSecret={{oAuthAppSecret}}&oAuthAccessToken={{oAuthAccessToken}}&consumer.delay=86400000"/>
I am using :
Red Hat JBoss Developer Studio
Version: 10.1.0.GA
Actually I see the bundle started :
[ 348] [Active ] [Created ] [ ] [ 80] MyApp [fbdemo] (1.0.0.SNAPSHOT)
Also :
[ 333] [Active ] [ ] [ ] [ 50] camel-facebook (2.17.0.redhat-630187)
Perhaps I have the error mentioned above, I put XXXXXXXXX for oAuth*.
2017-02-14 16:02:16,128 | ERROR | 68)-192.168.56.1 | BlueprintCamelContext | 234 - org.apache.camel.camel-blueprint - 2.17.0.redhat-630187 | Error occurred during starting Camel: CamelContext(blueprintContext) due Failed to create route fbRoute: Route(fbRoute)[[From[facebook://me?oAuthAppId={{oAuthAppId}}... because of Failed to resolve endpoint: facebook://me?oAuthAppId=XXXXXXXXXXXXXX
&oAuthAppSecret=XXXXXXXXXXXXXXXXXXXXXX&oAuthAccessToken=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&consumer.delay=86400000 due to: Illegal character in query at index 40: facebook://me?oAuthAppId=XXXXXXXXXXXXXXXXXXXXXXX
&oAuthAppSecret=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&oAuthAccessToken=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&consumer.delay=86400000
org.apache.camel.FailedToCreateRouteException: Failed to create route fbRoute: Route(fbRoute)[[From[facebook://me?oAuthAppId={{oAuthAppId}}... because of Failed to resolve endpoint: facebook://me?oAuthAppId=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
&oAuthAppSecret=XXXXXXXXXXXXXXXXXXXXXXXXXXXX&oAuthAccessToken=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&consumer.delay=86400000 due to: Illegal character in query at index 40: facebook://me?oAuthAppId=XXXXXXXXXXXXXXXX
Shall the oAuthAccessToken be the Application AccesToken or User AccessToken that I get from Facebook Graph Explorer. Note that I don`t have any special character in code secret only a | ( pipe) in case the AccessToken is the Appli AccessToken not User AccessToken. How to figure the index 40.
Thank you very much

If you are using xml then you need to check that all "&" symbols are encoded as "&" so it will looks like:
<from url="facebook://me?oAuthAppId=XXXXXXXXXXXXX&oAuthAppSecret=XXXXXXXXXXXXXXXXXXXXXXXXX&oAuthAccessToken=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&consumer.delay=86400000" />

Related

Flow Enums correctly parsed but not transformed

In my React app, I'm trying to migrate from my "old school" JS enums to Flow Enums:
https://flow.org/en/docs/enums/
(I think) I've done everything listed here:
https://flow.org/en/docs/enums/enabling-enums/
eslint and flow check are both happy (zero error) and the enums work as expected when I type code.
But when I start my app, they are not transformed and I get this:
ERROR in ./src/types.js
Module build failed (from ../../node_modules/babel-loader/lib/index.js):
SyntaxError: C:\foo\src\types.js: Unexpected token, expected "{" (16:7)
14 | |};
15 |
> 16 | export enum FooEnum {
| ^
17 | On,
18 | Off,
19 | Default
at instantiate (C:\foo\node_modules\#babel\parser\lib\index.js:72:32)
at constructor (C:\foo\node_modules\#babel\parser\lib\index.js:366:12)
at FlowParserMixin.raise (C:\foo\node_modules\#babel\parser\lib\index.js:3453:19)
at FlowParserMixin.unexpected (C:\foo\node_modules\#babel\parser\lib\index.js:3491:16)
at FlowParserMixin.parseExport (C:\foo\node_modules\#babel\parser\lib\index.js:16044:16)
at FlowParserMixin.parseExport (C:\foo\node_modules\#babel\parser\lib\index.js:6170:24)
at FlowParserMixin.parseStatementContent (C:\foo\node_modules\#babel\parser\lib\index.js:14893:27)
at FlowParserMixin.parseStatement (C:\foo\node_modules\#babel\parser\lib\index.js:14777:17)
at FlowParserMixin.parseStatement (C:\foo\node_modules\#babel\parser\lib\index.js:5951:24)
at FlowParserMixin.parseBlockOrModuleBlockBody (C:\foo\node_modules\#babel\parser\lib\index.js:15420:25)
Package-wise, all of them are in their latest version and I've installed:
babel-plugin-transform-flow-enums
eslint-plugin-ft-flow
flow-enums-runtime
My Babel config is:
"babel": {
"plugins": [
"#babel/plugin-proposal-class-properties",
[
"#babel/plugin-syntax-flow",
{
"enums": true
}
],
"babel-plugin-transform-flow-enums"
],
"presets": [
"#babel/preset-env",
"#babel/preset-flow",
"#babel/preset-react"
]
},
Also, calling Babel from a command line correctly transforms the enum. I'm using this command:
npx babel src/types.js
What could I have missed?
So, after struggling for hours, I eventually found out that
react-app-rewired was messing up with my Babel plugins.
I ended up installing customize-cra, which allowed me to explicitely use my Babel config:
const {useBabelRc, override} = require('customize-cra');
module.exports = override(
useBabelRc()
);

Insert document with string containing line breaks to mongo using mongo shell

I am trying to insert the following document into a mongo collection:
[
{
"text": "tryng to insert a string
with some line breaks",
}
]
By running db.myCollection.insertMany(documentArray) where documentArray is just me copy-pasting this array.
But I am getting this error:
> db.myCollection.insertMany([
... {
... "text": "tryng to insert a string
uncaught exception: SyntaxError: "" literal not terminated before end of script :
#(shell):3:37
> with some line breaks",
uncaught exception: SyntaxError: missing ( before with-statement object :
#(shell):1:5
> }
uncaught exception: SyntaxError: expected expression, got '}' :
#(shell):1:0
> ]
Which obviously appears because it detects the new line character as the end of the command, so Mongo shell thinks it has to run the command, which is not complete.
Is there any way of saving \r\n and \n characters in MongoDB? Should I use another method not directly with the shell?
Both Mongo and the shell are version 4.4.15
Try doing this instead. The key is to embed the newline character in the string.
[
{
"text": "trying to insert a string\n" +
"with some line breaks",
}
]

Logstash wont start when adding a match statement in a grok block

I'm having difficulty with starting Logstash.
My logstash.conf looks like this:
input {
beats {
port => "5044"
}
}
filter {
grok {
patterns_dir => ["./patterns"]
match => { "message" => "%{WORD:event_type}\t%{NUMBER:server_time}\t%{NUMBER:market_time}\t%{WORD:instrument}\t%{C_NUMBER:last_price}\t%{C_NUMBER:trade_quantity}\t%{C_NUMBER:bid_price}\t%{C_NUMBER:bid_quantity}\t%{C_NUMBER:ask_price}\t%{C_NUMBER:ask_quantity}\t%{GREEDYDATA:flags}\t%{GREEDYDATA:additional_infos}"}
}
# ... and other stuff here...
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "%{[#metadata][beat]}"
}
}
Logstash works fine if I comment the match => line. But with it, it does not start, meaning nothing shows up when I run netstat -na | grep 5044 in the container. It is simply not listening on 5044.
And when I try to run Logstash manually by /opt/logstash/bin/logstash --path.data /tmp/logstash/data -f /etc/logstash/conf.d/filebeat-config.conf, I get the following:
Sending Logstash's logs to /opt/logstash/logs which is now configured via log4j2.properties
[2018-08-27T09:35:25,883][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/tmp/logstash/data/queue"}
[2018-08-27T09:35:25,887][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/tmp/logstash/data/dead_letter_queue"}
[2018-08-27T09:35:26,177][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-08-27T09:35:26,213][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"5abcdba2-475f-46a9-b192-a343ca15ce89", :path=>"/tmp/logstash/data/uuid"}
[2018-08-27T09:35:26,727][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2018-08-27T09:35:29,016][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-08-27T09:35:29,316][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-08-27T09:35:29,325][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-08-27T09:35:29,467][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-08-27T09:35:29,510][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-08-27T09:35:29,513][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-08-27T09:35:29,533][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-08-27T09:35:29,549][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-08-27T09:35:29,565][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-08-27T09:35:29,689][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x68bd7527 #metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, #metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, #metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, #id=\"e473071da674c7efab2a8ee71c9e682afff58b8a4725d076964bc668f3b2c724\", #klass=LogStash::Filters::Grok, #metric_events=#<LogStash::Instrument::NamespacedMetric:0x5867faed #metric=#<LogStash::Instrument::Metric:0x61ef1454 #collector=#<LogStash::Instrument::Collector:0x51306706 #agent=nil, #metric_store=#<LogStash::Instrument::MetricStore:0x5227344a #store=#<Concurrent::Map:0x00000000000fb4 entries=2 default_proc=nil>, #structured_lookup_mutex=#<Mutex:0x7efeb9ea>, #fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=75 default_proc=nil>>>>, #namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :e473071da674c7efab2a8ee71c9e682afff58b8a4725d076964bc668f3b2c724, :events]>, #filter=<LogStash::Filters::Grok patterns_dir=>[\"./patterns\"], match=>{\"message\"=>\"%{WORD:event_type}\\\\t%{NUMBER:server_time}\\\\t%{NUMBER:market_time}\\\\t%{WORD:instrument}\\\\t%{C_NUMBER:last_price}\\\\t%{C_NUMBER:trade_quantity}\\\\t%{C_NUMBER:bid_price}\\\\t%{C_NUMBER:bid_quantity}\\\\t%{C_NUMBER:ask_price}\\\\t%{C_NUMBER:ask_quantity}\\\\t%{GREEDYDATA:flags}\\\\t%{GREEDYDATA:additional_infos}\"}, id=>\"e473071da674c7efab2a8ee71c9e682afff58b8a4725d076964bc668f3b2c724\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{C_NUMBER:last_price} not defined", :thread=>"#<Thread:0x20b6525c run>"}
[2018-08-27T09:35:29,699][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{C_NUMBER:last_price} not defined>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in `block in compile'", "org/jruby/RubyKernel.java:1292:in `loop'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in `compile'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:in `block in register'", "org/jruby/RubyArray.java:1734:in `each'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:in `register'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:340:in `register_plugin'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:351:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:351:in `register_plugins'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:729:in `maybe_setup_out_plugins'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:361:in `start_workers'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:288:in `run'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:248:in `block in start'"], :thread=>"#<Thread:0x20b6525c run>"}
[2018-08-27T09:35:29,724][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
Also, next to my logstash.conf, I have the directory patterns including a file containing the following:
USERNAME [a-zA-Z0-9._-]+
USER %{USERNAME}
INT (?:[+-]?(?:[0-9]+))
BASE10NUM (?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:\.[0-9]+)?)|(?:\.[0-9]+)))
NUMBER (?:%{BASE10NUM})
C_NUMBER (?:[+-]?(?:[(0-9)|(*,#,.)]+))
C_NUMBER2 (?:[+-]?(?:[(0-9)|(*,#,.)|null]+))
BASE16NUM (?<![0-9A-Fa-f])(?:[+-]?(?:0x)?(?:[0-9A-Fa-f]+))
BASE16FLOAT \b(?<![0-9A-Fa-f.])(?:[+-]?(?:0x)?(?:(?:[0-9A-Fa-f]+(?:\.[0-9A-Fa-f]*)?)|(?:\.[0-9A-Fa-f]+)))\b
POSINT \b(?:[1-9][0-9]*)\b
NONNEGINT \b(?:[0-9]+)\b
WORD \b\w+\b
NOTSPACE \S+
SPACE \s*
DATA .*?
GREEDYDATA .*
QUOTEDSTRING (?>(?<!\\)(?>"(?>\\.|[^\\"]+)+"|""|(?>'(?>\\.|[^\\']+)+')|''|(?>(?>\\.|[^\\]+)+`)|``))
UUID [A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}
MAC (?:%{CISCOMAC}|%{WINDOWSMAC}|%{COMMONMAC})
CISCOMAC (?:(?:[A-Fa-f0-9]{4}\.){2}[A-Fa-f0-9]{4})
WINDOWSMAC (?:(?:[A-Fa-f0-9]{2}-){5}[A-Fa-f0-9]{2})
COMMONMAC (?:(?:[A-Fa-f0-9]{2}:){5}[A-Fa-f0-9]{2})
MONTH \b(?:Jan(?:uary)?|Feb(?:ruary)?|Mar(?:ch)?|Apr(?:il)?|May|Jun(?:e)?|Jul(?:y)?|Aug(?:ust)?|Sep(?:tember)?|Oct(?:ober)?|Nov(?:ember)?|Dec(?:ember)?)\b
MONTHNUM (?:0?[1-9]|1[0-2])
MONTHDAY (?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])
DAY (?:Mon(?:day)?|Tue(?:sday)?|Wed(?:nesday)?|Thu(?:rsday)?|Fri(?:day)?|Sat(?:urday)?|Sun(?:day)?)
YEAR (?>\d\d){1,2}
HOUR (?:2[0123]|[01]?[0-9])
MINUTE (?:[0-5][0-9])
SECOND (?:(?:[0-5][0-9]|60)(?:[:.,][0-9]+)?)
TIME (?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9])
DATE_US %{MONTHNUM}[/-]%{MONTHDAY}[/-]%{YEAR}
DATE_EU %{MONTHDAY}[./-]%{MONTHNUM}[./-]%{YEAR}
ISO8601_TIMEZONE (?:Z|[+-]%{HOUR}(?::?%{MINUTE}))
ISO8601_SECOND (?:%{SECOND}|60)
TIMESTAMP_ISO8601 %{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?
TIMESTAMP_CUSTOM %{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND}.?%{NUMBER})?%{ISO8601_TIMEZONE}?
DATE %{DATE_US}|%{DATE_EU}
DATESTAMP %{DATE}[- ]%{TIME}
TZ (?:[PMCE][SD]T|UTC)
DATESTAMP_RFC822 %{DAY} %{MONTH} %{MONTHDAY} %{YEAR} %{TIME} %{TZ}
DATESTAMP_OTHER %{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{TZ} %{YEAR}
What is wrong with the match => line??
I highly appreciate your help.
You're attempting to use a grok pattern, {C_NUMBER}, that Logstash doesn't know about. It doesn't appear to be a standard pattern bundled with Logstash. put NUMBER in that place, and restart logstash.
I was able to resolve the issue by changing patterns_dir => ["./patterns"] to patterns_dir => ["/etc/logstash/conf.d/patterns"].
The match line is referencing a grok pattern that Logstash didn't find because of the relative path to the patterns directory.

Codemirror - color a line based in a tag

We've a file with a log file style log4j
[ main] [DEBUG] (14:25:46.832 CET) server started
[ main] [ INFO] (14:25:46.832 CET) I'm a nice info line
[ main] [ERROR] (14:25:46.832 CET) wrong user password for user '..'
Each line depending on if it's a DEBUG, INFO or ERROR should be colored in a different way (the whole line)
How this be done with Codemirror ?
Using the defineSimpleMode is straight forward using a regular expression :
CodeMirror.defineSimpleMode("log4j",
{
start: [
{regex: /.*\[FATAL\].*/, token: "log4j-fatal"},
{regex: /.*\[ERROR\].*/, token: "log4j-error"},
{regex: /.*\[ WARN\].*/, token: "log4j-warn"},
{regex: /.*\[ INFO\].*/, token: "log4j-info"},
{regex: /.*\[DEBUG\].*/, token: "log4j-debug"},
{regex: /.*\[TRACE\].*/, token: "log4j-trace"},
],
});

GCS Transfer via Source: URL list "errorCode": "UNKNOWN"

I'm trying to transfer 7,860,379 files, using the transfer system via URL list, however always encounter the same error:
{ //...
"errorBreakdowns": [
{
"errorCode": "UNKNOWN",
"errorCount": "1",
"errorLogEntries": [
{
"url": " or ",
"errorDetails": [
""
]
}
]
}
]
// ...
}
All the my URLs are valid and the file format as documented:
TsvHttpData-1.0
^([^ ]+)\t([0-9]+)\t([a-f0-9]{32})$
The error I find the API is very generic, someone went through the same problem?
Since, I thank you.
Based on your regex, I suspect you are not providing a base-64 encoded MD5, as it often contains '=' characters. To do this, you need to compute the binary version of your MD5 and then convert it to base64.
Example: Hk2gdsIpWTDz3kQssoTqKg==