Fiddler AutoResponder - Regular Expression to ignore timestamp in URL - fiddler

I just start to use fiddler in my project for debug purpose, but haven't figure out how to handle following case with autoresponder :(
I need replace the timestamp in the request url then point to the my personal path, like
from http://www.test.com/static/20140828/js/test.js
to http://www.test.com/static/mycode/js/test.js,
while the timestamp "20140828" changing frequently so I hope can have a rule can match and handle this kind replacement automatically, without update the timestamp every time.
I tried the regex but not found the solution for this case. Any suggestions would be appreciated. Thanks.

You're right that you need to use a regular expression; you haven't said what expression you tried.
You'll probably want to use something like:
MatchText REGEX:^.*test\.com/static/\d+/(.*)$
ActionText http://www.test.com/static/mycode/$1

Related

How do I reduce the number of 301 redirect entries using wildcards and variables in Squarespace?

I recently renamed all of the URLs that make up my blog... and have written redirects for almost every page... using wildcards where I can... keeping in mind... all that I know is the * wildcard at this time...
Here is an example of what I have...
/season-1/2017/1/1/snl-s01e01-host-george-carlin -> /season-1/snl-s01e01-george-carlin 301
I want to write a catch-all that will redirect all 38 seasons of reviews with one redirect entry... but I can't figure out how to get rid of just the word "host" between s01e01- and -george-carlin... and was thinking it would work something like this...
/season-*/*/*/*/snl-s*e*-host-*-* -> /season-*/snl-s*e*[code to remove the word "host"]-*-* 301
Is that even close to being correct? Do I need that many *s
Thanks in advance for any help...
Unfortunately, you won't be able to reduce the number of individual redirect entries using the redirect features that Squarespace has to offer, namely the wildcard (*) and a single variable ([name]). Multiple variables would be needed, but only [name] is supported.
The closest you can get is:
/season-1/*/*/*/snl-s01e01-host-[name] -> /season-1/snl-s01e01-[name] 301
But, if I'm understanding things, while the above redirect appears more general, it would still need to be copy/pasted for each post individually. So although it demonstrates the best that could be achieved, it is not a technical improvement.
Therefore, there are only two alternatives:
Create a Google Sheet (or other spreadsheet) where the old URLs are copy/pasted in column one, a formula using arrayformula and regular expressions to parse the old URL and generate the new URL is added in column two, and in column three a formula is written to join the two cells with -> and 301. With that done, you could click, drag and highlight all cells in column 3, copy, and paste them in the "URL Shortcuts" text area in Squarespace.It can be quite time consuming to figure out, write and test the correct formula, but it does avoid having to manually type out every redirect. Whether it is less time/effort in total depends on the number of redirects and one's proficiency with writing spreadsheet formulas.It could be that using the redirect code above would simplify the formula that'd need to be written in the spreadsheet, which may save some time.
Another alternative would be to remove your redirects and instead handle the redirect via JavaScript added to the 404/Page-not-found page. Because it sounds like you already have all of the redirects in place but are simply trying to reduce the overall number, I wouldn't recommend changing to a JavaScript-based approach. There are other drawbacks to using JavaScript, in any case.

Grok filter for a time counter HH:MM

I'm quite new to ELK and Grok-filtering, and I'm struggling with parsing this particular pattern in my grok filter.
I've used the grok debugger to try and solve this, but although I like the tool, I just get confused by the custom patterns.
Eventually, I hope to parse lots of log files sent by filebeat to logstash, then send the parsed logs to elasticsearch and display with kibana or some similar visualization tool.
The lines that I need to parse follow the following pattern:
1310 2017-01-01 16:48:54 [325:51] [326:49] [359:57] Some log info text
The first four digits is a log type identifier, and will be used for grouping. I've called the field "LogLineID".
The date is formatted YYYY-MM-DD HH:MM:SS, and is parsed ok. I called the field "LogDate".
But now the problem begins. Within the square brackets, I have counters, formatted as MM:SS if you like. I cannot for the life of me find a way to sort these out, but I need to compare these times, hence I want to store them as minutes and seconds, not just numbers.
The first is a counter "TimeSpent",
the second is a counter "TimeStarted" and
the third is a counter "TimeSinceDown".
Then, last, comes the info text, which I've managed to grok with simply applying %{GREEDYDATA:LogInfo}.
I notice that the amount of minutes could be far higher than the standard 60 minutes within an hour, so I may be barking up the wrong tree here trying to parse it with date patterns such as TIMESTAMP_ISO8601, but then, I don't really know how else to do this.
So, I came this far:
%{NUMBER:LogLineID} %{TIMESTAMP_ISO8601:LogDate}
and were as mentioned able to (by cutting away the square bracket parts) to parse the log info text with
%{GREEDYDATA:LogInfo}
to create a field LogInfo.
But that's were I'm stuck. Could someone please help me figure out the rest?
Massive thanks in advance.
PS! I also found %{NUMBER:duration}, but it could as far as I could tell only parse timestamps with dot, not colon..
grok regex expression can help you solve the problem.
but first I wanna make sure that do you mean [325:51] [326:49] [359:57] are the three component that you wanna to fetch? And it will returns the result like :
TimeSpent: 325:51
TimeStarted: 326:49
TimeSinceDown: 359:57
were i get the point , you can use my ways in on of the following suggestions:
define your own custom pattern files and add the pattern in your file.
just use the expression in filter part of logstash conf file
hope it will helps you
Ah, there was a space.. Actually, I was misleading myself and everybody in my question, as it was not actually that log line that was causing problems. I just took the first one, not realizing where the problem really were, but the one causing problems had a space within the brackets as such: [ 42:31]. There are also some parts where there are two spaces, so the way I managed to solve this was to include a %{SPACE} between the \[ and the %{NUMBER}:
%{NUMBER:LogLineID} %{TIMESTAMP_ISO8601:LogDate} \[%{SPACE}%{NUMBER:TimeSpentMinutes}\:%{NUMBER:TimeSpentSeconds}\] \[%{SPACE}%{NUMBER:TimeStartedMinutes}\:%{NUMBER:TimeStartedSeconds}\] \[%{SPACE}%{NUMBER:TimeSinceDownMinutes}\:%{NUMBER:TimeSinceDownSeconds}\] %{GREEDYDATA:LogText}
I still haven't solved the merging of minutes and seconds, but this I can also handle in a later stage.
Thanks to Lin Don for showing an interest in my problem, and sorry for not replying sooner.
Hope the solution will help others (or even myself) if their stuck on the same kind of problem.
Note to myself: Read the logs more carefully before grok'ing.. :)

Apply Command to String-type custom fields with YouTrack Rest API

and thanks for looking!
I have an instance of YouTrack with several custom fields, some of which are String-type. I'm implementing a module to create a new issue via the YouTrack REST API's PUT request, and then updating its fields with user-submitted values by applying commands. This works great---most of the time.
I know that I can apply multiple commands to an issue at the same time by concatenating them into the query string, like so:
Type Bug Priority Critical add Fix versions 5.1 tag regression
will result in
Type: Bug
Priority: Critical
Fix versions: 5.1
in their respective fields (as well as adding the regression tag). But, if I try to do the same thing with multiple String-type custom fields, then:
Foo something Example Something else Bar P0001
results in
Foo: something Example Something else Bar P0001
Example:
Bar:
The command only applies to the first field, and the rest of the query string is treated like its String value. I can apply the command individually for each field, but is there an easier way to combine these requests?
Thanks again!
This is an expected result because all string after foo is considered a value of this field, and spaces are also valid symbols for string custom fields.
If you try to apply this command via command window in the UI, you will actually see the same result.
Such a good question.
I encountered the same issue and have spent an unhealthy amount of time in frustration.
Using the command window from the YouTrack UI I noticed it leaves trailing quotations and I was unable to find anything in the documentation which discussed finalizing or identifying the end of a string value. I was also unable to find any mention of setting string field values in the command reference, grammer documentation or examples.
For my solution I am using Python with the requests and urllib modules. - Though I expect you could turn the solution to any language.
The rest API will accept explicit strings in the POST
import requests
import urllib
from collections import OrderedDict
URL = 'http://youtrack.your.address:8000/rest/issue/{issue}/execute?'.format(issue='TEST-1234')
params = OrderedDict({
'State': 'New',
'Priority': 'Critical',
'String Field': '"Message to submit"',
'Other Details': '"Fold the toilet paper to a point when you are finished."'
})
str_cmd = ' '.join(' '.join([k, v]) for k, v in params.items())
command_url = URL + urllib.urlencode({'command':str_cmd})
result = requests.post(command_url)
# The command result:
# http://youtrack.your.address:8000/rest/issue/TEST-1234/execute?command=Priority+Critical+State+New+String+Field+%22Message+to+submit%22+Other+Details+%22Fold+the+toilet+paper+to+a+point+when+you+are+finished.%22
I'm sad to see this one go unanswered for so long. - Hope this helps!
edit:
After continuing my work, I have concluded that sending all the field
updates as a single POST is marginally better for the YouTrack
server, but requires more effort than it's worth to:
1) know all fields in the Issues which are string values
2) pre-process all the string values into string literals
3) If you were to send all your field updates as a single request and just one of them was missing, failed to set, or was an unexpected value, then the entire request will fail and you potentially lose all the other information.
I wish the YouTrack documentation had some mention or discussion of
these considerations.

Gatling. Check, if a HTML result contains some string

Programming Gatling performance test I need to check, if the HTML returned from server contains a predefined string. It it does, break the test with an error.
I did not find out how to do it. It must be something like this:
val scn = scenario("CheckAccess")
.exec(http("request_0")
.get("/")
.headers(headers_0)
.check(css("h1").contains("Access denied")).breakOnFailure()
)
I called the wished features "contains" and "breakOnFailure". Does Gatling something similar?
Better solutions:
with one single CSS selector:
.check(css("h1:contains('Access denied')").notExists)
with substring:
.check(substring("Access denied").notExists)
Note: if what you're looking for only occurs at one place in your response payload, substring is sure more efficient, as it doesn't have to parse it into a DOM.
Here ist the solution
.check(css("h1").transform((s: String) => s.indexOf("Access denied"))
.greaterThan(-1)).exitHereIfFailed
You can write it very simple like:
.check(css("h1", "Access denied").notExists)
If you are not sure about H1 you can use:
.check(substring("Access denied").notExists)
IMO server should respond with proper status, thus:
.check(status.not(403))
Enjoy and see http://gatling.io/docs/2.1.7/http/http_check.html for details
EDIT:
My usage of CSS selector is wrong see Stephane Landelle solution with CSS.
I'm using substring way most of the time :)

Fiddler Autoresponder: Regex replacement not working

I have a regex rule and an action that returns a file from a local cache. The rule captures what I want it to, but the problem is $2 in the action is not handled, so Fiddler tries to return D:\path\$2 (and fails). What could be wrong?
Rule:
regex:(?insx).*(host1.com|host2.com)/folder1/folder2/(.*)\?rev=.*
Action:
D:\path\$2
Any help would be appreciated.
P.S. I'm using Fiddler v2.4.8.0
After loosing an interesting amount of hair with this, I achieve it 'naming' the group replacement, like this:
Rule:
regex:(?insx).*(host1.com|host2.com)/folder1/folder2/(?'mygroup'.*)\?rev=.*
Action:
D:\path\${mygroup}
When you're using group replacements like this, it's important to put ^ at the front of the Rule expression and $ at the end.