What does ${plugin::command} mean in NSIS? - plugins

I'm trying to figure out how to modify an XML file with NSIS. So I'm trying to learn how to use the XML plugin. The examples on the forum page often use the format ${plugin::command} like:
${xml::LoadFile}
The documentation gives no indication that you need the dollar sign and curly braces. As I understand it, just plugin::command will do. So I've been trying to figure out what that syntax means.
The documentation says a $ is for variables and the {} are for code blocks, but I can't find anything about what it means when they're used together. My Internet searches have revealed that it's used for something called template literals in JavaScript. But what does it mean in NSIS?
EDIT: I should mention that the NSIS documentation does show examples of this syntax, especially in the Predefines section, but it still doesn't explain what the syntax means in general.
EDIT: Okay, now I see that the syntax is for the compiler to replace things using !define and !macro. But... what about this specific case? Is it valid to use colons in such a symbol? Why are some people writing ${xml::LoadFile}and some people just writing xml::LoadFile?

It's a !define. There is a header file for this plugin that defines it. The plugin probably needs to do more than one thing, so they wrapped a few lines together with a define that inserts a macro. Either that or it has some default parameters for the plugin call. Either way, it's trying to save you some typing with this syntax.

Related

Shortcut for clike languages comments not working/implemented?

I'm using the Brackets code editor to code in C++ and I'm having a hard time having the shortcut for lineComment and blockComment working...
The shortcuts are [Ctrl+/] and [Ctrl+Shift+/], they work perfectly for CSS, JS.. etc but not with C++ files.
I looked into the clike.js file in the CodeMirror folder of Brackets, the blockCommentStart, blockCommentEnd and lineComment are correctly defined.
Is it a known issue? has anyone found a workaround?
Before that,I was coding with Notepad++ and this feature was the one I used the most. It's really hard not to have it anymore
You said you saw that blockCommentStart, blockCommentEnd and lineComment are correctly defined in clike.js. From CodeMirror documentation
This file defines, in the simplest case, a lexer (tokenizer) for your
languageā€”a function that takes a character stream as input, advances
it past a token, and returns a style for that token. More advanced
modes can also handle indentation for the language.
It is used to highlight the c++ file. But also it could be used to auto comment line with shortcut. However it is probably not implemented for C++. For this feature comment addon from CodeMirror might be used http://codemirror.net/addon/comment/comment.js since The addon also defines a toggleComment command, which will try to uncomment the current selection, and if that fails, line-comments it.
This was a Brackets bug, but it was fixed in the Sprint 39 release.
(Fwiw though, language metadata in Brackets is defined in a file called languages.json - although Brackets extensions can add to / modify this metadata as well).

How can I use a simpler link syntax in org-mode?

I'd like to have links with the syntax [[foo bar]] go to files with the name foo bar.org. This would make using org-mode much more like using a personal local wiki.
Is this possible without breaking existing link functionality? I'd also ideally still be able to export to html, etc. with standard org-mode tools.
The best I've been able to do is something like: (setq org-link-abbrev-alist '(("o" . "file:%s.org")))
This lets me use the syntax [[o:foo bar]], but that is more verbose, and looks distractingly ugly inline. For example: The quick brown o:fox jumps over the o:lazy_dog. And [[o:foo bar][foo bar]] is even more verbose to type and edit (though it reads fine in org mode).
I don't have a ready made solution and am not a programmer, but this part is self-documenting in org.el, you can write a dedicated link search function. I cite:
"List of functions to execute a file search triggered by a link.
Functions added to this hook must accept a single argument, the search
string that was part of the file link, the part after the double
colon. The function must first check if it would like to handle this
search, for example by checking the `major-mode' or the file
extension. If it decides not to handle this search, it should just
return nil to give other functions a chance. If it does handle the
search, it must return a non-nil value to keep other functions from
trying.
Each function can access the current prefix argument through the
variable `current-prefix-arg'. Note that a single prefix is used to
force opening a link in Emacs, so it may be good to only use a numeric
or double prefix to guide the search function.
In case this is needed, a function in this hook can also restore the
window configuration before `org-open-at-point' was called using:
(set-window-configuration org-window-config-before-follow-link)")
See also Hyperlinks :: Custom Searches # gnu.org

How to find literals in source code of Smartforms and in SAPScripts (or reports, if the others can't be done)

I'd like to check hardcoded values in (a lot of) Smartforms and SAPScript forms.
I have found a way to read the source code of both of these, but it seems that i will have to go through a lot of parsing before I get anything reliable.
I've come across function module GET_LITERAL but that doesn't seem to help me much since i have to specify the offset of the value, if i got right what the function is doing in the first place.
I also found RS_LITERAL_LIST but that also doesn't do what i expect.
I also tried searching for reports and methods, but haven't found anything that seemed to help.
A backup plan would be to get some good parsing tool, so do you know of anything like that.
Anyway, any hints would be helpful and appreciated.
[EDIT]
Forgot to mention, the version of my system is 4.6C
If you have a fairly recent version of ABAP, you can use a regex.
Follow the pattern of this example, but use your source as the text and create your own regex. Have it look for any single quotes on the end of a word separated by spaces or any integers with spaces on either side. That's just a start, you might need to work on a better pattern.
String functions count, find, and match

Comments for Function in Emacs

I'm looking for a way to generate and insert header comment blocks above my functions in Emacs (in any mode), with the default contents of the comment automatically based on the function's signature (i.e. the correct number of #param place-holders).
Doxymacs is a nice candidate. But I prefer another way works without the necessary libs. Can anyone recommend some others ways for adding smart comments for functions in Emacs? Thanks.
Edit:
Now I found this: http://nschum.de/src/emacs/doc-mode/, but it seems that it does not work well after I require it into my .emacs and add hook for js-mode. Doesn't it support js functions ?
I don't know of any general-purpose approach.
Csharp-mode has a defun that is bound to / , which tries to generate comments appropriate for C#. The way it works: Every time you type a slash, it looks to see if it is the third slash in a row. (In C#, three slashes are used to denote comments that produce documentation). If it is the third slash, then it looks at the surrounding text and inserts a comment skeleton or fragment that is appropriate.
It is not generalized in any way to support javascript or other language syntaxes. But you might be able to build what you want, if you start with that.
here's the excerpt:
http://pastebin.com/ATCustgi
I've used doxymacs in the past and I've found it useful
http://doxymacs.sourceforge.net/

Why are Perl source filters bad and when is it OK to use them?

It is "common knowledge" that source filters are bad and should not be used in production code.
When answering a a similar, but more specific question I couldn't find any good references that explain clearly why filters are bad and when they can be safely used. I think now is time to create one.
Why are source filters bad?
When is it OK to use a source filter?
Why source filters are bad:
Nothing but perl can parse Perl. (Source filters are fragile.)
When a source filter breaks pretty much anything can happen. (They can introduce subtle and very hard to find bugs.)
Source filters can break tools that work with source code. (PPI, refactoring, static analysis, etc.)
Source filters are mutually exclusive. (You can't use more than one at a time -- unless you're psychotic).
When they're okay:
You're experimenting.
You're writing throw-away code.
Your name is Damian and you must be allowed to program in latin.
You're programming in Perl 6.
Only perl can parse Perl (see this example):
#result = (dothis $foo, $bar);
# Which of the following is it equivalent to?
#result = (dothis($foo), $bar);
#result = dothis($foo, $bar);
This kind of ambiguity makes it very hard to write source filters that always succeed and do the right thing. When things go wrong, debugging is awkward.
After crashing and burning a few times, I have developed the superstitious approach of never trying to write another source filter.
I do occasionally use Smart::Comments for debugging, though. When I do, I load the module on the command line:
$ perl -MSmart::Comments test.pl
so as to avoid any chance that it might remain enabled in production code.
See also: Perl Cannot Be Parsed: A Formal Proof
I don't like source filters because you can't tell what code is going to do just by reading it. Additionally, things that look like they aren't executable, such as comments, might magically be executable with the filter. You (or more likely your coworkers) could delete what you think isn't important and break things.
Having said that, if you are implementing your own little language that you want to turn into Perl, source filters might be the right tool. However, just don't call it Perl. :)
It's worth mentioning that Devel::Declare keywords (and starting with Perl 5.11.2, pluggable keywords) aren't source filters, and don't run afoul of the "only perl can parse Perl" problem. This is because they're run by the perl parser itself, they take what they need from the input, and then they return control to the very same parser.
For example, when you declare a method in MooseX::Declare like this:
method frob ($bubble, $bobble does coerce) {
... # complicated code
}
The word "method" invokes the method keyword parser, which uses its own grammar to get the method name and parse the method signature (which isn't Perl, but it doesn't need to be -- it just needs to be well-defined). Then it leaves perl to parse the method body as the body of a sub. Anything anywhere in your code that isn't between the word "method" and the end of a method signature doesn't get seen by the method parser at all, so it can't break your code, no matter how tricky you get.
The problem I see is the same problem you encounter with any C/C++ macro more complex than defining a constant: It degrades your ability to understand what the code is doing by looking at it, because you're not looking at the code that actually executes.
In theory, a source filter is no more dangerous than any other module, since you could easily write a module that redefines builtins or other constructs in "unexpected" ways. In practice however, it is quite hard to write a source filter in a way where you can prove that its not going to make a mistake. I tried my hand at writing a source filter that implements the perl6 feed operators in perl5 (Perl6::Feeds on cpan). You can take a look at the regular expressions to see the acrobatics required to simply figure out the boundaries of expression scope. While the filter works, and provides a test bed to experiment with feeds, I wouldn't consider using it in a production environment without many many more hours of testing.
Filter::Simple certainly comes in handy by dealing with 'the gory details of parsing quoted constructs', so I would be wary of any source filter that doesn't start there.
In all, it really depends on the filter you are using, and how broad a scope it tries to match against. If it is something simple like a c macro, then its "probably" ok, but if its something complicated then its a judgement call. I personally can't wait to play around with perl6's macro system. Finally lisp wont have anything on perl :-)
There is a nice example here that shows in what trouble you can get with source filters.
http://shadow.cat/blog/matt-s-trout/show-us-the-whole-code/
They used a module called Switch, which is based on source filters. And because of that, they were unable to find the source of an error message for days.