Perl Term::Screen Enable insert character / delete character - perl

The Perl module Term::Screen has a method for inserting a character and deleting a character. Both of these methods have an accompanying method to check if the interface allows for such actions [ic_exists() and dc_exists()]. I'm running this script through a ssh session and the ic_exists and dc_exists are returning 0, not available. What do I need to do to enable the insert character and delete character for this module?

Make sure that the TERM in your ssh environment is set to something that supports those functions (and is compatible with your terminal).
Try adding
print "$ENV{TERM}\n";
to your script to see what terminal it thinks it's talking to.

Related

Is there a good place in .vim folder to store text filters?

I would like to create a filter folder, best inside .vim and be able to run a text filter just with one file name:! filter.pl
I put up a Perl text filter to change all special Characters in a LaTeX Math Formula, which is running fine so far - only problem it is running on the whole line not the selected formula, but I can live with it ...
#!/usr/bin/perl -np
use strict;
use warnings;
# this filter transforms all special characters in Mathformular for LaTeX
s/\\/\\backslash /g;
s/([\$\#&%_{}])/\\$1/g;
But to call this filter is cumbersome
: '<,'>!"/Users/username/Library/Mobile Documents/com~apple~CloudDocs/my_vim_cheat_sheet/perl_filter.pl"
Apple put in the path to the iCloud a white space, so I have to put "" around! Where I put a collection of text filters?
Thank you for your answers
marek
You can safely create a subfolder with any name different from ones Vim uses itself (see :h 'rtp'). So this is ok:
:*!$HOME/.vim/filters/perl_filter.pl
Also Vim has a predefined interface for a general purpose filter called 'equalprg'. To make use of it simply set a global-local (i.e. both set and setlocal are meaningful) option equalprg to a fully qualified name of your script. Then hit = in visual mode to apply filter (or ={motion} in normal mode). (Read :h 'equalprg' :h =).
If you need several filters at once, and switching equalprg is not convenient, you can still try different options to reduce typing.
For example, mappings, such as
vnoremap <Leader>f :!/path/to/my/filter<CR>
Then hitting \f (or whatever is your "leader" key set) in the visual mode will result in the executing :'<,'>!/path/to/my/filter (note that the visual selection will be applied automatically).
Another attempt is to set a dedicated environment variable (which will be inherited by all child processes including shell(s). For example,
:let $filters = '~/.vim/filters'
:*!$filters/myfilter.pl
Of course, you can put those set equalprg=... vnoremap ... let $filters=... etc.etc. in your vimrc.
I would like to create a filter folder, best inside .vim and be able to run a text filter just with one file name :! filter.pl
Simply add the script to somewhere within your $PATH. Or, if you really only intend to use that from within Vim, then add that directory to your $PATH in your .vimrc, so you have it available there.
For example, if you'd like to use ~/.vim/scripts for your external Perl or shell scripts, you can use this in your ~/.vimrc:
call setenv('PATH', expand('~/.vim/scripts').':'.$PATH)
After that, you can simply use :'<,'> !filter.pl to run it. And Tab completion will work with the name of the script, type :!fil<Tab> and Vim will complete it to filter.pl, assuming it's a unique prefix.
The snippet above for your .vimrc has one minor issue, that if you :source your .vimrc during Vim runtime, it will keep adding the entry to $PATH multiple times. That doesn't typically break anything, only the entry will become longer, you might run into variable length issues.
You can fix it by checking whether that's present in path or not before updating it, perhaps with something like:
let scripts_dir = expand('~/.vim/scripts')
if index(split($PATH, ':'), scripts_dir) < 0
call setenv('PATH', scripts_dir.':'.$PATH)
endif
But also, about this:
I put up a Perl text filter to change all special Characters in a LaTeX Math Formula
s/\\/\\backslash /g;
s/([\$\#&%_{}])/\\$1/g;
Consider writing that in Vim instead.
In fact, almost the same syntax will work as a Vim function:
function! EscapeLatexMathFormula()
s/\\/\\backslash /eg
s/\([$#&%_{}]\)/\\\1/eg
endfunction
You can call it on a range, with:
:'<,'>call EscapeLatexMathFormula()
Calling it without a range will affect the current line only.
You can also make it into a command, with:
command! -range EscapeLatexMathFormula <line1>,<line2> call EscapeLatexMathFormula()
In which case you can simply use:
:'<,'>EscapeLatexMathFormula
You can use tab-completion for the function and command names (though, of course, you can pick shorter names if you'd like, as well.)
Note that user-defined command names need to start with an uppercase letter. Function names can start with an uppercase letter too (there are more options for function names, but making this global with an uppercase is probably the easiest here.)

Translate PL/SQL define to TSQL

I am given a task to transition PL/SQL code to T-SQL.
Can anybody explain what SET DEFINE ON does in sqlplus and most impooirtantly how to translate it to T-SQL (I suppose using sqlcmd as a launcher?)
SET DEFINE is an SQL*Plus command setting the use and prefix for substitution variables. & is the default prefix and SET DEFINE ON resets it to this default and turns on the use of substitution variables.
So this is not a PL/SQL thing but an SQL*Plus thing.
As far as I know there's no such thing as substitution variables for sqlcmd, i.e. there's no equivalent for sqlcmd let alone T-SQL. But I might be wrong there.
'&' appears to be a token of the metalanguage used by SQL*Plus (the client, that is) to activate its "variable substitution" feature. So string literals containing '&' ("Marks & Spencer") tokens may not behave entirely as expected. SET DEFINE apparently serves to control that activation.

avoiding exploit in perl variable extrapolation from file

I am optimizing a very time/memory consuming program by running it over a dataset and under multiple parameters. For each "run", I have a csv file, "setup.csv" set up with "runNumber","Command" for each run. I then import this into a perl script to read the command for the run number I would like, extrapolate the variables, then execute it on the system via the system command. Should I be worried about the potential for this to be exploited, (I am worried right now)? If so, what can I do to protect our server? My plan now is to change the file permissions of the "setup.csv" to read only and ownership to root, then go in as root whenever I need to append another run to the list.
Thank you very much for your time.
Run your code in taint mode with -T. That will force you to carefully launder your data. Only pass through strings that are ones you are expecting. Do not launder with .*, but rather check against a list of good strings.
Ideally, there a list of known acceptable values, and you validate against that.
Either way, you want to avoid the shell by using the multi-argument form of system or by using IPC::System::Simple's systemx.
If you can't avoid the shell, you must properly convert the text to pass to the command into shell literals.
Even then, you have to be careful of values that start with -. Lots of tools accept -- to denote the end options, allowing other values to be passed safely.
Finally, you might want to make sure the args don't contain the NUL character (\0).
systemx('tool', '--', #args)
Note: Passing arbitrary strings is not possible in Windows. Extra validation is required.

Hitting ORA-01461 when inserting multibyte characters from perl into oracle

I have a perl script that is inserting records from a text file into our database. Whenever the record has a multibyte character like "RODR_Í_GUEZ". I receive the error ORA-01461, however i'm nowhere near the 4000 characters to switch from varchar2 to long
setting:
$ENV{NLS_CHARACTERSET} = 'AL32UTF8';
before connecting doesn't seem to help.
Using a java client (SQuirreL SQL) and manually writing the INSERT INTO statement inserts the record just fine, so i'm sure it's not how the database is configured.
Any thoughts?
You probably want to set the NLS_LANG environment variable. For Unix-ish systems, there is a script supplied in $ORACLE_HOME/server/bin called nls_lang.sh to output a reasonable value for your system, based on the LANG environment variable.
e.g. for my system (LANG=en_GB.UTF-8) the equivalent Oracle setting is
NLS_LANG=ENGLISH_UNITED KINGDOM.AL32UTF8
More info: http://forums.oracle.com/forums/thread.jspa?threadID=381531
Sergiusz's post there says practically all you need to know: I'll just add that the Perl DBD::Oracle driver is OCI-based, and the pure-Java JDBC driver isn't, hence they work differently in the same environment.

How to discover command line options (if any) for an undocumented executable of unknown origin?

Take an undocumented executable of unknown origin. Trying /?, -h, --help from the command line yields nothing. Is it possible to discover if the executable supports any command line options by looking inside the executable? Possibly reverse engineering? What would be the best way of doing this?
I'm talking about a Windows executable, but would be interested to hear what different approaches would be needed with another OS.
In linux, step one would be run strings your_file which dumps all the strings of printable characters in the file. Any constants chars will thus be shown, including any "usage" instructions.
Next step could be to run ltrace on the file. This shows all function calls the program does. If it includes getopt (or familiar), then it is a sure sign that it is processing input parameters. In fact, you should be able to see exactly what argument the program is expecting since that is the third parameter to the getopt function.
For Windows, you can see this question about decompiling Windows executables. It should be relatively easy to at least discover the options (what they actually do is a different story).
If it's a .NET executable try using Reflector. This will convert the MSIL code into the equivalent C# code which may make it easier to understand. Unfortunately private and local variable names will be lost, as these are not stored in the MSIL but it should still be possible to follow what's going on.