Using distutils to compile plain C code with unmangled output name - distutils

I have a little benchmark suite, which compares different approaches to writing and executing some simple toy Python code, mainly for the purpose of illustrating Cython.
One of the approaches I used to have, was to write the test function in pure C, compile it with a C compiler (getting distutils to find and invoke the compiler for me) then load the library into Python with ctypes, and execute the function.
This worked swimmingly until distutils started embedding the python version and the platform in the name of the file it generates.
In the past I could persuade distutils to compile a C source file called C.c into a library called C.so in the working directory: recent versions of distutils insist on mangling the output name into something of the form build/temp.linux-x86_64-3.6 or (depending on exactly what distutils features I use) some other variaton on that theme.
Is there any way to persuade current versions of distutils to simply call the output C.so in the current working directory? (Alternatively, is there some way to reliably and easily tell ctypes where the library produced by distutils resides?)
[Bonus: How can this idea be expressed portably across Linux, OS X and Windows?]
=================================================================
Edit: for completeness, here is how I used to do it successfully in the past:
pure_C_setup.py:
from distutils.core import setup, Extension
setup(name="ctypes-test",
ext_modules = [Extension("C", ["C.c"])])
With the above setup file, the command python pure_C_setup.py install --install-lib=. used to produce C.so in the working directory; today it produces build/lib.linux-x86_64-3.6/C.cpython-36m-x86_64-linux-gnu.so

Not platform, not even compiler independent, but you might be able to use Extension's extra_link_args to achieve what you want.
For VCPP 2015:
setup(name="ctypes-test",
ext_modules = [Extension("C", ["C.c"], extra_link_args="/OUT:C.pyd")])

Related

Missing Python file for models using pyCommunicator

My experience with Python is limited, but I just started looking at the new Python models included AnyLogic's examples. I am looking at the 1st one Passing Data Types. The model runs correctly with the set, modify and get functions working as expected. My question is there a python file somewhere that the communicator is working with? I only see the .alp file in the folder.
Thanks
I'm also beginning to use this, but as I understand, the idea of the python helper here (among other things) is that you can run python commands with Anylogic, so you actually don't need a python file. Nevertheless it uses python installed in your computer to run the scripts, if you don't have python installed, your model won't work.

Looking for a lua obfuscator to protect code

I have written a plugin for vanilla lua. I wish to protect this plugin, and I have heard of obfuscation. I tried XFuscator, but even after fixing line 5's logic, it doesnt work. Are there any newer, better ones floating out there?
Thanks!
If you are going to run your Lua script in the same machine you build it (I mean, same Lua version, same machine architecture), you could just compile it to bytecode using luac like this:
luac -s -o example.out example.lua
And distribute the .out file, that doesn't contain the Lua source code.
Note that Lua bytecode is platform specific (endianness, word size), and it could change in future Lua versions (in fact it already did in the past). For that reason, if you compile it, let's say, in a Intel x86-64 with Lua 5.3, you should run your generated .out only in this kind of machines or compatible ones.

directory structure of cpan module

I am working on perl module that I would like to submit in CPAN.
But I have a small query in regards to the directory structure of module.
As per the perlmonk article the module code directory structure should be as below:
Foo-Bar-0.01/Bar.pm
Foo-Bar-0.01/Makefile.PL
Foo-Bar-0.01/MANIFEST
Foo-Bar-0.01/Changes
Foo-Bar-0.01/test.pl
Foo-Bar-0.01/README
But when I am using the command, the structure is generated as below
h2xs -AX Foo::Bar
Writing Foo-Bar/lib/Foo/Bar.pm
Writing Foo-Bar/Makefile.PL
Writing Foo-Bar/README0
Writing Foo-Bar/t/Foo-Bar.t
Writing Foo-Bar/Changes
Writing Foo-Bar/MANIFEST
The article in question is advocating a considerably-older module structure. It certainly could be used, but it loses a lot of the advancements that have been put into place as far as good testing, building, and distribution practices.
To break down the differences:
modules have moved from the top level to the lib/ directory. This unifies the location where your module "lives" (i.e., the place where you work on the code and create the baseline modules to be tested and eventually distributed). It also makes it easier to set up any hierarchy that you need (e.g. subclasses, or helper modules); the newer setup will just pick these up. The older one may but I'm not familiar enough with it to say yes or no.
Makefile.PL in the newer setup will, when "make" is run. create a library called "blib", the *b*uild *lib*rary - this is where the code is built for actual testing. It will pretty much be a copy of lib/ unless you have XS code, in which case this is where the compiled XS code ends up. This makes the process of building and testing the code simpler; if you update a file in lib/, the Makefile will rebuild the file into blib before trying to test it.
the t/ directory replaces test.pl; "make test" will execute all the *.t files in t/, as opposed to you having to put all your tests in test.pl. This makes it far easier to write tests, as you can be sure you have a consistent state at the beginning of each test.
MANIFEST and Changes are the same in both: MANIFEST (built by "make manifest") is used to determine which files in the build library should be redistributed when the module is packaged for upload, and used to verify that a package is complete when it's downloaded and unpacked for building. Changes is simply a changelog, which you edit by hand to record the changes made in each distributed version.
As recommended in the comments on your question, using Module::Starter or Dist::Zilla (be warned that Dist::Zilla is Moose-based and will install a lot of prereqs) is a better approach to building modules in a more modern way. Of the two, the h2xs version is closer to modern packaging standards, but you're really better off using one of the recommended package starter options (and probably Module::Build, which uses a Build Perl script instead of a Makefile to build the code).

How does PAR::Packer work?

I was using PAR::Packer and this question popped up in my mind. How does PAR::Packer work in Perl? Does it actually compile the Perl script to .exe like g++ compiles C++ Sources to .exe or does it work like py2exe in Python that packs the interpreter and the script into an .exe?
To make this absolutely clear:
Tools like PAR::Packer do not “compile” your Perl script. They bundle the perl interpreter together with your source files and any required modules into a big fat executable file. When it is run, the original sources are extracted and fed to the enclosed perl.
This works reasonably well, but does not yield a speed improvement (on the contrary…). The only advantage is that you can distribute your programs as a single (albeit quite large) file, without dependencies.
There is a very experimental tool called perlcc that is able to translate some Perl programs to C or a Perl bytecode serialization. As the docs put it:
The code generated in this way is not guaranteed to work. The whole codegen suite (perlcc included) should be considered very experimental. Use for production purposes is strongly discouraged.
This is because the Perl language does not support static compilation. It needs to be able to execute code during parsing for some dynamic features during the same session where the main execution phase takes place.
There are other, commercial tools, that usually fall in the same category as PAR::Packer (creating fat executables).
Summary: If you want a single executable, use PAR::Packer. If you want speed, inline some C (or use XS). There is no tool that can compile all Perl scripts to machine code.
I was using PAR::Packer and this question popped up in my mind. How
does PAR::Packer work in Perl?
Does it actually compile the Perl
script to .exe like g++ compiles C++ Sources to .exe
no pp and perl2exe doesn't. (though pp is free).
but it looks like perlcc does
or does it work
like py2exe in Python that packs the interpreter and the script into
an .exe?
pp and perl2exe, yes
As an example- sendemail.exe that looks like something done in PAR Packer or Perl2exe
It packs the interpreter into the exe.
You can open sendemail.exe in 7-zip! There are some folders there but one can't really see it actual files.
I suppose it's some form of self extracting executable, but, that executes code.
You can monitor it with process monitor, and you see.
Or with, process explorer and see if it uses any dll from the temp directory that it creates..
It creates a temp directory e.g.
C:\Users\user\AppData\Local\Temp\pdk-user <--- Win7
or
C:\Documents and Settings\user\Local Settings\Temp\pdk-user <-- WinXP
The temporary directory it creates contains a bunch of strange named DLLs and a DLL called perl58.dll which is no doubt, the perl interpreter
I'm sure at one point I saw it had a dll there with a normal name. SSLEAY32.DLL and interestingly when I ran the pl file it used a dll with a similar name from my c:\perl64 directory. So the EXE looks like a bit of a hack really. It's more reliable to run the pl file directly.
DIR /s/b within that directory, shows-
178c2b604baa8a7f1ebc80539f378bff.dll
1823e8f62785746fd29cf0b06c636600.dll
465d2954d90fe6225ea61b3907c91da8.dll
6145f78a34d5ced8200800f1455d578a <-- the directory with the perl58. dll
9c50b5816b0e35f047e41f5899721d46.dll
f4e2e0db77ed1e6572c2f490479cd815.dll
f72f556d99dfb6b0c3bb37f123e2ee96.dll
6145f78a34d5ced8200800f1455d578a\perl58.dll
no normal named DLLs showing other than perl58.dll (though I have seen a normal one there in the past).
if you look in process explorer, you see it using perl58.dll and other normal named dlls
If you look at the PAR website, a page describes perl2exe
Perl2Exe is commercial, command-line application that can build
standalone executables from perl sources. It works by creating an
executable that contains
A standalone perl interpreter (that is capable of grokking perl 5.8.x)
Your perl script and All perl modules that are referenced by your perl
script.
pp - PAR Packager provides that same functionality, but is free.
And if you look here
http://www.perlmonks.org/?node_id=237943
...the perl2exe tool is not a way to hide your source code.
Now it's even come to the attention of the security community (reported in bugtraq, for example).
For details, see the report from net-security's page.
Please stop supporting perl2exe. Please use PAR for a complete installation package, or perlcc to simply compile the top-level program.
-- Randal L. Schwartz, Perl hacker
So, PP and perl2exe combine the interpreter into the EXE, And you can get an idea that there's something funny atypical about the EXE when it opens in 7zip!
perlcc compiles properly.
it appears to have been unmaintained for a while http://www.perlmonks.org/?node_id=654568 , people said it was buggy, but some work was done on it as recently as June 2014 http://www.yapcna.org/yn2014/talk/5603 It gets better performance than interpreted perl. (unlike the EXEs that pack an interpreter, which are slower than running a perl script normally).

How can I import a .PYD module in IronPython?

I'm trying to use a python package from IronPython.
Everything works fine if I import regular python modules.
But when I try to do the following:
import win32ui
I get:
No module named win32ui
I've hunted through the code in IronPython.Runtime.Importer and there's no mention of .pyd
Anyone know a way around this?
You can check out IronClad which is working to provide this support. It may or may not work w/ your PYD of choice.
A .pyd file is a DLL. So unless IronPython (which is written in .net) can correctly load C DLLs written for CPython, you might be out of luck.
Update
In fact, according to the IronPython FAQ, you are unfortunately unable to import .pyd files:
Q: How do I build and call into PYD libraries?
A: IronPython does not support using PYDs built for CPython since they
leverage implementation details of CPython. You can get a similar
effect for new "PYD"s you would like to implement by writing them in C#
or VB and building a DLL for .NET.