I am a very new to porting.
I was trying to port perl to a netbsd system. Since its a custom made build, we wont be able to run configure or make on the target netbsd system. So we are trying to cross-compile it in a host pc and copy the binary over target machine. And in order to do so, we have to make a makefile from scratch, since the format for the makefile in our build is different.
I have some basic doubts regarding this,
Firstly, In order to create a perl makefile for my custom build, what are the basic things will come. Such as ccflags, library paths etc.,?
There are some files like DynaLoader, uudmap.h, myConfig, Config.pm which gets generated while "make". How can i generate them using custum makefile.
How to set various library paths and what are they ?
The #INC, shows the perl search paths, how can i create it ?
Where exactly Perl modules get installed and when it happens?
A perl build normally involves building a stripped down version of perl named miniperl, which is then used extensively in the remainder of the process of building perl and the bundled modules.
There are two basic approaches to cross-compiling: to build miniperl for the target machine and build the modules, etc., there, or to build miniperl for the host and use it to build perl and modules for the target.
The WinCE port uses the latter approach; the rudimentary (last I knew, anyway) support for a -Dusecrosscompile switch to Configure uses the former.
I recommend you ask for advice and help on the perl5-porters mailing list: http://lists.perl.org/list/perl5-porters.html
And be prepared for hard work.
NetBSD's pkgsrc system has perl in it already and has the ability to generate binary packages that you can then install on a target machine.
Related
I'm writing a CI script to build a Perl library located at GitHub.
Now I do it this way:
cpanp i Module::Install # and other configure dependencies
perl Makefile.PL
make dist
cpanp i Foo-Bar-6.66.tar.gz
I want to create a PAR packages for all the dependencies I build. That's why I use cpanp.
Unlike cpanm ., cpanp i . is buggy and uses the name of the directory to name the distribution, which is incorrect. That's why I use this make dist step.
I can use something else as long as my PARs are built, of course. There is https://metacpan.org/pod/PAR::Dist#blib_to_par function which can in principle be called during any other build process.
The problem with my current approach is that:
the list of configure dependencies is maintained manually. I'd like to use some tool which does it automatically
on Windows make may be called gmake or dmake depending on perl version. I want to use an existing autodetection logic instead of rolling my own
Finally, I'd like autodetection of Makefile.PL vs Build.PL so I can copy-paste the code for different distributions
My question is: Is there any way to build local sources of a distribution either by using cpanp (this way would solve my problem) or in any other way which works for different versions of Windows Perl and builds PAR distributions of dependencies?
I managed to find a solution myself.
CPANPLUS comes with another tool besides cpanp named cpan2dist. cpan2dist --archive Foo-Bar-6.66.tar.gz works as intended - it builds distributions of dependencies and the current tarball, and autodetects the necessary build tool the way cpanp does.
To determine the real make you can use CPANPLUS::Config->new->get_prog('make'). It again uses the same autodetection logic as CPANPLUS so it's pretty robust.
Because i have a long series of comments with #ikegami, I cleaning up the question, in a hope it will be more understandable. Unfortunately, english isn't my "main" language. :(
Let say, having an environment where:
no development tools are installed (no make, nor gcc or like)
perl is installed with its core packages, nothing more
no outgoing network access is allowed - e.g. the user couldn't use curl nor cpan to download/install perl dependencies
the user even doesn't have admin (root) rights
but want install and evaluate some perl based web-app, let call it as MyApp
The MyApp
doesn't uses any XS-based module. (at least, I hope - in the development me using plenv and cpanm, so never checked the installed dependencies in depth)
it is an pure PSGI app, the simple plackup app.psgi works OK
the app uses some data-files which should be included in the "deployment".
The main question is: how to prepare the MyApp, and the all used CPAN-modules, to be easily installed in such restricted environment?
The goal is:
i don't need save my efforts and my time
but i want save the user's time and want minimize the needed actions on his side, so the installation (deployment) should be simple-as-possible.
E.g. how to get an running web-app to the user's machine with minimum possible (his) steps.
- the simplest thing is could be something as:
- copy one file (zip, or tarbal)
- unpack it
- from the terminal execute some run.pl in the unpacked directory.
To get the above simple installation, my idea was the following:
1.) Create an tarball, and after the unpacking will contain 3 folders and 1 perl-script, let say:
myapp_repo/
myapp_repo/distlib #will contain all MyApp's perl modules also ALL used CPAN modules and their dependecies
myapp_repo/datafiles #will contain app-specific data files and such
myapp_repo/install.pl
myall_repo/lib #will contain modules directly used by the `install.pl`
2.) I will develop an install.pl script, and it will be used as the installer-tool, like
perl install.pl new /path/to/app_root
and it will (should):
create the all needed directories under the /path/to/app_root (especially the lib where the will install the perl modules)
will call "local" cpanm internally (from the myapp_repo/lib) to install the app's perl modules and their CPAN dependencies using only distribution files from the distlib.
will generate and install the needed runtime script and the app.psgi into the /path/to/app_root/bin
will install the needed data-files for the app.
3.) So, after this the user should be able to simply run:
/path/to/app_root/bin/plackup /path/to/app_root/bin/app.psgi
In short, the user should use:
the system-wide perl and the system-wide perl-core modules
and any other
runtime perl-scripts (like plackup)
and the required CPAN-modules
should be installed to an self-contained directory tree using only files (no net-access).
E.g. the install.pl should somewhat call internally the cpanm to achieve (as equivalent) for the following cpanm command
cpanm --mirror file://path/to/myapp_repo/distlib --mirror-only My::App
which, should install My::App and all dependencies without network access using only the files from the myapp_repo/distlib
Some questions:
Is possible to use cpanm (called as an locally installed module) without the make?
For creating the myapp_repo/distlib, me thinking about using Pinto. Is it the right tool for achieve the above?
forgot me something? or with other words:
Is the above an viable (read: working) way?
are are any other tools, which i could/should to use for simplifying the creation of such distribution tarball?
#ikegami suggesting some method:
- "install everything" in one fresh-directory on my machine
- transfer this self-contained directory to the target machine
It sound very good, because this directory could contain all the needed app-specific data-files too, unfortunately, I don't understand the details how his solution should be done.
The FatPacked solution looks interesting too - need learn about it.
Don't write your own make or installer. Just copy it make from a different machine (which is basically what apt/yum/etc do anyway, and which you'd have to do even if you wrote your own). You'd be able to use cpan in 5 minutes!
Also, that should allow you to install gcc if you need it (e.g. to install an XS module), although it doesn't sound like you do. If you do install gcc, I'd install my own perl to avoid having to deal with PERL5LIB.
Tools such as minicpan will allow you to install any module from CPAN without internet access. Of course, you can keep using the command you are already using it if mirrors the packages you need.
The above explains how to simply and quickly setup a machine so it can use cpan and thus install any module easily.
If you just want to install a specific module and its dependencies, you can completely avoid using cpan on the target machine. First, you need a fresh install of Perl (preferable of the same version as the one on the target system). Then, simply install the module to a fresh dir on your machine, and transfer that dir to the target machine. That's it; nothing else needs to be done. This even works for XS modules if the two machine are similar enough.
This is what ppm (ActiveState's Perl package manager) does.
Unfortunately, while this solution is almost as simple as the one above, it's not nearly as flexible, it doesn't run the test suite of the modules being installed, etc. It does have the advantage of not requiring the transfer of any binary (if you're not installing any XS modules).
Being very new to Perl but not to dynamic languages, I'm a bit surprised at how not straight forward the manage of modules is.
Sure, cpan X does theoretically work, but I'm working on the same project from three different machines and OSs (at work, at home, testing in an external environment).
At work (Windows 7) I have problem using cpan because of our firewall that makes ftp unusable
At home (Mac OS X) it does work
In the external environment (Linux CentOs) it worked after hours because I don't have root access and I had to configure cpan to operate as a non-root user
I've tried on another server where I have an access. If the previous external environment is a VPS and so I have a shell access, this other one is a cheap shared hosting where I have no way to install new modules other than the ones pre-installed
At the moment I still can't install Template under Windows. I've seen that as an alternative I could compile it and I've also tried ActiveState's PPM but the module is not existent there.
Now, my perplexity is about Perl being a dynamic language. I've had all these kind of problems while working, for example, with C where I had to compile all the libraries for all the platform, but I thought that with Perl the approach would have been very similar to Python's or PHP's where in 90% of the cases copying the module in a directory and importing it simply works.
So, my question: if Perl's modules are written in Perl, why the copy/paste approach will not work? If some (or some part) of the modules have to be compiled, how to see in CPAN if a module is Perl-only or it relies upon compiled libraries? Isn't there a way to download the module (tar, zip...) and use cpan to deploy it? This would solve my problem under Windows.
Now, Perl is a dynamic language, but that doesn't imply that everything that people write is portable across platforms. That's not the fault of the language. It's not even the fault of the programmer. Some things, like Win32::OLE shouldn't work on Unix. :)
Other dynamic languages will have some of the same problems. If you have to compile C code, you won't be able to merely copy files to another machine. Some distributions configure the code slightly differently depending on your operating system, etc.
Even if you could copy files, you have to ensure that you copy all of the files that you need. Do you know everything that you need for a particular module? Remember, many of them have dependencies.
Most of the problems you're having aren't anything to do with the language. You're having trouble with the tools. If you want a zero conf CPAN tool that makes all the decisions for you, try cpanminus. It's mostly the same thing that you'd get out of cpan (although different code), but it makes all of the decisions for you. It doesn't run any of the distribution tests, and it installs into your user directory. When you need something that gives you control, come back to cpan.
In the external environment (Linux CentOs) it worked after hours because I don't have root access and I had to configure cpan to operate as a non-root user
This is one of those times when it helps to know The Trick. In this case local::lib, which lets you configure a non-root install area and all the ENV variables in about three minutes.
if perl's modules are written in perl, why the copy/past approach will not work?
Some are written in Pure Perl, but many are written partially in C (using Perl's XS API) for efficiency.
Sometimes you end up with situations like JSON::XS, JSON::PP and JSON::Any to autoselect the best one that is installed.
Isn't there a way to download the module (tar, zip...) and use cpan to deploy it?
The cpan program is all about getting things from the Internet. You can download the package (there will be a link along the lines of "Download: CGI.pm-3.49.tar.gz" on the right hand side of the CPAN page), untar it, then
perl Makefile.PL
make
make install
You would probably be better off configuring your cpan installation to use only HTTP sources (in the urllist config option). Possibly going to far as to create a mini CPAN mirror inside your network.
This is probably a multi-part question. Background: we have a native (c++) library that is part of our application and we have managed to use SWIG to generate a perl wrapper for this library. We'd now like to distribute this perl module as part of our application.
My first question - how should I distribute this module? Is there a standard way to package pre-built perl modules? I know there is ppm for the ActiveState distro, but I also need to distribute this for linux systems. I'm not even sure what files are required to distribute, but I'm guessing it's the pm and so files, at a minimum.
My next question - it looks like I might need to build my module project for each version of perl that I want to support. How do I know which perl versions I should build for? Are there any standard guidelines... or better yet, a way to build a package that will work with multiple versions of perl?
Sorry if my questions make no sense - I'm fairly new to the compiled module aspects of perl.
CLARIFICATION: the underlying compiled source is proprietary (closed source), so I can't just ship source code and the appropriate make artifacts for the package. Wish I could, but it's not going to happen in this case. Thus, I need a sane scheme for packaging prebuilt binary files for my module.
I look after DBD::Informix, one of the Perl Database Driver modules that works with the DBI (Perl Database Interface). The underlying libraries used to connect to IBM Informix Dynamic Server (IDS) are proprietary, but the DBD::Informix code itself is not. I distribute that code on CPAN, just the same as any other Perl module. People can download that source, and (provided that they have the Informix ClientSDK installed on their machine - and Perl and DBI and so on), they can build DBD::Informix to work with their installed Perl.
I would strongly counsel that you arrange that your Perl interface code be made available in source form, even though the library that it interfaces to is proprietary. This allows people to install the code with any version of Perl they have - without requiring you to deal with inconsistencies.
If you still want to provide binary support, you are going to have to work out which platforms you want to support, and build the module with the standard version of Perl on each such platform. This gets messy. You need access to an instance of each machine. Granted, virtual machines make this easier, but it is still fiddly and the number of platforms and versions only grows. But you still need to support people who don't use the standard version of Perl on their machine - that's why the Perl wrapper interface needs to be provided in source form.
DISCLAIMER: I have next to no experience creating binary packages that can easily be installed. Therefore, I am making this post CW to make it easier for others to add their advice.
You should make the distribution available in source form so it can be compiled on each system tailored according to the specifics of that system. I really like Module::Build for that purpose.
For ActiveState users on Windows, you probably want to have four or six PPMs based on whether you want to support 5.6. Package both 32-bit and 64-bit versions for each of 5.6, 5.8 and 5.10. Use the version of mingw you can install using ppm to compile the modules to preserve binary compatibility.
Another option is to use PAR::Packer and distribute your application in a PAR archive. In that context, PAR::WebStart might be useful although I have not tried it. I have had success with PAR archives in the past, though.
I have an app that I pack into "binary" form using PerlApp for distribution. Since my clients want a simple install for their Win32 systems, this works very nicely.
Now a client has decided that they need to run all unit tests, like in a standard install. However, they still will not install a normal Perl.
So, I find myself in need of a way to package my unit tests for operation on my client's systems.
My first thought was that I could pack up prove in one file and pack each of my tests separately. Then ship a zip file with the appropriate structure.
A bit of research showed that Test::Harness::Straps invokes perl from the command line.
Is there an existing tool that helps with this process?
Perhaps I could use PAR::Packer's parl tool to handle invocation of my test scripts.
I'm interested in thoughts on how to apply either PAR or PerlApp, as well as any thoughts on how to approach overriding Test::Harness and friends.
Thanks.
Update: I don't have my heart set on PAR or PerlApp. Those are just the tools I am familiar with. If you have an idea or a solution that requires a different packager (such as Cava Packager), I would love to hear about it.
Update 2: tsee pointed out a great new feature in PAR that gets me close. Are there any TAP experts out there that can supply some ideas or pointers on where to look in the new Test::Harness distribution?
I'm probably not breaking big news if I tell you that PAR (and probably also perlapp) aren't meant to package the whole test suite and plethora of CPAN-module build byproducts. They're intended to package stand-alone applications or binary JAR-like module libraries.
This being said, you can add arbitrary files to a PAR archive (both to .par libraries and stand-alone .exe's) using pp's -a switch. In case of the stand-alone executable, the contents will be extracted to $ENV{PAR_TEMP}."/inc" at run-time.
That leaves you with the problem of reusing the PAR-packaged executable to run the test harness (and letting that run your executable as a "perl"). Now, I have no ready and done solution for that, but I've recently worked on making PAR-packaged executables re-useable as more-or-less general purpose perl interpreters. Two gotchas before I explain how you can use that:
Your application isn't going to magically be called "perl" and add itself to your $PATH.
The "reuse" of the application as a general purpose perl requires special options and does not currently support the normal perl options (those in perlrun). It can simply run an external perl script of your choice.
Unfortunately, the latter problem is what may kill this approach for you. Support for perl command line options is something I've been thinking about, but won't implement any time soon.
Here's the recipe how you get PAR with "reusable exe" support:
Install the newest version of PAR from CPAN.
Install the newest, developer version of PAR::Packer from CPAN (0.992_02 or 03).
Add the "--reusable" option to your pp command line.
Run your executable with the following options to run an external script "foo.pl":
./myapp --par-options --reuse foo.pl FOO-PL-OPTIONS-HERE
Unfortunately, how you're going to teach Test::Harness that "./myapp --par-options --reuse" is a perl interpreter is beyond me.
Cava Packager allows you to package test scripts with your packaged executables. This is primarily to allow you to run tests against the packaged code before distribution. However the option is there to also distribute the tests and test capability to your end users.
Note: As indicated by my name, I am affiliated with Cava Packager.