Why does wheel installation put shared objects in site-packages folder instead of package folder? - python-c-api

I've a python binary distribution [wheel] created via
python setup.py bdist_wheel
The wheel looks as follows
unzip -l dist/<package-name>-1.0.0-cp36-cp36m-linux_x86_64.whl
Archive: dist/<package-name>-1.0.0-cp36-cp36m-linux_x86_64.whl
Length Date Time Name
--------- ---------- ----- ----
2996432 2021-01-07 21:47 lib<xyz>.so
7821608 2021-01-07 21:48 lib<abc>.so
4414000 2021-01-07 21:48 <module>.cpython-36m-x86_64-linux-gnu.so
581 2021-01-07 20:05 <package-name>/__init__.py
636 2021-01-07 20:05 <package-name>/version.py
Upon installing the wheel, why do the *.so files get installed in site-package folder?
/opt/conda/lib/python3.6/site-packages/
While the other files get installed inside
/opt/conda/lib/python3.6/site-packages/<package-name>

Wheel is essentially a compressed form of package distribution. Hence it can be unzipped [like a zip file]. The entire directory structure inside the zipped wheel gets copied as is in the site-packages folder. This is the reason why
the shared libraries are stored inside site-packages and
rest of the package files [e.g. __init__.py are stored inside the package subfolder of the site-packages].
wheel gets unzipped in the site-packages folder essentially.

Related

How to install the Powershell Module MicrosoftPowerBIMgmt with offline?

I want to install the Powershell Module MicrosoftPowerBIMgmt with below command line:
Install-Module -Name MicrosoftPowerBIMgmt
However, the script is executed, but it may be limited by my network, the command line has been stuck in downloading the microsoftpowerbimgmt.profile file, and the progress is 0.
It seems that I can't use the command to complete the download. So I wonder if there is any other scheme to install this module?
Any advice is greatly appreciated.
By the way, my system is windows 10.
I decided to run the install you gave and checking the docs it shows that one of the commands is Get-PowerBIWorkspace. With that in mind, I decided to try this trick:
(Get-Command Get-PowerBIWorkspace).dll
Which gave this result:
C:\Program Files\WindowsPowerShell\Modules\MicrosoftPowerBIMgmt.Workspaces\1.2.1093\lib\netstandard2.0\Microsoft.PowerBI.Commands.Workspaces.dll
This told me to look in C:\Program Files\WindowsPowerShell\Modules\ for the module. Checking that location, I found the following folders were added:
09/17/2022 08:52 PM <DIR> MicrosoftPowerBIMgmt.Profile
09/17/2022 08:52 PM <DIR> MicrosoftPowerBIMgmt.Admin
09/17/2022 08:53 PM <DIR> MicrosoftPowerBIMgmt.Capacities
09/17/2022 08:53 PM <DIR> MicrosoftPowerBIMgmt.Data
09/17/2022 08:53 PM <DIR> MicrosoftPowerBIMgmt.Reports
09/17/2022 08:53 PM <DIR> MicrosoftPowerBIMgmt.Workspaces
09/17/2022 08:53 PM <DIR> MicrosoftPowerBIMgmt
Checking the folders show that the data is around 50 MB. Modules are usually entirely contained in the module folder with no external changes made to the system (such as registry changes, or other files placed elsewhere). Since this is Microsoft, that might not be the case.
If I was in your shoes, I would probably use xcopy, or use a copy method you are familiar with, to copy these folders to the module folder on the system where you want the MicrosoftPowerBIMgmt module installed.
If these folders truly contain everything that makes the modules work, and you copy them to the module folder that is normally found in $Env:PSModulePath, then you should be able to successfully run your script.

python setup.py install won't keep data_files

I am currently so confused about installation of my own Python packages... Between setup.pys, sdists and wheels, no matter what I do, I can't seem to achieve what I want - which is to have a bunch of non-.py data files installed and kept in the virtual environment with the same structure I have in my project folder after the installation.
I've read all kinds of documentations, and created a setup.py file that has a data_files field that contains all the data files I need in my installation.
I have the following structure:
.
|__ requirements.txt
setup.py
hr_datapool
|__ __init__.py
|__ data_needed
|__ needed_folder
|__ needed_file.notpy
|__ one_module
|__ __init__.py
|__ misc_tools.py
|__tests
|__ test_tools.py
|__ other_module
...
And data_needed contains non-.py data files that are needed for misc_tools.py (and thus tests.py) to run.
Because of this, I added a data_files into my setup.py that contains all the folders and files I need. This I confirmed, everything is there what should be.
And yet, if I do any variation of pip install ., python setup.py install or the likes, the data_files are completely ignored, and/or placed in the wrong directory and don't appear anywhere in the created build or dist folders. And because of this, obviously all my tests fail, since they can't load files that are not there. Neither are they stored in the installation folder on the venv when I sometimes do succeed in copying them, but rather in the root of the venv.
The funny thing is, the files are handled while installing, I keep getting console output when installing with python setup.py install like:
copying data_needed/needed_folder/needed_file.notpy-> /Users/.../venv/hr_datapool/data_needed/needed_folder/
but only if I use python setup.py install, (not when using pip install .).
According to the documentation:
The directory should be a relative path. It is interpreted relative to the installation prefix (Python’s sys.prefix for system
installations; site.USER_BASE for user installations). Distutils
allows directory to be an absolute installation path, but this is
discouraged since it is incompatible with the wheel packaging format.
No directory information from files is used to determine the final
location of the installed file; only the name of the file is used.
Notice the highlights. However, in my example, it doesn't install relative to the directory containing the package, but it installs into its own folder in the root of the virtual environment, making it practically unreachable from within my code. I made sure I se relative paths in my setup.py, but still this happens.
How can I make sure the required data_files install within the target directory of the module, and not separately into the root of the virtual environment?

Can I run sbt new with a local template (not GitHub)?

I work in a secure environment where developers are not allowed to git-clone from GitHub, or any other external repos.
I was able to download a g8 template (play-scala-seed) from GitHub as a zip file and I've unzipped it to a local folder. Can I use that local directory instead of a git repo?
My first attempt at this failed:
> dir .\play-scala-seed
Volume in drive C is OSDisk
Volume Serial Number is A074-A016
Directory of C:\workspace\play-scala-seed
03/22/2018 11:03 AM <DIR> .
03/22/2018 11:03 AM <DIR> ..
03/22/2018 11:01 AM <DIR> project
03/22/2018 10:57 AM <DIR> src
03/22/2018 11:03 AM <DIR> target
03/22/2018 10:57 AM 70 .gitignore
03/22/2018 10:57 AM 509 .travis.yml
03/22/2018 10:57 AM 453 build.sbt
03/22/2018 10:57 AM 439 LICENSE
03/22/2018 10:57 AM 1,166 README.md
5 File(s) 2,637 bytes
5 Dir(s) 220,172,980,224 bytes free
Even though I'm sure the template exists and in in a directory called "play-scala-seed", it's not accepted by the SBT new command:
> sbt new .\play-scala-seed
Template not found for: .\play-scala-seed
So how can I make sbt new use a local directory for a g8 template?
I'm running Windows (if that matters!)
When you use giter8 directly you just need to refer to your local template with the file:// prefix:
g8 file://play-scala-seed
As mentioned by #volia17, you can find it in the giter8 documentation: Testing templates locally.
But when you use sbt new, you need you template name (folder) to end with .g8. sbt can accept different types of templates and this way it knows that this is a giter8 template. So you can rename your template folder to play-scala-seed.g8:
sbt new file://play-scala-seed.g8
P.S. Using giter8 directly is much faster, because sbt new takes time to start (it loads global sbt plugins every time).

Perl .bundle files

On my OSX machine, deep within a jungle of lib directories, I was able to see this
-r-xr-xr-x 1 user users 45700 Feb 01 1:47 LibXSLT.bundle*
1) What are these .bundle files ?
2) Who creates them ? CPAN modules ?
3) If so, then can we control its generation via some Makefile artifact ?

Eclipse Plugin only contains manifest

I am attempting to develop an Eclipse plugin. The plugin runs from inside Eclipse (i.e. when I launch a test instance of Eclipse with my plugin from inside Eclipse, I can use the plugin in the test instance.)
However, when I attempt to generate a plugin that could be installed by other systems using File > Expoort > Deployable Plug-ins and fragments, the zip file created, contains a single jar file which itself contains only a manifest file:
$ jar tvf com.foo.bar_1.0.0.d.jar
0 Wed Feb 10 12:14:12 EST 2016 META-INF/
863 Wed Feb 10 12:14:10 EST 2016 META-INF/MANIFEST.MF
For example, it does not include my icons or my plugin.xml file.
I am not (yet) using maven-tycho or any other extra-Eclipse means of building the plugin.
Can anyone suggest what I may be doing wrong?
You must list everything you want in the plugin in the build.properties file, so check that file. When you run from within Eclipse this file is not checked for accuracy but it must be correct when you export.
For a simple plugin it might look something like:
output.. = bin/
bin.includes = META-INF/,\
.,\
plugin.xml,\
OSGI-INF/
source.. = src/
This is including the 'META-INF' folder, the 'bin' folder (where your class files are), the 'plugin.xml' file and the 'OSGI-INF' folder.
In the plugin.xml editor use the 'Build' tab to set the contents of this file.