Meteor-postgres PACKAGE_DIRS - postgresql

I'm about to use meteor-postgres (https://meteor-postgres.readthedocs.io). As i can see it has it's own implementations of "accounts-password", "accounts-base", "accounts-ui" and mayby other packages. The package is not installed via Atmosphere but is git cloned and linked to node via env variable PACKAGE_DIRS.
Does Meteor packages in PACKAGE_DIRS directories have higher priority than packages installed by "meteor add ..."? Should i uninstall old packages (that are supposed to be used with Mongo) with the same names via "meteor remove ..." or not?
The official guide to meteor-postgres is very uninformative, if someone have any links describing usage of this package i will be very greatful!

Local packages will take precedence, but if you want to be sure, you can change the name of the package (change the author), so it will definitely take your version.
I create a packages directory inside my meteor project for locally modified packages, and I don't need to set PACKAGE_DIRS, as meteor finds them there anyway

Related

how to add sample run mode for package installation

I want to install a package (mypackage-1.0-local.zip) only for local environment. This package should not be installed in any other environments.Same as OOTB 'samplecontent'/'nosamplecontent' runmodes.
So for this I do not know how to achieve this. If I start AEM server with 'local' runmode then how package manager service will know whether to install this package or not based on runmode?
If you are maintaining this in code, you can try to use /apps/${site}/install.${runmode} and place the package in that folder to install a package only to instances matching the run-mode.
e.g. Packages kept under /apps/${site}/install.author will be installed only in author instances.

Why expo-cli must be installed globally?

The official way of working with expo-cli is to install it globally, see https://docs.expo.io/get-started/installation/#installing-expo-cli:
Installing Expo CLI
# Install command line tools
npm install --global expo-cli
However, I never found any explanation about why it is supposed to be global (other than that this simplifies the initial expo init command). To my thinking, having a global package undermines the whole idea behind npm and local node_modules. Essentially, expo-cli is a direct dependency of the project. It's used for running the dev version with expo start and also for creating production builds.
Different versions of expo-cli will work differently, they may even expect different values in app.config.ts. That means it's not safe to upgrade expo-cli globally for one project and then return working with an older project which has been created and maintained with an older version of (global) expo-cli.
None of this would have happened if expo-cli were a normal local project dependency like expo (the SDK package).
So, my question is: is there a real reason for keeping expo-cli global? What do I lose if I move it to local project dependencies? How come Expo documentation never even mentions such option?

How to install multiple versions of a compatible package in CentOS with YUM

Is there a way to install multiple versions of the same package in CentOS/RHEL (7/8) if the package installs separate files in each version?
We have an application we've recently converted to using RPM instead of a home-built package manager based on tar. In order to make atomic-like switches between versions, each version installed in separate directories with the version number in the name, and a symlink with the unversioned name pointed to the current, or previous, version at any given moment in time. The application, of course, used the unversioned name to get init script, configuration files, interpreter version and code. I'm thinking that the alternatives package would be the basis for this, although we wouldn't use the alternatives command to manage symlinks (although there's no technical reason not to).
Not exactly as you describe.
Some packages allow this (Kernel and Kernel-devel being two of them) but i beilieve this is an exception added within the package manager.
Certain Applications like PHP and Python which is perfectly acceptable to have multiple version (Python2.X and 3.X) do this by changing the base name of the application/rpm.
Take a look at: https://rpm.org/user_doc/multiple_versions.html
It gives a good insight on how to achieve what you want

Using Cake (C# Make) to always get latest NuGet package version

Is it possible to use Cake to always get the latest version of a specific NuGet package? I know NuGet itself only allows you to set that at the base Nuget.config level. There are some internal packages that we would like to always get the latest version of (some of our database entities), while other internal packages we don't want to force a latest (our extensions package, for example). Right now we have to go through and manually update projects that rely on those packages, and I would like to automate those "always get latest" at build.
I don't see anything using any of the NuGet add-ins, but I am new to Cake so I'm hoping I am just missing something.
Has anyone had any luck using Cake to always retrieve the latest version on the feed for specific named packages, and just use the current packages.config version for the rest?
The short answer is that you can do anything that you want.
Cake out of the box will attempt to adopt established best principles for reproducible builds.
With the preprocessor directive, you could simply omit the version information, and Cake/NuGet will fetch the latest version. However, once downloaded to the tools folder, Cake/NuGet will not fetch it again. What you could do is add a custom step in your bootstrapper to clear the tools folder each time before build, and then the latest version will be downloaded each time.
Note: This is NOT a recommended approach, but rather something custom for your setup.

Version controlled South migrations in virtualenv

I have a Django site placed in folder site/. It's under version control. I use South for schema and data migrations for my applications. Site-specific applications are under folder site/ so they are all version-controlled along with their migrations.
I manage a virtualenv to keep third party components dry and safe. I install packages via PyPI. The installed packages' list are frozen in requirements.txt so the they can be easily installed in another environment. The virtualenv is not under VCS. I think it is a good way if virtualenv can be easly deleted and reconstructed at any time. If I need to test my site, for instance, using another version of Python interpreter, simply activate another virtulalenv.
I'd like to use South for third party packages, though. Here comes the problem. Migration scripts stored in the application's folder so they are outside of my site's repository. But I want migration scripts to be under version control so I can run them on different stages as well.
I don't want to version control the whole virtualenv but the migration scripts for third party applications. How can I resolve this conflict? Is there any misconcept in my scenario?
The SOUTH_MIGRATION_MODULES setting allows you to put migration modules for specified apps wherever you want them (i.e. inside your project tree).
I think it depends a litte bit on your version control system. I recommend to use a sparse tree, one that only manages the migration folders of the various packages. Here I see two alternatives:
Make a truly sparse tree for all packages, one that you check out before creating the virtualenv. Then populate the virtualenv, putting stuff into the existing folders.
Collect all migrations into a separate repository, with a folder per project/external dependency. Check this out into the virtualenv, and create symlinks, linking from each project to its migration folder.
In either case, I believe you can arrange for the migrations to exist as a separate project, so you can install it with the same process as you install everything else (easy_install/pip/whatever).