Using the Parcel build tool, how can I control the locations of my output files? - parceljs

I'm following the official guide for Parcel 2.
By default parcel takes your input files and dumps them all in the same output directory.
The vast majority of developers have specific directory layouts for our projects. So parcel must offer a way for its users to put files and directories where we want them to go.
But I can't documentation for configuring output file layout.
How do I control where output files are created? And how to rename them?
For example, if I'm working with projectroot/src/js/index.js, how can I output to projectroot/dist/js/bundle.js

Related

As a dev, do you use .gitignore to ignore everything and purposefully include? Or do you just exclude with it?

We have an internal discussion going here and we are somewhat torn on the best practice for using .gitignore on projects that contain a lot of files (like a CMS).
Method 1
Method 1 would be to purposefully .gitignore all files that come standard with your build. That would generally start like:
# ignore everything in the root except the "wp-content" directory.
!wp-content/
# ignore everything in the "wp-content" directory, except:
# "mu-plugins", "plugins", "themes" directory
wp-content/*
!wp-content/mu-plugins/
!wp-content/plugins/
!wp-content/themes/
# ignore these plugins
wp-content/plugins/hello.php
# ignore specific themes
wp-content/themes/twenty*/
# ignore node dependency directories
node_modules/
# ignore log files and databases
*.log
*.sql
*.sqlite
Some staff members like this approach since if you create something outside of the standard files, for example like a /build folder, then it would automatically be detected for inclusion. However, writing custom theming and plugins require you to add a few layers to this file to "step in" to the folders you want to keep, and generally, the file is a bit messier to read.
Method 2
Method 2 ignores everything, and then you whitelist what you want in the repo. That would look like
# Ignore everything, but all to descend into subdirectories
*
!*/
# root files
!/.gitignore
!/.htaccess.live
!/favicon.ico
!/robots.txt
# theme
!/wp-content/themes/mytheme/**
/wp-content/themes/mytheme/style.css # Ignore Compiled CSS
/wp-content/themes/mytheme/js # Ignore Compiled JS
# plugins
!/wp-content/plugins/my-plugin/**
# deployment resources
!/build/**
Some staff like this since it's cleaner, you have to purposefully add something (which makes accidental adds harder), and it also in effect shows you your .git folder structure.
What is the best practice? Which method do you enjoy and would you recommend doing one over the other?
The second method is the best practice, when it comes to exlude some folder contents of gitignore rules.
It better reflect the following rule:
It is not possible to re-include a file if a parent directory of that file is excluded.
To exclude files (or all files) from a subfolder of an ignored folder f, you would do:
f/**
!f/**/
!f/a/sub/folder/someFile.txt
Meaning: you need to whitelist folders first, before being able to exclude from gitignore files.
It is clearer, shorter (unless you have a large number of folder to whitelist)
What if it is a Joomla install with a large amount of directories and files?
Or what if a core upgrade adds new files or folders
Don't forget you can have multiple gitignore files, one per folder.
That means you can mix and match both approaches.
And you have:
http://gitignore.io/ (which does blacklist when it comes to Joomla application)
github/gitignore (same approach for Joomla)
The ideal .gitignore file, is the one that does not exist.
For some reason, you're deeply intermingling files you want to track via source control, with files you DON'T want to track.
This, I think, is the source of your sadness.
You are mixing git's intended purpose, which is versioning of programmer-edited files, with deployment, which is intended to get the files where they belong in the correct directories.
Your question is not clear, as to whether you think the Wordpress core files should be versioned. I'm assuming not, since that's how you've set up your .gitignore.
Your question is also not clear, as to whether you are deploying a web site, or shipping plugins as a product. Those are both different use cases, and they require different types of versioning. If this is a deployed web site, you SHOULD be versioning Wordpress along with everything else. If you are shipping a plugin or a theme, then you should have a test suite of plenty of different Wordpress versions to test against.
I think your source control system should be set up, solely to track just the plugins/* and/or themes/* files that go into your distribution. Zipping that folder should give you the plugin asset that your customers download.
To debug your plugins, there should be a deploy step in your IDE that copies each of those tracked files, into a Wordpress install at a location you choose. This permits you to more easily test against different Wordpress versions.
You're reducing workflow problems, to trying to choose a .gitignore. Fix the problem at the root by getting the workflow right.

Including multiple folders (with images, scripts, etc) within a Matlab standalone GUI.exe

I have a software that has multiple GUIs. To organize things better (or at least that was my thought), I have created several folders within the root directory as it can be seen in this image.
Within the folders i have both files with different formats and also some Matlab scripts.
When creating the Matlab executable using the Application compiler, and after selecting the main file, Matlab does not directly detected that these same folders are important for the code to run. Therefore I decided to add the folders manually.
Once the setup is created and installed, by running the application within the Matlab environment, I was able to debug one possible issue why the software is not running.
As you can see in the first image, the "play.png" is within the Images folder.
My question is pretty straight forward: how to force the Matlab Compiler to learn that all these folders are to be included in the setup? Not only to be included but their paths'
Two things could be going on:
You are not including the files in the package.
Make sure that you include them using the -a option of mcc:
mcc -m hello.m -a ./testdir/*
You can also use the GUI, of course, see here.
You are looking for the included files in the wrong place. Use ctfroot as the root of all paths in your code:
img_file_name = fullfile(ctfroot,'Images','brain.jpg'));
Check the unpacked CTF file (it is automatically unpacked when executed) to see the directory structure in it. ctfroot points to the root of the unpacked CTF file.
PS: This blog post might give you some more pointers.

TDS File Replacement

I want to deploy some front end assets to the local web root of a site using file replacement. I can't seem to get it to work with a relative path in the target location field though. Is it possible to do this though tds or should I use a post build event instead?
The reason these assets aren't included in a project is that they are part of a third party solution but we still want this tracked in source control to try to make the project setup easier.
Most developer machines will be set up the same way for this project with the same file structure but I think it's a little more flexible if I can make the target a relative path so I don't need to worry about differences like drive letters and such.
The folder structure is as follows:
repo
folderToCopy
sitecore
webroot
I have tried the following using ..'s based on what tds changed my source location to be while using the "Make selected Source Location relative" option (changed it from an absolute path to ..\folderToCopy\):
../../Sitecore/Website
/../../Sitecore/Website
..\..\Sitecore\Website
\..\..\Sitecore\Website
From my understanding, TDS does the file replacement based on the files published from the associated Website project.
You can then have relative replacements such as the following:
<Replacement Include=".\assets\folderToCopy\myFile.txt">
<TargetPath>.\assets\targetFolder\myFile.txt</TargetPath>
<IsFolder>False</IsFolder>
<IsRelative>True</IsRelative>
</Replacement>
I have not been able to successfully get TDS to use the file replacement with files that are in source control but not in the project.
My suggestion would be to set up a build event that will copy these files to the correct location, or to create a nuget feed for them and pull them in as nuget references.

Searching for multiple solutions with NDpend

Is there a way with NDepend to have it search for sln files to load? I need to look at metrics across a large codebase that has hundreds of sln files in it. I want to create some summary info, like total lines of code. In the interface I can browse to sln files but that will take me a long long time.
The perfect solution would be to just select a top directory and then have it recurse looking for the sln files automatically...
You can achieve this by writing a program based on NDepend.API. See the getting started with NDepend.API page.
Basically your program will recursively search for all *.sln files under the top directory.
For each solution file it'll call GetAssembliesFromVisualStudioSolutionOrProject().
Once you gathered all assemblies file paths from all .sln files, you aggregate them in a newly created NDepend project.
The getting started page shows how to create such project and eventually run a first analysis if you have a build machine license (else the analysis will be ran from VisualNDepend.exe).

Can Google Package App use external directories during packing?

I am writing a number of Google Packaged Apps which run independently, but share lots of code. For example, they all use "library.js". I would like to have only one copy of library.js so any changes to it will be used by all newly packed apps.
To package my apps, it seems they all must have a copy of library.js in their own directory structure, whereas it would be nice to have a single master copy in some other directory that is accessible to all. I currently do a manual check to make sure all files are up-to-date before packing, and I am writing some code to do the check automatically, but it seems like a work-around.
Can a Google Packaged App use JS code in external library directories, or must all code be under the root directory of the app (i.e., requiring copying from external directory) when packing?
Have you tried providing a URL i.e. host the javscript file in .js format to an accessible location to your apps and then provide the .js file URL in all your apps code. The very next time you want to change, all you have to do is to update that .js file.