Aurelia exported bundle causes a 404 from SystemJS if a source module was in a subdirectory - systemjs

I'm learning Aurelia via the TypeScript / ASP.NET Core skeleton navigation app. Everything runs fine in its default state. To test the exported production bundle, I run the Gulp Export task, then publish the app via Visual Studio project publish to a local folder, then replace the published wwwroot folder with the wwwroot folder from within the "export" folder, then use dotnet from the command line to run the app.
Things break if I have any source modules in a subdirectory. For example, I moved the welcome.ts/html component files into "/src/Pages" and adjusted its route moduleId in app.ts accordingly to "./pages/welcome". The unbundled app then still runs up fine, but when I try the exported version, I get a request being made by SystemJS to http://localhost:5000/dist/pages/welcome.js which 404s (as you'd expect).
I can see the contents of the welcome component in the app-build.js file, and the config.js file within the export folder contains the expected file paths, i.e. it has "Pages/welcome...".
I have read this seemingly similar issue:
https://github.com/aurelia/bundler/issues/131
But setting depCache to false made no difference in my case. Why is SystemJS trying to load this module separately from outside of the bundle?

I was able to reproduce this error locally.
Presuming that you have a Windows environment, it will be a case-sensitivity issue.
After renaming [P]ages folder to [p]ages, bundled version works as expected.
On the filesystem there is a [P]ages/welcome.js viewmodel, but [p]ages/welcome has been defined as moduleId.
Unbundled mode: Windows filesystem is case-insensitive, which behaviour can be misleading by loading [P]ages/welcome.js correctly.
Bundled mode:
Based on file path, bundling process embeds[P]ages/welcome.js as [P]ages/welcome module.
But, according to the route config, SystemJS will be looking for [p]ages/welcome module within app-build.js.
My recommendation would be to use lowercase folder/filenames whenever it's possible.

Related

VSCode: how to structure a simple python package with few modules and tests, debugging and linting?

I'm having more trouble than I'd like to admit to structure a simple project in Python to develop using Visual Studio Code.
How should I structure in my file system a project that is a simple Python package with a few modules? Just a bunch of *.py files together. My requisites are:
I must be able to step debug it in vscode.
It has a bunch of unit tests using pytest.
I can select to debug a specific test from vscode tab and it must stop in breakpoints.
pylint must not show any false positives.
The test files must be in a different directory of the main module files.
I must be able to run all the tests from the console.
The module is executed inside a virtual environment using python standard lib module venv
The code will use type hints
I may use another linter, even another test framework.
Nothing fancy, but I'm really having trouble to get it right. I want to know:
How should I organize my subdirectory: a folder with the main files and a sibling folder with the tests? Or a subfolder with the code and a subsubfolder with the tests?
Which dirs must have a init.py file?
How the tests should import the files from the module? Should I use relative imports?
Should I create a pytest.ini file?
Should I create a .env file?
What's the content of my launch.json the debugger file config in vscode?
Common dir structure:
app
__init__.py
yourappcode.py
tests (pytest looks for this)
__init__.py
test_yourunittests.py
server.py if you have one
.env
.coveragerc
README.md
Pipfile
.gitignore
pyproject.toml if you want
.vscode (helpful)
launch.json
settings.json
Or you could do one better. Ignore my structure and look at the some of famous python projects github page. Like fastAPI, Flask, asgi, aiohttp are some that I can think of right now
Also:
I think absolute imports are easier to work with compared to relative imports, I could be wrong though
vscode is able to use pytest. Make sure you have a testing extension. Vscode has a built in one im pretty sure. You can configure it to pytest and specify your test dir. You can also run your test from command line. If youre at the root, just running ‘pytest’ will recognise your tests dir if it’s named that by default. Also your actual test files need to start with prefix test_ i think.
The launch.json doesn’t need to be anything special. When you click on the settings button next to play button in the debug panel. Vscode will ask what kind of app is it. I.e If its a flask app, select python then select flask and it will auto generate a settings file which you can tweak however you want in order to get your app to run. I.e maybe you want to expose a different port or the commands to run your app are different
It sounds to me like you just need to spend a bit of time configuring vscode to your specific python needs. For example, you can use a virtualenv and linting in whichever way you want. You just need to have a settings.json file in the .vscode folder in your repo where you specify your settings. Configurations to specify python virtualenv and linting methods can be found online

Can Google Package App use external directories during packing?

I am writing a number of Google Packaged Apps which run independently, but share lots of code. For example, they all use "library.js". I would like to have only one copy of library.js so any changes to it will be used by all newly packed apps.
To package my apps, it seems they all must have a copy of library.js in their own directory structure, whereas it would be nice to have a single master copy in some other directory that is accessible to all. I currently do a manual check to make sure all files are up-to-date before packing, and I am writing some code to do the check automatically, but it seems like a work-around.
Can a Google Packaged App use JS code in external library directories, or must all code be under the root directory of the app (i.e., requiring copying from external directory) when packing?
Have you tried providing a URL i.e. host the javscript file in .js format to an accessible location to your apps and then provide the .js file URL in all your apps code. The very next time you want to change, all you have to do is to update that .js file.

Custom Language Resource file not getting deployed to server

We have an MVC 4 application that has 4 resource files. The default one (Resource.resx), one for Chinese and German (Resource.zn-CH.resx and Resource.de-DE.resx) and then one for a custom language (Resource.en-PI.resx (English-Pirate)) for testing purposes.
Every setting for all of the resource files is exactly the same:
Build Action: Embedded Resource
Copy to Output Directory: Copy Always
Custom Tool: PublicResXFileCodeGenerator
Custom Tool Namespace: Resources
Whenever the application is built locally, in the bin folder we get the folders de-DE, en-PI and zh-CN which all include a dll named (ProjectName).resources.dll.
The problem is whenever this application is deployed to our DEV server the folder for en-PI is ignored and not copied and we have to manually copy that folder over for the en-PI language to work.
We have a build definition that will build the necessary projects to a certain folder and do other things and then copy them over to our DEV server. Everything works perfectly fine for the other 2 languages (de-DE and zh-CN) but the en-PI folder doesn't show up in the bin folder. Is there some setting somewhere that is causing this resource file not to be deployed because it is a custom language?
The way all of these resource files were added followed this example http://odetocode.com/Blogs/scott/archive/2009/07/16/resource-files-and-asp-net-mvc-projects.aspx
Figured it out. Microsoft explains it here under the section Resource Naming Conventions. Because the language is a custom language and has a custom code it doesn't have a name that the common language runtime expects which is why it doesn't get deployed correctly.

deploying asp.net source code to webserver

i am just trying to understand the deployment build model with asp.net
i write code locally on my machine in visual studio and when i hit f5, it starts up a local webserver for all my testing.
then, i FTP all of my source code to my webserver and then hit the real URL.
my question is when does this get compiled on the webserver. is it looking at the bin/ directory of my local file that i just copied over or is it recompiling the solution and projects on the web server.
If you just drop source files (.aspx, .asmx, etc.) in the web site's directory and site is set to allow dynamic updating then each page will get compiled the first time it is accessed.
Every time you update it, it gets recompiled. In your bin folder, all you have is classes with code.

How to setup a DotNetNuke Development Environment with Source Control?

My team is developing a new DotNetNuke web application and would like to know what is recommended to setup a development environment with source control and automated builds? We would like to keep the DNN source code separate from our custom modules and extensions source code.
The DotNetNuke Compiled Module template for Visual Studio wants us to store the source code in the DesktopModules directory of the DNN source code and output to the DNN source code bin directory. Is this the recommended structure? I would rather keep the files in different locations, but then it becomes more difficult to run and debug locally as it would require an install of the module for each change. Also, how should an automated build deploy any changes?
How have others set this up? Is there a recommended best practice?
For my source control, I develop modules in their own project. This contains the module code, test code, data provider code (if applicable) and anything else. This is checked into source control like any other project. Note that the module project contains no links to a specific DNN website, and DNN references are made in the project to a common "bin" directory that references your target build. For example, in my projects folder, I have \bin460 , \bin480, \bin510, \bin520 etc. Each of these folders holds a set of binaries for a specific DNN version. That way you can build against a particular version but test against any version you like.
The problem with source-controlling a module in place in a dnn install is
- sometimes not all of the module code is easily isolated under a single parent directory
- doesn't lend well to a PA module approach
- not easy to shift the project to a different DNN Version for development or testing
- easy to inadvertently source control parts of the DNN solution, particularly with integrated VS source control solutions.
This approach compiles quickly because you're not trying to compile the entire project. For test deployment I have a build script that copies the various parts of the module into a target website. This can be done via the compile (link the build script) or just run after you've had a successful compile in a cmd window. My build script has a 'target' environment switch, so that I can say 'dnn520' to deploy the build to my test dnn520 install. Note that you need to manually create the module configuration first before this will work, but this is a one-time effort, and you can use the export feature to create your .dnn module manifest.
To build your module package, invest the time in a comprehensive script which will take the various parts from your source directory, and zip them into an install package. Keep all of the parts in your source control folder, and copy them into a temp directory, then run a command-line zip utility (I use an ancient version of pkzip) to pack it into an installable file.
The benefits of this approach is :
- separation of module code from installed code
- simple way of keeping only the module code in source control (don't have to exclude all the website code)
- ability to quickly test out modules in different dnn versions
- packaging script allows you to quickly and easily build a new version of a module for install testing/deployment
The drawbacks are
- can't use the magic green 'go' button in VS (have to manually attach debugger)
- more setup time than developing in-place
We typically stick to keeping the module code in a folder under DesktopModules and building to the website's bin directory.
In source control, we just map the individual modules, rather than the entire website. Depending on what we're working on, a module may be an entire project in source control, or we may have multiple related modules in the same project, living next to each other.
Automatically deploying changes is somewhat difficult in DNN. It's highly recommended to have a build script that packages your module into an installable form. You can then copy installable packages into the website's Install/Module folder, and get the URL /Install/Install.aspx?mode=InstallResources, which will install any packages in that folder.
In response to bduke's answer. You should, and don't want to build projects in the DesktopModules folder.
That's where all of the source code for the site out of the box goes.
That's where you modules will be "installed" and thus if someone "updates" or re-installs one, then it will be overwritten
It can make upgrading your Application far more difficult. Many developers don't understand that the idea of not touching the original source code files to modify their behavior. BECAUSE it will just be overwritten when you perform an upgrade.
If you want to build modules, create a solution folder called Modules and place your seperate project modules there.
If you want to debug them, make the target debug output point to the web\bin folder.
If you want to install/deploy them. Build it in release mode and install them through the Module/Extension filter.