Best way to make #types packages visible in an nx workspace - nrwl-nx

Background
I'm trying to remove resize-observer-polyfill from an nx workspace that I'm working on because it's natively supported in the browsers that we are targeting. Once I removed the polyfill, I needed to add #types/resize-observer-browser because the workspace currently uses typescript#4.0.5 and my understanding is that TypeScript does not have a "native" type for ResizeObserver until v4.2 which I'd love to update to, but can't atm.
Problem
In order to make TypeScript happy, it seems like I have to go in and manually add "resize-observer-browser" to individual tsconfig compilerOptions.types entries. This didn't seem that bad to me at first. I just updated the tsconfig.lib.json file of the libraries that happened to utilize ResizeObserver. However, I soon realized I needed to also add it to the tsconfig.spec.json of the libraries so that the unit tests could run, and then I also needed to add it to the tsconfig.app.json of any applications that happened to import those libraries.
Question
Is there an easier way in an nx workspace to handle this sort of problem?
I think that I could remove the default types overrides in each of the tsconfig files, since that would let TypeScript just utilize everything that exists under node_modules/#types when compiling. I didn't want to take that path since I assume there is a good reason for the default nx library/app generators to add the types override (I assume it's to force you to be explicit and not accidentally get away with accidental imports of test code from business logic).

The docs seem to recommend against this for #types packages, but /// <reference types="..." /> (e.g. /// <reference types="resize-observer-browser" />) can be also be used to include types, and might be easier to manage if the type is only used in a few places.
Docs: https://www.typescriptlang.org/docs/handbook/triple-slash-directives.html#-reference-types-

Related

Absolute and relative path conflict in Modelica

I want to build up a tests library and keep it separated from the libraries under development. My first thought is to go for a structure like the following:
PensLib
--Variants
----BallPoint
----FountainPen
----Tests
------TB_BallPoint
HammocksLib
--Variants
----SingleHammock
----DoubleHammock
----Tests
------TB_DoubleHammock
--Systems
----IndoorWalls
----OutdoorWallAndTree
----CoconutPalms
----Tests
------TB_IndoorWalls
Tests
--PensLib
----Variants
------Test_BallPoint // extends PensLib.Variants.Tests.TB_BallPoint
--HammocksLib
----Variants
------Test_DoubleHammock // extends HammocksLib.Variants.Tests.TB_DoubleHammock
----Systems
------Test_IndoorWalls // extends HammocksLib.Systems.Tests.TB_IndoorWalls
For now let's assume that the way I structure my libraries make sense (which most likely doesn't). I will soon ask more questions on good practices in setting up the testing environment in Dymola and with the Testing Library.
My question is about the correct way to handle relative and absolute paths within models, if possible at all.
The model PensLib.Variants.Tests.TB_BallPoint is used for developing the variant BallPoint
The model Tests.PensLib.Variants.Tests_BallPoint is used for automated testing
I want the model Test_BallPoint to extend the model TB_BallPoint, but I cannot link them. I guess the absolute path PensLib.Variants.Tests.TB_BallPoint is treated as a relative one, since PensLib is found "on the way out" of the Tests library, and from there it goes looking for the rest of the path. Is there perhaps a way to control the path, kind of ..\..\..\PensLib\Variants\Tests\TB_BallPoint?
As you already noted such a setup makes troubles. There are ways around that, namely global name lookup and imports, which I explain briefly further below.
Both solutions are nice when you have such a case in a few situations. But if you have to use it all the time, you make your setup unnecessarily complicated.
Hence, I suggest to make yourself the live easier and change your package structure:
Either create a dedicated test library for every library, maybe PensLib_Tests and HammocksLib_Tests
Or rename the packages in the Tests library and don't use the exact library names
Global name lookup
You can use absolute class paths. They are denoted with a leading ., so this should work:
extends .PensLib.Variants.Tests.TB_BallPoint;
See Modelica Specification chapter 5: Scoping, Name Lookup, and Flattening for details, especially 5.3.3 Global Name Lookup
Importing
You can simply import the library. Lookup of imports is always performed globally.
import PensLib;
extends PensLib.Variants.Tests.TB_BallPoint;

Flutter imports: relative path or package?

In Flutter, for importing libraries within our own package's lib directory, should we use relative imports
import 'foo.dart'
or package import?
import 'package:my_app/lib/src/foo.dart'
Dart guidelines advocate to use relative imports :
PREFER relative paths when importing libraries within your own package’s lib directory.
whereas Provider package says to always use packages imports :
Always use package imports. Ex: import 'package:my_app/my_code.dart';
Is there a difference other than conciseness? Why would packages imports would reduce errors over relative imports?
From the same Dart guidelines, further down they give this reason for the relative imports:
There is no profound reason to prefer the former—it’s just shorter, and we want to be consistent.
Personally, I prefer the absolute method, despite it being more verbose, as it means when I'm importing from different dart files (in other folders), I don't have to work out where the file to be imported is, relative to the current file. Made-up example:
I have two dart files, at different folder levels, that need to import themes/style.dart:
One is widgets/animation/box_anim.dart, where the relative path import would be:
import '../../themes/style.dart';
The other is screens/home_screen.dart with the relative import:
import '../themes/style.dart';
This can get confusing, so I find it better to just use the absolute in both files, keeping it consistent:
import 'package:myapp/themes/style.dart';
And just stick that rule throughout. So, basically, whatever method you use - Consistency is key!
The Linter for Dart package, also has something to say about this, but is more about the Don'ts of mixing in the '/lib' folder:
DO avoid relative imports for files in lib/.
When mixing relative and absolute imports it's possible to create
confusion where the same member gets imported in two different ways.
An easy way to avoid that is to ensure you have no relative imports
that include lib/ in their paths.
TLDR; Choose the one you prefer, note that prefer_relative_imports is recommended in official Effective Dart guide
First of all, as mentioned in this answer, Provider do not recommands package imports anymore.
Dart linter provides a list of rules, including some predefined rulesets :
pedantic for rules enforced internally at Google
lints or even flutter_lints (previously effective_dart) for rules corresponding to the Effective Dart style guide
flutter for rules used in flutter analyze
Imports rules
There is actually more than two opposites rules concerning imports :
avoid_relative_lib_imports, enabled in pedantic and lints rulesets, basically recommend to avoid imports that have 'lib' in their paths.
The two following are the one you mention :
prefer_relative_imports, enabled in no predefined rulesets, but recommended in Effective Dart guide in opposition to :
always_use_package_imports, enabled in no predefined rulesets. Which means that it is up to you and to your preferences to enable it (be careful, it is incompatible with the previous rule)
Which one should I choose?
Choose the rule you want ! It will not cause any performance issue, and no rule would reduce errors over the other. Just pick one and make your imports consistent across all your project, thanks to Dart linter.
I personnaly prefer using prefer_relative_imports, as it is recommended by Dart team, with this VSCode extension which automatically fix and sort my imports.
Provider do not need packages imports anymore.
This was a workaround to an old Dart bug: Flutter: Retrieving top-level state from child returns null
TL;DR, by mixing relative and absolute imports, sometimes Dart created a duplicate of the class definition.
This led to the absurd line that is:
import 'package:myApp/test.dart' as absolute;
import './test.dart' as relative;
void main() {
print(relative.Test().runtimeType == absolute.Test().runtimeType); // false
}
Since provider relies on runtimeType to resolve objects, then this bug made provider unable to obtain an object in some situations.
My 5 cents on the topic are that absolute (package:my_app/etc/etc2...) imports cause much less trouble than relative ones (../../etc/etc2...) when you decide to reorganize/cleanup your project`s structure because whenever you move a file from one directory to another you change the "starting point" of every relative import that this file uses thus breaking all the relative imports inside the moved file...
I'd personally always prefer absolute to relative paths for this reason
This question already has good answers, but I wanted to mention an insanely annoying and hard-to-find problem I experienced with unit testing that was caused by a relative import.
The expect fail indicator for an exception-catching expect block
expect(
() => myFunction,
throwsA(isA<InvalidUserDataException>())
);
was showing the actual result as exactly the same as the expected result, and zero indication of why it's failing.
After massive trial-and-error, the issue was because the expected InvalidUserDataException (a custom-made class) was being imported to the test file in RELATIVE format vs PACKAGE format.
To find this, I had to compare side-by-side, line-by-line, call-by-call between this test file and another test file that uses the exact same exception expecters (It's lucky, we had this), and just by chance, I happened to scroll to the top of this file's imports and see the blue underline saying prefer relative imports to /lib directory.
No, they're not preferred; they're necessary, because the moment I changed that to a PACKAGE (absolute) import, everything suddenly started working.
What I learned from this is: Use absolute imports for test files (files outside the lib directory)
e.g. inside of src/test/main_test.dart
DON'T: use import '../lib/main.dart'
DO: use package:my_flutter_app/main.dart
Maybe other people knew this already, but I didn't, and I couldn't find anything online with searches about this issue, so I thought I would share my experience that might help others who got stuck around this.
Does anyone know why this happens?
Edit: For context, this happened while using Flutter 2.1.4 (stable) with Sound Null Safety
Do you use Integration Tests?
If the answer is yes, then in most cases you need to use package imports. When you attempt to run your integration tests on a physical device, any relative imports will not be able to find what they're looking for.
Example: https://github.com/fluttercommunity/get_it/issues/76
You can enforce package imports in your project by using these two linting rules:
always_use_package_imports
avoid_relative_lib_imports
I also prefer package imports because they stick even when rearranging your files and folders. Relative imports frequently break and it's a pain to have to remove them and reimport the offending dependency.
One very simple reason to not use package imports: rename your package without editing every dart file
Renaming can happen a few times while the product does not have a definitive name, and you or your product owner decides to change it.
It is much more painful to rename with package imports as your package name is in every import.
Of course you can change it with a find/replace query, but it's a useless edit on every dart file you can avoid with relative imports.
Plus, vscode allows to automatically update relative imports on file move/rename and I have never had any issue with this feature.

Remove/Add References and Compile antique VB6 application using Powershell

I've been given the task of researching whether one can use Powershell to automate the managing of References in VB6 application and then compile it's projects afterwards.
There are 3 projects. I requirement is to remove a specific reference in each project. Then, compile projects from bottom up (server > client > interface) and add reference back in along the way. (remove references, compile server.dll >add client reference to server.dll, compile client.dll > add interface reference to client.dll, compile interface.exe)
I'm thinking no, but I was still given the task of finding out for sure. Of course, where does one go to find this out? Why here of course, StackOverflow.
References are stored in the project .VBP files which are just text files. A given reference takes up exactly one line of the file.
For example, here is a reference to DAO database components:
Reference=*\G{00025E01-0000-0000-C000-000000000046}#5.0#0#C:\WINDOWS\SysWow64\dao360.dll#Microsoft DAO 3.6 Object Library
The most important info is everything to the left of the path which contains the GUID (i.e., the unique identifier of the library, more or less). The filespec and description text are unimportant as VB6 will update that to whatever it finds in the registry for the referenced DLL.
An alternate form of reference is for GUI controls, such as:
Object={BDC217C8-ED16-11CD-956C-0000C04E4C0A}#1.1#0; tabctl32.ocx
which for whatever reason never seem to have a path anyway. Most likely you will not need to modify this type of reference, because it would almost certainly break forms in the project which rely on them.
So in your Powershell script, the key task would be to either add or remove the individual reference lines mentioned in the question. Unless you are using no form of binary compatibility, the GUID will remain stable. Therefore, you could essentially hardcode the strings you need to add/remove.
Aside from all that, its worth thinking through why you need to take this approach at all. Normally to build a VB6 solution it is totally unnecessary to add/remove references along the way. Also depending on your choice of deployment techniques, you are probably using either project or binary compatibility which tends to keep the references stable.
Lastly, I'll mention that there are existing tools such as Kinook's Visual Build Pro which already know how to build groups of VB6 projects and if using a 3rd party tool like that is an option, could save you a lot of work.

Yocto: how to remove/blacklist some dependency from RDEPENDS of a package?

I have a custom machine layer based on https://github.com/jumpnow/meta-wandboard.
I've upgraded the kernel to 4.8.6 and want to add X11 to the image.
I'm modifying to image recipe (console-image.bb).
Since wandboard is based on i.MX6, I want to include the xf86-video-imxfb-vivante package from meta-fsl-arm.
However, it fails complaining about inability to build kernel-module-imx-gpu-viv. I believe that happens because xf86-video-imxfb-vivante DEPENDS on imx-gpu-viv which in turn RDEPENDS on kernel-module-imx-gpu-viv.
I realize that those dependencies have been created with meta-fsl-arm BSP and vanilla Poky distribution. But those things are way outdated for wandboard, hence I am using the custom machine layer with modern kernel.
The kernel is configured to include the Vivante DRM module and I really don't want the kernel-module-imx-gpu-viv package to be built.
Is there a way to exclude it from RDEPENDS? Can I somehow swear my health to the build system that I will take care of this specific run-time dependency myself?
I have tried blacklisting 'kernel-module-imx-gpu-viv' setting PNBLACKLIST[kernel-module-imx-gpu-viv] in my local.conf, but that's just a part of a solution. It helps avoid build failures, but the packaging process still fails.
IIUC you problem comes from these lines in img-gpu-viv recipe:
FILES_libgal-mx6 = "${libdir}/libGAL${SOLIBS} ${libdir}/libGAL_egl${SOLIBS}"
FILES_libgal-mx6-dev = "${libdir}/libGAL${SOLIBSDEV} ${includedir}/HAL"
RDEPENDS_libgal-mx6 += "kernel-module-imx-gpu-viv"
INSANE_SKIP_libgal-mx6 += "build-deps"
I would actually qualify this RDEPENDS as a bug, usually kernel module dependencies are specified as RRECOMMENDS because most modules can be compiled into the kernel thus making no separate package at all while still providing the functionality. But that's another issue.
There are several ways to fix this problem, the first general route is to tweak RDEPENDS for the package. It's just a bitbake variable, so you can either assign it some other value or remove some portion of it. In the first case it's going to look somewhat like this:
RDEPENDS_libgal-mx6 = ""
In the second one:
RDEPENDS_libgal-mx6_remove = "kernel-module-imx-gpu-viv"
Obviously, these two options have different implications for your present and future work. In general I would opt for the softer one which is the second, because it has less potential for breakage when you're to update meta-fsl-arm layer, which can change imx-gpu-viv recipe in any kind of way. But when you're overriding some more complex recipe with big lists in variables and you're modifying it heavily (not just removing a thing or two) it might be easier to maintain it with full hard assignment of variables.
Now there is also a question of where to do this variable mangling. The main option is .bbappend in your layer, that's what appends are made for, but you can also do this from your distro configuration (if you're maintaining your own distro it might be easier to have all these tweaks in one place, rather than sprayed into numerous appends) or from your local.conf (which is a nice place to quickly try it out, but probably not something to use in longer term). I usually use .bbappend.
But there is also a completely different approach to this problem, rather than fixing package dependencies you can also fix what some other package provides. If for example you have a kernel configured to have imx-gpu-viv module built right into the main zimage you can do
RPROVIDES_kernel-image += "kernel-module-imx-gpu-viv"
(also in .bbappend, distro configuration or local.conf) and that's it.
In any case your approach to fixing this problem should reflect the difference between your setup and recipe assumptions. If you do have the module, but in a different package, then go for RPROVIDES, if you have some other module providing the same functionality to libgal-mx6 package then fix libgal-mx6 dependencies (and it's better to fix them, meaning not only drop something you don't need, but also add things that are relevant for your setup).

Buildr: adding a path to the generated eclipse/idea files

I have a legacy java project that we have been moving to buildr/artifactory from ant/jars in svn.
The primary code is in the default (src/main/java) folder, but we have a few external source paths, for various tests that we can't move into the default folder, but we want to have access with it.
Currently, when adding a new library/regenerating IDE fields, it does not pick up these source paths, and I can't find a succinct discussion in the buildr manual for how to actually add them, rather than re-adding everything manually in eclipse (which just gets wiped out on the next regen).
Any idea how to have multiple source paths get picked up explicitly by buildr so that the idea/eclipse targets generate properly?
There are two ways that I know will work with IDEA. The second one might also work with Eclipse, while the first is specific to the idea task.
The IDEA-specific solution:
define 'proj' do
# ...
iml.main_source_directories << _('src/other')
end
iml also has test_source_directories and excluded_directories arrays you can append to.
The possibly eclipse-compatible solution, with more background than you probably want:
The iml object gets its default values for the main and test source directory arrays from project.compile.sources and project.test.compile.sources (slight simplification; resources are considered also). Buildr defines these .sources project attributes from the layout, so instead of explicitly appending to the iml attributes, you could use a custom layout for your project that includes your special source paths. That might work with the eclipse task, but I haven't tried it.