Is it possible to overwrite and use the same value in multiple build.sbt files? - scala

I have the following dummy project structure:
|───employee-uService
| ├───backend
| | ├───employee-api
| | ├───project
| | ├───src
| | │ └───main
| | │ ├───protobuf
| | │ └───scala
| | ├───build.sbt
| ├───build.sbt
|───build.sbt (root project build)
The build.sbt in the employee-api contains project definition with the .settings(scalapbSettings(".")) setting.
The scalapbSettings function sets up the proto source folder like:
val protoSources = PB.protoSources in Compile := Seq(file(s"$projectFolder/src/main/protobuf"))
Where the projectFolder is a parameter of the function.
The build.sbt one level higher in the hierarchy (employee-uService) defines employee-api and the respective impl project and aggregates them, while the root build aggregates the ...-uService projects.
Depending on the project I'm compiling, the given string parameter for the scalapbSettings function has to change to represent the proper path. (e.g.: in the root it has to be employee-uService/backend/employee-api while when running the api compile, it's ..
How could I pass a value to the function call that could be overwritten in the different build.sbt files?

Given the directory structure you described (protos are under src/main/protobuf in each project), you don't need to set PB.protoSources for each project since that is the default. However, if you wanted to specify it explicitly, and allow users to override, you could have in your scalapbSettings function the following line:
val protoSources = PB.protoSources in Compile := Seq(
file((sourceDirectory in Compile).value / projectFolder))
Then projectFolder should be relative to src/main (and can have default value "protobuf" of)
Tip: in an SBT shell you can type, protocSources to see what is the value of this settings for each project.

Related

Yocto Bitbake fails to find wrapper headers (include_next stind.h no such file)

I'm trying to create a bitbake recipe to build some source code that resides in a local folder (as opposed to fetching from a remote repo).
After running bitbake, I expect to have an executable file and a shared library in the resulting image.
The source code includes three CMakeLists.
So far, I'm able to:
Run cmake by itself to build the source code on my host and on the target (i.e. not using bitbake)
Using my .bb file, correctly point to the source code by using the variable OECMAKE_SOURCEPATH.
start running cmake using the default do_compile()
The build fails during do_compile() with the error:
In file included from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Inc/typedefs.h:29,
| from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Src/DOIP/DOIP.h:32,
| from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Src/ISOUDS/ISOUDS_MAIN/ISOUDS_Server_Cfg.h:30,
| from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Src/ISOUDS/ISOUDS_ClearDiagInfo/ISOUDS_ClearDiagInfo_Cfg.c:25:
| /home/myname/project/nxp_s32/build_s32g274asbc2/tmp/work/aarch64-ms-linux/embitel-uds/1.0.0-r0/recipe-sysroot-native/usr/lib/aarch64-ms-linux/gcc/aarch64-ms-linux/10.2.0/include/stdint.h:9:16: fatal error: stdint.h: No such file or directory
| 9 | # include_next <stdint.h>
| | ^~~~~~~~~~
| compilation terminated.
| make[2]: *** [CMakeFiles/uds-server.dir/build.make:264: CMakeFiles/uds-server.dir/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Src/ISOUDS/ISOUDS_ClearDiagInfo/ISOUDS_ClearDiagInfo_Cfg.c.o] Error 1
| In file included from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Inc/typedefs.h:29,
| from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Src/DOIP/DOIP.h:32,
| from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Src/ISOUDS/ISOUDS_MAIN/ISOUDS_Server_Cfg.h:30,
| from /home/myname/UDS_Server_Integration/src/3rdparty/udsdoip/UDSSrvonDOIP/DoIPSrvProcess/Src/ISOUDS/ISOUDS_CntrlDTCSetting/ISOUDS_CntrlDTCSetting.c:13:
| /home/myname/project/nxp_s32/build_s32g274asbc2/tmp/work/aarch64-ms-linux/embitel-uds/1.0.0-r0/recipe-sysroot-native/usr/lib/aarch64-ms-linux/gcc/aarch64-ms-linux/10.2.0/include/stdint.h:9:16: fatal error: stdint.h: No such file or directory
| 9 | # include_next <stdint.h>
| | ^~~~~~~~~~
However, stdint.h does exist. I looked up what "include_next" is and it's a "wrapper header". I think GCC is using this to modify the headers for the target environment, i.e. this is a cross-compiler issue. I assume this would indicate that cmake is not configured correctly for cross-compilation, or not looking in the correct location for the modified headers.
I have never encountered this problem building other source code for the same target environment using the same cross-compiler. My .bb recipe is also written using the same variables as for other packages. I even compared the CMakeOutput.log and CMakeCache.txt for this failing recipe and other successful recipes and saw that most of the relevant variables were set with the same values.
This led me to believe this could be an issue with the CMakeLists.txt and not having configured cmake correctly for this particular source code.
I have tried adding -DCMAKE_NO_SYSTEM_FROM_IMPORTED=1 based on this thread.
I have also avoided directly setting the cross-compiler based on this.
However, I'm at a loss for what I could be missing.
Other issues I've referenced:
Referencing gcc with yocto recipe Makefile, unable to find stdint
Here are my CMakeLists for reference:
cmake_minimum_required(VERSION 3.13)
project("MyUDS"
VERSION "1.0.0"
LANGUAGES C)
include(GNUInstallDirs)
## --- C++ build flags ---
set(CMAKE_C_STANDARD_REQUIRED ON)
set(CMAKE_C_EXTENSIONS OFF)
set(CMAKE_EXPORT_COMPILE_COMMANDS ON)
set(CMAKE_C_FLAGS "-MMD -MP -O4 -fcommon")
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -std=gnu11")
# Set version
set(PROJECT_VERSION_MAJOR 0 CACHE STRING "")
set(PROJECT_VERSION_MINOR 0 CACHE STRING "")
set(PROJECT_VERSION_PATCH 0 CACHE STRING "")
set(PROJECT_VERSION_BUILD 0 CACHE STRING "")
# changes binary and library outputs to ./build/bin and ./build/lib
# set(CMAKE_RUNTIME_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/bin)
set(CMAKE_LIBRARY_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/lib)
set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/lib)
add_library(uds-server SHARED)
set_target_properties(uds-server PROPERTIES
VERSION ${PROJECT_VERSION}
SOVERSION ${PROJECT_VERSION_MAJOR})
add_subdirectory(src/3rdparty/udsdoip)
----- CMakeLists in src/3rdparty/udsdoip ------
file(GLOB_RECURSE sched_sources ${CMAKE_CURRENT_SOURCE_DIR}/UDSSrvonDOIP/DoIPSrvProcess/Sched/*.c)
add_executable(udsserver ${sched_sources})
target_link_libraries(udsserver uds-server pthread)
target_include_directories(
udsserver
PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/UDSSrvonDOIP/DoIPSrvProcess/Inc
${CMAKE_CURRENT_SOURCE_DIR}/UDSSrvonDOIP/DoIPSrvProcess/Sched/Inc
...
${CMAKE_CURRENT_SOURCE_DIR}/UDSSrvonDOIP/DoIPSrvProcess/Src/Inc
)
add_subdirectory(UDSSrvonDOIP/DoIPSrvProcess)
install(TARGETS udsserver DESTINATION bin)
install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/service/uds_server.sh DESTINATION bin)
install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/service/uds_server.service
DESTINATION /etc/systemd/system
)
----- CMakeLists in UDSSrvonDOIP/DoIPSrvProcess ------
project(lib-uds-server)
# Enable helper debugging messages
target_compile_definitions(
uds-server PUBLIC DEBUG_SOCKCOMM DOIP_SERVER_PRINT_TCP_RX_PACKET_DATA DOIP_SERVER_PRINT_FOUND_NET_DEVS
)
file(GLOB_RECURSE isouds_sources RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} *.c)
target_sources(uds-server PRIVATE ${isouds_sources})
target_include_directories(
uds-server
PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/Inc
${CMAKE_CURRENT_SOURCE_DIR}/Sched
${CMAKE_CURRENT_SOURCE_DIR}/Sched/Inc
...
${CMAKE_CURRENT_SOURCE_DIR}/Src/ISOUDS/ISOUDSSecurDtaTrans
)
install(TARGETS uds-server
ARCHIVE DESTINATION ${CMAKE_INSTALL_LIBDIR}
LIBRARY DESTINATION ${CMAKE_INSTALL_LIBDIR}
RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR})
Found the issue, and it's a bit of a dumb mistake.
I was not inheriting the CMake C flags set by Yocto that are needed for the cross-compile environment. So no matter what flags I added in my .bb recipe file, they were being overridden in the source code CMakeLists.txt.
Here, where I was setting the C flags, I was not inheriting Yocto's flags.
set(CMAKE_C_FLAGS "-MMD -MP -O4 -fcommon")
I should have used the existing flags and appended the ones specific to my source code like this:
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -MMD -MP -O4 -fcommon")
This got my build working.

IntelliJ Scala: import works in test folder but not in main folder

I have an IntelliJ project in scala with the following directory structure (I've renamed files/directories for simplicity):
project
|
+--src
| |
| +--main
| | |
| | +--scala
| | |
| | +--'X'
| | |
| | +--'Y.scala'
| +--test
| |
| +--scala
| |
| +--'X'
| |
| +--'YSuite.scala'
|
+--build.sbt
The issue I'm having is that I'm able to import things in the YSuite.scala file that I'm not able to in YSuite.scala - specifically, the scala.collections.parallel packages. I just have no idea how or why I can import in the test file, but not in the parallel application file. I need them in the main file for implementation. Can someone point me in the right direction?
Screenshots are of the Y.scala file, YSuite.scala file, as well as the build.sbt file, if they help at all.
As can be seen, the red text indicates that I wasn't able to import it in Y.scala - when I hover over it with my mouse, it simply says cannot resolve symbol parallel. However, I've run the test file with some implementation of the parallel package, which runs with no problems.
Y.scala
YSuite.scala
build.sbt
a solution that seems to have worked for me:
step 1: File -> Invalidate Caches / Restart
step 2: build again/spin up sbt

OpenDDS Perl Script Setup Throwing Error

Continuing from this SO question.
When following the openDDS install guide I attempt to run configure from within the command prompt but receive this output recieve this error set:
C:\Users\Supervisor\Desktop\opendds>C:\Users\Supervisor\Desktop\opendds\configure.cmd
Options:
'compiler' => 'gcc'
'verbose' => 1
host system is: win32
compiler is: gcc
Using ace_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers
Using tao_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers/TAO
ACE_ROOT/ace/config.h exists, skipping configuration of ACE+TAO
ENV: saving current environment
ENV: Appending ;C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\bin;C:\Users\Supervisor\Desktop\opendds\bin;C:\Users\Supervisor\Desktop\opend
ds\ACE_wrappers\lib;C:\Users\Supervisor\Desktop\opendds\lib to PATH
ENV: Setting ACE_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers
ENV: Setting MPC_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC
ENV: Setting CIAO_ROOT to unused
ENV: Setting TAO_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\TAO
ENV: Setting DDS_ROOT to C:\Users\Supervisor\Desktop\opendds
ENV: Setting DANCE_ROOT to unused
Use of uninitialized value $mpctype in concatenation (.) or string at configure line 1028.
OpenDDS mwc command line: -type C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
Use of uninitialized value $mpctype in string eq at configure line 1031.
Running MPC to generate project files.
MPC_ROOT was set to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC.
Using .../opendds/ACE_wrappers/bin/MakeProjectCreator/config/MPC.cfg
ERROR: Invalid type: C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
mwc.pl v4.1.8
Usage: mwc.pl [-global <file>] [-include <directory>] [-recurse]
[-ti <dll | lib | dll_exe | lib_exe>:<file>] [-hierarchy]
[-template <file>] [-relative NAME=VAL] [-base <project>]
[-noreldefs] [-notoplevel] [-static] [-genins] [-use_env]
[-value_template <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-value_project <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-make_coexistence] [-feature_file <file name>] [-gendot]
[-expand_vars] [-features <feature definitions>]
[-exclude <directories>] [-name_modifier <pattern>]
[-apply_project] [-version] [-into <directory>]
[-gfeature_file <file name>] [-nocomments]
[-relative_file <file name>] [-for_eclipse]
[-workers <#>] [-workers_dir <dir> | -workers_port <#>]
[-language <cplusplus | csharp | java | vb>]
[-type <automake | bcb2007 | bcb2009 | bds4 | bmake | cc | cdt6 |
cdt7 | em3 | ghs | gnuace | gnuautobuild | html | make |
nmake | rpmspec | sle | vc6 | vc7 | vc8 | vc10 | vc11 |
vc12 | vc14 | vc71 | vc9 | vxtest | wb26 | wb30 | wix>]
[files]
-base Add <project> as a base project to each generated
project file. Do not provide a file extension, the
.mpb extension will be tried first; if that fails the
.mpc extension will be tried.
-exclude Use this option to exclude directories or files when
searching for input files.
-expand_vars Perform direct expansion, instead of performing relative
replacement with either -use_env or -relative options.
-feature_file Specifies the feature file to read before processing.
The default feature file is default.features under the
config directory.
-features Specifies the feature list to set before processing.
-for_eclipse Generate files for use with eclipse. This is only
useful for make based project types.
-gendot Generate .dot files for use with Graphviz.
-genins Generate .ins files for use with prj_install.pl.
-gfeature_file Specifies the global feature file. The
default value is global.features under the
config directory.
-global Specifies the global input file. Values stored
within this file are applied to all projects.
-hierarchy Generate a workspace in a hierarchical fashion.
-include Specifies a directory to search when looking for base
projects, template input files and templates. This
option can be used multiple times to add directories.
-into Place all output files in a mirrored directory
structure starting at <directory>. This should be a
full path. If any project within the workspace is
referenced via a full path, use of this option is
likely to cause problems.
-language Specify the language preference; possible values are
[cplusplus, csharp, java, vb]. The default is
cplusplus.
-make_coexistence If multiple 'make' based project types are
generated, they will be named such that they can coexist.
-name_modifier Modify output names. The pattern passed to this
parameter will have the '*' portion replaced with the
actual output name. Ex. *_Static
-apply_project When used in conjunction with -name_modifier, it applies
the name modifier to the project name also.
-nocomments Do not place comments in the generated files.
-noreldefs Do not try to generate default relative definitions.
-notoplevel Do not generate the top level target file. Files
are still processed, but no top level file is created.
-recurse Recurse from the current directory and generate from
all found input files.
-relative Any $() variable in an mpc file that is matched to NAME
is replaced by VAL only if VAL can be made into a
relative path based on the current working directory.
This option can be used multiple times to add multiple
variables.
-relative_file Specifies the relative file to read before processing.
The default relative file is default.rel under the
config directory.
-static Specifies that only static projects will be generated.
By default, only dynamic projects are generated.
-template Specifies the template name (with no extension).
-workers Specifies number of child processes to use to generate
projects.
-workers_dir The directory for storing temporary output files
from the child processes. The default is '/tmp/mpc'
If neither -workers_dir nor -workers_port is used,
-workers_dir is assumed.
-workers_port The port number for the parent listener. If neither
-workers_dir nor -workers_port is used, -workers_dir
is assumed.
-ti Specifies the template input file (with no extension)
for the specific type (ex. -ti dll_exe:vc8exe).
-type Specifies the type of project file to generate. This
option can be used multiple times to generate multiple
types. There is no longer a default.
-use_env Use environment variables for all uses of $() instead
of the relative replacement values.
-value_project This option allows modification of a project variable
assignment. Use += to add VAL to the NAME's value.
Use -= to subtract and = to override the value.
This can be used to introduce new name value pairs to
a project. However, it must be a valid project
assignment.
-value_template This option allows modification of a template input
name value pair. Use += to add VAL to the NAME's
value. Use -= to subtract and = to override the value.
-version Print the MPC version and exit.
Error from MPC, stopped at configure line 1035.
The cmd script being run is:
#echo off
:: Win32 configure script wrapper for OpenDDS
:: Distributed under the OpenDDS License.
:: See: http://www.opendds.org/license.html
for %%x in (perl.exe) do set PERLPATH=%%~dp$PATH:x
if x%PERLPATH%==x (
echo ERROR: perl.exe was not found. This script requires ActiveState Perl.
exit /b 1
)
set PERLPATH=
perl configure -verbose --compiler=gcc %*
if exist setenv.cmd call setenv.cmd
And the section of configure that generates the error is:
my $mwcargs = "-type $mpctype $buildEnv->{'DDS_ROOT'}$slash$ws $static";
$mwcargs .= ' ' . $opts{'mpcopts'} if defined $opts{'mpcopts'};
print "OpenDDS mwc command line: $mwcargs\n" if $opts{'verbose'};
print 'Running MPC to generate ', ($mpctype eq 'gnuace' ? 'makefiles' :
'project files'), ".\n";
if (!$opts{'dry-run'}) {
if (system("perl $ENV{'ACE_ROOT'}/bin/mwc.pl $mwcargs") != 0) {
die "Error from MPC, stopped";
}
}
Where initial unset variable is set:
my $mpctype = ($slash eq '/') ? 'gnuace' : $opts{'compiler_version'};
I have both perl and visual studio installed. Looking up MPC I can find a 'multi-precision library. Could this be because I am using gcc? I have to use GCC in order to create a library to use with the JNI out of this code eventually...
You need to make sure that you are using ActiveState perl on windows, other perl variants seem not to work 100%

How do I create a tarball and a zip for a single module using distinct configurations?

I have a multi-project build with a particularly messy module which contain several mainClasses. I would like to create several distribution packages for this messy module, each distribution package employing distinct file sets and employing different formats. Ideas?
This is the answer from the sbt-nativer-packager issue tracker where the same question was posted.
I'm adding this from the gitter chat as well:
I'm just arriving in this chat room and my knowledge of sbt-native-packager is virtually zero... but anyway... looks to me that JavaAppPackaging and other archetypes should actually be configurations extended from Universal. In this scenario, I would just create my own configuration extended from JavaAppPackaging and tweak the necessary bits according to my needs. And, finally, if the plugin just picks mappings in ThisScope... it would pick my own scope, and not JavaAppPackaging... and not Universal.
So, let's go through this one by one.
The sbt-native-packager plugin always pick mappings in Universal. This is not ideal. It should conceptually pick mappings in ThisScope
SBT native packager provides two categories of AutoPlugins: FormatPlugins and ArchetypePlugins. FormatPlugins provide a new package format, e.g. UniversalPlugin (zip, tarball) or DebianPlugins (.deb). These plugins form a a hierarchy as they are build on top of each other:
SbtNativePackager
+
|
|
+-------+ Universal +--------+
| |
| + |
| | |
+ + +
Docker +-+ Linux +-+ Windows
| |
| |
+ +
Debian RPM
mappings, which define a file -> targetpath relation, are inherited with this pattern
mappings in ParentFormatPluginScope := (mappings in FormatPluginScope).value
So for docker it looks like this
mappings in Docker := (mappings in Universal).value
The linux format plugins use specialized mappings to preserve file permissions, but are basically the same.
Since sbt-native-packager plugin always pick mappings in Universal, I have to redefine mappings in Universal in each of my configurations
Yes. If you want to define your own scope and inherit the mappings and change them you have to do this, like all other packaging plugins, too. I recommend putting this code into custom AutoPlugins in your project folder.
For example (not tested, imports may be missing )
import sbt._
object BuilderRSPlugin extends AutoPlugin {
def requires = JavaAppPackaging
object autoImport {
val BuilderRS = config("builderrs") extend Universal
}
import autoImport._
override lazy val projectSettings = Seq(
mappings in BuilderRS := (mappings in Universal).value
)
}
looks to me that JavaAppPackaging and other archetypes should actually be configurations extended from Universal
JavaAppPackaging is an archetype, which means this plugin doesn't bring any new packaging formats, thus no new scopes. It configures all the packaging formats it can and enables them.
You package stuff by specifying the scope:
universal:packageBin
debian:packageBin
windows:packageBin
So if you need to customize your output format you are doing this in the respecting scope.
mappings in Docker := (mappings in Docker).value.filter( /* what ever you want to filter */)
See: https://github.com/sbt/sbt-native-packager/issues/746
IMPORTANT: This is an "answer in progress". IT DOES NOT WORK YET!
This is an example of how one could achieve this.
The basic idea is that we add configurations for different packages to be generated. Each configuration tells which files will be present in the package. This does not work as expected. See my comments after the code.
lazy val BuilderRS = sbt.config("BuilderRS").extend(Compile,Universal)
lazy val BuilderRV = sbt.config("BuilderRV").extend(Compile,Universal)
addCommandAlias("buildRS", "MessyModule/BuilderRS:packageZipTarball")
addCommandAlias("buildRV", "MessyModule/BuilderRV:packageBin") // ideally should be named packageZip
lazy val Star5FunctionalTestSupport =
project
.in(file("MessyModule"))
.enablePlugins(JavaAppPackaging)
.settings((buildSettings): _*)
.configs(Universal,BuilderRS,BuilderRV)
.settings(inConfig(BuilderRS)(
Defaults.configSettings ++ JavaAppPackaging.projectSettings ++
Seq(
executableScriptName := "rs",
mappings in Universal :=
(mappings in Universal).value
.filter {
case (file, name) => ! file.getAbsolutePath.endsWith("/bin/rv")
},
topLevelDirectory in Universal :=
Some(
"ftreports-" +
new java.text.SimpleDateFormat("yyyyMMdd_HHmmss")
.format(new java.util.Date())),
mainClass in ThisScope := Option(mainClassRS))): _*)
//TODO: SEE COMMENTS BELOW ===============================================
// .settings(inConfig(BuilderRV)(
// Defaults.configSettings ++ JavaAppPackaging.projectSettings ++
// Seq(
// packageBin <<= packageBin in Universal,
// executableScriptName := "rv",
// mappings in ThisScope :=
// (mappings in Universal).value
// .filter {
// case (file, name) => ! file.getAbsolutePath.endsWith("/bin/rs")
// },
// topLevelDirectory in Universal :=
// Some(
// "ftviewer-" +
// new java.text.SimpleDateFormat("yyyyMMdd_HHmmss")
// .format(new java.util.Date())),
// mainClass in ThisScope := Option(mainClassRV))): _*)
Now observe configuration BuilderRV which in comments.
It is basically the same thing as configuration BuilderRS, except that we are now deploying a different shell script in the bin folder. There some other small differences, but not relevant to this argumentation. There are two problems:
The sbt-native-packager plugin always pick mappings in Universal. This is not ideal. It should conceptually pick mappings in ThisScope.
Since sbt-native-packager plugin always pick mappings in Universal, I have to redefine mappings in Universal in each of my configurations. And this is a problem because mappings in Universal is defined as a function of itself in all configurations: the result is that we ended up chaining logic to mapppings in Universal each time we redefined it in each configuration. This causes trouble in this example in particular because the configuration BuilderRV (the second one) will perform not only its filter, but also the filter defined in BuilderRS (the first one), which is not what I want.

Service Loader config file doesn't explode properly

So I am writing a webapp in Eclipse and I want to use the serviceloader in one of my classes. Question is where to put the META-INF/services stuff. From here (https://stackoverflow.com/a/3421191/2742995) I found:
But the ideal way is to have it in your plugin's jar file. E.g if you
have a plugin bundled as WEB-INF/lib/myplugin.jar, and your plugin
class is com.example.plugin.MyPlugin Then the jar should have a
structure:
myplugin.jar!/META-INF/services/com.example.plugin.MyPlugin
So I have in the module containing the serviceloader stuff, the source: src/main/java/ containing
vcs.validation.* (containing the source code)
a folder: META-INF/services/vcs.validation.javatests.JavaTest containing:
Test1 (which reads vcs.validation.javatests.Test1) and
Test2 (which reads vcs.validation.javatests.Test2)
(The interface vcs.validation.javatests.JavaTest has two implementing classes Test1 and Test2)
However, when I package the whole webapp as a war and deploy in tomcat the web-app/WEB-INF/classes/ folder does not contain any META-INF/services/. What am I doing wrong here?
Structure should be:
Project
| Module
| | src
| | main
| | java
| | [ source code]
| | resources
| | META-INF
| | services
| | [service files]
instead of:
Project
| Module
| | src
| | main
| | java
| | [source code]
| | META-INF
| | services
| | [service files]
In this way the service files are no longer exploded to webapp/WEB-INF/classes/META-INF/services but just live in the jar in which they are packaged according to:
myplugin.jar!/META-INF/services/com.example.plugin.MyPlugin