I have been given a code which was created by a vendor and seems like their engineer did a lot of hardcoding in the unit tests.
I have a unit test for a function which outputs the full absolute path of report generated as part of the code as a string.
currently the unit test/assertion that fails looks like
val reportPath = obj.getReportPath()
assert(reportPath.equals("file:/Users/khalid.mahmood/ReportingModule/target/report.csv")
where ReportingModule is the name of the project.
The code logic is fine as for me the value of the reportPath variable comes out to be:
file:/Users/vikas.saxena/coding_dir/ReportingModule/target/report.csv
Since I have the project cloned in a subdirectory called coding_dir in my home directory so the logic looks fine to me.
I want to modify the assertion to ensure that the code pics up the base directory of project by itself and on googling I found that sbt has base as the equivalent of project.baseDir (from maven) from this link
However the following code changes haven't worked out for me
assert(reportPath.equals(s"""$base""" + "/target/report.csv")
Can I get some pointers on how to get this right.
If you're using ScalaTest, you can the ConfigMap to do it.
First you need to tell the ScalaTest Runner to add the path to the ConfigMap. This can be done in your .sbt file like so:
Test / testOptions += Tests.Argument(
TestFrameworks.ScalaTest, s"-DmyParameter=${baseDirectory.value}")
(note that it doesn't have to be baseDirectory.value, many other sbt settings will work. I would suggest target.value for your specific use case).
In the test itself, you then need to access the value from the ConfigMap. The easiest way to do this is to use a Fixture Suite (such as FixtureAnyFunSuite) and mix in the ConfigMapFixture trait:
import org.scalatest.funsuite.FixtureAnyFunSuite
import org.scalatest.fixture.ConfigMapFixture
class ExampleTest extends FixtureAnyFunSuite with ConfigMapFixture {
test("example") { configMap =>
val myParameter = configMap.getRequired[String]("myParameter")
// actual test logic goes here
succeed
}
}
There are of course other ways to solve the problem. For instance, you can also simply get the current working directory (cwd) and work from there. However the downside to that is that in multi-module builds, the cwd will be different depending on whether the Test / fork setting in sbt is true or false. So to make your code robust against these sorts of eventualities, I recommend sticking with the ConfigMap way.
Related
I'd like flutter_test to find test files without the "_test" suffix. Where can I configure this pattern?
For python with pytest for instance, you'd change this setting in pytest.ini:
testpaths =
tests
python_files = *.py
What's the Dart way of doing this?
I have gone through your question in-depth and I found these things.
I checked the test package and dig deep into the source code to see how they were actually doing the checking for _test.dart files. And I found out that in the pubspec.yaml, they have one dependency called glob (link) which I think they used to filter the files. I went through their code and found these particular lines for it:
Link to this page
I tried to fork the repository and then change the type there but it was still showing the same test files as before. So I tried a different approach.
I tried to look into the VS Code plugin for test to see if I can change the type there but I couldn't found the exact module in which there defining the path. In VS-Code, we have an option in settings.json to search the test files outside of the test folder by this line.
"dart.allowTestsOutsideTestFolder": true
But there weren't any concrete options to change the test file search pattern for it. So my conclusion is if we were able to change the search pattern then we have to change in so many places which could also break some things. Therefore I would suggest to stick to the convention of it.
Within Flutter there is no convenient option, you can specify which test files to execute. But this will result in an very heavy test script.
Such as:
flutter test test/file1.dart test/file2.dart ...
EDIT:
Based on the answer of Cavin Macwan. You can create an file in the root named dart_test.yaml with the following content:
filename: "*.dart"
Note: This only works with dart test and not flutter test
SBT lets you define autoplugins specific to your project by putting them in ./project.
I'm trying to add resources to one such autoplugin - by which I mean something that it could access through a call to getClass.getResourceAsStream.
I have, however, not been able to work out how to do that, or even if it was possible. There's no documentation that I could find on the subject, and the obvious (simply putting resources in ./project with the plugin) fails.
Is what I'm trying to achieve possible?
Yes, you need to place your resource in ./project/src/main/resources/
For a quick demonstration that this works, assume the file name is test.txt, put the following in your build.sbt:
lazy val hello = taskKey[Unit]("prints the content of test.txt")
hello := println(IO.readStream(getClass.getResourceAsStream("test.txt")))
I just started using puppet. I don't know how to execute classes in puppet.
I've my files "config.pp init.pp install.pp service.pp".
For example install.pp :
class sshd::install{ ... }
Next, i declare my class in init.pp with "include sshd::install".
I also tried to run classes with :
class{'sshd::install':} -> class{'sshd::config':} ~> class{'sshd::service':}
After that, i launch "puppet apply init.pp" but nothing.
My scripts work individualy, but with classes i don't know how to execute all my classes.
Thanks
I'm not sure how much research you've done into Puppet and how its code is structured, but these may help:
Module Fundamentals
Digital Ocean's guide.
It appears that you are starting out with a basic module structure (based on your use of init/install/service), which is good, however your execution approach is that of a direct manifest (Not the module itself) which won't work within the module you are testing due to autoloading unless your files are inside a valid module path.
Basically: You want to put your class/module structured code within Puppet's module path (puppet config print modulepath) then you want to use another manifest file (.pp) to include your class.
An example file structure:
/etc/puppetlabs/code/modules/sshd/manifests/init.pp
install.pp
service.pp
/tmp/my_manifest.pp
Your class sshd(){ ... } code goes in the init.pp, and class sshd::install(){ ... } goes in install.pp etc...
Then the 'my_manifest.pp' would look something like this:
include ::sshd
And you would apply with: puppet apply /tmp/my_manifest.pp.
Once this works, you can learn about the various approaches to applying manifests to your nodes (direct, like this, using an ENC, using a site.pp, etc... Feel free to do further reading).
Alternatively, as long as the module is within your modulepath (as mentioned above) you could simply do puppet apply -e 'include ::sshd'
In order to get the code that you have to operate the way you are expecting it to, it would need to look like this:
# Note: This is BAD code, do not reproduce/use
class sshd() {
class{'sshd::install':} ->
class{'sshd::config':} ~>
class{'sshd::service':}
}
include sshd
or something similar, which entirely breaks how the module structure works. (In fact, that code will not work without the module in the correct path and will display some VERY odd behavior if executed directly. Do not write code like that.)
If I have written some source code in my build definition project (in /project/src/main/scala) in SBT. Now I want to use these classes also in the project I am building. Is there a best practice? Currently I have created a custom Task that copies the .scala files over.
Those seem like unnecessarily indirect mechanisms.
unmanagedSourceDirectories in Compile += baseDirectory.value / "project/src/main"
Sharing sourceDirectories as in extempore's answer is the simplest way to go about it, but unfortunately it won't work well with IntelliJ because the project model doesn't allow sharing source roots between multiple modules.
Seth Tisue's approach will work, but requires rebuilding to update sources.
To actually share the sources and have IntelliJ pick up on it directly, you can define a module within the build.
The following approach seems to only work in sbt 1.0+
Create a file project/metabuild.sbt:
val buildShared = project
val buildRoot = (project in file("."))
.dependsOn(buildShared)
and in your build.sbt:
val buildShared = ProjectRef(file("project"), "buildShared")
val root = (project in file("."))
.dependsOn(buildShared)
Then put your shared code in project/buildShared/src/main/scala/ and refresh. Your project will look something like this in IntelliJ:
Full example project: https://github.com/jastice/shared-build-sources
Can you make the following work? Put the source code for the classes in question should be part of your project, not part of your build definition; the “task which serializes a graph of Scala objects using Kryo and writes them as files into the classpath of the project” part sounds like a perfect job for resourceGenerators (see http://www.scala-sbt.org/0.13.2/docs/Howto/generatefiles.html). Then the only remaining problem is how to reference the compiled classes from your resource generator. I'm not familiar with Kryo. In order to use it, do you need to have the compiled classes on the classpath at the time your generator is compiled, or do they just need to be on the classpath on runtime? If the latter is sufficient, that's easier. You can get a classloader from the testLoader in Test key, load the class and instantiate some objects via reflection, and then call Kryo.
If you really need the compiled classes to be on the classpath when your resource generator is compiled, then you have a chicken and egg problem where the build can't be compiled until the project has been compiled, but of course the project can't be compiled before the build definition has been compiled, either. In that case it seems to me you have no choices other than:
1) the workaround you're already doing ("best practice" in this case would consist of using sourceGenerators to copy the sources out of your build definition and into target/src_managed)
2) put the classes in question in a separate project and depend on it from both your build and your project. this is the cleanest solution overall, but you might consider it too heavyweight.
Hope this helps. Interested in seeing others' opinions on this, too.
Having trouble accessing the compiled assets location in production.
My strategy has been to serve my assets in "app/assets/ui" when in development and "public" when in production this is done as shown below in my conf/routes file
#{if(play.Play.mode.isDev())}
GET /assets/*file controllers.common.Assets.at(path="/app/assets/ui", file)
#{/}
#{else}
GET /assets/*file controllers.common.Assets.at(path="/public", file)
#{/}
Since i have defined asset mappings outside “public,”I have added the following line in my Build.scala
playAssetsDirectories <+= baseDirectory / "app/assets/ui"
As an example my scripts are loaded conditionaly depending on the environment as shown below
#if(play.Play.isDev()) {<script src="#routes.Assets.at("/app/assets/ui", "javascripts/application.js")"type="text/javascript"></script>} else {<script src="#.routes.Assets.at("/public", "javascripts/application.min.js")" type="text/javascript"></script>}
I'm using Grunt for my frontend workflow and when the application builds it copies the distribution files to the application's public folder.
I start the app in production using "sbt clean compile stage" and then run the packaged app.
My problem appears that the routes are still referring to the "app/assets/ui" folder instead of the distribution "public" folder.
Any tips on how i can debug this? My working background is as a front end developer so i'm very new to Play! and scala.
As mentioned by #estmatic, your conditional in routes won't be evaluated.
As it's generally extremely useful to consolidate the differences between application Modes into files, I'd suggest you extend GlobalSettings (if you aren't already) and override the onLoadConfig method:
class Settings extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: File, classloader: ClassLoader, mode: Mode.Mode): Configuration = {
val specificConfig:Config = // ... Use the mode param to load appropriate file
super.onLoadConfig(specificConfig, path, classloader, mode)
}
...
}
You could then have appropriately-named files (dev.conf and production.conf spring to mind) that contain suitable values, one of them being the base path for the Assets controller to use.
EDIT turns out doing it this way makes usage in routes awkward, here's another approach:
This approach does not use a configuration file per-environment, which means that if something changes in the frontend configuration (e.g. it's no longer served up from /public) you'll have to change this code and re-deploy it. However, it fits into Play 2.x pretty nicely:
package controllers
object EnvironmentSpecificAssets extends AssetsBuilder {
val modeToAssetsPathMap = Map(
Mode.Dev -> "/app/assets/ui",
Mode.Prod -> "/public")
lazy val modePath = modeToAssetsPathMap(Play.current.mode)
/** New single-argument `at`, determines its path from the current app mode */
def at(file:String): Action[AnyContent] = at(modePath, file)
}
The code is pretty self-explanatory, the only "trick" is probably the lazy val which means we only have to evaluate the current operating mode and do the map lookup once.
Now your routes file just looks like this:
GET /assets/*file controllers.EnvironmentSpecificAssets.at(file)
Playframework 2.x doesn't support conditional statements in the routes file. The 1.x versions had this but it was removed.
What you have in your routes file is simply two routes with the same URI pattern, /assets/file*. The other lines are just being ignored as comments since they begin with the pound character, #. I think since the pattern is the same for both the first route is catching everything and the second isn't doing anything.
It's not exactly what you're trying to do but I think you can just make the route patterns a bit different and it should work.
GET /assets/dev/*file controllers.common.Assets.at(path="/app/assets/ui", file)
GET /assets/*file controllers.common.Assets.at(path="/public", file)