I have a file src/main/scala/foo.scala which needs to be inside package bar. Ideally the file should be inside src/main/scala/bar/foo.scala.
// src/main/scala/foo.scala
package bar
// ...
How can I auto-fix this issue throughout my project such that the folder structure matches the package structure?
Is there any SBT plugin etc that can help me fix this issue?
As far as I am aware there are not such tools, though AFAIR IntelliJ can warn about package-directory mismatch.
Best I can think if is custom scalafix (https://scalacenter.github.io/scalafix/) rule - scalafix/scalameta would be used to check file's actual package, translate it to an expected directory and if they differ, move file.
I suggest scalafix/scalameta because there are corner cases like:
you are allowed to write your packages like:
package a
package b
package c
and it almost like package a.b.c except that it automatically imports everything from a and b
you can have package object in your file and then if you have
package a.b
package object c
this file should be in a/b/c directory
so I would prefer to check if file didn't fall under any of those using some existing tooling.
If you are certain that you don't have such cases (I wouldn't without checking) you could:
match the first line with regexp (^package (.*))
translate a.b.c into a/b/c (matched.split('.').map(_.trim).mkString(File.separator))
compare generated location to an actual location ( I suggest resolving absolute file locations)
move file if necessary
If there is a possibility of having more complex case than that, I could replace first step by querying scalafix/scalameta utilities.
Here is an sbt plugin providing packageStructureToDirectoryStructure task that reads package statements from source files, creates corresponding directories, and then moves files to them
import sbt._
import sbt.Keys._
import better.files._
object PackagesToDirectories extends AutoPlugin {
object autoImport {
val packageStructureToDirectoryStructure = taskKey[Unit]("Make directory structure match package structure")
}
import autoImport._
override def trigger = allRequirements
override lazy val projectSettings = Seq(
packageStructureToDirectoryStructure := {
val log = streams.value.log
log.info(s"Refactoring directory structure to match package structure...")
val sourceFiles = (Compile / sources).value
val sourceBase = (Compile / scalaSource).value
def packageStructure(lines: Traversable[String]): String = {
val packageObjectRegex = """package object\s(.+)\s\{""".r
val packageNestingRegex = """package\s(.+)\s\{""".r
val packageRegex = """package\s(.+)""".r
lines
.collect {
case packageObjectRegex(name) => name
case packageNestingRegex(name) => name
case packageRegex(name) => name
}
.flatMap(_.split('.'))
.mkString("/")
}
sourceFiles.foreach { sourceFile =>
val packagePath = packageStructure(sourceFile.toScala.lines)
val destination = file"$sourceBase/$packagePath"
destination.createDirectoryIfNotExists(createParents = true)
val result = sourceFile.toScala.moveToDirectory(destination)
log.info(s"$sourceFile moved to $result")
}
}
)
}
WARNING: Make sure to backup the project before running it.
Related
I have to list out all files which are greater than particular file based on its timestamp in naming pattern in scala. Below is the example.
Files available:
log_20200601T123421.log
log_20200601T153432.log
log_20200705T093425.log
log_20200803T049383.log
Condition file:
log_20200601T123421.log - I need to list all the file names, which are greater than equal to 20200601T123421 in its name. The result would be,
Output list:
log_20200601T153432.log
log_20200705T093425.log
log_20200803T049383.log
How to achieve this in scala? I was trying with apache common, but i couldn't see greater than equal to NameFileFilter for it.
Perhaps the following code snippet could be a starting point:
import java.io.File
def getListOfFiles(dir: File):List[File] = dir.listFiles.filter(x => x.getName > "log_20200601T123421.log").toList
val files = getListOfFiles(new File("/tmp"))
For the extended task to collect files from different sub-directories:
import java.io.File
def recursiveListFiles(f: File): Array[File] = {
val these = f.listFiles
these ++ these.filter(_.isDirectory).flatMap(recursiveListFiles)
}
val files = recursiveListFiles(new File("/tmp")).filter(x => x.getName > "log_20200601T123421.log")
I have a localization resource file I need access from scala.js. It needs to be local to the script execution environment (i.e., not loaded asynchronously from a server, as recommended at How to read a resource file in Scala.js?).
Is there any mechanism for embedding the contents of a small resource file directly into the generated javascript compiled from a scala.js file? Basically, I want something like:
object MyResource {
#EmbeddedResource(URL("/my/package/localized.txt"))
val resourceString: String = ???
}
This would obviously bloat the generated .js file somewhat, but that is an acceptable tradeoff for my application. It seems like this wouldn't be an uncommon need and that this macro ought to already exist somewhere, but my initial searches haven't turned anything up.
If you are using sbt, you can use a source generator that reads your resource file and serializes it in a val inside an object:
sourceGenerators in Compile += Def.task {
val baseDir = baseDirectory.value / "custom-resources" // or whatever
val resourceFile = baseDir / "my/package/localized.txt"
val sourceDir = (sourceManaged in Compile).value
val sourceFile = sourceDir / "Localized.scala"
if (!sourceFile.exists() ||
sourceFile.lastModified() < resourceFile.lastModified()) {
val content = IO.read(resourceFile).replaceAllLiterally("$", "$$")
val scalaCode =
s"""
package my.package.localized
object Localized {
final val content = raw\"\"\"$content\"\"\"
}
"""
IO.write(sourceFile, scalaCode)
}
Seq(sourceFile)
}.taskValue
If you are using another build tool, I am sure there is a similar concept of source generators that you can use.
I have a folder structure like below:
- main
-- java
-- resources
-- scalaresources
--- commandFiles
and in that folders I have my files that I have to read.
Here is the code:
def readData(runtype: String, snmphost: String, comstring: String, specificType: String): Unit = {
val realOrInvFile = "/commandFiles/snmpcmds." +runtype.trim // these files are under commandFiles folder, which I have to read.
try {
if (specificType.equalsIgnoreCase("Cisco")) {
val specificDeviceFile: String = "/commandFiles/snmpcmds."+runtype.trim+ ".cisco"
val realOrInvCmdsList = scala.io.Source.fromFile(realOrInvFile).getLines().toList.filterNot(line => line.startsWith("#")).map{
//some code
}
val specificCmdsList = scala.io.Source.fromFile(specificDeviceFile).getLines().toList.filterNot(line => line.startsWith("#")).map{
//some code
}
}
} catch {
case e: Exception => e.printStackTrace
}
}
}
Resources in Scala work exactly as they do in Java.
It is best to follow the Java best practices and put all resources in src/main/resources and src/test/resources.
Example folder structure:
testing_styles/
├── build.sbt
├── src
│ └── main
│ ├── resources
│ │ └── readme.txt
Scala 2.12.x && 2.13.x reading a resource
To read resources the object Source provides the method fromResource.
import scala.io.Source
val readmeText : Iterator[String] = Source.fromResource("readme.txt").getLines
reading resources prior 2.12 (still my favourite due to jar compatibility)
To read resources you can use getClass.getResource and getClass.getResourceAsStream .
val stream: InputStream = getClass.getResourceAsStream("/readme.txt")
val lines: Iterator[String] = scala.io.Source.fromInputStream( stream ).getLines
nicer error feedback (2.12.x && 2.13.x)
To avoid undebuggable Java NPEs, consider:
import scala.util.Try
import scala.io.Source
import java.io.FileNotFoundException
object Example {
def readResourceWithNiceError(resourcePath: String): Try[Iterator[String]] =
Try(Source.fromResource(resourcePath).getLines)
.recover(throw new FileNotFoundException(resourcePath))
}
good to know
Keep in mind that getResourceAsStream also works fine when the resources are part of a jar, getResource, which returns a URL which is often used to create a file can lead to problems there.
in Production
In production code I suggest to make sure that the source is closed again.
For Scala >= 2.12, use Source.fromResource:
scala.io.Source.fromResource("located_in_resouces.any")
One-liner solution for Scala >= 2.12
val source_html = Source.fromResource("file.html").mkString
Important taken from comments (thanks to #anentropic): with Source.fromResource you do not put the initial forward slash.
import scala.io.Source
object Demo {
def main(args: Array[String]): Unit = {
val ipfileStream = getClass.getResourceAsStream("/folder/a-words.txt")
val readlines = Source.fromInputStream(ipfileStream).getLines
readlines.foreach(readlines => println(readlines))
}
}
The required file can be accessed as below from resource folder in scala
val file = scala.io.Source.fromFile(s"src/main/resources/app.config").getLines().mkString
For Scala 2.11, if getLines doesn't do exactly what you want you can also copy the a file out of the jar to the local file system.
Here's a snippit that reads a binary google .p12 format API key from /resources, writes it to /tmp, and then uses the file path string as an input to a spark-google-spreadsheets write.
In the world of sbt-native-packager and sbt-assembly, copying to local is also useful with scalatest binary file tests. Just pop them out of resources to local, run the tests, and then delete.
import java.io.{File, FileOutputStream}
import java.nio.file.{Files, Paths}
def resourceToLocal(resourcePath: String) = {
val outPath = "/tmp/" + resourcePath
if (!Files.exists(Paths.get(outPath))) {
val resourceFileStream = getClass.getResourceAsStream(s"/${resourcePath}")
val fos = new FileOutputStream(outPath)
fos.write(
Stream.continually(resourceFileStream.read).takeWhile(-1 !=).map(_.toByte).toArray
)
fos.close()
}
outPath
}
val filePathFromResourcesDirectory = "google-docs-key.p12"
val serviceAccountId = "[something]#drive-integration-[something].iam.gserviceaccount.com"
val googleSheetId = "1nC8Y3a8cvtXhhrpZCNAsP4MBHRm5Uee4xX-rCW3CW_4"
val tabName = "Favorite Cities"
import spark.implicits
val df = Seq(("Brooklyn", "New York"),
("New York City", "New York"),
("San Francisco", "California")).
toDF("City", "State")
df.write.
format("com.github.potix2.spark.google.spreadsheets").
option("serviceAccountId", serviceAccountId).
option("credentialPath", resourceToLocal(filePathFromResourcesDirectory)).
save(s"${googleSheetId}/${tabName}")
The "resources" folder must be under the source root. if using intellj check for the blue folder in the project folders on the left side. eg AppName/src/main/scala or Project/scala/../main/ etc.
If using val stream: InputStream = getClass.getResourceAsStream("/readme.txt") don't forget the "/" (forward slash), given readme.txt is the file inside resources
I am learning scala recently, the package in scala confused me.
I have a file named StockPriceFinder.scala:
// StockPriceFinder.scala
object StockPriceFinder {
def getTickersAndUnits() = {
val stockAndUnitsXML = scala.xml.XML.load("stocks.xml")
(Map[String, Int]() /: (stocksAndUnitsXML \ "symbol")) {
(map, symbolNode) =>
val ticker = (symbolNode \ "#ticker").toString
val units = (symbolNode \ "units").text.toInt
map ++ Map(ticker -> units)
}
}
}
then I want to use StockPriceFinder in test.scala which is in the same folder:
val symbolAndUnits = StockPriceFinder.getTickersAndUnits
but when I run it with scala test.scala, I got error:error: not found: value StockPriceFinder. In Java, if this two source files are in the same folder, I do not need to import and I can use it directly, so how can I import StockPriceFinder correctly in scala?
I have tried to use import StockPriceFinder in test.scala, but it still does not work.
You don't need to import StockPriceFinder if the files are in the same package (not folder).
But you do need to compile StockPriceFinder.scala first and pass the correct classpath to scala.
scalac StockPriceFinder.scala
scala -cp . test.scala
should work (might be a bit off). However, you shouldn't do it manually, since it becomes unmanageable very quickly; use SBT or other build tools (Maven, Gradle).
How do I get SBT staging directory at build time?
I want to do a tricky clone of a remote repo, and the stagingDirectory of SBT seems to be a nice fit.
How do I get the directory inside "Build.scala" ?
SBT source code:
http://www.scala-sbt.org/0.13.1/sxr/sbt/BuildPaths.scala.html#sbt.BuildPaths.stagingDirectory
=======
Underlying problem NOT directly relevant to the question. I wanted to use a subdirectory of a git dependency in SBT. SBT doesn't provide this out of the box so I wrote a simple wrapper:
object Git {
def clone(cloneFrom: String, branch: String, subdirectory: String) = {
val uniqueHash = Hash.halfHashString(cloneFrom + branch)
val cloneTo = file(sys.props("user.home")) / ".sbt" / "staging" / uniqueHash
val clonedDir = creates(cloneTo) {
Resolvers.run("git", "clone", cloneFrom, cloneTo.absolutePath)
Resolvers.run(Some(cloneTo), "git", "checkout", "-q", branch)
}
clonedDir / subdirectory
}
}
usage:
lazy val myDependency = Git.clone(cloneFrom = "git://...someproject.git", branch = "v2.4", subdirectory = "someModule")
Looking at the API from your link, there are two methods you can use getGlobalBase and getStagingDirectory, both take state.
import sbt._
import Keys._
import sbt.BuildPaths._
object MyBuild extends Build {
val outputStaging = taskKey[Unit]("Outputs staging")
lazy val root = project.in(file(".")).settings(
outputStaging := {
val s = state.value
println(getStagingDirectory(s, getGlobalBase(s)))
}
)
}
Edit
After your last comment, I think you're looking for a custom resolver. The custom resolver has an access to a ResolveInfo object, which has a property called staging.
For example this is how you could achieve what you're looking for (actually without accessing staging directly):
object MyBuild extends Build {
lazy val root = project.in(file(".")).dependsOn(RootProject(uri("dir+git://github.com/lpiepiora/test-repo.git#branch=master&dir=subdir")))
override def buildLoaders = BuildLoader.resolve(myCustomGitResolver) +: super.buildLoaders
def myCustomGitResolver(info: BuildLoader.ResolveInfo): Option[() => File] =
if(info.uri.getScheme != "dir+git") None
else {
import RichURI.fromURI
val uri = info.uri
val (branch, directory) = parseOutBranchNameAndDir(uri.getFragment)
val gitResolveInfo = new BuildLoader.ResolveInfo(
uri.copy(scheme = "git", fragment = branch), info.staging, info.config, info.state
)
println(uri.copy(scheme = "git", fragment = branch))
Resolvers.git(gitResolveInfo).map(fn => () => fn() / directory)
}
// just an ugly way to get the branch and the folder
// you may want something more sophisticated
private def parseOutBranchNameAndDir(fragment: String): (String, String) = {
val Array(branch, dir) = fragment.split('&')
(branch.split('=')(1), dir.split('=')(1))
}
}
The idea is that we delegate to the predefined git resolver, and we let it do it's work, after it's done, we'll return a subdirectory: fn() / directory.
This is an example and of course you could stick to your logic of getting the repository. The staging directory will be available to you in the resolver method.