I have a localization resource file I need access from scala.js. It needs to be local to the script execution environment (i.e., not loaded asynchronously from a server, as recommended at How to read a resource file in Scala.js?).
Is there any mechanism for embedding the contents of a small resource file directly into the generated javascript compiled from a scala.js file? Basically, I want something like:
object MyResource {
#EmbeddedResource(URL("/my/package/localized.txt"))
val resourceString: String = ???
}
This would obviously bloat the generated .js file somewhat, but that is an acceptable tradeoff for my application. It seems like this wouldn't be an uncommon need and that this macro ought to already exist somewhere, but my initial searches haven't turned anything up.
If you are using sbt, you can use a source generator that reads your resource file and serializes it in a val inside an object:
sourceGenerators in Compile += Def.task {
val baseDir = baseDirectory.value / "custom-resources" // or whatever
val resourceFile = baseDir / "my/package/localized.txt"
val sourceDir = (sourceManaged in Compile).value
val sourceFile = sourceDir / "Localized.scala"
if (!sourceFile.exists() ||
sourceFile.lastModified() < resourceFile.lastModified()) {
val content = IO.read(resourceFile).replaceAllLiterally("$", "$$")
val scalaCode =
s"""
package my.package.localized
object Localized {
final val content = raw\"\"\"$content\"\"\"
}
"""
IO.write(sourceFile, scalaCode)
}
Seq(sourceFile)
}.taskValue
If you are using another build tool, I am sure there is a similar concept of source generators that you can use.
Related
I have a file src/main/scala/foo.scala which needs to be inside package bar. Ideally the file should be inside src/main/scala/bar/foo.scala.
// src/main/scala/foo.scala
package bar
// ...
How can I auto-fix this issue throughout my project such that the folder structure matches the package structure?
Is there any SBT plugin etc that can help me fix this issue?
As far as I am aware there are not such tools, though AFAIR IntelliJ can warn about package-directory mismatch.
Best I can think if is custom scalafix (https://scalacenter.github.io/scalafix/) rule - scalafix/scalameta would be used to check file's actual package, translate it to an expected directory and if they differ, move file.
I suggest scalafix/scalameta because there are corner cases like:
you are allowed to write your packages like:
package a
package b
package c
and it almost like package a.b.c except that it automatically imports everything from a and b
you can have package object in your file and then if you have
package a.b
package object c
this file should be in a/b/c directory
so I would prefer to check if file didn't fall under any of those using some existing tooling.
If you are certain that you don't have such cases (I wouldn't without checking) you could:
match the first line with regexp (^package (.*))
translate a.b.c into a/b/c (matched.split('.').map(_.trim).mkString(File.separator))
compare generated location to an actual location ( I suggest resolving absolute file locations)
move file if necessary
If there is a possibility of having more complex case than that, I could replace first step by querying scalafix/scalameta utilities.
Here is an sbt plugin providing packageStructureToDirectoryStructure task that reads package statements from source files, creates corresponding directories, and then moves files to them
import sbt._
import sbt.Keys._
import better.files._
object PackagesToDirectories extends AutoPlugin {
object autoImport {
val packageStructureToDirectoryStructure = taskKey[Unit]("Make directory structure match package structure")
}
import autoImport._
override def trigger = allRequirements
override lazy val projectSettings = Seq(
packageStructureToDirectoryStructure := {
val log = streams.value.log
log.info(s"Refactoring directory structure to match package structure...")
val sourceFiles = (Compile / sources).value
val sourceBase = (Compile / scalaSource).value
def packageStructure(lines: Traversable[String]): String = {
val packageObjectRegex = """package object\s(.+)\s\{""".r
val packageNestingRegex = """package\s(.+)\s\{""".r
val packageRegex = """package\s(.+)""".r
lines
.collect {
case packageObjectRegex(name) => name
case packageNestingRegex(name) => name
case packageRegex(name) => name
}
.flatMap(_.split('.'))
.mkString("/")
}
sourceFiles.foreach { sourceFile =>
val packagePath = packageStructure(sourceFile.toScala.lines)
val destination = file"$sourceBase/$packagePath"
destination.createDirectoryIfNotExists(createParents = true)
val result = sourceFile.toScala.moveToDirectory(destination)
log.info(s"$sourceFile moved to $result")
}
}
)
}
WARNING: Make sure to backup the project before running it.
I'm new to scalapb and protobuf.
I'm trying to create unit test for my scalapb's generators. I've generated proto files and trying to use them in tests.
I've got this proto file:
syntax = "proto3";
package hellogrpc.calc;
import "google/api/annotations.proto";
option (scalapb.options) = {
flat_package: true
};
service CalcService {
rpc CalcSum (SumRequest) returns (CalcResponse) {
option (google.api.http) = {
post: "/calcService/sum"
body: "*"
};
}
}
There is method CalcSum which is annotated
And corresponding generated proto file:
// Generated by the Scala Plugin for the Protocol Buffer Compiler.
// Do not edit!
//
// Protofile syntax: PROTO3
package hellogrpc.calc
object CalcServiceProto extends _root_.com.trueaccord.scalapb.GeneratedFileObject {
lazy val dependencies: Seq[_root_.com.trueaccord.scalapb.GeneratedFileObject] = Seq(
com.trueaccord.scalapb.scalapb.ScalapbProto,
com.google.api.annotations.AnnotationsProto
)
lazy val messagesCompanions: Seq[_root_.com.trueaccord.scalapb.GeneratedMessageCompanion[_]] = Seq(
hellogrpc.calc.SumRequest,
hellogrpc.calc.CalcResponse
)
private lazy val ProtoBytes: Array[Byte] =
com.trueaccord.scalapb.Encoding.fromBase64(scala.collection.Seq(
"""ChtoZWxsb2dycGMvQ2FsY1NlcnZpY2UucHJvdG8SDmhlbGxvZ3JwYy5jYWxjGhVzY2FsYXBiL3NjYWxhcGIucHJvdG8aHGdvb
2dsZS9hcGkvYW5ub3RhdGlvbnMucHJvdG8iKAoKU3VtUmVxdWVzdBIMCgFhGAEgASgFUgFhEgwKAWIYAiABKAVSAWIiJgoMQ2FsY
1Jlc3BvbnNlEhYKBnJlc3VsdBgBIAEoBVIGcmVzdWx0Mm8KC0NhbGNTZXJ2aWNlEmAKB0NhbGNTdW0SGi5oZWxsb2dycGMuY2FsY
y5TdW1SZXF1ZXN0GhwuaGVsbG9ncnBjLmNhbGMuQ2FsY1Jlc3BvbnNlIhuC0+STAhUiEC9jYWxjU2VydmljZS9zdW06ASpCBeI/A
hABYgZwcm90bzM="""
).mkString)
lazy val scalaDescriptor: _root_.scalapb.descriptors.FileDescriptor = {
val scalaProto = com.google.protobuf.descriptor.FileDescriptorProto.parseFrom(ProtoBytes)
_root_.scalapb.descriptors.FileDescriptor.buildFrom(scalaProto, dependencies.map(_.scalaDescriptor))
}
lazy val javaDescriptor: com.google.protobuf.Descriptors.FileDescriptor = {
val javaProto = com.google.protobuf.DescriptorProtos.FileDescriptorProto.parseFrom(ProtoBytes)
com.google.protobuf.Descriptors.FileDescriptor.buildFrom(javaProto, Array(
com.trueaccord.scalapb.scalapb.ScalapbProto.javaDescriptor,
com.google.api.annotations.AnnotationsProto.javaDescriptor
))
}
#deprecated("Use javaDescriptor instead. In a future version this will refer to scalaDescriptor.", "ScalaPB 0.5.47")
def de
```scriptor: com.google.protobuf.Descriptors.FileDescriptor = javaDescriptor
}
I inspect CalcServiceProto.javaDescriptor in intellj idea:
Method descriptor has this proto definition:
name: "CalcSum"
input_type: ".hellogrpc.calc.SumRequest"
output_type: ".hellogrpc.calc.CalcResponse"
options {
72295728: "\"\020/calcService/sum:\001*"
}
But generator works just fine! I debug generator, toggled breakpoint on generator and method CalcSum has this proto definition:
name: "CalcSum"
input_type: ".hellogrpc.calc.SumRequest"
output_type: ".hellogrpc.calc.CalcResponse"
options {
[google.api.http] {
post: "/calcService/sum"
body: "*"
}
}
May be this works differently because i didn't register extensions like generator do.
Any way i want this test to be passed:
val s = CalcServiceProto.javaDescriptor.getServices.get(0)
val m = s.getMethods.get(0)
m.getOptions.getExtension(AnnotationsProto.http).getPost shouldBe "/calcService/sum"
If you need Java extensions to be available you need to generate your code with Java conversions enabled. This will make javaDescriptor rely on the official Java implementation and your test will pass.
When Java conversions are disabled, ScalaPB parses the the descriptor, but it can't guarantee the Java extensions are even compiled so it doesn't attempt to register them.
What I'd like to have is that the Scala descriptors would work in this case, however they are not supporting services and methods yet. I filed https://github.com/scalapb/ScalaPB/issues/382 to track progress on this.
In the mean time, like I wrote above, you can use java conversions to force ScalaPB to provide you the Java descriptor. In your build.sbt, have:
PB.targets in Compile := Seq(
scalapb.gen(grpc=true, javaConversions=true) -> (sourceManaged in Compile).value,
PB.gens.java -> (sourceManaged in Compile).value
)
I have a folder structure like below:
- main
-- java
-- resources
-- scalaresources
--- commandFiles
and in that folders I have my files that I have to read.
Here is the code:
def readData(runtype: String, snmphost: String, comstring: String, specificType: String): Unit = {
val realOrInvFile = "/commandFiles/snmpcmds." +runtype.trim // these files are under commandFiles folder, which I have to read.
try {
if (specificType.equalsIgnoreCase("Cisco")) {
val specificDeviceFile: String = "/commandFiles/snmpcmds."+runtype.trim+ ".cisco"
val realOrInvCmdsList = scala.io.Source.fromFile(realOrInvFile).getLines().toList.filterNot(line => line.startsWith("#")).map{
//some code
}
val specificCmdsList = scala.io.Source.fromFile(specificDeviceFile).getLines().toList.filterNot(line => line.startsWith("#")).map{
//some code
}
}
} catch {
case e: Exception => e.printStackTrace
}
}
}
Resources in Scala work exactly as they do in Java.
It is best to follow the Java best practices and put all resources in src/main/resources and src/test/resources.
Example folder structure:
testing_styles/
├── build.sbt
├── src
│ └── main
│ ├── resources
│ │ └── readme.txt
Scala 2.12.x && 2.13.x reading a resource
To read resources the object Source provides the method fromResource.
import scala.io.Source
val readmeText : Iterator[String] = Source.fromResource("readme.txt").getLines
reading resources prior 2.12 (still my favourite due to jar compatibility)
To read resources you can use getClass.getResource and getClass.getResourceAsStream .
val stream: InputStream = getClass.getResourceAsStream("/readme.txt")
val lines: Iterator[String] = scala.io.Source.fromInputStream( stream ).getLines
nicer error feedback (2.12.x && 2.13.x)
To avoid undebuggable Java NPEs, consider:
import scala.util.Try
import scala.io.Source
import java.io.FileNotFoundException
object Example {
def readResourceWithNiceError(resourcePath: String): Try[Iterator[String]] =
Try(Source.fromResource(resourcePath).getLines)
.recover(throw new FileNotFoundException(resourcePath))
}
good to know
Keep in mind that getResourceAsStream also works fine when the resources are part of a jar, getResource, which returns a URL which is often used to create a file can lead to problems there.
in Production
In production code I suggest to make sure that the source is closed again.
For Scala >= 2.12, use Source.fromResource:
scala.io.Source.fromResource("located_in_resouces.any")
One-liner solution for Scala >= 2.12
val source_html = Source.fromResource("file.html").mkString
Important taken from comments (thanks to #anentropic): with Source.fromResource you do not put the initial forward slash.
import scala.io.Source
object Demo {
def main(args: Array[String]): Unit = {
val ipfileStream = getClass.getResourceAsStream("/folder/a-words.txt")
val readlines = Source.fromInputStream(ipfileStream).getLines
readlines.foreach(readlines => println(readlines))
}
}
The required file can be accessed as below from resource folder in scala
val file = scala.io.Source.fromFile(s"src/main/resources/app.config").getLines().mkString
For Scala 2.11, if getLines doesn't do exactly what you want you can also copy the a file out of the jar to the local file system.
Here's a snippit that reads a binary google .p12 format API key from /resources, writes it to /tmp, and then uses the file path string as an input to a spark-google-spreadsheets write.
In the world of sbt-native-packager and sbt-assembly, copying to local is also useful with scalatest binary file tests. Just pop them out of resources to local, run the tests, and then delete.
import java.io.{File, FileOutputStream}
import java.nio.file.{Files, Paths}
def resourceToLocal(resourcePath: String) = {
val outPath = "/tmp/" + resourcePath
if (!Files.exists(Paths.get(outPath))) {
val resourceFileStream = getClass.getResourceAsStream(s"/${resourcePath}")
val fos = new FileOutputStream(outPath)
fos.write(
Stream.continually(resourceFileStream.read).takeWhile(-1 !=).map(_.toByte).toArray
)
fos.close()
}
outPath
}
val filePathFromResourcesDirectory = "google-docs-key.p12"
val serviceAccountId = "[something]#drive-integration-[something].iam.gserviceaccount.com"
val googleSheetId = "1nC8Y3a8cvtXhhrpZCNAsP4MBHRm5Uee4xX-rCW3CW_4"
val tabName = "Favorite Cities"
import spark.implicits
val df = Seq(("Brooklyn", "New York"),
("New York City", "New York"),
("San Francisco", "California")).
toDF("City", "State")
df.write.
format("com.github.potix2.spark.google.spreadsheets").
option("serviceAccountId", serviceAccountId).
option("credentialPath", resourceToLocal(filePathFromResourcesDirectory)).
save(s"${googleSheetId}/${tabName}")
The "resources" folder must be under the source root. if using intellj check for the blue folder in the project folders on the left side. eg AppName/src/main/scala or Project/scala/../main/ etc.
If using val stream: InputStream = getClass.getResourceAsStream("/readme.txt") don't forget the "/" (forward slash), given readme.txt is the file inside resources
I have a spark/scala project named as Omega
I have a conf file inside Omega/conf/omega.config
I use API's from typesafe to load the config file from conf/omega.config.
It was working fine and I was able to read the respective value for each key
Now today, For the first time I added some more key-value pairs in my omega.config file and tried to retrieve them from my scala code. It throws
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'job_name'
This issue started happening after adding new value for the key job_name in my omega.config file.
Also I am not able to read the newly added -key-values, I am still able to read all old values using config. getString method
I am building my spark/scala application using maven.
Omega.config
input_path="/user/cloudera/data
user_name="surender"
job_name="SAMPLE"
I am Not able to access the recently added key "job_name" alone
package com.pack1
import com.pack2.ApplicationUtil
object OmegaMain {
val config_loc = "conf/omega.config"
def main(args: Array[String]): Unit = {
val config = ApplicationUtil.loadConfig(config_loc)
val jobName = ApplicationUtil.getFromConfig(config,"job_name")
}
}
package com.pack2
import com.typesafe.config.{Config, ConfigFactory}
object ApplicationUtil {
def loadConfig(filePath:String):Config={
val config = ConfigFactory.parseFile(new File(filePath))
config
}
def getFromConfig(config:Config,jobName:String):String={
config.getString(jobName)
}
}
Could some one help me what went wrong?
You can try something like:
def loadConfig(filename: String, syntax: ConfigSyntax): Config = {
val in: InputStream = getClass.getResourceAsStream(filename)
if (in == null) return null
val file: File = File.createTempFile(String.valueOf(in.hashCode()), ".conf")
file.deleteOnExit()
val out: FileOutputStream = new FileOutputStream(file)
val buffer: Array[Byte] = new Array(1024)
var bytesRead: Int = in.read(buffer)
while (bytesRead != -1) { out.write(buffer, 0, bytesRead); bytesRead = in.read(buffer) }
out.close()
val conf: Config = ConfigFactory.parseFile(file, ConfigParseOptions.defaults().setSyntax(syntax).setAllowMissing(false).setOriginDescription("Merged with " + filename))
conf
}
filename is some file path in the classpath. If you want to update this method to taking some external file into account, change update the 4th with val file: File = new File("absolute Path of he file")
I am guessing the file isn't on the classpath after you build with Maven.
Since you are using Maven to build a jar, you need your omega.config to be in the classpath. This means that you either have to put it into src/main/resources by default or explicitly tell Maven to add conf to the default resources classpath.
In PlayFramework 2.2.x after using play dist I am having issues reading and writing to /public directory. Is that a known problem? Is the only solution to read/write to another directory with a global path?
This is my sample code:
val imageDirectory = "images/twitpics/"
val localPrefix = "/public/"
val publicPrefix = "/assets/"
val files = Play.getFile(localPrefix + imageDirectory)
.listFiles.filter(_.getName.takeRight(3) == "jpg")
val randomIndex = _rand.nextInt(files.length)
val imageFile = files(randomIndex)
Also
private val _jsonConfigFile = "/public/data/data.json"
def writeJsonToFile(content: String) = {
import java.io._
val pw = new PrintWriter(Play.getFile(_jsonConfigFile))
pw.write(content)
pw.close
}
After dist the public/ directory is packaged into the application jar, which is put on the classpath, so you cannot access it through the filesystem or write to it.