I want to use an immutable Queue in Scala, like this:
var a:Queue[Int] = Queue.empty[Int]
However, I get the following error:
error: not found: type Queue
I tried importing the library containing it but there was no effect:
import scala.collection.immutable
Pretty sure you need to add ._ to your import, like so:
import scala.collection.immutable._
Or import Queue specifically as:
import scala.collection.immutable.Queue
Related
Swift has an interesting keyword, which can be declared in the ModuleA:
#_exported import Foundation
Then, when in some other module (e.g. ModuleB) I import the ModuleA:
import ModuleA
let currentDate = Date() // using a Foundation struct
... I'm able to use also the types from the Foundation even though I'm not importing it with an import statement. It's being automatically imported as it has been declared as #_exported in the ModuleA.
Now I'd like to have a similar behavior in Objective-C.
Given:
3 Targets: TargetA (Swift), ProtocolTarget(Swift with #objc keyword annotated types) and TargetB (ObjC)
I'd like to make the ProtocolTarget implicitly imported, so that when the TargetB imports TargetA, all of the methods from the ProtocolTarget should become also available inside the TargetA.
How can I achieve this?
In Apple's github for the Swift Package manager they use
import func POSIX.isatty
import func libc.strerror_r
import var libc.EINVAL
import var libc.ERANGE
import struct PackageModel.Manifest
source
There is also a file where the only code in it is #_exported source
#_exported import func libc.fileno
Is this a Swift 3 feature? I can not find anywhere that you can import a type in the Swift documentation and nothing on #_exported.
You can import only a specific part of a module, not a whole module:
Providing more detail limits which symbols are imported—you can specify a specific submodule or a specific declaration within a module or submodule. When this detailed form is used, only the imported symbol (and not the module that declares it) is made available in the current scope.
From Import Declaration
For example import func POSIX.isatty will import function isatty from module POSIX instead of importing the whole module POSIX (which is BIG).
The #_exported attribute starts with an underscore. That means it's a private Swift attribute. Not a feature, an implementation detail.
In short, this attribute lets you export a symbol from another module as if it were from your module.
I have the following package object with a val declared in it
package au.com.someproject.protocol
package object helpers {
val etcdRoot = "someproject.com.au"
}
This package object is declared as part of an API, when I import the API into another project and I try to access the variable I meet with the following error
[error] /home/user/git/company/project/project-agent/src/main/scala/au/com/someproject/project_agent/cluster/StatusMonitor.scala:52: not found: value etcdRoot
[error] etcdClient.setKey(s"$etcdRoot/kumo/peers/${systemCluster.selfAddress.host.get}", systemCluster.selfAddress.port.get.toString, new Some(40.seconds))
I'm importing the variable like so, how I would except it should be done
import au.com.someproject.protocol.helpers._
But yet I get the error, is there something I am doing wrong with the declaration or importing?
There's usually a catch with package objects, they have to be manually placed inside the package folder, meaning I would expect your directory structure to look like this before the import would work.
src/main/scala/au/com/someproject/protocol/helpers/helpers.scala
And inside the helpers/, you define your package object. Sometimes you are tempted to have it like the below:
src/main/scala/au/com/someproject/protocol/helpers.scala
The above doesn't actually define the package object on the helpers package, you need to follow the specific directory structure and manually define the package before you define the package object inside it.
I'm a new in Scala. I created a package object in my code:
package mypackage.spark
import scala.language.implicitConversions
import org.apache.spark.SparkContext
import mypackage.spark.SparkContextFunctions
package object spark {
implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions =
new SparkContextFunctions(sc)
}
I expect that when I use import mypackage.spark._, I will able to use methods from SparkContextFunctions class. This approach works for me, when I use only only one imported package object. But when I add additional import in my code. For example:
import mypackage.spark._
import com.datastax.spark.connector._
com.datastax.spark.connector._ doing the same for org.apache.spark.SparkContext class. My code stop compile and I have an error that used method is not a member of SparkContext class. When I change the order of imports the compiler starts see methods from mypackage.spark._ and stops see methods from com.datastax.spark.connector._
Maybe I missed something? Or Scala doesn't support this?
Thanks.
If you need to use two classes named SparkContext at the same time, you can alias them:
import my.package.name.{SparkContext => MySparkContext}
import some.other.package.name.{SparkContext => OtherSparkContext}
Classes from the same package you can be aliased in the same import:
import my.package.name.{SparkContext => MySparkContext, SomethingElse => MySomethingElse}
You may want to choose better names than MyXXX and OtherXXX.
The imports may conflict in two ways: either both use toSparkContextFunctions for the implicit conversion name or both provide extension methods with the same name (even if different signature).
If neither is the case, there should be no problem. If one is, change your method names, since you can't change the ones in com.datastax.spark.connector.
Everything is in the title : How do I do for import a trait in a worksheet for testing it? I have a function in that trait that I want to test..
Just import it like you always would:
import com.mytrait.MyTrait