How can Scala.js test whether a class variable was set to undefined by js.native processes? - scala.js

Please consider this Scala.js class:
#JSExport class Example(#JSExport var x: Uint8ClampedArray) { ... }
edit: [this used to read: case class Example(var x: Array[Int]) { ... }]
Now suppose that a JavaScript native process occasionally sets the value of the Example.x class variable to undefined. How, from a method inside of the Example class, would you test to see if the value of x is undefined or a legitimate array of Int values?
Would you just use: js.typeOf(x) == "undefined"? Would that work? Are there alternatives? If so, what advantages and disadvantages do they have?

Given that you have a Uint8ClampedArray, which is a JavaScript type, Scala.js does not give any guarantee that, at run-time, it indeed holds a value which is a Uint8ClampedArray. In that case, you can indeed either use
js.typeOf(x) == "undefined"
or even
(x: Any) == js.undefined
The "recommended" way is to use
js.isUndefined(x)
but the other ones are just as correct.
Btw: you say in your comment that the JavaScript code does e.x$1 = undefined. That's undefined behavior, and will break in fullOpt. You need to use e.x = undefined, which is available thanks to the #JSExport on var x.

Related

Scala: getting rid of type argument in a function

Suppose I have a function defined in the following way:
def fun[T](somevar: SomeType[T]) = {
// some action
}
Is there a way to git rid of type parameter [T] and just use it as
def fun(somevar: SomeType[T]) = {
// some action
}
I mean, the type information should already be available inside someVar, no? Why not use them somehow?
The purpose of your question is unclear.
The users of your function (the client code) don't need to worry about the type parameter. They can just call fun(myvar).
If the type T is immaterial to the body of fun() then it can be dropped: SomeType[_]
But if T is meaningful to the workings of the function then the compiler has to be told that it is a type parameter. Just seeing this, SomeType[T], the compiler can't infer that T is now a type parameter. It will, instead, look for a definition of T outside of fun() and, not finding it, will throw an error.
This would be problematic because what you're removing was never actually redundant. When the function is defined as def fun[T](...), T is defined as a new type variable. Otherwise, it refers to a type or type variable defined somewhere else. You seem to think that
def fun(somevar: SomeType[T]) { ... }
should mean that T is a new type variable. Is that always the case, though? Not in this code:
trait T { ... }
def fun(somevar: SomeType[T]) { ... }
So where do we go from there? Define new type variables whenever a type name is referred to that hasn't been defined before? That would turn the following compile-time errors truly weird. (Or worse, sometimes allow them to compile?)
def fun(somevar: SomeType[Integr]) { ... }
class C[Thing] {
def g(x: List[Think]) = ...
}
For the compile-time type-checking philosophy of Scala, that way lies madness.
Unless you want T to be a specific type, no, there is not. The method signature wouldn't be valid without it. What somevar knows is irrelevant.
You can't do this. Compiler needs to know what is type T. If you don't care about SomeType type parameter, you can use this form:
def fun(somevar: SomeType[_]) = {
// some action
}

Declare a null var in Scala

I have read that null should not be used in scala.
How can I leave myVar uninitialized without the use of null?
class TestClass {
private var myVar: MyClass = null
}
I understand I can just make a dummy MyClass, that is never used in place of the null. But this can and does reduce the code's understandability.
As Rado has explained I can change null as shown below. I understand that I can now check to see if the variable is set during run-time, however, if I don't program that check then there is no benefit of using Option in this case.
Coming from Java, I feel there should be a way to simply leave the var uninitialized at compile-time and let it set during run-time without using the Option class, because as I mentioned above, if I don't code for the unset case then why use Option?
class TestClass {
private var myVar: Option[MyClass] = None
private def createVar() {
myVar = Some(new MyClass)
x: MyClass = myVar.get
}
}
I am thinking the only other way of doing what I am asking is:
class TestClass {
// Using dummy MyClass that will never be used.
private var myVar: MyClass = new MyClass
private def process(myVar: MyClass) {
this.myVar = myVar
myVar.useVarMethod()
}
}
The Scala way is to declare the variable as Option[MyClass]:
class TestClass {
private var myVar: Option[MyClass] = None
private def createVar() {
myVar = Some(new MyClass)
}
// Usage example:
def useMyVar(): Unit = {
myVar match {
case Some(myClass) => {
// Use myClass here ...
println(myClass.toString)
}
case None => // What to do if myVar is undefined?
}
}
}
That way you avoid NullPointerException. You make it explicit that the variable can be in undefined state. Everytime you use the myVar you have to define what to do if it is undefined.
http://www.scala-lang.org/api/current/index.html#scala.Option
I need myVar to be of type MyClass not Option[MyClass]. I see that I
could use Rado's updated answer and then use the get method, but is
there any other way?
When you use Option you can telling the compiler and everyone else who will read/use your code that it's okay not to define this value and the code will handle that condition and not fail at runtime.
The other way of dealing with is to do null checks every time before you access the variable because it could be null and therefore throw an exception at runtime.
When you use Option, the compiler will tell you if at compile time that you have not handled a condition where the value of a variable maybe undefined.
If you think about it, it's really a big deal. you have converted a runtime exception (which is deterministic) to a compile-time error.
If you want to extract the value out of something like an Option (which supports map and also flatMap), then you don't necessarily have to keep doing pattern matching on whether or not the Option contains a value (i.e. is a "Some") or not (i.e. is a "None").
Two methods are very useful - if you want just alter (or "map") the value within the Option then you can use the map method, which takes a function with a general type of:
f: A => B
so in your case at compile time would end up being:
f: MyClass => B
When you map an option, if the option is a "Some" then the contained value is passed through to the mapping function, and the function is applied (to change the MyClass to a B if you like...) and the result is passed back wrapped in an Option. If your Option is a None, then you just get a None back.
Here's a simple example:
scala> case class MyClass(value : String)
defined class MyClass
scala> val emptyOption : Option[MyClass] = None
emptyOption: Option[MyClass] = None
scala> val nonEmptyOption = Some(new MyClass("Some value"))
nonEmptyOption: Option[MyClass] = Some(MyClass(Some value)
Try and extract the value from both option instances using map:
scala> nonEmptyOption map { s => s.value + " mapped" }
res10: Option[String] = Some(Some value mapped)
scala> emptyOption map { s => s.value }
res11: Option[String] = None
So basically, if you map across an empty option, you always get a None. Map across a non-empty Option and you can extract the contained value and transform it and get the result wrapped back in an Option. Notice that you don't need to explicitly check any patterns...the map method takes care of it.
Flatmap is a bit more challenging to explain, but it basically isn't massively different except that it takes a function which has type:
g: A => M[B]
which basically allows you to take the value out of the Option (the 'A' in the type signature above), do something to it (i.e. a computation) and then return the result ('B') - wrapped in another container such as another Option, List...
The same notion (across Option anyway) that if the Option is a None then nothing really happens still applies. flatMap and map form the basis of Scala "for comprehensions" which you can read about (and are done far more justice than I could!!) in lots of other places.

Check for acceptance of type, rather than value, with isDefinedAt

I have a case where I want use isDefinedAt to check if a partial function accepts a type, rather than a specific value.
val test: PartialFunction[Any, Unit] = {
case y: Int => ???
case ComplexThing(x, y, z) => ???
}
Here you could do something like test isDefinedAt 1 to check for acceptance of that value, however, what I really want to do is check for acceptance of all Ints (more specifically, in my case the type I want to check is awkward to initialize (it has a lot of dependencies), so I would really like to avoid creating an instance if possible - for the moment I'm just using nulls, which feels ugly). Unfortunately, there is no test.isDefinedAt[Int].
I'm not worried about it only accepting some instances of that type - I would just like to know if it's completely impossible that type is accepted.
There is no way to make PartialFunction do this. In fact, because of type erasure, it can be difficult to operate on types at runtime. If you want to be able to verify types at compile-time you can use typeclasses instead:
class AllowType[-T] {
def allowed = true
}
object AllowType {
implicit object DontAllowAnyType extends AllowType[Any] {
override def allowed = false
}
}
implicit object AllowInt extends AllowType[Int]
implicit object AllowString extends AllowType[String]
def isTypeAllowed[T](implicit at: AllowType[T]) = at.allowed
isTypeAllowed[Int] // true
isTypeAllowed[Double] // false
The answer appears to be that this simply isn't possible - there are other ways to do this (as in wingedsubmariner's answer), but that requires either duplicating the information (which renders it pointless, as the reason for doing this was to avoid that), or changing not to use partial functions (which is dictated by an outside API).
The best solution is just to use nulls to fill the dependencies to create instances to check with. It's ugly, and has it's own issues, but it appears to be the best possible without substantial change.
test.isDefinedAt(ComplexThing(null, null, null))

Scala inheritance with a abstract class

I am new to scala, just doing some practice;
I tried a very simple program, briefed as following:
abstract class Device(val cds: Array[Char]) {
var codes = Array[Char](cds: _*)
def encrpt(code: Char): Char
var nextDevice: Device
def setNext(next: Device):Unit = {
nextDevice = next
}
}
//compiler error shows here
class Input(codes: Array[Char]) extends Device(codes) {
override def encrpt(code: Char) = code
}
you could see there is a compiler error at line 21, following is the message:
class Input needs to be abstract, since variable nextDevice in class Device of type com.me.acm.problem1009.Device is not
defined (Note that variables need to be initialized to be defined)
I am pretty confusing that error, my understanding, define some variable and an setter method in the parent class, so the child classes can use it without define it again. it is straight forward.
I guess I missed something. Could someone explain it to me and tell what is the correct way? thanks.
In Scala, variables do not have assumed default values as they do in Java (or many other languages). Thus, when you declare a variable, you must always specify its initial value.
In your code, you declare a variable nextDevice, but you do not give it a value. Since Scala always needs a value, it interprets what you've written as nextDevice being an abstract field, so the compiler is telling you that it must be overridden.
If you change that line to the following, for example, to specify an initial value, then the error will disappear:
var nextDevice: Device = new Input(Array())
As the error message is telling you, the variable nextDevice needs to be initialized in the constructor on Input.
class Input(codes: Array[Char]) extends Device(codes) {
override def encrpt(code: Char) = code
nextDevice = null
}
Note that using null is frowned upon in Scala. You should probably change the type of your variable to Option[Device]

Scala: compare type of generic class

There have been many questions on that issue, but sadly none seems to solve my problem.
I've written a generic scala class, let's call it
class MyClass[A]() { ... }
As well as the according object:
object MyClass() { ... }
Inside MyClass I want to define a function whichs behaviour depends on the given type A. For instance, let's just assume I want to define a 'smaller' function of type (A, A) => Boolean, that by default returns 'true' no matter what the elements are, but is meant to return the correct results for certain types such as Int, Float etc.
My idea was to define 'smaller' as member of the class in the following way:
class MyClass[A]() {
val someArray = new Array[A](1) // will be referred to later on
var smaller:(A,A) => Boolean = MyClass.getSmallerFunction(this)
...some Stuff...
}
object MyClass {
def getSmallerFunction[A](m:MyClass[A]):(A,A) => Boolean = {
var func = (a:Boolean, b:Boolean) => true
// This doesn't compile, since the compiler doesn't know what 'A' is
if(A == Int) func = ((a:Int, b:Int) => (a<b)).asInstanceOf[(A,A) => Boolean)]
// This compiles, but always returns true (due to type erasure I guess?)
if(m.isInstanceOf[MyClass[Float]]) func = ((a:Float, b:Float) => (a<b)).asInstanceOf[(A,A) => Boolean)]
// This compiles but always returns true as well due to the newly created array only containing null-elements
if(m.someArray(0).isInstanceOf[Long]) func = ((a:Long, b:Long) => (a<b)).asInstanceOf[(A,A) => Boolean)]
}
...some more stuff...
}
The getSmallerFunction method contains a few of the implementations I experimented with, but none of them works.
After a while of researching the topic it at first seemed as if manifests are the way to go, but unfortunately they don't seem to work here due to the fact that object MyClass also contains some constructor calls of the class - which, no matter how I change the code - always results in the compiler getting angry about the lack of information required to use manifests. Maybe there is a manifest-based solution, but I certainly haven't found it yet.
Note: The usage of a 'smaller' function is just an example, there are several functions of this kind I want to implement. I know that for this specific case I could simply allow only those types A that are Comparable, but that's really not what I'm trying to achieve.
Sorry for the wall of text - I hope it's possible to comprehend my problem.
Thanks in advance for your answers.
Edit:
Maybe I should go a bit more into detail: What I was trying to do was the implementation of a library for image programming (mostly for my personal use). 'MyClass' is actually a class 'Pixelmap' that contains an array of "pixels" of type A as well as certain methods for pixel manipulation. Those Pixelmaps can be of any type, although I mostly use Float and Color datatypes, and sometimes Boolean for masks.
One of the datatype dependent functions I need is 'blend' (although 'smaller' is used too), which interpolates between two values of type A and can for instance be used for smooth resizing of such a Pixelmap. By default, this blend function (which is of type (A,A,Float) => A) simply returns the first given value, but for Pixelmaps of type Float, Color etc. a proper interpolation is meant to be defined.
So every Pixelmap-instance should get one pointer to the appropriate 'blend' function right after its creation.
Edit 2:
Seems like I found a suitable way to solve the problem, at least for my specific case. It really is more of a work around though.
I simply added an implicit parameter of type A to MyClass:
class MyClass[A]()(implicit dummy:A) { ... }
When I want to find out whether the type A of an instance m:MyClass is "Float" for instance, I can just use "m.dummy.isInstanceOf[Float]".
To make this actually work I added a bunch of predefined implicit values for all datatypes I needed to the MyClass object:
object MyClass {
implicit val floatDummy:Float = 0.0f
implicit val intDummy:Int = 0
...
}
Although this really doesn't feel like a proper solution, it seems to get me around the problem pretty well.
I've omitted a whole bunch of stuff because, if I'm honest, I'm still not entirely sure what you're trying to do. But here is a solution that may help you.
trait MyClass[A] {
def smaller: (A,A) => Boolean
}
object MyClass {
implicit object intMyClass extends MyClass[Int] {
def smaller = (a:Int, b:Int) => (a < b)
}
implicit object floatMyClass extends MyClass[Float] {
def smaller = (a:Float, b:Float) => (a < b)
}
implicit object longMyClass extends MyClass[Long] {
def smaller = (a:Long, b:Long) => (a < b)
}
def getSmallerFunction[T : MyClass](a: T, b: T) = implicitly[MyClass[T]].smaller(a, b)
}
The idea is that you define your smaller methods as implicit objects under your MyClass, object, with a getSmallerFunction method. This method is special in the sense that it looks for a type-class instance that satisfies it's type bounds. We can then go:
println(MyClass.getSmallerFunction(1, 2))
And it automagically knows the correct method to use. You could extend this technique to handle your Array example. This is a great tutorial/presentation on what type-classes are.
Edit: I've just realise you are wanting an actual function returned. In my case, like yours the type parameter is lost. But if at the end of the day you just want to be able to selectively call methods depending on their type, the approach I've detailed should help you.