I was running a scala code chunk and ran into the following error:
Formatting failed: :13: error: [dialect dialect] illegal start of simple expression
case (leftDF, segmentStruct) =>
Can you please help me correct this?
[code]
val finalDF = listOfSegments.tail.foldLeft (firstDF)(
case (leftDF, segmentStruct) =>
leftDF.join(
getRelevantSegmentInfo(segmentStruct.tableName, segmentStruct.segmentName),
Seq(accountKeys),
"left_outer"
)
)
Related
I am trying to parse a simple syntax like so
mod A do
end
where between the do block will either be content or not. For some odd reason I am not parsing this correctly using this combinator.
def module: Parser[Any] = "mod" ~> moduleIdent ~ "do" ~ opt(rep(_)) ~ "end"
where _ is the repetition of an optional function definition parser.
I am getting the following error.
[2.1] failure: 'do' expected but 'e' found
end
moduleIdent code:
def moduleIdent: Parser[String] = "[A-Z]+".r ~ opt(ident) ^^ {
case first ~ optAll => optAll match {
case Some(all) => first ++ all
case None => first
}
}
running the parsing code:
parseAll(module, Source.fromFile(source).mkString)
where source is the given code that is failing.
Sometimes I am also getting "end of input" error. I am not sure what I am doing wrong. I am using the JavaTokenParsers class.
Thank you for your help!
I assigned the variable already, but I still get the error. I think that is not a typo.
val inputJPG = input.filter(context => context.contains("jpg")).collect
inputJPG.take(10)
------------------------------------------------------
scala> inputJPG.take(10)
<console>:20: error: not found: value inputJPG
inputJPG.take(10)
Just remove collect. (except you have a reason that is not clear by your question)
val input = Seq("none","image.jpg")
val inputJPG = input.filter(context => context.contains("jpg"))
inputJPG.take(10) // -> List(image.jpg)
collect is used when you want to filter and map in one step. See the API
I'm working on Advent of Code's coding challenges and I'm on day one. I've read from a file that contains nothing but ((()(())(( so I'm looking to turn each '(' to 1 and each ')' to a -1 so I can compute them. But I'm having issues when I map findFloor over source. I'm getting a type mismatch. Everything looks right to me and that's the weird part because it's not working.
import scala.io._
object Advent1 extends App {
// Read from file
val source = Source.fromFile("floor1-Input.txt").toList
// Replace each '(' with 1 and each ')' with -1, return List[Int]
def findFloor(input: List[Char]):Int = input match {
case _ if input.contains('(') => 1
case _ if input.contains(')') => -1
}
val floor = source.map(findFloor)
}
Error output
error: type mismatch;
found : List[Char] => Int
required: Char => ?
val floor = source.map(findFloor)
^ one error found
What I'm I doing wrong here ? / what I'm I missing ?
Scala map works over an elements rather than whole collection. Try this:
val floor = source.map {
case '(' => 1
case ')' => -1
}.sum
If you want to compute them in an sequential way, you could even do the computations directly by using foldLeft.
val computation = source.foldLeft(0)( (a, b) => {
b match {
case '(' => a + 1
case ')' => a - 1
}
})
It's simply adding up all values and returning the acummulated value. For ( it's adding and for ')' it's substracting.
The first argument is the starting value, a is the value from the previous step and b is the actual element therefore the char.
You're error is probably caused because the pattern match is not defined for all chars. You missing a match for all others, case _ => 0 for example.
An other option would be to use 'collect' since this accepts a PartialFunction and ignores all not matching elements.
The suggested 'fold' solution is the better approach here I think.
I have this code below:
//TABLE FROM HIve
val df = hiveContext.sql("select * from test_table where date ='20160721' LIMIT 300")
//ERROR ON THE LINE BELOW
val row = df.flatMap(row => ((row.get(0), row.get(1), row.get(2)), 1))
I get this error in the code above saying:
Type mismatch, expected: (Row) => Traversable[NotInferedU], actual : (Row) => ((Any, Any, Any), Int)
Can someone check to see what is wrong in my flatMap function. I am not able to understand what this error is stating.
you probably should use map instead. ((row.get(0), row.get(1), row.get(2)), 1) is not a Traversable as the error message stated.
Trying to create multiple dataframes in a single foreach, using spark, as below
I get values delivery and click out of row.getAs("type"), when I try to print them.
val check = eachrec.foreach(recrd => recrd.map(row => {
row.getAs("type") match {
case "delivery" => val delivery_data = delivery(row.get(0).toString,row.get(1).toString)
case "click" => val click_data = delivery(row.get(0).toString,row.get(1).toString)
case _ => "not sure if this impacts"
}})
)
but getting below error:
Error:(41, 14) type mismatch; found : String("delivery") required: Nothing
case "delivery" => val delivery_data = delivery(row.get(0).toString,row.get(1).toString)
^
My plan is to create dataframe using todf() once I create these individual delivery objects referenced by delivery_data and click_data by:
delivery_data.toDF() and click_data.toDF().
Please provide any clue regarding the error above (in match case).
How can I create two df's using todf() in val check?
val declarations make your first 2 cases return type to be unit, but in the third case you return a String
for instance, here the z type was inferred by the compiler, Unit:
def x = {
val z: Unit = 3 match {
case 2 => val a = 2
case _ => val b = 3
}
}
I think you need to cast this match clause to String.
row.getAs("type").toString