Programs for printing reverse triangle patterns with * in scala - scala

I am trying to explore Scala. I am new to Scala. This might be a simple question and searched in google to get below scenario to solve. But couldn't get answers. Instead of Scala I am getting Java related things.
My requirement to print format like below.
* * * * *
* * * *
* * *
*
Can someone suggest me how to get this format.
Thanks in advance.
Kanti

Just for the sake of illustration, here are two possible solution to the problem.
The first one is completely imperative, while the second one is more functional.
The idea is that this serves as an example to help you think how to solve problems in a programmatic way.
As many of us have already commented, if you do not understand the basic ideas behind the solution, then this code will be useless in the long term.
Here is the imperative solution, the idea is simple, we need to print n lines, each line contains n - i starts (where i is the number of the line, starting at 0). The starts are separated by an empty space.
Finally, before printing the starts, we need some padding, looking at example inputs, you can see that the padding starts at 0 and increases by 1 for each line.
def printReverseTriangle(n: Int): Unit = {
var i = 0
var padding = 0
while (i < n) {
var j = padding
while (j > 0) {
print(" ")
j -= 1
}
var k = n - i
while (k > 0) {
print("* ")
k -= 1
}
println()
i += 1
padding += 1
}
}
And here is a more functional approach.
As you can see, in this case we do not need to mutate anything, all the high level operators do that for us. And we only need to focus on the description of the solution.
def printReverseTriangle(size: Int): Unit = {
def makeReverseTriangle(size: Int): List[String] =
List.tabulate(size) { i =>
(" " * (size - i)) + ("* " * i)
}.reverse
println(makeReverseTriangle(size).mkString("\n"))
}

To add an alternative to Luis's answer, here's a recursive solution:
import scala.annotation.tailrec
def printStars(i: Int): Unit = {
#tailrec
def loop(j: Int): Unit = {
if(j > 0) {
val stars = Range(0, j).map(_ => "*").mkString(" ") // make stars
if(i == j) println(stars) // no need for spaces
else println((" " * (i - j)) + stars) // spaces before the stars
loop(j - 1)
}
}
loop(i)
}
printStars(3)
// * * *
// * *
// *
This function will take a maximum triangle size (i), and for that size until i is no longer greater than 0 it will print out the correct number of stars (and spaces), then decrement by 1.
Note: Range(0, j).map(_ => "*").mkString(" ") can be replaced with List.tabulate(j)(_ => "*").mkString(" ") per Luis's answer - I'm not sure which is faster (I've not tested it).

Related

How to generate arbitrary instances of a language given its concrete syntax in Rascal?

Given the concrete syntax of a language, I would like to define a function "instance" with signature str (type[&T]) that could be called with the reified type of the syntax and return a valid instance of the language.
For example, with this syntax:
lexical IntegerLiteral = [0-9]+;
start syntax Exp
= IntegerLiteral
| bracket "(" Exp ")"
> left Exp "*" Exp
> left Exp "+" Exp
;
A valid return of instance(#Exp) could be "1+(2*3)".
The reified type of a concrete syntax definition does contain information about the productions, but I am not sure if this approach is better than a dedicated data structure. Any pointers of how could I implement it?
The most natural thing is to use the Tree data-type from the ParseTree module in the standard library. It is the format that the parser produces, but you can also use it yourself. To get a string from the tree, simply print it in a string like so:
str s = "<myTree>";
A relatively complete random tree generator can be found here: https://github.com/cwi-swat/drambiguity/blob/master/src/GenerateTrees.rsc
The core of the implementation is this:
Tree randomChar(range(int min, int max)) = char(arbInt(max + 1 - min) + min);
Tree randomTree(type[Tree] gr)
= randomTree(gr.symbol, 0, toMap({ <s, p> | s <- gr.definitions, /Production p:prod(_,_,_) <- gr.definitions[s]}));
Tree randomTree(\char-class(list[CharRange] ranges), int rec, map[Symbol, set[Production]] _)
= randomChar(ranges[arbInt(size(ranges))]);
default Tree randomTree(Symbol sort, int rec, map[Symbol, set[Production]] gr) {
p = randomAlt(sort, gr[sort], rec);
return appl(p, [randomTree(delabel(s), rec + 1, gr) | s <- p.symbols]);
}
default Production randomAlt(Symbol sort, set[Production] alts, int rec) {
int w(Production p) = rec > 100 ? p.weight * p.weight : p.weight;
int total(set[Production] ps) = (1 | it + w(p) | Production p <- ps);
r = arbInt(total(alts));
count = 0;
for (Production p <- alts) {
count += w(p);
if (count >= r) {
return p;
}
}
throw "could not select a production for <sort> from <alts>";
}
Tree randomChar(range(int min, int max)) = char(arbInt(max + 1 - min) + min);
It is a simple recursive function which randomly selects productions from a reified grammar.
The trick towards termination lies in the weight of each rule. This is computed a priori, such that every rule has its own weight in the random selection. We take care to give the set of rules that lead to termination at least 50% chance of being selected (as opposed to the recursive rules) (code here: https://github.com/cwi-swat/drambiguity/blob/master/src/Termination.rsc)
Grammar terminationWeights(Grammar g) {
deps = dependencies(g.rules);
weights = ();
recProds = {p | /p:prod(s,[*_,t,*_],_) := g, <delabel(t), delabel(s)> in deps};
for (nt <- g.rules) {
prods = {p | /p:prod(_,_,_) := g.rules[nt]};
count = size(prods);
recCount = size(prods & recProds);
notRecCount = size(prods - recProds);
// at least 50% of the weight should go to non-recursive rules if they exist
notRecWeight = notRecCount != 0 ? (count * 10) / (2 * notRecCount) : 0;
recWeight = recCount != 0 ? (count * 10) / (2 * recCount) : 0;
weights += (p : p in recProds ? recWeight : notRecWeight | p <- prods);
}
return visit (g) {
case p:prod(_, _, _) => p[weight=weights[p]]
}
}
#memo
rel[Symbol,Symbol] dependencies(map[Symbol, Production] gr)
= {<delabel(from),delabel(to)> | /prod(Symbol from,[_*,Symbol to,_*],_) := gr}+;
Note that this randomTree algorithm will not terminate on grammars that are not "productive" (i.e. they have only a rule like syntax E = E;
Also it can generate trees that are filtered by disambiguation rules. So you can check this by running the parser on a generated string and check for parse errors. Also it can generated ambiguous strings.
By the way, this code was inspired by the PhD thesis of Naveneetha Vasudevan of King's College, London.

How does a recursive function return the result in scala?

I am currently learning Scala and I am stuck in the following thing:
I have this algorithm which finds in a recursive way the factorial of a number:
def fact(n:Int): Int=
{
if(n == 1) 1
else n * fact(n - 1)
}
println(fact(5))
My question is, why does this line: if(n == 1) 1 does exactly? Does in mean that the function should return one or that n should become 1? I dont understand how this function returns 120 which is the result. Could someone help me udnerstand? I appreciate any help you can provide
Uhm, this is a very broad question.
Since you are asking for basic understanding of the operators of the language. I will try to explain it all to you, but I would recommend you to take a formal introduction to programming course.
In Scala everything is an expression. Thus, the function itself is an expression that evaluates to the assigned block.
In this case the block is just an if / else expression, which takes a predicate to decide which of the two branches to choose. In this case n == 1 checks if n is equals to 1, if that is true, then it returns 1, if not, it returns n * fact(n -1).
Thus, if we execute the algorithm by ourselves using "equational reasoning", we can understand how it works.
fact(3) = if (3 == 1) 1 else 3 * fact(3 - 1) // replace n in the block.
fact(3) = 3 * fact(2) // reduce the if and the subtraction.
fact(3) = 3 * (if (2 == 1) 1 else 2 * fact(2 - 1)) // expand the fact definition.
fact(3) = 3 * (2 * fact(1)) // reduce the if and the subtraction.
fact(3) = 3 * (2 * (if (1 == 1) 1 else 1 * fact(1 - 1))) // expand the fact definition.
fact(3) = 3 * (2 * (1)) // reduce the if.
fact(3) = 6 // reduce the multiplications.
Lets make this method more c oriented.
Maybe now its more clear that there are two branches
1. When n equals 1 - which stops the recursion.
2. Otherwise - multiply the current value of n by the result of calling the fact method with n - 1, which eventually becomes 1 and stops the recursion.
def fact(n:Int): Int=
{
if (n == 1) {
(return) 1;
}
else {
(return) n * fact(n - 1);
}
}
The semicolon is redundant and the the a return keyword is not recommended/necessary.
You can read about it here
So you are left with:
def fact(n:Int): Int=
{
if (n == 1) {
1
}
else {
n * fact(n - 1)
}
}
Which is basically the same as:
def fact(n:Int): Int=
{
if (n == 1) 1;
else n * fact(n - 1)
}

How to generate increasing sequence in ScalaCheck?

I'm trying to generate sequence of increasing numbers using ScalaCheck.
I would like to achieve something like this:
0 2 4 6
Which was achieved by increasing range 0..3 by step of 2:
0 * 2 = 0
1 * 2 = 2
2 * 2 = 4
3 * 2 = 6
Thanks for help.
Sorry if question has been questioned before.
Well it appears not so difficult to generate random sequence. Sorry I needed to be more specific about predictability.
object GenerateSequence {
def apply(maxSize: Int, maxStep: Int): Gen[Seq[Int]] = {
for {
size <- Gen.chooseNum(1, maxSize)
step <- Gen.chooseNum(1, maxStep)
} yield {
(0 to size).map(_ * step)
}
}
}
It's not really using ScalaCheck, but you can use the stream of even numbers:
>val even = Stream.from(0,2)
>even.take(4)
// Display
// res2: Stream[Int] = Stream(0, 2, 4, 6)

Functional version of a typical nested while loop

I hope this question may please functional programming lovers. Could I ask for a way to translate the following fragment of code to a pure functional implementation in Scala with good balance between readability and execution speed?
Description: for each elements in a sequence, produce a sub-sequence contains the elements that comes after the current elements (including itself) with a distance smaller than a given threshold. Once the threshold is crossed, it is not necessary to process the remaining elements
def getGroupsOfElements(input : Seq[Element]) : Seq[Seq[Element]] = {
val maxDistance = 10 // put any number you may
var outerSequence = Seq.empty[Seq[Element]]
for (index <- 0 until input.length) {
var anotherIndex = index + 1
var distance = input(index) - input(anotherIndex) // let assume a separate function for computing the distance
var innerSequence = Seq(input(index))
while (distance < maxDistance && anotherIndex < (input.length - 1)) {
innerSequence = innerSequence ++ Seq(input(anotherIndex))
anotherIndex = anotherIndex + 1
distance = input(index) - input(anotherIndex)
}
outerSequence = outerSequence ++ Seq(innerSequence)
}
outerSequence
}
You know, this would be a ton easier if you added a description of what you're trying to accomplish along with the code.
Anyway, here's something that might get close to what you want.
def getGroupsOfElements(input: Seq[Element]): Seq[Seq[Element]] =
input.tails.map(x => x.takeWhile(y => distance(x.head,y) < maxDistance)).toSeq

Why can't I divide integers correctly within reduce in Swift?

I'm trying to get the average of an array of Ints using the following code:
let numbers = [1,2,3,4,5]
let avg = numbers.reduce(0) { return $0 + $1 / numbers.count }
print(avg) // 1
Which is obviously incorrect. However, if I remove the division to the outside of the closure:
let numbers = [1,2,3,4,5]
let avg = numbers.reduce(0) { return $0 + $1 } / numbers.count
print(avg) // 3
Bingo! I think I remember reading somewhere (can't recall if it was in relation to Swift, JavaScript or programming math in general) that this has something to do with the fact that dividing the sum by the length yields a float / double e.g. (1 + 2) / 5 = 0.6 which will be rounded down within the sum to 0. However I would expect ((1 + 2) + 3) / 5 = 1.2 to return 1, however it too seems to return 0.
With doubles, the calculation works as expected whichever way it's calculated, as long as I box the count integer to a double:
let numbers = [1.0,2.0,3.0,4.0,5.0]
let avg = numbers.reduce(0) { return $0 + $1 / Double(numbers.count) }
print(avg) // 3
I think I understand the why (maybe not?). But I can't come up with a solid example to prove it.
Any help and / or explanation is very much appreciated. Thanks.
The division does not yield a double; you're doing integer division.
You're not getting ((1 + 2) + 3 etc.) / 5.
In the first case, you're getting (((((0 + (1/5 = 0)) + (2/5 = 0)) + (3/5 = 0)) + (4/5 = 0)) + (5/5 = 1)) = 0 + 0 + 0 + 0 + 0 + 1 = 1.
In the second case, you're getting ((((((0 + 1) + 2) + 3) + 4) + 5) / 5) = 15 / 5 = 3.
In the third case, double precision loss is much smaller than the integer, and you get something like (((((0 + (1/5.0 = 0.2)) + (2/5.0 = 0.4)) + (3/5.0 = 0.6)) + (4/5.0 = 0.8)) + (5/5.0 = 1.0)).
The problem is that what you are attempting with the first piece of code does not make sense mathematically.
The average of a sequence is the sum of the entire sequence divided by the number of elements.
reduce calls the lambda function for every member of the collection it is being called on. Thus you are summing and dividing all the way through.
For people finding it hard to understand the original answer.
Consider.
let x = 4
let y = 3
let answer = x/y
You expect the answer to be a Double, but no, it is an Int. For you to get an answer which is not a rounded down Int. You must explicitly state the values to be Double. See below
let doubleAnswer = Double(x)/Double(y)
Hope this helped.