Counting letter occurrences in a String Swift - swift

Found a cute way of counting the occurrences of character in a String:
let inputString = "test this string"
var frequencies : [Character: Int] = [:]
let baseCounts = zip(
inputString, repeatElement(1,count: Int.max))
frequencies = Dictionary(baseCounts, uniquingKeysWith: +)
with the result
["i": 2, "r": 1, "n": 1, "e": 1, "s": 3, " ": 2, "g": 1, "t": 4, "h": 1]
However I tried to use a range for the elements such that
let secondBaseCounts = zip(inputString, 0...)
frequencies = Dictionary(secondBaseCounts, uniquingKeysWith: +)
but get the incorrect result:
["i": 20, "r": 12, "n": 14, "e": 1, "s": 20, " ": 13, "g": 15, "t": 19, "h": 6]
Why?

Your second attempt doesn't implement what you meant to implement. zip(inputString, 0...) simply maps the Int index of each character to the character itself.
So the value of secondBaseCounts will be
["t", 0), ("e", 1), ("s", 2), ("t", 3), (" ", 4), ("t", 5), ("h", 6), ("i", 7), ("s", 8), (" ", 9), ("s", 10), ("t", 11), ("r", 12), ("i", 13), ("n", 14), ("g", 15)]
Calling Dictionary(secondBaseCounts, uniquingKeysWith: +) sums each value associated with repeating keys, meaning that the values in your final frequencies dictionary will be the sum of all indices where a certain character occurs in inputString rather than the count of occurrences of that character.

I am answering this question without using any Build-In methods. Most likely to be asked in Interviews.
var inputString = "test this string"
extension String {
func countCharacterOccurances() -> Dictionary<String, Any> {
var occuranceDict : [String : Int] = [:]
for i in self {
var count = 1
if occuranceDict[String(i)] == nil {
occuranceDict[String(i)] = count
}
else {
count = occuranceDict[String(i)] ?? 0
count += 1
occuranceDict[String(i)] = count
}
}
return occuranceDict as Dictionary<String, Any>
}
}
var characterOccuranceDict = inputString.countCharacterOccurances()
Output : ["h": 1, "t": 4, "i": 2, "g": 1, "s": 3, "r": 1, " ": 2, "n": 1, "e": 1]
The following code is useful if you want a dictionary that is sorted alphabetically by Keys. Otherwise, you can print characterOccuranceDict.
Problem statement:- "write a program to count consecutive characters in the string and combine them in a string in given order"
let sortedDictionary = characterOccuranceDict.sorted { $0.key < $1.key }
sortedDictionary.forEach { (item) in
finalStr = finalStr + item.key + String(describing: item.value)
}
print(finalStr)
inputString = "AAABBCCCCD"
Output : A3B2C4D1
swift ios

Related

Swift: Remap value keys to keys with dictionray values in dictionary of dictionaries

Given,
let input = [
"A": ["X": 1, "Y": 2, "Z": 3],
"B": ["X": 7, "Y": 8, "Z": 9],
]
how do you produce,
[
"X": ["A": 1, "B": 7],
"Y": ["A": 2, "B": 8],
"Z": ["A": 3, "B": 9],
]
using functional programming?
You could do something like this:
let input = ["A": ["X": 1, "Y": 2, "Z": 3], "B": ["X": 7, "Y": 8, "Z": 9]]
let output = input.flatMap { okv in
okv.value.map { ikv in
(ikv.key, okv.key, ikv.value)
}
}
.reduce(into: [String:[(String,Int)]]()) { accum, value in
accum[value.0] = (accum[value.0] ?? []) + [(value.1,value.2)]
}
.reduce(into: [String:[String:Int]]()) { accum, value in
accum[value.key] = value.value.reduce(into: [String:Int]()) { iaccum, ivalue in
iaccum[ivalue.0] = ivalue.1
}
}
This works by first expanding the nested dictionaries into an array of tuples [("X", "A", 1), ("Y", "A", 2), ... ] and then reduces that intermediate array back into a dictionary in two stages: first as a dictionary of arrays and then as a dictionary of dictionaries.
In fact, you can do a little bit better if you take advantage of default dictionary values:
let result = input.flatMap { okv in
okv.value.map { ikv in
(ikv.key, okv.key, ikv.value)
}
}
.reduce(into: [String:[String:Int]]()) { accum,value in
accum[value.0, default: [String:Int]()][value.1] = value.2
}
I cam up with something like below. I found this problem much easier to understand using imperative approach, rather than using functional programming.
Note-: I haven’t covered edge cases.
class Test{
var outputDict:[String:[String:Int]] = [:]
func test(){
let input = ["A": ["X": 1, "Y": 2, "Z": 3], "B": ["X": 7, "Y": 8, "Z": 9]]
for (key,val) in input{
var newDict:[String:Int] = [:]
for (key1,val1) in val{
generateInternalDict(&newDict, val1, key, key1)
}
}
print(outputDict)
}
func generateInternalDict(_ newDict:inout [String:Int],_ value:Int,_ keyOuter:String,_ keyInner:String){
if outputDict[keyInner] == nil{
newDict[keyOuter] = value
outputDict[keyInner] = newDict
}else{
var getDictForKey = outputDict[keyInner]
getDictForKey![keyOuter] = value
outputDict[keyInner] = getDictForKey
}
}
}
Output-:
["X": ["B": 7, "A": 1], "Z": ["B": 9, "A": 3], "Y": ["B": 8, "A": 2]]

multiply the number in the array if it matches the given one

I have an array like this, and I need to multiply a number if it is 3, but in the end reduce eliminates all numbers equal to 3, and multiplies the rest. How do I fix this ?
let arr = [2, 4, 3, 1, 4, 3, 1, 3, 10, 4, 2, 13]
let aaa = arr.reduce([]) { $1 == 3 ? $0 : $0 + [$1 * 5] }
//[10, 20, 5, 20, 5, 50, 20, 10, 65]
//[2,4,15,1,4,15,1,15,10,4,2,13] need this
You should use map instead of reduce
let arr = [2, 4, 3, 1, 4, 3, 1, 3, 10, 4, 2, 13]
let result = arr.map { $0 == 3 ? 15 : $0 }
// [2, 4, 15, 1, 4, 15, 1, 15, 10, 4, 2, 13
Or if it should work for any multiple of 3
let result = arr.map { $0.isMultiple(of: 3) ? $0 * 5 : $0 }

I'll be back to edit the question soon so I can ask questions again as this is my only negative question it says I need to improve

I'll be back to edit the question soon so I can ask questions again as this is my only negative question it says I need to improve
You can use Dictionary grouping initializer and map their value count:
let numbers = [
"1, 2, 3, 4",
"5, 6, 7, 8",
"3, 4, 5, 6",
"1, 2, 7, 8",
"1, 2, 3, 4",
"3, 4, 5, 6",
"1, 2, 3, 4"]
let setFrequency = Dictionary(grouping: numbers) { $0 }
.mapValues{ $0.count }
or using reduce(into:)
let setFrequency = numbers.reduce(into: [:]) { $0[$1, default: 0] += 1 }
print(setFrequency) // ["3, 4, 5, 6": 2, "1, 2, 7, 8": 1, "5, 6, 7, 8": 1, "1, 2, 3, 4": 3]

In Swift how to write a func that turns a [String:[Int]] to [String:Int]

I was given a list of apps along with their ratings:
let appRatings = [
"Calendar Pro": [1, 5, 5, 4, 2, 1, 5, 4],
"The Messenger": [5, 4, 2, 5, 4, 1, 1, 2],
"Socialise": [2, 1, 2, 2, 1, 2, 4, 2]
]
I want to write a func that takes appRating as input and return their name and average rating, like this.
["Calendar Pro": 3,
"The Messenger": 3,
"Socialise": 2]
Does anyone know how to implement such a method that it takes (name and [rating]) as input and outputs (name and avgRating ) using a closure inside the func?
This is what I have so far.
func calculate( appName: String, ratings : [Int]) -> (String, Double ) {
let avg = ratings.reduce(0,+)/ratings.count
return (appName, Double(avg))
}
Fundamentally, what you're trying to achieve is a mapping between one set of values into another. Dictionary has a function for this, Dictionary.mapValues(_:), specifically for mapping values only (keeping them under the same keys).
let appRatings = [
"Calendar Pro": [1, 5, 5, 4, 2, 1, 5, 4],
"The Messenger": [5, 4, 2, 5, 4, 1, 1, 2],
"Socialise": [2, 1, 2, 2, 1, 2, 4, 2]
]
let avgAppRatings = appRatings.mapValues { allRatings in
return computeAverage(of: allRatings) // Dummy function we'll implement later
}
So now, it's a matter of figuring out how to average all the numbers in an Array. Luckily, this is very easy:
We need to sum all the ratings
We can easily achieve this with a reduce expression. StWe'll reduce all numbers by simply adding them into the accumulator, which will start with 0
allRatings.reduce(0, { accumulator, rating in accumulator + rate })
From here, we can notice that the closure, { accumulator, rating in accumulator + rate } has type (Int, Int) -> Int, and just adds the numbers together. Well hey, that's exactly what + does! We can just use it directly:
allRatings.reduce(0, +)
We need to divide the ratings by the number of ratings
There's a catch here. In order for the average to be of any use, it can't be truncated to a mere Int. So we need both the sum and the count to be converted to Double first.
You need to guard against empty arrays, whose count will be 0, resulting in Double.infinity.
Putting it all together, we get:
let appRatings = [
"Calendar Pro": [1, 5, 5, 4, 2, 1, 5, 4],
"The Messenger": [5, 4, 2, 5, 4, 1, 1, 2],
"Socialise": [2, 1, 2, 2, 1, 2, 4, 2]
]
let avgAppRatings = appRatings.mapValues { allRatings in
if allRatings.isEmpty { return nil }
return Double(allRatings.reduce(0, +)) / Double(allRatings.count)
}
Add in some nice printing logic:
extension Dictionary {
var toDictionaryLiteralString: String {
return """
[
\t\(self.map { k, v in "\(k): \(v)" }.joined(separator: "\n\t"))
]
"""
}
}
... and boom:
print(avgAppRatings.toDictionaryLiteralString)
/* prints:
[
Socialise: 2.0
The Messenger: 3.0
Calendar Pro: 3.375
]
*/
Comments on your attempt
You had some questions as to why your attempt didn't work:
func calculate( appName: String, ratings : [Int]) -> (String: Int ) {
var avg = ratings.reduce(0,$0+$1)/ratings.count
return appName: sum/avg
}
$0+$1 isn't within a closure ({ }), as it needs to be.
appName: sum/avg isn't valid Swift.
The variable sum doesn't exist.
avg is a var variable, even though it's never mutated. It should be a let constant.
You're doing integer devision, which doesn't support decimals. You'll need to convert your sum and count into a floating point type, like Double, first.
A fixed version might look like:
func calculateAverage(of numbers: [Int]) -> Double {
let sum = Double(ratings.reduce(0, +))
let count = Double(numbers.count)
return sum / count
}
To make a function that processes your whole dictionary, incoroprating my solution above, you might write a function like:
func calculateAveragesRatings(of appRatings: [String: [Int]]) -> [String: Double?] {
return appRatings.mapValues { allRatings in
if allRatings.isEmpty { return nil }
return Double(allRatings.reduce(0, +)) / Double(allRatings.count)
}
}
This a simple solution that takes into account that a rating is an integer:
let appRatings = [
"Calendar Pro": [1, 5, 5, 4, 2, 1, 5, 4],
"The Messenger": [5, 4, 2, 5, 4, 1, 1, 2],
"Socialise": [2, 1, 2, 2, 1, 2, 4, 2]
]
let appWithAverageRating: [String: Int] = appRatings.mapValues { $0.reduce(0, +) / $0.count}
print("appWithAverageRating =", appWithAverageRating)
prints appWithAverageRating = ["The Messenger": 3, "Calendar Pro": 3, "Socialise": 2]
If you'd like to check whether an app has enough ratings before returning an average rating, then the rating would be an optional Int:
let minimumNumberOfRatings = 0 // You can change this
var appWithAverageRating: [String: Int?] = appRatings.mapValues { ratingsArray in
guard ratingsArray.count > minimumNumberOfRatings else {
return nil
}
return ratingsArray.reduce(0, +) / ratingsArray.count
}
If you'd like the ratings to go by half stars (0, 0.5, 1, ..., 4.5, 5) then we could use this extension:
extension Double {
func roundToHalf() -> Double {
let n = 1/0.5
let numberToRound = self * n
return numberToRound.rounded() / n
}
}
Then the rating will be an optional Double. Let's add an AppWithoutRatings and test our code:
let appRatings = [
"Calendar Pro": [1, 5, 5, 4, 2, 1, 5, 4],
"The Messenger": [5, 4, 2, 5, 4, 1, 1, 2],
"Socialise": [2, 1, 2, 2, 1, 2, 4, 2],
"AppWithoutRatings": []
]
let minimumNumberOfRatings = 0
var appWithAverageRating: [String: Double?] = appRatings.mapValues { ratingsArray in
guard ratingsArray.count > minimumNumberOfRatings else {
return nil
}
let rating: Double = Double(ratingsArray.reduce(0, +) / ratingsArray.count)
return rating.roundToHalf()
}
And this prints:
appWithAverageRating = ["Calendar Pro": Optional(3.0), "Socialise": Optional(2.0), "The Messenger": Optional(3.0), "AppWithoutRatings": nil]
I decided to make an Dictionary extension for this, so it is very easy to use in the future.
Here is my code I created:
extension Dictionary where Key == String, Value == [Float] {
func averageRatings() -> [String : Float] {
// Calculate average
func average(ratings: [Float]) -> Float {
return ratings.reduce(0, +) / Float(ratings.count)
}
// Go through every item in the ratings dictionary
return self.mapValues { $0.isEmpty ? 0 : average(ratings: $0) }
}
}
let appRatings: [String : [Float]] = ["Calendar Pro": [1, 5, 5, 4, 2, 1, 5, 4],
"The Messenger": [5, 4, 2, 5, 4, 1, 1, 2],
"Socialise": [2, 1, 2, 2, 1, 2, 4, 2]]
print(appRatings.averageRatings())
which will print the result of ["Calendar Pro": 3.375, "Socialise": 2.0, "The Messenger": 3.0].
Just to make the post complete another approach using reduce(into:) to avoid using a dictionary with an optional value type:
extension Dictionary where Key == String, Value: Collection, Value.Element: BinaryInteger {
var averageRatings: [String : Value.Element] {
return reduce(into: [:]) {
if !$1.value.isEmpty {
$0[$1.key] = $1.value.reduce(0,+) / Value.Element($1.value.count)
}
}
}
}
let appRatings2 = ["Calendar Pro" : [1, 5, 5, 4, 2, 1, 5, 4],
"The Messenger": [5, 4, 2, 5, 4, 1, 1, 2],
"Socialise" : [2, 1, 2, 2, 1, 2, 4, 2] ]
let keySorted = appRatings2.averageRatings.sorted(by: {$0.key<$1.key})
keySorted.map{ print($0,$1) }
Calendar Pro 3
Socialise 2
The Messenger 3

how to print string in a dictionary on swift?

How can I print type of largest number in this dictionary?
let interestingNumbers = [
"Prime": [2, 3, 5, 7, 11, 13],
"Fibonacci": [1, 1, 2, 3, 5, 8],
"Square": [1, 4, 9, 16, 25],
]
var largest = 0
var typeoflargest:String = " "
for (kind, numbers) in interestingNumbers {
for type in kind.characters {
for number in numbers {
if number > largest {
largest = number
typeoflargest = String(type)
}
}
}
}
print(largest)
print(typeoflargest)
output:
25
S
why I got only first character "S" instead of "Square"?
There is no reason to be iterating the characters of the kind string. Just do the following:
let interestingNumbers = [
"Prime": [2, 3, 5, 7, 11, 13],
"Fibonacci": [1, 1, 2, 3, 5, 8],
"Square": [1, 4, 9, 16, 25],
]
var largest = 0
var typeoflargest:String = ""
for (kind, numbers) in interestingNumbers {
for number in numbers {
if number > largest {
largest = number
typeoflargest = kind
}
}
}
print(largest)
print(typeoflargest)
Output:
25
Square
Alternative approach:
let interestingNumbers = [
"Prime": [2, 3, 5, 7, 11, 13],
"Fibonacci": [1, 1, 2, 3, 5, 8],
"Square": [1, 4, 9, 16, 25],
]
let maximum = interestingNumbers
.map{ type, numbers in return (type: type, number: numbers.max()!) }
.max(by: { $0.number < $1.number })!
print(maximum.type, maximum.number)
Explanation:
First, get the maximal element of each category. Do this by iterating the dictionary, mapping the values from arrays of numbers to maximum numbers (within their respective arrays), yielding:
[
(type: "Square", number: 25), // 25 is the max of [1, 4, 9, 16, 25]
(type: "Prime", number: 13), // 13 is the max of [2, 3, 5, 7, 11, 13]
(type: "Fibonacci", number: 8) // 8 is the max of [1, 1, 2, 3, 5, 8]
]
Then, get the maximal type/number pair, by comparing their numbers, yielding the result:
(type: "Square", number: 25) // 25 is the max of 25, 13, 8