Swift (for in) immutable value - swift

I am trying to create a function within the ViewController class. I want to loop through a string and count the number of times a specific character occurs. Code looks like this:
var dcnt:Int = 0
func decimalCount(inputvalue: String) -> Int {
for chr in inputvalue.characters {
if chr == β€œ.” {
++dcnt
}
}
return dcnt
}
The input string comes from a UILabel!
I get a warning: Immutable value β€˜chr’ was never used.
How can I fix this problem

The problem, as so often in Swift, lies elsewhere. It's the curly quotes. Put this:
if chr == "." {

Related

Declaring a Swift Character type that could hold only ASCII characters?

Swift's Character data type is a very broad set and I have a use case where I need to declare a variable which can only hold ascii characters. The type should not be able to accept characters outside ascii.
Do we have any other in built data types that are suitable for my use case?
Or do I need to write a custom solution?
If it requires custom solution what are the best possible ways to achieve it?
As pointed in the question comments, it seems it's not possible to have a check on compile time.
A runtime solution could be:
You can check if a Character is Ascii by <#Character#>.isASCII, then you can create a custom class that only stores a value if the condition is satisfied.
struct CustomASCIIClass {
private var storedValue : Character? = nil
var value : Character? {
get {
return self.storedValue ?? nil
}
set (newValue) {
if (newValue?.isASCII ?? false) {
self.storedValue = newValue
} else {
// handle not possible value, throw something or make the variable to have nil value.
print("invalid: ", newValue)
}
}
}
}
Usage:
var a = CustomASCIIClass()
a.value = Character("A")
print(a.value) // Optional("A")
var b = CustomASCIIClass()
b.value = Character("😎")
print(b.value) // nil

Argument labels do not match any availble overloads

I'm trying to create an anagram tester, and I'm pretty sure the code I have should work, but I'm getting an error 'Argument labels '(_:)' do not match any available overloads' I've looked at the other posts regarding the same error, but I'm still not sure what this means or how to fix it.
var anagram1 : String!
var anagram2 : String!
var failure : Bool = false
var counter : Int = 0
print("Please enter first word: ")
anagram1 = readLine()
print("Please enter Second word: ")
anagram2 = readLine()
if anagram1.count == anagram2.count {
for i in anagram1.characters{
if (!failure){
failure = true
for y in anagram2.characters {
counter += 1
if i == y {
failure = false
anagram2.remove(at: String.Index(counter)) // error here
}
}
}
else {
print("these words are not anagrams")
break;
}
}
if (!failure) {
print("these words ARE anagrams")
}
}
else{
print ("these words aren't even the same length you fucking nonce")
}
To answer your first question: the error message Argument labels '(_:)' do not match any available overloads means that you've given a function parameter names or types that don't match anything Swift knows about.
The compiler is also trying to tell you what parameters to look at. '(_:)' says that you're calling a function with an unlabeled parameter. (That means a value without any parameter name. A common example of a function that would look like this is print("something"). In Swift documentation, this would look like print(_:).
Finally, overloads are ways to call a function with different information. Again using the print function as an example, you can call it multiple ways. A couple of the most common overloads would be:
// print something, followed by a newline character
print("something")
// print something, but stay on the same line
// (end with an empty string instead of the default newline character)
print("something", terminator: "")
Documented, these might look like print(_:) and print(_:, terminator:).
Note: these are broken down for explanation. The actual Swift documentation shows func print(_: Any..., separator: String, terminator: String) which covers a number of different overloads!
Looking at the line where the error occurs, you see a function call and an initializer (which is essentially a function). Documented, the way you've entered the parameters, the functions would look like: remove(at:) and String.Index(_:).
String.Index(_:) matches the parameters of the error message, so that's where your error is. There is no overload of the String.Index initializer that takes an unnamed parameter.
To fix this error, you need to find the correct way to create a String.Index parameter for the remove(at:) function. One way might be to try something like this:
for y in anagram2.characters.enumerated() {
// `y` now represents a `tuple`: (offset: Int, element: Character)
// so, you don't need `counter` anymore; use `offset` instead
if i == y.element { //`i` is a Character, so it can compare to `element`
...
let yIndex: String.Index = anagram2.index(anagram2.startIndex, offsetBy: y.offset)
anagram2.remove(at: yIndex)
...
}
}
However, there are other issues with your code that will cause further errors.
For one, you're looping through a string (anagram2) and trying to change it at the same time - not a good thing to do.
Good luck to you in solving the anagram problem!
Thanks for the help Leo but I found a way of doing it :)
if anagram1.count == anagram2.count {
for i in anagram1.characters{
if (!failure){
counter = -1
failure = true
for y in anagram2.characters {
counter += 1
if i == y {
failure = false
if counter < anagram2.count {
anagram2.remove(at: (anagram2.index(anagram2.startIndex, offsetBy: counter)))
break;
}
}
}
}

Counting character frequencies in a Swift string

I'm trying to convert Java code to Swift and facing the issue:
single-quoted string literal found, use '"' charArray[(s[i].asciiValue)! - ('a'.asciiValue)!]++ ^~~ "a"
Java Code:
for(String s: str){
char arr[] = new char[26]
for(int i =0;i< s.length(); i++){
arr[s.charAt(i) -'a']++;
}
}
Swift Code:
extension String {
var asciiArray: [UInt32] {
return unicodeScalars.filter{$0.isASCII}.map{$0.value}
}
}
extension Character {
var asciiValue: UInt32? {
return String(self).unicodeScalars.filter{$0.isASCII}.first?.value
}
}
class GroupXXX {
func groupXXX(strList: [String]) {
for str in strList {
var charArray = [Character?](repeating: nil, count: 26)
var s = str.characters.map { $0 }
for i in 0..<s.count {
charArray[(s[i].asciiValue)! - ('a'.asciiValue)!]++
}
}
}
}
There are several problems in your Swift code:
There are no single-quoted character literals in Swift (as already explained
by JeremyP).
The ++ operator has been removed in Swift 3.
s[i] does not compile because Swift strings are not indexed by
integers.
Defining the array as [Character?] makes no sense, and you cannot
increment a Character?. The Swift equivalent of the Java char
would be UInt16.
You don't check if the character is in the range "a"..."z".
Apparently you want to count the number of occurrences of
each character "a" to "z" in a string.
This is how I would do it in Swift:
Define the "frequency" array as an array of integers.
Enumerate the unicodeScalars property of the string.
Use a switch statement to check for the valid range of characters.
Then the custom extension are not needed anymore and the code becomes
var frequencies = [Int](repeating: 0, count: 26)
for c in str.unicodeScalars {
switch c {
case "a"..."z":
frequencies[Int(c.value - UnicodeScalar("a").value)] += 1
default:
break // ignore all other characters
}
}
charArray[(s[i].asciiValue)! - ('a'.asciiValue)!]++
As the error says, use double quotes, Swift doesn't have a syntax that differentiates between characters and strings (characters themselves may be sequences of bytes in swift).
You may need to explicitly force the character to be a Character if the compiler can't differentiate.
charArray[(s[i].asciiValue)! - (Character("a").asciiValue)!]++

Swift: Separating every character of a word by space

I am trying to make an app that takes inputed string and then separates every character by space, and finally show it. For example, when I input "pizza" it should output "p i z z a". Unfortunately, the following code, which I wrote, does not work:
#IBOutlet var input: UITextField!
#IBOutlet var output: UITextField!
#IBAction func split(sender: AnyObject) {
I think the problem lies in the following for-in:
for character in input.text!.characters.indices {
input.text = String(input.text![character], terminator: "")
}
output.text = input.text
}
I am new to programming and I was trying to find the solution on the web, but I did not manage to. Could you help me?
You can create an array of your string characters and use joinWithSeparator to join it with a space:
extension String {
var spacedString: String {
return characters.map{String($0)}.joinWithSeparator(" ")
}
}
"pizza".spacedString
You make three mistakes here:
1) You are assigning new value to input.text every time in the loop, not append to the end of the result string.
2) You loop thru the input.text and at the same time, you are changing the value inside input.text which causes some problem when doing input.text![character]
3) You don't need to use String(input.text![character], terminator: ""). "\(input.text![character]) " should work well
Leo Dabus's answer should work for you but I also want to provide a lower level version:
var result = String()
for character in input.text!.characters {
result.appendContentsOf("\(character) ")
}
output.text = result.stringByTrimmingCharactersInSet(.whitespaceAndNewlineCharacterSet())

How to handle initial nil value for reduce functions

I would like to learn and use more functional programming in Swift. So, I've been trying various things in playground. I don't understand Reduce, though. The basic textbook examples work, but I can't get my head around this problem.
I have an array of strings called "toDoItems". I would like to get the longest string in this array. What is the best practice for handling the initial nil value in such cases? I think this probably happens often. I thought of writing a custom function and use it.
func optionalMax(maxSofar: Int?, newElement: Int) -> Int {
if let definiteMaxSofar = maxSofar {
return max(definiteMaxSofar, newElement)
}
return newElement
}
// Just testing - nums is an array of Ints. Works.
var maxValueOfInts = nums.reduce(0) { optionalMax($0, $1) }
// ERROR: cannot invoke 'reduce' with an argument list of type β€˜(nil, (_,_)->_)'
var longestOfStrings = toDoItems.reduce(nil) { optionalMax(count($0), count($1)) }
It might just be that Swift does not automatically infer the type of your initial value. Try making it clear by explicitly declaring it:
var longestOfStrings = toDoItems.reduce(nil as Int?) { optionalMax($0, count($1)) }
By the way notice that I do not count on $0 (your accumulator) since it is not a String but an optional Int Int?
Generally to avoid confusion reading the code later, I explicitly label the accumulator as a and the element coming in from the serie as x:
var longestOfStrings = toDoItems.reduce(nil as Int?) { a, x in optionalMax(a, count(x)) }
This way should be clearer than $0 and $1 in code when the accumulator or the single element are used.
Hope this helps
Initialise it with an empty string "" rather than nil. Or you could even initialise it with the first element of the array, but an empty string seems better.
Second go at this after writing some wrong code, this will return the longest string if you are happy with an empty string being returned for an empty array:
toDoItems.reduce("") { count($0) > count($1) ? $0 : $1 }
Or if you want nil, use
toDoItems.reduce(nil as String?) { count($0!) > count($1) ? $0 : $1 }
The problem is that the compiler cannot infer the types you are using for your seed and accumulator closure if you seed with nil, and you also need to get the optional type correct when using the optional string as $0.