'+' is deprecated: Mixed-type addition is deprecated in Swift 3.1 - swift

When I'm directly adding an integer value(i.e: 1,2,3,etc) with another integer variable
let arr:Array = ["One","Two"]
var valueT:Int64 = 0
value = arr.count + 1 //in this line
I get the following warning:
'+' is deprecated: Mixed-type addition is deprecated. Please use explicit type conversion.
I fixed it the warning with this:
value = Int64(value + 1)
Though it is fixed but I wanna know why its called Mixed-type addition as I didn't use ++ . Also is there a better way to fix the warning in swift 3.1?
Update:
The following image is the proof of warning. I'm using Xcode Version 8.3 (8E162).
allROR is an array here.

Edit: To generate the error with your code it should be like
let value = 5
let result: Int64 = value + 1
Now you get the warning
'+' is deprecated: Mixed-type addition is deprecated. Please use explicit type conversion.
But it is looks like warning is misleading, as of both value and 1 is of type Int so its summation also Int so you need to simply convert the result to Int64 and it is what you doing and that is perfectly ok.
let result: Int64 = Int64(value + 1)

Just to answer this part: why its called Mixed-type addition
With the simplified example by Nirav D:
let value = 5
let result: Int64 = value + 1
You can Command-click on + and see the generated interface of Collection:
(After Indexing has finished, of course.)
#available(swift, deprecated: 3.0, obsoleted: 4.0, message: "Mixed-type addition is deprecated. Please use explicit type conversion.")
public func +<T>(lhs: T.Stride, rhs: T) -> T where T : SignedInteger
So, in the code example above, the type of 1 is inferred as Int64, and as Int64.Stride == Int, the operation value + 1 matches the signature func +<T>(lhs: T.Stride, rhs: T) -> T where T : SignedInteger.
This deprecation is included in the revised version of SE-0104 Protocol-oriented integers, this part:
Standard Library no longer provides + and - operators for Strideable types.
They were problematic, as one could have written mixed-type code like
let x: Int64 = 42; x += (1 as Int), which would compile, but
shouldn't. Besides, since the Stride of an unsigned type is signed,
Standard Library had to implement a hack to make code like let x: UInt
= 42; x += (1 as Int) ambiguous. These operators were only necessary because they made advancing collection indices convenient, which is no
longer the case since the introduction of the new indexing model in
Swift 3.
As you already have seen, you can avoid this warning in many ways.

Data type is different that is why it is showing an error
you need to make both variable and constant of same data type
for e.g.
let result = value + Int64(1) //in this line

OK
var arr = [String]()
var i: Int64 = 0
if arr.count == 0 {
i = 1
} else {
i = arr.count + 1
}
gives as a warning '+' is deprecated: Mixed-type addition is deprecated. Please use explicit type conversion.
The reason is, that arr.count and i has different types. And this warning is right. It has nothing with the integer literal 1
this snippet gives you the warning too
var arr = [String]()
var i: Int64 = 0
if arr.count == 0 {
i = 1
} else {
i = 1
i += arr.count // here is the warning now
}
this will not compile, even though it looks very similar
var arr = [String]()
var i: Int64 = 0
if arr.count == 0 {
i = 1
} else {
let tmp = arr.count + 1
i = tmp
}
I hope we get an error message when we compose all of these snippets in the future release.

Related

How to get character from Unicode in correct way when UnicodeScalars are not working?

I searched for the answer past days, and lot of them are very old (around swift 2 and 1.2).
I wanted to get characters when unicode code is taken from the variable. Because for some unkown reason that consturction won't work in Swift:
print("\u{\(variable)}") // should be proposal for including this in Swift 6
people advice to use UnicodeScalars. However Apple must introduced something new in Swift 5. I found some tutorial here but this code fragment
let value0: UInt8 = 0x61
let value1: UInt16 = 0x5927
let value2: UInt32 = 0x1F34E
let string0 = String(UnicodeScalar(value0)) // a
let string1 = String(UnicodeScalar(value1)) // 大 error here
let string2 = String(UnicodeScalar(value2)) // 🍎 error here
is not working, with string1 and string2 I get error "no exact matches in call to initalizer". So as when author posted that I understand it must worked in previous version of Swift but with latest they do not. What is changed under the hood? Section for Strings in Apple handbook does not reveal anything.
I am trying to rewrite in Swift some code from Typescript and in JS is so simple like:
for (let i = str.length; i >= 1; i -= 2) {
r = String.fromCharCode(parseInt("0x" + str.substring(i - 2, i))) + r;
}
and I'm struggling with this for past 2 days without effect!
The UnicodeScalar initializers taking an UInt16 or UInt32 argument are failable initializers (and return nil if the passed argument is an invalid Unicode scalar value).
public init?(_ v: UInt16)
public init?(_ v: UInt32)
The optional must be unwrapped before passing it to the String initializer.
Only the initializer taking an UInt8 argument is non-failable:
public init(_ v: UInt8)
So this compiles and produces the expected result:
let value0: UInt8 = 0x61
let value1: UInt16 = 0x5927
let value2: UInt32 = 0x1F34E
let string0 = String(UnicodeScalar(value0)) // a
let string1 = String(UnicodeScalar(value1)!) // 大
// ^--- unwrap optional
let string2 = String(UnicodeScalar(value2)!) // 🍎
// ^--- unwrap optional
Of course, in your real code, you would use optional binding and not forced unwrapping.

Conforming String.CharacterView.Index to Strideable: fatal error when using stride(to:by:): "cannot increment endIndex "

Question:
When attempting to stride over String.CharacterView.Index indices by e.g. a stride of 2
extension String.CharacterView.Index : Strideable { }
let str = "01234"
for _ in str.startIndex.stride(to: str.endIndex, by: 2) { } // fatal error
I get the following runtime exception
fatal error: cannot increment endIndex
Just creating the StrideTo<String.CharacterView.Index> above, however, (let foo = str.startIndex.stride(to: str.endIndex, by: 2)) does not yield an error, only when attempting to stride/iterate over or operate on it (.next()?).
What is the reason for this runtime exception; is it expected (mis-use of conformance to Stridable)?
I'm using Swift 2.2 and Xcode 7.3. Details follow below.
Edit addition: error source located
Upon reading my question carefully, it would seem as if the error really does occur in the next() method of StrideToGenerator (see bottom of this post), specifically at the following marked line
let ret = current
current += stride // <-- here
return ret
Even if the last update of current will never be returned (in next call to next()), the final advance of current index to a value larger or equal to that of _end yields the specific runtime error above (for Index type String.CharacterView.Index).
(0..<4).startIndex.advancedBy(4) // OK, -> 4
"foo".startIndex.advancedBy(4) // fatal error: cannot increment endIndex
However, one question still remains:
Is this a bug in the next() method of StrideToGenerator, or just an error that pops up due to a mis-use of String.CharacterView.Index conformance to Stridable?
Related
The following Q&A is related to the subject of a iterating over characters in steps other than +1, and worth including in this question even if the two questions differ.
Using String.CharacterView.Index.successor() in for statements
Especially note #Sulthan:s neat solution in the thread above.
Details
(Apologies for hefty details/investigations of my own, just skip these sections if you can answer my question without the details herein)
The String.CharacterView.Index type describes a character position, and:
conforms to Comparable (and in so, Equatable),
contains implementations for advancedBy(_:) and distanceTo(_:).
Hence, it can directly be made to conform to the protocol Strideable, making use of Stridable:s default implementations of methods stride(through:by:) and stride(to:by:). The examples below will focus on the latter (analogous problems with the former):
...
func stride(to end: Self, by stride: Self.Stride) -> StrideTo<Self>
Returns the sequence of values (self, self + stride, self + stride +
stride, ... last) where last is the last value in the progression
that is less than end.
Conforming to Stridable and striding by 1: all good
Extending String.CharacterView.Index to Stridable and striding by 1 works fine:
extension String.CharacterView.Index : Strideable { }
var str = "0123"
// stride by 1: all good
str.startIndex.stride(to: str.endIndex, by: 1).forEach {
print($0,str.characters[$0])
} /* 0 0
1 1
2 2
3 3 */
For an even number of indices in str above (indices 0..<4), this also works for a stride of 2:
// stride by 2: OK for even number of characters in str.
str.startIndex.stride(to: str.endIndex, by: 2).forEach {
print($0,str.characters[$0])
} /* 0 0
2 2 */
However, for some cases of striding by >1: runtime exception
For an odd number of indices and a stride of 2, however, the stride over the character views indices yield a runtime error
// stride by 2: fatal error for odd number of characters in str.
str = "01234"
str.startIndex.stride(to: str.endIndex, by: 2).forEach {
print($0,str.characters[$0])
} /* 0 0
2 2
fatal error: cannot increment endIndex */
Investigations of my own
My own investigations into this made me suspect the error comes from the next() method of the StrideToGenerator structure, possibly when this method calls += on the stridable element
public func += <T : Strideable>(inout lhs: T, rhs: T.Stride) {
lhs = lhs.advancedBy(rhs)
}
(from a version of the Swift source for swift/stdlib/public/core/Stride.swift that somewhat corresponds to Swift 2.2). Given the following Q&A:s
Trim end off of string in swift, getting error at runtime,
Swift distance() method throws fatal error: can not increment endIndex,
we could suspect that we would possibly need to use String.CharacterView.Index.advancedBy(_:limit:) rather than ...advancedBy(_:) above. However from what I can see, the next() method in StrideToGenerator guards against advancing the index past the limit.
Edit addition: the source of the error seems to indeed be located in the next() method in StrideToGenerator:
// ... in StrideToGenerator
public mutating func next() -> Element? {
if stride > 0 ? current >= end : current <= end {
return nil
}
let ret = current
current += stride /* <-- will increase current to larger or equal to end
if stride is large enough (even if this last current
will never be returned in next call to next()) */
return ret
}
Even if the last update of current will never be returned (in next call to next()), the final advance of current index to a value larger or equal to that of end yields the specific runtime error above, for Index type String.CharacterView.Index.
(0..<4).startIndex.advancedBy(4) // OK, -> 4
"foo".startIndex.advancedBy(4) // fatal error: cannot increment endIndex
Is this to be considered a bug, or is String.CharacterView.Index simply not intended to be (directly) conformed to Stridable?
Simply declaring the protocol conformance
extension String.CharacterView.Index : Strideable { }
compiles because String.CharacterView.Index conforms to
BidirectionalIndexType , and ForwardIndexType/BidirectionalIndexType have default method implementations for advancedBy() and distanceTo()
as required by Strideable.
Strideable has the default protocol method implementation
for stride():
extension Strideable {
// ...
public func stride(to end: Self, by stride: Self.Stride) -> StrideTo<Self>
}
So the only methods which are "directly" implemented for
String.CharacterView.Index are – as far as I can see - the successor() and predecessor() methods from BidirectionalIndexType.
As you already figured out, the default method implementation of
stride() does not work well with String.CharacterView.Index.
But is is always possible to define dedicated methods for a concrete type. For the problems of making String.CharacterView.Index conform to Strideable see
Vatsal Manot's answer below and the discussion in the comments – it took me a while to get what he meant :)
Here is a possible implementation of a stride(to:by:) method for String.CharacterView.Index:
extension String.CharacterView.Index {
typealias Index = String.CharacterView.Index
func stride(to end: Index, by stride: Int) -> AnySequence<Index> {
precondition(stride != 0, "stride size must not be zero")
return AnySequence { () -> AnyGenerator<Index> in
var current = self
return AnyGenerator {
if stride > 0 ? current >= end : current <= end {
return nil
}
defer {
current = current.advancedBy(stride, limit: end)
}
return current
}
}
}
}
This seems to work as expected:
let str = "01234"
str.startIndex.stride(to: str.endIndex, by: 2).forEach {
print($0,str.characters[$0])
}
Output
0 0
2 2
4 4
To simply answer your ending question: this is not a bug. This is normal behavior.
String.CharacterView.Index can never exceed the endIndex of the parent construct (i.e. the character view), and thus triggers a runtime error when forced to (as correctly noted in the latter part of your answer). This is by design.
The only solution is to write your own alternative to the stride(to:by:), one that avoids equalling or exceeding the endIndex in any way.
As you know already, you can technically implement Strideable, but you cannot prevent that error. And since stride(to:by:) is not blueprinted within the protocol itself but introduced in an extension, there is no way you can use a "custom" stride(to:by:) in a generic scope (i.e. <T: Strideable> etc.). Which means you should probably not try and implement it unless you are absolutely sure that there is no way that error can occur; something which seems impossible.
Solution: There isn't one, currently. However, if you feel that this is an issue, I encourage you to start a thread in the swift-evolution mailing list, where this topic would be best received.
This isn't really an answer; it's just that your question got me playing around. Let's ignore Stridable and just try striding through a character view:
let str = "01234"
var i = str.startIndex
// i = i.advancedBy(1)
let inc = 2
while true {
print(str.characters[i])
if i.distanceTo(str.endIndex) > inc {
i = i.advancedBy(inc)
} else {
break
}
}
As you can see, it is crucial to test with distanceTo before we call advancedBy. Otherwise, we risk attempting to advance right through the end index and we'll get the "fatal error: can not increment endIndex" bomb.
So my thought is that something like this must be necessary in order to make the indices of a character view stridable.

Swift: what's the difference between Array<OtherModule.MyType>() and [OtherModule.MyType]()

I'm using a type from a different module, let's call it OtherModule.MyType,
This code:
var a = [OtherModule.MyType]()
will produce an error invalid use of '()' to call a value of non-function type '[MyType.Type]'
This code won't:
var ax = [OtherModule.MyType]
But I believe ax is not an array any more, since this code
ax.append(OtherModule.MyType())
will cause an error Cannot invoke 'append' with an argument list of '(MyType)'
So I wonder what ax really is?
Besides, this code works fine:
var ay = Array<OtherModule.MyType>()
ay.append(OtherModule.MyType())
UPDATE:
I'm using swift 1.2 with Xcode 6.3
For some reason best known to the Swift team (modules are very scantly documented), Module.Thing behaves differently to Thing.
While Int is just a type name:
let i: Int = 1 // fine
// not fine, "expected member name or constructor call after type name"
let j = Int
Swift.Int can be both:
// used as a type name
let k: Swift.Int = 1
let t = Swift.Int.self
// but also used as a value
let x = Swift.Int
// equivalent to this
let y = Int.self
toString(x) == toString(y) // true
Under some uses it only wants to be a value, not a type name though. Hence this works:
// a will be of type [Int.Type], initialized with an array
// literal of 1 element, the Int metatype
let a = [Swift.Int]
But trying to use it as a type name in this context fails: [Swift.Int]() is no more valid than writing [1]() or let strs = ["fred"]; strs().
This behaviour seems a little arbitrary, and may even be a bug/unintentional.
Since the only way in which Swift.Int can be used in this context:
Array<Swift.Int>()
is as a type not a value (since only types can go between the angle brackets), it kind of makes sense that this works while the more ambiguous array literal syntax behaves differently.

How to create a type that either hold an `Array<Int>` or `UnsafePointer<UInt8>`

I'm doing some performance testing of Swift vs Objective-C.
I created a Mac OS hybrid Swift/Objective-C project that creates large arrays of prime numbers using either Swift or Objective-C.
It's got a decent UI and shows the results in a clear display. You can check out the project on Github if you're interested. It's called SwiftPerformanceBenchmark.
The Objective-C code uses a malloc'ed C array of ints, and the Swift code uses an Array object.
The Objective C code is therefore a lot faster.
I've read about creating an Array-like wrapper around a buffer of bytes using code like this:
let size = 10000
var ptr = UnsafePointer<Int>malloc(size)
var bytes = UnsafeBufferPointer<Int>(start: ptr, count: data.length)
I'd like to modify my sample program so I can switch between my Array<Int> storage and using an UnsafeBufferPointer<Int> at runtime with a checkbox in the UI.
Thus I need a base type for my primes array that will hold either an Array<Int> or an UnsafeBufferPointer<Int>. I'm still too weak on Swift syntax to figure out how to do this.
For my Array- based code, I'll have to use array.append(value), and for the UnsafeBufferPointer<Int>, which is pre-filled with data, I'll use array[index]. I guess if I have to I could pre-populate my Array object with placeholder values so I could use array[index] syntax in both cases.
Can somebody give me a base type that can hold either an Array<Int> or an UnsafeBufferPointer<Int>, and the type-casts to allocate either type at runtime?
EDIT:
Say, for example, I have the following:
let count = 1000
var swiftArray:[Int]?
let useSwiftArrays = checkbox.isChecked
typealias someType = //A type that lets me use either unsafeArray or swiftArray
var primesArray: someType?
if useSwiftArrays
{
//Create a swift array version
swiftArray [Int](count: count, repeatedValue: 0)
primesArray = someType(swiftArray)
}
else
{
var ptr = UnsafePointer<Int>malloc(count*sizeof(Int))
var unsafeArray = UnsafeBufferPointer<Int>(start: ptr, count: data.length)
primesArray = someType(unsafeArray)
}
if let requiredPrimes = primesArray
{
requiredPrimes[0] = 2
}
#MartinR's suggestion should help get code that can switch between the two. But there's a shortcut you can take to prove whether the performance difference is between Swift arrays and C arrays, and that's to switch the Swift compiler optimization to -Ounchecked. Doing this eliminates the bounds checks on array indices etc that you would be doing manually by using unsafe pointers.
If I download your project from github and do that, I find that the Objective-C version is twice as fast as the Swift version. But... that’s because sizeof(int) is 4, but sizeof(Int) is 8. If you switch the C version to use 8-byte arithmetic as well...
p.s. it works the other way around as well, if I switch the Swift code to use UInt32, it runs at 2x the speed.
OK, it’s not pretty but here is a generic function that will work on any kind of collection, which means you can pass in either an Array, or an UnsafeMutableBufferPointer, which means you can use it on a malloc’d memory range, or using the array’s .withUnsafeMutableBufferPointer.
Unfortunately, some of the necessities of the generic version make it slightly less efficient than the non-generic version when used on an array. But it does show quite a nice performance boost over arrays in -O when used with a buffer:
func storePrimes<C: MutableCollectionType where C.Generator.Element: IntegerType>(inout store: C) {
if isEmpty(store) { return }
var candidate: C.Generator.Element = 3
var primeCount = store.startIndex
store[primeCount++] = 2
var isPrime: Bool
while primeCount != store.endIndex {
isPrime = true
var oldPrimeCount = store.startIndex
for oldPrime in store {
if oldPrimeCount++ == primeCount { break }
if candidate % oldPrime == 0 { isPrime = false; break }
if candidate < oldPrime &* oldPrime { isPrime = true; break }
}
if isPrime { store[primeCount++] = candidate }
candidate = candidate.advancedBy(2)
}
}
let totalCount = 2_000_000
var primes = Array<CInt>(count: totalCount, repeatedValue: 0)
let startTime = CFAbsoluteTimeGetCurrent()
storePrimes(&primes)
// or…
primes.withUnsafeMutableBufferPointer { (inout buffer: UnsafeMutableBufferPointer<CInt>) -> Void in
storePrimes(&buffer)
}
let now = CFAbsoluteTimeGetCurrent()
let totalTime = now - startTime
println("Total time: \(totalTime), per second: \(Double(totalCount)/totalTime)")
I am not 100% sure if I understand your problem correctly, but perhaps
this goes into the direction that you need.
Both Array and UnsafeMutablePointer conform to MutableCollectionType (which requires a subscript getter and setter).
So this function would accept both types:
func foo<T : MutableCollectionType where T.Generator.Element == Int, T.Index == Int>(inout storage : T) {
storage[0] = 1
storage[1] = 2
}
Example with buffer pointer:
let size = 2
var ptr = UnsafeMutablePointer<Int>(malloc(UInt(size * sizeof(Int))))
var buffer = UnsafeMutableBufferPointer<Int>(start: ptr, count: size)
foo(&buffer)
for elem in buffer {
println(elem)
}
Example with array:
var array = [Int](count: 2, repeatedValue: 0)
foo(&array)
for elem in array {
println(elem)
}
For non-mutating functions you can use CollectionType
instead of MutableCollectionType.

Swift: How to use sizeof?

In order to integrate with C API's while using Swift, I need to use the sizeof function. In C, this was easy. In Swift, I am in a labyrinth of type errors.
I have this code:
var anInt: Int = 5
var anIntSize: Int = sizeof(anInt)
The second line has the error "'NSNumber' is not a subtype of 'T.Type'". Why is this and how do I fix it?
Updated for Swift 3
Be careful that MemoryLayout<T>.size means something different than sizeof in C/Obj-C. You can read this old thread https://devforums.apple.com/message/1086617#1086617
Swift uses an generic type to make it explicit that the number is known at compile time.
To summarize, MemoryLayout<Type>.size is the space required for a single instance while MemoryLayout<Type>.stride is the distance between successive elements in a contiguous array. MemoryLayout<Type>.stride in Swift is the same as sizeof(type) in C/Obj-C.
To give a more concrete example:
struct Foo {
let x: Int
let y: Bool
}
MemoryLayout<Int>.size // returns 8 on 64-bit
MemoryLayout<Bool>.size // returns 1
MemoryLayout<Foo>.size // returns 9
MemoryLayout<Foo>.stride // returns 16 because of alignment requirements
MemoryLayout<Foo>.alignment // returns 8, addresses must be multiples of 8
Use sizeof as follows:
let size = sizeof(Int)
sizeof uses the type as the parameter.
If you want the size of the anInt variable you can pass the dynamicType field to sizeof.
Like so:
var anInt: Int = 5
var anIntSize: Int = sizeof(anInt.dynamicType)
Or more simply (pointed out by user102008):
var anInt: Int = 5
var anIntSize: Int = sizeofValue(anInt)
Swift 3 now has MemoryLayout.size(ofValue:) which can look up the size dynamically.
Using a generic function that in turn uses MemoryLayout<Type> will have unexpected results if you e.g. pass it a reference of protocol type. This is because — as far as I know — the compiler then has all the type information it needs to fill in the values at compile time, which is not apparent when looking at the function call. You would then get the size of the protocol, not the current value.
In Xcode 8 with Swift 3 beta 6 there is no function sizeof (). But if you want, you can define one for your needs. This new sizeof function works as expected with an array. This was not possible with the old builtin sizeof function.
let bb: UInt8 = 1
let dd: Double = 1.23456
func sizeof <T> (_ : T.Type) -> Int
{
return (MemoryLayout<T>.size)
}
func sizeof <T> (_ : T) -> Int
{
return (MemoryLayout<T>.size)
}
func sizeof <T> (_ value : [T]) -> Int
{
return (MemoryLayout<T>.size * value.count)
}
sizeof(UInt8.self) // 1
sizeof(Bool.self) // 1
sizeof(Double.self) // 8
sizeof(dd) // 8
sizeof(bb) // 1
var testArray: [Int32] = [1,2,3,4]
var arrayLength = sizeof(testArray) // 16
You need all versions of the sizeof function, to get the size of a variable and to get the correct size of a data-type and of an array.
If you only define the second function, then sizeof(UInt8.self) and sizeof(Bool.self) will result in "8". If you only define the first two functions, then sizeof(testArray) will result in "8".
Swift 4
From Xcode 9 onwards there is now a property called .bitWidth, this provides another way of writing sizeof: functions for instances and integer types:
func sizeof<T:FixedWidthInteger>(_ int:T) -> Int {
return int.bitWidth/UInt8.bitWidth
}
func sizeof<T:FixedWidthInteger>(_ intType:T.Type) -> Int {
return intType.bitWidth/UInt8.bitWidth
}
sizeof(UInt16.self) // 2
sizeof(20) // 8
But it would make more sense for consistency to replace sizeof: with .byteWidth:
extension FixedWidthInteger {
var byteWidth:Int {
return self.bitWidth/UInt8.bitWidth
}
static var byteWidth:Int {
return Self.bitWidth/UInt8.bitWidth
}
}
1.byteWidth // 8
UInt32.byteWidth // 4
It is easy to see why sizeof: is thought ambiguous but I'm not sure that burying it in MemoryLayout was the right thing to do. See the reasoning behind the shifting of sizeof: to MemoryLayout here.