I have an instance of UnsafeMutableRawPointer and want to retrieve intenger representation of it. I'm trying to use the following code:
let pointer : UnsafeMutableRawPointer? = address(of: self)
let intRepresentation : UInt64 = UInt64(bitPattern: pointer)
But Swift compiler throws error: "Cannot convert value of type 'UnsafeMutableRawPointer?' to expected argument type 'Int64'"
Constructor declared as public init(bitPattern pointer: UnsafeMutableRawPointer?) is Swift.Math.Integers.UInt
Also it has public init(bitPattern x: Int64) in the same file
How I can make this code work or convert UnsafeMutableRawPointer to integer in any other (but not too ridiculous, like string parsing) way?
(U)Int has the size of a pointer on all platforms (32 bit or 64 bit), so you can always convert between Unsafe(Mutable)(Raw)Pointer and Int or UInt:
let pointer: UnsafeRawPointer? = ...
let intRepresentation = UInt(bitPattern: pointer)
let ptrRepresentation = UnsafeRawPointer(bitPattern: intRepresentation)
assert(ptrRepresentation == pointer)
Your code
let intRepresentation = UInt64(bitPattern: pointer)
does not compile because UInt64 does not have an initializer taking a pointer argument, and that is because pointers can be 32 bit or 64 bit. (And even on a 64-bit platform, UInt and UInt64 are distinct types.)
If you need an UInt64 then
let intRepresentation = UInt64(bitPattern:Int64(Int(bitPattern: pointer)))
does the trick.
Related
I searched for the answer past days, and lot of them are very old (around swift 2 and 1.2).
I wanted to get characters when unicode code is taken from the variable. Because for some unkown reason that consturction won't work in Swift:
print("\u{\(variable)}") // should be proposal for including this in Swift 6
people advice to use UnicodeScalars. However Apple must introduced something new in Swift 5. I found some tutorial here but this code fragment
let value0: UInt8 = 0x61
let value1: UInt16 = 0x5927
let value2: UInt32 = 0x1F34E
let string0 = String(UnicodeScalar(value0)) // a
let string1 = String(UnicodeScalar(value1)) // 大 error here
let string2 = String(UnicodeScalar(value2)) // 🍎 error here
is not working, with string1 and string2 I get error "no exact matches in call to initalizer". So as when author posted that I understand it must worked in previous version of Swift but with latest they do not. What is changed under the hood? Section for Strings in Apple handbook does not reveal anything.
I am trying to rewrite in Swift some code from Typescript and in JS is so simple like:
for (let i = str.length; i >= 1; i -= 2) {
r = String.fromCharCode(parseInt("0x" + str.substring(i - 2, i))) + r;
}
and I'm struggling with this for past 2 days without effect!
The UnicodeScalar initializers taking an UInt16 or UInt32 argument are failable initializers (and return nil if the passed argument is an invalid Unicode scalar value).
public init?(_ v: UInt16)
public init?(_ v: UInt32)
The optional must be unwrapped before passing it to the String initializer.
Only the initializer taking an UInt8 argument is non-failable:
public init(_ v: UInt8)
So this compiles and produces the expected result:
let value0: UInt8 = 0x61
let value1: UInt16 = 0x5927
let value2: UInt32 = 0x1F34E
let string0 = String(UnicodeScalar(value0)) // a
let string1 = String(UnicodeScalar(value1)!) // 大
// ^--- unwrap optional
let string2 = String(UnicodeScalar(value2)!) // 🍎
// ^--- unwrap optional
Of course, in your real code, you would use optional binding and not forced unwrapping.
I need Swift implementation of C# BitConverter.DoubleToInt64Bits(doubleValue).
I find C# implementantion on site is only
https://referencesource.microsoft.com/#mscorlib/system/bitconverter.cs
[SecuritySafeCritical]
public static unsafe long DoubleToInt64Bits(double value) {
/// some comments ....
Contract.Assert(IsLittleEndian, "This method is implemented assuming little endian with an ambiguous spec.");
return *((long *)&value);
}
in c# i have method:
public long EncodeValue(double doubleValue)
{
return BitConverter.DoubleToInt64Bits(doubleValue);
}
but I need the same functionality in Swift for ios.
Something like this:
func EncodeValue(doubleValue: Double)
{
return SwiftDoubleToInt64Bits(doubleValue)
}
The bitPattern property of Double returns an (unsigned) 64-bit integer with the same memory representation:
let doubleValue = 12.34
let encoded = doubleValue.bitPattern // UInt64
The reverse conversion is done with
let decoded = Double(bitPattern: encoded)
print(decoded) // 12.34
In the same way you can convert between Float and UInt32.
For a platform independent memory representation (e.g. “big endian”) use
let encodedBE = doubleValue.bitPattern.bigEndian
let decoded = Double(bitPattern: UInt64(bigEndian: encodedBE))
I am using Xcode 9.2 and I don't understand the reason behind the error
Type of expression is ambiguous without more context
I am getting some input when trying to create and wordArray as showed below. If I define it as UInt8 array it does work but not if I do as Uint16 since I get the error.
The original data, Characteristic.value comes from a BLE characteristic
let characteristicData = Characteristic.value
let byteArray = [UInt8](characteristicData!)
print("\(Characteristic.uuid) value as byte is->",byteArray)
let wordArray = [UInt16](characteristicData!)//Type of expression is ambiguous without more context
print("\(Characteristic.uuid) value as word is->",wordArray)
Why does this happen and how I can fix it?
characteristicData has the type Data and that conforms to the
(RandomAccess)Collection protocol with UInt8 as element type, that's why you can
initialize an [UInt8] array from it:
let byteArray = [UInt8](characteristicData)
You could equivalently write
let byteArray = Array(characteristicData)
To interpret the data as an array of a different type, use
the generic
func withUnsafeBytes<ResultType, ContentType>(_ body: (UnsafePointer<ContentType>) throws -> ResultType) rethrows -> ResultType
method:
let wordArray = characteristicData.withUnsafeBytes {
[UInt16](UnsafeBufferPointer(start: $0, count: characteristicData.count/2))
}
Here the ContentType is inferred automatically as UInt16.
I got from a function Swift result in type UnsafeMutablePointer<UInt>
Can I cast it to UInt?
Just use the memory property to access the underlying data.
let ptr: UnsafeMutablePointer<UInt> = funcReturningMutablePtr()
let theValue: UInt = ptr.memory
The type annotations are for clarity, but are not necessary.
I'm really stuck! I'm not an expert at ObjC, and now I am trying to use Swift. I thought it would be much simpler, but it wasn't. I remember Craig said that they call Swift “Objective-C without C”, but there are too many C types in OS X's foundation. Documents said that many ObjC types will automatically convert, possibly bidirectionally, to Swift types. I'm curious: how about C types?
Here's where I'm stuck:
//array1:[String?], event.KeyCode.value:Int16
let s = array1[event.keyCode.value]; //return Int16 is not convertible to Int
I tried some things in ObjC:
let index = (Int) event.keyCode.value; //Error
or
let index = (Int32) event.keyCode.value; //Error again, Swift seems doesn't support this syntax
What is the proper way to convert Int16 to Int?
To convert a number from one type to another, you have to create a new instance, passing the source value as parameter - for example:
let int16: Int16 = 20
let int: Int = Int(int16)
let int32: Int32 = Int32(int16)
I used explicit types for variable declarations, to make the concept clear - but in all the above cases the type can be inferred:
let int16: Int16 = 20
let int = Int(int16)
let int32 = Int32(int16)
This is not how this type of casting works in Swift. Instead, use:
let a : Int16 = 1
let b : Int = Int(a)
So, you basically instantiate one variable on base of the other.
In my case "I was trying to convert core data Int16 to Int" explicit casting didn't worked. I did this way
let quantity_string:String = String(format:"%d",yourInt16value)
let quantity:Int = Int(quantity_string)