Swift BitConverter.DoubleToInt64Bits Equivalent - swift

I need Swift implementation of C# BitConverter.DoubleToInt64Bits(doubleValue).
I find C# implementantion on site is only
https://referencesource.microsoft.com/#mscorlib/system/bitconverter.cs
[SecuritySafeCritical]
public static unsafe long DoubleToInt64Bits(double value) {
/// some comments ....
Contract.Assert(IsLittleEndian, "This method is implemented assuming little endian with an ambiguous spec.");
return *((long *)&value);
}
in c# i have method:
public long EncodeValue(double doubleValue)
{
return BitConverter.DoubleToInt64Bits(doubleValue);
}
but I need the same functionality in Swift for ios.
Something like this:
func EncodeValue(doubleValue: Double)
{
return SwiftDoubleToInt64Bits(doubleValue)
}

The bitPattern property of Double returns an (unsigned) 64-bit integer with the same memory representation:
let doubleValue = 12.34
let encoded = doubleValue.bitPattern // UInt64
The reverse conversion is done with
let decoded = Double(bitPattern: encoded)
print(decoded) // 12.34
In the same way you can convert between Float and UInt32.
For a platform independent memory representation (e.g. “big endian”) use
let encodedBE = doubleValue.bitPattern.bigEndian
let decoded = Double(bitPattern: UInt64(bigEndian: encodedBE))

Related

Does UserDefaults.integer work on 32-bit platforms?

Will the following code work on 32-bit platforms?
class Account {
private static let UserIDKey = "AccountUserIDKey"
class var userID: Int64? {
get { Int64(UserDefaults.standard.integer(forKey: UserIDKey)) }
set { UserDefaults.standard.set(newValue, forKey: UserIDKey) }
}
}
Ie, will it work on 32-bit platforms if I store and retrieve an Int value that's greater than Int32.max into UserDefaults?
I'm wondering because I only see the UserDefaults instance method integer(forKey:). Why doesn't UserDefaults have an instance method like int64(forKey:)?
You can save a number of simple variable types in the user defaults:
Booleans with Bool
integers with Int
floats with Float
doubles with Double
Strings with String
binary data with Data
dates with Date
URLs with the URL type
Collection types with Array and Dictionary
Internally the UserDefaults class can only store NSData, NSString, NSNumber, NSDate, NSArray and NSDictionary classes.
These are object types that can be saved in a property list.
An NSNumber is an NSObject that can contain the original C style numeric types, which even include the C/Objective-C style BOOL type (that stores YES or NO). Thankfully for us, these are also bridged to Swift, so for a Swift program, an NSNumber can automatically accept the following Swift types:
UInt
Int
Float
Double
Bool
Dmitry Popov's answer in Quora
Remember how you were taught in school how to add numbers like 38 and 54. You work with separate digits. You take rightmost digits, 8 and 4, add them, get the digit 2 in answer and 1 carried because of an overflow. You take the second digits, 3 and 5, add them to get 8 and add the carried 1 to get result 9, so the whole answer becomes 92. A 32-bit processor does exactly the same, but instead of decimal digits it's got 32-bit integers, algorithms are the same: work on two 32-bit parts of 64-bit number separately, deal with overflows by carrying from lower to higher word. This takes several 32-bit instructions and the compiler is responsible to encode them properly.
Save Int64 to UserDefaults
import Foundation
let defaults = UserDefaults.standart
//Define an 64bit integer with `Int64`
let myInteger: Int64 = 1000000000000000
//Save the integer to `UserDefaults`
defaults.set(myInteger, forKey: "myIntegerKey")
// Get the saved value from `UserDefaults`
guard let savedInteger = defaults.object(forKey: "myIntegerKey") as? Int64 else { return }
print("my 64bit integer is:", savedInteger)
Save 64bit Int to UserDefaults via converting to String
let myInteger: Int64 = 1000000000000000
let myString = String(myInteger)
//Save the `myString` to `UserDefaults`
UserDefaults.standard.set(myString, forKey: "myIntegerKey")
// Get the value from `UserDefaults`
let myString = UserDefaults.standard.string(forKey: "myIntegerKey")
let myInteger = Int64(myString!)
Resources
How is a UInt64 represented in a 32 bit device
Martin R's SO answer to store 64bit Int into UserDefaults
learnappmaking.com
codingexplorer.com NSUserDefaults a Swift introduction

Convert UnsafeMutableRawPointer to UInt64

I have an instance of UnsafeMutableRawPointer and want to retrieve intenger representation of it. I'm trying to use the following code:
let pointer : UnsafeMutableRawPointer? = address(of: self)
let intRepresentation : UInt64 = UInt64(bitPattern: pointer)
But Swift compiler throws error: "Cannot convert value of type 'UnsafeMutableRawPointer?' to expected argument type 'Int64'"
Constructor declared as public init(bitPattern pointer: UnsafeMutableRawPointer?) is Swift.Math.Integers.UInt
Also it has public init(bitPattern x: Int64) in the same file
How I can make this code work or convert UnsafeMutableRawPointer to integer in any other (but not too ridiculous, like string parsing) way?
(U)Int has the size of a pointer on all platforms (32 bit or 64 bit), so you can always convert between Unsafe(Mutable)(Raw)Pointer and Int or UInt:
let pointer: UnsafeRawPointer? = ...
let intRepresentation = UInt(bitPattern: pointer)
let ptrRepresentation = UnsafeRawPointer(bitPattern: intRepresentation)
assert(ptrRepresentation == pointer)
Your code
let intRepresentation = UInt64(bitPattern: pointer)
does not compile because UInt64 does not have an initializer taking a pointer argument, and that is because pointers can be 32 bit or 64 bit. (And even on a 64-bit platform, UInt and UInt64 are distinct types.)
If you need an UInt64 then
let intRepresentation = UInt64(bitPattern:Int64(Int(bitPattern: pointer)))
does the trick.

Mocking a String-class method in Swift

I've got some Swift 2.0 code, for which I'm trying to achieve 100% code coverage. I'm doing some JSON handling, part of which looks like so:
guard let jsonData = jsonText.dataUsingEncoding(NSUTF8StringEncoding) else {
throw ErrorCode.JSONEncodingFailure
}
I don't think there's a real-world case in which any string can't be encoded as UTF-8, but I don't want to open myself up to a crashing error either, so that code must remain. How can I mock the jsonText object to return nil for dataUsingEncoding()?
The closest I've come is to subclass NSString like so:
public class BadString: NSString {
public override var length: Int {
get {
return 5
}
}
public override func characterAtIndex(index: Int) -> unichar {
return 0
}
public override func dataUsingEncoding(encoding: NSStringEncoding) -> NSData? {
return nil
}
}
Here's the problem, though. In order for my mock implementation to be used, I have to define jsonText (a function parameter) as an NSString, rather than String, which feels wrong for an all-Swift codebase. With it defined as a Swift String, I have to cast my BadString to that type, and it uses the String implementation instead of my own.
Is there another (clean) way to achieve this?
You will be hard-pressed to find a string that cannot be encoded using UTF-8! As long as you know that is the encoding you will be using, I would suggest that you not worry about testing the "encoding failure" case.
However, if you still desire to test it then I recommend making one of the following changes in order to allow you to do so:
(1) change the way you are thinking about the 'failure': if you know that the string you are encoding will always be non-empty, then broaden the guard to also require that the encoded data has length > 0, e.g. =>
guard let jsonData = jsonText.dataUsingEncoding(NSUTF8StringEncoding)
where jsonData.length > 0 else {
throw ErrorCode.JSONEncodingFailure
}
...using this idea, you can now use an empty string for jsonText and trigger this code path (assuming that an empty string would also satisfy your definition of 'failure' here)
(2) store your string encoding value in a variable (let's call it stringEncoding) that you can access during your test setup, and then test this using incompatible values for jsonText and stringEncoding, e.g. =>
var jsonText = "🙈"
let stringEncoding = NSASCIIStringEncoding
...I guarantee that jsonText.dataUsingEncoding(stringEncoding) will return nil in this case :)
Happy Testing! I hope this helps!

Swift convert string to UnsafeMutablePointer<Int8>

I have a C function mapped to Swift defined as:
func swe_set_eph_path(path: UnsafeMutablePointer<Int8>) -> Void
I am trying to pass a path to the function and have tried:
var path = [Int8](count: 1024, repeatedValue: 0);
for i in 0...NSBundle.mainBundle().bundlePath.lengthOfBytesUsingEncoding(NSUTF16StringEncoding)-1
{
var range = i..<i+1
path[i] = String.toInt(NSBundle.mainBundle().bundlePath[range])
}
println("\(path)")
swe_set_ephe_path(&path)
but on the path[i] line I get the error:
'subscript' is unavailable: cannot subscript String with a range of
Int
swe_set_ephe_path(NSBundle.mainBundle().bundlePath)
nor
swe_set_ephe_path(&NSBundle.mainBundle().bundlePath)
don't work either
Besides not working, I feel there has got to be a better, less convoluted way of doing this. Previous answers on StackOverflow using CString don't seem to work anymore. Any suggestions?
Previous answers on StackOverflow using CString don't seem to work anymore
Nevertheless, UnsafePointer<Int8> is a C string. If your context absolutely requires an UnsafeMutablePointer, just coerce, like this:
let s = NSBundle.mainBundle().bundlePath
let cs = (s as NSString).UTF8String
var buffer = UnsafeMutablePointer<Int8>(cs)
swe_set_ephe_path(buffer)
Of course I don't have your swe_set_ephe_path, but it works fine in my testing when it is stubbed like this:
func swe_set_ephe_path(path: UnsafeMutablePointer<Int8>) {
println(String.fromCString(path))
}
In current version of Swift language you can do it like this (other answers are outdated):
let path = Bundle.main.bundlePath
let param = UnsafeMutablePointer<Int8>(mutating: (path as NSString).utf8String)
It’s actually extremely irritating of the library you’re using that it requires (in the C declaration) a char * path rather than const char * path. (this is assuming the function doesn’t mutate the input string – if it does, you’re in a whole different situation).
If it didn’t, the function would come over to Swift as:
// note, UnsafePointer not UnsafeMutablePointer
func swe_set_eph_path(path: UnsafePointer<Int8>) -> Void
and you could then rely on Swift’s implicit conversion:
let str = "blah"
swe_set_eph_path(str) // Swift implicitly converts Strings
// to const C strings when calling C funcs
But you can do an unsafe conversion quite easily, in combination with the withCString function:
str.withCString { cstr in
swe_set_eph_path(UnsafeMutablePointer(cstr))
}
I had a static library (someLibrary.a) written in C++ compiled for iOS.
The header file (someLibrary.h) had a function exposed like this:
extern long someFunction(char* aString);
The declaration in Swift looks like this:
Int someFunction(aString: UnsafeMutablePointer<Int8>)
I made an extension to String:
extension String {
var UTF8CString: UnsafeMutablePointer<Int8> {
return UnsafeMutablePointer((self as NSString).UTF8String)
}
}
So then I can call the method like so:
someFunction(mySwiftString.UTF8CString)
Update: Make String extension (swift 5.7)
extension String {
var UTF8CString: UnsafeMutablePointer<Int8> {
return UnsafeMutablePointer(mutating: (self as NSString).utf8String!)
}
}

Swift Simple XOR Encryption

I am trying to do a simple xor encryption routine in Swift. I know this is not a particularly secure or great method but I just need it to be simple. I know how the code is implemented in Javascript I'm just having trouble translating it to Swift.
Javascript:
function xor_str()
{
var to_enc = "string to encrypt";
var xor_key=28
var the_res="";//the result will be here
for(i=0;i<to_enc.length;++i)
{
the_res+=String.fromCharCode(xor_key^to_enc.charCodeAt(i));
}
document.forms['the_form'].elements.res.value=the_res;
}
Any help you could provide would be great, thanks!
I would propose an extension to String like this.
extension String {
func encodeWithXorByte(key: UInt8) -> String {
return String(bytes: map(self.utf8){$0 ^ key}, encoding: NSUTF8StringEncoding)!
}
From the inside out,
The call to self.utf8 creates a byte array, [UInt8], from the string
map() is called on each element and XOR'ed with the key value
A new String object is created from the XOR'ed byte array
Here is my Playground screen capture.
UPDATED: For Swift 2.0
extension String {
func encodeWithXorByte(key: UInt8) -> String {
return String(bytes: self.utf8.map{$0 ^ key}, encoding: NSUTF8StringEncoding) ?? ""
}
}
I can't answer with a comment yet, but Price Ringo, i noticed a couple problems with your sandbox..
the last line has 2 errors, you should actually XOR it with the original encrypted UInt8 and you are not "decoding" the encrypted string..
you have...
println(str.encodeWithXorByte(0))
where you should have..
println(encrypted.encodeWithXorByte(28))