I am looking for help to increment a hexadecimal by 1. An example of this might be "000001" or "AD02D3" and increment to "000002" or "AD02D4". I can't figure out how to do this in Swift. If I convert to an Int:
var num2 = Int(str, radix: 16)
I loose all the preceding zeros and there are difficulties converting back.
Any suggestions?
Save the string length in a variable
Convert to an integer Int(str, radix: 16)
Add 1 to the integer
Use a format string with the saved length and specifying leading "0"s
Create the new string for the format function
let str = "00000A"
let len = str.lengthOfBytesUsingEncoding(NSUTF8StringEncoding)
let num2 = Int(str, radix: 16)! + 1
let newStr = NSString(format: "%0\(len)X", num2) as String
print("newStr: \(newStr)") // newStr: 00000B
import Foundation
func inc(s: String)->String? {
if let i = Int(s,radix: 16)?.successor() {
return String(format: "%0\(s.characters.count)X", arguments: [i])
} else {
return nil
}
}
let arr0 = ["01","0000FE00","FFFFFF"]
let arr1 = arr0.flatMap{ inc($0) }
print(arr1) // ["02", "0000FE01", "1000000"]
Related
I'm trying to separate the decimal and integer parts of a double in swift. I've tried a number of approaches but they all run into the same issue...
let x:Double = 1234.5678
let n1:Double = x % 1.0 // n1 = 0.567800000000034
let n2:Double = x - 1234.0 // same result
let n3:Double = modf(x, &integer) // same result
Is there a way to get 0.5678 instead of 0.567800000000034 without converting to the number to a string?
You can use truncatingRemainder and 1 as the divider.
Returns the remainder of this value divided by the given value using truncating division.
Apple doc
Example:
let myDouble1: Double = 12.25
let myDouble2: Double = 12.5
let myDouble3: Double = 12.75
let remainder1 = myDouble1.truncatingRemainder(dividingBy: 1)
let remainder2 = myDouble2.truncatingRemainder(dividingBy: 1)
let remainder3 = myDouble3.truncatingRemainder(dividingBy: 1)
remainder1 -> 0.25
remainder2 -> 0.5
remainder3 -> 0.75
Same approach as Alessandro Ornano implemented as an instance property of FloatingPoint protocol:
Xcode 11 • Swift 5.1
import Foundation
extension FloatingPoint {
var whole: Self { modf(self).0 }
var fraction: Self { modf(self).1 }
}
1.2.whole // 1
1.2.fraction // 0.2
If you need the fraction digits and preserve its precision digits you would need to use Swift Decimal type and initialize it with a String:
extension Decimal {
func rounded(_ roundingMode: NSDecimalNumber.RoundingMode = .plain) -> Decimal {
var result = Decimal()
var number = self
NSDecimalRound(&result, &number, 0, roundingMode)
return result
}
var whole: Decimal { rounded(sign == .minus ? .up : .down) }
var fraction: Decimal { self - whole }
}
let decimal = Decimal(string: "1234.99999999")! // 1234.99999999
let fractional = decimal.fraction // 0.99999999
let whole = decimal.whole // 1234
let sum = whole + fractional // 1234.99999999
let negativeDecimal = Decimal(string: "-1234.99999999")! // -1234.99999999
let negativefractional = negativeDecimal.fraction // -0.99999999
let negativeWhole = negativeDecimal.whole // -1234
let negativeSum = negativeWhole + negativefractional // -1234.99999999
Swift 2:
You can use:
modf(x).1
or
x % floor(abs(x))
Without converting it to a string, you can round up to a number of decimal places like this:
let x:Double = 1234.5678
let numberOfPlaces:Double = 4.0
let powerOfTen:Double = pow(10.0, numberOfPlaces)
let targetedDecimalPlaces:Double = round((x % 1.0) * powerOfTen) / powerOfTen
Your output would be
0.5678
Swift 5.1
let x:Double = 1234.5678
let decimalPart:Double = x.truncatingRemainder(dividingBy: 1) //0.5678
let integerPart:Double = x.rounded(.towardZero) //1234
Both of these methods return Double value.
if you want an integer number as integer part, you can just use
Int(x)
Use Float since it has less precision digits than Double
let x:Double = 1234.5678
let n1:Float = Float(x % 1) // n1 = 0.5678
There’s a function in C’s math library, and many programming languages, Swift included, give you access to it. It’s called modf, and in Swift, it works like this
// modf returns a 2-element tuple,
// with the whole number part in the first element,
// and the fraction part in the second element
let splitPi = modf(3.141592)
splitPi.0 // 3.0
splitPi.1 // 0.141592
You can create an extension like below,
extension Double {
func getWholeNumber() -> Double {
return modf(self).0
}
func getFractionNumber() -> Double {
return modf(self).1
}
}
You can get the Integer part like this:
let d: Double = 1.23456e12
let intparttruncated = trunc(d)
let intpartroundlower = Int(d)
The trunc() function truncates the part after the decimal point and the Int() function rounds to the next lower value. This is the same for positive numbers but a difference for negative numbers. If you subtract the truncated part from d, then you will get the fractional part.
func frac (_ v: Double) -> Double
{
return (v - trunc(v))
}
You can get Mantissa and Exponent of a Double value like this:
let d: Double = 1.23456e78
let exponent = trunc(log(d) / log(10.0))
let mantissa = d / pow(10, trunc(log(d) / log(10.0)))
Your result will be 78 for the exponent and 1.23456 for the Mantissa.
Hope this helps you.
It's impossible to create a solution that will work for all Doubles. And if the other answers ever worked, which I also believe is impossible, they don't anymore.
let _5678 = 1234.5678.description.drop { $0 != "." } .description // ".5678"
Double(_5678) // 0.5678
let _567 = 1234.567.description.drop { $0 != "." } .description // ".567"
Double(_567) // 0.5669999999999999
extension Double {
/// Gets the decimal value from a double.
var decimal: Double {
Double("0." + string.split(separator: ".").last.string) ?? 0.0
}
var string: String {
String(self)
}
}
This appears to solve the Double precision issues.
Usage:
print(34.46979988898988.decimal) // outputs 0.46979988898988
print(34.46.decimal) // outputs 0.46
I have a data like a bellow:
let data = Data(bytes: [206, 66, 49, 62])
Then I used this extension (from How to convert Data to hex string in swift) to convert to a hex string:
extension Data {
struct HexEncodingOptions: OptionSet {
let rawValue: Int
static let upperCase = HexEncodingOptions(rawValue: 1 << 0)
}
func hexEncodedString(options: HexEncodingOptions = []) -> String {
let hexDigits = Array((options.contains(.upperCase) ? "0123456789ABCDEF" : "0123456789abcdef").utf16)
var chars: [unichar] = []
chars.reserveCapacity(2 * count)
for byte in self {
chars.append(hexDigits[Int(byte / 16)])
chars.append(hexDigits[Int(byte % 16)])
}
return String(utf16CodeUnits: chars, count: chars.count)
}
}
And then it is giving "ce42313e" as hex string. Now I am trying to convert this to Signed integer (32-bit) Two's complement .. I tried a couple of ways but not find anything perfectly.
When I give "ce42313e" in this bellow link under hex decimal the value is -834522818
http://www.binaryconvert.com/convert_signed_int.html
bellow is one of those I tried to convert "ce42313e" to int and it's giving me 3460444478 ..instead of -834522818 .
let str = value
let number = Int(str, radix: 16)
Please help out to get that value.
Int(str, radix: 16) interprets the string as the hexadecimal
representation of an unsigned number. You could convert it to
Int32 with
let data = Data(bytes: [206, 66, 49, 62])
let str = data.hexEncodedString()
print(str) // ce42313e
let number = Int32(truncatingBitPattern: Int(str, radix: 16)!)
print(number) // -834522818
But actually you don't need the hex representation for that purpose.
Your data is the big-endian representation of a signed 32-bit integer,
and this is how you can get the number from the data directly:
let data = Data(bytes: [206, 66, 49, 62])
let number = Int32(bigEndian: data.withUnsafeBytes { $0.pointee })
print(number) // -834522818
Noted that the old method to convert a hex string to a binary string has been removed from swift i.e. : String(hex, radix: 2) -> binary string
What is an alternative in swift 4?
You need first to convert your hexaString to an Array of Bytes [UInt8]. Then you can use String(_, radix:) to convert the bytes to binary. Note that if you would like to return a String instead of an array of strings [String] you would need to add leading zeros to make your binary strings length consistent (8 characters):
extension String {
typealias Byte = UInt8
var hexaToBytes: [Byte] {
var start = startIndex
return stride(from: 0, to: count, by: 2).compactMap { _ in // use flatMap for older Swift versions
let end = index(after: start)
defer { start = index(after: end) }
return Byte(self[start...end], radix: 16)
}
}
var hexaToBinary: String {
return hexaToBytes.map {
let binary = String($0, radix: 2)
return repeatElement("0", count: 8-binary.count) + binary
}.joined()
}
}
let hexString = "00ff01fe"
hexString.hexaToBinary // "00000000111111110000000111111110"
I don't recall any function that would convert a hex string to another string of arbitrary radix. Perhaps you are thinking about the initializer functions that convert between strings and integer values (and vice versa) using an arbitrary radix:
let hex = "00ff01fe"
let value = UInt64(hex, radix: 16)!
let binary = String(value, radix: 2)
let paddedBinary = repeatElement("0", count: 64 - binary.count) + binary
But that only applies when the hex string represents a 64 bit value, but it illustrates the basic idea. Convert to some integer type, and then convert back to binary, padding it out with zeros.
If you have a hex string that is longer than that, you cannot use the above. But you can map the individual characters of your hex string to numeric values, build binary representation of each, zero pad them, and use joined to concatenate them together:
let hex = "ffeeddccbbaa99887766554433221100"
let result = hex.compactMap { c -> String? in
guard let value = Int(String(c), radix: 16) else { return nil }
let string = String(value, radix: 2)
return repeatElement("0", count: 4 - string.count) + string
}.joined()
I have a string of binary values e.g. "010010000110010101111001". Is there a simple way to convert this string into its ascii representation to get (in this case) "Hey"?
Only found the other way or things for Integer:
let binary = "11001"
if let number = Int(binary, radix: 2) {
print(number) // Output: 25
}
Do someone know a good and efficient solution for this case?
A variant of #OOPer's solution would be to use a conditionally binding while loop and index(_:offsetBy:limitedBy:) in order to iterate over the 8 character substrings, taking advantage of the fact that index(_:offsetBy:limitedBy:) returns nil when you try to advance past the limit.
let binaryBits = "010010000110010101111001"
var result = ""
var index = binaryBits.startIndex
while let next = binaryBits.index(index, offsetBy: 8, limitedBy: binaryBits.endIndex) {
let asciiCode = UInt8(binaryBits[index..<next], radix: 2)!
result.append(Character(UnicodeScalar(asciiCode)))
index = next
}
print(result) // Hey
Note that we're going via Character rather than String in the intermediate step – this is simply to take advantage of the fact that Character is specially optimised for cases where the UTF-8 representation fits into 63 bytes, which is the case here. This saves heap-allocating an intermediate buffer for each character.
Purely for the fun of it, another approach could be to use sequence(state:next:) in order to create a sequence of the start and end indices of each substring, and then reduce in order to concatenate the resultant characters together into a string:
let binaryBits = "010010000110010101111001"
// returns a lazily evaluated sequence of the start and end indices for each substring
// of 8 characters.
let indices = sequence(state: binaryBits.startIndex, next: {
index -> (index: String.Index, nextIndex: String.Index)? in
let previousIndex = index
// Advance the current index – if it didn't go past the limit, then return the
// current index along with the advanced index as a new element of the sequence.
return binaryBits.characters.formIndex(&index, offsetBy: 8, limitedBy: binaryBits.endIndex) ? (previousIndex, index) : nil
})
// iterate over the indices, concatenating the resultant characters together.
let result = indices.reduce("") {
$0 + String(UnicodeScalar(UInt8(binaryBits[$1.index..<$1.nextIndex], radix: 2)!))
}
print(result) // Hey
On the face of it, this appears to be much less efficient than the first solution (due to the fact that reduce should copy the string at each iteration) – however it appears the compiler is able to perform some optimisations to make it not much slower than the first solution.
You may need to split the input binary digits into 8-bit chunks, and then convert each chunk to an ASCII character. I cannot think of a super simple way:
var binaryBits = "010010000110010101111001"
var index = binaryBits.startIndex
var result: String = ""
for _ in 0..<binaryBits.characters.count/8 {
let nextIndex = binaryBits.index(index, offsetBy: 8)
let charBits = binaryBits[index..<nextIndex]
result += String(UnicodeScalar(UInt8(charBits, radix: 2)!))
index = nextIndex
}
print(result) //->Hey
Does basically the same as OOPer's solution, but he/she was faster and has a shorter, more elegant approach :-)
func getASCIIString(from binaryString: String) -> String? {
guard binaryString.characters.count % 8 == 0 else {
return nil
}
var asciiCharacters = [String]()
var asciiString = ""
let startIndex = binaryString.startIndex
var currentLowerIndex = startIndex
while currentLowerIndex < binaryString.endIndex {
let currentUpperIndex = binaryString.index(currentLowerIndex, offsetBy: 8)
let character = binaryString.substring(with: Range(uncheckedBounds: (lower: currentLowerIndex, upper: currentUpperIndex)))
asciiCharacters.append(character)
currentLowerIndex = currentUpperIndex
}
for asciiChar in asciiCharacters {
if let number = UInt8(asciiChar, radix: 2) {
let character = String(describing: UnicodeScalar(number))
asciiString.append(character)
} else {
return nil
}
}
return asciiString
}
let binaryString = "010010000110010101111001"
if let asciiString = getASCIIString(from: binaryString) {
print(asciiString) // Hey
}
A different approach
let bytes_string: String = "010010000110010101111001"
var range_count: Int = 0
let characters_array: [String] = Array(bytes_string.characters).map({ String($0)})
var conversion: String = ""
repeat
{
let sub_range = characters_array[range_count ..< (range_count + 8)]
let sub_string: String = sub_range.reduce("") { $0 + $1 }
let character: String = String(UnicodeScalar(UInt8(sub_string, radix: 2)!))
conversion += character
range_count += 8
} while range_count < characters_array.count
print(conversion)
You can do this:
extension String {
var binaryToAscii: String {
stride(from: 0, through: count - 1, by: 8)
.map { i in map { String($0)}[i..<(i + 8)].joined() }
.map { String(UnicodeScalar(UInt8($0, radix: 2)!)) }
.joined()
}
}
I can easily turn a decimal number into an octal but I'm trying to do the reverse and I've got stuck.
let decimal = 11_224_393
let octString = String(rawAddress, radix: 8, uppercase: false)
let octal = octString.toInt()
Question
I want a function that given an Int of octal digits will read it in as an octal and convert it to decimal
such as:
// oct2dec(777) = 511
// oct2dec(10) = 8
func oct2dec(octal : Int) -> Int {
// what goes here?
}
Just use Swift native octal literals and initializers (String from integer | Int from String).
let octalInt = 0o1234 // 668
let octalString = "1234" // "1234"
let decimalIntFromOctalString = Int(octalString, radix: 0o10) // 668
let octalStringFromInt = String(octalInt, radix: 0o10) // "1234"
For your specific use-case:
let decimal = 11_224_393
let octString = String(rawAddress, radix: 0o10, uppercase: false)
guard let octal = Int(octString, radix: 0o10) else {
print("octString was not a valid octal string:", octString)
}
Using string conversion functions is pretty horrible in my option. How about something like this instead:
func octalToDecimal(var octal: Int) -> Int {
var decimal = 0, i = 0
while octal != 0 {
var remainder = octal % 10
octal /= 10
decimal += remainder * Int(pow(8, Double(i++)))
}
return decimal
}
var decimal = octalToDecimal(777) // decimal is 511