Making a table of hex representations in Swift with compact code - swift

In C# one can write the following one line to make an array of all the hex strings representing the values from 0 to 255:
using System.Linq;
static string[] HexTbl = Enumerable.Range(0, 256).Select(v => v.ToString("X2")).ToArray();
Is there a similarly compact way to do this in Swift?

Yes there is. The approach is the same: map each number in the range
0 ... 255 to a string using a hex format:
let hexTable = (0 ..< 256).map { v in String(format: "%02X", v) }
or slightly shorter:
let hexTable = (0 ..< 256).map { String(format: "%02X", $0) }
Result:
["00", "01", "02", ..., "FD", "FE", "FF"]

Related

Swift: Byte array to decimal value

In my project, I communicate with a bluetooth device, the bluetooth device must send me a timestamp second, I received in byte:
[2,6,239]
When I convert converted to a string:
let payloadString = payload.map {
String(format: "%02x", $0)
}
Output:
["02", "06","ef"]
When I converted from the website 0206ef = 132847 seconds
How can I directly convert my aray [2,6,239] in second (= 132847 seconds)?
And if it's complicated then translate my array ["02", "06,"ef"] in second (= 132847 seconds)
The payload contains the bytes of the binary representation of the value.
You convert it back to the value by shifting each byte into its corresponding position:
let payload: [UInt8] = [2, 6, 239]
let value = Int(payload[0]) << 16 + Int(payload[1]) << 8 + Int(payload[2])
print(value) // 132847
The important point is to convert the bytes to integers before shifting, otherwise an overflow error would occur. Alternatively,
with multiplication:
let value = (Int(payload[0]) * 256 + Int(payload[1])) * 256 + Int(payload[2])
or
let value = payload.reduce(0) { $0 * 256 + Int($1) }
The last approach works with an arbitrary number of bytes – as long as
the result fits into an Int. For 4...8 bytes you better choose UInt64
to avoid overflow errors:
let value = payload.reduce(0) { $0 * 256 + UInt64($1) }
payloadString string can be reduced to hexStr and then converted to decimal
var payload = [2,6,239];
let payloadString = payload.map {
String(format: "%02x", $0)
}
//let hexStr = payloadString.reduce(""){$0 + $1}
let hexStr = payloadString.joined()
if let value = UInt64(hexStr, radix: 16) {
print(value)//132847
}

ByteCountFormatter iOS Swift

Convert Bytes values into GB/MB/KB, i am using ByteCountFormatter. Example code below.
func converByteToGB(_ bytes:Int64) -> String {
let formatter:ByteCountFormatter = ByteCountFormatter()
formatter.countStyle = .binary
return formatter.string(fromByteCount: Int64(bytes))
}
Now, my requirement is it should show only one digit after decimal point.
Example for 1.24 GB => 1.2 GB, not like 1.24 GB. force it for single digit after apply floor or ceil function.
ByteCountFormatter is not capable to show only one digit after decimal point. It shows by default 0 fraction digits for bytes and KB; 1 fraction digits for MB; 2 for GB and above. If isAdaptive is set to false it tries to show at least three significant digits, introducing fraction digits as necessary.
ByteCountFormatter also trims trailing zeros. To disable set zeroPadsFractionDigits to true.
I have adapted How to convert byte size into human readable format in java? to do what you want:
func humanReadableByteCount(bytes: Int) -> String {
if (bytes < 1000) { return "\(bytes) B" }
let exp = Int(log2(Double(bytes)) / log2(1000.0))
let unit = ["KB", "MB", "GB", "TB", "PB", "EB"][exp - 1]
let number = Double(bytes) / pow(1000, Double(exp))
return String(format: "%.1f %#", number, unit)
}
Notice this will format KB and MB differently than ByteCountFormatter. Here is a modification that removes trailing zero and does not show fraction digits for KB and for numbers larger than 100.
func humanReadableByteCount(bytes: Int) -> String {
if (bytes < 1000) { return "\(bytes) B" }
let exp = Int(log2(Double(bytes)) / log2(1000.0))
let unit = ["KB", "MB", "GB", "TB", "PB", "EB"][exp - 1]
let number = Double(bytes) / pow(1000, Double(exp))
if exp <= 1 || number >= 100 {
return String(format: "%.0f %#", number, unit)
} else {
return String(format: "%.1f %#", number, unit)
.replacingOccurrences(of: ".0", with: "")
}
}
Be also aware that this implementation does not take Locale into account. For example some locales use comma (",") instead of dot (".") as decimal separator.

Swift 3 : Negative Int to hexadecimal

Hy everyone,
I need to transform a Int to its hexadecimal value.
Example : -40 => D8
I have a working method for positive (or unsigned) Int but it doesn't work as expected with negatives. Here's my code.
class func encodeHex(data:[Int]) -> String {
let hexadecimal = data.reduce("") { (string , element) in
var append = String(element, radix:16 , uppercase : false)
if append.characters.count == 1 {
append = "0" + append
}
return string + append
}
return hexadecimal
}
If I pass -40 I get -28.
Can anyone help ? Thanks :)
I assume from your existing code that all integers are in the range
-128 ... 127. Then this would work:
func encodeHex(data:[Int]) -> String {
return data.map { String(format: "%02hhX", $0) }.joined()
}
The "%02hhX" format prints the least significant byte of the
given integer in base 16 with 2 digits.
Example:
print(encodeHex(data: [40, -40, 127, -128]))
// 28D87F80
D8 is the last byte of binary representation of -40. The remaining three bytes are all FFs.
If you are looking for a string that represents only the last byte, you can obtain by first converting your number to unsigned 8-bit integer, and then converting it to hex, like this:
let x = UInt8(bitPattern:Int8(data))
let res = String(format:"%02X", x)

Why can't I get a negative number using bit manipulation in Swift?

This is a LeetCode question. I wrote 4 answers in different versions of that question. When I tried to use "Bit manipulation", I got the error. Since no one in LeetCode can answer my question, and I can't find any Swift doc about this. I thought I would try to ask here.
The question is to get the majority element (>n/2) in a given array. The following code works in other languages like Java, so I think it might be a general question in Swift.
func majorityElement(nums: [Int]) -> Int {
var bit = Array(count: 32, repeatedValue: 0)
for num in nums {
for i in 0..<32 {
if (num>>(31-i) & 1) == 1 {
bit[i] += 1
}
}
}
var ret = 0
for i in 0..<32 {
bit[i] = bit[i]>nums.count/2 ? 1 : 0
ret += bit[i] * (1<<(31-i))
}
return ret
}
When the input is [-2147483648], the output is 2147483648. But in Java, it can successfully output the right negative number.
Swift doc says :
Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.
Well, it is 2,147,483,647, the input is 1 larger than that number. When I ran pow(2.0, 31.0) in playground, it shows 2147483648. I got confused. What's wrong with my code or what did I miss about Swift Int?
A Java int is a 32-bit integer. The Swift Int is 32-bit or 64-bit
depending on the platform. In particular, it is 64-bit on all OS X
platforms where Swift is available.
Your code handles only the lower 32 bits of the given integers, so that
-2147483648 = 0xffffffff80000000
becomes
2147483648 = 0x0000000080000000
So solve the problem, you can either change the function to take 32-bit integers as arguments:
func majorityElement(nums: [Int32]) -> Int32 { ... }
or make it work with arbitrary sized integers by computing the
actual size and use that instead of the constant 32:
func majorityElement(nums: [Int]) -> Int {
let numBits = sizeof(Int) * 8
var bit = Array(count: numBits, repeatedValue: 0)
for num in nums {
for i in 0..<numBits {
if (num>>(numBits-1-i) & 1) == 1 {
bit[i] += 1
}
}
}
var ret = 0
for i in 0..<numBits {
bit[i] = bit[i]>nums.count/2 ? 1 : 0
ret += bit[i] * (1<<(numBits-1-i))
}
return ret
}
A more Swifty way would be to use map() and reduce()
func majorityElement(nums: [Int]) -> Int {
let numBits = sizeof(Int) * 8
let bitCounts = (0 ..< numBits).map { i in
nums.reduce(0) { $0 + ($1 >> i) & 1 }
}
let major = (0 ..< numBits).reduce(0) {
$0 | (bitCounts[$1] > nums.count/2 ? 1 << $1 : 0)
}
return major
}

Split a double by dot to two numbers

So I am trying to split a number in swift, I have tried searching for this on the internet but have had no success. So first I will start with a number like:
var number = 34.55
And from this number, I want to create two separate number by splitting from the dot. So the output should be something like:
var firstHalf = 34
var secondHalf = 55
Or the output can also be in an array of thats easier. How can I achieve this?
The easiest way would be to first cast the double to a string:
var d = 34.55
var b = "\(d)" // or String(d)
then split the string with the global split function:
var c = split(b) { $0 == "." } // [34, 55]
You can also bake this functionality into a double:
extension Double {
func splitAtDecimal() -> [Int] {
return (split("\(self)") { $0 == "." }).map({
return String($0).toInt()!
})
}
}
This would allow you to do the following:
var number = 34.55
print(number.splitAtDecimal()) // [34, 55]
Well, what you have there is a float, not a string. You can't really "split" it, and remember that a float is not strictly limited to 2 digits after the separator.
One solution is :
var firstHalf = Int(number)
var secondHalf = Int((number - firstHalf) * 100)
It's nasty but it'll do the right thing for your example (it will, however, fail when dealing with numbers that have more than two decimals of precision)
Alternatively, you could convert it into a string and then split that.
var stringified = NSString(format: "%.2f", number)
var parts = stringifed.componentsSeparatedByString(".")
Note that I'm explicitly calling the formatter here, to avoid unwanted behavior of standard float to string conversions.
Add the following extension:
extension Double {
func splitIntoParts(decimalPlaces: Int, round: Bool) -> (leftPart: Int, rightPart: Int) {
var number = self
if round {
//round to specified number of decimal places:
let divisor = pow(10.0, Double(decimalPlaces))
number = Darwin.round(self * divisor) / divisor
}
//convert to string and split on decimal point:
let parts = String(number).components(separatedBy: ".")
//extract left and right parts:
let leftPart = Int(parts[0]) ?? 0
let rightPart = Int(parts[1]) ?? 0
return(leftPart, rightPart)
}
Usage - Unrounded:
let number:Double = 95.99999999
let parts = number.splitIntoParts(decimalPlaces: 3, round: false)
print("LeftPart: \(parts.leftPart) RightPart: \(parts.rightPart)")
Outputs:
LeftPart: 95 RightPart: 999
Usage Rounded:
let number:Double = 95.199999999
let parts = number.splitIntoParts(decimalPlaces: 1, round: true)
Outputs:
LeftPart: 95 RightPart: 2
Actually, it's not possible to split a double by dot into two INTEGER numbers. All earlier offered solutions will produce a bug in nearly 10% of cases.
Here's why:
The breaking case will be numbers with decimal parts starting with one or more zeroes, for example: 1.05, 11.00698, etc.
Any decimal part that starts with one or more zeroes will have those zeroes discarded when converted to integers. The result of those conversions:
1.05 will become (1, 5)
11.00698 will become (11, 698)
An ugly and hard to find bug is guaranteed...
The only way to meaningfully split a decimal number is to convert it to (Int, Double). Below is a simple extension to Double that does that:
extension Double {
func splitAtDecimal() -> (Int, Double) {
let whole = Int(self)
let decimal = self - Darwin.floor(self)
return (whole, decimal)
}
}