SBData is wrong when SBValue comes from a Swift Dictionary - swift

I'm trying to write a Python function to format a Foundation.Decimal, for use as a type summarizer. I posted it in this answer. I'll also include it at the bottom of this answer, with extra debug prints.
I've now discovered a bug, but I don't know if the bug is in my function, or in lldb, or possibly in the Swift compiler.
Here's a transcript that demonstrates the bug. I load my type summarizer in ~/.lldbinit, so the Swift REPL uses it.
:; xcrun swift
registering Decimal type summaries
Welcome to Apple Swift version 4.2 (swiftlang-1000.11.37.1 clang-1000.11.45.1). Type :help for assistance.
1> import Foundation
2> let dec: Decimal = 7
dec: Decimal = 7
Above, the 7 in the debugger output is from my type summarizer and is correct.
3> var dict = [String: Decimal]()
dict: [String : Decimal] = 0 key/value pairs
4> dict["x"] = dec
5> dict["x"]
$R0: Decimal? = 7
Above, the 7 is again from my type summarizer, and is correct.
6> dict
$R1: [String : Decimal] = 1 key/value pair {
[0] = {
key = "x"
value = 0
}
}
Above, the 0 (in value = 0) is from my type summarizer, and is incorrect. It should be 7.
So why is it zero? My Python function is given an SBValue. It calls GetData() on the SBValue to get an SBData. I added debug prints to the function to print the bytes in the SBData, and also to print the result of sbValue.GetLoadAddress(). Here's the transcript with these debug prints:
:; xcrun swift
registering Decimal type summaries
Welcome to Apple Swift version 4.2 (swiftlang-1000.11.37.1 clang-1000.11.45.1). Type :help for assistance.
1> import Foundation
2> let dec: Decimal = 7
dec: Decimal = loadAddress: ffffffffffffffff
data: 00 21 00 00 07 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
7
Above, we can see that the load address is bogus, but the bytes of the SBData are correct (byte 1, 21, contains the length and flags; byte 4, '07', is the first byte of the significand).
3> var dict = [String: Decimal]()
dict: [String : Decimal] = 0 key/value pairs
4> dict["x"] = dec
5> dict
$R0: [String : Decimal] = 1 key/value pair {
[0] = {
key = "x"
value = loadAddress: ffffffffffffffff
data: 00 00 00 00 00 21 00 00 07 00 00 00 00 00 00 00 00 00 00 00
0
}
}
Above, we can see that the load address is still bogus, and now the bytes of the SBData are incorrect. The SBData still contains 20 bytes (the correct number for a Foundation.Decimal, aka NSDecimal), but now four 00 bytes have been inserted at the front and the last four bytes have been dropped.
So here are my specific questions:
Am I using the lldb API incorrectly, and thus getting wrong answers? If so, what am I doing wrong and how should I correct it?
If I'm using the lldb API correctly, then is this a bug in lldb, or is the Swift compiler emitting incorrect metadata? How can I figure out which tool has the bug? (Because if it's a bug in one of the tools, I'd like to file a bug report.)
If it's a bug in lldb or Swift, how can I work around the problem so I can format a Decimal correctly when it's part of a Dictionary?
Here is my type formatter, with debug prints:
# Decimal / NSDecimal support for lldb
#
# Put this file somewhere, e.g. ~/.../lldb/Decimal.py
# Then add this line to ~/.lldbinit:
# command script import ~/.../lldb/Decimal.py
import lldb
def stringForDecimal(sbValue, internal_dict):
from decimal import Decimal, getcontext
print(' loadAddress: %x' % sbValue.GetLoadAddress())
sbData = sbValue.GetData()
if not sbData.IsValid():
raise Exception('unable to get data: ' + sbError.GetCString())
if sbData.GetByteSize() != 20:
raise Exception('expected data to be 20 bytes but found ' + repr(sbData.GetByteSize()))
sbError = lldb.SBError()
exponent = sbData.GetSignedInt8(sbError, 0)
if sbError.Fail():
raise Exception('unable to read exponent byte: ' + sbError.GetCString())
flags = sbData.GetUnsignedInt8(sbError, 1)
if sbError.Fail():
raise Exception('unable to read flags byte: ' + sbError.GetCString())
length = flags & 0xf
isNegative = (flags & 0x10) != 0
debugString = ''
for i in range(20):
debugString += ' %02x' % sbData.GetUnsignedInt8(sbError, i)
print(' data:' + debugString)
if length == 0 and isNegative:
return 'NaN'
if length == 0:
return '0'
getcontext().prec = 200
value = Decimal(0)
scale = Decimal(1)
for i in range(length):
digit = sbData.GetUnsignedInt16(sbError, 4 + 2 * i)
if sbError.Fail():
raise Exception('unable to read memory: ' + sbError.GetCString())
value += scale * Decimal(digit)
scale *= 65536
value = value.scaleb(exponent)
if isNegative:
value = -value
return str(value)
def __lldb_init_module(debugger, internal_dict):
print('registering Decimal type summaries')
debugger.HandleCommand('type summary add Foundation.Decimal -F "' + __name__ + '.stringForDecimal"')
debugger.HandleCommand('type summary add NSDecimal -F "' + __name__ + '.stringForDecimal"')

This looks like an lldb bug. Please file a bug about this against lldb with http://bugs.swift.org.
For background: there is some magic going on behind your back in the Dictionary case. I can't show this in the REPL, but if you have a [String : Decimal] array as a local variable in some real code and do:
(lldb) frame variable --raw dec_array
(Swift.Dictionary<Swift.String, Foundation.Decimal>) dec_array = {
_variantBuffer = native {
native = {
_storage = 0x0000000100d05780 {
Swift._SwiftNativeNSDictionary = {}
bucketCount = {
_value = 2
}
count = {
_value = 1
}
initializedEntries = {
values = {
_rawValue = 0x0000000100d057d0
}
bitCount = {
_value = 2
}
}
keys = {
_rawValue = 0x0000000100d057d8
}
values = {
_rawValue = 0x0000000100d057f8
}
seed = {
0 = {
_value = -5794706384231184310
}
1 = {
_value = 8361200869849021207
}
}
}
}
cocoa = {
cocoaDictionary = 0x00000001000021b0
}
}
}
A swift Dictionary doesn't actually contain the dictionary elements anywhere obvious, and certainly not as ivars. So lldb has a "Synthetic child provider" for Swift Dictionaries that makes up SBValues for the keys and values of the Dictionary, and it is one of those synthetic children that your formatter is being handed.
That's also why the load address is -1. That really means "this is a synthetic thing whose data lldb is directly managing, not a thing at an address somewhere in your program." The same thing is true of REPL results, they are more a fiction lldb maintains. But if you looked at a local variable of type Decimal, you would see a valid load address, because it is a thing that lives somewhere in memory.
Anyway, so apparently the Synthetic children Decimal objects we are making up to represent the values of the dictionary don't set the start of the data correctly. Interestingly enough, if you make a [Decimal : String] dictionary, the key field's SBData is correct, and your formatter works. It's just the values that aren't right.
I tried the same thing with Dictionaries that have Strings as values, and the SBData looks correct there. So there's something funny about Decimal. Anyway, thanks for pursuing this, and please do file a bug.

Related

Integer encoding format

I've run across some PIN encoding which I'm trying to figure out so I can improve upon a web application used at my place of work.
When I reset users' PINs (in this case, just my own for testing purposes), I'm seeing the following:
PIN VALUE
000000 = 7F55858585858585
111111 = 7F55868686868686
222222 = 7F55878787878787
999999 = 7F558E8E8E8E8E8E
000001 = 7F01313131313132
000011 = 7F55858585858686
000111 = 7F01313131323232
001111 = 7F55858586868686
011111 = 7F01313232323232
000002 = 7F02323232323234
100000 = 7F01323131313131
111112 = 7F03343434343435
123456 = 7F0738393A3B3C3D
654321 = 7F073D3C3B3A3938
1357924680 = 7F01323436383A3335373931
1111111111 = 7F5586868686868686868686
1234567890 = 7F0132333435363738393A31
It's clearly just hex, and always starts with 7F (1111111 or 127), but I'm not seeing a pattern for how the next two characters are chosen. Those two characters seem to be the determining value for converting the PIN.
For example:
000000 = 7F 55 858585858585
7F (hex) = 127 (dec) or 1111111 (bin) ## appears to not be used in the calculation?
55 (hex) = 85 (dec) or 1010101 (bin)
0 (PIN) + 85 = 85
000000 = 858585858585
111111 = 7F 55 868686868686
7F (hex) = 127 (dec) or 1111111 (bin) ## appears to not be used in the calculation?
55 (hex) = 85 (dec)
1 (PIN) + 85 = 86
111111 = 868686868686
But then also:
1357924680 = 7F 01 323436383A3335373931
01 (hex) = 31 (dec) ?
1 (PIN) + 31 = 32
1357924680 = 323436383A3335373931
Any help pointing me in the right direction would be greatly appreciated.
I don't see enough data in your minimal reproducible example to uncover an algorithm how the pinshift value should be determined (supplied to the pin_to_hex function). A random value is used in the following solution:
def hex_to_pin( pinhex: str) -> list:
'''
decode a PIN from a particular hexadecimal-formatted string
hex_to_pin('7F0738393A3B3C3D')
inverse of the "pin_to_hex" function (any of the following):
hex_to_pin(pin_to_hex('123456', 7))
pin_to_hex(*hex_to_pin('7F0738393A3B3C3D'))
'''
xxaux = bytes.fromhex(pinhex)
return [bytes([x - xxaux[1] for x in xxaux[2:]]).decode(),
xxaux[1]]
def pin_to_hex( pindec: str, pinshift: int, upper=False) -> str:
'''
encode a PIN to a particular hexadecimal-formatted string
pin_to_hex('123456', 7)
inverse of the "hex_to_pin" function (any of the following):
pin_to_hex(*hex_to_pin('7F0738393A3B3C3D'),True)
hex_to_pin(pin_to_hex('123456', 7))
'''
shift_ = max( 1, pinshift % 199) ## 134 for alpha-numeric PIN code
retaux = [b'\x7F', shift_.to_bytes(1, byteorder='big')]
for digit_ in pindec.encode():
retaux.append( (digit_ + shift_).to_bytes(1, byteorder='big'))
if upper:
return (b''.join(retaux)).hex().upper()
else:
return (b''.join(retaux)).hex()
def get_pin_shift( pindec: str) -> int:
'''
determine "pinshift" parameter for the "pin_to_hex" function
currently returns a random number
'''
return random.randint(1,198) ## (1,133) for alpha-numeric PIN code
hexes = [
'7F01323436383A3335373931',
'7F0738393A3B3C3D',
'7F558E8E8E8E8E8E'
]
print("hex_to_pin:")
maxlen = len( max(hexes, key=len))
deces = []
for xshex in hexes:
xsdec = hex_to_pin( xshex)
print( f"{xshex:<{maxlen}} ({xsdec[1]:>3}) {xsdec[0]}")
deces.append(xsdec[0])
import random
print("pin_to_hex:")
for xsdec in deces:
xsshift = get_pin_shift( xsdec)
xshex = pin_to_hex( xsdec, xsshift)
print( f"{xshex:<{maxlen}} ({xsshift:>3}) {xsdec}")
Output SO\71875753.py
hex_to_pin:
7F01323436383A3335373931 ( 1) 1357924680
7F0738393A3B3C3D ( 7) 123456
7F558E8E8E8E8E8E ( 85) 999999
pin_to_hex:
7f1041434547494244464840 ( 16) 1357924680
7f4e7f8081828384 ( 78) 123456
7f013a3a3a3a3a3a ( 1) 999999

How to get desired values from BLE manufacturer data flutter

I am new to flutter and I am working on an app that reads data from a BLE beacon. I have scanned the device and got the manufacturer data as {256:[0,0,0,16,1,57,33,18,0,0,154,10,0,0,94,0]}
the device manufacturer told me to device data like :
KCBAdvDataManufacturerData = <.. .. .. .. .. .. .. .. be 21 01 00 50 08 00 00 5e ..>
The UUID - kCBAdvDataManufacturerData packet contains the sensor data as shown below:
Byte index 8 – 11 = Pressure 32bit value
Byte index 12 – 15 = Temperature 32bit value
Byte index 16 = Battery level in percentage
I am totally not getting any idea that in Dart how can I achieve it from
{256:[0,0,0,16,1,57,33,18,0,0,154,10,0,0,94,0]}
to
Byte index 8 – 11 = Pressure 32bit value
Byte index 12 – 15 = Temperature 32bit value
Byte index 16 = Battery level in percentage
and then
in a human-understandable form
here temperature is in C pressure in PSI and battery is in %.
There is a list of bytes so you can get sublists from that list with https://api.dart.dev/stable/2.9.1/dart-core/List/sublist.html
That sublist can be converted from 4 bytes into a signed integer for the pressure and temperature values:
Convert 4 byte into a signed integer
I am not sure the index values you have been given look quite right. I am assuming the data is in little endian format so my assumption on the data is:
Pressure = [33,18,0,0] = 4641 (Are you expecting a value of about 46.41psi?)
Temperature = [154,10,0,0] = 2714 (Are you expecting a value of about 27.14c?)
Battery = [94] = 94 (Are you expecting a value of 94%?)
This might be done like the following:
import 'dart:typed_data';
var manufacturerData = Uint8List.fromList([0,0,0,16,1,57,33,18,0,0,154,10,0,0,94,0]);
var pressure = ByteData.sublistView(manufacturerData, 6, 10);
var temperature = ByteData.sublistView(manufacturerData, 10, 14);
var battery = ByteData.sublistView(manufacturerData, 14, 15);
main() {
print(pressure.getUint32(0, Endian.little));
print(temperature.getUint32(0, Endian.little));
print(battery.getUint8(0));
}
Gives me the output:
4641
2714
94

Confused about some hexadecimals, longs, and int

I am following along my Scala textbook and I see this:
scala> val hex = 0x5
hex: Int = 5
scala> val hex2 = 0x00ff
hex2: Int = 255
scala> val hex3 = 0xff
hex2: Int = 255
scala> var hex4 = 0xbe
magic: Int = 190
scala> var hex5 = 0xFF
magic: Int = 255
val magic = 0xcafebabe
magic: Int = -889275714
scala> var prog = 0xCAFEBABEL
prog: Long = 3405691582
scala> val tower = 35l
tower: Long = 35
My questions:
why do you need the extra 00 after the x in 0x00FF?
I get why FF = 255... hexadecimal is base16 starting at 00 = 0 and 0F = 15. But why does 0xcafebabe = -889275714?
Why is going on with the Longs? I don't understand what is going on?
You don't, it's just to show that leading 0s are ignored as far as I can tell
int is a 32-bit signed integer: if you exceed 2^31, the highest-value bit gets set but is interpreted as a minus. In short, you have an overflow.
If you add "l", the variable is a long which uses 64 bits, so the overflow doesn't happen
00FF needs the 2 zeros to make sure that this is a SIGNED number, proving that it is positive by using the two zeros.
The cafebabe doesn't have that since it is a negative number. We found that out because of the lack of zeros at the end.
Finally, the point of the long (though im not sure of that one) is to set the idea that there are unseen zeros stretching backwards, thus giving us a positive number.

Swift 3 : Negative Int to hexadecimal

Hy everyone,
I need to transform a Int to its hexadecimal value.
Example : -40 => D8
I have a working method for positive (or unsigned) Int but it doesn't work as expected with negatives. Here's my code.
class func encodeHex(data:[Int]) -> String {
let hexadecimal = data.reduce("") { (string , element) in
var append = String(element, radix:16 , uppercase : false)
if append.characters.count == 1 {
append = "0" + append
}
return string + append
}
return hexadecimal
}
If I pass -40 I get -28.
Can anyone help ? Thanks :)
I assume from your existing code that all integers are in the range
-128 ... 127. Then this would work:
func encodeHex(data:[Int]) -> String {
return data.map { String(format: "%02hhX", $0) }.joined()
}
The "%02hhX" format prints the least significant byte of the
given integer in base 16 with 2 digits.
Example:
print(encodeHex(data: [40, -40, 127, -128]))
// 28D87F80
D8 is the last byte of binary representation of -40. The remaining three bytes are all FFs.
If you are looking for a string that represents only the last byte, you can obtain by first converting your number to unsigned 8-bit integer, and then converting it to hex, like this:
let x = UInt8(bitPattern:Int8(data))
let res = String(format:"%02X", x)

How do I convert a string to hex in Rust?

I want to convert a string of characters (a SHA256 hash) to hex in Rust:
extern crate crypto;
extern crate rustc_serialize;
use rustc_serialize::hex::ToHex;
use crypto::digest::Digest;
use crypto::sha2::Sha256;
fn gen_sha256(hashme: &str) -> String {
let mut sh = Sha256::new();
sh.input_str(hashme);
sh.result_str()
}
fn main() {
let hash = gen_sha256("example");
hash.to_hex()
}
The compiler says:
error[E0599]: no method named `to_hex` found for type `std::string::String` in the current scope
--> src/main.rs:18:10
|
18 | hash.to_hex()
| ^^^^^^
I can see this is true; it looks like it's only implemented for [u8].
What am I to do? Is there no method implemented to convert from a string to hex in Rust?
My Cargo.toml dependencies:
[dependencies]
rust-crypto = "0.2.36"
rustc-serialize = "0.3.24"
edit I just realized the string is already in hex format from the rust-crypto library. D'oh.
I will go out on a limb here, and suggest that the solution is for hash to be of type Vec<u8>.
The issue is that while you can indeed convert a String to a &[u8] using as_bytes and then use to_hex, you first need to have a valid String object to start with.
While any String object can be converted to a &[u8], the reverse is not true. A String object is solely meant to hold a valid UTF-8 encoded Unicode string: not all bytes pattern qualify.
Therefore, it is incorrect for gen_sha256 to produce a String. A more correct type would be Vec<u8> which can, indeed, accept any bytes pattern. And from then on, invoking to_hex is easy enough:
hash.as_slice().to_hex()
It appears the source for ToHex has the solution I'm looking for. It contains a test:
#[test]
pub fn test_to_hex() {
assert_eq!("foobar".as_bytes().to_hex(), "666f6f626172");
}
My revised code is:
let hash = gen_sha256("example");
hash.as_bytes().to_hex()
This appears to work. I will take some time before I accept this solution if anyone has an alternative answer.
A hexadecimal representation can be generated with a function like this:
pub fn hex_push(buf: &mut String, blob: &[u8]) {
for ch in blob {
fn hex_from_digit(num: u8) -> char {
if num < 10 {
(b'0' + num) as char
} else {
(b'A' + num - 10) as char
}
}
buf.push(hex_from_digit(ch / 16));
buf.push(hex_from_digit(ch % 16));
}
}
This is a tad more efficient than the generic radix formatting implemented currently in the language.
Here's a benchmark:
test bench_specialized_hex_push ... bench: 12 ns/iter (+/- 0) = 250 MB/s
test bench_specialized_fomat ... bench: 42 ns/iter (+/- 12) = 71 MB/s
test bench_specialized_format ... bench: 47 ns/iter (+/- 2) = 63 MB/s
test bench_specialized_hex_string ... bench: 76 ns/iter (+/- 9) = 39 MB/s
test bench_to_hex ... bench: 82 ns/iter (+/- 12) = 36 MB/s
test bench_format ... bench: 97 ns/iter (+/- 8) = 30 MB/s
Thanks to the user jey in the ##rust irc channel in freenode. You can just use the hex representation fmt provides,
>> let mut s = String::new();
>> use std::fmt::Write as FmtWrite; // renaming import to avoid collision
>> for b in "hello world".as_bytes() { write!(s, "{:02x}", b); }
()
>> s
"68656c6c6f20776f726c64"
>>
or a bit silly one,
>> "hello world".as_bytes().iter().map(|x| format!("{:02x}", x)).collect::<String>()
"68656c6c6f20776f726c64"
Using the hex crate, it is very easy:
use hex;
println!("{}", hex::encode(String("some str")));