How to decode hex into separate bits in Perl? - perl

A byte stored as two hex numbers contains a set of flags. I need to extract these flags, as 0's and 1's. Iterating over entries in a file with:
foreach(<>)
{
#line = split(/ /,$_);
$torR = !!((hex $line[4]) & 0x3); # bit 0 or 1
$torY = !!((hex $line[4]) & 0x4); # bit 2
$torG = !!((hex $line[4]) & 0x8); # bit 3
print "$torR,$torY,$torG\n";
}
run on data file:
796.129 [1f/01] len:7< 02 01 D5 01 8B 0A 8E
796.224 [1f/01] len:7< 02 01 D4 03 09 A9 B8
796.320 [1f/01] len:7< 00 01 D4 03 07 49 5A
796.415 [1f/01] len:7< 00 01 D4 00 11 A0 EE
796.515 [1f/01] len:7< 00 01 D4 00 00 31 4C
796.627 [1f/01] len:7< 02 01 D4 01 89 C1 FD
796.724 [1f/01] len:7< 02 01 D3 03 06 39 FD
796.820 [1f/01] len:7< 08 01 D4 03 08 40 6F
796.915 [1f/01] len:7< 08 01 D5 00 13 3D A4
797.015 [1f/01] len:7< 08 01 D4 00 00 34 04
Actual Result -
1,,
1,,
,,
,,
,,
1,,
1,,
,,1
,,1
,,1
Desired result:
1,0,0
1,0,0
0,0,0
0,0,0
0,0,0
1,0,0
1,0,0
0,0,1
0,0,1
0,0,1
Seems like 'false' gets stored as empty string instead of '0'.
Is there a neat trick to get this right at once, or do I need to convert empty strings to zeros "manually"?

If you want the true/false values to be numeric, you need to coerce them to be numeric:
$torR = 0 + !!((hex $line[4]) & 0x3); # bit 0 or 1
$torY = 0 + !!((hex $line[4]) & 0x4); # bit 2
$torG = 0 + !!((hex $line[4]) & 0x8); # bit 3
Keep in mind that the empty string '' is also a false value.
On the other hand, I might be inclined to write that as:
my (#ryg) = map 0 + !!((hex $line[4]) & $_), 0x3, 0x4, 0x5;
print join(', ', #ryg), "\n";
In addition, you would probably benefit from not using plain numbers in your program. Consider, for example, having a %FLAG structure that gives names to these constants, and a %COL structure that gives names to the columns you are interested in. Using the data you posted:
use Const::Fast;
const my %FLAG => (
torR => 0x3,
torY => 0x4,
torG => 0x5,
);
const my %COL => (
# ...
tor => 4,
);
while (my $line = <DATA>) {
my #line = split ' ', $line;
my %set_flags = map +($_ => 0 + !!((hex $line[$COL{tor}]) & $FLAG{$_})), qw(torR torY torG);
print join(', ', #set_flags{qw(torR torY torG)}), "\n";
}
__DATA__
796.129 [1f/01] len:7< 02 01 D5 01 8B 0A 8E
796.224 [1f/01] len:7< 02 01 D4 03 09 A9 B8
796.320 [1f/01] len:7< 00 01 D4 03 07 49 5A
796.415 [1f/01] len:7< 00 01 D4 00 11 A0 EE
796.515 [1f/01] len:7< 00 01 D4 00 00 31 4C
796.627 [1f/01] len:7< 02 01 D4 01 89 C1 FD
796.724 [1f/01] len:7< 02 01 D3 03 06 39 FD
796.820 [1f/01] len:7< 08 01 D4 03 08 40 6F
796.915 [1f/01] len:7< 08 01 D5 00 13 3D A4
797.015 [1f/01] len:7< 08 01 D4 00 00 34 04

I think I would use split and unpack to turn each value into an array of zeroes and ones, and then examine them individually
use strict;
use warnings 'all';
for my $val ( qw/ 02 02 00 00 00 01 01 08 08 08 / ) {
my #bits = split //, unpack 'b8', chr hex $val;
my $torR = $bits[0] || $bits[1] ? 1 : 0;
my $torY = $bits[2] ? 1 : 0;
my $torG = $bits[3] ? 1 : 0;
print "$torR,$torY,$torG\n";
}
output
1,0,0
1,0,0
0,0,0
0,0,0
0,0,0
1,0,0
1,0,0
0,0,1
0,0,1
0,0,1
Or here's a way using the Bit::Vector which produces the same result
use strict;
use warnings 'all';
use Bit::Vector;
for my $val ( qw/ 02 02 00 00 00 01 01 08 08 08 / ) {
my $vec = Bit::Vector->new_Hex(8, $val);
my $torR = $vec->Chunk_Read(2, 0) ? 1 : 0;
my $torY = $vec->Chunk_Read(1, 2) ? 1 : 0;
my $torG = $vec->Chunk_Read(1, 3) ? 1 : 0;
print "$torR,$torY,$torG\n";
}

Related

Address of segment descriptor

All values ​​are in hexadecimal number system. On Pentium in protected mode, registers have the following value: LDTR = 06000000, GDTR = 08000000, CR3 = 10000000, DS = 14, CS = 0034 CR0 = 00000001.
If the instruction (e.g. MOV AL, [2A66] accesses the logical address 2A66, what physical address does it access? At what address is the segment descriptor located? Current memory status, looking at absolute addresses is:
........
06000000 CD 20 FF 9F 00 9A EE FE 1D F0 4F 03 22 05 8A 03
06000010 22 05 17 03 22 93 0D 04 01 01 01 00 02 FF FF FF
.........
08000000 CA 20 FF 9F 00 9A E3 FE 1D F2 4F 08 23 05 8A 07
08000010 26 05 19 03 22 05 0D 04 01 02 01 00 02 FF FA FF
.........
10000020 3A 56 21 40 2A 38 42 18 2A 56 42 40 8E 48 42 18
10000030 2A 36 42 40 9A 48 42 18 7A 56 42 20 8E 48 42 18
10000040 23 60 42 40 4E A8 42 18 5A 56 42 40 8E 48 42 18
.........
40426860 C6 06 23 99 00 80 3E 1D 96 00 74 03 E9 99 00 E8
40426870 A6 01 E8 FF 03 75 19 80 3E C4 98 00 34 00 AD 0A
40426880 13 96 00 BA E9 89 75 03 E9 17 01 C6 06 1F 99 01
40426890 B8 00 6C BE 08 98 BB 21
.........
C6011D70 C6 06 23 99 00 80 3E 1D 96 00 74 03 E9 99 00 E8
C6011D80 A6 01 E8 FF 03 75 19 80 3E C4 98 00 34 00 AD 0A
C6011D90 13 96 00 BA E9 89 75 03 E9 17 01 C6 06 1F 99 01
Could you give me some guidelines what is the problem here and what I need to know to solve it? Operating systems and registry is new to me, so I don't know what I'm supposed to do here. I don't know even where should I start.

mitmproxy: HTTP request wit nonexsisting leading 0 in data

I am trying to use mitmproxy to look at the traffic from my win32 schannel tls client. But when I try to use mitmproxy the following messages throw an "Bad HTTP request line" error with a leading 0 in the binary dump that does not exsist in the data that my client sends (I have checked with a little python server).
"CONNECT www.example.com:443 HTTP/1.0\r\n\r\n"
"HTTP/1.0 200 Connection established\r\n\r\n"
Send Tls Client Hello:
16 03 03 00 AC 01 00 00 A8 03 03 5F 80 1A 2D F6 2A 59 DE 18
69 F0 BB 3C 2D 2B 11 90 F8 8C A7 F9 D7 96 CD DC 32 88 02 22
11 90 6A 00 00 2A C0 2C C0 2B C0 30 C0 2F 00 9F 00 9E C0 24
C0 23 C0 28 C0 27 C0 0A C0 09 C0 14 C0 13 00 9D 00 9C 00 3D
00 3C 00 35 00 2F 00 0A 01 00 00 55 00 00 00 14 00 12 00 00
0F 77 77 77 2E 65 78 61 6D 70 6C 65 2E 63 6F 6D 00 0A 00 08
00 06 00 1D 00 17 00 18 00 0B 00 02 01 00 00 0D 00 1A 00 18
08 04 08 05 08 06 04 01 05 01 02 01 04 03 05 03 02 03 02 02
06 01 06 03 00 23 00 00 00 17 00 00 FF 01 00 01 00
Bad HTTP request line: b"\x00\x16\x03\x03\x00\xac\x01\x00\x00\xa8\x03\x03_\x80\x17\xbd\x1f\xf3\x8fO\xddy\xfb\xaaR\x1c\xeb\xe0sdD\xb7}|\xeb\xbes\xdf$3\xb6\xd9\ry\x00\x00*\xc0,\xc0+\xc00\xc0/\x00\x9f\x00\x9e\xc0$\xc0#\xc0(\xc0'\xc0"
Now my question: Is this just a lack of understanding in how proxys and tls work or an error from mitmproxy?

Parse the sccp layer from pcap file that contains the "sendAuthenticationInfo" packet

I have a pcap file that contains the sendAuthenticationInfo message.
I try to parse the sccp layer from this packet using tshark
I tried the following:
tshark.exe -r filter.pcap -T fields -e sccp > parse.bin
I know what the result should be from parsing manually in wireshark, the result I get is much shorter and different from expected.
Original packet:
d4 c3 b2 a1 02 00 04 00 00 00 00 00 00 00 00 00
00 00 00 01 01 00 00 00 f0 52 7a 57 9a d7 00 00
ba 00 00 00 ba 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 08 00 45 00 00 ac 00 00 00 00 10 84
ab cc 7f 00 00 01 7f 00 00 01 0b 58 0b 59 00 00
48 23 01 f3 c7 60 00 03 00 8c 00 00 1f 7a 00 01
00 00 00 00 00 03 01 00 01 01 00 00 00 7c 02 10
00 72 00 00 04 3a 00 00 02 4e 03 02 00 04 09 80
03 0e 19 0b 12 06 00 11 04 79 52 14 02 10 07 0b
12 07 00 12 04 44 87 92 97 01 08 44 62 42 48 04
00 00 00 01 6b 1e 28 1c 06 07 00 11 86 05 01 01
01 a0 11 60 0f 80 02 07 80 a1 09 06 07 04 00 00
01 00 0e 03 6c 1a a1 18 02 01 01 02 01 38 30 10
80 08 24 05 01 77 03 84 35 f8 02 01 01 83 01 00
00 00
expected result:
09 80 03 0e 19 0b 12 06 00 11 04 79 52 14 02 10
07 0b 12 07 00 12 04 44 87 92 97 01 08 44 62 42
48 04 00 00 00 01 6b 1e 28 1c 06 07 00 11 86 05
01 01 01 a0 11 60 0f 80 02 07 80 a1 09 06 07 04
00 00 01 00 0e 03 6c 1a a1 18 02 01 01 02 01 38
30 10 80 08 24 05 01 77 03 84 35 f8 02 01 01 83
01 00
result I got:
73 63 63 70 0d 0a

determining hash function used in digital signature

I have a digital signature (RSA - PKCS#1). After decrypting it with the RSA public key I get the following 128 bytes
00 01 ff ff ff .. ff 00 30 31 30 0d 06 09 60 86 48 01 65 03 04 02 01 05 00 04 20 77 51 1b f4 d7 17 d7 ad 8c 2d e5 89 2a ca e0 6d a3 c0 7d 13 4d d7 b8 01 14 87 03 00 69 e4 9b b3
PKCS#1 padding removed, 51 bytes left:
30 31 30 0d 06 09 60 86 48 01 65 03 04 02 01 05 00 04 20 77 51 1b f4 d7 17 d7 ad 8c 2d e5 89 2a ca e0 6d a3 c0 7d 13 4d d7 b8 01 14 87 03 00 69 e4 9b b3
I would like two things about this:
Is it possible to determine the hash function used? Encoded algorithm ID should be prepended to the actual body of the digest, is it possible to tell what algorithm it is from the raw bytes?
Where does the actual digest start (how long the head / digest is)?
This appears to be EMSA-PKCS1-v1_5 as described in RFC 3447, which means that after removing the header and padding, you have a DER encoding of an AlgorithmIdentifier followed by the hash value itself.
From the RFC:
For the six hash functions mentioned in Appendix B.1, the DER
encoding T of the DigestInfo value is equal to the following:
[...]
SHA-256: (0x)30 31 30 0d 06 09 60 86 48 01 65 03 04 02 01 05 00 04 20 || H.
So in your example, the hash value is the SHA-256 hash starting 77511bf4d7....

How do I find what is the value of offset of this byte?

So somehow from the following hex data (03 00 21 04 80 04) the values below were obtained.
Can anybody can tell how how can I do this and how it was achieved?
Band = 3 (40,6)
Duplex_Mode = 0 (46,1)
Result = 0 (47,1)
Reserved_1 = 0 (48,8)
Min_Search_Half_Frames = 1 (56,5)
Min_Search_Half_Frames_Early_Abort = 1 (61,5)
Max_Search_Half_Frames = 1 (66,5)
Max_PBCH_Frames = 0 (71,5)
Number_of_Blocked_Cells = 0 (76,3)
Number_PBCH_Decode_Attemp_Cells = 1 (79,3)
Number_of_Search_Results = 1 (82,4)
Reserved_2 = 0 (86,2)
The parameters in paranthesis is the Offset/Length I am told. I don't understand how based on that information should I be able to unpack this payload.
So I have written
my $data = pack ('C*', map hex, split /\s+/, "03 00 21 04 80 04");
($tmp1, $Reserved_1, $tmp2) = unpack("C C V", $data);
And now help. How do I unpack the table values above from $tmp1 and $tmp2 ?
EDIT: Hex Data = "00 00 00 7F 08 03 00 21 04 80 04 FF D7 FB 0C EC 01 44 00 61 1D 00 00 10 3B 00 00 FF D7 FB 0C 00 00 8C 64 00 00 EC 45"
Thanks!
You might want to define a set of bitmasks, and use bitwise AND operations to unpack your data.