This is a great tool! I would like the option to specify whether a decimal value should be interpreted and displayed as a signed or an unsigned value. Currently it appears that a 16-bit decimal field is always interpreted as unsigned, whereas a 32-bit field is interpreted as signed. For example, this data:
258d2d2200000001000000000100000000ffffffffffff000000000000000000000000000000a7f38
hhhhhhhh hhhhhhhh hhhhhhhhhh hhhhhhhh 16d 32d
displays:
258d2d22 00000001 0000000001 00000000 65535 00000000-1 00000000 00000000 0000000000 0000a7f3 08
This is a great tool! I would like the option to specify whether a decimal value should be interpreted and displayed as a signed or an unsigned value. Currently it appears that a 16-bit decimal field is always interpreted as unsigned, whereas a 32-bit field is interpreted as signed. For example, this data:
displays:
258d2d22 00000001 0000000001 00000000 65535 00000000-1 00000000 00000000 0000000000 0000a7f3 08