What is the correct way to construct a Biginteger from an implied unsigned hex string in C #?

I am facing an issue where I have an implied unsigned hexadecimal number as a string provided from user input that needs to be converted to Biginteger.

Due to the signed nature of Biginteger, any input that has the highest order bit (0x8 / 1000b) set, the resulting number is considered negative. However, this problem cannot be solved by simply checking the signed bit and multiplying by -1, or getting an absolute value due to padding that will not respect the underlying notation, such as treating all 0xF * values ​​as -1.

Below is an example of I / O

var style = NumberStyles.HexNumber | NumberStyles.AllowHexSpecifier;


BigInteger.TryParse("6", style) == 6   // 0110 bin
BigInteger.TryParse("8", style) == -8  // 1000 bin
BigInteger.TryParse("9", style) == -7  // 1001 bin
BigInteger.TryParse("A", style) == -6  // 1010 bin
...
BigInteger.TryParse("F", style) == -1  // 1111 bin
...
BigInteger.TryParse("FA", style) == -6 // 1111 1010 bin
BigInteger.TryParse("FF", style) == -1 // 1111 1111 bin
...
BigInteger.TryParse("FFFF", style) == -1 // 1111 1111 1111 1111 bin

      

What's the correct way to construct a Biginteger from an implied unsigned hexadecimal hex string?

+3


source to share


1 answer


Prefixing your hex string with "0" should do this:

BigInteger.TryParse(string.Format("0{0}", "FFFF"), style, ...)

      

My BigInteger is 65535 in the above example.



Edit

An excerpt from BigInteger documentation:

When parsing a hexadecimal string, BigInteger.Parse (String, NumberStyles) and BigInteger.Parse (String, NumberStyles, IFormatProvider) assume that if the most significant bit the first byte in the string is set, or if the first hexadecimal digit of the string is the least significant four bits of the byte value, the value presented using two complementary views. For example, both "FF01" and "F01" represent the decimal value -255. To differentiate positive from negative values, positive values ​​must include a leading zero. The corresponding overloads of the ToString method, when passed in an "X" format string, add a leading zero to the return hex string for positive values.

+1


source







All Articles