Decrypt C # -recorded C ++ data using Windows AES crypto provider

I need to encrypt data in C # using RijndaelManaged and decrypt in C ++ code.

C # encryption code:

static string Encrypt(string plainText)
{
    byte[] plainTextBytes = Encoding.UTF8.GetBytes(plainText);

    var keyBytes = new byte[] { /* ... 32 bytes of a key */};
    byte[] iv = new byte[] { /* ... 16 bytes of IV */ };


    var symmetricKey = new RijndaelManaged() 
    { 
        Mode = CipherMode.CBC, 
        Padding = PaddingMode.Zeros, 
        BlockSize = 128, // Must be 128 to be compatible with AES
        KeySize = 256 
    };

    var encryptor = symmetricKey.CreateEncryptor(keyBytes, iv);

    byte[] cipherTextBytes;
    using(var memoryStream = new MemoryStream())
    {
        using(var cryptoStream = new CryptoStream(memoryStream, encryptor, CryptoStreamMode.Write))
        {
            cryptoStream.Write(plainTextBytes, 0, plainTextBytes.Length);
            cryptoStream.FlushFinalBlock();
            cipherTextBytes = memoryStream.ToArray();
            cryptoStream.Close();
        }
        memoryStream.Close();
    }
    return Convert.ToBase64String(cipherTextBytes);
}

      

But when decrypting this C ++ code, I always get a response from NTE_BAD_DATA from CryptDecrypt. Here is the C ++ code (all checks removed for clarity):

__declspec(dllexport) DWORD  Decrypt(char* stringBuffer)
{
string encryptedString(stringBuffer);

// Decode base64 string to byte array. Works ok, the binary array is the same as the one in C# code.
vector<BYTE> encryptionBuffer = Base64::decode(encryptedString);
DWORD bufferSize = encryptionBuffer.size();

struct CryptoBlob {
    BLOBHEADER header;
    DWORD cbKeySize;
    BYTE rgbKeyData[32];
} keyBlob;

keyBlob.header.bType = PLAINTEXTKEYBLOB;
keyBlob.header.bVersion = CUR_BLOB_VERSION;
keyBlob.header.reserved = 0;
keyBlob.header.aiKeyAlg = CALG_AES_256;
keyBlob.cbKeySize = 32;

BYTE keyData[32] = { /* 32 bytes of a key the same as in C# code */ };
BYTE ivData[16] = { /* 16 bytes of IV the same as in C# code */ };

memcpy(keyBlob.rgbKeyData, keyData, 32);

HCRYPTKEY hPubKey;
HCRYPTPROV hProv;

CryptAcquireContext(
    &hProv,
    NULL,
    NULL,
    PROV_RSA_AES,
    CRYPT_VERIFYCONTEXT);

CryptImportKey(hProv, (const LPBYTE)&keyBlob, sizeof(keyBlob), 0, 0, &hPubKey);
CryptSetKeyParam(hPubKey, KP_IV, ivData, 0);

// Here the error happens, the value returned is 0x80090005 (NTE_BAD_DATA)
DWORD err = CryptDecrypt(hPubKey, 0, TRUE, 0, encryptionBuffer.data(), &bufferSize);

// overwrite the input buffer with decrypted data
memset(stringBuffer, 0, encryptedString.length());
memcpy(stringBuffer, encryptionBuffer.data(), bufferSize);

return 0;
}

      

Any idea what could be wrong? Thank!

+3


source to share


2 answers


When you pass TRUE

as the third parameter to CryptDecrypt , it tries to undo the PKCS # 7 padding. When it cannot undo this padding, it emits NTE_BAD_DATA.

Since you changed the padding mode for encryption to a value other than Pkcs7, you need to transfer FALSE

and do a manual depaduation.



Since PaddingMode.Zeros is not an integral part, there is no depading to perform.

+4


source


If that's not the answer, I would recommend looking at the / iv switch in both C ++ and C # and making sure the byte arrays look exactly the same.

The extra char at the end can cause problems.



If they do not match, keep in mind that there may be differences in return types between programming languages โ€‹โ€‹and between implementations (e.g. signed / unsigned, char / byte array), which can also cause problems.

0


source







All Articles