I've got the following Java code that I'd like to port to Node.js:
// Java
byte[] rawKey = "deadbeefdeadbeef".getBytes("us-ascii");
SecretKeySpec skeySpec = new SecretKeySpec(rawKey, "AES");
Cipher cip = Cipher.getInstance("AES/ECB/NoPadding");
cip.init(Cipher.DECRYPT_MODE, skeySpec);
byte[] plaintext = cip.doFinal(ciphertext, 0, ciphertext.length);
Here's my attempt with Node.js, using streams
// JS, using the crypto streams API
var decipher = crypto.createDecipher('aes-128-ecb', 'deadbeefdeadbeef');
decipher.setAutoPadding(false);
decipher.pipe(concat(function(plaintext) { console.log(plaintext); });
decipher.end(ciphertext);
And, also a Node.js attempt using the older .update() and .final() API:
// JS, using the `.update()` and `.final()` API
var decipher = crypto.createDecipher('aes-128-ecb', 'deadbeefdeadbeef');
decipher.setAutoPadding(false);
var pieces = [];
pieces.push(new Buffer(decipher.update(ciphertext)));
pieces.push(new Buffer(decipher.final()));
var plaintext = Buffer.concat(pieces);
Both of these versions produce the same output of the correct length (the same length as the input), but this output is not the same plaintext that is produced by the Java version of the decipher operating on the same input buffer. How can I set up a Node.js decipher like the Java decipher configured above?
Thank you.
crypto.createDecipheriv('aes-128-ecb', 'deadbeefdeadbeef', '');did the trick.aes-256-ecbwith a 128-bit key and it wasn't throwing or indicating error otherwise. Just producing undesired decrypt output.concatfunction indecipher.pipe(concat(...?