#if BYTE_ORDER == BIG_ENDIAN
/* Copy a vector of big-endian int into a vector of bytes */
#define be32enc_vect(dst, src, len) \
memcpy((void *)dst, (const void *)src, (size_t)len)
/* Copy a vector of bytes into a vector of big-endian int */
#define be32dec_vect(dst, src, len) \
memcpy((void *)dst, (const void *)src, (size_t)len)
#else /* BYTE_ORDER != BIG_ENDIAN */
/*
* Encode a length len/4 vector of (int) into a length len vector of
* (unsigned char) in big-endian form. Assumes len is a multiple of 4.
*/
static void be32enc_vect(unsigned char *dst, const int *src, size_t len)
{
size_t i;
for (i = 0; i < len / 4; i++)
be32enc(dst + i * 4, src[i]);
}
/*
* Decode a big-endian length len vector of (unsigned char) into a length
* len/4 vector of (int). Assumes len is a multiple of 4.
*/
static void be32dec_vect(int *dst, const unsigned char *src, size_t len)
{
size_t i;
for (i = 0; i < len / 4; i++)
dst[i] = be32dec(src + i * 4);
}
#endif /* BYTE_ORDER != BIG_ENDIAN */
It creates the hash consistently under Ubuntu, seems to work well. It creates the same hash consistently under Raspbian (as expected) and works well, this being the same hash as the one under Ubuntu. Under Windows (the same code) the hash is consistent, but different than the Linux versions (that makes the pfiles non portable). I've checked the inputting char array in the debugger in both the Linux and Windows versions and the input (e.g. the password) seems identical character for character. Could this somehow be an encoding issue between platforms?
Anybody have a thought off the top of their head what I might be missing?
As a side note, I assumed this returned a sha256 hash but it doesn't match up to "echo -n thepassword | sha256sum" in either environment. The code is on github, I can share links to the sections if needed.