Python beginner here. I have a large set of data that started of as a string of 16 bit ints, "1,2,3,4,5" and eventually need to turn into a byte aligned binary file.
Currently I have it working with the following:
#helper function
def unintlist2hex(list_input):
for current in range(len(list_input)):
list_input[current] = "%04X"%(int(list_input[current]))
return list_input
#where helper gets called in main code
for rows in dataset:
row_list = rows.text.split(",")
f_out.write(binascii.unhexlify("".join(unintlist2hex(row_list))))
but this runs quite slow up for my limited data test size(about 300,000 ints). How could I go about speeding it up? I profiled the code and most of the all the cycles are spent in unintlist2hex()
Note that I struggled to use hex(), and bin() because they had a tendency to truncate leading zeros.