2

I'm trying to read a file in smallish chunks with Python to get around the issue of having < 1G of memory to play with. I'm able to write the file to disk and read in chunks, but no matter what I try I always end up getting a MemoryError. I originally didn't have the del/gc stuff, but put that in after reading a bit online.

Can anyone help point me in the right direction in a way to read this file in chunks (256M-512M) and dump the chunk out of memory as soon as it's done and before loading the next one?

with open(path) as in_file:
    current = 0
    total = os.stat(path).st_size
    while current < total:
        in_file.seek(current, 0)
        bytes_read = in_file.read(byte_count)
        # do other things with the bytes here
        in_file.close()
        del in_file
        gc.collect()
        current += byte_count
1

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.