I am using following code to read a file into chararcter array. Now, for small file (say for 2 MB) it is executing properly but for large file (140 MB), in my 18 GB UBUNTU server it is giving segmentation fault. Can anybody help me how to solve this ? I think 18 GB is enough to hold a 240 MB file into memory. I am using 64 bit UBUNTU and compiling using g++.
ifstream is;
char chararray [fileSize] ;
is.read(chararray, fileSize) ;
fileSizeis something like asize_twith value240*1024*1024...char* chararray = new char[fileSize];.delete[]instead ofdelete. Vectors are better...and as of C++11 you can even return big ones from functions without paying the cost of copying it, thanks to the copy elision that happens during return value optimization.