0

I am inserting JSON data to a file with Laravel, I have 300 000 records and I want to insert them by block of 5000 records.

 public function fileput($start,$end,$n) {
        $final = [];
        $res = DB::table('company')
                ->Where('company_id', '>',$start)
                ->Where('company_id', '<',$end)
                ->get();

        foreach ($res as $k => $v) {
            $id = $v->company_id;
            $index = 10000000 + $id;
            unset($res[$k]->company_id);
            unset($res[$k]->company_data);
            $arr = array('index' => ['_id' => $index]);
            $ind = json_encode($arr);
            $data = json_encode($v);
            array_push($final, $ind, $data);
        }
        $written = File::put('file.txt', $final);
} 

Problem is, after entering first 5000, when incrementing to 10000, first 5000 records got erased.

Therefore how can I insert all the records in one file?

1 Answer 1

1

Try using File::append() instead of File::put().

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.