-1

I'm building an Ionic + Capacitor app that uses the @capacitor-community/sqlite plugin for local database storage. I'm trying to export the entire SQLite database as a JSON backup file and save it using Capacitor’s Filesystem API.

import { Filesystem, Directory, Encoding } from '@capacitor/filesystem';
import { CapacitorSQLite, capSQLiteJson } from '@capacitor-community/sqlite';

const sqlite = CapacitorSQLite;

async function backupDatabase() {
  try {
    const db = await sqlite.createConnection('my_db', false, 'no-encryption', 1);
    await db.open();

    const jsonObj: capSQLiteJson = await db.exportToJson('full');

    await Filesystem.writeFile({
      path: 'backup.json',
      data: JSON.stringify(jsonObj.export),
      directory: Directory.Documents,
      encoding: Encoding.UTF8
    });

    console.log('Backup completed successfully');
  } catch (err) {
    console.error('Backup failed:', err);
  }
}

It works fine for smaller databases, but the app crashes when the JSON becomes large (around 50MB or more). No specific error is thrown — it just freezes or closes unexpectedly during the writeFile() operation.

1

1 Answer 1

1

Option 1: Let the SQLite plugin save directly to a file (best for large backups)

The @capacitor-community/sqlite plugin supports exporting the database as a file — so you don’t need to serialize large JSON across the bridge.

Instead of using exportToJson() and Filesystem.writeFile(), use the plugin’s backupToStore() or exportToJson({jsonexportmode: 'full'}) + native save methods.

For example:

const db = await sqlite.createConnection('my_db', false, 'no-encryption', 1);
await db.open();

// Create a backup file managed by the plugin
await db.backupToStore('backup_file'); // creates 'backup_file.db' in plugin storage

console.log('Database backup completed successfully');

Then, if you need to export that file for sharing or uploading, use the plugin’s getUrl() or copy the file from the plugin’s database folder using native filesystem APIs — not via the JS bridge.

✅ This avoids JSON serialization and the huge string transfer problem.


Option 2: Stream or chunk the JSON write

If you must use JSON (for example, for cloud sync), don’t write it in one massive call.

Instead, write it in chunks:

import { Filesystem, Directory, Encoding } from '@capacitor/filesystem';
import { CapacitorSQLite } from '@capacitor-community/sqlite';

const CHUNK_SIZE = 5000000; // ~5MB

async function backupLargeJSON() {
  const db = await CapacitorSQLite.createConnection('my_db', false, 'no-encryption', 1);
  await db.open();

  const jsonObj = await db.exportToJson('full');
  const jsonStr = JSON.stringify(jsonObj.export);

  // Split into smaller parts
  for (let i = 0; i < jsonStr.length; i += CHUNK_SIZE) {
    const chunk = jsonStr.slice(i, i + CHUNK_SIZE);
    await Filesystem.appendFile({
      path: 'backup.json',
      data: chunk,
      directory: Directory.Documents,
      encoding: Encoding.UTF8,
    });
  }

  console.log('Backup completed successfully');
}

⚠️ This is slower but prevents the JS bridge from choking on a single large payload.


Option 3: Skip JSON entirely — just copy the raw SQLite database file

If you don’t need JSON for migration between platforms, just back up the .db file itself:

const dbPath = await sqlite.getDatabasePath('my_db');

// Now copy this file using native Filesystem plugin or share it
await Filesystem.copy({
  from: dbPath,
  to: 'Documents/backup_my_db.sqlite',
});

This is far more efficient — no JSON serialization, no large JS objects, and very fast.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks! But the methods backupToStore and getDatabasePath don’t exist in the official @capacitor-community/sqlite plugin or typings. I also tried chunking, but the app still crashes once the JSON size grows — around 400K records or 80MB+. I only need offline backup/restore on Android (no cloud sync). Looking for a working native approach that handles large DBs reliably.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.