Option 1: Let the SQLite plugin save directly to a file (best for large backups)
The @capacitor-community/sqlite plugin supports exporting the database as a file — so you don’t need to serialize large JSON across the bridge.
Instead of using exportToJson() and Filesystem.writeFile(), use the plugin’s backupToStore() or exportToJson({jsonexportmode: 'full'}) + native save methods.
For example:
const db = await sqlite.createConnection('my_db', false, 'no-encryption', 1);
await db.open();
// Create a backup file managed by the plugin
await db.backupToStore('backup_file'); // creates 'backup_file.db' in plugin storage
console.log('Database backup completed successfully');
Then, if you need to export that file for sharing or uploading, use the plugin’s getUrl() or copy the file from the plugin’s database folder using native filesystem APIs — not via the JS bridge.
✅ This avoids JSON serialization and the huge string transfer problem.
Option 2: Stream or chunk the JSON write
If you must use JSON (for example, for cloud sync), don’t write it in one massive call.
Instead, write it in chunks:
import { Filesystem, Directory, Encoding } from '@capacitor/filesystem';
import { CapacitorSQLite } from '@capacitor-community/sqlite';
const CHUNK_SIZE = 5000000; // ~5MB
async function backupLargeJSON() {
const db = await CapacitorSQLite.createConnection('my_db', false, 'no-encryption', 1);
await db.open();
const jsonObj = await db.exportToJson('full');
const jsonStr = JSON.stringify(jsonObj.export);
// Split into smaller parts
for (let i = 0; i < jsonStr.length; i += CHUNK_SIZE) {
const chunk = jsonStr.slice(i, i + CHUNK_SIZE);
await Filesystem.appendFile({
path: 'backup.json',
data: chunk,
directory: Directory.Documents,
encoding: Encoding.UTF8,
});
}
console.log('Backup completed successfully');
}
⚠️ This is slower but prevents the JS bridge from choking on a single large payload.
Option 3: Skip JSON entirely — just copy the raw SQLite database file
If you don’t need JSON for migration between platforms, just back up the .db file itself:
const dbPath = await sqlite.getDatabasePath('my_db');
// Now copy this file using native Filesystem plugin or share it
await Filesystem.copy({
from: dbPath,
to: 'Documents/backup_my_db.sqlite',
});
This is far more efficient — no JSON serialization, no large JS objects, and very fast.