Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Export/Import of raw IndexedDB data #124

Open
2 tasks done
nickchomey opened this issue Feb 6, 2025 · 1 comment
Open
2 tasks done

Feature: Export/Import of raw IndexedDB data #124

nickchomey opened this issue Feb 6, 2025 · 1 comment

Comments

@nickchomey
Copy link

nickchomey commented Feb 6, 2025

Is your feature request related to a problem?

SurrealDB has its own Export/Import mechanism, though I'm not sure it is even available to the JS sdk. (Though, evidently it is possible since Surrealist can do it. Though it doesn't seem to work for the indxdb backend),

Anyway, surql files are essentially just a mysqldump of commands to recreate the DB. If the DB is large - and especially if it has indexes, and particularly if you have FTS or Vector indexes - it takes a while to recreate. And even moreso if you are using a low-powered mobile devices.

I propose that a mechanism be introduced to fully export and import the indexeddb database that is underlying surrealdb.wasm - including indexes - so that we can rapidly recreate the DB. This might even afford us the ability to pre-generate Databases on our high-powered servers, send them to browser clients where they can be rapidly imported - especially on low-powered mobile devices.

Describe the solution

There's various implementations out there for export/import of indexeddb databases, but it appears that the simpler ones do not support ArrayBuffers - which is what surrealdb.wasm uses.

However, dexie-export-import does support this and it seems to work for SurrealDB!

It exports a nonsensical json file full of array buffers, and then re-imports it significantly faster than it took to create it all. (in my quick/crude test - it was 10x faster for a single table with 5000 records and an index, compared to a bulk insert of the same data).

The main problem is that it requires a 400kb (70kb gzip) bundle of Dexie + dexie-export-import. I have to figure that the main logic could be extracted and built into surrealdb.js and/or surrealdb.wasm.

Alternative methods

n/a

SurrealDB version

n/a

Contact Details

No response

Is there an existing issue for this?

  • I have searched the existing issues

Code of Conduct

  • I agree to follow this project's Code of Conduct
@nickchomey
Copy link
Author

Here's a screen recording of exporting 15000 records with an Index in a negligible amount of time, and then re-importing them in 4.5 seconds (and that is WITH dev tools open, which is considerably slower than when it is closed, as noted in #125)

chrome_omyn3w4Jzi.mp4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant