-
-
Notifications
You must be signed in to change notification settings - Fork 401
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better-SQLite-Pool #234
Comments
I personally wouldn't use Using multiple connections could enable asynchronous transactions, but that may result in serious lockups. |
so there is a reason why im asking.. at the moment I have a file called db_Conn.js looks like
then i have actions.js
now if I'm inserting and querying the DB at the same time im worried I face a lock up. lets say i have a transaction with 20million records to insert. and a user is logging in and has to run through 1 million users... will i face an issue like this? |
Node.js is single-threaded, so even with multiple connections, every database operation (on any connection) blocks all database operations on other connections within the same process. The best strategy is to make your operations fast. Read operations can be made fast by creating well-written indexes. Write operations can be made fast by using WAL mode. See here and here to understand how things work under the hood when using Node.js and SQLite3. If your operations are inevitably slow and you can't make them fast, you probably don't want to use SQLite3 at all. However, you might be able to do multiple slow concurrent operations by creating multiple database connections and doing your operations incrementally. For example, you could run a slow SELECT statement asynchronously like this: async function select(sql) {
const iterator = Database('mydata.db').prepare(sql).iterate();
const rows = [];
while (true) {
const entry = iterator.next();
if (entry.done) break;
else rows.push(entry.value);
await new Promise(setImmediate);
}
return rows;
} And you could run a huge number of INSERT statements asynchronously like this: async function insert(sqlArray) {
const db = Database('mydata.db');
db.prepare('BEGIN').run();
try {
for (const sql of sqlArray) {
db.prepare(sql).run();
await new Promise(setImmediate);
}
db.prepare('COMMIT').run();
} catch (err) {
if (db.inTransaction) db.prepare('ROLLBACK').run();
throw err;
}
} It's important for the above functions to perform their operations incrementally, by using |
Are you suggesting that instead blocking the event loop while reading/writing to the database is the better practice? |
I'm suggesting that opening a separate connection for every query will have a huge overhead. If your queries are unavoidably slow, then that is probably the only solution. But in most cases, the better solution is to simply make your queries fast by proper use of indexes. When you get your queries fast enough (microsecond range), the overhead of threading outweighs the tiny amount of time you're blocking the thread. In databases like MySQL and PostgreSQL this is not possible because of network latency, but in SQLite3 it is very possible. |
Closing due to inactivity |
This is more of a question for you Josh.
Why would you use pool over regular transaction insert?
The text was updated successfully, but these errors were encountered: