Home>

I'm having a problem executing queries in mysql. Essence: The script is written in Node.JS libraries using socket.io + mysql2

When there are multiple events (for example, 100-500 at once), sequential requests to the database should work on SIO Code example:

const mPool= Mysql.createPool({
   host: "",
   user: "",
   password: "",
   database: "",
   port: "",
   queueLimit: 0
   connectionLimit: 150
   waitForConnections: true
});
let dbP= mPool.promise();
Sio.sockets.on('connection', function (socket)
{
 socket.on('myEvent', async(data)=> {
    let MyData= JSON.parse(data);
    let [result1]= await dbP.query('SELECT * FROM `table` WHERE `id`= ?',
    [MyData.id]).catch((error)=>{console.log(`query SELECT table error: ${error}`)});
    if(!result1.length)
    {
        console.log(`Not found id: ${MyData.id}`);
    }
    else
    {
        let [result2]= await dbP.query('SELECT * FROM `table2` WHERE `id2`= ?', [MyData.id2]).catch((error)=>{console.log(`query SELECT table2 error: ${error}`)});
        if(!result2.length)
        {
            /* Small code is executed here in the form of a loop */
            await dbP.query('INSERT INTO `table2` (`id2`, `param1`, `param2`, `param3`) VALUES (?, ?, ?, 0)', [MyData.id2, param1, param2])
            .then(()=>            {
                Sio.to(socket.id).emit("MyResultEmmit", MyData);
            }).catch((error)=> {console.log(`query INSERT table2 error: ${error}`)});//I often get the error Duplicate entry '' for key 'PRIMARY' had time to process before the top
        }
        else
        {
            /* Here again, small code is executed in the form of a loop */
            await dbP.query('UPDATE `table2` SET `param1`= ?, `param2`= ? WHERE `id2`= ?', [param1, param2, MyData.id2])
            .then(()=>            {
                Sio.to(socket.id).emit("MyResultEmmit", Event);
                QueueConnect= false;
            }).catch((error)=>{console.log(`query UPDATE table2: ${error}`)});
        }
    }
 });
});

How to fix what would the database have time to process?


Two selects + 2 inserts, and the server does not have time to process? What do you have there, Pentium 100? Well and -what exactly means "does not have time to process"?

Akina2022-01-31 07:39:13

That's exactly what a pentium is not a normal zeon 12 stream, I'm saying if 1 request is sent, then everything works fine, but if for example 500-1000 requests at once make 3-5 of them processed in the muscle, I raised the limits on the muscle server, set connections from 151 up to 1000 did not help

隼SAPSAN2022-01-31 07:39:13

@隼SAPSAN What happens to the others? do they reach? do not have time to process and close? or what?

Ceri2022-01-31 07:39:13

@Ceri, They seem to be trying to run at the same time

隼SAPSAN2022-01-31 07:39:13

Each of the 4 queries communicates with a single record of one table. Point-blank, I see no reason for the brakes, even if the tables are plump and there are no indexes. What are MAX_CONNECTIONS_PER_HOUR and MAX_USER_CONNECTIONS for the account being used? put connections from 151 to 1000 Hm... why?

Akina2022-01-31 07:39:13