If you like DNray Forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...

 

How to safeguard a database that contains crucial data from slow queries?

Started by PaulKegg, Apr 27, 2023, 12:17 AM

Previous topic - Next topic

PaulKeggTopic starter

The situation involves a combat server, a webserver, and MySQL. PHP scripts are executed by Apache on the server, which interact with the terminal as well as remote users via TCP, all working with the same database.

The performance of the "local Apache plus muscle" combination is crucial, while that of "remote users plus muscle" is not. When a remote user launches a poorly made query, all other requests to the database are slowed down for 3-5 minutes, causing the web component to struggle with response time. How can the server be configured so that bad queries from remote users do not harm the functionality of the database for local connections?

The "locale" and "remoters" connect to the database under different users, but dividing the database is not an option. The server has enough capacity with 8 cores and 24 gigs of memory.
  •  


cassie_camay

The taskbook does not clarify if remote clients will make changes to the database, but it is possible to configure replication.
The master server can be configured to work with the local server while remote clients and backup archives can be managed by the slave server. When dealing with large tables exceeding 1 million records, it's recommended to use the slave server for data backup.
  •  

dragoxna

The choice of dividing the base into two may not be a suitable option for you. However, consider that it can increase speed and reliability, even if you do not intend to include a separate slave or construct a master-slave system.

It will require design skills to solve synchronization problems, but the outcome could surpass your initial expectations. Additionally, it is worth taking into account the benefits of dividing a database for scaling purposes and reducing downtime during maintenance.
  •  

recje

To me, it seems risky to allow non-developer/support-engineer users with limited SQL and database knowledge direct SQL-level access to a large, loaded database containing critical data. The idea of granting TCP access raises questions about the ability for users to manually run queries, which could potentially result in poorly written queries that may cause harm to the database.

Instead, wouldn't it be better to provide users with a set of pre-approved requests or filter their requests through a competent individual for review and execution? How can accidental data deletion be prevented and are users only given read-only permissions on all DB objects?

If granting such access is a must, there are ways to mitigate risk. For example, consider implementing a slave server with replication or setting user-specific resource quotas that limit CPU and IO bandwidth usage, and memory allocation within the MySQL server. As someone who is not completely familiar with MySQL, I am not certain about how to implement resource quotas for this particular database management system.
  •  

anilkh7058

  •  

cookaltony

To ensure that bad queries from remote users do not impact the functionality of the database for local connections, you can implement several strategies:

1. Query Throttling: Implement query throttling to limit the number of queries a remote user can execute within a certain timeframe. This can prevent a single user from overwhelming the database with excessive or poorly optimized queries.

2. Resource Limiting: Set resource limits on the MySQL server for remote users. This includes setting limits on CPU usage, memory consumption, and query execution time. By enforcing these limits, you can prevent badly written queries from consuming excessive resources, which in turn affects the performance of local connections.

3. Prioritization: Assign higher priority to local connections by configuring the database server to prioritize their requests over remote user requests. This ensures that local connections receive more resources and faster response times.

4. Connection Pooling: Implement connection pooling to manage the number of simultaneous connections from remote users. This controls the number of active connections and prevents resource contention when multiple remote users are connected simultaneously.

5. Database Optimization: Regularly optimize your database by identifying and fixing query bottlenecks, creating appropriate indexes, and optimizing table designs. This can help improve overall query performance and reduce the impact of poorly made queries.

6. Query Monitoring and Logging: Monitor incoming queries and log the execution times. This allows you to identify and track the performance impact caused by poorly made queries. With this information, you can take appropriate measures to optimize or restrict problematic queries.

7. Database Caching: Utilize a caching mechanism, such as Redis or Memcached, to store frequently accessed data and reduce the need for repetitive queries. This can significantly improve response times for both local and remote connections.

8. Connection Timeouts: Configure shorter connection timeouts for remote users compared to local connections. This ensures that idle connections from remote users are closed more quickly, freeing up resources for local connections.

9. Query Optimization: Work on optimizing the poorly made queries from remote users by analyzing their execution plans and rewriting them for better performance. Providing guidance or assisting remote users in optimizing their queries can ultimately benefit the overall system performance.

10. Load Balancing: Implement a load balancer to distribute incoming requests across multiple instances of the webserver. This helps evenly distribute the workload and prevents a single server from being overwhelmed by bad queries.

11. Monitoring and Alerting: Set up monitoring tools to track the performance metrics of your server and database. This allows you to proactively identify issues and take corrective actions before they affect the system's functionality. Configure alerts to notify you when specific thresholds are exceeded or when certain types of queries are detected.

12. Regular Maintenance: Schedule regular maintenance tasks, such as database backups, index optimizations, and query analysis. This helps keep the system running smoothly and minimizes the impact of poorly made queries over time.

By implementing these additional strategies, you can further enhance the performance and resilience of your server configuration, ensuring that bad queries from remote users do not adversely affect local connections.
  •  


If you like DNray forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...