If you like DNray Forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...

 

Webserver protection from bots

Started by fitriulina, Dec 28, 2022, 03:57 AM

Previous topic - Next topic

fitriulinaTopic starter

Do you think it is reasonable to add multiple lists of bots to .htaccess file in order to prevent unnecessary server load?
Also, do you have a preferred list that you use currently?
  •  

GavinOwlsen

Such a measure could be reasonable, but only if there is accurate data from logs showing invalid behavior. However, in reality, each case has its unique set of circumstances. For instance, on certain websites, logs revealed excessive activity from American educational institutions, and targeted domains that were subsequently blocked.

It is not advisable to simply prescribe lists found online into the .htaccess file. Instead, it is crucial to evaluate and include only what is necessary while avoiding overloading the file with too many instructions. Other optimization methods should also be explored.
  •  

JPinto

It is important to recognize that the majority of DDoS attacks are conducted by organized hacking teams. However, cloud-based smart filters can remove up to 90% of malicious traffic and effectively decrease server load.

This filtering process involves routers and machines that intercept and distribute traffic evenly, filtering it before passing it on to the web server. End users may experience a slight delay in page loading, but the site will remain accessible.

While basic hosting tariffs may include protection against weak attacks up to 10 Gbit/s, more serious attacks require third-party resources. To protect against various types of attacks, such as DDoS, SQL/SSI Injection, Brute Force, Cross-site Scripting (XSS), Buffer Overflow, and Directory Indexing, a Web Application Firewall (WAF) can be used. Investing in effective protection against DDoS attacks is essential for businesses, as the damage caused can be far greater than the cost of protection.
  •  

carldweb

Adding extensive lists of bots to the .htaccess file can lead to increased complexity and maintenance overhead. Each bot list may have its own specific formatting and requirements, which could make it challenging to manage and update effectively. Furthermore, the .htaccess file itself can become bloated and difficult to navigate, potentially impacting the overall performance of the server.
Instead of relying solely on .htaccess for managing bot traffic, I recommend implementing a comprehensive solution that includes utilizing a web application firewall (WAF) or a dedicated bot management tool. These solutions provide more robust and flexible mechanisms for identifying and mitigating bot traffic, without the need to clutter the .htaccess file with extensive lists.

In terms of preferred bot lists, as a webmaster, I prioritize using reputable and regularly updated bot databases provided by industry-leading security companies. These databases often include sophisticated algorithms and crowd-sourced intelligence to accurately identify and categorize bot traffic. By leveraging these high-quality bot lists, I can effectively protect my websites from malicious bots while minimizing false positives and negatives.

While adding multiple lists of bots to the .htaccess file might initially seem like a straightforward approach to preventing unnecessary server load, it's crucial to consider the implications for long-term maintainability and performance. Implementing a more holistic bot management strategy, which may include utilizing WAFs and reputable bot databases, can offer a more effective and sustainable solution for webmasters seeking to optimize their website's security and performance.
  •  


If you like DNray forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...