Task for php interpreter

Started by Bronson, Oct 02, 2022, 01:37 AM

Previous topic - Next topic

BronsonTopic starter

Greetings everyone!

I am wondering if the PHP interpreter would struggle with the task of searching for specific words in 220 files, on a moderately-sized virtual server, VPS 4/2200 Mhz, 8 GB.

The files themselves amount to 8 Mb in total, and though they may be numerous, they are generally quite small. Would searching through these files using regular expressions in PHP be a laborious process, or is it not much of an issue?

Thank you all for your potential responses in advance!
  •  

Ali_Pro

Overall, I don't foresee any issues with your server configuration - even for more modest parameters.

However, if you aren't running this particular task at a very high frequency, it shouldn't pose any problems.

I'm not sure what was meant by "nonsense," but it's always worth considering potential performance optimizations regardless of the scope of a particular task. This could involve minimizing database queries or leveraging caching technologies.
Ali.
  •  

_AnnA_

class Timer{
    private static $start = .0;

    static function start()
    {
        self::$start = microtime(true);
    }

    static function finish()
    {
        return microtime(true) - self::$start;
    }
}

Timer::start();
Here is your code looking for
echo Timer::finish();

Run the script. How long does it take to search?
  •  

Newport

If you want to check the server with the given configuration for common search queries, you should use a database (and full-text search) to be objective.
  •  

Ali_Pro

Quote from: _AnnA_ on Oct 02, 2022, 07:37 AMRun the script. How long does it take to search?
Hmmm, what's the use of a single test?
It may take 0.01 sec, but with real load and 50 users running it at the same time it may take seconds, or even tens of seconds.
Which, you must agree, is unacceptable.
Ali.
  •  

-DM-

Quote from: Newport on Oct 02, 2022, 09:42 AMcheck the server with the given configuration
As understood from the wording of the topicstarter, it's a one-time task. 
In that case, I don't see any problem, the interpreter can handle it without stress.
  •  

Guess jr.

The lack of clarity in the original question can lead to irrelevant answers.

While PHP is generally quite efficient, it's worth noting that disk read operations can be more costly - so the efficiency of the read process will greatly impact overall performance.

In cases like this, one potential solution could involve implementing a multiprocessing approach - something which is common in languages such as Python. Similarly, recent versions of PHP may also have features supporting parallel processing.

Overall, given the specifications outlined in the initial question and with an effective implementation strategy, this task shouldn't create any significant server load.

It's important to properly frame and communicate technical questions to ensure that you receive helpful and accurate responses. Additionally, using best practices such as optimizing code performance and leveraging caching or parallel processing techniques can greatly improve the efficiency of your applications.
  •  

BronsonTopic starter

Quote from: _AnnA_ on Oct 02, 2022, 07:37 AMRun the script. How long does it take to search?
On average 0.260 sec, i.e. +/-260 milliseconds, - php 7.4, the usual pattern search function in the string preg_match_all is used, roughly speaking it is looking for 16 different words.
I understand this is not much, and no problem, it's a one-time task, pressed the button, see the result and forget.

Quote from: Guess jr. on Oct 02, 2022, 12:11 PMall with the right approach.
Let's practice the right approach!
Thank you all for the tips!
  •  

Newport

Parsing a single task on a normal server should be no issue at all. In the past, I myself have parsed an XML structure using PHP - and if I recall correctly, the file size was around 400+ MB.

Despite having server parameters that were only half as powerful as yours, I didn't encounter any significant issues during the process - though I should note that properly organizing the parsing implementation was critical.

With this in mind, even a relatively straightforward operation such as running a preg_match on a small amount of data should be a routine task that does not incur additional costs or excessive load on the server.

It's worth keeping in mind that your specific server configuration and the nature of your implementation can greatly impact the efficiency of these operations, so it may be wise to carefully evaluate performance metrics and adjust your approach as necessary.
  •  

justinthomsony

As a newcomer to PHP, I've been using two different bundles - XAMPP (which uses apache+mod_php) and WT-NMP (nginx+php_fastcgi). A question that has always intrigued me is: does parsing of each PHP file occur with EVERY request to the web server (assuming the scripts are not being dynamically generated or updated)?

If so, my main concerns are:

1. What is the approximate overhead in this scenario, and how can it be measured?
2. Which extensions should be utilized to avoid unnecessary waste of processor time?
3. If an opcode caching extension is installed, how can PHP scripts be properly updated?

efficiency of PHP scripts on a server can be impacted by a variety of factors, including proper configuration and implementation of caching technologies. Additionally, techniques such as code profiling and benchmarking can provide valuable insights into the performance of your applications.
  •