Task for the php interpreter.

Started by Bronson, Oct 02, 2022, 01:37 AM

Previous topic - Next topic

BronsonTopic starter

Hi all!

Do you think it's hard work for php interpreter to process 220 files, looking for certain words in them, on a medium-sized virtual server, VPS 4/2200 Mhz, 8 GB.

Files are 220, they take 8 Mb, but mostly they are all small, in general, in php search through regular expressions in the text - is it a costly operation, or so, nothing?

Thank you all in advance for the answers!
  •  

Ali_Pro

For your server configuration in general, I do not see any problem. And for more modest parameters, either.

Well, if you don't run this task every second.
Nonsense.
Ali.
  •  

_AnnA_

class Timer{
    private static $start = .0;

    static function start()
    {
        self::$start = microtime(true);
    }

    static function finish()
    {
        return microtime(true) - self::$start;
    }
}

Timer::start();
Here is your code looking for
echo Timer::finish();

Run the script. How long does it take to search?
  •  

Newport

If you want to check the server with the given configuration for common search queries, you should use a database (and full-text search) to be objective.
  •  

Ali_Pro

Quote from: _AnnA_ on Oct 02, 2022, 07:37 AMRun the script. How long does it take to search?
Hmmm, what's the use of a single test?
It may take 0.01 sec, but with real load and 50 users running it at the same time it may take seconds, or even tens of seconds.
Which, you must agree, is unacceptable.
Ali.
  •  

-DM-

Quote from: Newport on Oct 02, 2022, 09:42 AMcheck the server with the given configuration
As understood from the wording of the topicstarter, it's a one-time task. 
In that case, I don't see any problem, the interpreter can handle it without stress.
  •  

Guess jr.

The illiteracy of the question and generates irrelevant answers.
php itself is relatively fast. But disk read operations are more expensive, so a lot will depend on the organization of the read process.
In python I would use multiprocessing, it seems that the latest versions of php can also do it.
In general, according to these parameters it is not a load at all with the right approach.
  •  

BronsonTopic starter

Quote from: _AnnA_ on Oct 02, 2022, 07:37 AMRun the script. How long does it take to search?
On average 0.260 sec, i.e. +/-260 milliseconds, - php 7.4, the usual pattern search function in the string preg_match_all is used, roughly speaking it is looking for 16 different words.
I understand this is not much, and no problem, it's a one-time task, pressed the button, see the result and forget.

Quote from: Guess jr. on Oct 02, 2022, 12:11 PMall with the right approach.
Let's practice the right approach!
Thank you all for the tips!
  •  

Newport

For a single task - no problem at all, I once parsed the xml structure in php, the file size, if I'm not mistaken, about 400+ MB.
On the server, the parameters are 2 times lower than yours.
Only with a processor I can't say which one, I don't remember. I didn't have any problems with it, except for the fact, that it was necessary to make the parsing organization correctly.
So even a simple preg_match on such a small amount of data, with a normal server, is a completely routine task without any special costs.
  •  

justinthomsony

I'm new to PHP, I use the following bundles:
XAMPP (apache+mod_php) and WT-NMP (nginx+php_fastcgi)
have always been interested in this question: parsing of each php file occurs at EACH request of the web server (provided that php scripts are not updated and are not dynamically generated)?

1. If so, what is the overhead in this case (approximately, and how to measure it)
2. Which extension should be connected to avoid such an "unreasonable waste of processor time"?
3. If the opcode caching extension is installed, how to update the php script?
  •