Virus on the site?

Started by Ravina123, Aug 20, 2022, 12:15 AM

Previous topic - Next topic

Ravina123Topic starter

Generally, Google brought my website into the database of unsafe sites, although there were no apparent reasons, it opened as usual, the antivirus did not swear.
Since this happens infrequently, they did not immediately find .htacess infected, but that's not the point.
Tell me, please, how to find such rubbish, where can I see it scanned? so as not to wait to be thrown out of the search.
  •  

admissioninfo123

Quite often they add spam scripts. If you have ssh access, then you can log in as root and start the search from the account root:
grep -rl 'v3c6e0b8a' ./*
grep -rl 'FilesMan' ./*
grep -rl 'eval(base64_decode' ./*

Helps to detect spam scripts. It is also common to add .html extension pages with a link to "fun" websites. They can only be looked for with their hands, but usually no one bothers themselves, so they throw them at the root.
And so the general advice, keep the CMS of the latest version and hаcking tends to zero.
  •  

RZA2008

#2
Preventive measures:
In order to protect yourself from hаcking as much as possible, it is advisable to adhere to the following rules:
1. Timely update the software on the server;
2. Timely update CMS;
3. Make backup copies in a timely manner;
4. Use strong passwords. It is desirable that the passwords you use contain at least 8 characters, including numbers and special characters ($%#/*);
5. If possible, do not store passwords in clear text;
6. Use the minimum possible access rights to folders and files.

Virus Scan

So, you suspect that you have a virus on your website. To begin with, it is worth finding out if this is so. It will be useful to study the website using, for example, Firebug. If, when loading a site / page on the network or console tabs, we see suspicious activity: a redirect, iframe loading, loading extraneous pages or files - there is reason to think. Next, you should try to "feed" your site's URL to online services, such as antivirus-alarm or vms.drweb.com/online. They will help determine if your sites have malicious code. After the services work, you just have to look at the files that are indicated in the output. Most often, malicious code is written at the beginning or at the end of the file. So, the online service found something or did not find it, but you still have suspicions that the scripts are infected. It's time to start the manual inspection.

Surface inspection.
find /home/user/data/www/site.com/ -type f -mtime -20
this command will help you find site.com website files that have been modified less than 20 days ago. If you know the approximate date of infection, by changing the mtime key parameter, you can quickly find the files you need.
Examining the ftp log file (/var/log/xferlog, for example) will also help in the search for viruses if the infection was carried out via ftp.

such a record tells us that on June 19, a 16kb virus.php file was uploaded via ftp to the include folder by user user with ip address <IP>. Pay attention to the i key in the entry, which tells us that the file has been uploaded to the server.
It is also likely to be useful to look at the directories available to the general public. These are such directories as uploads, image, etc. Those in which users of the site can write.
file /home/user/data/www/site.com/uploads/* | grep -i php
will show php files in the uploads folder, regardless of their extension. It is unlikely that you allow your visitors to upload php files to the server. And if it's a php file pretending to be a picture... For example
file in.jpg
in.jpg: PHP script text
Detailed inspection.
Let's say you didn't find anything on a superficial examination. Let's move on to a detailed study.


Post Merge: Aug 20, 2022, 01:16 AM


htaccess
It happens that redirects are written in htaccess files.

find /home/user/data/www/site.com/ -type f -iname '*htaccess'
the command will help you find all htaccess files for the website site.com. Examine these files carefully for extraneous redirects.
php/js code
Manually malicious code can be searched by patterns. Most often, this is either code encoded in base_64, or obfuscated according to a certain algorithm. It makes sense to look for the following most common patterns:
FilesMan, try{document.body, String["fromCharCode"], auth_pass, fromCharCode, shell_exec, passthru, system, base64_decode, chmod, passwd, mkdir, eval(str_replace, eval(gzinflate, ="";function, "ev" +"al",md5=,ss+st.fromCharCode, e2aa4e
You can search for malicious code using the grep command, for example
grep -ril base64_decode /home/user/site.com
will show all files in the site.com folder that contain base64_decode. Generally speaking, some of these functions and variables can be used in code for quite legal reasons, so before deleting or cleaning a file, make sure that it is malicious code. Also, before deleting, I recommend making a backup copy of the site.

If you find malicious code on the website, do not rush to edit the file. Look at it first with the stat command
stat-infected.js
In the output, you will see the date of the last access, modification, and file attribute changes. This can help set the time and date of the breach. By these dates, you can search for files and events in the logs, as shown at the beginning of the article.
In order to remove malicious code from a file, the following construction can be used

Remember that the listed actions do not guarantee complete cleanup of your scripts!!! If there are still doubts, it is better to restore the nearest backup.
Actions after cleaning

After cleaning up your scripts, it is advisable to do the following:
- change passwords for access to your server;
- update the CMS you use and all related software (plugins, modules);
- make a clean backup of the website.
These actions will reduce the likelihood of subsequent hаcking and protect against data loss.
  •