Today, I would like to talk about the important services, ports and how should you treat them. I have seen many servers and VPS have them in default. I strongly urge everyone to starts to restricting and protect ing them. this is necessary.
SSH service port 22 – This is remote Shell access for a Linux server. You can use TCP wrapper or iptables to restrict access by certain IP addresses. Also, change port 22 to something else that you can remember. Do not keep its default settings.
RDP – ports 3389 change the administrator username and restrict IP addresses using the advanced firewall in Windows server OS.
MySQL port 3306 limited to a local connection.
MSSQL port 1433 limited to a local connection.
Plesk administration port 8880 & 8443 using Plesk administration restrict access or use 2FA under tools and settings.
cPanel / WHM port 2086, 2087, 2082, 2083, and even 2092 and 2093 you can restrict access using IP addresses in WHM. Use 2 FA for WHM and cPanel access.
FTP port 21 active mode if possible restrict the access using IP addresses or at least use a strong password.
There is no difference in encryption if you are using a free SSL certificate, they are the same in terms of functionality. However, some would still prefer a paid SSL certificate.
There are a few reasons why you would go for a paid SSL;
Re-issue – mostly unlimited from most CA. However, there is a limit or delay to get an SSL issued as the process is automatic for end users. The waiting time for the next cycle is no for someone to run an online business.
The validity of the free SSL certificate is shorter, you definitely can not get past the 12 months duration,
Limited to DV (domain validated), if you are looking for OV or EV, it is only available to a paid SSL certificate.
There is no insurance value on a free SSL certificate.
Support provided by CA, you might get support still for the free one but it is definitely taking longer.
Reputation, if I’m running an online business. I will want my SSL certificate signed by a reputable CA and recognized by my visitors. Not likely with the free SSL, especially for the EV SSL.
Today, we ain’t talking about web hosting but it is SEO related. How to improve your website ranking in Search Engine listings. I’m ain’t no expert neither it is my living tool. But I’m immersed in this topic on and off for the past few years. So, I’m taking the opportunity to share.
Write Relevant Content
Quality content is the number one rider of your search engine rankings and there is simply no substitute for wonderful articles. Quality content created specifically for your designed customer raises website traffic, which boosts the site’s authority and relevance. Fine tune your web writing capabilities.
Determine and focus on a keyword phrase for each web page. Think about the best way your reader may possibly search for that certain webpage (with the terms just like “web hosting in Singapore,” “best-shared web hosting,” or “Fast web hosting”). Then, repeat this phrase several times throughout the page once or twice in the opening and closing the sentences, and several to four more times throughout the remaining articles.
Don’t forget to use strong, italics, heading tags like an H1, and other emphasis tags to highlight these types of keyword phrases but do not overdo it. You still need your language and composing style to read naturally. Under no circumstances sacrifice great composing for SEO. The better webpages are created for the user, not for the search engine.
Update Your Articles Regularly
You have most likely noticed that all of us feel quite highly about content material. Search engines do, also. On a regular basis kept up to date content is viewed as one of the best signals of a site’s relevancy
Title metadata is responsible for that page titles shown at the top of an internet browser windows and as one of the subjects within just search engine results. It is the most important metadata on your webpage. For those with a CMS site, the web staff has developed an automated program for creating the meta title for each website.
Description metadata is the literal description that an internet browser may use in your web page search result return. Think of it as the site’s windows display-an exact and interesting description of what is comprised within, with the goal of telling people to get into. A good meta description will certainly typically consist of two full sentences. Search engines may not always use your meta description, but it is important to give them all the option.
Keyword metadata is hardly ever if ever used to tabulate search engine rankings. On the other hand, you should already know the keyword phrases, so it does not harm to add them into your keyword metadata. You will want to include a variety of key phrases. As a general rule, try to keep it to approximately 6-8 phrases with every phrase consisting of 1-4 words. A great example would be “cheap web hosting in Singapore.”
Have a links worthy site
Focus on constructing important links within the wording. Instead of needing “click here” links, try composing out the identity of the spot. “Click here” has no search engine benefit past the included URL, while “cheap web hosting” is definitely rich with keywords and will improve your search engine rankings as well as the rank of the webpage you are backlinking to. Always use descriptive web links by connecting keywords-it not only enhances search engine optimization but also gives benefit to your visitors, including those with disabilities or who are using display screen readers.
Use alt tags
Always describe your video or graphics and online video media using alt tags, or alternative text descriptions. They enable search engines to locate the webpage, which is crucial-especially for those who make use of text-only internet browsers or screen readers.
The last article we have discussed the major challenge for today’s businesses is online marketing. The popularity of going online has made the exposure is more effective and easier, Actually, you will need just a computer connected to the internet,
In fact, online marketing is not new but it is the key element of business success today. The idea is to generate human traffic to your website. We call it a money site. Whether you are selling a product or providing a service, it is the same principle.
How do you make people come to your website? One of the most effective and traditional methods is advertising. You tell people about the products you are selling, the services that you are providing. You can advertise in social media, search engines and your business-related portals.
These channels offer a mechanism to monitor people click your advertisements online. The more of your advertisements show up, the higher the chances it is clicked and the more money you will pay. Marketing through advertising can be expensive. However, the return is faster.
Another type can be costly and it can be affordable, it depends on how you are doing it. We call it SEO. Get rank highest possible for the products and the services you are dealing with. The result is not instant and very subjective. There are influencing factors like the web content, the ranking algorithm in search engines, your competitors etc. For this, I suggest to business owner can sacrifice time by doing it yourselves. A unique article publish often like twice a week or a blog related to your business. Like what I have said, this can be months to see the result and you must d plenty of reading on SEO.
This is the cheapest and most effective ‘Word of mouth’. Recommendation ftom somebodies can be friends, colleagues customers and relative. If you have a huge social circle, you definitely can make use of it. However, the result can be reversed. If someone has started to make an unwanted remark, they spread fast and it can be harmful to your business.
In my opinion, online business marketing must include these 3 methods. However, the 2nd method SEO is for the long run if you have done it correctly. The return is very constant. You won’t get rich overnight but I’m sure it generates income constantly and you can gauge your income more accurately with your expectation.
In the last article, we install maldet. We learn how to configure maldet today. Again, I want to mention maldet is free and only for Linux server. Let us begin the configuration, assuming you have installed maldet successfully.
For setting up maldet, the configuration file at /usr/local/maldet/conf.maldet has to make sure you be modified.
The next are some of the general options that you will may just want to set.
If you will want to be notified of the existence of malware by email, set the following selections.
email_alert : If you want to get email alerts whenever a suspect file is detected, then it should be set to 1.
email_addr : The email address to which notifications should be directed. This is used in combination with the email_alert option.
email_ignore_clean : When malware notifications have been automatically cleaned out (check the next two options), ignore dispatching email notifications. This is disabled by default. Set it to 1 to allow it, if you have decide to put up an automated daily scan that picks up and cleans the hits and you do not want to be notified of these by email.
What action will need to be used on the infected files? The following alternatives can be placed to quarantine (to push the affected files to a secure and protected area where they are unable to create any damage) the files.
quarantine_hits : The default value is 0. Set this to 1 and so that the infected files will be relocated to quarantine.
quarantine_clean : The default value is 0. This is used once quarantine_hits is set to 1. Do you want the program to further clean the files? Set this to 1 if you will want the program to try to clean the malware injections. Maintain this as zero if you want to check before cleaning.
In a multi-user conditions, the following choices may become useful.
quarantine_suspend_user : By default, the following is disabled and set to 0. If you set this to 1, the accounts of users who have got hits will be suspended. For this to function, quarantine_hits should be 1.
quarantine_suspend_user_minuid : The lowest user id which can be suspended. This is set to 500 by default.
inotify_minuid : The lowest user id above which users need to be watched. The default value is 500.
inotify_docroot : The web directory relative to the home directory of users. By default, it is set to public_html. If this is set, only this web directory will be checked.
Save and close the configuration file.
A simple scan
For a simple scan, run maldet with the –scan-all option with a path as an argument. It first builds a list of files for almost all the directories and sub-directories in that path. Then it reads through all any files and gives the number of hits. It also provides a report which you can easily view to examine the files that are suspicious. Help to make sure that you provide the full path and not the relative path.
sudo maldet –scan-all /home/username/public_html/
A notice of warning, though. The setting scan_ignore_root in the configuration file is set to 1 by default. This triggers files that are owned by root to be ignored in the file list that maldet builds. The default value is more efficient, but the assumption is that your root password has not been compromised and malware are not injected into root-owned files. Change this setting to 0 if you want root-owned files also to be scanned. This might slow down the scan. So, use it judiciously.
You can view the files that are affected by opening the report file mentioned.
Quarantine affected files
When quarantine_hits is set to 1, maldet not only scans for malware, but also moves the hits to quarantine so that your users do not have access to these files. So, your malware scan may produce results as below. In this case, quarantine_clean is set to 0.
If you view the report, you can see the affected files and their quarantine location. You can inspect the files and then decide on whether you want to clean them.
If you scan with the quarantine_hits set to 0, you need not set it to 1 and redo the scan. Instead, you could quarantine all malware results from the previous scan with
sudo maldet -quarantine SCANID
Quarantine and clean affected files
When quarantine_clean is set to 1, it moves the affected files to quarantine, maldet tries to clean them.
If you did a scan with the quarantine_hits or quarantine_clean set to 0, you can do a clean with the following option.
sudo maldet -clean SCANID
Restore a file
If you want to restore a file which was false positive as a malicious and quarantined, or if you have cleaned the file and want it back in its proper location,
sudo maldet -restore FILENAME
Alternately, give the complete path of the quarantined file.
You can also make use of wildcards in your scan path. ? is the wildcard character.
sudo maldet –scan-all /home/?/public_html/
This will check all directories inside /home and if any of them had a public_html sub-directory, then that directory will be scanned completely.
If you want to check the same path as a previous scan, but only those files created or modified in the recent past, you have to run maldet with the –scan-recent option and the number of days n
A weekly incremental check will be done by doing such a recent scan for 7 days.
Automate periodic scan
You can automate daily scans using the cronjob feature. During installation, LMD installs a cronjob at /etc/cron.daily/maldet.
This cronjob will update signatures, include new malware threats in its registry and perform a daily check of all the home directories and recent changes on the server. Whenever, it detects some malware, it will notify you specified in the configuration.
The inotify monitor can be used to monitor users real-time for file creation, modification or movement. Monitoring can be done with one or more of the three options available,
The users option will take the home directories of all users in the system who have UID greater than inotify_minuid and monitor them. If inotify_docroot is set, the users’ web directory, if it exists, will only be monitored.
sudo maldet –monitor users
Alternately, you can monitor paths. Give a comma-separated path with the –monitor option.
sudo maldet –monitor PATH1,PATH2,…
For example,sudo maldet –monitor /tmp,/home,/var
If you have concerns about specific files, you can monitor specific files by giving a comma-separated list of files.
sudo maldet –monitor FILE1,FILE2,..
Exclude files or paths
Certain paths or files can be excluded from the scan, by using the ignore files.
Add files or paths to be excluded from daily scan in /usr/local/maldetect/ignore_paths
Add signatures to be excluded from daily scan in /usr/local/maldetect/ignore_sigs
Add files or paths to be excluded from inotify monitoring in /usr/local/maldetect/ignore_inotify.
Add the extensions of file types that you want to exclude from daily scans (one per line) in /usr/local/maldetect/ignore_file_ext. Sample entries in file could be
Check out more options like running maldet in the background and other finer settings by using the help option.
sudo maldet –help
If you run a self-hosted website, at some point or the other, it is possible for malicious hackers to inject malware into your system. Before that happens, get your system secure and install maldet to keep ahead of such attacks.
Actually, it is frustrating if you are receiving a lot of spam email. Even you have so-called the best anti-spam on your email service, you can receiving spam email, maybe lesser and at the same time, you might treat some emails as spam email, known as false positive.
As far as I’m concerned, there isn’t 100%. With anti-spam you are receiving lesser definitely but your definition is never the same as the server. Some anti-spam require you to set rules or train them in order to be effective
Thus, you cannot eliminate those spam emails defined by you. However, you can still minimise receiving spam emails without spending a lot to or buy a good anti-spam. Here are a few tips to help you;
a. Never use your ‘work’ email to do registration online for personal use, Your email address can be sold to someone for bulk sending. Always think twice if registration is necessary.
b. Avoid common account name like help, sales, enquiry or similar. If your name is John, avoid using john@, and add your last name.
c. Do not advertise your email address. Spammers like to use harvesting technique to collect email addresses. Common areas like auction portals, buy & sell portal etc.
e. Make sure you use SPF in your domain name DNS allows only permitted email server to send email on your behalf. I recommend ‘-a’ at the end of SPF record if you want to enforce those emails impersonate your organization.
f. Use effective RBLs on your email server. Reputable RBLs filter emails sent from bad IP addresses.
WordPress version 5.1 is here, you can find the details here https://wordpress.org/support/wordpress-version/version-5-1/ . If you are using an older version of WordPress, I suggest you upgrade as soon as possible.
Before you upgrade, always verify the plugins that you have installed are compatible. An upgrade to the outdated plugins may require. Do a backup of your WordPress website before the upgrade.
It is a common mistake made, the users backup the WordPress files only. WordPress website is a database CMS. Content updates are changes in the database table, thus making a database dump / backup is important.
Without the latest database backup, you have a high chance of getting a broken website during restoration.
There are many types of cache, some served with the web server and some are integrated with the website applications, like WordPress. Whichever the type of cache you are using, the objective is to speed up the website.
Yes! I just said ‘sped up the website’. Wait a minute speed up my website? I have a broken website and mt updates were never appear!
Certainly, your website and her website are not the same. Apart from the website, She is using a IIs web server but you are using an outdated Apache 2.2 web server. For your information, Apache 2.4 is newer and better.
Caches are intended to speed up your website but the question is ‘how much’? Is your website properly loaded? The idea of ‘cache’ is to store the data frequently access or preloaded so that it can speed up on displaying the page on the website.
The cache uses the technique of compressing, preloading, minifying, combine and expiring to speed up the loading process. However, these caches introduce 2 main issues;
A broken website
Updates are not reflected.
These are known issues. They can be annoying. Some learn the processes. If you choose to use cache, some of these are unavoidable. Some caches are more intelligent, they automatically flush each time you have made an update but I suggest you manually flush for a better result.
If the cache has a commercial license. Ask for trial. You need a few days of intensive testings to make sure all pages of your website are loading properly.
Sucuri Firewall Pro is better? Yes, in a way. It depends on the users, and on how he or she is managing the website. However, I personally feel Sucuri is better and can be better.
In the market, Sucuri is not the only one markets website protection. There is big name like Cloudflare, Stachpath and others. But my discussion is on Sucuri today, and the Pro plan. They don’t have a free plan like Cloudflare. For their plans: You can find it here
I set up and use most, Actually, they do the job. However, I like Sucuri. The set up gives me a feeling that it is more secure for those choose to use their own DNS. The website webroot point to Sucuri proxy, not to your source. In this way, it is difficult to find your source IP and attack it.
Even your source IP is exposed, you can protect your web server only allowing sucuri proxies to access it. It is strongly recommended that you do that. When you are using a firewall proxy, your log will show the proxy IP instead. Sucuri has a tutorial on this, how The X-forward can be found here for the most web server.
Sucuri Package from us comes with monitoring too. You can check your website is status and infected by malware or not as little as 6 hours interval.
Most website proxies include CDN. With the feature, it speeds up your website. A bigger brand has more POP than Sucuri. However, never get the impression that the site responds faster. For example, my website vastspace.net scores 86 in Pingdom speed test with Sucuri and 72 only with the other firewall. Test location for both set up was the same. To confirm, I used GTmetrix, Yslow is 81 and 89 with Sucuri.
I feel too the website has loaded faster even the load time at GTmetrix has proven. I’m not sure you have to pay more to improve loading speed (image loading speed for example) if this is the case, Sucuri is cheaper.
Sucuri is easy to understand and straight forward to most as compared to many web firewall. I found what I need, I have tried some web protection GUI. I’m either overwhelmed by the clickable icons or they have limited features. Actually, the worst feelings are having to pay for a particular feature. In my opinion, do not put them there but sell them as the addons.
Like I have mentioned, this is my opinion. Sucuri is value for money. It costs lesser than most, you will get website protection and speed. It is worth considering.