CategoriesWordpress

WordPress Optimization Secrets: Block External Requests for Lightning-Fast Performance.

WordPress, known for its versatility and ease, sometimes requires a bit of tweaking to ensure optimal performance and security. Today, we’ll explore a powerful technique that can significantly enhance both speed and security: blocking external HTTP requests.

How External Requests Impact Performance and Security?

  • Performance: When your WordPress site makes external requests, it waits for responses from those external servers. This can add delays, slowing down page loading times.
  • Security: External requests can introduce vulnerabilities if the external servers are compromised or malicious.

The Code: Explained

define( 'WP_HTTP_BLOCK_EXTERNAL', true );
define( 'WP_ACCESSIBLE_HOSTS', '…' ); // List of whitelisted hosts
define( 'AUTOMATIC_UPDATER_DISABLED', true );
define( 'WP_AUTO_UPDATE_CORE', false );
// … additional functions for handling plugin and theme updates

Understanding the Code

This PHP code is designed to:

  1. Block external HTTP requests.
  2. Disable automatic updates for plugins, themes, and core WordPress updates.

Key Components of the Code:

  1. Blocking External HTTP Requests:
    • define( 'WP_HTTP_BLOCK_EXTERNAL', true );
    • This line blocks all external HTTP requests from your WordPress site, which can reduce load times and improve security by preventing unwanted data exchanges with external servers.
  2. Specifying Accessible Hosts:
    • define( 'WP_ACCESSIBLE_HOSTS', '...' );
    • Despite blocking external requests, you might still need to allow specific domains (like API servers or service providers). This line lists the allowed hosts, ensuring that your site can still communicate with essential external services.
  3. Disabling Automatic Updates:
    • define( 'AUTOMATIC_UPDATER_DISABLED', true );
    • define( 'WP_AUTO_UPDATE_CORE', false );
    • These settings turn off the automatic updater for WordPress, which can be crucial for sites where updates need to be controlled and tested in a staging environment before applying them to the live site.
  4. Custom Filters to Deny Plugin and Theme Updates:
    • add_filter( 'http_request_args', 'bt_deny_plugin_updates', 5, 2 );
    • add_filter( 'http_request_args', 'bt_deny_theme_updates', 5, 2 );
    • These filters intercept WordPress’s update checks for plugins and themes, ensuring that updates are not automatically applied. This is especially useful for customized themes or plugins where updates might overwrite custom code.

Complete Code

define( 'WP_HTTP_BLOCK_EXTERNAL', true );
define( 'WP_ACCESSIBLE_HOSTS', 'maps.googleapis.com,api.sendgrid.com,sendgrid.com,wp-rocket.me,api.cloudflare.com,search.google.com,maps.googleapis.com,' );
define( 'AUTOMATIC_UPDATER_DISABLED', true );
define( 'WP_AUTO_UPDATE_CORE', false );
add_filter( 'http_request_args', 'bt_deny_plugin_updates', 5, 2 );
function bt_deny_plugin_updates( $r, $url )
{
    if ( 0 !== strpos( $url, 'http://api.wordpress.org/plugins/update-check' ) )
        return $r;

    $plugins = unserialize( $r['body']['plugins'] );
	if($plugins->active!=null){
		unset(
			$plugins->plugins[ plugin_basename( __FILE__ ) ],
			$plugins->active[ array_search( plugin_basename( __FILE__ ), $plugins->active ) ]
		);
		$r['body']['plugins'] = serialize( $plugins );
	}
    return $r;
}
add_filter( 'http_request_args', 'bt_deny_theme_updates', 5, 2 );
function bt_deny_theme_updates( $r, $url )
{
    if ( 0 !== strpos( $url, 'http://api.wordpress.org/themes/update-check' ) )
        return $r;

    $themes = unserialize( $r['body']['themes'] );
    unset(
        $themes[ get_option( 'template' ) ],
        $themes[ get_option( 'stylesheet' ) ]
    );
    $r['body']['themes'] = serialize( $themes );

    return $r;
}

How to Implement

To implement this code:

  1. Access your WordPress site’s functions.php file or a site-specific plugin.
  2. Insert the provided code snippet.
  3. Modify the WP_ACCESSIBLE_HOSTS line to include or exclude domains based on your requirements.

Decreasing Page Load Times

  1. Blocking Unnecessary External HTTP Requests:
    • By using define( 'WP_HTTP_BLOCK_EXTERNAL', true );, the code blocks all external HTTP requests by default. External requests can significantly slow down your website because each request requires additional processing time and waiting for responses from external servers. By limiting these requests, your website can load faster, as it only processes essential internal requests.
  2. Controlled External Communication:
    • The WP_ACCESSIBLE_HOSTS line allows you to specify which external hosts your site can communicate with. This targeted approach means your site only connects to external services that are necessary for its functionality, like Google Maps or payment gateways. This selective connectivity reduces the overhead of handling multiple external requests, leading to faster page load times.
  3. Reducing Update Checks:
    • The custom filters for denying plugin and theme updates (wpse_102554_deny_plugin_updates and wpse_102554_deny_theme_updates) can indirectly impact page load times. By controlling update checks, you reduce the frequency of calls to WordPress.org servers, which, although not a major factor, can slightly improve backend performance.

Increasing Security

  1. Mitigating External Threats:
    • By blocking external HTTP requests, the code significantly reduces the surface area for attacks. Many security vulnerabilities arise from external interactions, such as API calls to compromised or malicious servers. By limiting these interactions, you reduce the risk of security breaches.
  2. Preventing Unauthorized Data Transfers:
    • Limiting external requests also means that your website is less likely to unknowingly participate in data transfers that could be harmful or unauthorized. This is particularly important in the context of data privacy and preventing data leaks.
  3. Control Over Updates:
    • Disabling automatic updates (plugins, themes, core updates) doesn’t inherently increase security but provides control over the update process. It allows you to vet and test updates in a staging environment first, ensuring they don’t introduce vulnerabilities or conflicts into your live site. However, it’s crucial to manually update regularly to avoid missing security patches.

Conclusion

Implementing this PHP code can significantly improve your WordPress site’s performance and security. By reducing external HTTP requests, you enhance page load times and minimize exposure to potential external threats. Additionally, controlling the update process ensures stability and prevents unintended conflicts or vulnerabilities. Remember, while these changes offer benefits, they require careful management and regular manual updates to maintain a secure and efficient WordPress environment.

Additional Tips:

  • Consider using a caching plugin to further boost speed.
  • Regularly review your plugins and themes to ensure they’re from trusted sources and up-to-date.
  • Implement strong security measures, such as using a firewall and keeping your WordPress installation updated.
CategoriesWoocommerceWordpress

Woocommerce: Automatically change order status based on chosen shipping method.

If you are running an e-commerce website using woocommerce, and using multiple shipping methods to ship the orders. You must have felt the requirement to have some easy way to filter orders based on the shipping method chosen by customer during checkout.

Out of the box, there is no way to filter orders based shipping method, but woocommerce does provides ability to filter based on order status. We can leverage this feature using a small trick and few lines of custom php code.

So here is the deal, we can define few custom order statuses (using code, or by using any plugin) in woocommerce, then with few lines of php, we can programmatically move order to a particular custom status based on the chosen shipping method. Sounds like a deal? Lets do it.

add_action( 'woocommerce_thankyou', 'bt_update_status_woocommerce_thankyou', 10, 1 );
add_action( 'woocommerce_thankyou_cod', 'bt_update_status_woocommerce_thankyou', 10, 1 );
function bt_update_status_woocommerce_thankyou($order_id){
	$order = wc_get_order( $order_id );
	$order_statuses = wc_get_order_statuses();
	$shipping_method_to_order_status_map = array(
		"shiprocket"=>"wc-transporting_sr",
		"nimbuspost"=>"wc-transporting_nb",
	);
	foreach($shipping_method_to_order_status_map as $shipping_method=>$order_status) {
	  	if(strcasecmp($order->get_shipping_method(),$shipping_method)==0  && isset($order_statuses[$order_status])){
			if ( $order->has_status( 'processing' ) ) {
				$order->update_status( $order_status);
			}
		}
	}	
}

Just change lines 7-8 as per your requirements. The variable $shipping_method_to_order_status_map holds the mapping of shipping method to order status.

Shipping method names can be found by going to Woocommerce Settings -> Shipping -> Zone Name. Refer this screenshot:

Just copy-paste the “Title” as the key of the $shipping_method_to_order_status_map array in line 6.

For target order status, if you are using custom order status, you can get order status names from the plugin that you might be using for creating order statuses. Refer this screenshot:

Remember to prefix “wc-” to the slug of custom status. As for the one in above screenshot, the status shall be: wc-transporting_sr. So the array will look like this:

$shipping_method_to_order_status_map = array(
		"shiprocket"=>"wc-transporting_sr"
	);

The code can be installed inside functions.php file of your theme or by using a plugin like “Code Snippets”, as shown in this screenshot:

Now, for every new order, the order status will be automatically updated based on what customer chose as shipping during checkout. Here are few examples:

As a bonus, now you can easily filter orders based on shipping method, right from woocommerce dashboard!! See the tiny links below “Add Order” button in above screenshot!! πŸ™‚ πŸ™‚

We tested this code snippet on our development server and it appears to be working fine on current version of wordpress and woocommerce.

Has it worked for you? has it not? Let me know in comments!!

CategoriesDatabaseMariaDBMySQL

MariaDB ERROR 1075 “Incorrect table definition; there can be only one auto column and it must be defined as a key”

Like many developers around, I was also struggling with MySQL vs MariaDB dilemma lately and decided to stick with MariaDB where ever possible and keep using MySQL for legacy projects.

Recently my company (Bitss Techniques) partnered with Jelastic to provide “Multi-Cloud Platform-as-a-Service”, Bitss Cloud, enabling developers and startups implement scalable hosting with just a few click and a lot cheaper than Amazon or other cloud providers. So we decided to move all vps hosted projects to Bitss Cloud so we can scale hardware resources, both horizontally and vertically as and when required.

Almost all of our legacy projects’ database was on MySQL, hence we had a tough choice to make, keep using MySQL or move them to MariaDB. Specifically one of the db was having more than 15000 tables, sized about 4GB and growing!! Anyways, we decided to move everything to MariaDB, and in no time, we realized the magnitude of mess we got in.

First we tried to export MySQL database using mysqldump command like this:

mysqldump -u dbuser -p dbname > dbfile.sql

Above command usually work flawlessly but somehow this time mysqldump was incredibly slow. The CPU kept racing to 100% usage but only around 100 mb of data got exported into the sql file!! Even after spending hours researching about this on Google & SO, practically nothing helped which could speed up the export process. My intuition kept hinting me that this has something to do with mysqldump being single threaded, so i googled about using mysqldump in multithreaded way, and that is when i came to know about ‘mysqlpump‘. Next thing we tried was this:

mysqlpump -u dbuser -pdbpass dbname > dbfile.sql

Mysqlpump unbelievably dumped 2.5 GB of database in merely 60 seconds!! Thanks Oracle, for the great utility!!

After we got our MySQL db dump, we zipped it up, copied it to the newly launched MariaDB instance (through NFS), extracted it, then ran this command to import:

mysql -u dbuser -p dbname < dbfile.sql

Well, looks like the cosmos was not in mood to favor us. The import failed with error:

Unknown Collasion utf8mb4_0900_ai_ci

Apparently, MariaDB does not support all MySQL colliasions and manually changing it on a 2.5 GB of SQL file was an impossible task. Linux have an excellent ‘sed’ utility to rescue in situations like this. We ran this sed command:

sed -i 's/utf8mb4_0900_ai_ci/utf8mb4_general_ci/g' dbfile.sql

That command replaced all occurrences of ‘utf8mb4_0900_ai_ci’ with ‘utf8mb4_general_ci’. Nice & Easy.

Well, not so easy. The import failed again with this error:

ERROR 1075 (42000) at line 863: Incorrect table definition; there can be only one auto column and it must be defined as a key

We had no idea what went wrong so we decided to investigate further by looking at line 863 of sql file. But looking at 1 line of code inside a 1.4 GB file was impossible for any text editor, but linux’s “sed” command came to rescue again. This command helped us to see one line inside the large file in linux:

 sed -n 863,900p dbfile.sql 
CREATE TABLE `db`.`wK4df1Ga1o_101_nf3_action_meta` (
`id` int NOT NULL AUTO_INCREMENT,
`parent_id` int NOT NULL,
`key` longtext NOT NULL,
`value` longtext,
`meta_key` longtext,
`meta_value` longtext
) ENGINE=InnoDB AUTO_INCREMENT=119 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_general_ci
;
-- insert into (.....)
ALTER TABLE `db`.`wK4df1Ga1o_101_nf3_action_meta` ADD UNIQUE KEY `id` (`id`);

The create table statement generated by mysqlpump was causing the problem. Mariadb requires an auto increment column to be a key (unique or primary). Mysqlpump does set the auto increment (id) column as unique key later using alter statement, but the error is issued before alter statement is ever executed.

We ran these two commands to edit the sql queries so unique key is created during create table statement.

 sed -i 's/`id` int NOT NULL AUTO_INCREMENT,/`id` int NOT NULL AUTO_INCREMENT unique key,/g' dbfile.sql
 sed -i 's/` ADD UNIQUE KEY `id` (`id`);/`;/g' dbfile.sql

We got the same error a couple of more times for columns of different datatype, just small edit to above command was needed like this:

sed -i 's/`session_id` bigint unsigned NOT NULL AUTO_INCREMENT,/`session_id` bigint unsigned NOT NULL AUTO_INCREMENT unique key,/g' dbfile.sql
sed -i 's/` ADD UNIQUE KEY `session_id` (`session_id`);/`;/g' dbfile.sql

Then we tried to import the database again with our fingers crossed. And…… Import was successful!! Hooray…..

That’s how we imported MySQL database to MariaDB.

To summarize, we got 2 main errors:

  1. Unknown Collasion utf8mb4_0900_ai_ci.
  2. Incorrect table definition; there can be only one auto column and it must be defined as a key.

Somehow, mysqldump was unable to export the db, so we used mysqlpump, but its sql queries were not welcomed by MariaDB. Editing the sql file through sed command helped to resolve the issues.

I hope our experience will help someone to fix the import issues.

Are you getting some other error while importing from MySQL to MariaDB, do share in comments and I’ll try to help.

CategoriesMicrosoft Dot Net On Linux

MS SQL Server on Linux <=> Management Studio (SSMS) on Windows (via ssh)

If you use MS SQL Server for your database needs, probably you also use SQL Server Management Studio (SSMS) for several day to day database management needs. SSMS is a common tool for managing MS SQL Server whether you need to do DDL, DML or DCL related tasks. While all these things can be done using plain T-SQL and SQLCMD utility, SSMS does things beautifully without us worrying about the SQL syntax and falling into the complexities of queries.

MS SQL Server’s support for Linux was very much awaited among its users. Most of the .Net Framework developers generally uses MS SQL Server for database but the windows-only support not only limited the usage but also increases of hosting cost as compared to linux based db servers.

Continue reading
CategoriesMicrosoft Dot Net On Linux

.Net Framework ❀ Linux – a beautiful love affair

Since beginning of my days as computer enthusiast, Linux have always fascinated me and motivated to start my journey in computer technologies. One of the best thing about Linux is that it has no limits. Almost all of the breaking technologies that we see today are directly or indirectly related to linux!! Whether it is Mac or Android, Drones or any IOT, linux is everywhere.

Back in 2005, when computer systems & information technology was still evolving, the ease of use & user friendly experience made Windows a very popular operating system. So by the time I got my B.Tech degree, I developed a few experimental apps for myself and for college projects using Microsoft .Net Framework, which was picking up lot of buzz during that time.

Its really amazing to be able to develop both standalone and web application by using same set of concepts and tools. Microsoft .Net Framework does exactly that. So in no time, it became source of my bread and butter.

In-spite of being awesome, .Net Framework was hiding a little shame within itself – ‘Linux’ πŸ™

Over past few years, several technologies like PHP & Python have evolved many fold. One of the primary reason of their success is their support for Linux and ‘Open’ Culture, on which, .Net Framework was not very good.

Continue reading
Categoriescomputer security

Ransomware attack? Here’s what can be done about it.

Ok, so we are here. Admit it or not, the most scary thing as a computer user is when you start your PC and suddenly all your files have been renamed to something cryptic text and there is a “ReadMe” file in all your folders mentioning that your files have been encrypted and you need to deposit some bitcoins to get them back. Well, if you have encountered it try remembering when was the last time you’ve gone wild on your pc (if you know what i mean ;-)) and regret. That’s the only thing you can do if you’re hit by a Ransomware attack. If there is no backup and the data meant lot, you’re screwed. Literally.

Continue reading