Encryption on the web has a history of being notoriously expensive, and difficult to implement. Not long ago, you could expect to pay upwards of $1,000 per year to register your website’s SSL certificate with the root server Verisign directly. A shorter time ago, a few re-sellers such as Godaddy came into play. As of the date of this article, one year for Godaddy’s SSL service will cost $69.99 for one website.
While I would have loved to have had a “valid” certificate, for some of my purposes, a snake oil certificate, or self-signed certificate, was secure enough. Since self-signed certificates aren’t validated by a Certificate Authority (CA), modern browsers will try to prevent users from navigating to websites that use them (or any other non-validated certificate).
Now, since I created the certificates on my own server — I hope I can trust myself, I can ignore those messages in that case just fine. This was useful for securing dev websites and code repository connections. Anything public facing, this wouldn’t really be acceptable.
I tried an application called NetDrive a few years earlier with some success in 2010, then a few years later I came across OwnCloud. This package was a decent competitor to Dropbox itself, so I decided to give it a shot. With anything involving logging in and transferring files, it’s a good idea to use HTTPS, so I initially started with my usual self-signed certificate.
This gave me a few choices:
- Chance not using encryption
- Use SFTP to handle the file transfers
- Search out cheap certificate authorities.
This search eventually led me to StartCom StartSSL, who offered free SSL certificates, it seemed almost too good to be true! Sure enough I signed up, validated the domains that I wanted to set up HTTPS, and I was on my way. There weren’t many extra steps to this process verses using a self-signed certificate, it’s mostly just the fact that a third party validates your self-signed certificate. So after getting everything in the correct directories, and updating my Apache config files, I was online.
Fast forward a couple of years, and there is a new player in town, Let’s Encrypt. These guys caught my attention a few months ago on the tech news article circuit, and a CommitStrip comic. It requires you download a script onto your web server, run it, tell it which domains you want to set up HTTPS for… and you’re done. I didn’t believe it at all until I tried it for myself. An odd catch, the certificates it generates are good for 3 months, instead of 12 months. It comes with the option to have it run nightly to renew them, so it’s mostly a moot point. I haven’t gotten to the point where any have expired yet, so I may have to add a followup.
Browser encryption has come a long way has come a long way, and it’s almost at the point where it’s just as easy to have HTTPS as it is to have standard HTTP.
I’ve written some very basic URL rewriting scripts in the past. With only a few URL parameters this is pretty simple to accomplish with a few lines of code.
For example, I had written a basic CMS where URLs followed a specific pattern using .htaccess scripting. The rewrite engine would be able to convert this into something Apache can read.
- The URL bar would read as follows:
- The web server would read this as:
This is fine enough for basic URL patterns, but what happens when the URL tree is a little more complicated? Starting in Joomla! version 1.5, they included a build in router that can be created on a component level.
Used Boats Ahoy! was created before this was available, so I continued to use the module JoomSEF from part one of this article. Fortunately the scripting required for JoomSEF to work properly with a component followed a number of the rules the built in 1.5 router uses. Unfortunately, JoomSEF only required instructions to generate the URLs and not to be able to return the URL to a form the web server understands, which is much more difficult. Also, moving away from the original method meant a number of the URLs would no longer be valid which requires an extra layer of error handling.
URL rewriting is a process I’ve been using on a number of websites for a long time. Modern CMS solutions have various built in functions and plugins to make this simple for anything that you might download and try out of the box. What is under the hood is much less straight forward.
In older versions of Joomla!, my go-to was a plugin called joomSEF. If all of the links are formatted correctly, this plugin will convert standard php links into more user-friendly style links. What is happening under the hood? Well first when the page loads, all of the links that are formatted in a way that Joomla! understands are read, then matched against the database for all of the user-friendly URLs that have been generated. If the matches are found, the user-friendly URLs will be displayed on the page. If the match is not found, then the plugin will generate it then insert the record into the database.
While this process is fine for websites with small to medium number of links, any kind of text matching in the database is a very resource intensive process. One of my websites eventually had to be moved to a VPS server because of this. By the time it was deactivated, this website alone had 1,872,945 records to search through in a table reaching upwards of a gigabyte. I needed to actively prune unused records every so often by looking to see which URLs have been unused.
In a previous post (Overdue Updates) I mentioned working through some issues to help increase speed. Of all the minor tweaks I did, I found the root causes of speed issues which were two-fold:
- Web Service Calls
- Generating SEO Friendly URLs
Thankfully, there was a way to help alleviate both problems with one solution, Cache. I built caching functions into this Used Boats Ahoy!, but due to feed changes, they were not working exactly as intended.
- Original Process (No Cache)
- Page Request►Web Service Call►Process Feed►Display
- Original Process (WithCache)
- Page Request►Read Cache►Web Service Call*►Write Cache*►Process Data►Display Data
- New Process
- Page Request►Read Cache►Web Service Call*►Process Data*►Write Cache*►Display Data
* If needed
With this new process, the server will only have to call the Web Service and process the majority of the data only if the cache is out of date, or doesn’t exist yet. Where many pages were taking 2 seconds to load previously, now take 0.3 seconds if being called from the cache.
This has happened to me a couple times, still more than I would like to admit. First off, I used to have my apache logs named the same as the website URL (“bodhidevelopment.com”). This is a pretty minor problem and can go unnoticed for years… until your server runs out of storage space. Apache2 has a nice little feature to rotate logs, but the default will only look for files with the “.log” extension. I used to regularly have log files that were over a gigabyte, and I could only imagine what kind of performance issues that might cause.
I never really paid attention to PHP warnings/errors unless they were causing serious problems, but it felt like the next logical step after getting the logs in order. A large potion of warnings were just undefined variables being used, which is extremely easy to fix, and can add up fast if there are a large number of variables/loops on a page. This gave me the opportunity to look through code that hasn’t been touched for years.
Over the last few weeks I have been performing some much needed maintenance on the Used Boats Ahoy! website. I originally had much more ambitious plans for this website, but I decided to start with an interim solution.
Used Boats Ahoy! is built on top of Joomla 1.5, which has become obsolete by a few versions. I began researching on how to bring it up to version, but unfortunately there are currently too many ties to deprecated APIs to upgrade. This led me to starting a fresh branch which will eventually replace the current project. I decided for the time being, updating the current project made the most sense.
- SEO updates to URL rewriting
- Update link exchange pages
- PageSpeed improvements
- Google Adsense script loading asynchronously
- FaceBook API loading asynchronously
- Boat Search “Modify Search” populates search criteria
- Formatting fixes
This was one of the languages I was never very keen on working with, but I will not pass up on a good opportunity. I’m in an interesting position where I am working on stand alone project which is loosely coupled with other related projects. This means I’ll be using shared libraries and standards, but I have to read through other projects to learn how to use them.
In a previous post, I’ve been using Google’s PageSpeed Insights for a while to help optimize the page loads of my websites for a while now, but I’ve never gone so far as installing the Apache mod_pagespeed plugin. But after trying a few different Joomla extensions, I figured this one might be more efficient by optimizing directly through the Apache layer.
mod_pagespeed is an open-source Apache module that automatically optimizes web pages and resources on them. Optimization is done by rewriting the resources using filters that implement web performance best practices. Webmasters and web developers can use mod_pagespeed to improve the performance of their web pages when serving content with the Apache HTTP Server.
After having good luck with this plugin on Used Boats Ahoy!, I enabled it on my other websites. Below are the changes in PageSpeed score.
Bodhi Development: 70 to 92
Stat Addict: 74 to 90
Washington State Used Boats: 85 to 92
Quality Used Boats: 88 to 93
Bodhi Sanctum: 72 to 76
Used Boats Ahoy: 88 to 94
I’ve been hosting a VPS through Linode for the last couple of years, and I’ve have a great experience so far. A VPS provides the full root server experience without having to worry about the hardware. I was going through my server configurations, and realized I was running a deprecated Linux Kernal 2.6. A host like Linode makes it extremely easy to change kernels on a VPS with a simple web interface with a dropdown menu to select from a prepolulated list of kernels. From an issue in May of last year, I was recommended to use a more stable version (188.8.131.52) which was still selected. I’m hoping any issues I experienced have been resolved since then.
While there are no significant changes, there rare plenty of random fixes and driver updates, updating the revision to 3.0 was a huge milestone for Linus and Linux. As of today Linux 3.4 has been released, which I hope to be able to look in to taking the additional leap.