i recently had an issue with frequent login attempts against on of my services. These were almost all from countries that should not be accessing my service. To resolve the issue I implemented geo blocking with TCP Wrappers. This is how I went about geo blocking connections. Continue reading
I’ve done a little tuning to my WordPress setup. In order to keep up to date, I’ve switched from the Ubuntu installation to a downloaded installation under
/opt/wordpress. This is owned by my user and served by
apache running as
www-data. Updates are done using the sftp add-on.
I added myself to the
www-data group. This allow
apache to read any files with group access, but prevents writing if the web-server is compromised.
I set the group sticky bit on all the directories. If required, setting it on the
wp-content/upgrade directory should be sufficient.
I generated my key outside the home directory for
www-data which is
/var/www. The directory I chose is not one I would
ssh requires a
.ssh/known_hosts file in its home directory. This was created and the appropriate security added. The key is password protected.
There are some outstanding issues. I’ll look into these as time permits.
The WordPress ssh2 modules does not work on my server. I’ve found a couple of issues.
- Passwords on the key don’t work. This is a known issue with a work-around. The initial connection appears to fail, but a second call should resolve the issue.
- The is_dir function does not work. Returning true for paths that end in a slash (
/) is a workaround. This got me as far as trying to install. This may be a result of how the path is constructed and there is a published workaround.
is_filefunction appear to fail as WordPress reports the download contains no files. This is likely the same issue as for the
My modifications to the theme are getting a little old. The theme works reasonably well on mobile devices, but I would like to update to a more streamlined theme. The site statistics I have indicate a surprisingly high percentage of viewers use a mobile device.
I use eximstats to report my daily email traffic. I have a fairly high rate of rejections, and wanted hostnames listed in the rejection reports. To resolve this I developed a patch to capture the hostname related to the IP address, and add this data to the rejection reports.
The enhanced list saves me the effort of looking up IP addresses that were repeatedly addressed. Occasionally, these are from legitimate servers that have been misconfigured. DNS problems are often the cause. Continue reading
One nagging issue I had with IPv6 was how to distribute DNS server addresses and search lists to my clients. It took a little research to find the solution. On IPv4 I had been using DHCP to do this, but DHCP didn’t seem to be right approach for IPv6.
radvd can be used to distribute both types of data. The following article covers setup on Ubuntu and OpenWRT. The Ubuntu (Debian) examples below should work with any distribution using
/etc//radvd.conf to configure
radvd. Continue reading
My original intent in setting up BackupPC was to be able to backup my laptops. The mainly run Windows, and have a lot of shared files. Therefore I wanted a backup solution which handled de-duplication. BackupPC was just what I needed. I have already posted an article about Setting Up BackupPC on Ubuntu that includes setting up a server.
This article covers setting up BackupPC on Windows using
rsyncd as the protocol. (I tried using Samba, but didn’t like the results with Windows Home editions.) This is done with an extremely minimal
cygwin install available from the BackupPC site on SourceForge. The backups described here are not designed for bare metal recovery. They should include all the user’s files, and some of the configuration data for installed applications. Continue reading
I recently started to do regular backups of all my systems using BackupPC. It uses the
rsync protocol to limit the amount of data transferred during backups. Once the initial backup is done, future backups only need to copy incremental changes. This requires far less resources than other software I have used
This article covers setting up the server on Ubuntu and configuring backups for Ubuntu and OpenWRT. A future article will cover backing up Windows systems using an
rsyncd daemon process. Continue reading
While I was cleaning up my Ubuntu Email server configuration, I consolidated my login security. My SMTP server is Exim and my IMAP server is Dovecot. Mail User Agents (MUAs) use authentication over TLS encrypted connections to access IMAP and SMTP. Both programs had their own password configuration.
Exim includes Dovecot in its supported authentication mechanisms. This enables one authentication mechanism to be used for both SMTP and IMAP (or POP3). This post also includes configuration details for forced authentication over the Submission port. Continue reading
We are quickly running out of IPv4 addresses. Are you ready for World IPv6 Day on June 8th, 2011? I have prepared my configuration on OpenWRT and Ubuntu. This includes configuring DNS using bind, email using Exim, and a Squid web proxy.
Having verified that I could establish IPv6 connectivity, I chose to improve my connectivity. This started with getting a tunnel from Hurricane Electric and updating my configuration. I then updated my bind server and Exim mail server support IPv6 addresses. This posting updates and continues from my post on Implementing IPv6 6to4 on OpenWRT. Review it for information on creating a tunnel and running radvd on OpenWRT. Continue reading
Recent reports indicate that spam is increasing again. I have been using Exim to filter spam for several years. Some recent tuning I have done have decreased the percent of spam which reaches my spam filters. This article provides a discussion of the techniques used, and provides implementation examples. Spambots tend to be simple programs which don’t handle slow servers very well. Using a greylist is effective method of blocking them as they usually don’t retry. My latest changes use delays to cause many spambots to abandon their attempt. Greylisting is used only for poorly configured servers that make it to the Recipient command.
Over the holidays, I had a user experience and attempted browser hijacking. It appeared to have bypassed my squid proxy. My updated configuration now sends all web access via squid. The old firewall rules, that allowed direct access to the Internet, have been replaced with a transparent Squid proxy. This runs on my existing Squid Proxy using another port. Continue reading