Category Archives: Blog

Pressing Pause on Work

The French legislation that was signed off in May 2016 and is in effect as of Jan 1st 2017 will be something studied closely by most other countries in the next few years. Part of the law changes (which included other changes to allow employers to more easily dismiss staff) was to have companies define a time when their staff can effectively disconnect from work email.

Almost all companies have been trying to rapidly adopt a “mobile first” approach to their business, mostly to catch up with their customers who are now using mobiles more than any other device. The flow-on effect of this has been to then try the same with their own work force and for good reason. Give your staff the right information at the right time in order to better serve your customers and improve their experience.

But email, the bane of many people’s lives, was always the first and simplest product to get people to use. Away from your desk, in a meeting, on the train, and of course at home long after work hours finished. This has been a growing expectation at many companies that emails are almost like TXT messages; something that needs a prompt, if not immediate response. But email just isn’t that medium, and that expectation is misguided if a company respects and cares about their staff. Some of this is definitely a cultural shift, perhaps with younger employees moving away from email and not having that old mental connection of email to “snail mail” – something that takes time.

In the research done on the subject of stress levels vs email (a topic I’m sure you’re familiar with), it was found that the more you check your email, the higher your stress levels become. If you can’t disconnect and separate your work time from home/play time then your mental health will likely suffer, to the detriment of one or both.

I work a lot with mobile technology and trying to ensure people have the right tools for what they need to do, but I definitely see the advantage of changing expectations of working after hours. I hope the French law changes provide a measurable improvment in the health of those they affect and that more companies choose to do the same and combine them with similar work environment updates for the “modern age” (whatever that means these days). Work from home if you can, interact with those groups you need to for face-to-face time, but when you’re done for the day, press pause on the work side of your life.

WordPress Permalink 404 with HTTPS

The time had come to switch this blog to HTTPS given the ease and cost ($0) of deploying certificates from LetsEncrypt. So that was easily done under Apache – create a new conf file for the SSL site in /etc/apache2/sites-available, and then update the old conf for the non-SSL site to redirect before requesting a new cert using certbot-auto -d mike.mcmurray.co.nz –apache. WP handled that just fine but only the admin pages and the main home page displayed as expected, other pages were just a 404.

So I made the .htaccess file writable by WP and updated the permalink rules from the WP admin console to have the file updated. Nope, still the same.

The rewrite rules are the issue, it’s just that they’re not being allowed to work. The new conf file for the SSL config needs to allow the web server to override the more secure defaults. So this needs to be in the SSL configuration file – note this is a sub-section, not the whole thing.

 <VirtualHost _default_:443>
     ServerAdmin admin@yoursite.com
     ServerName blog.yoursite.com
     ServerAlias blog.yoursite.com
     DocumentRoot /var/www/html/blog

     ErrorLog ${APACHE_LOG_DIR}/error.log
     CustomLog ${APACHE_LOG_DIR}/access.log combined

     <Directory /var/www/html/blog/>
         Options FollowSymLinks
         AllowOverride All
         Order allow,deny
         Allow from all
     </Directory>

     # SSL Engine Switch:
     # Enable/Disable SSL for this virtual host.
     SSLEngine on
     ...

</VirtualHost>

 

Allowing SSH Key Based Logins from Another System

I have a Digital Ocean server that I SSH into from my laptop for mainly development purposes. But I also want to do scheduled downloads of the server backups from a server at home. So I need to SSH from a new machine to my server with no user prompt. Easy, but it always prompts me for a pass phrase and I have multiple keys in use on my home server.

While you could just copy your private keys from Client1 to Client2 in order to do this, it’s not a great thing to be doing security-wise. So let’s just not do that.

What you need to do is create a new key pair on Client2 (actually my home server) with,

ssh-keygen

When prompted, make sure you tell it to use a new key file if you have existing keys. If you don’t do that it’ll overwrite your old ones and you’ll be testing your recovery process. When prompted for a pass phrase, just leave it blank and hit Enter. While a pass phrase would be more secure, I want to use this SSH connection to automatically connect as part of a crontab job. So no one will be able to enter a pass phrase anyway.

So now we have a fresh keypair on Client2, say in a file called id_rsa_2. We need to get the public key id_rsa_2.pub to our remote server so it’ll trust us when we connect. We do that with a simple copy and append command,

cat ~/.ssh/id_rsa_2.pub | ssh <your-user>@<your-server> “cat >> ~/.ssh/authorized_keys”

When you run that command you’ll be prompted for your password as normal as we’re in the process of loading up the keys.

Now we have a new key pair and have copied the public key to the remote server so it trusts us when we connect. But if Client2 has multiple key pairs in use (i.e. we had to use id_rsa_2 as otherwise we would have overwritten existing keys), how does SSH on Client2 know which keys to use? By default it’ll always use the first key pair and not our new one.

The simple solution is to create a config file in Client2 called ~/.ssh/config and define a Host and which keys to use.

Host <your-server>
IdentityFile ~/.ssh/id_rsa_2

Now you should be able to SSH from your second machine to your remote server with new keys and by using the keys, not have to enter a password.

Geo-blocked Content and Business Models

The internet has changed the world we live in dramatically in the last 10 years. This is a fact that no one would dispute. But many businesses are continuing to ignore some of the associated changes that this global connectivity has bought. No longer do the borders of countries matter to data, in that those of us with connectivity can share anything we like.

A business who started on the internet should know what this new world looks like and so should the older content businesses – they’ve had their chance to evolve. Newspapers are very different in many countries now, no longer are they part of the morning ritual and no longer do advertisers queue at their door ready to put up with what was typically a poor experience (ever tried to place a classified ad?).

TV broadcasters are now where newspapers were five or more years ago, and most are acting to embrace rather than fight, the new technology. “On demand” web sites from broadcasters in NZ now often show new content before it is delivered over the air to TV sets. They realise that people can and will get the same content from other sources if they don’t do this and that people want to watch on their own schedules.

The power has shifted away from the broadcasters to the content owners. If people are happy to stream content when they want they often care little for who is providing it. Why are we tied to a broadcaster who simply takes the video, inserts their own ads and then pushes play? As they face this issue they stick to their business model and protect it by forcing their consumers to jump through ever smaller and more restrictive hoops. Want to view this video or listen to this song – sorry, not in your country.

Because of the internet the technology to work around these restrictions is fairly easy to employ for many people. VPNs and DNS configurations allow ways to subvert the geo-blocking restrictions, and are being “consumerised” as apps that Mum and Dad can download and use. Technical changes and smart people will work around what the other tech and smart people create, until we get where we are now; legal threats.

Digital property needs to be recognised as being different from physical property. Theft does not harm the owner in the same way that stealing money or your car does. Yes, consumers should recognise someone’s work and effort and reward them, but consumers also shouldn’t be punished with huge fines due to the loss of a $5 movie rental.

We can’t undo the internet, it’s here to stay and we have to work out a way for quality products to fit into this new world. The new broadcasters (Netflix, Neo, Lightbox, etc) should accept they will all have very similar content and they need to provide the service on top of that to keep customers – not threaten them and split them up by location.

If we can’t work it out, then we might look back on this period and think the internet put a severe dent in human culture because everyone was chasing the money.

Doc5 Wiki Available for Download

Slightly behind with this post but I finally have a new release of Doc5 available for download.

New features include,

  • Full WYSIWYG editing and no more trying to get used to the markup. (Not that it was difficult but people are used to risch editors these days)
  • Complete redesign of the UI.
    Bootstrap makes for an easy to use, clean interface and I really like the design anyway.
  • Easier to use more finely grained permissions.
    Per user permissions for categories and pages and inheritance for pages.
  • Much better file management and easier to link files into pages.
  • Bug fixes and support for different databases with faster access.
  • HTML email templates.
    This will make it easier to extend and handle language translations in the future.

Change of Host = Change of Performance

As per the previous posts I’ve moved hosts over the last week and now I think everything is across. While having a quick look at the Google Web Developer Tools to check for errors I also see the following little chart indicating the time it takes Google to fetch a page from this sub-domain and WordPress site.

google-site-perf

As you can see the last week shows a very clear decrease in the time to download a page. As far as I can tell the only thing to change has been the provider, and as part of that the underlying web server. They (previous host) use Nginx and now I run what is probably a fairly default LAMP stack. I’m going to assume they can tweak the hell out of that Nginx config they have for all their customers, but it just shows that cheap do it yourself servers on SSD (perhaps key here) can definitely perform.

So after this post I’ll wait for the load to increase and eat my words later. 🙂

Permissions Problems with git pull

I’ve started working on Doc5 from a laptop in the last few months and have begun the pull/push process to get my Bitbucket repo and desktop machine all in sync. But when trying to get these sorted I found permissions problems on one of the local repos. When I tried to do a pull I had about eight files that either couldn’t be unlinked or couldn’t be created.

If I looked at the permissions on the files I was the owner, www-data (Apache in Ubuntu) was the group and the permissions where 644 on the files and 755 on the directories in my project folder. So that all seemed fine.

But what you need to watch for is the extra permissions that a process needs in order to unlink. What git is doing is taking these files away and then replacing them in the folder. i.e. it’s not just a modification through a write action to the file. Continue reading Permissions Problems with git pull

Doc5 Beta Now in Testing

After many years I have got a version of Doc5 up and available to use. It’s a vastly different wiki app than the previous version and most of the changes have been made in the last 9 months. The last version released for download was a different name and appeared before my son was born. He starts school in two weeks.

My 9-5 job takes up enough time that for a couple of years I left this project alone and considered dropping any thoughts of pushing it out. But writing web apps is my hobby, so it’s been good to dig through all the old code and clean it up.

So a list of the major changes:

  • WYSIWYG editing has arrived and the previous wiki engine is gone
  • Permissions have been simplified but also extended to categories
  • Full UI make over, although I have gone with a pretty basic Bootstrap view of things.
  • Better file uploads and management.
  • Templates for email notifications

I think a full release should be available for download in the next two months. Testing on the web will help tune spam catching and there’s some bug fixing to roll out as well as plenty of test cases to run.

Running Your Own Web Server

After nudging the storage allowance on our web host a few times in the last month, I’ve started setting up the same sites on Digital Ocean. Partly to get the 20GB of SSD storage that resolves the current constraint but also to have a play with managing it all myself and working through the automated provisioning experience.

After a day of configuration and following the odd tutorial (admittedly Digital Ocean have some very good content in their help system for how to set up a myriad of different apps and OS’s) I can say it really is much easier just going for the basic “pay for simple access to a shared web server” option if you can. Even just WordPress is a bit of a pain in the arse to tweak for the proper security permissions while letting you do uploads and automatic plugin installs. In my case I need to have the LAMP stack, or at least the file system and web server accessible, but if you don’t, take a minute to think of why you can’t just use wordpress.com.

Our old host also had email set-up as part of the service with unlimited mail accounts, etc. Again with the new option and managing it all myself, I’m not overly looking forward to the fiddling to get Postfix, etc working properly. It’d be great if Google Apps had a much lower price for their per user mailbox hosting option.

Anyway, so far, so good as this blog is the first of the old content to move across to Digital Ocean.

CrashPlan Fails to Start on Linux Mint

I’ve just reinstalled Linux again after another Windows 8 attempt (this time 8.1) and I’m trying Linux Mint 17. One of the key apps I install is CrashPlan and when installed on the 64bit version of this OS the desktop app component fails to start.

The answer is to append an option to one of the start script lines as below. It took me a few attempts to access the support page for this, so I’m posting it here too.

Source: http://support.code42.com/CrashPlan/Latest/Troubleshooting/CrashPlan_Client_Closes_In_Some_Linux_Installations

  1. Edit the run.conf file in your CrashPlan app installation directory
    Default location: /usr/local/crashplan/bin/
  2. Navigate to the end of the GUI_JAVA_OPTS section
  3. Add this line, inside the quotes:
    -Dorg.eclipse.swt.browser.DefaultType=mozilla
    Example:
SRV_JAVA_OPTS="-Dfile.encoding=UTF-8 -Dapp=CrashPlanService -DappBaseName=CrashPlan -Xms20m -Xmx512m -Djava.net.preferIPv4Stack=true -Dsun.net.inetaddr.ttl=300 -Dnetworkaddress.cache.ttl=300 -Dsun.net.inetaddr.negative.ttl=0 -Dnetworkaddress.cache.negative.ttl=0 -Dc42.native.md5.enabled=false"
GUI_JAVA_OPTS="-Dfile.encoding=UTF-8 -Dapp=CrashPlanDesktop -DappBaseName=CrashPlan -Xms20m -Xmx512m -Djava.net.preferIPv4Stack=true -Dsun.net.inetaddr.ttl=300 -Dnetworkaddress.cache.ttl=300 -Dsun.net.inetaddr.negative.ttl=0 -Dnetworkaddress.cache.negative.ttl=0 -Dc42.native.md5.enabled=false -Dorg.eclipse.swt.browser.DefaultType=mozilla"
  1. Save the file
    The CrashPlan app should launch properly
​​Important Note: If you uninstall and reinstall the CrashPlan app, or the app automatically upgrades to a new version, the run.conf file is overwritten, causing this issue to reoccur. Update the run.conf file as described above to correct the issue.