Allowing SSH Key Based Logins from Another System

I have a Digital Ocean server that I SSH into from my laptop for mainly development purposes. But I also want to do scheduled downloads of the server backups from a server at home. So I need to SSH from a new machine to my server with no user prompt. Easy, but it always prompts me for a pass phrase and I have multiple keys in use on my home server.

While you could just copy your private keys from Client1 to Client2 in order to do this, it’s not a great thing to be doing security-wise. So let’s just not do that.

What you need to do is create a new key pair on Client2 (actually my home server) with,


When prompted, make sure you tell it to use a new key file if you have existing keys. If you don’t do that it’ll overwrite your old ones and you’ll be testing your recovery process. When prompted for a pass phrase, just leave it blank and hit Enter. While a pass phrase would be more secure, I want to use this SSH connection to automatically connect as part of a crontab job. So no one will be able to enter a pass phrase anyway.

So now we have a fresh keypair on Client2, say in a file called id_rsa_2. We need to get the public key to our remote server so it’ll trust us when we connect. We do that with a simple copy and append command,

cat ~/.ssh/ | ssh <your-user>@<your-server> “cat >> ~/.ssh/authorized_keys”

When you run that command you’ll be prompted for your password as normal as we’re in the process of loading up the keys.

Now we have a new key pair and have copied the public key to the remote server so it trusts us when we connect. But if Client2 has multiple key pairs in use (i.e. we had to use id_rsa_2 as otherwise we would have overwritten existing keys), how does SSH on Client2 know which keys to use? By default it’ll always use the first key pair and not our new one.

The simple solution is to create a config file in Client2 called ~/.ssh/config and define a Host and which keys to use.

Host <your-server>
IdentityFile ~/.ssh/id_rsa_2

Now you should be able to SSH from your second machine to your remote server with new keys and by using the keys, not have to enter a password.

Geo-blocked Content and Business Models

The internet has changed the world we live in dramatically in the last 10 years. This is a fact that no one would dispute. But many businesses are continuing to ignore some of the associated changes that this global connectivity has bought. No longer do the borders of countries matter to data, in that those of us with connectivity can share anything we like.

A business who started on the internet should know what this new world looks like and so should the older content businesses – they’ve had their chance to evolve. Newspapers are very different in many countries now, no longer are they part of the morning ritual and no longer do advertisers queue at their door ready to put up with what was typically a poor experience (ever tried to place a classified ad?).

TV broadcasters are now where newspapers where five or more years ago, and most are acting to embrace rather than fight, the new technology. “On demand” web sites from broadcasters in NZ now often show new content before it is delivered over the air to TV sets. They realise that people can and will get the same content from other sources if they don’t do this and that people want to watch on their own schedules.

The power has shifted away from the broadcasters to the content owners. If people are happy to stream content when they want they often care little for who is providing it. Why are we tied to a broadcaster who simply takes the video, inserts their own ads and then pushes play? As they face this issue they stick to their business model and protect it by forcing their consumers to jump through ever smaller and more restrictive hoops. Want to view this video or listen to this song – sorry, not in your country.

Because of the internet the technology to work around these restrictions is fairly easy to employ for many people. VPNs and DNS configurations allow ways to subvert the geo-blocking restrictions, and are being “consumerised” as apps that Mum and Dad can download and use. Technical changes and smart people will work around what the other tech and smart people create, until we get where we are now; legal threats.

Digital property needs to be recognised as being different from physical property. Theft does not harm the owner in the same way that stealing money or your car does. Yes, consumers should recognise someone’s work and effort and reward them, but consumers also shouldn’t be punished with huge fines due to the loss of a $5 movie rental.

We can’t undo the internet, it’s here to stay and we have to work out a way for quality products to fit into this new world. The new broadcasters (Netflix, Neo, Lightbox, etc) should accept they will all have very similar content and they need to provide the service on top of that to keep customers – not threaten them and split them up by location.

If we can’t work it out, then we might look back on this period and think the internet put a severe dent in human culture because everyone was chasing the money.

Doc5 Wiki Available for Download

Slightly behind with this post but I finally have a new release of Doc5 available for download.

New features include,

  • Full WYSIWYG editing and no more trying to get used to the markup. (Not that it was difficult but people are used to risch editors these days)
  • Complete redesign of the UI.
    Bootstrap makes for an easy to use, clean interface and I really like the design anyway.
  • Easier to use more finely grained permissions.
    Per user permissions for categories and pages and inheritance for pages.
  • Much better file management and easier to link files into pages.
  • Bug fixes and support for different databases with faster access.
  • HTML email templates.
    This will make it easier to extend and handle language translations in the future.

Change of Host = Change of Performance

As per the previous posts I’ve moved hosts over the last week and now I think everything is across. While having a quick look at the Google Web Developer Tools to check for errors I also see the following little chart indicating the time it takes Google to fetch a page from this sub-domain and WordPress site.


As you can see the last week shows a very clear decrease in the time to download a page. As far as I can tell the only thing to change has been the provider, and as part of that the underlying web server. They (previous host) use Nginx and now I run what is probably a fairly default LAMP stack. I’m going to assume they can tweak the hell out of that Nginx config they have for all their customers, but it just shows that cheap do it yourself servers on SSD (perhaps key here) can definitely perform.

So after this post I’ll wait for the load to increase and eat my words later. :-)

Permissions Problems with git pull

I’ve started working on Doc5 from a laptop in the last few months and have begun the pull/push process to get my Bitbucket repo and desktop machine all in sync. But when trying to get these sorted I found permissions problems on one of the local repos. When I tried to do a pull I had about eight files that either couldn’t be unlinked or couldn’t be created.

If I looked at the permissions on the files I was the owner, www-data (Apache in Ubuntu) was the group and the permissions where 644 on the files and 755 on the directories in my project folder. So that all seemed fine.

But what you need to watch for is the extra permissions that a process needs in order to unlink. What git is doing is taking these files away and then replacing them in the folder. i.e. it’s not just a modification through a write action to the file. Continue reading Permissions Problems with git pull

Doc5 Beta Now in Testing

After many years I have got a version of Doc5 up and available to use. It’s a vastly different wiki app than the previous version and most of the changes have been made in the last 9 months. The last version released for download was a different name and appeared before my son was born. He starts school in two weeks.

My 9-5 job takes up enough time that for a couple of years I left this project alone and considered dropping any thoughts of pushing it out. But writing web apps is my hobby, so it’s been good to dig through all the old code and clean it up.

So a list of the major changes:

  • WYSIWYG editing has arrived and the previous wiki engine is gone
  • Permissions have been simplified but also extended to categories
  • Full UI make over, although I have gone with a pretty basic Bootstrap view of things.
  • Better file uploads and management.
  • Templates for email notifications

I think a full release should be available for download in the next two months. Testing on the web will help tune spam catching and there’s some bug fixing to roll out as well as plenty of test cases to run.

Running Your Own Web Server

After nudging the storage allowance on our web host a few times in the last month, I’ve started setting up the same sites on Digital Ocean. Partly to get the 20GB of SSD storage that is the current constraint but also to have a play with managing it all myself and working through the automated provisioning experience.

After a day of configuration and following the odd tutorial (admittedly Digital Ocean have some very good content in their help system for how to set up a myriad of different apps and OS’s) I can say it really is much easier just going for the SaaS option if you can. Even just WordPress is a bit of a pain in the arse to tweak for the proper security permissions while letting you do uploads and automatic plugin installs. In my case I need to have the LAMP stack, or at least the file system and web server accessible, but if you don’t take a minute to think of why you can’t just use

Our old host also had email set-up as part of the service with unlimited mail accounts, etc. Again with the new option and managing it all myself, I’m not overly looking forward to the fiddling to get Postfix, etc working properly. It’d be great if Google Apps had a much lower price for their per user mailbox hosting option.

Anyway, so far, so good as this blog is the first of the old content to move across to Digital Ocean.

CrashPlan Fails to Start on Linux Mint

I’ve just reinstalled Linux again after another Windows 8 attempt (this time 8.1) and I’m trying Linux Mint 17. One of the key apps I install is CrashPlan and when installed on the 64bit version of this OS the desktop app component fails to start.

The answer is to append an option to one of the start script lines as below. It took me a few attempts to access the support page for this, so I’m posting it here too.


  1. Edit the run.conf file in your CrashPlan app installation directory
    Default location: /usr/local/crashplan/bin/
  2. Navigate to the end of the GUI_JAVA_OPTS section
  3. Add this line, inside the quotes:
SRV_JAVA_OPTS="-Dfile.encoding=UTF-8 -Dapp=CrashPlanService -DappBaseName=CrashPlan -Xms20m -Xmx512m -Dnetworkaddress.cache.ttl=300 -Dnetworkaddress.cache.negative.ttl=0 -Dc42.native.md5.enabled=false"
GUI_JAVA_OPTS="-Dfile.encoding=UTF-8 -Dapp=CrashPlanDesktop -DappBaseName=CrashPlan -Xms20m -Xmx512m -Dnetworkaddress.cache.ttl=300 -Dnetworkaddress.cache.negative.ttl=0 -Dc42.native.md5.enabled=false -Dorg.eclipse.swt.browser.DefaultType=mozilla"
  1. Save the file
    The CrashPlan app should launch properly
​​Important Note: If you uninstall and reinstall the CrashPlan app, or the app automatically upgrades to a new version, the run.conf file is overwritten, causing this issue to reoccur. Update the run.conf file as described above to correct the issue.

Personal Data at Work

Computer SecurityThere’s plenty of information and talk around about the issues of allowing company data onto a personal device, but what about personal data on a work device? More and more of our personal data is stored in “cloud” services like Gmail and Evernote that we access from work computers and company controlled accounts.

In the time of buzzwords like “BYOD” business is rightly concerned about their data being on that tablet of yours that you managed to connect to the company wireless. They want to make sure that the correct level of security protects their data – especially things like email that almost everyone accesses from their smart phone, tablet or even web kiosk. Personal devices have proliferated the work place since they became cheaper, smarter and cooler to have than the company provided devices. Right now business is just starting to catch on and recognise that things are changing and that not everything can be restricted or dictated like they used to be.

But for much longer than the iPad has been around, we’ve all been accessing web sites and apps at work with personal login information. Some of the time we also click the “Remember Me” option when logging in without a second thought. All this information about our own email, blogs, password managers, Amazon account and other browsing habits are all sitting on that company device, protected by that one password for your company login.

So think about all the people in your company who have the ability to reset your work account’s password. In an enterprise environment that might be fifty or more people. In an environment that’s been poorly managed that might be in the hundreds. So all any of them has to do is reset your password and login to your machine and start up a browser. Whatever sites you jump onto on a Monday morning without logging into, are theirs for the viewing. You may not even know after a long holiday – “Woops, I must have forgotten my password”. None of your personal sites are being “hacked” or even having their passwords changed, you’re already logged into them on your work PC.

What can you do about this? Don’t save your passwords at work and don’t stay logged in to any service you value. If you’re thinking that you don’t care about access to your email, just think what information is in there – personally identifiable information, and it probably receives password resets for most other sites you’ve signed up to. What about bank account info, insurance updates?

Just to take this one step further combine the person with access to reset your password at work with the person who manages your work cell phones, and the fact your bank uses SMS as a two-factor authentication option. They’re a password reset and SIM transfer away from your bank account.

Microsoft Azure 90 Day Trial

I’ve just started a 90 day trial of the Microsoft Azure cloud service as I’ve got a day session next week with MS on the topic. For those of you thinking about giving it a go I suggest you go and jump in now. It’s very easy to sign up (no cost but it does ask for your credit card details) and the management portal is very easy to use. In 5min I have a web site running, a server being provisioned and a domain namespace configured.

There’s also the suggestion that the websites you add remain free after the trial period, but I’m cynical and thinking that you still need to pay for data transfers and storage at least.


You can provision a whole raft of different infrastructure from within Azure, some of which are shown on the left. There are plenty of Linux images to kick start your server provisioning off and the websites come with templates for common web apps – blogs, CMS’s, etc. While there are some apps in the later category that use non-MS technologies like MySQL, it seems you can’t provision a standalone database other than a SQL instance. Perhaps to be expected.

Once your new virtual machines are up and running you can download an .rdp file to get access to the server and do your normal tasks. But an RDP session from NZ to the Southeast Asia data centre is a bit slow, so I’m, thinking my home connection is either a little busy or connectivity really is that bad out of NZ.

DNS and other management roles such as AD and the associated namespaces are easy enough to add too. The configuration for the namespace includes all the identity provider set up that will also allow your apps and services to plug into your own source of user info.

All things considered after an hour or two of playing, the Azure 90 day trial looks to be very worthwhile, even just for a play. If you’re a business based around some of these core Microsoft technologies there’s a good chance this may be your “gateway drug” to actually stepping into doing this Cloud stuff for real.