As per the previous posts I’ve moved hosts over the last week and now I think everything is across. While having a quick look at the Google Web Developer Tools to check for errors I also see the following little chart indicating the time it takes Google to fetch a page from this sub-domain and WordPress site.
As you can see the last week shows a very clear decrease in the time to download a page. As far as I can tell the only thing to change has been the provider, and as part of that the underlying web server. They (previous host) use Nginx and now I run what is probably a fairly default LAMP stack. I’m going to assume they can tweak the hell out of that Nginx config they have for all their customers, but it just shows that cheap do it yourself servers on SSD (perhaps key here) can definitely perform.
So after this post I’ll wait for the load to increase and eat my words later.
I’ve started working on Doc5 from a laptop in the last few months and have begun the pull/push process to get my Bitbucket repo and desktop machine all in sync. But when trying to get these sorted I found permissions problems on one of the local repos. When I tried to do a pull I had about eight files that either couldn’t be unlinked or couldn’t be created.
If I looked at the permissions on the files I was the owner, www-data (Apache in Ubuntu) was the group and the permissions where 644 on the files and 755 on the directories in my project folder. So that all seemed fine.
But what you need to watch for is the extra permissions that a process needs in order to unlink. What git is doing is taking these files away and then replacing them in the folder. i.e. it’s not just a modification through a write action to the file. Continue reading Permissions Problems with git pull
I’ve just reinstalled Linux again after another Windows 8 attempt (this time 8.1) and I’m trying Linux Mint 17. One of the key apps I install is CrashPlan and when installed on the 64bit version of this OS the desktop app component fails to start.
The answer is to append an option to one of the start script lines as below. It took me a few attempts to access the support page for this, so I’m posting it here too.
- Edit the run.conf file in your CrashPlan app installation directory
- Navigate to the end of the GUI_JAVA_OPTS section
- Add this line, inside the quotes:
SRV_JAVA_OPTS="-Dfile.encoding=UTF-8 -Dapp=CrashPlanService -DappBaseName=CrashPlan -Xms20m -Xmx512m -Djava.net.preferIPv4Stack=true -Dsun.net.inetaddr.ttl=300 -Dnetworkaddress.cache.ttl=300 -Dsun.net.inetaddr.negative.ttl=0 -Dnetworkaddress.cache.negative.ttl=0 -Dc42.native.md5.enabled=false"
GUI_JAVA_OPTS="-Dfile.encoding=UTF-8 -Dapp=CrashPlanDesktop -DappBaseName=CrashPlan -Xms20m -Xmx512m -Djava.net.preferIPv4Stack=true -Dsun.net.inetaddr.ttl=300 -Dnetworkaddress.cache.ttl=300 -Dsun.net.inetaddr.negative.ttl=0 -Dnetworkaddress.cache.negative.ttl=0 -Dc42.native.md5.enabled=false -Dorg.eclipse.swt.browser.DefaultType=mozilla"
- Save the file
The CrashPlan app should launch properly
Important Note: If you uninstall and reinstall the CrashPlan app, or the app automatically upgrades to a new version, the run.conf file is overwritten, causing this issue to reoccur. Update the run.conf file as described above to correct the issue.
There’s plenty of information and talk around about the issues of allowing company data onto a personal device, but what about personal data on a work device? More and more of our personal data is stored in “cloud” services like Gmail and Evernote that we access from work computers and company controlled accounts.
In the time of buzzwords like “BYOD” business is rightly concerned about their data being on that tablet of yours that you managed to connect to the company wireless. They want to make sure that the correct level of security protects their data – especially things like email that almost everyone accesses from their smart phone, tablet or even web kiosk. Personal devices have proliferated the work place since they became cheaper, smarter and cooler to have than the company provided devices. Right now business is just starting to catch on and recognise that things are changing and that not everything can be restricted or dictated like they used to be.
But for much longer than the iPad has been around, we’ve all been accessing web sites and apps at work with personal login information. Some of the time we also click the “Remember Me” option when logging in without a second thought. All this information about our own email, blogs, password managers, Amazon account and other browsing habits are all sitting on that company device, protected by that one password for your company login.
So think about all the people in your company who have the ability to reset your work account’s password. In an enterprise environment that might be fifty or more people. In an environment that’s been poorly managed that might be in the hundreds. So all any of them has to do is reset your password and login to your machine and start up a browser. Whatever sites you jump onto on a Monday morning without logging into, are theirs for the viewing. You may not even know after a long holiday – “Woops, I must have forgotten my password”. None of your personal sites are being “hacked” or even having their passwords changed, you’re already logged into them on your work PC.
What can you do about this? Don’t save your passwords at work and don’t stay logged in to any service you value. If you’re thinking that you don’t care about access to your email, just think what information is in there – personally identifiable information, and it probably receives password resets for most other sites you’ve signed up to. What about bank account info, insurance updates?
Just to take this one step further combine the person with access to reset your password at work with the person who manages your work cell phones, and the fact your bank uses SMS as a two-factor authentication option. They’re a password reset and SIM transfer away from your bank account.
I’ve just started a 90 day trial of the Microsoft Azure cloud service as I’ve got a day session next week with MS on the topic. For those of you thinking about giving it a go I suggest you go and jump in now. It’s very easy to sign up (no cost but it does ask for your credit card details) and the management portal is very easy to use. In 5min I have a web site running, a server being provisioned and a domain namespace configured.
There’s also the suggestion that the websites you add remain free after the trial period, but I’m cynical and thinking that you still need to pay for data transfers and storage at least.
You can provision a whole raft of different infrastructure from within Azure, some of which are shown on the left. There are plenty of Linux images to kick start your server provisioning off and the websites come with templates for common web apps – blogs, CMS’s, etc. While there are some apps in the later category that use non-MS technologies like MySQL, it seems you can’t provision a standalone database other than a SQL instance. Perhaps to be expected.
Once your new virtual machines are up and running you can download an .rdp file to get access to the server and do your normal tasks. But an RDP session from NZ to the Southeast Asia data centre is a bit slow, so I’m, thinking my home connection is either a little busy or connectivity really is that bad out of NZ.
DNS and other management roles such as AD and the associated namespaces are easy enough to add too. The configuration for the namespace includes all the identity provider set up that will also allow your apps and services to plug into your own source of user info.
All things considered after an hour or two of playing, the Azure 90 day trial looks to be very worthwhile, even just for a play. If you’re a business based around some of these core Microsoft technologies there’s a good chance this may be your “gateway drug” to actually stepping into doing this Cloud stuff for real.
For some reason the 64bit download of Windows 8 Enterprise from TechNet does not prompt for a license key but does try and activate – and then fails every time. When it does enter the normal activation process you’re likely to see a DNS error or something along the lines of being unable to connect to the remote activation service.
So credit to the forum at Techplex for the simple solution, albeit unclear why it is needed in the first place.
Solution: Right-click in the lower left screen corner and open a command prompt as admin. Type in the following and hit Enter.
The Windows Activation process will start, you can enter your TechNet license key and then the app will connect to the remote system and you should then have a properly activated Windows 8 device.
I’ve been designing a new secure Windows domain whose users need access to an IIS website in another domain. The obvious question is, “How can we transparently auth users to that site from both domains?” which IIS looks to make pretty easy – as long as there is a domain trust in place.
Looking around for more info I found an excellent article on WindowsITPro that explains all the various IIS authentication types. So I needed to share its goodness.
If you’re a Ubuntu user who finds themselves with an ugly message like this one day when running a apt-get update,
No apport report written because the error message indicates a disk full error
you may have thought you’ve run out of disk space and run the command,
but then found you had plenty of space free. Well maybe you do have plenty of bytes free but what about inodes? They’re effectively a limitation of the number of files you can have in a filesystem. Continue reading Apport Disk Full Error Using apt-get
I’ve been looking about for some free Git hosting and found that most options like Github require you to make the code available to everyone. So I was pleasantly surprised to find Bitbucket from those ingenious Aussies at Atlassian.
There’s some excellent documentation, the system is easy to use and your first push from your local repository can be done with a few minutes after you’ve signed up.
An issue tracker is available for each of your projects and can be made public while your code is still private. So for a small team or individual it’s a valuable tool even just to use as a backup for your local code repos and keeping track of the odd bug.
I’ve recently been involved in a deeper look at the world of virtual desktops and what options suit different users groups. There are a few different ways to look at the whole desktop virtualisation and most of them depend on what software your business uses and how your users access it.
With the different types of virtualisation around these days it can be a little confusing about what fits where and how they interact. I’ll run through a quick overview of them here and the products that I’ve been testing with. (Be warned this is a bit of a ramble.) Continue reading Published Desktop vs Virtual Desktop