Sunday 12 December 2010

File Sharing Options In Ubuntu

Since October was my last post on this blog I thought it was about time I posted something at least somewhat useful. As someone who owns several PCs something that interests me is networking those PCs together so that they may share files and common resources. This is actually something many people struggle with all the time. For those who can't solve the problem they may resort to cumbersome methods like using USB sticks. So what are the file sharing options available to Ubuntu users?

Well as I've just mentioned, you can use USB sticks. It's a reliable method guaranteed to work. It is not however terribly efficient. The next option involves networking and there are 3 main ways to share files over a network in Ubuntu. We have Samba, which is an implementation of Microsoft's networking protocols for Windows. Next we have NFS which is commonly used in most POSIX compliant OS's. The third and final option in the "cloud".

Samba
Samba is great. When it works. Not only does it allow other Linux machines to share files and other resources like printers. It also allows both Linux and Windows machines to share the same resources on the same network. Which is great. When it works.

And that is the biggest con for me when it comes to Samba. Not only did the Samba developers succeed in developing software that could network both Linux and Windows PCs together on the same network using Microsoft's SMB protocols. They also succeeded in reproducing the reliability of Microsoft's networking protocols. It could just be my incompetence. But frankly I've never been able to get a home network to actually work reliably with Windows and Samba has never work consistently and reliably for me in Ubuntu either.

I will however concede that there are many people who do appear to have Samba working just fine. Another advantage is that shares are dynamic and browse-able. By that I mean a new share will appear on the network as they are made available.

So Samba is flexible but not reliable.

NFS
NFS or Network File System on the surface looks a bit scary. Setting it up requires using something called the "terminal" and editing "configuration files" with "root privileges"! For the purposes of a home network based on Ubuntu machines it's actually very simple. Install 3 packages and reboot the PC. Then you can start sharing stuff. Which is where you need to start editing files. Well actually you don't.

Ubuntu has a GUI that lets you select shares for either NFS or Samba. It can be found in System>Administration>Shared Folders. In the case of NFS this will create all the entries in the /etc/exports file for you. All you need do is supply the basic information. The folder to be shared and the clients (other PC's) who have access to it. Editing the /etc/exports file directly does however give more flexibility and certainly isn't something any home user should shy away from. Simply make a copy of the original file. That way if you screw things up, it's easy to fix.

One of the biggest draw backs for me using NFS is that it's static or not browse-able. At least as of yet I haven't found a way to browse through a PC's NFS shares. Each and every share on the network needs to be "mounted" manually or via the fstab file for automatic mounting at boot time. Which can be inconvenient if you can't remember the exact name of the share. I also haven't yet figured out how to share a printer with NFS. If it's even possible.

NFS server and client software is available for Windows. However I've never used it and therefore can't vouch for it.

So NFS is reliable but inflexible.

The Cloud Services
There are at least two "cloud" based file sharing services that spring to mind when it comes to Ubuntu. The first is Canonical's own service. Ubuntu One. This is installed by default in Ubuntu 10.10 as part of Canonical's "social from the start" initiative. You do however still need to create a user account.

The other service and probably more popular service is DropBox. DropBox is the more mature and better established of the two services. Which means more of the bugs have been ironed out. I've used both services and DropBox is definitely my choice for the moment.

The thing that interests me most about these services is the ability to share files securely wherever there is an Internet connection. Not only do they share files but they synchronise them as well. Which means the latest version of a document should always be available. There is a significant limitation however. Cost!

Unlike using Samba or NFS to share resources on a home network using a cloud service limits the amount of data you can share at any given time. That's because the cloud services operate by copying the files to an on-line storage space and then to each client. This on-line storage needs to be paid for by someone.

Users do however get some free storage space. With DropBox it's 2GB and with Ubuntu One it was 1GB when I tried it. Which in today's world of massive MP3 libraries, digital photos and perfectly legally ripped DVDs isn't much. Once you've reached your reached the limits of your on-line storage capacity you must move files out of the shared space or pay up actual real money for more space. On a monthly or annual basis. Which means you're actually paying good money to access your own files.

On the plus side however these services do tend to "just work". Which is why they are popular and why people are actually prepared to shell out money to get that extra storage capacity. Cloud services are also said to have another important function. Off site back up. In reality this only works if you go to the trouble of setting it up that way. Remember these services automatically synchronise their shares. So if you delete a file in the shared space on your laptop. It will be deleted in the shared space on your desktop as well. However as a cheap off site back up option it's not a bad idea.

So cloud based file sharing services are reliable, flexible, browse-able but potentially expensive.

The Conclusion
Each file sharing option here has it's pros and cons. The important thing for people to realise is there are options out there. You don't have to stick with a one size fits all and doesn't actually work properly for anybody solution. My solution is to use a combination of services. I use DropBox for files I need access to when I'm not at home and can't connect via my own home network. And I use NFS for my home network.

I have chosen these options for their reliability and simplicity. They both "just work" and very rarely throw up any problems. Not having figured out how to share a printer with NFS at the moment if it's even possible isn't a huge issue. In the modern age why are we still printing stuff anyway? There's no real "need".

Tuesday 5 October 2010

ConDem-ed

This is just a quick observation. The other night I was watching the news. There was a report on the cuts the ConDem-ed government intends to make along with the welfare reforms. I couldn't help but notice everything major that has been mentioned so far will not be implemented in this parliament. But in the next parliament.

I'm by no stretch of the imagination a political heavy weight. But that rather seems like an ideologically driven approach rather than austerity measures. The cuts the ConDem-ed government are planning will be intended to last well beyond this and the next government. And there is one more odd thing about this ConDem-ed government.

Details. The devil is always in the detail. But the ConDem-ed government are very light on details. Before the election I was criticised by some for calling out the Conservative manifesto as being utter nonsense. All head lines and no substance. Those who would play devils advocate defended that manifesto, claiming an election manifesto should only set out the "vision statement". Well frankly that's just not good enough for me.

Before the election the Conservatives were pushing out some radical Thatcher-ish sounding policies with no actual meat in them. Today in government they are doing much the same thing. How can a government function for this long after an election and not have actually done anything? Why are the opposition parties holding them to account?

Something has gone very, very wrong in British politics.

Wednesday 11 August 2010

deviantArt Munro

There are people in the world who doubt the usefulness of the Internet and the World Wide Web. These same people are the people who are currently talking down the transition from the desktop computing model to the "cloud" model. They are also the same people who stand to lose the most from this transition.

I've read articles in the past that talk about the security concerns related to "cloud computing" or "fog computing". There's a lot of scaremongering going on in the industry and in popular media. And disasters like Microsoft's Sidekick disaster are routinely trotted out to add weight to the claims being made. All of those articles however ignore one simple fact. We already live in the "cloud". As I've said in various forums in the past, cloud computing is simply a return to the client server model where applications are hosted by a server and interacted with via a client. The server does the heavy lifting. The client provides the interface. This is how the web works. It's how it's always worked.

So why all the fuss? Well traditionally web applications have always been limited to the capabilities of the web browser. Primarily Microsoft's Internet Explorer which has been a ball and chain shackled to the ankles of the Internet since Microsoft forced it upon it's customers in Windows 95. But that has changed. Microsoft no longer dominates the web in a meaningful way. It's market share is falling continuously. New versions of IE cannibalise market share from older versions of IE while the over all market share for IE falls. Microsoft's competitors are stealing the march on the web. Apple, Google, Opera and Mozilla have all released HTML5 aware web browsers. And it's HTML5 that will set the web free.

HTML5 is what powers Munro from deviantArt. Munro it's self is nothing special. It's a painting application that allows you to publish directly to your deviantArt account. What is important is how Munro is delivered. It's free to use and is delivered to the user via the deviantArt web site. There's no plug-ins required. No installation required. No configuration required. No lengthy serial number to input for activation. It is essentially the ultimate plug 'n' play application. The only thing the user is required to do is figure out how to use it. Which isn't hard.

Munro isn't the first HTML5 application to pop up. There are quite a few out there now. Discreetly integrated into web sites. Users simply take them for granted without even noticing how painless the whole experience was. Because it was painless. This is how the web was meant to be. Free and easy to use. Simple and transparent. Accessible to all.

Monday 26 July 2010

Why does Wall Street hate Microsoft?

Why does Wall Street hate Microsoft? is the question posed on the Microsoft Blog. The answer the author arrives at, extremely quickly, is Windows is just not exciting enough. Hmm okay then. I'm no expert on the stock market or on how investors behave. But I think it's a safe bet they're more interested in the profit margins. Not just from one or two products. But from the whole product line. And there in lies the problem. With the exception of Microsoft Office and Microsoft Windows very few Microsoft product lines ever seem to make any profit at all.

Take Bing for example. It was supposed to be the next best thing since sliced bread. It wasn't. Hardly anybody uses it and Microsoft's on-line efforts are haemorrhaging something on the order of $2,000,000,000 per year. Vista was a disaster and despite the supposedly good figures for Windows 7. The business world is sticking with Windows XP. Which is why Microsoft has extended downgrade rights to 2020. And Windows Surface was also essentially still born.

There was Zune. Which only really sold in the USA and even then wasn't so popular. There's the Xbox. Hugely popular, yes. Shame it's a technical disaster with at least half of all units sold at one point being returned as duds. Which one might think was enough disappointment for one product. But no. Microsoft had plenty more disappointment for Xbox fans. First they decided to lock out third party peripheral devices. Then they decided to cut off early adopters by ending support for their version of the console. There was the sidekick debacle. Then there was the still born Microsoft "Kin Phone". Which nobody wanted. And then of course there is a list longer than your arm of product lines Microsoft has recently discontinued.

All of which is very telling about Microsoft's understanding of todays market. They just don't get it. For some unfathomable reason Microsoft thought teenagers would buy a phone like Kin when they could have an iPhone or the HTC Legend, the Google Nexus One or the Motorola Droid. Even Microsoft's own Windows Phone 7 looks like a better proposition. Then there is the way Microsoft treats it's paying customers. People are tired of the 20 character activation codes. It makes people who have bought and paid for their products feel like criminals. So it's understandable then they might consider switching to Apple or Linux.

And speaking of the competition. When Microsoft's competition does something different, exciting or innovative Microsoft's responses are almost non-existent. Microsoft's answer to the buzz surrounding Compiz on the Linux platform was a new task-bar. Windows that minimise or maximise when you shake them or some such. Compiz in comparison has a multitude of features. Some actually useful in boosting productivity like the Negative, Opacify, Enhanced Desktop Zoom or Add Helper. Basically Microsoft are a company that doesn't have anything new to offer and always seem to be late to the party. And not even fashionably late at that.

It's not that Wall Street hates Microsoft. Wall Street just doesn't see any potential. Investors minds have been made up that Microsoft has run it's course. It's time for a change. For something new. Apple are currently storming a head because Apple has proven it's self to be a company that can diversify to survive and fight it's way clear of bad times. Microsoft innovates by buying up an existing company or product and slapping it's own brand on it. Apple innovates by putting a new spin on existing technologies and products.

Then there is the honesty factor. Investors don't like it when people lie to them. Windows 7 is turning bumper profits. Except virtually nobody in the business sector is interested and the whole of Asia is apparently pirating Windows. So who exactly is buying Windows 7? Are these bumper profits coming from consumers alone? Maybe there's some "Hollywood" accounting going on over at Redmond.

Wall Street doesn't hate Microsoft. It just doesn't care any more.

Saturday 24 July 2010

Ubuntu Tip: How To Synchronise Gnote Between PCs

In the great Tomboy vs Gnote debate one of the trump cards Tomboy has is it's ability to synchronise it's database of notes with other PCs. Not that it was ever a feature that worked brilliantly. However none the less it was a feature I used and the only reason I continued using Tomboy over Gnote.

How stupid I was?

Ubuntu as many Ubuntu users will know comes with a "cloud" service called Ubuntu One. Which while isn't that great provides us with the basic inspiration for what we're about to do. You see Canonical in their great wisdom decided it would be a great idea to integrate Tomboy with Ubuntu One. Fine. Excellent. If it works. It did for a time for me. Then sort of went a bit crappy. However there are more mature "cloud services" available for synchronising files between two PCs. And that incidentally is all Tomboy's synchronisation feature does so far as I can see. It simply copies files that have been created or changed recently to the PC that doesn't have the new version.

You see all of the notes you create in Tomboy or Gnote are stored in individual XML formatted files. And the Linux file system has a crafty little feature called symbolic links. Combine this with a service like Ubuntu One or Dropbox and all our notes are synchronised automagically so long as we have an active web connection.

Prerequisites:
  1. Tomboy or Gnote. I recommend Gnote. It's lighter and faster than Tomboy.
  2. Ubuntu One, Dropbox or similar service. I would recommend Dropbox.
Setup:
  1. Creat a new folder for yout notes in your Ubuntu One or Dropbox folder.
  2. Copy your existing notes to the folder you just created in your Ubuntu One or Dropbox folder. Tomboy notes can be found in /home/your-user-name/.local/share/tomboy. Gnote notes can be found in /home/your-user-name/.gnote.
  3. Replace the Tomboy or Gnote folder with a symbolic link. Open a terminal window and enter the following command adjust for your own PC;
    ln -vsf Dropbox/gnote /home/your-user-name/.gnote
  4. Repeat steps 1 and 3 on the second, third, fourth, etc PC.
NOTES:
  1. It's best to have Tomboy or Gnote already setup and working before you attempt this.
  2. It's also better to avoid using hidden directories with Dropbox. They don't work very well in my experience.
  3. If you're using Tomboy but would like to switch to Gnote that's not a problem. Gnote is a native Linux implementation of Tomboy that is free of all Mono dependencies. Both applications use exactly the same data file formats and both offer almost exactly the same feature set. So all you need to do is copy the Tomboy files to your Gnote folder.

Digg Is Dead

For quite sometime now I've had a Digg button imbedded in my posts. News aggrigators like Digg are great for driving web users to your blog. However for some strange reason the digg button isn't working any longer. At least not on my blog. All the posts show the same number of diggs. Clicking the button does nothing.

So what is it about the new blogspot templates that are causing this conflict with Digg? Clearly having a button on your blog or website that does nothing looks bad. So For the time being I've removed the Digg button.

Ubuntu Tip: Split Screen In Nautilus

Way back in the days when I got my first IBM compatible PC running MS-DOS and Microsoft Windows 3.1 there was a feature in the Windows file manager I have really missed over the years since 3.1 became redundant. It's the split screen mode.

Well I shall miss it not a minute more. For Nautilus has just such a feature. Simply open Nautilus and press F3. Each pain can point to it's own directory. Which is handy considering I connect to FTP servers using Nautilus.

Wednesday 21 July 2010

Password Protecting A Website

I'm writing this more as a reminder to myself than for anybody elses benefit. Recently I had a request to add a password protected area to a web site I administer, www.aikido-gb.com. For some reason I had always thought this was quite difficult and needed experience in building on-line databases. It doesn't.

Adding password protection to an area of a website is actually pretty easy. Basically all you need to do is create a subdirectory and to it add two files, .htaccess and .htpasswd.

Setting Up .htaccess
The .htaccess file defines access privileges and the location of the password file, .htpasswd. It should look something like the following example;
AuthType Basic
AuthName "restricted area"
AuthUserFile /the-full-path-name-of-your-webspace/membersonly/.htpasswd
AuthGroupFile /dev/null
AuthName secure
AuthType Basic
require valid-user
order allow,deny
allow from all
An important point to note. The variable AuthUserFile must be the full path name of your web space. This will likely not be your domain name. So you may need to contact your ISP to get that information. Alternatively search the ISPs FAQ.

Setting Up .passwd
The .passwd file contains users names and passwords. Nothing else. The passwords must be encrypted. Fortunately there is an abundance of free tools around the web for this task.
userName:passWord

Tuesday 20 July 2010

Ditching Ubuntu 64-bit

Since upgrading to Ubuntu 10.04 64-bit my PC became unstable. Most of the time it was just innconveinent minor things. The clock on the top panel wouldn't display properly. However occasionally the whole PC would freeze. Especially when under a heavy work load. Couple this with the fact that Adobe Flash and Adobe Air have never worked to satisfactory degree. I decided it was time to go back to a 32-bit world.

What I had to consider was what I was losing and what I was gaining. My PC has 8GB of RAM. Normally on a 32-bit system 3.75GB is as much as you get. So I'd potentially be losing half of my RAM. But that was it. None of the software I use is exclusively 64-bit. Some of it is however exclusively 32-bit and there are ways around the 4GB memory cap in 32-bit operating systems.

What I stood to gain was a simpler installation procedure for my 32-bit only software and a more stable operating system. I use the 32-bit version of Ubuntu 10.04 on my laptop with no problems. So I already knew it was more stable.

To get past the 4GB memory cap, I intended to install the server kernel which has PAE technology enabled by default. However I found this was totally unnecessary. The desktop kernel in Ubuntu 10.04 comes with PAE support.

In the end up I now have a more stable PC. A simpler installation procedure for my 32-bit software. I still get to use 7.9GB of the total 8GB of RAM in my PC. Which is actually an improvement over the 64-bit version of Ubuntu. The PC does however feel a little sluggish at times. Particularly when launching applications. But that's a small price to pay for simplicity and stability.

Monday 7 June 2010

Ubuntu Tip: Gnome Schedule

Ask someone in the Linux world how do you schedule tasks to activate automatically and they'll likely start blathering about cron jobs. Now cron is a great tool if you do a lot of task scheduling. But for those of us who want something simpler there is Gnome Schedule. In Ubuntu it's bizarrely not installed by default. Why not? It's an insanely useful little tool that's easy to get to grips with. Luckily installation simple.

  1. Open a terminal window.
  2. Type sudo apt-get -y install gnome-schedule and press enter.
  3. Start gnome-schedule from the Applications > System Tools menu.
That's it. All ready to go!

Monday 15 February 2010

Why Fire Fox is not Doomed?

Okay I was going to comment on the article it's self. However InfoWorld appear to be afraid of feedback and require some sort of lengthy vetting process after registering. Like seriously!?! What is that all a bout!?!

So Firefox is doomed. I would disagree. Google might cut funding to better support it's own browser. But then again the whole point of Google funding Firefox is to get the default search position in the search bar. So far as I know Google don't intend to exit the search business just yet. So while Chrome is something of a threat or competition at least to Firefox. It's not the end game until Chrome gets significantly more market share.

IE is an immovable object? Clearly Randell is deliberately ignoring the fact that IE is well into decline. It's only actual strong hold now where Microsoft can still feel safe is on the corporate desktop. Everywhere else is up for grabs. Which is why every other major web browser on the market namely Firefox, Safari, Opera and Chrome are all gaining market share while IE is losing market share. What's more these browsers are gaining as much market share between them as IE is losing. Which means they're taking it from IE.

If that wasn't enough at least 3 or 4 countries have advised their citizens to ditch IE and use something else instead. Then of course there are countries like Brazil where significant numbers of people don't even use Windows let alone IE. They're all on Linux. So IE an immovable object? I hardly think so.

As for the critique of Mozillas intention to opt for smaller updates? This is what the Ubuntu developers do. They focus on one major thing they want to make better and by and large deliver on their goals. That doesn't mean other things don't get fixed or don't change. Clearly they do. But the developers do well out of that intense focus. Which is perhaps something Firefox actually needs.

I also don't think it's fair to compare Mozilla to Microsoft or to compare incremental updates to hot-fixes. First of all Microsoft and Mozilla are two entirely different companies with different motivations and goals. They have totally different business models that dictate how much funding they can dedicate to a single project and how the end product is marketed.

Hot-fixes are also intended to be temporary fixes to tide Windows over until the service pack arrives. Again that's a different coding style from an incremental update. A hot-fix is a quick and dirty patch to fix a problem in the short term. An incremental update is something that is integrated properly to the existing code base. That is what Mozilla does at the moment. The fact that it might want to do less but do it better doesn't mean the product will suffer.

Sunday 7 February 2010

Ubuntu Targeted By Malware!

Apparently some malware has been found in both a .deb file claiming to install a screen saver and a theme pack over on Gnome-Look. Which is a shame since I use that site regularly to find new wallpapers and themes.

So Ubuntu is now the target of criminals? All the tech press places all of desktop Linux at 1% of the market and claim in the same breath that figure is a generous estimate. Why then are criminals targeting something that has only 1% of the desktop PC market. It makes no sense unless Linux is much bigger than 1%. It would be interesting to find out what has changed the criminals minds. Are Linux users simply too naive? Too cocky? Too arrogant? Too stupid? Clearly criminals now see Linux desktops as vulnerable targets ripe for the picking.

This is both an interesting and frighting development. However we have at least exposed a vector of infection for Linux systems. Anybody could build a .deb or .rpm package file or even a normal tarball. Clearly without the protection of the community maintained repositories these methods of installation are just as vulnerable to misuse as Windows .exe files. Which is something many people in the Linux community including myself hadn't considered. Normally we tell people an installation script requires permission to run. So we're protected. The trouble is I don't think many people give entering their password a second thought when installing from a .deb package. Most of us simply trust them.

So this is a wakeup call to Linux users and in particular to Gnome and Ubuntu users. Be vigilant. Be careful about your chosen software sources. If you're installing something from a web site be sure to scan it first! Linux has a very good anti-virus application available in the repositories of most decent distributions. It's called ClamAV. While this is a command line application there are several front ends available. The most popular at the moment is ClamTK. Use the following commands to install them to your system. Remember you'll need to enter your password when installing software.

  1. Open a terminal window.
  2. Type sudo apt-get -y install clamav clamtk then press enter.
  3. Remember to scan your system regularly.
There are other anti-virus scanners for Linux. They're not hard to install. Most come in .rpm or .deb files. The ones that come as tarballs generally tend to be binaries. So there's no need to compile anything. Check out this article for more info.