07 September, 2009

The modern browser, and other worlds.

Goin’ on a Safari diet

Earlier this year, I posted about Google Chrome and how good a score it got on my Windows XP Home system. Since then, I found the Safari browser from Apple, also based off the same WebKit code. I also got a copy of Windows XP Professional, then patched it to the max. Safari got a better score than Google Chrome too, as it didn’t fail the linktest that seems to plague Google Chrome every time I try the ACID3 test. Literally the only thing that Safari failed on was speed, and that’s totally unsurprising, given that I’ve got a Duron 1GHz, with a whole 1,256Mb of memory. Still, the machine does me okay for what I’ve been doing up until now.

Safari hasn’t exactly supplanted Google Chrome for sheer skinniness of browser interface yet, nor has it impressed me as much as Opera did when I first ran that way back when I first spotted it. I’ve literally only found two things missing from Chrome – the ability to turn off flash (a la FlashBlock for FireFox) and the ability to add arbitrary other plugins. Still, it’s pretty okay. I like things in Safari but don’t know if I’ll use it as my main browser; given that it comes from Apple, and they don’t exactly give out the source code to all their crown jewels, I may step back to Firefox.

New Jools for Mozilla Corporation

Incidentally, I understand there’s been a new release of Firefox (3.5) which supports the HTML 5.0 release of the HTML specification, which seems to include lots of goodies that don’t need browser plug-ins such as Flash, Shockwave, or SilverLight. While that’s nice, what does the HTML 5.0 standard mandate that earlier revisions didn’t? I haven’t any idea at the moment, which is why I’ll head off to the W3C.org site for the HTML 5.0 preliminary  standard to see what it says. Dry reading, but it’s the last word in the standard. I may have to go somewhere else to actually find out what it all means, mind you – but hey, that’s life.

Same old same old for Microsoft?

In comparison to IE8, I think I still prefer Firefox/Chrome. I don’t know why, it could be the fact that I can’t block adverts in quite the same way with IE. It could be the fact that I get a bar and a VERY loud noise that pops up every time the page has a flash applet that IE could run, but first it needs to check with me first. I can’t reduce the volume of the noise, and it’s one of the things that really puts me off. It could also be the fact that IE8 still only gets 23/100 on my machine when I do the ACID3 test. Hey, that’s better than the rank score of 3/100 that I got with IE7, or 12/100 in IE8 non-compatibility mode! It could even be the fact of the fallibility of Microsoft’s programmers when they made previous versions of the browser so exploitable, and so much a core part of the operating system. Hopefully IE8 isn’t exploitable in quite the same ways, nor to the same degree. Frankly, I don’t know.

Personally, for blocking/choosing flash, I prefer to use Firefox with the FlashBlocker plugin; add AdBlocker Pro and NoScript, and nothing is getting through those three without your say so. In my eyes, that’s a better way of doing the job. NoScript has the advantage of treating each site separately, no matter if it happens to display content on the same page or not, each website gets its own blockable entry. It’s the same for AdBlocker Pro.

So now what?

What’s next in the browser wars? I honestly have no idea. Not as a desktop user, anyhow. I could say I had a wish-list, but I’d be incorrect in saying so. And as I’m no programmer (really) I have no idea of the scale of the job that modern browser programmers have… do they make it lean-n-mean (a la Chrome) and risk leaving out features that users want, or do they make it sing and dance (a la Firefox 3.5) and take plenty of a user’s machine memory on load-up? That’s a hard act to balance, because while some people want to get the job done (display me the page, please, and don’t put any stupid dressing on it), some others want to be immersed in a multimedia environment hosted (funnily enough) by a browser-like interface.

So, what? Host the whole OS in the browser?

That’s a possibility being mooted by some – create the browser as a thin shim around a internet-based operating system, with most of the applications hosted at remote servers, along with most of the user’s data files. Great for redundancy, it almost approaches what thin clients are built for. A bit useless for those of us with slow old modems and gobs of hard disk space just crying out for tunes to be stored locally. And in New Zealand at least, shuffling all that data over our slow links that we have here isn’t all that practical unless you’re only editing a few documents a month and doing a little bit of surfing, a little bit of email, and some (small amount of) music listening. Otherwise you end up paying gobs of money because you’ve gone over your data cap for the month.

Local, or Server?

Personally, I prefer local. Nobody else has to deal with it then. As long as I’ve got a copy on my hard drive, nobody snoops my network traffic to see my artwork, music, or letters (or blog posts, for that matter). I’m not entirely in favour of server storage, except for one thing—access for other people. As soon as you throw in multiple access to the mix, then server storage makes more sense. With local storage, you have to shove a document up to some other place (via ftp/http/torrent) or let someone have a reccy at your personal machine to see the document, or use some sort of peer-to-peer software like Skype/Messenger. Hey, people are still emailing stuff, but that assumes the other person has an email account they have access to. And who remembers that grand old collection of software: uucp? Granted, all these solutions work (except for uucp now, of course), but they always feel like a bolt-on to me. Even server storage feels like a bolt-on to me, but it’s how shared environments work. “Team” members put documents into a store, and everyone has access to it, to work on as they need, sometimes in pairs—in which case, the document will probably be locked so that only certain things can be done by other people.

These sorts of team environments are only really used by businesses at the moment, but there’s no real reason to restrict it to them. I can imagine that Grannie might just want her son’s help in writing up a letter, but the son’s in another city. In which case, a shared environment may just do the job. I’m not talking about the type of thing whereby Grannie offers son a Remote Desktop invitation, as that’s not really what that’s for—and that’s rather limited to whatever software happens to be on Granny’s machine at the time, the speed of the network (sometimes abysmal), and the speed of understanding between the two of them. VNC offers a similar experience to Remote Desktop, but can operate across differing operating systems. An ad-hoc arrangement whereby son pastes paragraphs into Messenger and then Granny pulls that out of Messenger into her editor would of course also work, but there’s nothing like working from the same page, so to speak.

Linux (and indeed Unix of old, BSDs etc) had this concept of kibitz – a program that provided a shared shell where two (or more) people could interact, though for xkibitz at least, it’s text only. I can do the same thing with a OpenSSH session, the screen program  and an editor, but this requires either that I give the other party/parties access to my account, or provide a common login account for everyone to play in that’s a bit more locked down security-wise, so if anyone goes NATO on you, you’ve at least got some protection. Again, a screen session is text-only. Examples of fully graphical, fully interactive by all parties systems aren’t too prevalent, and VNC (in its various forms) is the only 2D example outside of RFB that I know of currently.

Why not a 3D environment?

One concept has a fully three-D model environment that everyone logs into, and everyone can create objects and interact with each other’s objects, all be it with some restrictions. Currently, I know of two models in this case: Second Life is one well-known example, OpenCobalt and EduSIM are two other examples that aren’t so well known, partly because people are still working on the underlying software. Frankly I’m still getting my head around things in Second Life, and I’ve volunteered to be a tester for the OpenCobalt project to bring them up to speed. There are some limitations to all of these environments though, in that you have to have a recent machine, and a reasonably fast Internet connection (DSL at 2MBit should do). Good quality graphics cards are a must for these applications, as I’ve found out to my detriment. Neither of my machines have a really recent video subsystem, and as a result, their performance suffers when subjecting them to the requirements of a 3D multi-user environment.

Why are there two models? What’s the difference?

The main difference between the two models lies in this. Second Life focuses on a server-driven hosting model, where everyone who downloads an official Second Life client and executes it automatically (with suitable username/password) gets logged into the Second Life server cluster, and can only interact with the activities hosted there. If you wish to have a slice of your own land, then that is paid for on a subscription basis, because that’s actually how they make their income to afford the running of the servers. I believe they’ve accepted the fact that most people won’t actually buy their own land, just as long as they can get those people to interact with people who will buy land, build projects, and items to interact with.

In the other model, there’s the “everyone gets their own island” model, where each person starts off with an environment (called an island, in Cobalt parlance) they can tool up as they need. The trick to getting people to each other’s environments is in a tool called the teleport. Much like Second Life, it allows an avatar to transport to somewhere else. Unlike Second Life, the teleports are not mandated by a server farm or by a company. And currently at least, spaces are only visible to other people on a voluntary basis. If I start up a Cobalt island, it’s completely autonomous. If I wish others to come to that island, then I can hand out a “postcard” telling somebody “this is where I am on the network, so come and join me”. I can take the island down whenever I like, without having to report to anybody other than whom I’ve invited to my island. People that receive a postcard can then use the teleport mechanism to come to my space.

So far, in my testing, I haven’t nailed down exactly how many people we can have on an island before performance starts suffering. I have heard that there have been twelve people (wow!) on an island, and performance wasn’t impacted, but they were all on a local area network. I’ve noticed that things happen at a slower frame rate for me, but that’s because I’m on an older machine. I’ve no idea how performance degrades for hundreds or even thousands of people in an island, because we’ve never done that scale yet.

OpenCobalt is built on the Squeak platform, a popular form of Smalltalk that’s freely downloadable and usable by anybody, so it was a good fit for a project of this type. There have been extensions to the Squeak platform to allow it to do the 3Dish thing, previous work was put into a client called Croquet, and further work was done on Croquet to produce what has become OpenCobalt. It’s got a wee way to go, but I think with the right work, we can get the performance of Second Life, without the necessity of depending upon a centralised server farm to run it on.

Anyhow, that’s my little wander through the subjects rattling around in my mind.

No comments: