25 posts • joined 28 Nov 2007
Maybe not quite as pointless as it seems...
I already listen to digital radio, via the BBC iPlayer app or the TuneIn Radio app and any number of IP steams - DAB is dead as far as I'm concerned.
That having been said, I like the abstraction idea. If this could be put into an OS, it would be pretty cool. Having a "Radio" app that aggregates all possible sources would be nice.
It looks like it would be fairly cheap for Google/Apple to do as well, and it allows the vendors/networks to make money by pre-packaging stations - so should be popular there. And customers find it easier to listen to radio. Everyone wins, it seems.
(Well, except for Apple and their customers maybe, as Apple are a bit touchy about such customisations.)
On that basis, I wish 'em luck.
So are they planning on making some films that would be worth pirating then?
Something that's not a sequel or a prequel or a remake? Or a comic book adaptation or a young adult book series adaptation?
Something people might actually want to pay them money for, rather than just pirate so that they can at least be disappointed for free?
I think they have bigger problems than Google Glass. And the problem is on the supply side, not the customer side.
BASIC is under-appreciated
There's the famous quote about how BASIC ruins programmers.
Personally, I think it probably brought more people to IT than any other programming language ever has.
Almost everyone I know in IT "of a certain age" used BASIC as a kid. Whether they're now a developer, a systems administrator, an architect, a hardware engineer or a support engineer. (Or whatever other IT roles we can think of.)
And through things like VBA, BASIC continues to be very much alive despite a plethora of newer options.
Re: Very few decent games for linux on Steam
Define your criteria for "decent games"...
If you mean that there are very few big ticket, big budget games that get TV adevrtising slots and ads on bus shelters - then you're right. "Call of Battlefield Volume 17638 - Easter Edition" isn't available.
Although I'm not sure that's a bad thing, as many of those are bad ports from consoles anyway. The Steam Box has a controller which maps the keyboard to a console controller. So you'd actually be playing a game that was badly ported for controls on a PC, but on a platform which then attempts to port those controls back to a controller... *shudders*
There are some surprising titles. Serious Sam 3 BFE, for example. Niche, perhaps - but if you like that classic Doom style first person "possibly more enemies than you have bullets" kind of rush then you're sorted there.
And there are loads of excellent indie games. I've been steadily losing time to FTL: Faster Than Light. And Solar 2. And Gratuitous Space Battles.
But let's be honest, what will really convince you about gaming on Linux is playing Valve's own Hat Fortress 2 - the world's premier hat wearing simulator. I hear the next update adds the ability to shoot stuff too... ;-)
Re: but then Sony launched the Z1
Not quite just you. After the music CD rootkit debacle, I also swore off Sony.
However, I can't swear off a company forever if they seem to have improved. And Sony have somewhat improved. They're still not perfect - their motion picture side is as nutty as the rest of the industry - but they seem to have given their consumer electronics side a *little* more autonomy, and are less and less in the "everything in a Sony stack" mode. (Memory Stick? No thanks!)
I did some basic research on the Z1, checking for issues with DRM etc - nothing came up.
I think that's one area that Android has had some success in which people overlook - the Android platform is so much richer than a desktop platform in the APIs it provides, and pretty much stops some forms of lock in.
So for example you could go off and write your own music library code, but there's not much of a point - other apps want access to the music library as well. If you go your own way, then you're just going to have people complain that their exercise app or game can't see playlists.
So I do feel a little protected against Classic Sony Dickery by Android...
And Sony seem to be leading the pack in build quality and specs at the moment, which is why they're in the lead. That having been said, I'm sure I saw something about a new HTC One model with an SD card slot. That could complicate matters! ;-)
As a Samsung Galaxy S3 owner...
I'm indifferent about the S4.
I was going to upgrade to it, but then Sony launched the Z1.
That doesn't mean that the S4 is bad. Only it and the Z1 are even in the running, as I want a microSD card slot on my phone. Waterproofing just nudged it into the lead.
But most of all, I just fancied a change. My last phone was HTC, my current one is Samsung, my next looks like it'll be a Sony. After that, who knows? If LG or Motorola launches a phone that meets my requirements, I'd happily look at those too.
Why would I have brand loyalty to manufacturers when I'm on Android? Whoever makes the best phone for my needs will get my money. I happen to need different things to many other people, so I have a more limited choice.
I suspect Samsung are confused right now. In brand terms, they want to be more like Apple. In market terms, they're not and never will be.
Re: And now the world waits...
I agree with you - it'll never happen.
Or at least, if it does, it'll be a huge change in Microsoft.
Every dumb decision, every stupid delay, every failure to create a new market that Microsoft has had for the past decade - they're all about extending and milking their existing Windows/Office monopolies.
So when I see small business owners I know wonder whether they should make do with a Chromebook or splash out on a Macbook Air - because in their own words "I use Google Apps and it's fine" - I can't help but see the future.
Enterprises, conservative as they are, will continue to insist on 100% Microsoft, because Microsoft is the new IBM. Nobody ever got fired for buying...
Meanwhile, in the rest of the world, their mindshare and their usage stats decline by the day.
iWorks being free is a significant milestone in that. I'm not completely sure that in the future people will look back on this and say "that's when Office really became irrelevant"... But I'm not sure I'd like the odds on a bet against it.
To bastardise Pink Floyd, "All in all, it's just another chip in the wall..."
Meanwhile, Microsoft will continue to fluff their own decisions and cripple their own products to preserve a doomed pair of monopolies.
Much like IBM before them.
Close, but no banana.
You're so tantalisingly close to the truth, yet haven't quite got it.
Your argument, after all, can also apply to future versions of Windows/Mac OS X. Should their creators ever be so stupid as to drastically change the interface, that is.
(Silence at the back! Wait 'till I've finished, as that's not my point!...)
The truth is that familiar interfaces are nice, but as you indicate with your tablet example, people are always willing to change if the benefits are clear.
What people *really* want is functionality.
After all, how many Mac users have a copy of Office? By many people's logic, they should. Office is irreplacable, and LibreOffice/OpenOffice just aren't good enough!
Yet I see more Macs with iWorks than Office on them...
Most PCs come with a trial version of Office these days. And most people don't want to pay 100 bucks for it at the end of the trial. Show them LibreOffice/OpenOffice with 100 of their bucks in one hand, and a copy of Office with a receipt for 100 of their bucks in the other, and they soon decide how valuable Office really is to them.
And yes, I have seen people pick Office instead of the 100 bucks. Usually for business reasons. But then, I also see small businesses just using Google Apps instead.
And Google Apps shows the real direction for functionality these days. Web browsers just won't be reasonable old chaps and stop their pace of development. They insist on rudely and uncouthly becoming more and more capable. It's just not cricket!
Remember Microsoft Works? Not the standalone abomination, but the Word/Works/Autoroute/Other bundle from the early 2000's? I knew people who loved that. Mostly for Word and Autoroute - that made their computer useful to them. These days a browser with bookmarks to Google Docs/Office 365 and Google Maps/AA's website/RAC's website could do the same. For free.
What software do people really want from their computer so that they can do what they want to? I'd venture the list looks a little like this:
1. A web browser.
1a. Email. (Maybe via browser, maybe via IMAP/POP3 See 1).
2. Basic Word processing.
3. Plays music. (Maybe via a web browser? See 1.)
4. Plays video. (Maybe via a web browser? See 1.)
5. Allows basic photo management/editing/sharing.
6. Using a spreadsheet as an overgrown table creator, and, in a very small minority of cases, using some basic formulas - but I wouldn't hold your breath waiting for that last bit to happen.
7. Abusing DTP to create godawful invitations to equally godawful events, and brain-melting newsletters about said godawful events - newsletters that are so bad your hamster will bite you for forcing it to **** on their shredded remains.
8. Porn (see 4).
I hesitated before putting 6 and 7 on. Many people will never actually use a spreadsheet, despite being aware it's there. Most home DTP functions are basically templated, and could probably be done in most word processor packages anyway.
The basics of everything on that list can be done in a browser if necessary. And other things I've probably overlooked.
So Microsoft's biggest fear is that customers - or on their behalf, OEMs - begin to ask why Office costs them 100 bucks when they use it so little. Or why Windows adds 20 to 30 bucks to the cost of a device, when all they're going to do is use a web browser that's free anyway.
At that point, Linux looks much more attractive to OEMs, and I'm sure many of them have been hinting at that since the netbook era.
None of this affects Apple as much. They're a luxury brand. It actually helps Apple, in a way - knowing that you don't need Office but just need to write simple documents occasionally makes a purchase of iWorks easier.
The tablet/phone markets are a little different, partly due to a richer platform experience due to well thought out APIs and partly due to a much lower cost per app, which distorts things a little.
But sticking to the fundamental "how do I do this?" rather than "how do I run XYZ?" shows much the same results - lots of movement towards web-based services. (A shocking number of apps seem to be little more than wrappers for mobile web sites!)
When you wrote "Linux is not an OS for most users, and will never be", you could have put Windows 8 in there. Or even Mac OS X. And you'd still be correct.
Because whilst a Linux distribution comes with plenty of software - free, curated, easily installable - all it needs to satisfy 90% of the needs of 90% of consumers is a web browser.
Most users don't need an OS, except as a bootstrap to a web browser. To them, Windows is what DOS was for the Windows 95 developers - a handy way to start the ball rolling after the power button is pressed. A minor step on the way to the final destination.
Sure, some older folks are stuck in their ways. (And many more aren't.) But by the time today's kids can afford to buy a copy of Windows or Office with their own wage packet, they'll already have gone through multiple versions or alternatives, and learned that they don't need it if they can run a web browser for free.
Welcome to the future. Feel free to bookmark it in case you need to visit again...
(Who am I kidding? Just type "the future" in your address bar, and it'll pop up! Remember using bookmarks instead of Google and local browser history? How quaint!)
Re: A question from a young'un of 31...
The immediate reason at the time was simple... Money.
Windows 95, shipping in August 1995, had a minimum requirement of 4Mb of RAM, and recommended 8Mb of RAM.
Which everybody regarded as a joke. Sure, Windows 95 ran in 4Mb. And you could even run Notepad or Clock, too! But if you ran both of them together, your hard disk began to glow red hot as the swapping kicked in.
Realistically, everyone recommended a minimum of 8Mb and you should really want 12Mb or 16Mb.
For Windows NT? Version 2.51 was released in May 1995, and the minimum requirement was 12Mb, with the recommended 16Mb. And again, everyone laughed at that - you wanted 16Mb, preferably 24Mb.
Why is this important? Well, back then in 1995, RAM was for sale about 33 US dollars per Mb. (I couldn't find a reliable GBP price, so we'll have to use USD - sorry!)
That's before any taxes, too.
Let that sink in. If you'd bough a 486SX machine in late 1994, and it had 4Mb of RAM - fine for Windows 3.11! - then you were probably going to have to spend another 132 bucks just on RAM before you could spend your 90 bucks on the Windows 95 upgrade... Is your motherboard full already with four 1Mb sticks? Not unusual. You now get to throw away the old 4Mb, and spend ~270 bucks on four 2Mb sticks.
Eager to upgrade to Windows NT 3.51? Well, just double the prices... And start weeping, presumably.
Over the following few years, demand for RAM drove the prices down fairly quickly. But still, the requirements of Windows NT were a little more than Windows 95, and it never quite lost that reputation until Windows XP came out. (And perhaps not even then!)
There are other hardware issues, too. Windows 95 brought us Plug & Play, but that didn't come to Windows NT until Windows 2000 shipped in late 1999. And Windows NT also had a different driver model. Most manufacturers targeted Windows 9x for driver development as it had a wider installed base, so you were both more limited in the hardware Windows NT could run AND you sometimes had to fiddle with IRQs/memory addresses manually to get things working.
(Although in its favour, high end hardware like SCSI cards usually had much better Windows NT support and took a lot of the hassle out of hardware configuration anyway...)
Now on top of these costs and hardware issues, add on all the software compatibility issues that others have raised.
Windows NT was superb. Brilliant. Faced with a choice between Windows 98 and Windows NT 4.0, I jumped at Windows NT 4.0. But I was savvy enough to know how to select/handle my hardware, and how to tweak software to run under it.
It rewarded me with greater stability and reliability. But I wouldn't have recommended it to the average user on the street until WIndows 2000 SP6.
On its release, I didn't like Windows XP's crayola-inspired interface tweaks, or paying for what was effectively Windows 2000 Service Pack 7. But with hindsight, Windows XP was the version of Windows we'd finally wanted - a great blend of the plug & play and software compatibility of Windows 9x, and all the stability of Windows NT/2000.
And by then, RAM was so cheap (by comparison) that it really wasn't an issue. :-)
Re: NewsBlur - a Better FOSS RSS Reader.
Less fussed about it being open source, but it has been a very pleasant experience.
The training was a really nice trick. I didn't bother at first, but once I did I found the Focus Mode great for grabbing the important things whilst on the train to work, when time can be limited.
Good points, but I would like to add more detail.
You're quite right that Apple have rarely truly innovated. Most of their innovations have been done by someone else before.
Apple's true advantage lay in two areas. The first was in attention to detail and a strong direction. Steve Jobs, once involved in a project, dominated it. Don't want Steve's idea of an email client? Tough. That's all Apple will give you. But it was, admittedly, above average. And that was enough.
Their second advantage lies in two distinct but related areas - marketing and journalists. Apple have great marketing, and that helps. But many journalists aren't actually technical - they're writers who are merely ten percent more technical than the average person. Therefore they use Macs, because they're prevalent in the industry (due to an early GUI and availability of DTP software) and are personal users. Apple benefits from a very friendly Press.
So you're right that Apple's innovation was, in every case, not actually so. From MP3 players to capacitive screens to multi-touch, Apple has been as innovative as a block of granite. But they know how to do a demo, and their three-ring circus with Jobs as ringmaster was certainly good copy.
Apple are not dumb. They knew the iPod would be cannibalised by phones - remember the iTunes phone they did with Motorola before the iPhone? They have a good grasp of the market, and have proven themselves to be adept at pre-empting their competitors by entering the market early - witness phones and tablets.
Their weakness is in their uncritical belief in their own bullshit. There is no reason to have a tablet smaller than 10" - until eBook readers primed the market, and Google/Asus stormed it with the 7" Nexus. And Apple had to ship a smaller iPad.
There is no reason to have a larger phone, as your thumb can't reach that far. Until the Samsung Galaxy S3 and Note 2 sold like hot cakes, and Apple had to put a larger screen on their next iPhone.
Journalists are very happy today to forget that we had Apps on a Palm/Handspring Treo or Windows Mobile devices almost a decade ago, or that the original iPhone was web apps only despite not being 3G... But Apple's packaging and marketing make up for both history and shortcomings.
No one thing Apple has done has been truly innovative or ground-breaking. But they can more than make up for that with positive press coverage from poor "technology" journalists.
Their big problem now is that they charge more as a luxury brand, yet deliver less due to their obsessive control. Their competitors aren't necessarily better, but are at least both cheaper and more flexible.
Except even the most technologically illiterate journalists can't help but notice Apple losing their lead now.
Apple are in the a corner - give up control and risk loss of profits and bad publicity, or carry on and hope to work on high margins as a luxury brand. I suspect they will continue as a luxury brand for as long as they can, before moving slowly back to obscurity. They certainly show few signs of wanting to give up control, being fanatical about both what their devices can do and what they will allow themselves to take 30% of in their stores...
Re: Remote Control
Yes, PowerShell can connect to a remote machine and yes you can administer it.
But have you tried it for anything but Windows stuff? For example, Exchange Server? It's not very smooth. If I SSH to a Linux/BSD based MTA, I'm fine - every tool that's installed on that box is now available to me. If I run them, because I'm actually running on the remote machine, there are no issues. It all just happens over port 22, which is basically an encrypted text pipe. (I simplify, but not by much.)
If I connect to an Exchange Server via PowerShell, I need a port 80 connection with which to fetch the special PowerShell tools that Exchange needs. I then have to create a new session on that machine, and import that session into my current session.
(See http://technet.microsoft.com/en-us/library/dd297932.aspx for details.)
It's the same for SQL Server and other Microsoft server software.
You could view this as a security boon, albeit security by interface obfuscation. Personally, I view it as a bloody stupid way to work. The *NIX method is both more seamless and intuitive. PowerShell still has some way to really rival it.
I fall more towards the CLI camp
GUIs are nice, but they change. Because CLIs are often scripted, they become an API, and therefore don't change much over the years.
A CLI is also, perversely, easier to document - documenting a process for a GUI often results in 20 pages of screenshots, with no real detail about what you're actually DOING. Whereas a CLI document is often much shorter, so people feel they should pad out the document with a little explanation...
Sometimes, a GUI is superior - as others have pointed out, some selections can be hard to do with CLIs. Although if we had tools for the CLI like 4DOS/4NT/Take Command's SELECT, we'd be laughing there too.
What I really wish is that each played to their strangths more. I've seen too many GUIs that should have had some decent reporting or status monitoring, but were instead just bunches of buttons and checkboxes.
Conversely, I've seen CLI administered programs that were pretty poor, with minimal scripting (required input during the program) and more of an "edit the config file" attitude than actually providing a tool to make the change.
There is no panacea. Which is lucky, as it keeps us all employed...
Novell's allegations are well documented - basically, MS had a bunch of APIs for Chicago (what shipped as Windows 95) and was encouraging their use by third party companies. They then pulled them in a late beta.
So efectively Microsoft suckered Novell into developing for something that they never intended to ship.
See http://www.groklaw.net/article.php?story=20111121214458515 for all the gory details. But beware - it'll take a long while to read!
Fax isn't a good indicator
Disclaimer: I work as a messaging systems engineer - email, fax, IM.
Fax isn't as great an indicator of "backwards" as you'd like to think.
The plain fact is that a fax is a legally understood medium in every jurisdiction that matters. Email isn't.
So if you send an offer, a contract, a complaint or other documents by fax internationally then it's usually as legally binding as the real thing via the post. Those court battles were fought years ago, and won in favour of the fax.
The same cannot be said of email.
So when doing business, especially internationally, fax is often preferred. (Just yesterday, I spoke to someone whose small business only has a fax machine for dealing with suppliers in Europe.)
In some cases, they may have to send documents by fax because it's required by law, or because lawyers (on either side) have advised it be used as a "belt and braces" measure.
This doesn't mean that Intel's paper isn't valid - just that it may have picked the wrong metrics.
Interesting to see that people don't know what the cloud is even if they're using it, though - the buzzword is overused, so I shouldn't be as surprised as I was.
It's not an IT issue.
This is a cultural issue, not a technical one.
Basically, if you're doing things properly, you shouldn't need the emails.
Most of the documents people are looking for should be stored somewhere else other than email. A document management system, a network share - whatever works for that team/group/organisation.
But email is attached to a person. Just because Bob closed the Acme sale, should you have to keep Bob's email forever? Even after he's left? No, the documentation for that sale - the terms, the contract, etc. - should be somewhere that ISN'T BOB'S MAILBOX.
However, people are lazy feckless ****holes who just don't get this, so we end up having to rummage through their crazy personal filing system to find a document that should have been stored somewhere properly.
The best, easiest way to reduce your email storage costs - for both compliance purposes and otherwise - is to follow three simple steps:
1. Make it easy to get stuff out of email and to somewhere secure, shared and useful.
2. Have low quotas.
3. Single-instance on commit to the compliance archive, to reduce storage costs.
If, for legal purposes, you have to keep every message for n years, your storage costs are always going to be high - because you'll be grabbing everything at the router, rather than relying upon backups from the user mailbox.
From-the-router style compliance archiving, even with single instancing, is ruinously expensive - and you should probably look at systems like Centera for the belt and braces you'll need.
For anything less, the solution is simple - get the stuff out of email. Make it gross misconduct (a sackable offence) for employees not to be doing that.
Your problem will then be document management / information management, but that's a nicer problem to have than "there's an email we need, might be one of these eight people, we think it was sent in 2008, why would you need an exact date?".
Make the cultural change, it'll save you loads of money.
Exchange did work that way (single-instanced storage), with a couple of caveats that I won't bother going into here.
Microsoft moved away from it, mostly due to performance issues. Single instancing means you have to "garbage collect" your mail stores, to remove the mails that have now been deleted by all owners. It also means creating new copies transparently (from the user's POV) when a user edits the email somehow, which they often manage to do accidentally.
Basically, it costs in performance terms and makes the DB messy, and fast large disks are now cheap. So Microsoft decided to ditch shared storage and go all out for speed. I think that was with Exchange 2007, so any recent Exchange implementation/upgrade will lack SIS.
What they need to do is one of the tracks from the Total Annihilation soundtrack. That was a superb bit of work, which stands out to this day.
Consider yourself corrected, or at least challenged to prove your point!
I've seen this argument time and time again, and I always ask for the same thing: Proof.
There are specialised h.264 decoding parts. They're usually in TVs and the like, because there you don't want to have to put too complex a software system in them.
But when people say "hardware acceleration", they usually think something along the lines of "the processor coordinates data transfer via DMA or some other bus to a special chip which decodes the video and puts it directly onto the screen".
Yep, those special chips in dumb devices like a TV do that, and do it at very low power and heat output.
In a phone or on a laptop? There is no block of hardware dedicated to h.264 in that manner. That would be nuts, because it restricts you.
Instead, there are blocks of specialised computation that aren't much different to MMX, SSE, and so forth. That's what people are talking about when they talk hardware acceleration on a more complex device.
Think about it - otherwise, the iPhone/Android "h.264 chip" would need to be connected directly to the orientation sensor, and would be doing the animation AND resizing when you turn the device from one orientation to another. That's one heck of a complex bit of hardware when compared to the original vision of "chip which does video".
Basically, if the h.264 decoder uses them, then so can WebM. It's just a matter of doing so. Which has already been done for the most part - some of the first patches I heard of to the decoder were ARM assembler versions to improve speed, for example.
Hardware acceleration isn't an issue unless you have a device you can't get a software decoder update for. And the device manufacturers & developers have pretty much sorted that. (Although I wouldn't hold my breath waiting for Apple to join in!)
There are still dedicated "dumb" hardware decoders out there, in camcorders and TVs. But for the use cases you mentioned (desktops, laptops, phones) WebM can be accelerated, and not without much hassle.
It's down to the willingness of the vendor, and most seem willing. Check the WebM wikipedia article for a nice list...
Of course, I could be wrong here. If you know otherwise, then I want proof. I want a spec sheet(s) that show that a common GPU has a dedicated in-silicon decoder of a dumb nature, which could not be reprogrammed to do a new size/orientation/output destination or be partially used for WebM decoding.
Without wishing to sound snotty, that places the ball in your court. I've put forth my understanding, and you now have to prove me wrong. Which I would welcome, by the way.
I've been looking for that magic spec sheet since WebM was first announced, and haven't found it yet. Nobody has presented it to me, depsite numerous challenges to do so. I'm not yet tempted to call the hardware acceleration argument total balls, but I'm pretty close to it!
Is the voicemail stored on a computer system?
If the voicemails are stored on a computer system, then access by anyone without authorisation is illegal.
The Computer Misuse Act 1990 clearly states that unauthorised access is itself a crime, even if you modify no data.
At no point does the law state how the computer system is to be accessed. Keyboard and mouse, Kinect, phone and number pad, phone and voice recognition, direct neural command - it's irrelevant.
If you're not the owner of the computer or don't have permission from the owner to access it, then accessing it is breaking the law.
But the obvious question is...
It's all very well, but does it have the articles?
They're the only thing I ever read it for, honest!
Backup? Um... No.
I'm not sure it's in any way accurate to call Ubuntu One usable for backup.
Ubuntu One synchronises multiple PCs, and synchronises your content into the cloud for access anywhere. But if you delete or overwrite something accidentally on your machine whilst it's connected to the internet, then a minute or so later it's toast on Ubuntu One as well. That's hardly a good backup solution...
It seems to me that you've fallen into what I like to call the "RAID trap" - people buying RAID arrays often mistake RAID 1 or RAID 5 for backup, when in fact it's merely providing redundancy against hardware failure.
The bottom line is that if your laptop died, Ubuntu One would keep your data for you. But it can't protect you against your own stupidity...
(And a clarification - I'm not railing against Ubuntu One, and use it myself for syncing machines - but I wouldn't use it for backup! But it's a good sync service and it's looking promising in terms of projects using it - for bookmark sync, data sync, configuration sync, etc., which is why I'm a paying customer of Ubuntu One.)
Dropbox is slightly better for backup - it holds previous versions for about 7 or 14 days (I don't recall which, exactly) and they offer an option for subscribers called "Packrat" which keeps previous versions of files indefinitely.
In the end, I picked SpiderOak for backup. They seem to be the most secure, and give me 100Gb for 100 dollars a year, which is a pretty good price. It has multiple versions, and is cross-platform with Win, Mac and Linux clients. I was surprised not to see it mentioned here.
For local or local-network backups, check out the rdiff-backup package. Rsync and diff, combining to give space-efficient backups - even to SSH hosts across the network. I've had no problems with it.
I've had a completely different experience.
Personally, I prefer the UI. I generally find it cleaner and less cluttered than the Windows UI, and often a lot more logical.
Of course, I'm really talking about Gnome rather than Ubuntu. Even Kubuntu can't make me like KDE, which seems to be aping Windows a little too much, and clutters itself up accordingly.
As for installing non-repo software... No idea what you're on about. I've never had trouble with non-repo software. Wanted Opera. Downloaded it and double-clicked it. Up pops a nice box saying what the software is. Click on the Install button, type in my password, and a few seconds later I'm done.
Same story with Bibble 4 Pro and the upgrade to Bibble 5.
Extensive repositories mean I've not needed to install much software from outside them, but if there's a .deb file for it then it's incredibly easy.
The only times it's not easy are when, like VMWare Professional, the software comes as a bundle file. Then I have to drop to the CLI, as you describe. This is not a failing of Ubuntu. Two much smaller companies got it right, VMWare haven't yet.
As for not running existing software - I've found there's very little I need to run in Windows. Linux in general has usable equivalents for all of my needs. Not all of them are free - some I've bought. But they're out there, and seem to be increasing in number.
These days on Windows it's games and that's it. And if Steam comes to Linux and brings Team Fortress 2 with it, then that'll remove 75% of my Windows needs...
Christo - stand up, your voice was muffled by your chair dear lad.
I've never seen an environment where administrator privileges are required to run Notes. It'll run quite happily with user privileges, unless you've mucked about with the Windows security.
That's based on 11 years experience with the product, running on every version of Windows except ME (did anyone use that?) and Vista, by the way.
Now, with Vista, you may have a point. The large number of security changes and the fact that Vista shipped late into Notes 6.5's lifecycle mean that it's not going to be supported by IBM. So that's hardly a fair point, but a valid one I suppose.
Vista is supported by Notes 7.0.2 and higher, including the newly released 8.0.
What I suspect you've done is installed Notes in single-user mode, where it will require administrative access to the data directory you selected when you installed it. (NOT to the whole machine.)
Users tend not to have those rights, and therefore you get errors.
That's very easy to fix, and frankly not exactly hard for a networking and security wizard like yourself to have determined with some simple testing. Or a search of the IBM knowledgebase. Both of which I'm sure you were too busy for.
Personally, I'd instead look at installing Notes in multi-user mode. Then it'll put a subset of the data directory into their Windows profile, bypassing the problem entirely.
Either way, good luck fixing it.
- +Comment Trips to Mars may be OFF: The SUN has changed in a way we've NEVER SEEN
- MARS NEEDS WOMEN, claims NASA pseudo 'naut: They eat less
- Back to the ... drawing board: 'Hoverboard' will disappoint Marty McFly wannabes
- Vid Find using email DIFFICULT? Get someone to install 'Inbox' for you
- Google+ goes TITSUP. But WHO knew? How long? Anyone ... Hello ...