Re: Obvious really...
Of course, the graphic novel, not the film!
2113 posts • joined 15 Jun 2007
Of course, the graphic novel, not the film!
I know that this is off topic, but the AC@13:13 made me think.
What we need now IMHO is a lightweight fast browser, without all of the historical cruft.
... wait a minute...
Wasn't that the primary reason Firefox was introduced back then as a response to Netscape Communicator?
If they had said it is water resistant when it wasn't, and had done nothing, they would have been subject to a Trading Standards enquiry about false claims in the UK. That would have been damaging to the brand.
I started buying second hand Thinkpads about 14 years ago, starting with a 365XD, which had a 100MHz Pentium 1 and a 1.2GB disk. Since then, I've bought 360, T20, T23, A30, T30 (and T42/43s for friends and family) systems, and have used as work laptops T60 and T400 (my current work laptop).
I recently dumped 2 365s and a 360D, as I could not think of any good uses for them. Apart from a broken screen (someone jumped on one - the 365's had plastic covers), I think that they all still worked. All of the T series machines still do, and the A30 is running the firewall for the house. All of the T20 and later systems still work (although there have been some remedial fixes for T30s, it's true).
I think that the T20 to T23 systems were the nicest to use (although far too slow nowadays to be used for anything other than basic Web, and even then you have to block Flash). They were robust, compact, had optical drives (the X series Thinkpads generally don't), the keyboard was not recessed like newer ones, and just feel good to use. My T30, which is still MY (personal rather than my work) primary laptop was a real downward step, being larger, heavier, and it turns out, having a design flaw with the memory sockets. The T40 and later series use low-profile optical drives that can be difficult to source, and recessed keyboards that don't feel quite right. And the very latest T530s have 'Island' keyboards in a new layout which is just wrong. And of course, all have my pet hate, a 16x9 or 16x10 'wide' aspect ratio screen.
I have to do something about my T30 before it completely dies, but now that I discover that the Pentium M processor used in the T4X series do not support PAE, and current Ubuntu releases don't run without messing around. It looks like I will have to go for a T60, and live with the fact that I won't be able to swap my IDE drive into the new machine.
Well. What can I say. If you don't care, you wouldn't have commented. If you don't want to hear other peoples views, don't read the comments.
I know that my past usage of UIs is of no real relevance, but I was using it to counter the "helpdesk monkey" label of the original AC. I know I play the "I've been around a long time" too much in these forums, it's a personality flaw I have, but I find it difficult to counter an accusation of no breadth of experience without providing some background, and boy, do I have a lot.
I know, and I have said several times, that Unity will suit some (maybe a lot of) people, but it does not me. I know that it's a personal choice, but I get as annoyed by people who say "Unity is THE way forward for everyone", when it's clearly not. I try to never push my views on other people, I just comment on the way I feel. And I feel that Canonical are doing exactly what I don't with their push to Unity. If they had installed Gnome still, and given you the option of switching (something that you can do yourself I know) I would be far less critical. After all, it's not as if modern systems are short of disk space, and much of Gnome is actually already installed. But they didn't.
Having switched distro's several times since I first used Linux (damn, there I go again), I know that I can do so again, so I have no problems there.
But Ubuntu was, and still is, a distro with a large and well maintained repository. This is one of the reasons I liked it and I have suggested in the past to anybody and everybody that they try Ubuntu as it could have been the dominant distro, something that is needed to achieve the critical mass required to get Linux accepted. But I will probably never suggest to someone looking for a different experience that they install a current Ubuntu ever again, because I will either not be using it myself, or will using a different GUI. Maybe that is a loss to the Ubuntu community, maybe not. I have no pretensions about being influential in my professional life, let alone my personal one (I work with real UNIX most of the time still - Linux is just what I use at home).
I think it is you that needs to take a chill pill. 1995 was a long time ago, and even Windows 95 was new/still in beta. UNIX-like OS's were using CDE or Motif, neither of which resembles Gnome 2, or simple window managers like FVWM.
But you appear to be advocating change for changes sake, merely to keep a product moving. Yes, there are people who will like Unity. And there will be people who (like myself) would like to keep a patched working OS going with as little change as possible, because change costs time and money, and I have too little of either free to adjust the way I work. Is it too much to ask for a system that will still be patched (even LTS releases stop being patched) without having to alter their habits.
I have tried every Ubuntu since Unity was released, and I still find that it does not do what I need it to do without serious effort. I generally run with the desktop completely covered up with many windows, so anything put there is no good to me at all. I can use Dash from the side panel, but it appears to me to take longer to find things than static menus. I cannot customise the side panel as easily as with Gnome (may be a learning curve, but I think not), and applications opening full screen are unnecessary. Mixing mouse and keyboard actions in a single operation is in-efficient, and relying on keyboard short-cuts (even to an emacs user) leads complexity that takes time to learn.
I only have so much learning bandwidth. Much of what I have is taken up by my quite challenging work, changing phones, adopting a tablet, and learning to use whatever new consumer device has ended up with a complex UI. There will be the inevitable application changes with 12.04 (LTS release - see above), and the last thing I need to do is learn a new desktop UI as well. This will be true even if I switch to Gnome 3 after installing Precise Pangolin.
Oh, and by the way, I am a valued and well paid (comments about cost notwithstanding) IT professional working supporting complex HPC systems, not as you put it a 'helpdesk monkey'". I have been a strong Ubuntu advocate since Dapper Drake, but have nearly reached the end of what I can put up with. I cannot tell who you are because of your use of AC, but I strongly suspect that I was working with UNIX systems and GUIs (such as SunView, AT&T Blit, Layers) , early Windows, GEM, Arthur, and even Lisa a little, while you were still in nappies!
I'm not arguing there.
If you look at your Linux history, you will find that RedHat 9.1 was made available before either Centos or Scientific Linux were available. I have been using Ubuntu since Dapper (from about when RedHat stopped patching RH9.1), and stick to LTS releases, as the normal Ubuntu releases and Fedora moves too fast for my liking. My day-to-day systems are to use, and the less time I need to spend maintaining them, the better.
I have been using Linux since RH 4.1, and UNIX a lot longer than that, so I do know my way around.
And if you read what I said, I got something out of it other than the knowledge that I was helping RedHat, in that I did not have to download the iso images over a modem....
I actually have little sympathy for the "must be completely free of cost" Linux brigade. If someone wants to create some software which has real value, and wishes to get some remuneration for the work, let them. Provided that they abide with the GPL and/or LGPL, then Linux should be a platform that they can use.
I am getting a little jaded with the Open Source model. We are now getting to the point where very useful parts of the Open Source landscape are becoming abandonware, with the original maintainer moving on to other things, and no-one else picking up the baton. Other projects go the other way, and get managed by large groups, who then argue about direction causing fragmentation. Gnome and KDE and even the XFree86/Xorg groups have been guilty of this in the past.
What it means is that there is no continuity, with new releases of distributions changing the tools used by default. For example, for ripping CDs, I've been through so many different ripping tools, all of which work in different ways, store the tracks in different directories, tag the files differently, use different profiles for the back-end decoders, and generally leave their detritus all over the . files in my home directory (I keep my home directory when I upgrade distributions). I would have been absolutely happy sticking with Grip, which was small, efficient, and suited me perfectly. I could have asked to become the maintainer, but I have too little time to do what I need to, let alone taking on a software project.
Unity is a personal thing. I've noticed that younger people often are happy with running one application at a time, full screen. These people can cope with Unity quite well.
Everybody else, especially those used to *lots* of windows, Unity is a distraction, and not only is the change unwelcome, but the way of working is foreign.
I know that you can set up multiple windows, and 'pane' them, but the way of controlling the positioning is not as easy as in Gnome.
On my eeePC 701, I found Unity unusable, mainly because it did not like the small screen, but also because of the lack of graphics power leading to severe lag in updating Dash.
Yes, but only if they also had an option to not install Unity in the first place as well!
I actually have no problem in paying a reasonable amount for useful software, but the definition of reasonable is flexible. I would not mind paying, say, a tenner per download of Ubuntu, especially if it allowed Cannonical to include licenses for patent encumbered components (think H.264, which is free for personal use, but needs a volume license for Cannonical to install as part of the OS install - it's a volume thing).
In the past, I did pay for an official Redhat 9.1 (no, not Fedora or RHEL) box set, because I wanted to support Redhat in their continuing efforts (and downloading 6 CDs over a V90 modem was going to take a while, not to mention finding the disk space to store it and the effort of burning the disks).
What they are attempting to say in a clumsy way is that some ad-funded websites may disappear if they lose the higher payment for targeted click through advert referrals that such tracking may enable. This may reduce choice, or may cause some sites to become subscriber only.
Even without those controls, for most purposes my 8 year old Thinkpad running Linux on a 2GHz Mobile Pentium 4is fine, until I hit a site that has such adverts.
Unfortunately, there was nobody policing the names, so anybody whether they were a company or not, was allowed to register a .co.uk address. I had a long email exchange with Nominet about what I regarded as a cyber-squatter who had registered a name that matched a company I owned (beating me to it by a matter of hours which was suspicious as I had used a 'free' service to check it was available before I tried to registered it myself), and who didn't use it, or even have a real name-server serving it for several years.
Even though mine was a limited company, which was set up specifically to be clever and have synergy with a domain name, and the person who had registered the name I wanted did *not* represent a company and was not using it, Nominet would not allow me to start an appeal.
It is partly my fault for being slow in registering the name myself, but it was amazing that as soon as I made an attempt to check it was available (and it was), it suddenly became unavailable. Oh well. All ancient history now, as is my company (I got fed up with the bureaucracy of running a company in the UK).
LG is talking about patents.
Samsung is talking about trade secrets.
I wonder what happens if you obtain a trade secret from somebody, and then file a patent on it. Because the original was a trade secret, the originator could not claim prior art because there is a real possibility that someone else could have developed the technology in isolation. They would have to prove some form of industrial espionage and get the patent re-assigned, but I'm not even sure whether industrial espionage is a chargeable offence.
It is unlikely that a phone will implement full X11, more likely it will use Qt to provide hardware abstraction from the graphics hardware.
If it does use Qt, it will mean that there will definitely be some software immediatly, but not as much as is available across all Linux.
It looks to me like they are probably using the Android system specifications as the target for this. If that is the case, there is a very good chance that it would work on many different handsets and possibly even tablets with fairly minimal changes. I like my Android tablet, but putting a real Linux on it would be much better.
...then you should be able to look at the project write up at the research library of the University. Many will also sell copies if you ask them.
I would suggest that if it is a trade secret, there must be more in the product than was in the final year project, or else anybody would be able to see it. A trade secret that becomes known via a legitimate means then has no further protection in law.
From your post, I don't know which side of this argument you are actually on, but the things that you have ignored in the proposed Luddite way of the world is that you would have to actually have a much smaller population, people would have to leave the cities and return to the land (including working it), and that there is insufficient energy in many environments to even boil the water for ale or safe drinking and would generate pollution. Open fires are significantly more polluting per joule of usable energy than anything we do for power at the moment.
On food, preserving with salt assumes you have a local source of salt, something that was not the case in most of Europe before transport. Jerking and smoking assumes that you have heat sources (I know that you can use waste heat from your inefficient open fires), or good sunlight. And people actually knew more about the effects of food poisoning from first hand experience in those days. Whether this was a good or bad thing, the bulk of the population were poorly nourished much of the time. Life expectancy was worse with no good medical services and medicines.
I'm sure it would be possible to design a way of working that was sustainable and local, but I would expect that it would degenerate to the way people actually lived in the middle-ages without some form of non-local engineering and manufacturing capability.
I think you will find (pun intended) that it's over 40 years!
It appears in the UNIX Edition 1 man pages, a scan of which are still available on Dennis' home page at Alcatel Lucent http://cm.bell-labs.com/cm/cs/who/dmr/1stEdman.html. These man pages are dated 11/3/71 (probably American), so November 1971. I first used it on UNIX Version/Edition 6 in 1978.
It's so old that, like dd, it does not completely adhere to the UNIX command arguments convention of having flags before arguments.
I do hope that Alcatel Lucent decide to keep Dennis' home page up as a homage to one of the Great People of IT.
Unless you are tied in some way to CDE by the environment you work in, I can see no reason to put it on to a Linux machine.
I never liked it, although it has some interesting capabilities for cross-system RPCs built into the window manager itself. Unfortunately, it felt like a bloated version of Motif, designed by committee, and foundered because it was licensed software rather than freely available. I notice that CDE is now published under LGPL, but apparently, still requires Motif or a work-a-like in order to be used.
To tell you the truth, I must look into downloading the virtual desktop version of TWM called vtwm, which was about as lightweight as you could get! That was my preferred window manager on UNIX for many years.
When I first used Linux, I stared using FVWM, but it was not the same. I notice that there appears to be a project to keep it alive now, so that's my project for tonight! If I can get it working, that could well be what I will use to make Ubuntu 12.04 usable.
You don't think that the fact that they were older technology, having been around for quite some time before the original iPhone or most touch screen feature phones might have something to do with why they were 'clunky'? I would have loved to have seen a Palm TX with a phone grafted on. That would have been a device that could have stood up to the original iPhone. The Treos were great, but they wasted too much space on the keyboard.
I've got some which do work to an extent. I'm using one now on an android tablet. Let me enter something with Graffiti without correcting it.
Ths quick beown fx jumpes over the lazy dog.,
'he quick brown .iox imizs over the lazy dog.
Tite quick .browU fox juaes ovclr the laz= dog.
Now with a finger.
The quick brown fox jumpes over the lazy dog.
Hmmm. Capacitive stylus not so good. And that is one of the better ones I have.
You have your opinion, but I have to disagree. Styli were there because at the time, not only were people used to using pens and pencils, but also the necessary technology for capacitive screens could not be overlayed on a screen at a price point and energy budget that made it suitable for hand-held devices. There was also the problem that of technological and cost necessity, the screens were much smaller (my Treo had a 360x360 screen that was about 2" in diagonal), so on-screen buttons were small and you would not have been able to use a finger for anything other than the broadest of selections.
I don't actually believe that there is a significant difference in UI design. I could scroll my Treo with a swipe of a suitable input device, but often it was just more convenient to use a scroll bar (it was the lack of processing and graphics power that made single large scrolls better than many small incremental scrolls rather than the input method). And what is it that makes a finger a better pointing device than a stylus, apart from convenience? It's certainly not more accurate! And I find that carrying around a polishing cloth all the time because of the grease marks a pain (I don't have olephobic coatings on my devices).
I have not actually used a Galaxy Note 10.1, but I have used Wacom graphics tablets (apparently the same technology). There is no comparison between using a finger (which a Note can also do), and using a device that allows you to rest your hand on the screen, have total confidence that what the stylus is pointing to is accurate, and allows multiple levels of pressure to emphasize what you want to do. I suspect that you've never come across a situation where a pressure sensitive input device is a real benefit.
On a Note, I would not use the stylus to play Angry Birds or select the next music track I wanted to listen to. But I may use it when browsing the Internet (too often do I select the wrong link with my index finger on my current tablet), and definitely would wherever I wanted to make notes, free-hand drawings or anything else that requires a high degree of accuracy.
WTF does using a finger or stylus have to do with a smartphone being a smartphone?
I assure you that in their time, Palm, Handspring, iPaq and many others were smart, and used stylii. They were smart because of what they could do, not how they did it. They still worked using rectangular icons arranged in a grid, with multiple pages and apps that used gestures rather than key presses. They could install software. They could do media, games, productivity applications, and they could interact with the outside world.
My Treo functioned perfectly well with a stylus (conveniently tucked into a slot, always available), a pen top, or even a finger nail (I play classical guitar so I have an advantage here), but if the on-screen buttons were designed properly to make them large enough, would also work with a blunt finger. I used Graffiti all the time in place of a on-screen keyboard or even the keyboard buttons, so I never had to peck at a keyboard with a stylus or fingers.
I am currently trying to find a decent stylus for my capacitive-screened phone and tablet, because it is just so much more natural for someone who still writes with implements to use a stylus. I'm sure that there will come a time in the not too distant future when we will find people who have grown up without having to learn to write with a pen or pencil, who find that using a finger is more natural, but to date, everyone who has been through school will have learned to write in the traditional manner.
I look at the Galaxy Note 10.1 (not a phone, I know) with envious eyes, merely because of it's stylus. Looks like the best of all worlds, but is too expensive for me.
Not always true. I have what looks like a Alibaba special branded by a small European start-up that is surprisingly good, and it's available for the same ball-park cost as the Nexus 7.
It's got a 1.2 GHz Cortex A8 processor, 9.7" 4x3 IPS screen and an 8000mA/h battery that gives ~8 hours of continuous use. It runs ICS 4.04, and can use Google Play. Before the Nexus 7 was available, I would say that it was highly recommended. Even now with Google selling the Nexus 7 at little or no profit, I would say that it or one of it's follow-up tablets is worth a punt if you want something with a 4x3 aspect ratio larger screen.
And I can almost guarantee you that you have never heard of the company's name.
Since when is a UNIX derived system (OSX has UNIX branding by virtue of passing the test suite) a "Linux Clone"! Surely it's the other way round.
I do deliberately turn off the muting. As I listen to non-music stations while driving (mainly Radio 4 and Radio 4 Extra) I prefer to have some chance of hearing what is being said through the burbles rather than having it chop the audio right at the point of the punchline of a joke or a critical response to a well asked question.
I agree with the OP when he says that FM degrades more gracefully. In a poor reception area, I can make more sense of a poor FM signal than I can of a DAB one.
I would prefer to hear it all of course, and I find that DAB reception here in the heart of the Westcountry is diabolically bad, even close to the largest towns and cities in the region. Within 5 miles of Exeter, I can find completely dead spots where you cannot get DAB reception at all. That's not in the sticks, that should be like any suburban location.
On the subject of living out in the sticks, you can take a running jump. You are just jealous of the fresh air we breathe, the green spaces we have available on our doorsteps, and the spectacular sights that you have forgotten. Unlike the Internet where distance is a problem, radio is a medium that could and should be country wide.
I'd spotted that, but I put it down to a bit of spin. It may be a completely true statement, but it is completely meaningless. It's probably designed to make it look to shallow thinkers as if PWC know what is going on, when in fact what it shows to anybody with half-a-brain that they clearly don't.
My Rover 400 (bubble) did not rust significantly in it's 15 years of use (I pranged it on ice), and my Dad's 75 still looked pristine after 8 years when he sold it. It's the BL years that were the worst, and I am quite surprised to see anything on the roads from that era nowadays.
What are you comparing to what?
PowerShell might have some advantages, but I can't imaging that you really mean that Command or CMD is better than bash or ksh (or even csh if you are really talking about batch files).
For goodness sake, even the OS/2 shell was better than the standard Windows command processors.
Microsoft bought (or at least licensed) Insignia Solutions' SoftPC to allow non-native code execution. The original plan was for NT on platforms other than 386/486 systems to use this technology to run binaries on other platforms. The facility was called Windows-on-Windows.
This capability disappeared without a trace when MS pulled support from these other platforms.
You have a female carpenter? And one who works in a maid costume?
The true travesty is the Bacon McMuffin.
After a heavy team night out (they used to happen about twice a month), a colleague of mine used to bring in a big bag of them to work next morning and hand them out. I'll swear that most of the people must still have been drunk in order to eat them! Made worse by the Berocca that they also thought made them feel better.
Only if the idiot is authorized to do this in the sudo config. Unfortunately, many Linux distro's automatically put the first user set up during the installation into whatever group the sudo config. allows.
It doesn't have to be this way!
I was going to say it must have something to do with diethylene glycol from the wine industry until I remembered that the Rhine does not flow through Austria........
Palm Treo. Able to buy and install applications from the internet before there was an Apple app store. Not every application could be purchased this way, but I bought a spreadsheet that I downloaded direct to the phone over the 2G data link.
Firstly, BeOS is *NOT* a UNIX variant. It was written from the ground up as a modular OS which was mostly POSIX compliant. True, it has Bash but....
Secondly, I think that you must have meant that OSX was based on Mach (inherited from NeXT), which is a partly micro-kernel operating system with a BSD command set over the top. Again it is not a UNIX variant, but looks to all intents and purposes like UNIX because of the BSD command set (like Linux).
I'm really not sure what you mean by "their security models have been pretty flaky at times", because it has not changed. The standard UNIX security model has not changed significantly in 40 years. It is amazing that it stands up at all against what is available in modern operating systems, let alone be regarded as more secure in some instances.
Whilst it is far from perfect, it is simple enough to be well understood by most people working with it, something that I don't believe is really true about some other operating systems. This means that it was and is used correctly. Also, remember that UNIX was not just multi-tasking from the word go, but also multi-user. It was a mature although developing operating system when the IBM PC was launched.
At the time of the original IBM PC, UNIX could and did run on 16 bit machines. You must remember that Xenix (which ran on IBM PC/XTs), was based on UNIX Version 7, and UNIX Version 7 ran on PDP11s in as little as 128KB of memory. In fact the architecture of non-I&D PDP11s required the kernel to fit in less than 56KB of memory.
The biggest problem is that UNIX has always worked best on systems with hard-disks. The basic tool set of UNIX (effectively the / and /usr filesystem) was around 2.5MB on a PDP11 IIRC, so squeezing that down to 128KB disks was an impossible task. That's not to say people didn't try. I saw several floppy based implementations of UNIX around at the time, but they were generally slow and barely usable. Also, pipes working through floppies (early UNIXs at the time used an unlinked file to store the pipe data over 1 block) were incredibly slow.
There were small UNIX systems available at the time. AT&T had their 3B1, and other people like Onyx, Tadpole and Torch (and many others) had mainly 68000 based UNIX systems available, albeit more expensive than a PC. And the interesting thing is that these contemporary systems to the IBM PC were already 32 bit systems, not 16 bits like the Intel 8086.
Life really would have been better based around UNIX on PCs!
Sorry, my "where's the joke icon" was actually directed at b166er at 15:12, not at the reply about Douglas Adams.
I always had great respect for Douglas ever since I heard the radio series of the Hitch Hikers Guide to the Galaxy when it was first aired on Radio 4 in 1977. At the time, I was impressed that he was the first radio show writer to make good use of Stereo to benefit the comedy content, but as time progressed, his detailed use of English ("...almost, but not quite, exactly unlike Tea..." etc) to make rational arguments of clearly absurd situations was genius.
I was very upset when his refreshing view of the world was taken from us all. I can only hope that he is really sitting in a bar in the Domain of the King, enriching that world, wherever it is.
I'm sorry that my reply was misconstrued.
This was a joke, wasn't it?
I don't think RSTS had PIP, but I could be wrong. Certainly RSX-11M before version 4.0 only used MCR, so PIP was essential. DCL was added to RSX-11M as a port from IAS, which derived from RSX-11D.
Interestingly, it appears that RSX-11M was one of the the first project that Dave Cutler led.
Was only later versions of RT-11 that included DCL. Around the time that CP/M was written, it was all PIP etc.
who has also not had a drastic reduction in energy requirements per person on their agenda.
Now, I'm not saying that we should go down the rampant energy use path, and that efficient use of energy should not be promoted, but the people shouting here are almost certainly the hair-shirt brigade that want to suggest that we should reduce our energy footprint to be the same as a Kalahari nomadic tribesman.
So any project that suggests that we can keep our current energy use will be attacked from every possible angle.
.....it is the fact that if some were put up, Wikipedia could be dragged through the UK courts for something that their editorial model cannot control. It is that potential that prevents Wikipedia from operating in the UK.
Of course, I'm not saying that all the articles are squeaky-clean....
You're lucky they actually listened. When I tried to fix a problem with RBS's online banking a few years ago (it just kept refusing to allow me to log on even when using the correct credentials with someone watching over my shoulder to check I was doing it right), they just claimed that they did not support any OS other than Windows and OSX, and suggested I get another PC.
Turned out to be a bug in their code causing buttons to be off the screen, and also mis-handling the return key as a form completion action.
Eventually I did get put through to someone who knew a little about Linux (after having the access blocked and enabled at least three times), who was able to confirm that their login process was not working with Firefox on Linux. They did even fix it!
OK, not an Ultrabook but my early 4GB EeePC 701 is still going strong after 6 or 7 years of light use of ~5 hours a week (it used to be a lot more before I got my Android Tablet). No cracked hinges or anything else. Only problem is the battery life is pretty dire now at less than an hour, but I'm sure I could fix that with a new battery.
The problem I have is that current Ubuntu releases are now too big and unwieldy (and graphics heavy!) to squeeze onto it's rather small internal flash. Must try Mint.
You've got to be careful nowadays about what counts as a connected call, and who generates the engaged tone.
It is now quite easy for a large company telephone exchange to take the call, and then forward it to an engaged phone within their system. Thus what you might hear as an engaged line may count as a connected call as far as your telco is concerned.
In addition, if you have an inclusive calls deal on your landline, what you will probably find is that the time you are on the 'phone talking to the other party is not charged minute-by-minute (as long as you keep it to less than 60 minutes), but there is a per-call connection charge, often around 12p. So don't believe the telco when they talk about free calls. I've actually noticed that some now don't call them 'free calls' but 'free minutes'.
The combination of forwarded engaged tone and per-call charge may end up costing you quite a bit of money.
I actually got a bit huffy with BT once when they started giving free virtual answering machines which took messages when the line was engaged. I got a bill where one of my kids had repeatedly called a friend who's phone was engaged (probably because they were using a dial-up modem for internet access - it was some time ago). They got through to the answer service (a connected call) and hung up immediately. They then tried again 2 minutes later, and again two minutes after that. At the time BT was charging a minimum 5p call charge, and on the bill there was a few weeks worth of this which actually clocked up about £20 of costs once VAT had been added. I realised that BT had produced a way of generating revenue from engaged phone lines!
Although this would not change my charges, I immediately asked to have call-minder turned off on my phone, so others would not suffer the same problem. I still do not use the virtual answer phone to this day.
Virgin are one of the companies that cache iPlayer and their own download content at various locations around the country. There is AFAIK some relatively clever transparent proxies in their network to deliver the data. This means that for certain streaming sites, the cost of sending the data is much less (traffic is kept to the local infrastructure, not loading their backbone), and they can afford to not count that traffic towards bandwidth caps.
It's only some streaming sites. The rest of the traffic is counted normally, as I know to my cost as an ex Virgin ADSL customer, now happy with another ISP.
(....and on reflection, I use too many parentheses.)
Other trading funds - DVLA and Met Office, and many more.
A trading fund does not actually have to make a profit (a major difference between a trading fund and a share-holder owned company), it is just distanced from ministers and civil servants to allow the organisation to operate more like a company (including borrowing money on the open market) and be less influenced by current political thinking.
And I think that any difference between their costs and their income (profit or maybe described better as a trading surplus) does not have to be returned to the treasury, but can be invested internally. They have customers (which can be other government institutions - for example the DVLA has contracts with the Police and VOSA), and all of the relationships with customers and suppliers are governed by commercial law except in the few areas where the organisation involved in direct legislation.
There are no shareholders, and little in the way of bonus culture, so it may quack a little, but there is also a woof and a miaow in there somewhere.
It depends on your perspective.
If you are the person planning the change in a hurry, or the management committing the resource to do the change, then an over-rigorous, time consuming process is the last thing you want to add to your work or costs, so you do your best to short-circuit the process to make the change happen faster and cost less.
If you are the risk manager, who is on the line if changes cause problems, then you want as much process around you to protect your butt (and to a lesser extend, the organisation they are working for), and then a bit more. You feel most secure if there is no change at all (that's counter-productive).
If you are a diligent IT professional, then you want the *right* amount of change management to make sure that the change has been carefully considered, and has a good backout plan, but not so much as to make planning the change more difficult than it has to be.
It is this balance that is missing. You see it swinging from cost-reduction to risk management according to the current trends in risk and management style and the most recent disaster. And always nowadays, the people who understand it least are the ones dictating the processes.
If you are in a large organisation involved in change management, take a change and estimate how much the change costs in people and financial terms. Look at the time necessary to cross the 't's and dot the 'i's. Count the number of people involved. Look at the number of people who have to read and understand the change. Add up the people-hours spent sitting in the change board meetings.
You can often find in places like a bank that a change to switch servers from one DNS or time server to another (often simple but with a potential high risk and impact if it goes wrong), which may actually only take minutes to do, ends up costing you dozens of man-hours (or even man-days), involving people on quite high salaries, and many days or weeks to drive the process. All of these things cost money, one way or another.
Apart from those rare systems that really do run Java in a sand-box, user files on *ANY* platform will be vulnerable to this type of attack. The OS, however, shouldn't.
What is worrying in this article is the issue of it installing a rootkit on MacOS. I'm not sure whether I am talking about the same thing, but I define a rootkit as something that gains privileged access, and then alters the OS start-up process so that it will have running privileged components that will monitor whether the rootkit is removed from the system disk, at which point it will re-infect it.
The operative word here is "privileged". It implies that there is something that will cross the privilege barrier, which requires an OS security weakness or vulnerability. Of course, I could have the MacOS security model all wrong, but I thought MacOS was relatively robust. If it is a user-mode rootkit (is there such a thing - a process kicked off in user-land during the user's start-up, but not running as a privileged user) then I might be able to understand it.