1395 posts • joined Friday 15th June 2007 09:17 GMT
You don't think that the fact that they were older technology, having been around for quite some time before the original iPhone or most touch screen feature phones might have something to do with why they were 'clunky'? I would have loved to have seen a Palm TX with a phone grafted on. That would have been a device that could have stood up to the original iPhone. The Treos were great, but they wasted too much space on the keyboard.
Re: First sentence wound me up immediatly
I've got some which do work to an extent. I'm using one now on an android tablet. Let me enter something with Graffiti without correcting it.
Ths quick beown fx jumpes over the lazy dog.,
'he quick brown .iox imizs over the lazy dog.
Tite quick .browU fox juaes ovclr the laz= dog.
Now with a finger.
The quick brown fox jumpes over the lazy dog.
Hmmm. Capacitive stylus not so good. And that is one of the better ones I have.
Re: First sentence wound me up immediatly
You have your opinion, but I have to disagree. Styli were there because at the time, not only were people used to using pens and pencils, but also the necessary technology for capacitive screens could not be overlayed on a screen at a price point and energy budget that made it suitable for hand-held devices. There was also the problem that of technological and cost necessity, the screens were much smaller (my Treo had a 360x360 screen that was about 2" in diagonal), so on-screen buttons were small and you would not have been able to use a finger for anything other than the broadest of selections.
I don't actually believe that there is a significant difference in UI design. I could scroll my Treo with a swipe of a suitable input device, but often it was just more convenient to use a scroll bar (it was the lack of processing and graphics power that made single large scrolls better than many small incremental scrolls rather than the input method). And what is it that makes a finger a better pointing device than a stylus, apart from convenience? It's certainly not more accurate! And I find that carrying around a polishing cloth all the time because of the grease marks a pain (I don't have olephobic coatings on my devices).
I have not actually used a Galaxy Note 10.1, but I have used Wacom graphics tablets (apparently the same technology). There is no comparison between using a finger (which a Note can also do), and using a device that allows you to rest your hand on the screen, have total confidence that what the stylus is pointing to is accurate, and allows multiple levels of pressure to emphasize what you want to do. I suspect that you've never come across a situation where a pressure sensitive input device is a real benefit.
On a Note, I would not use the stylus to play Angry Birds or select the next music track I wanted to listen to. But I may use it when browsing the Internet (too often do I select the wrong link with my index finger on my current tablet), and definitely would wherever I wanted to make notes, free-hand drawings or anything else that requires a high degree of accuracy.
First sentence wound me up immediatly
WTF does using a finger or stylus have to do with a smartphone being a smartphone?
I assure you that in their time, Palm, Handspring, iPaq and many others were smart, and used stylii. They were smart because of what they could do, not how they did it. They still worked using rectangular icons arranged in a grid, with multiple pages and apps that used gestures rather than key presses. They could install software. They could do media, games, productivity applications, and they could interact with the outside world.
My Treo functioned perfectly well with a stylus (conveniently tucked into a slot, always available), a pen top, or even a finger nail (I play classical guitar so I have an advantage here), but if the on-screen buttons were designed properly to make them large enough, would also work with a blunt finger. I used Graffiti all the time in place of a on-screen keyboard or even the keyboard buttons, so I never had to peck at a keyboard with a stylus or fingers.
I am currently trying to find a decent stylus for my capacitive-screened phone and tablet, because it is just so much more natural for someone who still writes with implements to use a stylus. I'm sure that there will come a time in the not too distant future when we will find people who have grown up without having to learn to write with a pen or pencil, who find that using a finger is more natural, but to date, everyone who has been through school will have learned to write in the traditional manner.
I look at the Galaxy Note 10.1 (not a phone, I know) with envious eyes, merely because of it's stylus. Looks like the best of all worlds, but is too expensive for me.
Re: You get what you pay for
Not always true. I have what looks like a Alibaba special branded by a small European start-up that is surprisingly good, and it's available for the same ball-park cost as the Nexus 7.
It's got a 1.2 GHz Cortex A8 processor, 9.7" 4x3 IPS screen and an 8000mA/h battery that gives ~8 hours of continuous use. It runs ICS 4.04, and can use Google Play. Before the Nexus 7 was available, I would say that it was highly recommended. Even now with Google selling the Nexus 7 at little or no profit, I would say that it or one of it's follow-up tablets is worth a punt if you want something with a 4x3 aspect ratio larger screen.
And I can almost guarantee you that you have never heard of the company's name.
Re: PMSL :)
Since when is a UNIX derived system (OSX has UNIX branding by virtue of passing the test suite) a "Linux Clone"! Surely it's the other way round.
I do deliberately turn off the muting. As I listen to non-music stations while driving (mainly Radio 4 and Radio 4 Extra) I prefer to have some chance of hearing what is being said through the burbles rather than having it chop the audio right at the point of the punchline of a joke or a critical response to a well asked question.
I agree with the OP when he says that FM degrades more gracefully. In a poor reception area, I can make more sense of a poor FM signal than I can of a DAB one.
I would prefer to hear it all of course, and I find that DAB reception here in the heart of the Westcountry is diabolically bad, even close to the largest towns and cities in the region. Within 5 miles of Exeter, I can find completely dead spots where you cannot get DAB reception at all. That's not in the sticks, that should be like any suburban location.
On the subject of living out in the sticks, you can take a running jump. You are just jealous of the fresh air we breathe, the green spaces we have available on our doorsteps, and the spectacular sights that you have forgotten. Unlike the Internet where distance is a problem, radio is a medium that could and should be country wide.
Re: "PWC has said that as far as it is aware nothing is missing "
I'd spotted that, but I put it down to a bit of spin. It may be a completely true statement, but it is completely meaningless. It's probably designed to make it look to shallow thinkers as if PWC know what is going on, when in fact what it shows to anybody with half-a-brain that they clearly don't.
Get over it
My Rover 400 (bubble) did not rust significantly in it's 15 years of use (I pranged it on ice), and my Dad's 75 still looked pristine after 8 years when he sold it. It's the BL years that were the worst, and I am quite surprised to see anything on the roads from that era nowadays.
Re: Net Applications? @Dave 15
What are you comparing to what?
PowerShell might have some advantages, but I can't imaging that you really mean that Command or CMD is better than bash or ksh (or even csh if you are really talking about batch files).
For goodness sake, even the OS/2 shell was better than the standard Windows command processors.
Re: Mac OS9 --> OSX @MacroRodent
Microsoft bought (or at least licensed) Insignia Solutions' SoftPC to allow non-native code execution. The original plan was for NT on platforms other than 386/486 systems to use this technology to run binaries on other platforms. The facility was called Windows-on-Windows.
This capability disappeared without a trace when MS pulled support from these other platforms.
Re: you bastards
The true travesty is the Bacon McMuffin.
After a heavy team night out (they used to happen about twice a month), a colleague of mine used to bring in a big bag of them to work next morning and hand them out. I'll swear that most of the people must still have been drunk in order to eat them! Made worse by the Berocca that they also thought made them feel better.
Re: Linux trojan is not news
Only if the idiot is authorized to do this in the sudo config. Unfortunately, many Linux distro's automatically put the first user set up during the installation into whatever group the sudo config. allows.
It doesn't have to be this way!
Re: too easily shocked
Palm Treo. Able to buy and install applications from the internet before there was an Apple app store. Not every application could be purchased this way, but I bought a spreadsheet that I downloaded direct to the phone over the 2G data link.
Re: Really? BeOS?
Firstly, BeOS is *NOT* a UNIX variant. It was written from the ground up as a modular OS which was mostly POSIX compliant. True, it has Bash but....
Secondly, I think that you must have meant that OSX was based on Mach (inherited from NeXT), which is a partly micro-kernel operating system with a BSD command set over the top. Again it is not a UNIX variant, but looks to all intents and purposes like UNIX because of the BSD command set (like Linux).
Re: Where would we be now?
I'm really not sure what you mean by "their security models have been pretty flaky at times", because it has not changed. The standard UNIX security model has not changed significantly in 40 years. It is amazing that it stands up at all against what is available in modern operating systems, let alone be regarded as more secure in some instances.
Whilst it is far from perfect, it is simple enough to be well understood by most people working with it, something that I don't believe is really true about some other operating systems. This means that it was and is used correctly. Also, remember that UNIX was not just multi-tasking from the word go, but also multi-user. It was a mature although developing operating system when the IBM PC was launched.
At the time of the original IBM PC, UNIX could and did run on 16 bit machines. You must remember that Xenix (which ran on IBM PC/XTs), was based on UNIX Version 7, and UNIX Version 7 ran on PDP11s in as little as 128KB of memory. In fact the architecture of non-I&D PDP11s required the kernel to fit in less than 56KB of memory.
The biggest problem is that UNIX has always worked best on systems with hard-disks. The basic tool set of UNIX (effectively the / and /usr filesystem) was around 2.5MB on a PDP11 IIRC, so squeezing that down to 128KB disks was an impossible task. That's not to say people didn't try. I saw several floppy based implementations of UNIX around at the time, but they were generally slow and barely usable. Also, pipes working through floppies (early UNIXs at the time used an unlinked file to store the pipe data over 1 block) were incredibly slow.
There were small UNIX systems available at the time. AT&T had their 3B1, and other people like Onyx, Tadpole and Torch (and many others) had mainly 68000 based UNIX systems available, albeit more expensive than a PC. And the interesting thing is that these contemporary systems to the IBM PC were already 32 bit systems, not 16 bits like the Intel 8086.
Life really would have been better based around UNIX on PCs!
I've yet to meet a green protester
who has also not had a drastic reduction in energy requirements per person on their agenda.
Now, I'm not saying that we should go down the rampant energy use path, and that efficient use of energy should not be promoted, but the people shouting here are almost certainly the hair-shirt brigade that want to suggest that we should reduce our energy footprint to be the same as a Kalahari nomadic tribesman.
So any project that suggests that we can keep our current energy use will be attacked from every possible angle.
Re: Where is the joke icon?
Sorry, my "where's the joke icon" was actually directed at b166er at 15:12, not at the reply about Douglas Adams.
I always had great respect for Douglas ever since I heard the radio series of the Hitch Hikers Guide to the Galaxy when it was first aired on Radio 4 in 1977. At the time, I was impressed that he was the first radio show writer to make good use of Stereo to benefit the comedy content, but as time progressed, his detailed use of English ("...almost, but not quite, exactly unlike Tea..." etc) to make rational arguments of clearly absurd situations was genius.
I was very upset when his refreshing view of the world was taken from us all. I can only hope that he is really sitting in a bar in the Domain of the King, enriching that world, wherever it is.
I'm sorry that my reply was misconstrued.
It's not that there is libelous material on Wikipedia....
.....it is the fact that if some were put up, Wikipedia could be dragged through the UK courts for something that their editorial model cannot control. It is that potential that prevents Wikipedia from operating in the UK.
Of course, I'm not saying that all the articles are squeaky-clean....
Where is the joke icon?
This was a joke, wasn't it?
Re: Memories of the once cutting edge.
I don't think RSTS had PIP, but I could be wrong. Certainly RSX-11M before version 4.0 only used MCR, so PIP was essential. DCL was added to RSX-11M as a port from IAS, which derived from RSX-11D.
Interestingly, it appears that RSX-11M was one of the the first project that Dave Cutler led.
Re: Memories of the once cutting edge.
Was only later versions of RT-11 that included DCL. Around the time that CP/M was written, it was all PIP etc.
You're lucky they actually listened. When I tried to fix a problem with RBS's online banking a few years ago (it just kept refusing to allow me to log on even when using the correct credentials with someone watching over my shoulder to check I was doing it right), they just claimed that they did not support any OS other than Windows and OSX, and suggested I get another PC.
Turned out to be a bug in their code causing buttons to be off the screen, and also mis-handling the return key as a form completion action.
Eventually I did get put through to someone who knew a little about Linux (after having the access blocked and enabled at least three times), who was able to confirm that their login process was not working with Firefox on Linux. They did even fix it!
OK, not an Ultrabook but my early 4GB EeePC 701 is still going strong after 6 or 7 years of light use of ~5 hours a week (it used to be a lot more before I got my Android Tablet). No cracked hinges or anything else. Only problem is the battery life is pretty dire now at less than an hour, but I'm sure I could fix that with a new battery.
The problem I have is that current Ubuntu releases are now too big and unwieldy (and graphics heavy!) to squeeze onto it's rather small internal flash. Must try Mint.
You've got to be careful nowadays about what counts as a connected call, and who generates the engaged tone.
It is now quite easy for a large company telephone exchange to take the call, and then forward it to an engaged phone within their system. Thus what you might hear as an engaged line may count as a connected call as far as your telco is concerned.
In addition, if you have an inclusive calls deal on your landline, what you will probably find is that the time you are on the 'phone talking to the other party is not charged minute-by-minute (as long as you keep it to less than 60 minutes), but there is a per-call connection charge, often around 12p. So don't believe the telco when they talk about free calls. I've actually noticed that some now don't call them 'free calls' but 'free minutes'.
The combination of forwarded engaged tone and per-call charge may end up costing you quite a bit of money.
I actually got a bit huffy with BT once when they started giving free virtual answering machines which took messages when the line was engaged. I got a bill where one of my kids had repeatedly called a friend who's phone was engaged (probably because they were using a dial-up modem for internet access - it was some time ago). They got through to the answer service (a connected call) and hung up immediately. They then tried again 2 minutes later, and again two minutes after that. At the time BT was charging a minimum 5p call charge, and on the bill there was a few weeks worth of this which actually clocked up about £20 of costs once VAT had been added. I realised that BT had produced a way of generating revenue from engaged phone lines!
Although this would not change my charges, I immediately asked to have call-minder turned off on my phone, so others would not suffer the same problem. I still do not use the virtual answer phone to this day.
Re: So Virgin closely monitor your data?
Virgin are one of the companies that cache iPlayer and their own download content at various locations around the country. There is AFAIK some relatively clever transparent proxies in their network to deliver the data. This means that for certain streaming sites, the cost of sending the data is much less (traffic is kept to the local infrastructure, not loading their backbone), and they can afford to not count that traffic towards bandwidth caps.
It's only some streaming sites. The rest of the traffic is counted normally, as I know to my cost as an ex Virgin ADSL customer, now happy with another ISP.
Re: Mea culpa
(....and on reflection, I use too many parentheses.)
Re: Mea culpa
Other trading funds - DVLA and Met Office, and many more.
A trading fund does not actually have to make a profit (a major difference between a trading fund and a share-holder owned company), it is just distanced from ministers and civil servants to allow the organisation to operate more like a company (including borrowing money on the open market) and be less influenced by current political thinking.
And I think that any difference between their costs and their income (profit or maybe described better as a trading surplus) does not have to be returned to the treasury, but can be invested internally. They have customers (which can be other government institutions - for example the DVLA has contracts with the Police and VOSA), and all of the relationships with customers and suppliers are governed by commercial law except in the few areas where the organisation involved in direct legislation.
There are no shareholders, and little in the way of bonus culture, so it may quack a little, but there is also a woof and a miaow in there somewhere.
Re: I'd like to hear more about
It depends on your perspective.
If you are the person planning the change in a hurry, or the management committing the resource to do the change, then an over-rigorous, time consuming process is the last thing you want to add to your work or costs, so you do your best to short-circuit the process to make the change happen faster and cost less.
If you are the risk manager, who is on the line if changes cause problems, then you want as much process around you to protect your butt (and to a lesser extend, the organisation they are working for), and then a bit more. You feel most secure if there is no change at all (that's counter-productive).
If you are a diligent IT professional, then you want the *right* amount of change management to make sure that the change has been carefully considered, and has a good backout plan, but not so much as to make planning the change more difficult than it has to be.
It is this balance that is missing. You see it swinging from cost-reduction to risk management according to the current trends in risk and management style and the most recent disaster. And always nowadays, the people who understand it least are the ones dictating the processes.
If you are in a large organisation involved in change management, take a change and estimate how much the change costs in people and financial terms. Look at the time necessary to cross the 't's and dot the 'i's. Count the number of people involved. Look at the number of people who have to read and understand the change. Add up the people-hours spent sitting in the change board meetings.
You can often find in places like a bank that a change to switch servers from one DNS or time server to another (often simple but with a potential high risk and impact if it goes wrong), which may actually only take minutes to do, ends up costing you dozens of man-hours (or even man-days), involving people on quite high salaries, and many days or weeks to drive the process. All of these things cost money, one way or another.
Re: You've never needed a password to install malware on a Mac
Apart from those rare systems that really do run Java in a sand-box, user files on *ANY* platform will be vulnerable to this type of attack. The OS, however, shouldn't.
What is worrying in this article is the issue of it installing a rootkit on MacOS. I'm not sure whether I am talking about the same thing, but I define a rootkit as something that gains privileged access, and then alters the OS start-up process so that it will have running privileged components that will monitor whether the rootkit is removed from the system disk, at which point it will re-infect it.
The operative word here is "privileged". It implies that there is something that will cross the privilege barrier, which requires an OS security weakness or vulnerability. Of course, I could have the MacOS security model all wrong, but I thought MacOS was relatively robust. If it is a user-mode rootkit (is there such a thing - a process kicked off in user-land during the user's start-up, but not running as a privileged user) then I might be able to understand it.
The very nature of the x86 architecture, with it's requirement to remain backward compatible with it's 16 bit forebears, and a "Complex Instruction Set" (CISC) are the significant problems Intel have, and up until about 7 years ago, computing power was more important to them than consumed power.
The ARM was originally designed to be a very simple 32 bit processor from the outset, with a low transistor budget. Even though modern ARM processors are much more complex, the design ethos prevails. Low power consumption was actually a useful side effect.
Intel would probably very much like to discard the legacy components of the x86 design, but it's a problem, because backward compatibility is seen by most of their customers as the main strength of the processor line, as Intel found out with the Itanium, i860 and i960 lines of processors.
Re: Laws, sausages, vinyl records
Do you know it was a CD?
Could it not have been a lossless audio file at some stupidly high bit rate stored on a writable DVD or BluRay disk?
Studio master tapes have been digital for about 30 years (I remember Sky 2 coming out, and being proclaimed as one of the first records to be digitally recorded, mixed and mastered). Once mixed in a digital mixer, it is quite possible that the output may be put on a DVD.
Re: Tape this !
If you think that modern LTO tapes are like the C15s that you used on your Trash80, then you've got no right to comment on this story.
Massive tape libraries with well designed data management systems are fine for backing up or archiving large amounts of data. The only problem is that enterprise grade tape media is still too expensive (but still cheaper TB for TB than disk). Put in a data-management system with recent data stored in modest sized disk pools, and migrated to the massive tape pools as it ages. Index the data so that you know which tape it is on, and you can retrieve it remarkably rapidly, and with relatively little effort.
And if the data falls in a category where it is no longer required to be accessed quickly, you can actually remove the tapes from the library to make space for more. And you could easily store a replica of your data in an offsite store in case of disaster (try seeing what the cost of having Petabytes of data in 'the cloud' is).
No, tape is still useful. Just be careful that you keep the drives to read it working!
Re: I forgot to mention
I've used that feature for many years. It's not news to me, nor does it alter anything I've said.
When my kids were younger, and we shared PCs, I gave them all normal user ids, kept the admin login to myself for infrequent use (I also used an ordinary account for my normal work), and created another administrator account to be used with runas which I then made unable to log in directly through a registry hack. I gave my kids the password for the runas account for applications that were stupid enough to need administrator privilege to run. This worked fine for everything until I came across the game Blockland, which needed to actually be run from an logged in administrator account.
But it did not take long for my kids to realise that they could actually run almost anything as the runas admin account, but what it did do was make their default access for browsing and mail, the most likely things to cause the system to be compromised.
I've never said that the security model of Windows NT based OSs is weak. In fact, on these forums, I've actually said that it is probably better than the default UNIX model. What I have said, though, is that it is set up on ordinary systems in a generally flawed manner, and this is compounded by application writers creating programs that need administrator rights to access certain parts of the filesystem needed by the application, but this is another story.
Re: Oh dear, not this again
Nigel11. Well, it is quite true that as long as the cables deliver a clean digital signal, you need nothing better.
But even digital signals are afected by analogue issues.
A digital signal is something approximating a square wave (obviously not a pure square wave). But when you transmit it down a wire, the effect of capacitance and inductance take their toll on these nice clean leading and trailing edges due to an effect called hysteresis. This tends to round-off the nice square edges.
When you recieve a digital signal, especially an asynchronous one, you rely on the signal passing the high/low thresholds within a suitable time. Thus, a really bad cable which may cause overall loss of signal and excessive hysteresis could cause multiple single bit errors because of the signal not reaching the threshold in time.
No matter, you say. All digital signals are transmitted with error correction. True, but invoking the error correction algorithm may take time (even if in hardware), and may not actually reconstruct the packet correctly if there are too many errors. What to do then? Well, most systems when faced with potential missing data in real-time will repeat the last packet's data, which is clearly unacceptable.
I'm not saying that this happens frequently, but be aware that it can happen, and cannot be totally ignored.
Re: Refreshing ...
I think there is a basic problem with the current generation of music listeners.
I'm perfectly aware that music is subjective, and that many people may think that they are happy with current heavily mixed, sound processed 'music' played through systems that are not ideal. This is made worse by the number of people who are unused to hear music on anything other than earbuds, headphones or computer speakers. They just don't know any better!
But what have the current generation (or in fact anybody learning how to listen to music on anything after a Sony Walkman in the early '80s) have to compare what they listen to now with? As a previous poster has commented, modern music is rarely listened to 'raw', even in a concert. It's all processed, mixed and amplified, so that what is heard is what the producer/sound engineer wants to be heard.
There is not enough live acoustic music available for people nowadays to actually have a reference to compare with. My modest audio system has cost me no more than about £600 over the 30 years I've collected and maintained it. I'm aware that the transistors and capacitors are aging, leading to more background hiss, and that the paper cones of the speakers probably are not as stiff as they used to be, but it is still quite good enough for my children's friends to listen in awe to good music played in a condusive environment on a good budget system (almost all of the components in my system at one time or another got 'best buy' awards in annual roundups of reviews in the HiFi press).
I play acoustic guitar, and they can hear how close to a real guitar John Martyn's Solid Air (on vinyl) can sound. The same with orchestrial and choral works where they can hear the individual sections seperate out across the soundstage. They may not care for the music, but they can hear a difference from what they are used to. And this also extends to their music (mainly CDs - none of them have vinyl!) played on my system.
So I am quite prepared to go along with beauty is in the ears of the beholder, but it's a shame so many of those ears are uneducated.
Re: I forgot to mention
There is a distinction between an administrator account, an account that can run commands using something like UAC, and one who can log in, but cannot even run UAC.
Up to and including XP, most default users on Windows were in the first category. Windows Vista on later, the default is in the second category, as are most Linuxes. But it is possible to configure Linux users in the third category (i.e. they are not allowed to run anything using sudo or it's ilk). Most UNIX systems are configured like this, and ordinary users do not have any abillity to do anything damaging to the OS unless there is an actual defect in the security system (and note I am not saying that there are no defects in any OS).
I find it funny how UNIX, the oldest of all of the OS's mentioned, is the one that implements, the least-risk model. Just shows that people don't learn from history.
Re: @Mr Torx
Linux, by it's very nature, is open to inspection by anybody who wants. Whether this is done is a moot point, but at least you can do it. Previous Linux exploits (like buffer overruns) certainly have been discovered before being found in the wild (you can tell these because they are normally published as 'potential' buffer overruns). Windows does not have this level of openess, so although there are more systems to attack, there is less chance to spot an exploit before it is actually used (which is why zero-day exploits are so damaging to Windows).
The autorun is another matter entirely. If the underlying OS was secure, and the default user was not privileged, then it would be relatively safe (but of course, personal information would be available even if they were not privileged). But Windows has a reputation of being unsafe, and certainly in XP and earlier, most systems were configured so that the default user was an administrator. This make autorun almost suicidal if users put untrusted media in their systems. I does not take a genius to see this.
Users on Linux and other UNIX-like operating systems can still be affected without privilege (I can think of several ways to add key-loggers to sessions on systems running X-Windows, for example), but in general, this is likely to affect the user and only that user, and the underlying OS and other users will be safe (significant, but less so if a Linux system is 'personal', i.e. only one user ever uses it - this is the problem Android has).
Because many users of commodity OSs do not really understand the differences in the security models and practice between different OSs, I see many challenges to Linux that are unfounded, and really should never be voiced if the person doing the challenging actually knew. I judge this to be one of them.
Whilst I agree with you about the personality differences between The Major in the original film, and the entirety of the rest of the franchise, my personal feeling is that they should not be regarded as different timelines. The difference could be down as much to the different English voice actresses and animators as anything else. I wish I could understand Japanese so that I could judge whether this is the case in the original soundtracks.
I have heard other people talk about this, but I can see nothing in the ARC of SAC that would conflict with the original film happening later in the same timeline, and it is quite clear that what happens in the original film is necessary for the two following films, inclusing SSS, which most people think is in the SAC line. I don't want to go into details, because some people here may not have watched them, and I would not wish to colour their experience (except to say you must watch them, especially if you think animation is just for kids - careful of the 1st Gig episode "Jungle Cruise" though)
And I believe that if you watch the retelling of "The Laughing Man", and "The Individual Eleven", there has been some re-working of the SAC animation (new/different scenes) and some re-voicing of the characters (I was quite shocked at the differences in the bits I have seen). Both I and my daughter think that these later re-packages are significantly worse than the original. These again change the feel of the series.
One thing I always find strange, and that is the way anime series are often re-cut to create a film, but the re-cutting, or re-imagination whatever you want to call it, results in a completely different telling of what looks like the same story. This started a long time ago, and the earliest I remember noticing it was Macross the series, and "Macross - Do You Remember Love". Wherever I can, I try to watch the original series in preference to the films, but that is entirely dependent on what is dubbed into English. This is why I sometimes resort to fansubs on P2P sites, as I don't speak Japanese (did I say that already?)
Re: DAT OST
Yoko Kanno, aka Gabriela Robin. Sometimes, music is credited to Yoko, and words to Gabriela. An elaborate ruse.
BTW, you can see her in videos performing with The Seatbelts and also in some of the Macross Frontier concerts on YouTube. She is generally seen behind a keyboard.
Other Anime she had written music for includes Cowboy Bebop, Macross Plus, A Vision of Escaflowne and Wolf's Rain (and more - these are what just comes to mind). Some of the best regarded Anime series, and a wide variety of music from full orchestral film scores to jazz, rock, techno, something akin to new-age and j-pop.
Re: I found this film a bit disappointing, though ultimatly enjoyable.
I do so wish you could edit posts. Trajic?? Tragic, as is my spelling!
I found this film a bit disappointing, though ultimatly enjoyable.
The problem is that you can't really understand the entire story line unless you watch it multiple times, and spot the various 'bodies' that the Major may be using throughout the film. And when you have worked this out, you have the final enigma of whether the superficial conclusion is actually what is meant (this is probably intentional, but ultimatly leaves you with the feeling of a lack of closure).
In many ways, the story expands on the question raised in the original 1995 film about identity and self, and as such is quite thought provoking. This is the basic theme behind the whole story ARC, and is also explored by the ultimately trajic Tachikoma story (in SAC) as well.
BTW. On the timeline business, SAC 1st and 2nd Gig must be set before the original film, bearing in mind what happens at the end of that film. Innocence follows the first film after a gap of many months, and SSS is over a year later than the original film. So it is axiomatic that Togusa is in GITS (and he is).
I don't really believe that the stories in the original film and SAC and SSS are incompatible with each other. I would definiltly suggest that you need to be familier with some of the other (or even all) of the films and series before embarking on this one.
The various GITS soundtracks are permanently loaded in my 'phone, and are probably listened to more than anything else in there. Brilliant writing, and the choice and quality of the artists is superb. Yoko Kanno (primary music for everything apart from the original film) needs more exposure outside of Japan and the Anime scene.
Re: Boot Loader Locking
Yes, but Microsoft will play on the security side of what this does, pointing out the exposure that all systems without it will have, and also how sophisticated exploit writers are becoming, and how little ordinary users understand about managing their systems (lots of stats about people who install firewalls or UAC and then ignore it).
Their view is tha the colateral damage to other OSs (which aren't really important anyway in MS's view) is just unfortunate, and will only affect commodity systems, as specialist systems will be run by specialists who will not be using the type of hardware they are suggesting.
Quite honestly, it's only been a matter of time before this happened. Ross Anderson had it right all along.
It is a problem, but one that we will get around, either by ignoring Windows 8 on tablets or related devices, or finding some way to break it. I favour making sure that hardware vendors are not peanalised for selling systems that do not have Windows 8 on them (by legislation, if necessary), and then letting the market sort itself out. Discounts to vendors for *ONLY* installing Windows on their procucts should be illegal, and would eliminate this problem immediately.
Re: The Big Lie 2.0
But you don't say the 'first UNIX based phone', you qualify it with 'successful' and 'usable'. Thus it was not the first, so cannot claim patent or copyright. A failed product can still be prior art.
And I could be a pedant over your use of UNIX, and also ask why a smart phone needs to be running a UNIX like OS (think PalmOS, Nokia Communicator or Windows Mobile devices for other devices that were clearly smart before the iPhone). Apple produced a good product, but not one that was especially innovative.
Check your history. No computers, but maybe card sorters. Computers in 2WW were people who did computations, not machines.
Re: I don't get it?
If the OP was an exception,then I must be really rare. Not only do I enjoy my job, it actually partly conditions my life as well.
I will often come home from fixing the work computers (with the associated buzz of a job well done) and open the laptop (or my new Android tablet), and spend time using computers to do other things, including reading about tech.
And before you ask, I am married, and have children. They might get annoyed by the amount of time I spend with computers, but as I was doing this before the family came along, they accept it.
But I know that things are changing, I just hope that my skillset remains sufficiently in demand that I can reach retirement before I struggle to find work. Only 14 years to go, unless they raise the retirement age again
(spot the person whose pension provision is suffering)
Re: overclocked CPUs are more likely to make a Windows PC crash
Well, not strictly true.
Most CPUs are designed to run at a certain speed. When a particular member of a chip family is first spun, chances are only a small percentage of the silicon will run reliably at the full design speed, but many more will run at a fraction of that speed. So they are marked with the slower speed, and sold as slower chips. But they were still designed to run that the higher speed.
Manufacturers put pretty much every CPU through some testing, starting at lower speeds and increasing it until the chip fails to execute something correctly. They then stamp the chip with the last speed that worked sucessfully, and then move on.
What overclockers do is that they reason that when a chip runs above it's tested and rated speed, the cause of failure is probably due to heat, so they put a better heatsink on the chip, and then ramp the speed up above the rated value until it fails, and run it at the highest speed that it functioned correctly. The better the cooling, the higher the clock speed you can run it at (that is why some HPCs have direct water cooling of the CPUs, and why people like Amari [I believe] used to sell an actively refrigerated PC at one time).
Unfortunately, another aspect of heat damage is that it can be cumulative. This is, I believe, what Microsoft are trying to say. This aspect has a name, and it's called 'cooking' the CPU. Once you've cooked it, the chances if it running reliably at the same clock speed (or even at it's rated speed) is seriously reduced.
The most obvious case of this I saw was Throughbred AMD Athlon XP2600s (that was the highest speed Throughbred cores with 133MHz FSB, faster Athlon XPs were Barton cores with an FSB of 166MHz). These were actually clocked with a multiplier at something like 2.06GHz, but over time, even if you did not overclock them, they stopped performing at their rated speed. You had to gradually step down the speed to keep the PC stable. Replace the CPU, back up to full speed, at least for a few months. I went through three or four before I realised what was going on, and this happened even with overspec'd heatsinks and fans.