Re: Brexit pushes Fusion Power back another 10 years
Which is incredibly restrained of it, especially given that it pushes national attitudes to racism back 60 years, and the economy back 40 years. I’d say Fusion is getting off remarkably lightly.
552 posts • joined 26 Oct 2010
Which is incredibly restrained of it, especially given that it pushes national attitudes to racism back 60 years, and the economy back 40 years. I’d say Fusion is getting off remarkably lightly.
…must …not …step …away …from …European …Union. …Can’t …afford …to …do …this …on …our …own.
Walls? We’d be glad to have Walls. Without bricks or mortar. Or walls. I’m in the Arch flagship store. I haven’t worked out how to build the floor yet. It’s very…
…spacey here. As in lots of it. Free, that is. On my hard disk.
Ditto. This article might have given me pause had I read it before upgrading - but luckily I upgraded before reading it. Loving the iOS 11 goodness, and everything works well.
The Registry, a monolithic store of settings, is very bad design. Good design would be discrete settings files for each application, including the operating system, which requires them. Critically, these settings files should be deletable individually or en masse, in case of corruption, in which case they will be automatically replaced by default settings files - and the computer will remain functional.
Before anyone objects, and claims that such a scheme would be unworkable, I’d like to point out that there are operating systems available today which use exactly this functionality, with great success.
Microsoft has done great things recently and, I hope, will continue to do so. I look forward to the abolishment of The Registry - except as a stub to support legacy applications.
Haven't the Will It Blend people covered this already? Pretty sure that they can liquidise anything…
You need a freezer to make it solid again.
And a lot more research into how to make it cat again.
The share price is jumping on the news and, whether you care or not, whether you’ll buy the new gewgaw or not, the world + wife will still tune in to the announcement. It’s a media circus for a reason - you already know what to expect in the big top, but you’ll still attend - just in case one of the acrobats falls off the high wire.
Sure, a little of the surprise might have gone, but the leak will have no material impact whatsoever.
@marky_boi - the boss can't even give one away to convince us to change from Android to Apple for testing
Just noticed this gem. If this really is true (and this claim makes me doubt that it is), you’d be on a performance improvement plan, all of you, as quick as winking. If you’re employed in software development then you’re required to produced, good, well tested, software - regardless of whether you like the platform you’re developing for or not. It shouldn’t be the boss’s job to convince you to do yours - a good software developer should be responsible enough and take enough pride to do the job well.
@AC - It would be rather boring if is was only Android or only iOS now wouldn't it.
It would be worse than boring - it would be dangerous. Think back to the 1990s / early 2000s when Microsoft had a de facto monopoly on just about everything OS related. Linux was in its infancy and Apple was busy dying. You had precisely one choice - Windows - and it wasn't a happy one. You bought Windows, and you got something which was insecure and which rode roughshod through established standards (or tried to).
Android is a great OS, although it might not be the great OS for you. iOS is a great OS, although again it might not work for you. Even Windows for phones was a great OS - before Microsoft deep sixed it. Competition is great - without competition you'll get shit.
@Hans 1 - How is Apple less evil than Google?
Well perhaps Evil is in the eye of the beholder too. But Apple makes its money by selling hardware. Google makes its money by selling advertising. Because of these very different business models, Apple doesn't need to grep the contents of iCloud for any scavengable data which it can sell or use - which, from a privacy perspective, suits me fine. Google, on the other hand, needs to and does scavenge whatever it can from the data you store on its servers (which may suit you fine - it's a valid option).
This means, of course, that some of Google's facilities (OK Google, for example) are more advanced that the Apple counterparts - Apple spends time focussing on security, for example, and sandboxing processes (even Siri) so that your data doesn't leak out. Because Google's business model is predicated on not doing this, it can allow its services to communicate much more naturally and freely, providing an apparently more flexible and advanced service - provided of course that you don't have any secrets that you'd like to keep, well, secret.
@marky_boi - I know this tune, you’re a bot aren’t you? Either that or you’re just a wee bit clueless and you enjoy talking out of your hat (which I’m sure is very funky, and worn backwards as you hoon around in your bewinged Corsa).
The truth is that, as far as your data is concerned, Apple’s garden isn’t so much walled as fenced - with one of those cute little picquet fences for flower borders that can be stepped over ever so easily. You can sync your appointments, contacts, email with Google (or any other service you care to mention), and export any documents you create in Apple’s own Office suite in more portable formats - or just use tools from third parties in the first place.
Even software isn’t very walled. Sure, most people will have to use Apple’s various stores for buying software - but given that Apple curates its garden (not hugely well, given the amount of crap on there - but still) to ensure that Malware doesn’t sneak in, and the relative incidence of Malware on Android to Malware on iOS, I’d say that that’s a bloody good thing. And if, as any good geek would, you want software from off the AppStore then you can either install PastryKit framework apps (no need for the AppStore, no curation), or install OpenSource apps that you build yourself using Xcode (which is free) and of which there are a multitude. Jailbreaking isn’t required in either case.
SO yes, I totally get that there are excellent reasons for choosing Android - but you can cite those without having to make up a load of fake bullcrap. Assuming that you’re knowledgable enough, and actually have some good reasons…
Their phones might be very nice, and very good, but I have my doubts that they’re very secure - even without including the mandatory Google snoopage. So far, Microsoft and Apple have proven themselves concerned with privacy when it comes to looking after your data - Google notsomuch (which isn’t actually a criticism of Google - the OS is free, so they have to make their money some how). Chinese designed kit, if it takes off in the west, is a golden opportunity for data mining which could go far beyond anything done today. That’s okay too - you pays your money and you makes your choice.
My point is just that quality, like beauty, is in the eye of the beholder - and sales volume is no indication of quality (or, indeed, of profit).
BMW, Audi, Mercedes, Volvo, Jaguar et al don’t outsell Ford or Chevrolet. But which would you rather have? (again, that isn’t to say that Ford don’t make some decent cars).
I invented the letterbox. And peas. But I’m a bit older than he is.
I think that there are 'flat' interfaces and undifferentiated interfaces. TWM, Windows 1, Windows 2, System 7 (in fact, all pre-MacOS 8 MacOS) are all Flat - but perfectly differentiated (and, to my eyes at least, very clear and elegant)
Windows 3 - Windows 7, macOS 8 - macOS 10.9, iOS 1 - iOS 6 are all differentiated and 3D (to my eyes they're very clear, but also a bit gaudy)
Anything recent, as far as I can tell, other than some special Android / Windows skins or Linux variants, are flat (which is elegant) and undifferentiated (which is sadly unclear).
So don't have a downer on flat UIs. Have a downer on undifferentiated UIs instead!
"This is very much the problem with business, they want it all for free"
How do you work that out? What I want, what business wants, is for schools and universities to build a solid foundation on which I can add business specific skills and training. And given how much I pay, both as a taxpayer and in direct contribution and funding to my kids education, and how much students pay in tuition fees, I'd say that that's the very least I can expect.
A graduate with a degree in computing should have a solid understanding of how computers and networks work, C programming (and not some trendy, mainly educational frippery like Haskell), and a solid grounding in mathematics. SQL would be nice, Linux is essential and everything else is a bonus.
With that foundation they can pick up pretty much everything else on the job.
…and, sad to say, university level IT / computing isn’t well taught these days either. The first thing that I have to do when employing graduates is teach them C - and, once they have C under their belts, then they can begin to become adept in the other languages that I may need them to use.
Other (basic) things that they don’t know / understand include:
how a computer actually works (to most it’s just a magic box for running software)
algorithmic efficiency (big O and so forth)
Fortunately, they’re all bloody clever - so they pick it up quickly - but that doesn’t alter the fact that, as an employer, I shouldn’t have to be the one who teaches them this stuff.
My guess is that another Carrington event would brick just about everything. I even doubt the ability of my 1964 GT car to survive it unscathed owing to induced currents in its primitive wiring. That said, I'm damn sure that my old car would be easier to get up and running again than my modern one…
32bit cleanliness is not required for VM. 32bit cleanliness is required to use more than 10MB RAM. To use virtual memory you need a 68030 cpu or a 68020 cpu with an MMU fitted to your computer. In practice this means that all non-68000 Macs can use virtual memory, 32 bit clean or not, with the exception of the LC which, despite being 32bit clean, had no MMU - and a multiplexer which was limited to 10MB RAM. Making the original LC a bit of a shit computer.
I think that it's fair to say that Woz was the genius behind the design, but a good design isn't enough to succeed in this world. You need business acumen as well - and without the business acumen of Steve Jobs there wouldn't be an Apple (or a Pixar) today. You may not like him, but Apple would have gone the way of Osbourne and Sinclair without him.
And Woz wozn't (sic) the only hardware / software genius at Apple. Let's not forget the likes of Burrell Smith, Bill Atkinson, Andy Hertzfeld…
Stanford's NLS (by Doug Engelbart) can reasonably claim to be the first ever GUI. Xerox's Alto (http://toastytech.com/guis/salto.html) followed up on this groundbreaking work - but had no icons, or drop down menus, at all. Nevertheless, it was a huge step forward, but nothing like a GUI that you'd recognise today.
Star, also by Xerox, improved on Alto with icons and resizable, overlapping, windows - but, in the case of Star, a huge amount of CPU power was required because the entire screen had to be redrawn whenever a window was moved. Oh, and it didn't have trash either, at least, not in its earliest incarnation, and by the time it did get a trash Lisa and Macintosh had launched. Nor did it have drop down menus - all its menus were in a ribbon like bar at the top of the app, plainly on view at all times.
So yes, I think that Apple can reasonably claim to have innovated the first recognisably modern GUI. More importantly, I think that Apple (Bill Atkinson, to be more accurate) can reasonably claim to have invented the crazily complex maths required to do Quickdraw Regions - that clever functionality whereby only the parts of the screen which have changed get redrawn. Xerox were astonished by the regions functionality - they hadn't thought it possible - and it permitted Apple to run a full GUI on a 5MHz 68000 CPU (Lisa).
Furthermore, Apple built on the NLS work of Doug Engelbart (1968) and ENQUIRE by Tim Berners Lee to develop HyperCard - the first mixed text and media hypertext system. HyperCard in turn influenced Tim Berners Lee (very recursive) and Robert Cailliau to develop a what we now call a web browser. Which is a very useful innovation.
But really, has anyone ever done anything that was genuinely 'New'? We're all just standing on the shoulders of giants. It's giants - all the way down.
More or less. Dragging a file to the wastebasket (as it was known on U.K. Macs back then) didn't result in the immediate deletion of the file. The file could be removed from the trash until Finder quit. On System 6 or earlier this meant that the file(s) would be deleted whenever you launched a program (in single tasking mode) or at shutdown (if multifinder was running for cooperative multitasking)
System 7 (1991, for the Mac Plus and above) fixed this so that deleting files worked more or less as it still does today, from a user perspective at least.
I’m pretty sure (and someone is bound to correct me if I’m wrong) that Apple hasn’t invented anything per se. What it has done is innovated (a lot) - and many of those innovations have since been copied by other computer manufacturers.
The beige plastic case. It might seem stupid, but this did a lot to make computers acceptable for home use. Before Apple? Heavy, pressed steel case full of bodged wires and unfriendliness. After Apple? Streamlined plastic, and finished circuit boards.
The floppy drive. Apple didn’t invent the floppy drive - but, before Apple, disk drives cost more than the computer - and contained their own CPUs, RAM and so forth to drive the, er, drive. After Apple, the computer’s own CPU drove the drive using software run on the computer itself. Thanks to Woz the price of disk drives dropped dramatically.
Colour graphics. Before Apple if your computer could even drive a display it was driven like a teletype. No moving graphics. Just text - and strictly black and white. Using some clever kludges based on the inadequacies of NTSC Apple gave us colour graphics.
Drop down menus. The GUI existed before Apple, but it was very menu driven. No one thought of hiding the menus aways so that they weren’t visible until clicked. In fact, I think that the icon representation of files and folders might be an Apple innovation too (Xerox used lists of filenames).
Regions. This is the cunning method by which only the parts of the screen which have changed get redrawn, rather than the entire visible area. It’s how Apple got away with using comparatively weedy CPUs and limited memory compared with the beast that was the Xerox Star.
ADB. Imagine a desktop bus through which you could daisy chain, keyboards, mice, joysticks - even slow handheld scanners. Sounds like USB? Actually, it’s ADB - and the year is 1986.
Desktop spanning multiple monitors. Apple may have been the first - but even if it wasn’t it was the first affordable (relatively) implementation. Yours since 1986 (with an addon board and monitor which clipped to the CPU of the Mac Plus).
CD-ROM. Again, Apple wasn’t the first - but it was the first to ship an optical drive as an integral part of the computer (Mac IIvi / IIvx).
No floppy drive (or CD-ROM). How everyone laughed. And then copied this usefully cost saving idea.
The Dockable Computer. If only they hadn’t abandoned this useful idea. I which modern Macs had a dock connector - but that doesn’t alter the fact that the Duo did it first, and (even today) did it peerlessly well.
I could go on. The postscript laser printer, the swipeable touch screen smartphone, the ‘intelligent’ PDA, the modern tablet computer and many more besides. None of these products was, strictly speaking, the first - all had ancestors - but all did it in a way that made them significantly more useful than anything that went before.
Ultimately, you might not want to use an Apple product (whether for good reasons (there’s another system which fits your use-case better) or stupid ones (I hate Apple and I’ll never buy an Apple product)), but if you use a computer then you have no choice but to use Apple’s innovations.
Have you every considered citing yourself - just to see if anyone notices?
I submit to no-one in my love for the Scotch Egg - with a dab of tomato sauce. But this isn't a Scotch Egg - it's a Dragon's Egg, which is something different and wonderful. I have a place in my heart for both, just as I enjoy stews and curries, hot dogs and sausage rolls.
They all pale into in significance when compared to the mighty Dragons Egg by Monty Pieman. Sadly, they haven’t made it in a while - but it’s a Scotch Egg with the egg replaced with a delicious mash of chillies. I don’t know exactly which varieties they use - but I’m a bit of a chilli-head and, when I first had one (last year), it definitely gave me a bit of the chilli-sweats. Superb effort. I’ve had quite a few since - and now I miss them because they haven’t made them in a while. I doubt that this footlong will make an acceptable substitute.
You wouldn't like the VT220 if you had to use one today. I had to use one yesterday - and it gave me a cracking headache. Up til then I'd forgotten all about the CRT induced headache, engendered by CRTs with low refresh rates - and the ghosting.
My first monitor was a natty amber job. I can't remember the manufacturer - I always think of it as an Elephant, despite the fact that Elephant Memory Systems made disks not Monitors, because I stuck an elephant sticker on the casing - the sticker came free in a box of disks. Happy days.
But, to answer your question, it's a kind of very dark aubergine purple with white text - the current Ubuntu terminal colour scheme. I can't remember a mud or turd coloured version.
I know - I’m still pissed off at the changes made to the bios font when IBM released VGA. CGA for the win - who needs more than 16 shades in text mode? Or more than 640*200 resolution, black and white, in ‘high resolution’ graphics mode?
Grumble grumble, I remember when this was all fields. Kids today. When I were a lad, I had to walk 50 miles to school in cardboard shoes with a laptop abacus. And we were grateful.
Alternatively, and this works on older versions of Windows too, just install ‘Console’ (https://sourceforge.net/projects/console/) and get this benefit and many more besides. And if Console isn’t your cup of tea, there are plenty of other alternatives available just a quick Google away.
My preferred colour scheme is Ubuntu’s auberginey palette (and typeface, for that matter). Using Console (and the settings for Terminal on my Mac), I’ve given all my computers a little Ubuntu makeover!
It seems like a rather anachronistic way of measuring Operating System market share - especially now that most internet use is on mobile devices (tablets and smartphones). I'll bet if the analysis included the likes of Android and iOS you'd see that Android has by far the largest market share. I'd further guess that Windows and iOS have roughly the same share of the market.
So, do you have these (I suspect more representative) figures el Reg?
It's not difficult. Just add everything.
Fried egg, runny yolk x2
Black pudding x2
And all washed down with strong black coffee.
And don't get religious about the sauce either - save the prejudice for whether to eat veggie or pork sausages.
Greater Anglia used to piss me off every day until I got so fed up that I moved house to Chilterm Railway's patch. The service is now excellent, but I concede that my solution was rather extreme and not remotely practical for most.
Here. Have an upvote from me. I wouldn't worry about all the down votes though (look at the reaction to my comments on this thread - very negative, and I can't understand why either (although, since there's no explanation either, I don't care too much))
I can think of some very good reasons to abandon Paint (and any other software extraneous to the OS), not least for reasons of security and developer time required to maintain the software. But I suspect that many of the commentards downvoting you are newbies or have only a passing familiarity with IT (although, doubtless, they'd claim great expertise). The Register used to attract Programmers, Sand Benders, Ops and Sys Admins. Now there also seem to be noobs and gamerz here for the lulz (whatever that means).
Like Viz, the Register just isn't as good as it used to be! Won't stop you coming though, will it? Me neither.
I'm not saying that they always behave well, or do the right thing (although, in my experience, they can generally be trusted) - just that they aren't a monopoly any more.
As you say, you had a choice - and you made it. While it lasted, at least, I hope that it was the right one for you. Of course, you could have stuck with Windows 7 - and, presumably, a still working copy of Photoshop.
You could also have chosen to buy a Mac (with Photoshop) or Linux (with something else). So badly behaved, perhaps, but not a monopoly.
Cheaper than what? Cheaper than a Chromebook? Cheaper than a Raspberry Pi?
The thing is that Microsoft’s tactics only work if there’s an advantage to the computer manufacturer playing along. In the past there was an almost unassailable advantage - favourable pricing on Windows. Currently there may well still be an advantage - but that advantage is dwindling. Which brings me to:
Time for some TRUST BUSTING and anti-MONOPOLY actions. You can't just have SOME vendors not playing Micro-shaft's game. It has to become ILLEGAL for them to do it at ALL.
I disagree - I don’t think that the law is necessary here (well, except for lawyers keen on earning another fat fee). I think that market forces will do this - the tide is turning.
It’s fashionable to hate Microsoft, just as it’s fashionable to hate Apple (Google, Facebook, Amazon - insert whipping boy of choice here). The truth is that they all have their advantages and disadvantages. Personally, I quite like Satya Nadella’s vision for Microsoft - I find it infinitely more preferable to the cock up that Ballmer was making of it. Especially now that Linux is being brought into the fold, and I can imagine the day when Windows is a shell on top of Linux. I even like Windows 10. But I’m not a huge fan of masses of bundled software - it leads to laziness and lowest common denominator applications. After all, why install a competitor to MS Paint - even a better one - if MS Paint is bundled? Similarly, I know people who use Notepad even though Notepad++ is infinitely better (and still free).
I’ve said it before, and I’ll say it again. Pick the one that you like best and then enjoy it. No need to get religious about it, or worry about what might or might not be wrong over the fence. These things all seem to sort themselves out over time.
I know. I know. I’m an eternal optimist. But I can't imagine a day when there’ll be no free software on the Windows Store. In fact, other than the registration fee, I can’t imagine a day when Microsoft will charge per app submission (perhaps I’m not very imaginative). At a time when ChromeOS (and its successors) are breathing down Microsoft’s neck in the low cost space, and Apple are squeezing them at the high end, I can’t see Microsoft doing anything which might lessen Windows’ desirability.
As to “Mom & Pop”, I agree that they probably don’t have the know how to install Linux. But there are businesses which will happily supply a pre-built Linux machine (and, once installed, it’s perfectly easy to use - my mum, a confirmed technophobe, happily uses her Linux laptop without even knowing that it’s Linux - I built it, she uses it. End of.), and it’s just as easy as buying a Windows machine to buy a Mac or ChromeBook - you don’t even have to go to a specialist store, John Lewis will sell you one!
Not being a hardware vendor, perhaps I have an overly simplistic view of the world. But, the way I see it (which I concede may be based on false assumptions and a whole stack of optimism), the hardware vendors don’t need to play along with Microsoft. A vendor could conceivably say ‘actually, we’re not going to make any Windows computers - we’ll supply our machines pre-installed with Linux or ChromeOS’. Of course, if a buyer wants Windows then they can still buy, at full cost, a boxed copy of Windows - but otherwise they can save a bob or two and have an OS which may be just as functional for what they need to do.
In fact, this is pretty much exactly what Eight Virtues, System76 and ZaReason do. I imagine that this model will become more prevalent in the future, especially now that Linux has proven itself on the desktop.
Actually, now I come to think of it, this is also exactly what Apple does, Raspberry Pi (the latest being perfectly capable machines for what most people need to do), BeagleBone, Udoo, Parallela and PixiePro do too. In fact, I guess that you could add to that all the members of the Open Source Hardware Association.
In fairness, I don’t think that Windows can be described as a monopoly any more, or Microsoft’s practices as monopolistic. You have a choice, in a way that there hasn’t been a choice since the halcyon days of the 1980s. You can vote with your feet and buy a Chromebook. You can install Linux (or, if you must have something Windowsy, ReactOS). You can have a Mac - or eschew desktop OSs altogether and buy a tablet with iOS or Android.
When Windows was installed on more than 99% of computers then yes, it was a monopoly. But now, as its market share has collapsed below 20% when all personal computing devices are taken into account, no. I don’t think you can reasonably call it a monopoly.
As for MS Paint, it was originally released with Windows 1 in 1985 as a ‘competitor' to MacPaint. Apple quietly put a bullet into MacPaint in 1989 - its useful life long outlived as more capable alternatives became available. The world is awash with simple, capable, paint programs - the only surprise is that Microsoft has taken this long to follow Apple’s example.
I know others who like it too. Maybe if it had been my first I’d feel the same way, but I was a C programmer, and I got tasked with working on an APL system because of my aptitude for quickly picking up new languages. I might be good at learning new languages - doesn’t necessarily mean that I enjoy using them!
APL isn’t the only language, incidentally, that can use real mathematical divide symbol for the maths operations. AppleScript (and IIRC HyperTalk) can too - but only because it’s very flexible as to the syntax (which can, in fairness, be A Bad Thing, if only because no two developers will write code in the same way)
For example, in AppleScript, for this sum, these are synonymous:
display dialog 10 ÷ 2
display dialog 10 / 2
display dialog 10 div 2 (div is integer only)
I like Perl for short bits of text processing. I use it like a more readable version of sed when I need to share code with a non-programmer. Anything more than that and Perl falls down badly - I had to maintain an application written in tens of thousands of lines of (badly written) Perl code. The original developer had left out the "use strict" pragma because in his words "it didn't run when he put that in". I fixed that, and improved overall reliability somewhat - but it still wasn't as good, or as fast, as it could have been if it had been written in a language which was up to the task in the first place.
As for Java, that's a sad tale. So much potential - and ruined by Oracle. You have to admit* though that Microsoft really ran with it and has, latterly at least, come up with a real gem in C#.
*you don't have to admit of course. You could spew coffee over your keyboard and disagree vehemently. There are some strange idioms in English.
I love Python. It’s a great language - the new ‘Basic’. It’s great for teaching kids how to program, and it’s great for doing real work in as well but…
…for me my one true love is C. It’s powerful (and, yes, dangerous if abused). It doesn’t hide anything or do anything automagically. Memory is yours to play with as you will. Even my C++ looks like C (which I realise makes it bad C++ - except, sometimes, to other C programmers).
I quite like Objective C and Swift. I’ve been paid to develop in Pascal (which was my favourite teaching-kids-to-code language until I discovered Python) and APL (which was a vile experience). But, in my experience, if you can do C then you can pick up most modern programming languages quite easily. If you can do C well then even Assembly comes fairly naturally.
But if you wait 10 years you can have all this power, and more besides, in your mobile phone running iOS 21 or Android 'Banoffee Pie'
Given the quality of Radiohead's musical output, you'd have thought they could have paid a developer to write them a decent megademo for their Easter Egg, rather than this sub-school-playground nonsense. Bad effort, Thom. Bad effort!
Or "The Gullibles" https://www.youtube.com/watch?v=NdOT9CEjQC8
You see, that's the thing about conversations. They evolve. They disappear off into little curious sidetracks and eventually come back to the original subject via a circuitous route. Or not at all. That's the fun of it.
@Martin an gof brought up the matter of Hi-Fi and I ran with it. I'm a geek - and yes, I love listening to my stereo. And playing on my computers, with my bikes and my cars*. But your central point was pretty much mine too - "Essentially all that matters is this - if it sounds good to you, great. Nobody else needs to know, or gives a shit, what set up you use. In fact this applies to most other things people discuss on this site - if you're happy using your custom made Linux PC then great for you, if you use a Mac and it works for you, that's fine." Two thumbs up. No point having a fight about this, or much of anything else.
However as to "why do you think they care what you specifically use or do?" er… because some people do. Not anyones kit specifically - but some people are interested in the minutiae of other people's interests. If you discussed your stereo, I'd be interested in what it was because that tells me a little about what informed your opinion. If you're not interested that's cool. Scoot on to the next comment, maybe (if you're lucky) typed by someone less geeky than me.
You'll note by the way, that I didn't say that my HiFi / Car / Computer etc is the best and that anyone who has or uses something different is an idiot. As you say, if "it works for you, that's fine."
The problem, as mentioned elsewhere, is when kids get a thorough education in bullshit and not in actual fact.
* just messing with you. I know you don't like going off on a tangent - accept my apologies.
It would seem that we are entering a new Dark Age, where facts* are deprecated in favour of a faddy idea or 'alternative fact' which props ups the loudmouth of the day.
@TonyJ and @AC make a good point about how this is harmful when kids are taught this bullshit.
As far as Monster cables et al go though, I just laugh - especially when the music being played is from a digital source**, and especially a compressed one.
I use cheap cable (£7 for 30m), decent speakers (Mordaunt Short), decent separates (mostly NAD, with a little Sony and some Technics for good measure) and keep the cable runs as short as possible. It's a setup which, to my ears, is brutal on bad MP3 rips (better off with a cheap stereo - a good one will throw the compression artefacts into sharp relief!) and sublime with lossless and good quality CD or better recordings.
* in the sense of something which is demonstrably provable or can be shown to have happened.
** not that digital audio is bad - my ears certainly aren't good enough to tell the difference between a good MP3 and a CD, for example - but technically, a good analogue recording on a format like reel to reel should be better than CD because the CD has been sampled (at 44kHz) and the analogue recording hasn't. It's just that, even if the marketing lies of super expensive cables are true, the sample rate will have a much larger impact on the quality of the sound vs. the deficiencies of the lower quality cable.
This really gets on my tits. There’s no miracle to wellness - it’s all well known science. Eat healthily (well, most of the time), drink in moderation, exercise hard and do something to keep yourself mentally fit (chess, reading, programming…) There are no shortcuts, and putting stickers all over yourself to rebalance your chakra (?) is just as nuts as rearranging your workstation to promote the flow of ch’i instead of rearranging it so that it’s more comfortable to work at.
Still, I suppose, if the only people getting conned are the people who are into this shit, and if they feel better once their wallet has been suitably lightened, where’s the harm?
I've lost count of the times that I've told 'partners' from HPE that I don't give a shit about The Machine. They never seem to believe me though. Apollo, I like. The Machine is boring - it's just a future-washing of todays technology - but there's nothing really futuristic about it.
@AC - I’m sorry. You’ll have to explain. What does the sign have to do with LGBT?
There's an adult shop on the Cowley Road in Oxford which encourages its customers to "Use Rear Entrance". Not a mistaken sign - but I thought I'd mention it anyway.
By my calculations, that’s 937,236,841,784 Brontosauruses. A very long way indeed.
Irv Gordon will have to travel 3,200 times further than he already has in order to beat that.
@Alister - by the way, I disagree about greaseproof paper. Izal was no way that strong!
Biting the hand that feeds IT © 1998–2017