Re: What a shame
... and hear it fly, those engines roar out a sweet tune.
176 posts • joined 26 Feb 2008
... and hear it fly, those engines roar out a sweet tune.
> So just how the fuck are we supposed to be experts?
You're not, but it would be reasonable to do 5 minutes of reading and apply some common sense rather than just believing what is written on the internets, and then spouting off about how terrible the world is ...
[You in the general sense, not you specifically]
"Access granted, Warden William Smithers. Thank you, and be well."
> In my experience DRM often taints a product's entire design
I love the smell of hyperbole in the morning. Anti-DRM is really turning in to a religion of the first order. It doesn't help the argument.
> I simply don't see what the problem is.
Ignoring the civil liberties aspect for one moment, there are a number of other issues which are a problem.
(1) Electronic intercepts of this nature are notoriously unreliable, and just getting a lot more data doesn't mean you actually get better "intelligence". For most cases of terrorism which this is supposed to help "fix" we actually already had solid leads and intel on most of the bombers, but lacked man power to actually doing anything about it. Drag-netting a huge amount of low quality data makes the problems of data analysis harder, not easier, so actually risks making it less likely you find the bad guys.
(2) Actually doing something with all of the data requires investment in both technology and/or people. Given the relative expense of large scale government IT projects, and all of the administrators that go along with it, and all of the agents you would need to hire to actually follow up data leads, it is highly likely that you would save far more lives by plowing that money into the NHS or some other social scheme that have ever been killed by "terrorism".
That final point is the nail in the coffin for me - no politician ever looks at the opportunity cost of what the money could do elsewhere. The downside is that if the government doesn't do anything, and something does happen, you just know that the press will slaughter them for not doing anything. They can't win - we really need the public to say "you know what, liberties are more important than a few people dying in very rare circumstances which you probably wouldn't be able to stop anyway". But that's not likely to happen.
Which would be all well and dandy if this proposal had anything what so ever to do with reducing bandwidth. It doesn't - at best you would save some control headers which are tiny - so if you network is bandwidth congested you're still screwed.
> So, tell me, WHY should I go out of my way to use something that isn't better?
Did you actually read the article at all? Been good at something is not a monopoly. Being good at something and then using that as leverage to force people to to use something which you are not so good at is abusive.
If they are good at search, fine, but they are not allowed to use that "good at search" to also force Google+, Maps, Google Shopping, analytics, etc, etc, or whatever other "beta software" they have invented down your throat - that stifles competition. Doubly so in Google's case because they subsidize all of the peripheral stuff on the back of a huge ad network, which makes it almost impossible to compete against on a commercial footing.
The general risk is that you end up in a situtation where users can't use something better because it doesn't exist, but it should have existed in a well functioning market. Mega corps are nearly always "bad".
That would be Amazon. It has very good download rates, but latency can be a bit high - around one to two days I find.
> Why wouldn't a console do?
Keyboard. Mouse. That is all.
> I mean, something like Civilization wouldn't work on console, I see that. And I'm told that big FPS games don't translate well either.
OK it wasn't all. Didn't you just really answer your question? I want a machine which can play ALL games, not just the subset which happen to be available in walled garden #1 with a stupid interface form factor.
For sake of disclosure, I do own a console (or two), but also heavily use a decent Windows gaming PC. With Linux in a VM for "Real Work".
> That will come when the advertisers understand that they haven't got a god given right to shove their crap in your face 24,7 whether you want it or no
Which will come when "users" realise they also have a right to pay for the content they are consuming on the internet. It all has to get paid for somehow, and the general trend so far is that users are a bunch of tight fisted folks who would rather put up with adverts (indirect costs) rather than pay up front (direct costs).
Not just web - happening in gaming too with "free to play" - people just don't like paying for software.
> It's all a bit cyberpunk, isn't it, when Corporations can have a direct influence on political policy.
Who do you think pays for the billion dollar election campaigns. If you think US politics is not entirely "corporate" you need a head scanning. Companies do not stump up a billion of dollar in total unless they think they get something out of it.
I get the impression this is more about "state" vs "federal" turf war than anything to do with providing good and competitive internet to the good citizens.
How can a company with not much history (it's a young company), which is losing $60M a year, at an apparently increasing rate, be worth $700M. *Boogle*. If it is "just software" it's not even like they have a monopoly on "analytics in the cloud".
Tech Bubble 2.0.
High Dynamic Range
That was my point - the bus is still there, you are still moving data over it. If processor A has the data and processor B needs the data the only way to get it is to copy it over a bus (on chip or off chip, it makes no difference - there are always wires linking things together, it is just a question of scale).
All this spec seemingly changes is the need for manually keeping the different memory pools in sync - the "clever bit" is the reliance of system level hardware cache coherency - not the removal of the need to move data around the system (which is impossible).
Sorry if I didn't state the point in the most "smooth" way ...
> without having to copy chunks of data over buses, for example
Unless they have just invented a chip design which works on quantum entanglement of particles, so data can magically jump from the CPU to the GPU caches or to main memory as required, I'm pretty sure that moving data over busses is still firmly in the picture.
> I think you may be making some incorrect assumptions about what these certificates mean. They are purely and simply there to certify that a given web site belongs to a given organisation.
Yes, but they don't do this directly. The entire system is based on the fact that there is a "trusted authority" which can attest that the entire certificate chain from root to signer is valid. You implicitly have to trust everyone in that chain not to have leaked a key, or the security of the chain is DOA.
The entire scheme is trust based, and the OP is correct - most of those certificate signers I wouldn't trust at all (I have never head of them, so why would I trust them or their processes to keep their signing keys safe?). However, as most websites are only signed via one CA you don't get a choice - you either accept the certificate or go elsewhere (which is a choice, just not a very useful one).
Personally I would say that there isn't enough visibility of how any of the certificate vendors or their processes actually work, so the security processes involved are totally opaque. That's not really trust - that's just taking everything on faith - and that's no real way to build a security system.
It'll be fine on American roads without them ...
It's entirely possible to get very small parts of a design running VERY hot. The average chip temp will be much lower, but hot spots can cause massive problems as dissipating heat out of an area only a few 100 nm across is hard.. I seem to remember some Intel slides on Pentium4 claiming there were parts of the design around the same thermal density as a SaturnV rocket nozzle. We all saw how well that turned out for that micoarchitecture - hot spot problems were so bad it got shot.
> Combine that with a single flat rate tax system (possibly with a tax exempt threshold), and suddenly the cost of administering the system becomes tiny compared to what it is at the moment.
It doesn't work for a sizeable chunk of the system. There are a large number of exceptions - people who are severely ill and/or disabled - where £130 doesn't cover expenses (i.e. needing 24 hour care). Hell, around here it wouldn't even cover rent in social housing, so those out of a job would still need support.
Assuming you are giving the parents the money for the kids (it was cradle to grave, right?), it also encourages parents who shouldn't be having kids to have more of them just to get the £££. If you don't pay parents for the kids, then you get a much worse poverty trap as today with parents on benefits with 5 kids. Unless you start wanting to forcefully sterilize portions of the population, this is an "interesting" route to go down.
Society is full of corner cases - the problems with policies like this is they try and treat everyone as if they were "average" - it simply isn't true.
... and neither is my wife.
> "I only had S3 keys on my GitHub and they were gone within five minutes!"
Assuming they got comitted to a git repo, I wonder if there were still in the version history. Seen that happen a few times on internal repos - people delete the file from HEAD, but nuking the history totally is much harder.
> I think the argument against IE is relevant only in that it controls the complete stack and allows (allowed) MS to pretend (or insist) that IE is essential to the proper operation of the OS.
The real issue was that Microsoft deliberately did not follow the defined standards so only their browser worked for the majority of sites - it was an wilful abuse of a dominant market position, and made compatibility for other software difficult.
As a result of the legal ruling, and increased competition, Microsoft were basically forced to start paying more attention to the standards. Still not perfect, but a lot better than they were ...
> WHY THE FUCKING FUCKITY FUCK WOULD I WANT A COMMAND LINE!!!!?!?!?!!
TortoiseGIT for Windows?
EGit for Eclipse?
> And it has no Windows support.
I mean "Git Windows" on Google throws up about 3 or 4 different download sites at the first downloads. "No support", sure.
> with no documentation
Sometimes I think organizations roll out tools without teaching their staff the first thing about how to use them. WHAT THE FUCK HAS HAPPENED TO THE FUCKING WORLD???
> I do hope they come up with a new flavour of the month to replace git soon, as its a total stinking pile of shite.
Out of curiosity, what's wrong with it? I've used most of the "free" ones over the years (CVS, SVN, mercurial, GIT), and managed to avoid the uber systems (Perforce, etc), and TBH git seems to be the best of those that I have tried.
It's fast, flexible, and generally comes with a whole pile of features which I can't get in the others (although mercurial has some of them). Once I embraced the concept of a local repo with local commits for my working copy I suddenly became a whole load more productive. Just don't try to use it like a drop-in swap for a traditional CVS/SVN - it will drive you crazy and you'll never actually get any of the benefits.
Is it perfect? No - I miss having a global incrementing version for trunk (my only feature gripe - and one which I have "emulated" simply by tracking the hash history over time for each major branch I maintain), and it does have a learning curve, but it is one hell of a powerful tool.
> Once your TV picture has a dynamic range that exceeds that of our eyes, without them dilating all the excess DR is wasted.
Not all the dynamic range in HDR needs to be "brighter" - you generally get more quantized levels available in the between the existing "LDR" min and max luma levels. In dark scenes where you eye is already dilated this allows the HDR scene to show subtle detail which is not visible in an LDR encoding.
Yes, ambient light doesn't help this (eye dilation isn't only determined by the scene), but TBH many people do watch films (the most likely HDR candidates people will actually care about) with the lights off ...
> So far their plan has been ... <snip> ... 4k ...
I'm not complaining about 4K, not because I want a better telly (hardly ever watch the one I have), but because it is finally meaning decent resolution panels being produced in volume, so it may finally kick 1080p out of PC and laptop displays. PC monitors have been static in terms of resolution for 10 years thanks to TV determining the "economics" and TV moving to something better is the only way to unblock that.
4k2k monitor - yes please. 4k2k TV - whooo ultra high definition HDR amplified compression artefacts on freeview - wonderful.
> It's really a minority of people who have that though - otherwise they wouldn't be able to have darkened cinemas.
You generally don't move your head in a cinema - it's a different problem. If all the VR environments provide is a 3D scene without head tracking, then you'd be right, but IIRC half the point of the head-mounted display is that you can look around.
I'd be happy with a full 1080p screen in a notebook ... 1376x768 is still all too common ...
Money owed > Money in bank.
Chapter 11 is designed to let companies restructure before they implode into an unrecoverable mess. If you only try it when you have $0 in the bank then you're unlikely to have enough time to recover anything.
> And gosh they are tiny chips, 0.1mm square is nothing for such a useful chip.
Worth noting that's just the CPU area _in_ the chip - not the whole chip. Someone will have to bolt on peripherals, bus, internal memories, pin out pads, etc to form the whole chip which an end developer would buy.
That said, in reality at the "small end" the logic gate area is significantly smaller than the area needed for pin out (I think the smallest I've seen is 2mm x 2mm, which was achieved by eschewing packaging and just dipping the silicon in a ceramic paint to insulate and protect it).
> A process that could be easily devolved to a 3D Printer - which are hardly like rocking horse shit.
Most 3D printers I've seen can't capture the level of detail required for a finger print - not the mass-market squirty plastic kind anyway.
As above - on many sensors blutack or plasticine works just fine. Low tech and far easier ...
> If there's an incentive, things will be made simple. Look at the grunts who skim cards, using quite high tech.
True, but that's a create once and use many kind of operation. Getting one fingerprint of one guy unlocks one phone - it's not scalable to quite the same extent.
That said, if you really really want to get into one guys phone hacking is a lot of bother - just threaten to hit him with a wrench until he gives you the pin number, or cut off his finger. Attack the fleshy part - not the technology - it's normally simpler.
> attackers shouldn't be able to determine the password even if they know what method you use to create it
Yes, if every user applies a cryptographically sound key generation technique this is indeed the theory. I think Bruce's general point is that if you ask the average punter to string 3 words together the amount of entropy is far from what it looks like, and so the algorithm is not cryptographically sound - you either need longer key lengths (more words), or a new algorithm.
> Why, that's less than a week!
I assumed that he's talking about rainbow table attacks working backwards from a website which has leaked a hash, the issue is not processing time (you only have to generate the rainbow table once per salt), but the amount of space it takes to store the tables.
Assuming most users pick from a relatively small pool of words in common use, it's not an insurmountable search space. Yes, the total search space in theory is pretty enormous but, I suspect many people have a password with "Cat" in, and many more with "Dog", and more than a few with "Password". Hackers only have to skim off the 1% of the easiest passwords to get into a system- not the 1% of the geeks with the uber difficult ones which are hard to remember ;P
I read more as "there is no point investing in really expensive locks (long passwords) if everyone makes the safe out paper (i.e. the other security is implemented wrong)".
That said their approach is a race to the bottom - lets make everything as weak as the weakest part - rather than trying to improve industry best practise to reduce the number of "implemented wrong" instances in the wild.
I'm still using my iPad 1 - it gets used daily on my commute, and basically is as good a new and still has a week-long battery life.
The only reason I may consider an upgrade is that they's stopped providing iOS updates for it, so some new apps don't work if they require iOS-latest. However, given the price of a new one I really can't justify it, and I don't want to encourage companies to stop supporting perfectly functional hardware =) Given that most apps work on the current one just fine and the hardware is basically in perfect condition, an upgrade just seems like spending for the sake of it.
> You never had an economics class did you?
Yep I did GCSE economics - a long time time ago ...
Modern macro-economics is all about macro-scale models, and data mining - would have made it far more interesting =) Micro economics (i.e. business level) is pretty much accounting, albeit not spun that way, but the basic principles are the same.
If you can't find some means to get some "computing" in to at least one aspect of an economics course you're not trying hard enough - there are lots of applicabilities to "big data" and statistical trend and correlation analysis in many real world applications of economics (insurance, crime, risk analysis, etc - it's all data mining).
I'd much rather that "coding" was to some extent introduced as part of the syllabus for other subjects.
Welcome to biology - today we're going to data mine some gene sequence information from a database. Welcome to chemistry - we're going to use some image processing tools to look at some slow-motion reaction images, or to automate some process. Economics 101 - accounting using spreadsheets, and setting up some monthly report generation scripts.
You teach coding to kids who don't like computers (who would normally hate CS on principle), and you teach some non-coding skills to the guys who love to code (who would hate subject X on principle). Done right it should be "win-win". You obviously can't do this in every lesson - but it teaches what coding is really for in industry - and how that CS is a tool which solves problems in other subjects, and not a solution in its own right (in general).
> Teaching how to break down a problem into ever smaller parts until each becomes manageable is a skill that I dare say should be valuable for *everyone*. I believe that coding is a very good vehicle for that.
The entirety of the education system _should_ be geared towards that. What's the point of all of the "knowledge" which all of the existing subjects are trying to cram in our students' heads, if they can't actually apply it to a problem and do something useful with it.
Unfortunately we have politicians who seem to equate "education" with learning as much general knowledge trivia as possible, followed by a testing methodology which is to regurgitate as much of it verbatim in 2 hours as possible. Very little of our education system encourages application of existing knowledge to new problems - indeed most of the time any "outside of the box" thinking is actively punished because the answers are not on the proscribed list of tick boxes which give marks.
You don't need to teach coding to teach "original thought" and "problem solving". My nephew "likes computers" (i.e. he plays games) - but he's not interested in coding - it's too abstract. If you want to teach something in schools, and teach it to everyone, you need to make it widely accessible to > 80% of the class. Coding won't give that, unless you wrap it up so much (logo turtles anyone) that it isn't really coding any more - so don't pretend that it is (or that it needs to be).
> who don't let you get away with submitting half-baked papers and essays.
What, you mean, actually _fail_ a student? ... but, but, ... that would ruin the statistics.
My Dad ran his own company for years - ran a very similar "aptitude" test which wasn't particularly sensitive to what course you did at university, etc. He was basically looking for the one skill you can't teach per-se - common sense - and the right attitude - a work ethic. Unless you are working in a very "top right" industry - you can teach most of the "skills "people needs on the job, but they have to want to learn it, which can be a harder thing to find.
> The Qualcomm Snapdraqon, based on an ARM Cortex-A15,
It's not based on Cortex-A15. Krait is Qualcomm's own CPU implementation (instruction set compatible, different microarchitecture).
> and the programmer whose error caused all the ruckus says there just aren't enough people scrutinizing the OpenSSL code to spot difficult-to-find bugs
... and forking it and making it OpenBSD-only is going to improve number of eyes looking at the code how?
> At 3Ghz that's about 30 flop/cycle. How? If that was a multiply-accumulate (fmac) heavily pipelined over 2 vectors of effectively infinite length then that's 2 cycles/flop, say you have 2 such units that's 4...
Dual issue vec4 SSE = 8 per core per clock, add in a couple of non-SSE fp32 instructions = 10 per core per clock. 4 cores = 40 per clock.
> Good advice: my only improvement on that is to get a cashback credit card if you can. You can easily make 3 figures a year just from funnelling payment for things you buy anyway through the card(s).
Oh dear, another one who thinks this money grows on trees.
What really happens is that the CC company screws the merchant via transaction fees, who then increases the price of goods you were buying in the first place to cover it. Nothing banks do is ever designed to actually give you money which they haven't managed to screw out of someone else first.
Personally I'd rather the world banned these "freebies" and actually forced banks to compete on their ability to deliver a banking service, and nothing else, in particular for credit cards where the actual cost is invisible to the punter (and therefore these not-so-freebies actually look "free", unless you understand how the model actually works).
... and in principle there is nothing wrong with that goal, and from their business point of view it seems to make sense. They just made a mess of execution - the devil was in the detail for both.
I don't agree with the point that they should have started with "a new OS". In general people were happy with the old OS (it was faster than the old one, more stable than ever before, finally fixed their driver model, and generally worked). Not liking a few aspects of the UI is not a reason to bin the whole operating system ...
What it needed was continued incremental improvement on performance, memory, and stability, with _optional_ compatibility with RT/Phone, not a whole sale crammed down your face for compatibility when it wasn't actually compatible at all ...
> Big companies (DISNEY and others) get the copyright term extended whenever one of their moneymakers is nearing the end of its copyright period (the 1936 Mickey Mouse film is still in copyright!!
On think I never understood is why people thing this is this such a "bad thing"?
There is little public good (it's not a cure for cancer), it's hardly a cultural game changer which will change the world, and nothing is stopping people developing another mouse-alike cartoon, provided they don't slavishly copy it or infringe trademarks.
Why should people have a right to copy (i.e. get for free) what someone else spent time producing, just because it is digital?
We seem to be entering an age where youngsters are far more interested in getting other people's old stuff for free rather than learning how to create something themselves - cultural stagnation starts here ...
> Once they go feet dry, though, some sort of self-destruct/auto-immolation might be desired, but might conflict with safe handling and storage/stowage aboard a sub.
As opposed to the warheads in a MK48, or a cruise missile? A small auto-destruct on the electronics of a drone is no problem to stow ...
I've bought one and TBH it's a really nice console. Sure, there are a few teething bugs (random UI elements like dropdown menus lose all text, and a couple of system reboots playing Ryse), but after a weekend gaming on it it seems like a nice improvement over the XBox 360.
The most noticeable change for me is noise - it's practically silent, making it a viable platform for watching films - on the 360 you had to increase the volume to drown out fans.
Is it the best thing ever - no, of course not. A big enough improvement to justify moving over - probably not at its current price point for most people. However, I just like toys, and I'm happy ...
I have a couple of these ...
... CPU's not as powerful (only an ARM Cortex-M microcontroller) but it is trivial to whack in a breadboard and hack together some fun projects interacting with the "real world" - no soldering required.
The whole toolchain is web-based and you download over USB from the browser (just drag and drop the program to the flashdisk). Not as flexible as some of its bigger brothers, but for teaching kids it hides all of the grotty bits they don't need to know about (yet) and lets them focus on the fun stuff.
> i5, 8gb ram, 128gb ssd and a 9inch screen. No fancy graphics, just a portable workhorse. Few usb 3 ports, an SD slot and hdmi out(and in?) Oh, and not stupidly expensive!
So basically you want everything a laptop has - except the optical drive and GPU - but smaller? The cost of a higher density battery to give the same battery life in a smaller form factor probably costs as much as the bits you ripped out.