Knob head or bell end.
The original commenter would appear to have merged the two.
Nob end has plenty of precedent. It refers to a football fan from Preston.
543 posts • joined 27 Dec 2008
Knob head or bell end.
The original commenter would appear to have merged the two.
Nob end has plenty of precedent. It refers to a football fan from Preston.
If you do an actual risk assesment instead of talking complete nonsense, there was no danger whatsoever to myself or anyone else. No laws were broken whatsoever, and if by remote chance the drone managed to get out of control I would have been impressed to see how it would have escaped the building to hunt someone down 3 miles away!
If there was no risk what was the point of the experiment? Why is it worth reporting here? The hazard is what leands notability. Since it is clearly there you acted recklessly and yes you broke the law. The staute book does not alter what it says according to the ego of the pilot.
Actually, with FPV flying in the UK, you must have a competent observer who maintains unaided visual line of site of the aircraft.
Air Navigation Order 2009 article 166:
(3) The person in charge of a small unmanned aircraft must maintain direct, unaided visual contact with the aircraft sufficient to monitor its flight path in relation to other aircraft, persons, vehicles, vessels and structures for the purpose of avoiding collisions.
How can they prove he didn't have a spotter who kept line of sight for him?
The pilot has to keep direct line of sight. You can have a helper (for example the flag marshall in pylon racing, who indicates when your craft has passed the furthest reach of the course) but you always have to maintain direct line of sight on your own behalf. Saying "I couldn't see it but my spotter could" is actually evidence of your guilt.
I have flown a typical consumer drone into myself at speed to demonstrate how safe they are. The scratches were not any worse than falling into a thorny bush. These horror stories that they are going to knock people out or decapitate them are simply ridiculous and not remotely realistic.
It is not often a post here actually gets me angry but this has managed it. I've had an electric shock on a few occasions and always gotten away with it - usually just a tingle, once a jolt, but never any injuries. From this limited anecdote and using your logic we can conclude that the entire electrical safety code is superfluous and can be dispensed with.
Widespread use of drones may be fairly new, but model aircraft are not and they share most of the legislation. There is also a track record involving property damage, personal injury and occasional deaths - only every few years on average, but yes, they have happened and will continue to do so. This is why most official model flying sites demand insurance and a BMFA certificate showing competence on the the part of the pilot.
You ignore all this, break the existing law (which, surprise, surprise, doesn't allow you to fly into people) and endanger not just yourself but anyone in three mile radius of yourself - after all you could have been incapacitated by this stunt, leaving an out of control model on the loose endangering everyone in the neighbourhood.
This does not make the case for deregulation, quite the opposite. The established model flying community shit themselves when they hear of antics such as this. They have spent decades promoting responsibility and working with regulators to get the law to the the relatively encumbered form that it is today. That is always at risk whenever stories of mindless yobs pulling reckless stunts (like your own actions) receive attention.
Why is a collection agency needed?
YouTube could easily set up a system of paying authors automatically when their video is watched - or adding adverts around it and paying a share of the advertising revenue. Oh, they already do.
It isn't automatic - you have to approach Youtube, agree terms, supply bank account details etc. After which you have an agreement with Youtube. You have to then do the same thing with Dailymotion. Then with Vimeo. Then with 1,001 sites that you've never heard of, many of which are foreign and don't deal in English. Even if collectively those amount to a decent revenue stream individually they may literally amount to beer money over the course of a year. That means your negotiating position with each individual site is weakened. This is precisely why Youtube are able to throw their weight around with take it or leave it offers - take this money for your work, or receive nothing. Oh, and your work will still be on our site anyway...
No one wants to spend hours negotiating a contract that is worth the price of a pint a year. No one wants to pay for translation and legal advice on such a contract. No one wants the hassle of reviewing a thousand such contracts for quirks that subtly affect the tax treatment of each one. No one wants to hand over their bank account details to a thousand unfamiliar foreign websites. This is precisely where collective bargaining arrangements come in, which do serve both sides of the deal - the content creator has a single sizeable revenue stream that it is worth considering properly and the content buyer has an entity that they may meaningfully negotiate with.
I am not advocating compulsion to use this arrangements, but arguing that such arrangements have no value at all is straightforward ignorance of reality. If in your desire to stick it to the man you want to spend £10,000 on lawyers and accountants for £1,000 of revenue that is of course your right. People who are trying to make a living are likely to take a different view.
It is correct: the article doesn't say the probe couldn't do any work, just that it couldn't do that work from solar cells.
what's the sun done to deserve that? Send it back to it's masters ....
It's already engulfed SCO once if you follow the corporate shenagans for long enough. After the "old" SCO sold its Unix biz it renamed itself to Tarantella which was then bought up by Sun who themselves were then bought up by Oracle.
The type of coolant is the secret recipe.
What, you mean water or meths?
I have no idea how this particular item works. It's conceivably the same technology but with separate pipes rather than concentric.
Heat pipes have been around a lot longer and have far greater application than their comparatively recent use in desktop computers. Yes, there are always two paths between the heat souce and the radiator, one for the vapour and one for the liquid. How those are arranged physically is irrelevant: you are confusing the packaging of a specific implementation with the general principle.
difference between this and heat pipe ?
Not a lot, but that's hardly surprising in that it is a heat pipe. It presumably (you would hope) is not dependent on a specific orientation which differs from some desktop heat pipes but that isn't new in and of itself. The primarily innovation here seems to be getting it so thin, especially in contrast to the desktop heat pipes which invariably won't fit in even a compact desktop case.
Mind you, I think that's simply another expression of the endless ability to screw over the gamers. Nothing opens the wallet faster than stroking the ego, and agreeing that a £600 machine made of commodity components is some kind of supercomputer is a sure way of doing that. Cue the order of a £100 heatsink the size of a brick...
Define the mathlib with '-l' (man bc).
No, the problem there isn't that the maths library isn't loaded but the scale factor is left at default (i.e. integer). You need to set the scale register to some positive value to get that many decimal places. This is a backstop against recurring decimals and irrational numbers - since it is arbitrary precison even something as simple as 1/3 would carry on forever as it tries to calculate an exact decimal value (0.333333...) unless some limit was in place. See the very man page you reference.
Date/time format is a human convention - so it doesn't mean a thing.
But so is the decimal number system.
The BBC should have bought a stockpile of Pi's and gave them out, and tailored their documentation for the device. That's all they needed to do. But no, someone at the BBC decided "We're not spending enough money on this, lets release our own hardware too!". But you know why would it be any different? They haven't had to work for their money, they just use the threat of court to get money from people who want to watch a bit of TV in the evenings.
They are not paying for them though - they are being stumped up for by their partners, and yes, even the RasPi is probably too expensive. Let's say there's roughly a million year 7 pupils - giving them all Pis would cost £25m+ - yes, you'll get a bit of a bulk discount but not much given the volumes Pis are produced in anyway, and the costs of distributing via schools will be fairly high. That's a big chunk of change to come out of even multinational community or corporate social responsibility budgets. When you factor in the support stuff it needs to make a complete workstation - case, PSU, SD card, leads, keyboard, mouse etc - it no longer even looks particularly cheap compared to some other alternatives.
I also think this is better focussed on their specific stated aim to get kids coding. There is no doubt the Pi has been spectacularly successful but in that respect it hasn't really worked out. Look at all the attention it gets - it is focussed squarely on being a media server or some other commercially available device. Hardware work similarly revolves are plugging in some commerical module as opposed to building something yourself.
If you give out a more limited device that can't be used as a free general purpose computer and doesn't need a lot of extras (because it can never be a workstation) you can focus the spend on the skills you are trying to develop, rather than simply providing a free gadget to every 11 year old in the country.
So how does this work if you have a raft of machines with the same id?
I wouldn't have thought it would make any difference. The presumable intention is to ensure that only one copy is running at a time on any given system, to prevent the infection crippling the machine and indeed itself. If two different machines compute the same value it doesn't make any difference since the mutex is local to the system, so the one instance per machine relationship still holds.
£100/year for insurance? Think you need to shop around more.
You don't even need to do that most of the time: watches are generally included under the personal possessions cover of your home insurance even out of the house. You may need to check the level of cover for a particularly fancy watch but my £800 O+W is covered without me even needing to declare it.
That makes perfect sense to you and I because we understand the basic mechanism of competition. Unfortunately, Adobe has never heard of it. Perhaps they will soon.
They can work if the level of prestige is high enough. Donald Knuth stopped handing out bug report cheques a few years ago because of fraud but no one ever cashed them - far better to frame it and hang it on the wall.
Which was in fact the problem - too many scanned images of the cheques online, complete with valid bank details, which is why he had to stop it.
However, that's a personal reward from a world renowned expert whose prestige is inflated on that basis. This is another utterly pointless metric to go along with your Facebook friends, Wikipedia edit count, Stack Overflow rep, Yahoo Answers points and so on almost ad infinitum. All of them essentially boil down to how much time you are willing to waste on something for no gain instead of any actual skill.
UK is the name of the alliance of England, NI, Wales and Scotland.
Great Britain is the name of the actual islands.
I'm amazed at how many people still get this wrong, even when correcting others. Britain is an island - the British mainland, hence the Great prefix denoting it as the principal island in the archipelago, c.f. Gran Canaria for instance. The islands as a whole are the British Isles, consisting of the UK, ROI and Isle of Man (but not the Channel Islands which are not considered part of the group).
I always thought that Kickstarter projects were providing funding for possibilities rather than concrete results. Did I misunderstand? Genuine question - I don't know.
But the fundraiser does have a responsibility to provide what is promised. There are no guarantees but that is in the sense the agreement is not underwritten - the promised benefits are not notional goodwill gestures but have the force of contract behind them - they are guaranteed by the fundraising entity if not by Kickstarter. This is why there is a walk away clause built in - if what is promised turns out to undeliverable you refund the money - all of it - and that releases you from your obligations under the contract. Yes, that means the developer takes a financial hit, but does get them out of a situation of e.g. providing something that costs tens of thousands to provide for a few hundred. The other alternative is that the company folds in which case yes, you lose out from investing but that is the risk people understand and accept.
What we have here instead is a going concern that presumably is making money that has funded an R&D programme via pre-orders. The programme failed but the customers are still being asked to pick up the tab. Meanwhile the company continues to operate without penalty and carries on making money. That is not the risk associated with failure of the company but a straightforward breach of contract. They have ripped off their customers who do on the face of it have a valid legal claim against them - you can't simply wash your hands of the affair and say "Yes, we've spent your money but our gamble didn't pay off. You have to pay for our mistake."
How does this stand up to anti age discrimination legislation, e.g. the Equality Acts here in the UK?
Some have been PXE booting diskless servers for decades now.
And? How is that remotely relevant? Did you even read the article?
This is modularising Windows and allowing you to ditch the crap you don't need and don't want. That is something altogether more substantial than where the boot image comes from.
If they can do it properly so that you can cut it back quite dramatically (e.g. run a few undemanding services in 128MB memory and a 1GB disk image) then that would be qute welcome. I suspect what we will actually end up with is a dependency graph that goes "You need feature X. That needs features A, B, C, D, E, F, G, H, I, J, K, and L installing as prerequisites." If it goes too far along those lines it loses its usefulness and becomes more marketing fluff, so it'll be interesting to see how it actually works in practice.
How much of a mess has been left on the Moon as of now?
Serious question. Descent stages for six Apollos, the buggies for the later ones, Lunokhod, Ranger impact, etc. Any more?
There's actually a surprising amount of stuff up there, much as I hate quoting Wikipedia they do have this list which lists some of them, although for many there are in fact several objects for each listing - random pieces of wreckage for the crash landings and assorted litter and equipment from the Apollo missions.
Currently NO ISP is charging content providers off their networks with interconnect fees. Market forces are stopping that. If the market already stops that, why does the government have to step in to "stop" something not happening?
Careful there, you are answering the wrong question and doing so incorrectly. Netflix are not themselves a Tier 1 network, indeed most of their "network" is actually Amazon's.
You need to look into how inter-network routing actually works which is as much by negotiation and contract as it is technical. NN is nothing to do with mandating some form of universal peering arrangement, per-byte charges already exist, have always existed, and will continue to exist even if this has been passed in the form people are assuming it has.
they have been awarded a patent but it is worth nothing until it is tested in court.
They have applied for a patent.
The mobile manufacture element is only a tiny part of the claim, what is actually being claimed here is essentially a 3D printing bureau service of the kind that has been around for 20 years. The only difference I can see is the introduction of a library of third party designs that can be selected by the user, instead of needing to supply a design at time of order. Hardly a groundbreaking step, indeed it could be argued that making a request "Send me one of your demo models" to a 3D printer vendor is prior art here.
Yes, I know the US patent system is hopelessly broken but even so I surprised they would think it worth the time even trying to patent this.
Not necessarily - it might be that they'll do the electronics later, or are doing it in a different class.
Not in school. People seem to massively overestimate the standard of anything taught in school. Admitted this is 20 years ago but when I was there calculating using Ohm's law was at level 9 on the curriculum - i.e. the standard expected of an 'A' student at GCSE. In that context even simple things such as determining the value of a base resistor (part research, part calculation, part judgement) are completely off the menu.
You can blame that squarely on the National Curriculum - if everyone has to cover everything the coverage becomes so wafer-thin that it is of little practical use. When what became CDT and then D&T was half a dozen separate subjects, e.g. woodwork, metalwork, needlework, cooking, tech drawing and systems - of which you might do perhaps two - you could develop skills to a reasonable level. Now it seems the entire subject is classroom based and the pupils never pick up so much as a junior hacksaw.
Not that that is bad in and of itself - I remember when I was at school logic circuits were on a ready made board and wired with banana plugs. That is enough to teach the principles of what is being taught even if not component level design. It was only at A level electronics the breadboards and soldering irons came out.
thats the problem - people want to overclock without accepting the risks
And also that people don't recognise the problems that they themselves have caused. People try an over clock and initially it appears to work. They e.g. try a couple of games on it, oh it still works, forget about the overclock and move on.
Six months later a new game executes a perfectly valid sequence of instructions under conditions that make the timing of that said instruction sequence close to the wire on even stock hardware. With the overclock in place the results are still being computed come the relevant clock edge, errors result and e.g. the computer crashes.
Cue lots of ranting online about how the game, or Windows, or the GPU is a buggy POS because it keeps crashing. Never any mention of the fact that they broke their own computer - they don't even recognise that fact for themselves.
The endless coverage in the gaming rags has caused overclocking to become viewed as a risk-free method of extracting the very best from a machine, and a "cool" thing to do to show how clever you are with computers. To the extent that in some quarters you get labelled a mug who wastes his money if you haven't clocked your system to the very edge of stability under even moderate load. After all, it's how fast your machine is that counts, who cares if it can be guaranteed to work properly?
While I hate being fair to Apple, they do not actually make the things, not an excuse but part of the explanation. Sadly when the real assembler of the bits runs them through the soldering process it is not exactly a craft operation. It is run down to a cost limit, which is bad, a time limit which is necessary to protect the bits and a quality control limit which is supposed to be good.
Yes, but the onus is still on Apple to specify exactly what they want, how is it to be made and what materials they will use, and they will do so in exhaustive detail. Hell, this is a company that goes to exacting lengths even for the cardboard boxes it sells it phones in - that vacuum-induced whoosh when you open an iphone box is not by accident, it is a deliberate feature to subconsciously persuade the buyer they really have bought something of quality.
Selection of soldering processes used will be a far more significant factor than the mere box. Indeed if you look at pretty much any semiconductor data sheet you'll see a section on soldering towards the end. This is something the designers will pay close attention to even at the schematic design stage to ensure all the components needed can be physically assembled on the same board without the involvement of mutually-incompatible processes.
All of the manufacturers do this - after all it is their reputation on the line - even for components. This is why for example if you compare a Dell desktop with an equivalent white box the Dell seems to have too small a power supply - perhaps 400W as opposed to 600W. However if you test the two PSUs side by side you see that the Dell delivers more power, their specifications state how that power rating is to be determined which is more conservative than the optimistic ratings so prevalent on the open market.
Apple are big spenders, not some two bit operation, and can afford to specify anything they want and their suppliers will bend over backwards to deliver it. If the process used is the wrong one it is because Apple said to do it so. If QA is not up to scratch it is because they cut corners in that part of their demands. Note that Apple are not blaming their suppliers here - they realise they have to stand by the quality of their own products. They would take production back in house in a heartbeat if they though poor quality outsourced work was going to damage their reputation.
No, demanding bundling of the entire package effectively prevents third party subsidies for the inclusion of added services, much like the effective giving away of Windows in the form of Windows with Bing. If you could have had a subsidy but can't you are paying more.
The fact that it might not actually be a good trade off doesn't that doesn't enter into the calculation - it is for the open market rather than the courts to decide what is a "fair" price for a product or service, inclusive of intangibles such as personal data.
Sadly most manufacturers go for the "supported" option, rather than having the balls to do it themselves. Back in the day they would be doing all of it from scratch, now we have the usual lazy can't be arsed to manage our own operating system cobblers, even when it is detrimental to their product.
No, the really sad thing is that this is what customers expect and demand. If a device has no Google layered services on top, or indeed Google Play, it gets labelled as cheap or proprietary. The reviewers are no different: you don't need to go any further than this very site for e.g. this review from Alun Taylor for an example:
The elephant in the room is that all these improvements only go so far to compensate for the fact that the Amazon App Store is still short on content compared to the Google Play Store and that you have to forgo Google’s own apps.
Since when is not given true details fraud if there's no financial transaction involved?
The laws on fraud are defined in terms of material gain obtained by deception. Financial transactions are the common form that fraud takes but it can be and is applied much more broadly than that. By giving false info you are receiving a material benefit (the update) which cost the provider real money to supply (power, bandwidth, hardware, etc) on the basis of a false representation. That is not a matter of interpretation - it is clear and outright fraud according to the law.
You've never really used or set up an OSX machine, have you? You need an Apple ID for updates, but it doesn't check if your details are real or not and the associated T&Cs are actually decent.
I was thinking more of the iOS devices there but the point still stands. By your own admission you either have to give over your personal data or commit fraud. Some choice.
You wouldn't find this on an Apple computer, because a single company controls both the hardware and the software. Microsoft's reputation is being undermined by crap like this. They need to copy Apple and start shipping their own hardware.
You wouldn't. You would simply find that an Apple device is all but unusable if you deny it the chance to phone home with a far more comprehensive set of personal data. Sadly, the average punter doesn't seem to care.
Heads should roll over this. Literally, as in detached from the bodies that they used to be part of. It isn't going to happen, it'll be a mistake or a bug or something.
As Steve Rambam said at least ten years ago, "Privacy is dead. Get over it." You might not like it but as long as somebody else is willing to lap up this kind of shit it is an economic impossibility to avoid.
Any term widely used and understood for two decades is proper usage. Original meaning becomes irrelevant after a certain period, such is the nature of language.
But in this case it is some elements on one community - everyone else has been using it correctly for the last thirty years. Loadable module or not, drivers are still in kernel space, run in kernel mode and a single errant driver can and does take down the entire system. Even if that wasn't the case it still wouldn't qualify as a true microkernel since it still includes many systems that reside in user mode under the true microkernel model, e.g. the process scheduler.
Those elements of the Linux community misusing the term are not alone, this kind of mislabeling is quite common. The classic example is Windows' use of the term "virtual memory" to mean disk paging, which gets so deeply ingrained that people people refuse to believe you when you point out the term does not by definition refer to hard drives at all. Just like that example, a piece of hyped mislabeling does not alter the accepted definition.
Which joker sent the Face On Mars a pack of cigars, huh?!?
No. It's actually a million office workers following management advice given when they asked "OK, so where can we smoke now?"
it's very old hat now to set priority low for IP data, high for IP phone, and top priority for internal network management, you know. then somebody downloading 450 Mb of manuals from a vendor is not going to hammer your conference call with market analysts.
I was thinking exactly the same thing. On site you say "This VLAN is high priority" - that's CCNA stuff. Externally use MPLS and designate the appropriate LSPs as priority. In both cases the primary driver for their introduction was mixed data on a unified network.
Well when the BBC allows one person, Roger Harbin, decide that scientific consensus for man made global warning is so strong, the BBC can be absolved from it's responsibility of balanced reporting, then what hope does any real science have on the channel? We can only look forward to the BBC trust backing to the hilt more sensationalist drivel like the mega tsunami, and disregarding any evidence to the contrary.
Demanding consensus is far too high a threshold for science coverage with any currency to it at all - if that is the threshold you have strictly educational programming and that is it. If consensus is required entire categories go straight out of the window. Can an informative and valuable program be made covering string theory? Of course it can, the fact that string theory is very much a work in progress and has nothing like consensus behind it should not be a bar on such a programme being commissioned.
Can the news report the findings of a peer-reviewed paper in e.g. the Lancet? If you demand consensus then no it can't - the fact that three or four academics were unable to rubbish it at review stage does not amount to consensus by itself.
Consider The Sky at Night and they are discussing some space probe. Before launch the discussion is couched in terms of "We expect..." or "We hope to see...". When the results are back it is "We think this is probably..." or "We interpret this as...". Both are mere speculation rather than reporting of the consensus. Does it mean it is not legitimate science programming? Of course not.
Far too those with a false sense of their scientific awareness have a mistaken belief in some magical threshold level of consensus, and everything outside that box is junk. That attitude itself is psuedoscience, on any particular area of progress there are different ideas with different levels of support behind them. Any new proposal lacks acceptance at first, it has to be considered and tested first and there are always counter-theories and interpretations. If you proceed from the starting position of "This has no consensus, therefore it is junk and therefore it doesn't even need to be explored" you are not promoting science but arguing for a complete cessation of scientific advancement.
I have not seen the programme in question but the BBC's arguments seem valid enough to me starting from that position. Consider the very premise of the programme - its title is not "Are we going to be killed by a mega-tsunami?" but "Could we survive a mega-tsunami?" - i.e. the emphasis in not on how likely it is to happen but on what the impact would be if it does happen. Remember that the arguments that it can't happen are themselves shaky - note the appeal to recorded (i.e. written) history which is so brief as to be meaningless in a geological context and as is acknowledged the longer geological record does indicate prior examples. Even if the tsunami couldn't possibly be triggered by this one volcano, the programme remains valid if anywhere else (possibly somewhere we know nothing about) could cause similar effects. I think that's enough on which to base a What If? scenario that doesn't tackle the contentious point head-on in any event.
It's not beyond the wit of man to build drive bays that accept bare drives, and they would probably be cheaper as well as more convenient for the user.
Personally for things like this I much rather screw the drives in place. With lowish end hardware like this I've had too many bad experiences with clip in trays - they may work well enough on day one but invariably there are plastic components that are turning brittle come year five. I'd expect to run an array such as this for at least ten years (demoted to backup store towards the end) so I'd happily a moment's inconvenience for those years of extra life.
...given that there is a virtually infinite list of provocative domain names, many on very similar themes and this will appeal to those who like to stir things up. So, ITaughtTaylorSwiftHowToGiveHead.com is taken. Did they think of e.g. ITaughtTaylorSwiftHowToSuppressTheGagReflex.com?
They will be expensive to start with and not so further down the track.
I wouldn't count on it. I've been using DVD-RAM for perhaps the last ten years for archival and blank media still costs perhaps 5x similar DVD+/-RW.
When I built this new workstation I'm using now a couple of months ago I did pay over the odds for an M-DISC capable bluray drive with the single layer discs in mind, but when I looked at the cost of media I decided that I'll stick with DVD-RAM for the time being. It doesn't help that there seems to be a pitiful lack of competition at the distributor and retail level so the most attractive cost/disc prices have ridiculous shipping fees attached to them.
The savings come with volume and it seems there is little mass market interest in long term integrity - you can see some of that even here, with all the attention focusing on the potential problems of reading in centuries time, but completely failing to grasp that 99% of the archival market is more bothered with readability after perhaps 20-30 years, 50 at the outside.
What a load of guff! No units on the y-axis, arbitrary threshold - FAIL indeed!
In this case a failure in basic comprehension. The article explicitly specifies this is according to ISO/IEC 16963 which defines what it means for a medium to have failed when determining storage lifetimes - that is the scale. If that isn't good enough for you obviously you have much greater insight into this area than that international panel of experts who spent years considering the issue.
He^6, not He...
Microsoft trying to succeed outside x86 = a zebra trying to change its stripes.
Sure. MIPS, Alpha, Power, Itanium, now ARM. (Have I missed any out?) All supported at one time or another, all have fallen by the wayside. That track record is pretty damning and becoming a self-fulfilling prophecy: no one wants to adopt a new platform today that won't exist in five years time, condemning MS to continued failure away from their x86 home turf.
This is so open to abuse. As an extreme example, if you were to upload a picture of a woman without her "headscarf", there are a number of ethnic minorities that would consider that sexual.
As has already been noted I think that's stretching things a little. Juries are expected to weigh things like this according to their understanding of the broader community's values rather than their own insular views.
The problem I see is at the other end of the debatable spectrum. Take it as read the photos under consideration are of full frontal nudity. Are they covered? It's easy to give a knee-jerk answer of "yes" without considering the context or that nudity does not by itself imply sexuality.
What if this is a photo taken for medical purposes? An artistic nude? Holiday snaps on a nudist beach? None of these are overtly sexual in nature so my personal interpretation would be "no". However I can see many people would exempt the medical photos but include the others or indeed include all three sets of photos. The balance of people holding each of those interpretations is difficult to forecast in advance and will vary between juries, so you will not get a consistent interpretation of what constitutes sexual material.
If the phone is bent that way, clearly it has to be flexible, because exactly no one would want a smartphone that is bent like that all the time.
Bent like what? It is a deliberately obfuscated photo that allows you to extrapolate several possible shapes from the information available. I can see three straight away: the most literal view would be that it is doubly bent and this a straight side-on view which seems improbable and doesn't it in well with the shading.
The second is that this is an edge-on three quarters view of the side and end of a shape that angles up at one end, kind of like how some calculators angle up the display portion of the case. That fits well with the photo but I don't see why you would want that shape in a phone.
The third and my preferred interpretation is that this is a straightforward curved phone with the screen side facing downward, taken from an angle to mostly show the top or bottom end and dramatically foreshorten the long side. I can see some people wanting that and can see some practical benefits in that it would allow you to more easily reach the top and bottom of the screen with your thumb.
In any event I'm now long past the point at which this descends into idle speculation as the author was careful to avoid. My essential point is that you don't know anything about the shape from that photo.
One final point that is easily missed - if you load it up in an image editor and start distorting the brightness curve quite dramatically you do bring what appears to be an edge button into visibility below the rightward portion of the horizontal bit, at least I don't think it's a compression artifact. That doesn't tell you much by itself but it does seem to eliminate interpretations that require a face-one angle of the phone.
On a related note they keep using another photo, sorry don't have a link to it right now, of a woman with red-brown hair, hand up to her head on what appears to be a beach...
Found it, it's used on this article among plenty of others. I haven't been able to track down the "other" photos though.
It's a woman who has just found your secret porn stash and is holding it in her hands and with that disapproving look, she doesn't look best pleased...
On a related note they keep using another photo, sorry don't have a link to it right now, of a woman with red-brown hair, hand up to her head on what appears to be a beach... it always looks strangely familiar. I tried a Google image search on it once and only found it on the stock image sites but I could swear I've seen the same model in the same location on one of the porn sites.
Someone's making the wrong comparison. You need to look at the cost of replacing the disc versus the value of the data on the disc. I suspect the disc is tiny in value, compared to that of the data it holds.
No, that is the wrong comparison. If you have data that you can't afford to lose on one device (or even one array) that is your problem - if you have a backup of the data on a drive the value of the data on the dead one is meaningless.
However, that still isn't the point they are making. It is being taken as read that the data must be protected and in that sense your point is the very opening premise of the study. They are not arguing over whether data should be protected but the most cost effective way of assuring that.
Having said that I'm still not convinced the comparison is valid. I'll admit my experience is at the lower end of the scale, only going up to a few tens of terabytes but in my experience the cost of the drives is usually around half of even the capital cost of the array. You have semi-fixed costs such as computer smarts and software on top but the extra costs per unit are not inconsiderable, i.e. physical enclosures, controllers and power supplies, which inevitably scale with the number of drives.