> I don't use Adblock on The Reg
I didn't until they started sliding things all over the screen. Hopefully I've managed to target just that one thing.
2104 posts • joined 18 Jun 2009
> I don't use Adblock on The Reg
I didn't until they started sliding things all over the screen. Hopefully I've managed to target just that one thing.
I think the issue is more that what should be just a civil wrong was met by several government agents, apparently under the control of the MPAA, with the man being detained and questioned for an hour before they could be bothered to do even the most cursory inspection of the evidence.
Agreed in principle. What American*, no matter how frightened of terrorism, actually wants their tax dollars spent on getting Homeland Security to rough up people in cinemas? No politician of any party is going to stand up and defend this.
That said, supposing the man had started filming as security approached him then the likely outcome would have been that (i) since he has now taken video footage without permission in a cinema, obviously he's a terrorist and can go straight to jail; and (ii) his device would have been confiscated before he had a chance to send footage anywhere.
(* or anyone else, anywhere else — this just happens to be an American story)
I don't mind letting a security firm raise its profile if it helps to create the narrative that smart appliances have more negative qualities than positive.
My understanding of the legal position is this:
Per Factortame, some acts are of constitutional significance. The European Communities Act is the one that case is about but it's far from being the only one. Such acts are special because they are not subject to implicit repeal. If a later act wants to contradict a constitutional act then it has to do so explicitly.
Subsequent acts like the Human Rights Act (which incorporates the European Convention on Human Rights) have adopted some of the logic of this line of thinking: all acts are to be interpreted compatibly with the HRA unless they state explicitly that they're incompatible. Were there no recognition of the idea of a set of elevated acts, such a provision would be void since there's an underlying rule that no parliament can dictate which laws a future parliament may make or unmake.
So if a barrister could find a suitably significant act then he or she could argue that the subsequent one is not to be applied literally as written. Similarly judges could try to finagle some sort of unintended meaning out of the literal words if they really put their minds to it.
But in England and Wales courts cannot strike down legislation in any broad sense. This is something the Americans explicitly did differently as a balance and measure to try better to ensure ongoing separation of powers. See e.g. the Lord Chief Justice prior to 2005 for an idea of how much the British system has ever been bothered about technical separation of powers. We like strong competing interests but have historically not generally been especially bothered about whether powers technically may flow from one body to another.
[Mobile] Safari is the same, for the record. Zooms never reflow. I can't help feeling Google is throwing away a usability advantage. At worst they should, in classic Android style, make it optional.
I make less than $1,250 a day. I'm living the dream!
The facts don't show "that his products have been involved in a number of fires that should not be expected nor tolerated for these vehicles" at all.
The facts are that three Tesla vehicle fires have been reported, all following a collision. So the [US] National Highway Traffic Safety Administration is investigating whether there's a problem. It may find that there is and it may find that there isn't.
In most cases it'll mean recoding and recompiling to take full advantage. Probably only OpenCL apps will just work more quickly, and theoretically DirectCompute apps but Microsoft's inexplicable decision to bury that in DirectX makes it somewhat obtuse and hence obscure.
Right now Nvidia seems to dominate the market for compute languages with its proprietary CUDA, which isn't going to work on an AMD product but that and OpenCL aren't even as different as, say C++ and Java as they're built on fundamentally the same concepts. Think more like mid-'90s C++ and Ada95.
They might not buy it in preference to other options but if were de-Apple'd other than the price then there's nothing to dissuade them particularly.
The HP Z420 has the same RAM, processor, SSD size and essentially the same GPU but with twice as much GPU memory. It costs 13% more.
The Lenovo ThinkStation S30 has the same RAM and processor, half the SSD, 50% more GPU RAM but an Nvidia rather than an AMD (fantastic if you're a CUDA person but it benchmarks more slowly for OpenGL stuff), and costs almost 35% more.
That said, both of those are traditional desktops so they have the internal slots and drive bays. Also they are both computers that are already on the market versus the Mac Pro which isn't quite yet, so their pricing will have been set when components were more expensive than they are now.
Wait a few months and somebody will be undercutting Apple if they're not already, if only because Apple adjusts price and spec only maybe once a year and prices to be profitable from day one.
Sony launched a MiniDisc data drive but those geniuses decided to make the drives incompatible with the normal discs, creating specific MD Data media — it's is identical to a normal minidisc except that the plastic case has an angled corner in the top-right. Presumably the record label had a word about music piracy and thereby lost Sony however many billions over the course of the '90s.
I'd prefer XP with a compositing window manager and a smarter start bar. Or Windows 7 as I like to think of it.
I actually think I'd probably be fine with Windows 8 as I've noticed that I tend to launch a few applications at the start of the day and then just use those, but there seems to be no benefit in upgrading. It's a shame there's no obvious commercial model that just makes the OS updates free, without locking down the hardware and introducing planned obsolescence.
I can't be the only person to have had one of those?
Also, as a niggle: I'd argue that the 68008 is either 8 bit or 32 bit as it's an 8-bit bus with a 32-bit instruction set architecture. Which I guess means it's an 8 bit machine in context, given that the hardware engineering seems to have been the primary goal of the project, Sinclair being a company that made money through selling hardware.
On the contrary, ARM is a proven architecture in a way Itanium definitely wasn't. Itanium bet on very long instruction words (VLIW): each instruction was a compound built of the instruction for every individual unit in the CPU.
Famously the Pentium has separate integer and floating point units. If it had used VLIW then each instruction would have specified both what the integer unit should do and what the floating point unit should do to satisfy that operation.
So the bet was that compilers are better placed to decide scheduling between multiple units in advance than a traditional superscalar scheduler is as the program runs. The job of organising superscalar dispatches has become quite complicated as CPUs have gained extra execution pipes so why not do it in advance?
The idea of VLIW had been popular in academia but had never succeeded in the market. So: the Itanium architecture was unproven in a way that ARM most definitely isn't.
In the real world the solution to the problem VLIW tackles has become to cap single cores at a certain amount of complexity and just put multiple cores onto each die.
I heard that if you can give it 1.21 gigawatts then it can go almost three full days without needing to be plugged back in. It's Apple's version of wireless charging.
I think 4K is here to stay on the basis that it's very useful for computers and costs of scale make it more efficient to manufacture one type of panel for all uses.
I know you might think that antialiasing solves the resolution problem because you can no longer see the pixels, but it's effectively a high-pass filter. You lose detail and things don't look sharp. Compare any of the first generation tablets to the modern lot, even at a distance.
I think because chose and choose versus lose and loose tends to confuse people. If you were a bad speller, which of those would you think should rhyme?
Admittedly I own none of the modern consoles but to an outsider the overwhelming impression is that Sony and Microsoft will sell you games where army men direct you where to stand on screen in order to see the next story-boarded explosion whereas Nintendo prefers to promote the 25th iteration of Mario Kart.
I'd dare imagine Samsung can spend a little less on advertising now that it's accelerated away from the pack. It now just needs to make sure it isn't being outdone. I also don't think the "just another Android handset" necessarily applies as people show strong brand loyalty following positive experiences even in markets where the products are essentially interchangeable — think of things like food, clothing, etc.
I feel quite sad for HTC. The One should have been the hit of last year but ended up not even reaching HTC's expectations.
I don't think it's safe to assume Apple will respond as they're happy to ignore so many other segments that would be an easy segue — think the upgradeable desktop, the phablet, the netbook and more. They also often don't do very well when they do fill an apparently obvious incremental need, so I suspect that sucks some of the motivation out of the room. See the market performance of the headerless Mac Mini.
The evolution of the iPhone screen is also a relevant case study. They quadrupled the pixels. They switched from 35mm-film style 3:2 to cinematic 16:9. Those are the two changes in seven models.
Good on Samsung for offering a range of devices for a range of consumers but surely it's going to be overly bulky? I think laptops get away with it because you put them on a surface to use and then ordinarily sit at arm's length away.
On the plus side, the weight doesn't sound that bad. It's about 750g, apparently, which although 57% heavier than this year's iPad is just 13% heavier than last year's. It's 158% heavier than a Nexus 7 though.
"That's utter tosh."
No, it was poorly phrased. By "full-fat QT was considered too much hassle on a handset" I meant "the full QT API was considered to add too many unnecessary complexities when developing for a handset". No performance issue — just a question of how precisely you tailor an API for use in a specific domain to the exclusion of other uses.
"You're moaning about an API you've clearly never tried based on a brief description of an early Alpha that explained how to create a custom "button class", and ignoring the features of the beta and released."
I'm moaning about an API based on the description of it given to me by people Nokia paid to do so as pitchmen, presented a week before they abandoned Symbian, Maemo and QT.
If you're saying I should instead judge a later version then my judgment is this: it's completely useless because Nokia phones use Windows Phone.
"The GPU claim is just a property of being newer - is Windows RT better than Linux, because it was written in a time when GPUs can be assumed? No. Newer OSs that come out will in turn be able to assume things that IOS could not, but that doesn't automatically make them better, let alone an argument for being revolutionary."
Revolutionary is a straw man. Nobody here has said revolutionary.
Launching when GPUs had become cheap enough is part of what allowed Apple to be first to commercialise the direct manipulation metaphor. The direct manipulation metaphor is a huge advance because it removes one more abstraction. If I want to scroll the web content, I just push it around with my finger, in any direction I like.
Compare and contrast with having mentally to link the position of the content to two individual sliders, and being able to scroll on one axis at a time only. You think nothing of it because you're a long-time computer user. Most of the world isn't.
So, yes, being able to design around having a GPU on hand is part of what allowed Apple to advance the state of the art (though, again, by _commercialising_ an idea, not by inventing it).
I maintain that direct manipulation is a huge advance and is one of the reasons Android now rules the roost.
"I found touchscreen Symbian perfectly fine compared to anything else of the time,"
That's very difficult to believe. I was given an N8 for free by Nokia as part of a developer relations effort. So that's nearly the final touch-screen Symbian phone. To pick one fault at random, from the limited set I recall at this distance: because Symbian was designed around the screen area being known and constant, most applications couldn't deal with a virtual keyboard that would, effectively, dynamically change the screen area — so Nokia hacked around that by implementing a system whereby tapping on a text field would take you to a completely separate screen, without any context, where text could be entered and then submitted back to the original text box, returning you to where you previously were.
I saw that in several places, without installing a jot of third-party software.
"If multitouch is the revolution you claim, then how come the complaint today Apple fans make of Android is that the phones are too big to use one handed - how do you use multitouch one handed?"
I can't actually answer for every Apple fan you've ever heard say something on the internet, but I would assume it's about options. Use it with a single thumb or multitouch it. While my personal preference smaller phones, the market has spoken fairly clearly: the 5–6" Android phones are where the action is. I guess it's all about hand size too, even in the opposite direction: a lot of people already can't reach the entire iPhone screen with a single hand so what advantage is a 4" screen to them?
"Your comments about what Nokia couldn't do with development tools also make no sense - they switched to Qt, one of the best toolkits I've used, and development was as good as Android's for the time (some things were not as good, but some things were better)."
You're rose tinting. Nokia was in the process of developing and trying to push QT Quick because full-fat QT was considered too much hassle on a handset. Even then, we had a thirty-minute presentation on how 'easy' it was to create a push button. Apparently all you need to do is create a graphic of the button, create a depressed graphic, add a touch sensitive area, position it exactly over the graphic, then catch the appropriate finger up/down callbacks and push appropriate messages to show and hide your two graphics.
Android and iOS, of course, just have built-in classes for buttons.
"As for AT&T, if they offered unlimited data to the minority of Apple users and not others, that's not something to praise."
AT&T went from — in line with the rest of the industry — not offering unlimited data to anyone, to offering it to some people. That's not something to praise?
If we're discounting (i) the interface (Apple: first to market with multitouch; first mobile OS built from day one to assume a GPU); and (ii) the cost of ownership (Apple strong-armed AT&T into unlimited data when everyone else was charging 50p/megabyte), then I guess you can write off the iPhone. And more or less everything else to happen in computing since at least 1980. Who needs the direct manipulation metaphor if a Computer Science graduate can learn to navigate a highly modal, nested interface on a phone with exactly one font and two levels of zoom using a combination of key presses, dials and a stylus?
Ignoring the good things Apple did for the market just weakens discussion of the many bad. It also implicitly attacks Android. Android has conquered the market by being a million times better than the near-unusable pre-iPhone rubbish. That's why throwing some gestures on top of Symbian and trying to do with development tools what they couldn't do with the OS didn't work for Nokia.
Not if you live in an American city other than New York, where public transport is anathema.
You'd be crazy to want to own a car in most of London.
i.e. for the people who don't live on the ground floor and/or use on-street parking. I assume there's quite a lot of us, since so many of us fit into each building.
Fine, we also don't have anywhere to install the Fresnel lens and therefore can't charge it in a day but it's the only hybrid with an electric part that at least may sometimes be useful.
In America most employees get only ten days of holiday a year (versus twenty eight in the UK and similar quantities across Europe), so the average Joe wouldn't gain that much from working the extra days — 252 rather than 242 is only about a 4% increase.
Though that's more than 1.9% so your point stands.
I completely agree. They put a web view into an application and didn't even make sure it sourced "retina"-resolution content, so the display was blocky and the user interface slow and obtuse*, then washed their hands of the platform. Given Apple's mapping fumble, there was a fantastic opportunity to grow their user base but Nokia screwed it up.
Google's approach, of taking a few weeks more to come up with something fast and effective, was much more intelligent.
Microsoft: 64.1% of commercial channel computing devices that aren't phones
I'll bet Microsoft has a long way further to drop — all those years focussing on corporate hearts and minds is paying off, but probably not in the way intended. Though they still seem not quite to get the difference: I think shoehorning the phrase "with full Microsoft Office" into every consumer-targetted Surface advert isn't having the effect Microsoft seems to intend.
I think everything is well on the issue of homosexuality in the UK now, isn't it? Or, it will be from next March. Maybe I'm living a blinkered existence?
By including a single male with a Gear in the advert, they could not only have reflected their device's demographic in the abstract but have included every single person so far to buy one.
Smart watches are a solution without a problem.
If we're going to stick to America versus the UK, highlighting the existence of the constitution doesn't make a lot of sense because the UK is almost unique in being designed around not having one — the UK has a series of distinct documents that are considered as being of constitutional significance (ie, are not subject to implicit repeal per Factortame) but everything else is up for grabs or dictated by case law. For example, the protection against arbitrary imprisonment in England is the court's ability to issue a writ of habeas corpus. Does that make it any less of a protection? The only thing that stops the monarch from grabbing back absolute power is the certainty of what the public would do as a result. Does that make a sudden switch from democracy to dictatorship somehow more likely?
History further shows that the USA has been no faster to gift liberties in practice.
In the British Empire, trading in slaves became illegal in 1807 and keeping them at all became illegal in 1833. The USA banned the import of slaves in 1807 but didn't manage to abolish slavery until 1865. So which nation was ahead in liberty?
Nowadays the UK doesn't execute prisoners because it learnt the hard way that criminal systems are fallible no matter how many rights of appeal are offered. It has free comprehensive healthcare not just on humanitarian grounds — the belief is that healthcare is a fundamental right — but on purely functional ones: if a significant proportion of your population has worse health then that means you're likely to have worse health too, since many types of bad health are contagious and public spaces are shared.
Those are things a European mindset would suggest are advances but with which a USA mindset wouldn't agree. But that's just more evidence that most of the world does not recognise the American norms as some sort of ideal.
"I got a few thumbs down for saying it, but the American constitution and system of government is a historically amazing human achievement. It was an isolated event drawing much inspiration from the French revolution (via B.Franklin ), and coming "hot on the heels" of the English civil war. But the ideal of "life, liberty and the pursuit of happiness" is simply an amazing observation."
The American constitution and system of government is nothing like an isolated event. It is explicitly a fork of the British system, run locally so as to be answerable to local needs. It was designed to be bicameral with one house that lots of people are elected to and the one house that a small number of people are chosen for by other important people. It uses an adversarial, precedential legal system based on the premise that everything is legal unless explicitly proscribed. It explicitly adopted all British case law up to the cut off. Major political party for the first half century? The Whigs.
The British system, of course, directly descends from that imported from Normandy by William the Conqueror in 1066. Ever wondered why we have mortgages, a few of which are puisne, or why civil wrongs other than those arising due to contracts are called torts?
Converting Locke's "life, liberty, and estate" to "life, liberty and the pursuit of happiness" does not an isolated event make.
Jailbreaking isn't piracy because it doesn't involve distributing something for free that the publishers desire to be reimbursed for.
Jailbreaking is also legal in most places. It's even explicitly legal in the USA thanks to an exemption to the DMCA — presumably because there's no MPAA/RIAA-style body opposed to it.
Bill Gates isn't Jewish.
He does look quite a bit older than his actual age — he's only 58 per Wikipedia. Given that he spent his professional career near Seattle, I don't think he can blame sun damage.
Apple's SDK supplies you with only the incoming stream of typed characters. So ordinary Bluetooth keyboards are useful for email, browsing and general productivity but not for games — it's like you get key down notifications but not key up.
Products like the iCade work around that by typing one character for button down and another for button up. This keyboard presumably does the same thing.
So it won't be usable with anything other than the Elite Systems app. It's an expensive workaround for an iOS-exclusive problem.
It's not particularly secret that digital cameras can see infrared, is it? That's why you can use your phone to test whether the batteries have run out in your remote control.
An oblique reference to the classic Gran Turismo tactic of bouncing off your opponents to get round corners?
Admittedly it's been years since I played one. Do they model car damage yet?
"Jag-yar", if memory serves. Surely you guys didn't do more than one thing to upset Apple?
Yes, they would. See VP8 or SPDY.
I think the difference versus classic 90s-era Microsoft is that what's good for the Internet is good for Google, so Google's proposed standards tend to be good ideas which Google genuinely intends to be open for all. One gets the impression that the point of Dart isn't that they can shut down Firefox et al, it's that the various Internet-connected devices from all manufacturers will be better at running web applications from all providers because then people will spend even more time on the web and Google's advertising revenue will increase.
The other difference is that people seem to be a bit more aware of the malign influence that even a benevolent single sponsor can produce. So Google's good intentions may still not be enough to win people around. But at least they're trying something.
It falls within your list of GPUs that required reverse engineering but I think pretty much everything about the Raspberry Pi's Broadcom is known now: see http://www.geomerics.com/media/presentations.html
Samsung didn't technically lose against Apple — being the claimant it merely failed to win. The standard might be only balance of probabilities but the burden was entirely on Samsung. It's much the same distinction that causes criminal defendants to be found not guilty rather than innocent.
My first silver metal laptop with black keys was a c.2001 Titanium PowerBook. I'm glad we can talk about this topic as it is exceedingly relevant to the story posted.
I think the argument is that Microsoft should pivot from profiting on every release to profiting only on the initial install — the copy of Windows your hardware manufacturer licenses. Exactly the same model as Microsoft already uses for Windows Phone, effectively, with the assumption that the rise in tablets and other Secure UEFI devices will more than offset the money lost to traditional PCs as they retreat into a business niche.
Giving everyone the latest version for free has other benefits too: Apple has used it effectively to take control of its platform's development tools and to strong arm all developers into supporting the latest technologies. If 75% of your user base is running the latest thing and 90% are running either the latest thing or the thing before then there's a pretty solid case for developers to use the latest tools and frameworks.
That would give Microsoft more leverage with which to push developers towards WinRT and store-bundled releases, and therefore give a huge boost to their tablet efforts. Based on the Windows Phone trajectory, Microsoft's tablets will probably be competing adequate with Android devices on build quality and cost within a few years so if they can just find a way to get the application developers on board then they'll have a reasonably healthy pitch for consumers.
But that's the starting salary. So you get that straight out of university.
If you've at least a year or two's experience, you're pretty much guaranteed six American figures. There aren't enough employees to fill the vacancies and only a very limited number can be imported every year. That's why you always see the big tech firms with any lobby group in favour of immigration reform.
I think you're conflating issues.
The Hour of Code is a national scheme. Participants offering a free one-hour in-store lesson include both of the companies that have the space to do so: Apple and Microsoft. The article seems more interested in the Apple angle but you can take that up with El Reg.
Neither Apple nor Microsoft allow you to develop for their ~$500 tablets without buying an additional computer. Does that mean neither of them should be allowed to take part in the scheme?
They are if you modify the gun and then resell it through the original maker.
Yes, I think that was the problem. It remained an AppWork product even after changes. Hence the company assumed responsibility and the CEO ended up being liable.
So it's broadly similar to going after a hosting company for propagating an infringing website.
How about Oakland West?
Yes, it's a smug, annoying name for the customer service counter.
However, somewhere in the 12 years since Apple introduced them, I think most people who are likely to hear about them will have made peace with the name.
For your next comment, maybe you could explain what this "Windows XP" thing is?