113 posts • joined Monday 28th September 2009 10:45 GMT
Re: Flashing news!
I suspect they're sniffing the browser agent string and sending different content to different browsers.
Indeed - and IE11 is no longer detected as IE-family, the sniffer doesn't know what the heck it is, so it gets dumped in 'must be some previously unknown variant of Netscape 2.0'. Which is exactly what ASP.NET's default browser caps do up to .NET 4.5. There are jhotfixes available for .NET 4.0 and 2.0-3.5, but I'm not entirely sure whether they just fixed the detection files, removed the detection feature, or defaulted to assuming max capabilities rather than fewest.
IE11's User-Agent string is substantially changed from older versions of IE, *because* it is a much more compliant browser and newer websites were sending incompatible, or fallback, content. The change makes it look a lot like Chrome, which means most sites will send it their latest content version, which should be mostly compatible in IE11.
Microsoft Dynamics will have to come up with an update that actually detects IE11 as a 'capable' browser before it will work without selecting Compatibility Mode.
The Compatibility Mode button - which is only available on the desktop browser, it's not in the 'Metro' version - tells IE to send an IE7 User-Agent string to the server (nearly - it sends ';Trident/7.0' in the string as a tell-tale that this is really IE11, not 7). The browser then defaults to its IE7 rendering mode, unless the site sends an X-UA-Compatible HTTP header (or META tag) telling it to use a newer mode.
If IE decides that the server you're connecting to is on your Intranet, it will use the Intranet Zone settings. The default for the Intranet Zone is to always pretend to be IE7. This can of course cause problems for applications developed for IE8 and up. The Intranet Zone is, by default, only enabled for domain-joined computers, and the default detection rule is basically 'if the hostname in the URL doesn't contain any dots, it's Intranet'. The Intranet Zone rules are configured on the Security tab of Internet Option - click the Local Intranet icon, then click Sites to set up the rules for what is considered Intranet. To disable compatibility for intranet sites, press Alt+T to get the old Tools menu, and select Compatibility View Settings. Then uncheck "Display intranet sites in Compatibility View". These settings can be set through Group Policy.
Microsoft's Compatibility View Lists also gives them the ability to send a custom User-Agent for specific domains. This is what went wrong with IE11 against Google's websites when it was first release: Google's code didn't work with IE11 originally, so Microsoft added their domains to the compat view list indicating IE10 (but using the Trident/7.0 token rather than Trident/6.0 as IE10 would send). Then, just around the time that IE11 was released, Google fixed their code to work with IE11's real User-Agent, IE10's real User-Agent - but it broke when IE11 sent its pseudo-IE10 string. MS then took Google domains out of the compatibility view list, but it takes a little while for the browser to download a new list.
As I recall, there was never a public version of 64-bit Windows (beta or Gold) for Alpha. NT 4.0 supported Alpha, using the 32-bit instruction set, and Windows 2000 supported it right up to release candidate 1. Then Compaq pulled the plug on support. MS press release: http://web.archive.org/web/19991012214337/http://microsoft.com/NTServer/nts/news/msnw/compaq.asp
Why did it matter for Compaq to support it? Windows on Alpha was never a retail product, only available with a new Alpha-based system (OEM product), and MS require the OEM to provide front-line support for OEM Windows. (I think they'd do better by standing behind their product, regardless of how acquired, but it's their decision, and a large part of why OEM Windows is substantially cheaper than Retail editions.)
I believe MS continued to work on 64-bit Windows using Alpha hardware until IA-64 hardware became available in moderate volume. WOW64's origins - of running 32-bit x86 Windows programs on 64-bit Alpha 'native' operating system - explain a lot of the oddities in the handling of 32-bit programs on x86-64, such as dual views of the registry, inability to load 32-bit code in a 64-bit process, completely separate 32- and 64-bit copies of most libraries, segregated Program Files folders, etc.
Re: IE 11 User-Agent string
Yes, it is:
Mozilla/5.0 (Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko
And Google didn't work with that, so Microsoft set up the compatibility view list so that IE sent a very-nearly IE10 string to it:
Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.3; Trident/7.0)
The real IE10 sends:
Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; Trident/6.0)
At some point - presumably in the last week - Google changed their code. Now the real IE11 User-Agent string works, and the real IE10 User-Agent string works. The faux-IE10 string set by the Compatibility View list, however, doesn't. (I've just tested this out with IE10 on Windows 7, using Fiddler to change the requests before sending - sendng 'Trident/7.0' causes it to break in exactly the way described.)
So now, Microsoft have changed the Compatibility View list so that IE11 sends its native User-Agent string.
Microsoft are warning that they intend to remove the feature in future versions:
"Starting with IE11, document modes are deprecated and should no longer be used, except on a temporary basis. Make sure to update sites that rely on legacy features and document modes to reflect modern standards.
"If you must target a specific document mode so that your site functions while you rework it to support modern standards and features, be aware that you're using a transitional feature, one that may not be available in future versions."
Re: Doesn't necessarily mean it's Microsoft's fault
I believe it was a combination of the two:
- Google's code in March didn't detect the new IE User-Agent string properly...
- ...so Microsoft added google.com (etc) to the IE compatibility view list, telling IE11 to pretend to be IE10...
- ...then Google changed their code last week to work properly with IE11's correct User-Agent string, but break with the IE10 string (only when the IE11-specific 'Trident/7.0' appears, and therefore doesn't break in actual IE10)...
- ...now Microsoft have removed the CV-list entry so IE11 reports as itself
The current 'ttl' element in the CV-list is set to 1, presumably cache for one day before trying again.
Information on IE's User-Agent string and Compatibility View list can be found at http://blogs.msdn.com/b/ieinternals/archive/2013/09/21/internet-explorer-11-user-agent-string-ua-string-sniffing-compatibility-with-gecko-webkit.aspx
@ Tom 13
The problem is, or was, Google not sending standards-compliant code to IE11, when IE11 sends its latest User-Agent string. Therefore Microsoft added Google's domains to its Compatibility View list.
This list does not necessarily do the same as clicking the Compatibility View button. The button forces IE to emulate IE 7 (which is useless, in my opinion - it should emulate IE 6). The Compatibility View list can cause a custom User-Agent string to be selected for a given site, it can turn other features on or off such as back-forward caching, it also lists domains that are known to require ActiveX controls (and therefore have to load in the desktop browser rather than the 'immersive' mode), and which GPUs and drivers are known to have problems with hardware acceleration.
IE11's User-Agent string is deliberately very different from IE10's, in order to cause more sites to send it standard-compliant code rather than code designed for IE 6. Google's code must have been detecting it incorrectly. In the current version of the compatibility list that I just retrieved, the only feature disabled for Google's domains is the back-forward cache. It's also excluded for microsoft.com.
Switchover done, not tablets
In my view, the last decade's rise in TV sales was for two reasons:
1. Thin, light, and slightly less power-hungry flat-screen HD TVs (plasma, LCD, LED backlight) became sufficiently affordable to replace bulky, heavy, very power-hungry SD CRTs;
2. Practically the entire world went through a digital switchover.
Both of the above reasons fed on one another and led to a boom in TV sales.
Both reasons will have petered out. The digital switchover is complete in the major economies and well underway in the rest of the world, with deadlines in the next few years. Those people who were going to replace their TV for one with an integrated digital tuner have done so; those who added an external box are no more likely to replace their TV than they would have before switchover.
The trouble for TV manufacturers is that they really haven't come up with a new must-have beyond HD, which for many viewers is still a marginal benefit. People might be buying TVs with 3D, they might be 'Smart' TVs, they might even have 4K resolution, but in most cases that's simply because those features were bundled with a TV that had the desired size, picture and sound quality on normal 2D, 1080p and SD broadcasts.
Has anyone done an analysis of the relative sales of *real* PCs over the last five years? Those that are actually powerful enough to do more than web browsing on?
Asus and Acer were predominantly netbook manufacturers. I wouldn't be surprised to find that the market for netbooks running Windows has been essentially replaced by iOS, Android and (to an extent) Windows tablets. But the question is, is the *rest* of the PC market actually holding up beyond a brief fad for netbooks?
X-UA-Compatible not a long-term solution
Microsoft are planning to withdraw compatibility modes from IE in future versions:
"Starting with IE11 Preview, document modes are deprecated and should no longer be used, except on a temporary basis. Make sure to update sites that rely on legacy features and document modes to reflect modern standards.
"If you must target a specific document mode so that your site functions while you rework it to support modern standards and features, be aware that you're using a transitional feature, one that may not be available in future versions."
Re: Obsessed with consumers
iPhone and Android devices support (or at least support*ed*, in the case of Android) Microsoft's Exchange ActiveSync protocol, which does email 'push' over HTTP/S - so you can hook them straight up to an Exchange server (2003 SP2 or higher) or any other server that implements EAS, and you just need a normal data plan rather than Blackberry-specific plans.
(Technically, EAS Direct Push is actually client pull - the server just doesn't reply to the client's request until it has something to send or the connection is about to time out.)
The integration between Exchange and the Blackberry Enterprise Server was always one of the big pain points, so I heard. Many Exchange admins would probably be glad to bury BES and BB in a ditch somewhere.
It appears that BB10 synchronizes email using the EAS protocol - this allows a BB10 device to be used for push email without BES. Not clear whether BES10 wraps its tentacles round the heart of the Exchange server when it is installed, though, or merely acts as a man-in-the-middle extending the server's responses.
Module certification, not product
All the FIPS 140-2 certification does is say that if you use the crypto facilities in Windows (because these modules are common across all implementations of the NT kernel), that they will implement the approved algorithms properly, and not leak information outside the module to other parts of the application or to other applications. It's not a high bar.
This certification absolutely does not mean that data stored on the device is secure against external attacks.
Earlier versions of Windows and Windows Phone crypto modules were also certified - the Windows Phone 7 ones certified under Windows CE. I'm not sure what the threshold for needing a new certification is, but all that's happened here is that NIST's wheels have finished turning and the new certifications for Windows 8 have been signed off - just in time to start the process all over again for Windows 8.1. If, that is, whatever changed in 8.1 requires a new round of certification rather than just adding the approval for the new version.
Re: Possibly stupid question
Yes. It does mean that. Not because of frequencies, but because the limited spectrum available after release of '700 MHz' won't be enough to reconstruct the services we have now, at their current coverage levels, unless the newer DVB-T2 standard and AVC/H.264 compression is used for many more of the services.
If you have Freeview HD equipment, you'll be fine. Any non-HD gear quite possibly won't work, or won't receive all SD services, after this band is released. You should check that any new equipment has the 'Freeview HD' logo (YouView equipment is fully DVB-T2/AVC compatible, but doesn't fully implement IPTV in the same way as the Freeview HD logo now requires, so can't have the logo).
If the decision to switch to DVB-T2/H.264 isn't taken, then it should just be a case of retuning the box. It'll still scan channels 49 to 69, it just won't find anything up there.
Personally, I don't think the case has been made for release of this band. It seems to suffer from circular reasoning: the predicted demand for bandwidth comes from demand for linear TV on mobile devices, so we have to turn off the current linear TV broadcast in order to make space to send it over an inferior protocol?
Re: It's not about addressable memory
*Microsoft* implemented PAE just fine, from Windows 2000 onwards. Manufacturers of commodity hardware didn't - most hardware and drivers in the PC world could not handle being presented with 64-bit physical addresses. So when introducing Execute Disable/No Execute in Windows XP SP2 - which requires PAE to be turned on, on x86 processors - Microsoft deliberately capped the physical address space at 4 GB for compatibility with the cheap hardware and bad drivers.
Server editions of Windows on 32-bit processors, both before and after XP SP2 / Windows Server 2003 SP1, can access however much RAM is fitted, up to whatever the limit is for that edition of Windows. Windows Server 2003 and 2008 Standard Editions are also limited to 4 GB, but for market segmentation reasons, not technical ones (i.e. want access to more than 4 GB of RAM? Pay more).
Re: Pesky paper trails
There is no proof that what the computer has recorded internally is the same as what the voter selected, and what is printed on the audit trail. That fundamental lack of ability to see how the machine is operating means that it cannot ever be trustworthy.
Re: Modern MIPS isn't as RISC as it used to be...
ARM doesn't have a delay slot, but if you perform any computations using the Program Counter register (e.g. retrieving local pool data, immediate data that's too big/complex to go in the immediate part of a MOV instruction) you find that it's actually pointing two instructions (8 bytes) beyond the instruction that does the computation. That's a bit mind-bending.for anyone who grew up on a CISC processor.
Refarming permission already granted
Ofcom granted permission for O2, Vodafone and EE to refarm their 2G spectrum for first 3G and then 4G services. http://www.telecoms.com/161582/ofcom-approves-2g-and-3g-spectrum-refarming/
Right now, you can't make phone calls on LTE - we're still waiting for the networks to implement Voice-over-LTE. The phone falls back to 2G or 3G to make phone calls. That means 2G can't be completely switched off yet, as even 3G coverage isn't up to it.
Frankly, I think the telcos should be required to sort out their coverage, deploy VoLTE, and shut down 2G before they get any more spectrum. We're more than 13 years on from the 3G auction and coverage is still pretty atrocious. I particularly object to the idea that broadcast TV would have to go through yet another technology upgrade in order to keep its current coverage level and range of content, to make space for telcos to continue to run three generations of incompatible networks, the oldest of which was obsolete more than 10 years ago. A sunset date for 2G would *make* the telcos improve 3G coverage.
Stuck on WP 7.x
I suspect the OEMs weren't willing to put in the effort - if it was even possible - to bring up WP8 on their old hardware.
Windows CE does not have a standard boot loader. It is up to the OEM to write their own boot loader, which calls directly into the OS 'Startup' function once it has located the image to run. The kernel is mostly supplied as shared source code and Platform Builder will build your code and link it to produce an image. See http://msdn.microsoft.com/en-us/library/aa446905.aspx for details (that's CE 5.0 rather than 6.0 but it's much the same on 6.0 and later versions). Dealing with interrupts, timers, power management and other basic hardware resources is a job for the OEM Adaptation Layer, often written by the processor manufacturer (as a Board Support Package) but the OEM can customize it. http://msdn.microsoft.com/en-us/library/ee479387(v=winembedded.60).aspx
The Windows 8 kernel expects to run in a PC-like environment. For ARM devices, it uses UEFI to boot and ACPI to describe the system hardware in a way that Windows can use to configure itself to the system. I can't find anything explicitly saying that this is how Windows Phone 8 does it, but the intro for Windows RT is here: http://blogs.msdn.com/b/b8/archive/2012/02/09/building-windows-for-the-arm-processor-architecture.aspx .
I can easily imagine that the UEFI and ACPI implementation is larger than the space available for the CE boot loader. It's likely that the various hardware in the device doesn't conform to the Windows-on-ARM models that would allow generic function drivers supplied by MS to be used, meaning that the OEM would have to write new drivers (the driver models are completely incompatible). It's a vast amount of effort that would mostly be wasted if new devices conformed to the Windows-on-ARM hardware model, and probably running a huge risk of bricking the old phones even if it could be achieved.
I can't see this happening again: I think it is very unlikely that any technical changes will now obsolete Windows Phone 8 hardware. The kernel is the same as on the desktop, the server and on Windows RT devices, and it boots and talks to hardware in the same way. Microsoft don't have a third kernel stream to use (excepting research projects like Singularity, or the .NET Micro Framework which is smaller still than CE). There's a much clearer and cleaner demarcation between MS-supplied code and OEM-supplied. The runtime is the same as the full .NET Framework, with the server pieces removed but otherwise the same.
The main programming difference between Windows Store for desktop/tablet and Windows Phone apps is that Windows Phone Runtime (WinPRT) still wraps up the Silverlight/WP7 UI controls, rather than using the UI controls developed for Windows Runtime. Windows Phone 8.1 'Blue' is basically held up waiting for that. My suspicion is that the Windows Phone 8 SDK was so late because they were trying to get it done for WP8, but couldn't make it work in the space/speed/time available and cut it at the last minute. There's not much point investing heavily in the apps with a shifting base underneath - or maybe all the changes to the apps were already done for proper-WinRT-on-WP and thus can't readily be back-ported to the old UI components?
Non-DRM music stores
iTunes, Amazon, Play.com - that's three to be going on with
Amazon's only 'DRM' is that the MP3 file is watermarked in a way that can tie it back to your account. See http://www.amazon.com/gp/help/customer/display.html/ref=dm_adp_uits?ie=UTF8&nodeId=200422000 (some files don't even have this stamping). I think being able to trace where an unauthorized copy came from is a reasonable step.
The files themselves should play on any conforming MP3 player, so the usual complaints about having to repurchase, etc, simply don't apply.
The movie and TV industries really need to get a clue and follow suit. They're still stuck where the record industry were five years ago. The other thing the movie and TV industries really need to do is stop making exclusives and openly distribute all content through all stores: I'm unsure of which subscription service to join, because I have no guarantee that the content I might want to watch will be available through my choice. They don't have long-term exclusive deals for DVDs, why is downloading or streaming any different?
Time slicing is no problem at all if the majority of the threads on your system are blocked, waiting for something to happen (e.g. user input, a network request to complete). The battery killers are the apps that poll to find out if something's happened, rather than subscribing to an event that tells them something has happened. It's down to the OS to provide such a notification system, and for developers to use it rather than polling (the OS typically has to provide a way for the app to find out information when it starts up, or to make decisions in response to another notification - it can't *just* have events).
Also, apps should not waste CPU time (hence power) calculating things that the user cannot currently see. On iOS, Windows Phone, and Windows Runtime ('Metro') apps on Windows 8, if an app is not in the foreground, all its threads are suspended. You have to specifically register distinct code to be able to run in the background. The OS only gives these background tasks a limited amount of time to run before killing them, to prevent runaway code killing the battery. Audio players, turn-by-turn navigation or location-tracking apps need to register so they aren't suspended, and Apple and Microsoft check that these permissions/capabilities aren't requested by apps that shouldn't have them when verifying them for store inclusion.
Android allows apps to create as many threads as they like, and doesn't suspend background apps. You might 'need' more cores on Android simply because background apps are unnecessarily wasting CPU time (and battery power).
The rule of thumb is that you need more cores if you constantly see more than 90% usage across all the cores. It's very unlikely that a single active app plus the OS rendering, and a few background tasks (that are throttled anyway) can actually saturate that many cores. Windows Phone and iOS devices top out at dual-core.
Cars in need of extra sponsorship often perform significantly better in pre-season testing than in race trim. In pre-season, they don't need to pass scrutineering, so can be below minimum weight, and don't have to provide a fuel sample after qualifying, so can run on just barely enough fumes to get round a lap.
The licence fee funds the BBC
I'm sorry, but you're wrong: ITV does not receive any part of the licence fee. It should not: it is an entirely commercial organisation.
I am trying to find a documentary source for you, but I'm struggling. The law (Communications Act 2003 section 365) requires the BBC to collect the licence fee, but to pay all money collected (less any refunds to be paid) into the government's main bank account, known as the Consolidated Fund. The government then decide how to allocate whatever they receive.
The Consolidated Fund accounts for 2012-13 shows "BBC Licence Fee Revenue" as £3,122m. The BBC's Annual Report shows £3,091.7m income plus £16.8m 'premium' from the quarterly payment scheme.
The government, from 2008 to 2012, did top-slice the licence fee to fund Digital UK and the Switchover Help Scheme. Since the BBC was a large shareholder in Digital UK, its accounts were consolidated in the BBC's accounts, and SHS was also arranged under the BBC. Now that switchover is complete, that money is heading to the government's Broadband Delivery UK scheme.
The problem with real-world currencies is not governments
It's banks. Banks create money when they create loans, and they now create the vast majority of money (estimates range from 95-98%). Unless you are prepared to have 100% reserve banking, preventing banks from creating money, you cannot base an economy on BitCoin without completely debasing the currency. There are many sources for this, here's one: http://www.webofdebt.com/articles/dollar-deception.php
The fact is, fixed currency standards don't work; they cannot scale to the level of growth in the economy without increasingly mining an ever greater amount of whatever commodity you pegged it to, and having to store it. Gold was useful here because it doesn't obviously degrade, and has few uses (its use in electronics, of providing a tarnish-free and reasonably conductive coating to ensure good contact for connections made and broken repeatedly, was decades away when we went off the gold standard). Failing to keep up with economic growth causes deflation, which is generally considered a bad thing: http://krugman.blogs.nytimes.com/2010/08/02/why-is-deflation-bad/
Fiat currency allows the supply of money to approximately match the aggregate demand of the economy, without uselessly mining a resource that you're not going to use. Central banks can wield a few levers to try to keep the supply slightly ahead of demand, in order to get a little inflation, which helps devalue debts as well as savings. The problem we've had for a decade or so is that the economy is very imbalanced, with consumer electronics largely in deflation, cancelling out some very high inflation in house prices (not measured in the favoured 'consumer price inflation' metric) and other commodities.
Really, money is just a medium of exchange: something that has wide acceptance in exchange for other things. It's just our IOUs to each other: I owe you a day of software development, you owe me an Xbox. By assigning numbers to these IOUs, I can transfer your IOUs so that Samsung owe me a TV. You have to think of money's value being in terms of what it can buy. Instead of thinking that a sandwich costs £2.50, you say that a pound is worth 4/10ths of a sandwich.
The major problem for government is that politicians do not understand how money is created, and how differently it behaves under a fiat currency system compared to a pegged system ('gold standard'). Too many economists - and, unfortunately, the ones that the politicians are listening to - still make their predictions based on ideas from the gold standard era - that there is a finite amount of money.
Re: What doesn't help with Adobe..
Apparently there is a way to bundle third-party applications in a way that WSUS can consume: see http://wsuspackagepublisher.codeplex.com/
Microsoft have not bothered to make it possible to update third-party applications through Windows Update because the vendors all want to have control over the updating experience, and won't produce proper MSI installers that actually use Windows Installer properly (rather than just wrapping a script, for example). Windows Update does support driver updates, but when did you last see a timely update for your graphics card on WU? Never, because nVidia and ATI insist on shovelling additional control panels and other shovelware along with the driver, and don't package the install properly.
Adobe get kickbacks from Intel for bundling McAfee AntiVirus with Flash, Oracle get kickbacks from Ask for bundling their toolbar. I'm sure one of them tries to bundle Chrome as well. If Ninite are allowed to install without offering the prompt, Adobe and Oracle don't get their kickbacks.
Re: Thought it said Free Software Foundation on the door
No, it's simpler than that. Any DRM system *must* have the decryption key on the user's system as well as the encrypted content. The only way that the key can be protected is by some form of obfuscation. Even if protected by other system or application keys, the application has to be able to unbundle the key.
Open source software can never be certified for implementing a DRM system, because there is no way to hide the system for hiding the key, without massively obfuscating the code for doing so - something that would simply not get checked into the system. There would somewhere have to be a binary blob implementing the DRM, but that is not compatible with the GPL. It is compatible with *other* open source licences, but the FSF's purpose is to promote GPL.
So we have an impasse. Hollywood won't release its content officially without DRM, but GPL software cannot implement DRM, and it offends the sensibilities of other contributors to the W3C.
Re: RE: thus proving taxation systems are broken
Don't see why corporate taxes should not be assessed on revenues rather than profits. My income tax is assessed on, well, my income, less a personal allowance. In fact my personal allowance is slightly *reduced* because my employer pays for private health insurance - which I don't expect to use, but haven't opted out of.
I'd have no problem with allowing a 'corporate allowance' of something like number of employees registered in PAYE, multiplied by some reasonable wage level, to ensure that the company can always pay its employees.
Re: "introduction of a new rendering engine can have significant implications for the web"
The problem is that standards are very difficult to specify precisely using English. The specifications for HTML 4, CSS level 1 and CSS level 2 have not changed in 15 years. They were sufficiently ambiguous that even though browser manufacturers were doing their best to test to the specifications - even Microsoft for IE 6.0 - there were incompatibilities between the results. There were no rigorous, shared, conformance tests for any of those until really the last couple of years, so the required behaviour was not nailed down - still isn't, really. Even different versions of WebKit - that is, current builds of Chrome and Safari - could, and do, produce different behaviour.
CSS level 2 was found to be so ambiguous, and have so many underspecified features, that it led to a revision 2.1 which nailed more stuff down and removed a lot of the underspecified stuff.
A lot of the effort in the HTML5 and HTML v.Next, and related, specifications has gone into nailing down precisely what was actually meant in earlier versions. There's now a serious effort to write shared conformance tests, and to actually run them automatically for each browser build, checking for regressions. IE has quite a lead in the official conformance tests, because Microsoft have been submitting the most tests to the suite - not without debate as to whether the test actually tests the behaviour it claims to test, and whether it comes up with the right answer.
Re: thus proving taxation systems are broken
The source of these problems is actually very simple: countries have agreed to write their tax systems so that multinational companies are not taxed twice on the same profit - called 'double taxation'. This is supposed to be fairer to the company.
The problem that occurs is small jurisdictions that don't need a lot of revenue - in absolute terms - set their tax rates very low - in percentage terms. (Or places that set corporate taxes to zero for overseas corporations and raise all their revenue from residents.) Through assigning some income to such tax havens, or exaggerating the costs of some required resource, whose supply is routed through the tax haven, the corporation can reduce their tax bill in the high-tax countries that are actually providing the revenue. This is referred to as 'double-non-taxation'.
Google, I believe, has assigned the copyright to its logos to a subsidiary in a tax haven, then that subsidiary charges a ridiculously large amount to each national subsidiary for use of those logos. Starbucks did something similar, and also routed all buying of coffee beans via Switzerland, for which the Swiss subsidiary extracted very high management fees, so each national subsidiary is paying far more than open market price for coffee.
Amazon UK's servers are actually hosted in Luxembourg, and all purchases from amazon.co.uk are therefore reported as being made in Luxembourg, meaning they pay Luxembourg's very low rate of VAT rather than the UK's much higher rate. VAT-bearing goods were formerly routed via Guernsey - as in, shipped from a UK warehouse to a Guernsey subsidiary, and back to the customer in the UK - in order to avoid VAT, but HMRC have closed that one (Low Value Consignment Relief was a special feature for the Channel Islands, intended for small businesses actually based on the Islands selling small amounts of stuff to the UK, but it was abused, and so small Guernsey businesses don't get the relief any more.)
Microsoft have set up their patent licensing subsidiary Microsoft Open Technologies Inc in a tax haven, and Microsoft Corp will pay MOT Inc royalties for use of those patents. (You didn't think it was really about making the interoperability groups arms-length from Redmond, did you?)
The answer is also quite simple. Strike out the double taxation rules. All revenue raised in the country that the end customer lives in is taxed at the prevailing rate in that country. Multinationals are then playing by the same rules as corporations that do business solely in one jurisdiction.
However, that is considered bad for business, so the suggestion from the Tax Justice Network is to employ country-by-country reporting. That is, change the global accounting standards so that multinationals are forced to report accurately how much revenue was raised from each country. The group profits are then apportioned to each country according to the proportion of revenue, and tax assessed in each country according to the corresponding part of the profit.
Re: Internet Explorer 6 staggers on?
Microsoft's own upgrade-from-IE6 website http://www.ie6countdown.com/ (which uses statistics from http://netmarketshare.com/ ) indicates that the Far East is really the only outpost left where IE6 has significant usage share on the open web. Well, let's be honest: China. In the UK it's well below 1%.
NetMarketShare weight their statistics - gathered from tracking bugs on websites using HitsLink, I believe - by overall internet traffic from each country, to rebalance the distribution of users of their customers' websites. StatCounter do not do this. It does mean there could be big sampling errors if relatively few users from China are browsing sites that use HitsLink.
I'm still not sure how well these companies deal with Network Address Translation, having multiple computers behind a single public IP address. The Far East notoriously also has very few public IPv4 addresses, with NATs being widely deployed. If the counter cannot see through the NAT, it will record a count of 1 for each browser used behind the NAT regardless of whether there is one instance or a million, heavily distorting the results.
Re: TV is in the way
Apparently I can't subtract today. Three have 2x15 MHz at 2.1 GHz. They would still have to turn off some 3G to get some 4G in this band, if any phones even support LTE in this band.
TV is in the way
'Three' cannot launch their LTE service yet because they only got 2x5 MHz in the auction, and they got the lowest-frequency block, which will still be occupied by TV services in some parts of the country until the end of July. Their only other licensed spectrum is 2x5 MHz in the 2.1 GHz band (well, and 1x5 MHz intended for time-division duplexing, which has never been used). That spectrum is used for their UMTS (3G) services, and UMTS cells would, I think, have to be turned off to repurpose them for LTE.
"I like the free open standard better than the H.264, thanks."
@Mikel: You have that backwards. H.264 is the open standard, developed by the Motion Picture Expert Group under the joint auspices of ISO, IEC and ITU. H.264 is the ITU-R project number - it is also known as MPEG-4 Part 10 Advanced Video Coding, and published as ISO/IEC 14496-10. In order to be published by these organizations, contributors have to sign up to the organizations' patent policy, which says that patents covering the specification must be available on fair, reasonable and non-discriminatory terms - but it does not define what those words actually mean. Due to the wide membership of MPEG and of the standards organizations, it should be less likely that someone later claims that their patent is essential to implementation, and that they can hold implementers hostage, because they haven't signed up to FRAND terms.
US courts have prevented Qualcomm from blocking Broadcom's use of Qualcomm-patented technology in an implementation of H.264, because Qualcomm signed up to the patent policy.
MPEG LA's role is that some of those patent holders have employed MPEG LA to look after their interests, regarding patents considered essential to various MPEG standards. MPEG LA extracts an administration fee before divvying up the royalties among the various patent holders. MPEG LA would *like* to be a one-stop shop for licensing all patents essential to H.264 (and MPEG-2 Visual, and a number of others) but there is no compulsion for other patent holders to join. When they talked about 'forming a patent pool' they were inviting patent holders to make similar arrangements.
VP8's *reference implementation* is published under an open source licence. The *specification* is published on the WebM project's website, and Google provide a royalty-free license all patents that Google owns, or has obtained the authority to sub-licence. Google have recently agreed such authority with MPEG LA for some patents that are part of MPEG LA's other patent pools (and MPEG LA have agreed to stop trying to form a pool for VP8). However, *other* companies could still hold VP8 implementers hostage if they have patents essential to VP8 implementation.
We cannot know whether there are such patents. The national patent offices simply do not organize their patent databases in a way that you can properly search, and there is a disincentive to searching: in the USA, you can get triple damages awarded if you have 'wilfully' infringed, and wilful infringement has been decided if the implementer read the patent and decided that it didn't apply. The exact wording of the patent will only be interpreted in a court case, and courts have frequently applied the widest possible interpretation of the wording. For example, Toyota have to pay Paice Technologies royalties on the Prius and other hybrid cars, even though the patent in question specifically mentions how their implementation is different from the mechanical design used in the Prius, itself taken from an expired TRW patent; the claims were read so widely as to apply to any car that combines a petrol engine and an AC motor, AC provided by inversion from a battery.
However, we do know that Nokia believe they hold such patents, essential to implementing VP8, and therefore the IETF cannot publish the RFC as Nokia refuse to licence them.
I'm not defending patents as they currently stand. I think the issues we see largely represent a failure of imagination of the patent office staff, that they are granting the most obvious patents, combining known techniques in a not-particularly-novel way, and one that would be or was discovered totally independently, with no real exposure to the original implementation. The patent *system* makes it unbelievably difficult to actually find out if the problem you're facing *has* already been solved - if we could look up a solution and know it's going to cost us a dollar per device, rather than spending years on finding a solution, we might pay it. What's galling is when you do spend those years finding the solution, only to have someone say 'no, we invented that - pay $$$ per device' when they actually contributed *nothing* to your solution.
Re: heavily weighted towards Labour MPs
The faces presenting the policies may change with an election, but the people writing the policies don't. "Yes, Minister" is heavily fact-based: ministers 'go native' with alarming speed, though perhaps not surprisingly considering they usually have no knowledge or experience in the portfolio they have been assigned, and also no experience in managing staff.
Microsoft's Support Lifecycle policy for Windows is to support a service pack (or the original release if there has only been one service pack) for two years after the release of the following service pack. The actual end date is aligned to the next Patch Tuesday (second Tuesday of the month), which is 9 April. Future updates will only be installable on Windows 7 SP1 as a baseline.
All this means is that if you reinstall Windows 7 from a disc or image without SP1 applied, Windows Update will first offer all the security and critical updates from RTM to this month, then it will offer SP1, then any updates released after SP1.
Windows 7 *itself* is in mainstream support until 13 January 2015, and extended support until 14 January 2020. In the mainstream support period, you can call up for paid support, you can use any free incidents that you got when buying the product, you can get non-security hotfixes and if you really want to, you can make change requests. In extended support you still get paid support but the free incidents are no longer valid; you still get security hotfixes but other fixes require an extended support contract, which you have to take out within 90 days of the end of mainstream support; warranty claims and design change requests are no longer accepted.
There is no incentive if the technology is mandated
Firstly, there are six national multiplexes, not five. There are five SD multiplexes and one HD. At the moment. Ofcom are running a competitive process to launch two new ones.
The idea of the incentive pricing is to encourage the spectrum to be used efficiently. However, there is no point applying an additional tax if the broadcasters' hands are tied on becoming more 'efficient'. The spectrum plan and technology for Freeview was set in stone by government: the public service broadcasters had to achieve 98.5% population coverage, the BBC had to free up its second multiplex to convert it to HD mode, the majority of viewers had to be able to use existing aerials fitted for analogue reception, and we had to fit into the internationally-co-ordinated frequency plans. That really meant a requirement to use the 64QAM, FEC 2/3, 1/32 guard interval mode that the BBC and ITV/C4 are using. If they change that mode, to get more capacity and become more efficient, coverage will be reduced. The limits of what can be crammed into the 24 Mbps available have been pretty much reached, without reducing quality any further. There are already criticisms from many viewers that many channels are unacceptably low-quality, running 16:9 broadcasts at a resolution intended only for 4:3 pictures (544 x 576 pixels) and at a low enough bitrate to prevent the normal smoothing of macroblock edges to work properly.
The HD technology - DVB-T2 and AVC/H.264 encoding - can also be used for SD services, but are only viewable on Freeview HD receivers. The majority of viewers don't have one. The two new multiplexes - to run in this mode, and give four or five extra HD channels on each - are intended as an additional incentive for viewers to go and buy a new receiver. If a majority of viewers haven't done that, it won't be politically acceptable to turn off DVB-T/MPEG-2 support, and that will make the release of 700 MHz very difficult as there really isn't space for six national multiplexes in what remains. Viewers will be seriously angry if they lose services due to this - as it is, there are many people upset by the fact that they can't get three of those multiplexes if their local relay is PSB-only.
It won't be politically acceptable as people will expect the government to fund replacement equipment. For switchover, enough people had voluntarily switched that the government could get away with only subsidising equipment for pensioners over 75, the disabled, and other groups on long-term welfare. It was funded by increasing and top-slicing the TV licence fee, but only by a small amount as so few people were covered.
Meanwhile, the mobile phone networks are now running three generations of technology concurrently, with no end date for 2G announced or even considered. Phone still rely heavily on the 2G network for basic communications, as the promises of 3G coverage were broken and eventually the coverage requirements have been removed. O2's block of 800 MHz spectrum comes with coverage obligations - 90% of the population, if I recall - but the rest of the recent 4G auction has no obligations attached at all. It's still unclear if Voice-over-LTE even works, making voice services still dependent on 2G in much of the country.
Re: Whinging Cambridge
Cambridge's local TV service has had a frequency reserved for it which will not be available to white space devices. It is still considered a 'white space' because it isn't used to cover Cambridge from current TV services, but isn't available to run a full-power service as it would interfere. Cambridge is normally covered, for TV services, by the Sandy Heath transmitter in Bedfordshire: the local TV service will come from the Maddingley site formerly used by Channel 5, on UHF Channel 40. This frequency is, or soon will be, used by the Welwyn relay and three relays near High Wycombe, so is unavailable at Sandy Heath.
Regarding white space devices, a BBC/Arqiva joint report for Ofcom basically says that the TV spectrum is so densely used that only about a quarter of UK households could use a white space networking device. This will drop to only 3% if the 700 MHz band is reallocated to mobile phone networks and the TV spectrum is replanned, which Ofcom seem keen on doing in around 2018. See http://stakeholders.ofcom.org.uk/binaries/consultations/uhf-strategy/statement/BBC_Arqiva_preliminary.pdf for the report. I'm counting scenario 3 - where the 600 MHz band is used by two new TV multiplexes from 25 sites - as this is the model Ofcom subsequently chose from that consultation.
Re: Bands not used for existing broadcasts locally?
The BBC were required to get BBC Alba onto Freeview in Scotland, but weren't given any extra money to do so, nor allocated any more spectrum. That meant having to carry it on their SD multiplex, the second multiplex having switched to the incompatible second-generation DVB-T2 standard to make enough space for four or five HD services. (The BBC are required to carry STV HD and 4hd, and it was expected they would have to carry Channel 5 HD as well, until C5 pulled out yet again.)
So the choice was basically make picture quality terrible on all SD services while BBC Alba is running, or turn off the radio stations.
The local TV services have been granted a multiplex of their own, on frequencies that are generally close enough to the existing multiplexes that existing aerials should pick the up. The multiplex has space for the local TV service, and one or two extra slots that will be sold nationally by the multiplex operator Comux.
Re: Why no bigger cities?
Those cities were part of phase 1, a programme supplier has already been selected, and they are due to launch over the next year. This is phase 2.
Re: Yer not strange
You need to get a Windows Phone. Settings > Website preference > desktop version in WP8. That changes the User-Agent from:
Mozilla/5.0 (compatible; MSIE 10.0; Windows Phone 8.0; Trident/6.0; IEMobile/10.0; ARM; Touch; <manufacturer>; <model>)
Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; Trident/6.0; ARM; Touch; WPDesktop)
The only difference between that and a Windows RT tablet is the 'WPDesktop' token.
Re: hmmm...what's the real story here
Rubbish. The 4G auction was structured in the same way as the 3G one. It didn't raise as much money as we're in the depths of a recession (not triple-dip, in my book we haven't had enough sustained growth to ever have been considered out of it), rather than at the peak of a tech bubble and with ludicrous expectations of video calling.
The UK auction actually raised 33% less money than the Treasury had put into their books for this financial year, a whole £1.16bn short.
"they had to concede defeat that flash is still more popular than Silverlight"
Silverlight is completely blocked in IE10 on Windows 8, 'immersive' (TIFKAM), and in both environments on Windows RT. Not even Microsoft websites are allowed to use Silverlight. So that argument really doesn't fly. The IE team's blog post's comments are full of people complaining about Silverlight not being available.
The reason for supporting Flash is exactly as stated: because many websites simply do not work without it. The trend is to provide an "HTML5" website for iOS, that usually doesn't work on anything else, but particularly not on IE 9 or 10.
Re: What is MPEG-LA's take-away?
It means that Google have acknowledged that it *does* (potentially) infringe on a number of patents that MPEG-LA administer for their respective holders, and that they have agreed a royalty to be paid by Google for all copies. The patent owners aren't going to give it away for free, and I imagine MPEG LA have inserted themselves in between Google and the patent owners and will get administration fees.
The bit about 'MPEG LA will discontinue its effort to form a VP8 patent pool' just means that they have agreed that Google will manage the licensing of these patents. MPEG LA try to be a one-stop shop for patents for other video- and audio-encoding technologies, but there is no guarantee that they have signed up all holders of patents essential to those technologies, and no guarantee that Google will manage it for VP8 either.
The WebM project site license is very careful to state that it only grants a royalty-free licence to Google's patents, and patents acquired by Google, that are licensable by Google. There is still a possibility that someone else out there, not a company whose patents are managed by MPEG LA, will claim that a patent of theirs is infringed by VP8. As, indeed, Google are doing with H.264.
Because VP8 is not a product of a large standards organization, patents are not required to be licensed on Fair, Reasonable and Non-Discriminatory terms. IEC, ISO and ITU's patent policies require that any contributor to a standard licences any patents on FRAND terms, though the meaning of FRAND is left undefined. W3C has a stronger requirement, that all patents covering a W3C specification must be royalty-free. That leaves them with a conundrum: they can't mandate support for any video encoding in HTML5 because no-one contributing to the standard will - or can - make that guarantee.
Re: Confused about the bars diagram
Shorter is better - less time taken. The latest versions of Safari still outperform IE 10 on this particular microbenchmark. Microsoft continue to claim that this microbenchmark is not really representative of anything much, and that they focus their optimization efforts more on whole scenarios than on these microbenchmarks.
Re: Am I missing something about the price of RT devices
Yes, it is real, actual Office. The following are missing according to http://blogs.office.com/b/office-next/archive/2012/09/13/building-office-for-windows-rt.aspx :
•Macros, add-ins, and features that rely on ActiveX controls or 3rd party code such as the PowerPoint Slide Library ActiveX control and Flash Video Playback
•Certain legacy features such as playing older media formats in PowerPoint (upgrade to modern formats and they will play) and editing equations written in Equation Editor 3.0, which was used in older versions of Office (viewing works fine)
•Certain email sending features, since Windows RT does not support Outlook or other desktop mail applications (opening a mail app, such as the mail app that comes with Windows RT devices, and inserting your Office content works fine)
•Creating a Data Model in Excel 2013 RT (PivotTables, QueryTables, Pivot Charts work fine)
•Recording narrations in PowerPoint 2013 RT
•Searching embedded audio/video files, recording audio/video notes, and importing from an attached scanner with OneNote 2013 RT (inserting audio/video notes or scanned images from another program works fine)
That's compared to the x86 version of Office Home & Student 2013. The common theme is largely code that was written in x86 assembly - VBA macros had to be cut from Office:Mac x86 originally, for exactly this reason. See http://www.schwieb.com/blog/2006/08/08/saying-goodbye-to-visual-basic/ for more on that and the technical challenges they faced. I would anticipate that, just as happened on the Mac, VBA will be back in a later release of Office RT.
"Remember, the claimed "up to 3Mbps" is for everyone connected to the same switch."
No. The advertised rates are the ATM line rate negotiated for your link to the hardware in your phone exchange (Multi-Service Access Node, MSAN).
Upstream from the MSAN, ISPs rent backhaul capacity from BT on a VPN link to their internal network, handing over at a Core Node. Physically, BT have to install enough backhaul capacity to handle the total rented capacity from all ISPs from that exchange, up to its parent exchange, and so on aggregating up to the Core Nodes.
Your download speeds from any given website depend thereafter on your ISPs peering capacity at Internet Exchange nodes, the capacity of those nodes, the website's ISP's peering capacity, and the amount of capacity that the website is renting from their ISP.
The article author's problem is that there is too much noise on his line, or devices multiplexing the line to multiple properties, meaning that the higher frequencies needed by ADSL are filtered off. Assuming that the problem is reproducible when all extension wiring is disconnected, he needs a new fully independent cable installed. However, I think BT's Universal Service Obligation only requires that a speed of 28.8 kbit/s is achievable for the Fundamental Internet Access requirement (last reviewed by Ofcom in 2006).
Christian Berger is talking about satellite, where each version of BBC One is a full-time independent stream simulcast on the same transponder, not Freeview, where each transmitter only transmits one variant of BBC One.
My understanding of the reason for simulcasting is simply that the BBC want to remain compatible with as much free-to-air satellite equipment as possible, and many do not support the idea of switching streams on and off and redirecting to a different PID. You have to have the peak capacity available anyway, there isn't something else that can be switched off when local news comes on, so it's a trade-off between having the streams on constantly and sending megabytes of NULL packets.
You could argue that you could use better compression for the regional content, to pack all the regional services into the space used by the single sustaining stream, but a five-fold compression ratio would be unwatchable (the BBC run four or five versions of BBC One per transponder, mixed in with other non-regional services or versions of BBC Two).
Not a jailbreak
Come on, you have to connect with the kernel debugger and insert code to modify a byte to remove the certificate check? That's really not a practical jailbreak. In order to attach a kernel debugger, you have to boot into a kernel-debugging mode anyway. Microsoft's support threads say that you have to contact your 'ecosystem program manager' to do it on RT - Windows RT is not available to OEMs generally - as you can't modify the boot configuration data to enable kernel debugging. I'd be interested to know how he managed to enable kernel debugging in the first place!
Microsoft common controls patent
Microsoft *did* patent the common controls introduced in 1995, and they're among the patents MS are successfully enforcing against Android device makers. Here's one, related to tab controls, cited in the case against Barnes & Noble's Nook reader: http://www.google.com/patents/US5889522
Re: Perhaps this is a bit mad but
A malicious attacker can construct a file that causes an overflow in the font parser. It's nothing about fixing known fonts. This is a security vulnerability.
You *should* install this patch
The issue is that someone malicious could create a specially structured OpenType font file (using Adobe Compact Font Format [CFF or Type 2] font outlines - OTF can contain either CFF or TrueType outlines), presumably where some field indicates a larger size than it should. They can then use that file from a web page, for example with Web Open Font Format (WOFF) download. It doesn't have to be a genuine font, it could be used for one letter on the page, all that matters is that the browser tries to render it.
Because this only affects the Adobe CFF parser, any bugs won't affect most fonts on most people's systems - the Windows- and Office-supplied fonts are either TrueType or OpenType using TrueType outlines. However, most graphics professionals use one or more OpenType fonts, for their advanced features. The fonts using advanced OpenType features usually use CFF outlines rather than TrueType.
- Xmas Round-up Ten top tech toys to interface with a techie’s Christmas stocking
- Exploits no more! Firefox 26 blocks all Java plugins by default
- Xmas Round-up Ghosts of Christmas Past: Ten tech treats from yesteryear
- Review Hey Linux newbie: If you've never had a taste, try perfect Petra ... mmm, smells like Mint 16
- NSFW Oz couple get jiggy in pharmacy in 'banned' condom ad