Re: Good Article
91 posts • joined 17 Jun 2011
There is probably some truth to the fact that Google sees the world through cloud covered glasses. But, in general, Android itself doesn't encourage or discourage sd cards, it's just the Nexus line.
I suspect that as soon as the FAT32 patent expires, meaning that having removable storage doesn't come with a hefty Redmond tax, Google will be just fine with a micro sd slot.
As you said, some permissions are essential to the app function. How do you deal with the inevitable moron that denies net access to their mail app? And how many apps in practice respond gracefully to having random things from the system fail. I don't think that Android necessarily strikes the right balance here, but the matrix is large and users are stupid.
Polonium captures a NEUTRON and then following quasi-science should be THE following quasi-science.
Believe the commentard was referring to then following quasi-science:
However, of course, nuclear synthesis cannot possibly happen in nature. (209Bi can capture a proton to produce 210Bi and decay to 210Po). Which can be quickly looked up here:
But then fundamentalists were never known for their critical thinking abilities... or ability to understand math. By the same logic, He3 couldn't exist either.
Nuclear explosion 'cos this is all about nuclei...
Well, at least MS had the excuse that the processors it ran on 8086 and 8088 didn't have mem protection. Apple COULD have wired in protection for the 68000 (Mac onward) but elected not to due to cost. OS/2 used mem protection from the 286 onwards. NT, based on the 386 did too.
The consumer OS's didn't. Because:
1. RAM was expensive, and the memory protected OS's (OS/2 and NT) both needed more of it.
2. Many apps didn't respect process boundaries and crashed anyway, e.g. directly accessed interrupt tables to hook stuff. So, to keep compatible you couldn't protect memory.
But, rewrite history all you like?
Go back to remedial physics and I mean Gallilean relativity, forget Newton or Einstein. I mean, I guess that means that every rocket engine we fire in space can't work because it doesn't exceed the velocity of the earth around the sun?
This article is a little revisionist in that it fails to detail WHY some of these standards were chosen.
1. Film shutter speed was chosen because early emulsions react more slowly and you need to have the shutter open for longer periods of time to capture an image. Also, you physically need more film to lug around for the same movie.
2. Interlacing was chosen to minimize flickering on televisions. Phosphors decay, so if you just scan them vertically you get a pretty annoying flashing/stroping experience. By writing to alternate lines, you help mitigate this.
When there is enough existing footage that is interlaced or at lower frame rate, the standard pretty much have to have way to represent them (for a while) since this reduces the cost of transcoding. Marketers take over and so you get silliness like recording interlaced on modern devices.
On the subject of interlace, good video decoders don't do just bob and/or comb de-interlacing, they do a certain amount of motion tracking between scan lines to prevent irritating vertical edge effect.
I just hate it when people neglect to mention the reasonably good engineering decisions that get in the way of their pet projects.
Not normally one to defend apple, but for *patent* lawsuits, willful infringement automatically incurs triple damages. This isn't a patent lawsuit.
It does appear that a minimum Jobs knew that he was walking a fine enough line that his original wording would be interpreted as price fixing.
The magic hand of the market brings ALL prices down including labor. Seriously, read about working conditions in the early 19th century and the early 20th before the labor movements before opening your Ayn Rand fantasy inspired pie-hole.
You could just hover over the icon and see what it does, in your own language. And as far as I remember there is a key-press that actually shows all of the text at once.
Then the rest of the time the icons actually save space. Dunno, doesn't seem all that crazy to me.
TIFKAM's lack of integration into the rest of windows though... stupid.
Paris because she is integrated into many functions.
Ignoring energy required (engineering problem solvable at some scale) and getting blasted to bits by high frequency photons (bit harder, but a bunch of ions helps). You can get anywhere you want because time slows down as you approach the speed of light. To an external observer you turn into a blue shifted or red shifted strangely shaped thing, that still can't get anywhere faster than light.
So long as you aren't trying to build an empire and run it, this is all OK.
The database gets out of sync with the filesystem state really easily, it prevents seamless install of apps by just copying them in? It prevents partitioning a users space over different file systems. It can't be cleaned up.
Your eye is a high-pass filter. I.e. the extra effort to increase pixel size comes to nothing the moment the pixels move around or have diagonal lines and curves.
Then you have to anti-alias them. So, at some point there is a trade-off between physical display density, how your eye works and computation, with a certain point of diminishing return on making the pixel smaller.
OK, I'll feed the fellow South African troll. Well, I won't, but I will observe that more effluent comes out of your mouth in the average post than CO2 from your white entitled SUV's tailpipe.
Believe you are wrong. Each AD instance actually runs jet as the underlying store and then builds replication on top of that. There are a few variants of Jet lying around, but they derive from a common source. AFAIK
There no market for Notepad and Calc? Actually, the bundling wasn't that big a deal, it was the OEM contracts.
Not so much gibbrerish actually. USER32 is right at the bottom of everything before Win8 and USER32 is pants. (E.g. window move requires an application to respond to a move event before the move happens, resulting in freezing).
That said, they was probably a hybrid they could have chosen which ran the risk of blowing some apps up (or, gasp, having them render incorrectly in some interesting cases) which kept old apps and new apps in one place and still allowed an evolution touch.
Pure is poor....
This. Patents shouldn't be accepted without a realization and a documented best mode.
Because perl is executable line noise used by admin to cobble together small helpers together. When the start language has no decent syntax or rules, why bother with the code
It can only be a planet if its roughly spherical (big enough to collapse to a sphere under its own gravity) AND its significant enough of a mass in its own orbit to have (largely) cleared it of debris. (Hence Pluto and Ceres don't count).
So, an exoplanet wouldn't actually be a planet by this definition. Perhaps they'll have to come up with a formal definition of exoplanet vs. dwarf planet. The good news is no kiddies are going to have to memorize exoplanets so no-one much will have to car or notice.
Yeah, because I am sure that Microsoft totally dominates in data centers and online services.... oh wait...!
The faster you go, the slower it seems, to AN EXTERNAL OBSERVER, to you, you just go faster. Yes, and things get thinner relative to you, and external time speeds up, and in the other reference frame, you get heavier . And visible light turns to X-rays, I like how people forget that. Absent some serious field tech, the main problem is the stuff you are flying through, oh and going home, you don't get to do that.
Even with relativity intact, you can get anywhere you like as fast as you want, (well, not counting pesky problems like getting bombarded with highly energetic particles caused by their relative blue-shift and the like). You just can't come back without massive time dilation. (See for example http://en.wikipedia.org/wiki/Twin_Paradox).
So, kindof hard to run a Galactic empire, but, hey maybe that's for the best.
You forced me to read this to the end out of sensationalism? Editors, please ban this author, this was waste of everyone's time.
Most NoSQL databases exist for a simple reason, the ability to blast the data across a very large number of nodes. It's a conscious trade-off, but at these kinds of scale (say 1 billion users) the relational model breaks down and the nice clean tables can't be efficiently joined, regardless. Doesn't mean that if you are writing the database for even a pretty large company you can't go relational.
Isn't it enough just to that the gravitation attraction within a galaxy is large enough to overpower expansion? There might be some net difference to a non-expanding system, but the galaxy as a whole would still not expand?
Paris, because like gravity, she sucks....
I said NT (as in 3.51, 4.0) HAD ACLs (as do all successor NT OS's). At the time almost all *NIXes did not. Also, I should distinguish between the Windows security *model*, which was pretty good and the actual security of the OS (buffer overruns and other exploits) which was somewhat bad (and of course the myriad of Apps written for Win95 that never bothered with this security nonsense and hence had to run as Administrator didn't help much either).
Windows 95 kept the Win16 subsystem pretty much intact and then built a 32 bit kernel around it (and used semaphores to guard access to the non-reentrant 16 bit code). It also had to be very careful how it called into the old DOS core which was kept largely intact. It truly was an unholy mess, but seriously, adding threading and a 32 bit somewhat haphazardly protected address space is hardly a minor enhancement.
Windows NT was a complete kernel rewrite, SMP and 32 bit (and then 64 bit) from the ground up, largely platform neutral except for the HAL which ran initially on a wide variety of chips (including MIPS and Alpha). Win32 is just a subsystem inside of this with a kernel32.dll export to keep compatibility with Win95 win32 APIs. Win16 and the DOS v-mode support was also just a subsystem hosted on the kernel. Windows NT always had a security model actually somewhat in advance of *NIX (ACLs not just group, user, universe), and its only grown more sophisticated as the need to protect a user from themselves has become more important. (No, grandma doesn't care that the root account wasn't compromised if they lost all their data).
If you are going to open an orifice, try the one in front of your face rather than the one on your posterior.
Point of order.
MS was found to hold a legally acquired monopoly in PC operating systems.
MS were found to be guilty of illegally trying to extend this monopoly to browsers by Penfield-Jackson, but this verdict was remanded to the lower court.
Subsequently MS settled with the DoJ meaning that no US court has found that MS did in fact illegally attempt to extend its legally acquired monopoly to browsers.
Not sure about exact EU rulings, but get your facts straight.
Its platform neutral, one app to deploy to any client. Any other option you might name is platform specific.
Paris, 'cos she's not very picky either.
The EU has the position to ensure fair competition of ANY businesses within its jurisdiction. I'm not taking a position on whether Google is a monopoly and/or has abused it position. But your statement is violently parochial and seems to match what the commission is really up to.
We (as a county) have 1/3 of the US deployed nuclear weapons:
As to political affiliation. Bainbridge Island is actually probably Green, (hence more Blue than Seattle, even). The rest of the county tends quite red. Almost like a microcosm of Washington State. As anyone who has seen Islanders let off fireworks on the beach on 4 July can tell you, we can amass quite a stockpile.
"Plan B is for an interim, Windows Phone Classic (aka 7.99999) to take advantage of multicore processors and (perhaps) higher resolution screens."
Oh Joy, yet another author that doesn't understand that the WinCE kernel can't do SMP and that was the whole point of moving to a shared kernel.
Win7 was Vista stabilized and made efficient, also with some time for the driver/apps ecosystem to catch up. Saying it was based on XP is basically just stupid and wrong, otherwise as a trivial example, why are Vista drivers compatible with 7 (as a rule) and not XP? I normally don't get involved with drivel like this, but would it kill you to think before you started typing?
As an engineer that means anything up to $499 million. Or did you mean $0.00000000 billion.
Yeah, I know I'm being a pedant, so were you....
WinPho 7.5 runs on a single core because its based on the WinCE kernel which doesn't do SMP. Android, based on Linux, does, so it uses them. This really says nothing about how WELL each one uses them or if they are required.
US Supreme Court, as opposed to the (Washington) State Supreme Courts. But, most appeals get rejected way before it reaches the highest court.
Don't see why a GPU couldn't directly get the result of a DMA transfer after the host CPU set it up? File Systems would pretty much be a CPU thing though.
Not: "Do no evil."
Just saying. Getting your facts straight makes you more credible.
The comparison between Fire and Android and Linux and BSD is just silly. Linux and BSD are completely different kernels with a common POSIX abstraction layer and some shared user-space libraries. All android variants (including fire) share a custom linux kernel with a large amount of common user-space libraries, including VMs (Dalvik), UI controls, etc. The amount of UI customization and vendor specific extensions on top varies.
How is this relevant to a cloud offering? The reason you use a cloud solution has to do with improved collaboration and less client state to manage. It's not obvious how something modeled on a local, heavy client solution factors in (except to add extra commoditization pressure on full client office).
I think everyone is missing the point of why a new shell was invented. Basically, it was an attempt to define a richer way to pipe together operators than input and output streams of text. The idea was that if you exchange objects then a human can always view them (through appropriate deserialization) and code can operate on it in a more predictable way. Hence kluges like grunging through human readable output with sed and awk could be avoided.
That at least is the idea, and motivates why other shells weren't deemed sufficient. How well this all really works is more open to debate.
Java - because Sun and then Oracle really messed up by locking the language down to the extent that no-one can really leverage it for a client platform well. If you think MS were alone on this, witness Dalvik and android.
C# - because its from MS, basically. Not that I have an inherent bias against MS technology and certainly C# is what Java would be if Sun had bothered to continue developing it, but because the rest of the planet doesn't necessarily hold this view.
Except that Dart cross-compiles to JS. So, you don't really have to implement everything twice. If the Dart VM does end up being significantly faster than JS, then, there might be a significant win in getting the performance benefit where Dart is available.
No ripping of JS out of Chrome required. Either JS VMs catch up in performance, or eventually other browsers adopt Dart because it really is faster and then don't want web apps to suck on their browser compared to chrome.
Biting the hand that feeds IT © 1998–2018