17 posts • joined 25 Oct 2010
Re: Lack of integrated email/contacts/calendar?
Hmm. My smartphone connects totally well and easily to a FOSS mailserver. Well, it does not connect nicely to an Exchange server, but that's not what is mentioned in the article.
You do know what integration means don't you?
So you can access your email. What about your calendar and contacts from the same app? Oh, right, that isn't going to happen.
So how is it integrated? If you're faced with a question that is tough to answer replacing it with a different question that you can is generally not helpful.
Re: 23 Years
Most people have a TV, that almost certainly runs Linux, I don't know of any that don't. The only popular set-top box that doesn't run Linux is the Apple TV. Most routers run Linux.
Sources please. If these devices are all running Linux you'll be able to point to e.g. the source for them. That simple measure instantly excludes most routers, TVs etc. I know for a fact my router doesn't and of five TVs only one runs Linux.
Re: 23 Years
Over 85% of smartphones sold in the last quarter are running (a kernel which was forked from) Linux.
And what proportion of households bought a smartphone in the last quarter? Remember it has to be "pretty hard" NOT to find a Linux device in a household, so even if ALL smartphones EVER made ran Linux it wouldn't by itself fit the bill.
I've not really looked into this myself but yes, the Unix = Linux assumption seems very prevalent among the Linux community, even to the extent that is a given tool is available on say Ubuntu, Debian and Fedora it can be considered portable and even a Unix standard. Yes, I've seen that exact claim made on these very fora. Seeing a Linux-style filesystem (or even a tool such as OpenSSH) could easily be enough for some of that contingent to make false claims.
Re: Didn't we do this already?
While the technology stack is compelling (192 core GPUs in an under $300 package) I think the industry shut the door on this type of setup a while ago.
It's probably a lot more common than you might imagine. Look in call centres, large offices etc and you'll see that kind of set up fairly frequently, often a nice silent machine on the Vesa mount. I've even seen it for programmers where it has additional attractions - programmers generally don't use a lot of processor power except when compiling when they need as much as they can get. A single beefy machine serving a dozen or so users gives them that without costly, overpowered machines sat on every desk running at 1% utilisation.
Re: C'mon Moore's Law, Hit The Wall.
... higher definition images when 2.2MP is the most that the human eye can see. 48-bit color when 27-bit tests the limit of the best eyesight. 4K TV's where there is nothing wrong with 1080 displays.
24-bit colour is all that is needed in the final result but you do lose at least a bit channel with a lot of post-processing. Adding on a layer on top of the image? No longer true colour. Adjusting brightness and contrast? No longer true colour, and so on almost ad infinitum. That's without even considering that not everyone is simply taking and viewing pictures of their cat but might be doing something productive. I have a friend who does astrophotography and he commented a while back that his CCD is supposedly rated for 16 bits per channel but in practice it is more like 12 bits plus noise, even with all the cooling and advanced trickery needed for the very best images. He's NOTICED that and shown me - are you really going to tell him or me that he's imaging it?
As for coming up with a supposed resolution of the human eye, it's a mug's game that shows complete ignorance of how the eye actually works. The overall resolution is fairly low in pixel count terms but nowhere near uniform - i.e. you have a comparatively high resolution in the very centre of view and very low resolution in the extreme periphery of view. Since you don't know where the viewer is looking ALL of the image needs to be good for that very high centre resolution.
"Predictability" (by which I would assume you mean deterministic behaviour) and randomness are two completely different qualities... To qualify as random any value in the target domain must be as likely an output as any other - if there is any weighting or bias in the output it is not random. A lot of real-world systems have been compromised by this very implied assumption - it's unpredictable, therefore it's random.
Right on. I've long believed that most modern security flaws are not down to lack of thought or effort but lack of study or lack of knowledge. People spend 10 minutes studying this stuff and imagine themselves to be some kind of expert and begin to spout fundamentally flawed premises and if they were absolute truths. History is littered with examples of how non-random systems have been broken - it was essentially this very issue of a slight bias (not encoding a letter to its original value) that allowed even Enigma to be broken.
And they came with the bundled 50 / 50 chance of getting the keyboard and mouse the right way around when trying to plug them, arm twisted like a SCO lawyers soul, into the back of the computer.
I never understand why they didn't simply make the ports identical in the first place - they used a six pin mini-DIN - two for power, two for signalling, and two unused. The keyboard and mouse used the same two pins for signalling even though they were not automatically compatible with each other. It would have been a trivial matter to put the signals for one or the other to the unused pins and wiring both signals to both ports as many laptops actually ended up doing, Then you would simply have two interchangeable keyboard or mouse ports with no possibility of connecting them up the wrong way round.
Re: I used to have the same problem
The upper side of the socket is the one furthest from the floor. Where sockets are mounted vertically, it's the side that would be nearest to the floor if you laid the device on its side with the wrong side facing up. You may find it easier to think of it as the left-hand side (right-hand side for left-handed people). But then you will have to decide whether you're in front of the computer or behind it.
This would be great if it was true, but it isn't. Indeed, it isn't even what the standard says. The standard gives orientation in terms of (to paraphrase) the side facing the user, but does not specify how that is determined. For many applications is obvious but e.g. rack equipment could conceivably be mounted above head height in which case the orientation is reversed. I've seen plenty of cases that make this very assumption which can be bloody inconvenient when indicator light on directly mounted equipment shine downwards on a box mounted at knee height to start with.
That always assuming any attention is paid the the standards in the first instance. I've seen plenty of examples where they are simply ignored with no justifiable defence under the spec. Cheap flash MP3 players are a favourite - for some reason the screen always seems to point downwards when they are plugged into a computer in violation of the spec.
Re: The vi thing
Actually Edlin was ditched from DOS 6.x...
Re: P. Lee
I don't even see those. Precisely how much difference does a computer being an inch or two THINNER help in most space constrained environments> The limiting factor is going to be the larger dimensions, i.e. its width or height.
As for regularly travelling with a desktop, well it's only a couple of years since I completed my doctorate and was flying between Dublin (Uni) and Manchester (home) eight or ten tens a year, invariably with desktop in tow. I had a flat screen telly, keyboard and mouse at both locations so it was just a mini-itx base unit that I could stuff in a relatively small suitcase and stash clothes, paperwork, toiletries etc immediately around and on top of it. I wouldn't even consider chancing that with something like this - even if you treat your baggage with kid gloves it's only a matter of time before Ryanair smash something through the large, vulnerable, expensive screen.
Re: The fundamental things apply
Can I suggest reading up even a little before spouting this kind of mock-intellectual mumbo jumbo. Modulate the amplitude and you introduce harmonics. Modulate the frequency and you introduce harmonics. Both increase the signal's bandwidth without any need to resort to quantum theory. Indeed there is ultimately little that can't be explained using traditional wave theory, Shannon and the Fourier transform. Introducing things like quantum theory when they are complete irrelevances is like those idiots who try (and fail) to apply GR to phenomena that can be satisfactorily explained with Newtonian mechanics.
Ultimately it is this one research team that are making outlandish claims and they don't have the facts to back up their case. I was dubious on first hearing about this even without reading their reports in full so frankly I not surprised now. When your claims fly in the face of received wisdom it doesn't necessarily make you wrong, but the onus is very much on you to prove your assertions than for others to disprove them.
Re: Spectacularly Refined Chap
IEEE Std 1541-2002 is relevant here given the whole standard is on the 10^9/2^30 issue. Paragraph 4.1 is cut and dry:
The SI prefixes shall not be used to denote multiplication by powers of two.
I see you've already been thumbed down. I'm not surprised. After all, of course some anonymous nobody commenting on a discussion forum knows better than a recognized international committee of experts.
Re: Pre-95 & NT Windows weren't OSes!
Not that old chestnut again. Consider the tasks performed by an operating system: managing processes and memory, controlling devices, managing the filesystem... Windows 3.1 did all of these. Dos was little more than a boot loader. You may as well say that Linux isn't an OS - it's dependent on GRUB after all.
Don't be too obsessed by home users
Home users do not fund the industry. Home users tend to stick with the commercial software installed on the machine when they bought it, at massive OEM discounts. Anything else is either a free download, pirated, or not bothered with. The industry seems to be obsessed with home users now that industry is basically saturated, but you need at least ten installed home systems to provide the same income as one commercial system. For many or even most commercial environments the PC form factor - monitor, keyboard, mouse, is still the most general, productive, and cost effective. How many PCs are primarily used for data entry, word processing and email? Keyboards are still a pretty central feature to most users who are actually paying the bills.
If companies forget that core market they will be abandoned and they will lose the income to fund the loss-leading trendier stuff. We moved to thin clients a long time ago for flexibility and ease of management, so it ultimately makes no difference if an app runs on Windows, Solaris, Linux or whatever. Microsoft can't provide a decent word processor? Fine, they get one on Linux or whatever.
That is precisely what he said and precisely what is relevant here. The commentator remarked that we are facing a shortage because no plans are on the table now - not just architectural plans, simple aspirations of the "we will need a new plant in 2013 variety". He makes clear no such plans are in the pipeline. Even if every flash company in the world did an about turn now it would still take those new plants five years to come on line which is where the problem cited actually lies. How long it takes to physically build a building and populate it with pre-ordered plant is an irrelevance.
Cable select works well until it doesn't. As soon as you encounter even a handful of instances where particular devices refuse to play ball you switch over almost instantly to never trusting it again.
Yes, another myth that is repeated so often it is accepted without question. For bread and butter stuff none of the free Unix systems are really lacking. Linux may have the edge on random consumer tat but nothing more than that. If you want hardware support look at NetBSD with support for dozen of different architectures in a single kernel (Linux is a differnt kernel for each new system) as well as support for devices that simply doen't exist in the world of x86.
It's still a tiny fraction of the size of the Linux kernel. So what was that point about it all being hardware support?
Like it or not as Linux has matured it has grown disproportionately. Yes, it is now huge and still growing for comparatively little in terms of substantial benefit. This has only considered the kernel since that is what is most relevant to server apps, but the user interface stuff is if anything else worse. I agree with the "Spectacularly Refined Chap": there is a bloat problem there but pointing anything like that out will be instantly shot down by the religious Linux advocates instead of measures being taken to actually address what are growing into serious issues.
- Product round-up Six of the best gaming keyboard and mouse combos
- China building SUPERSONIC SUBMARINE that travels in a BUBBLE
- Boffins attempt to prove the UNIVERSE IS JUST A HOLOGRAM
- Review Raspberry Pi B+: PHWOAR, get a load of those pins
- Linux turns 23 and Linus Torvalds celebrates as only he can