causes of crime are so opaque?
Hradly.
if(opportnity(crime) && chanceOfGettingAwayWith(crime)>0.97)
commit(crime);
else
expressInnocenceAndLawAbidingNature(self);
1946 publicly visible posts • joined 28 Jun 2011
shoulkd be no need for error connection and what matters in terms of crosstalk is the nearets of te intended receiver to the farness of the unitended
It is simply a (chip) cored transformer.
What it cant do of course is nramsmit DC, but that needn't be an issue - a simple 'in phase with master clock' is a one state and out of phase a 0 state etc etc.
Plenty of encoding technologies like NRZ or Manchester can use an 'AC' transmission medium to transfer 'DC' states.
In a perfect world with perfect employees, that is true. You may take their 100% accurate information, and use it to make only those decisions appropriate to our level of management.
However, in a world of human beings, how if you know nothing about their jobs, will you know if they are simply lying to you?
Years ago when Britain still had a manufacturing industry one of my PCB's was going into production and the production manager said 'the girls can't stuff that board in less than 45 minutes. I said 'rubbish, I bet you I could stuff it in under 15 minutes.
I did it in 12 minutes sitting down at the carousel..
we set the rate at 20.
Sigh. Back in the day there was the same issue with 8 bit micros. No compatibility. Unless you used CP/M and a bios..Then along came IBM and defined a standard hardware, and the rest is history.
If Dell do the same thing here, it will be just another example of a big company enforcing a de facto but open-ish standard on the community.
BUT the real point is this: the operating system itself is the interface these days,not the BIOS. Not the hardware.
So long as each manufacturer has a published hardware standard so that linux kernel hackers can easily port their linux to it, then the user wont see any difference. Code compiled on one ARM system will run on another, even with different hardware, in the same way it doesn't matter what graphics or Ethernet or sound card you have in a PC, the Linux examines it and selects an appropriate driver for it.
Mmm. Now whilst I see where you are coming from, you need to ask yourself one question.
Is X86 the future of the small server market?
Unless I want to run some ghastly Microsoft server thing, why do I care what the CPU architecture is at all?
If it runs LAMP and Samba, that's all I do care about. And if it runs it at one half the power and two thirds the price, that's what I will be using..
Dell presumably knows what its customer base are doing with its servers. If they are almost universally running Linux on em, then this is a straight one for one replacement.
Intel have to show they can do better MIPS per watt with the atom derivatives at less price.
And there is another point too. I run an atom server in a SOHO application. Its more than fast enough. Where it needs more go is in the networking and disk access areas. If some of that functionality gets integrated onto the ARM core chip - in a way that Intel cannot do - then its lower chip count and cost.
Today's server is an intelligent connection between storage and network. It should be possible to make that a one chip solution almost, bar the RAM. OK maybe one serial or USB port for a console ...
Intel is still stuck selling CPU AND peripheral chips.
yeah well no fine. It happens to be blowing a gale.
Look again at gridwatch. In particular look at the interconnects for early December. Suddenly the imports switch to exports.
Why? because it was cold and there was no sun and no wind, and all of Germany and Denmark's renewable energy wasn't worth a chocolate teapot, and neither was ours.
You would have seen figures <100MW for wind then
KiIl birds and bats?
Destroy profitably of competing generation which is not allowed to compete on level terms with their more expensive output?
Add extra grid instability?
Stymie investment in any energy source due to regulatory uncertainty? (The government giveth, and the government taketh away).
What they don't do of course, in a mixed grid that uses fossil plant to co-operate them with, is appreciably lower carbon dioxide emissions.
Two chemists on a train heading to a conference on organic polymers.
One notices the other playing with a lump of flexible material "That looks interesting: may I see?"
"Sure, here you are"
"Hmm flexible, elastic..How did you make it?"
"Oh I just wiped my hand across my nose, and there it was ...."
There are more exceptions than that.
DTP ain't up to the Adobe/Quark level yet, and 2D and 3D CAD and creative drawing ain't up to Corel, Autocad, Rhino3D etc etc.
Then there are the millions of little apps that people use. Sage accounts for example.
If you are smart enough you use virtualisation or Wine, if not you buy windows.
...either C or PHP..
I'd have zero trouble spending half a day recompiling.
And I would, like as not.
If you have code written for Linux on an x86, it will port to Linux on ARM.
Only MSexcresence code wont.
So what? legacy apps can run on X86 as long as Intel is still in business.
It may be that professionals use Jquery, but an awful lot of 'creatives' who are only interested in a flashy looking page use it as well with absolutely zero understanding of what they are doing.
Here at Itzman Towers, the creative designs the page and the geek implements it using as little code as possible and none of it third party.
the problem is that so many sites are now using vast JavaScript frameworks that they have little idea of the integrity of and even less control over.
"Include my JavaScript library with this link, and then put this line in your code, and magic happens"
Most of these sites simply stop working when JavaScript is disabled.
Leaving the user between a rock and a hard place.
The fact of the matter is that 'single X86' is no longer an appropriate technological solutin except on the desktop or small office server, and IBM aint in those markets.
Machine room servers are going on for massively more economical virtual machine technology based on high end rack and blade type solutions.
And as more an more serve side applications move from being operating system dependent to network dependent, serving their users via networks sockets, only, and because high end applications are massive on design effort, it makes sense to go the extra half yard and provide them on whatever platform is the most cost effective, and if that happens to be ARM, or some other processor, so be it. It is no big deal.
Microsoft is tied to Intel, and Intel owns the medium computer space, because Microsoft owned that space. The medium computer space is increasingly irrelevant, and so is microsoft and so are Intel.
the so called cloud model - essentially smart clients running standardised internet based applications on cheap end user devices coupled to massive centralised storage and application power, is the current cost effective way to deliver IT to end users.
The personal computer is no longer a personal computer: its no more than a smart graphics terminal hooked into someone else's mainframe running time shared applications,
The wheel turns full circle, and what IBM envisaged as the PC, is now what every user level device is. A terminal.
So IBM simply acknowledges that and gets out of a market that it simply isn't interested in. IBM has already been replacing servers with virtual machines for its client base, - its not interested in prolonging a change it probably feels is not only inevitable, but actually welcomes.
There will always be a niche market for small single OS servers for private and geographically secured reason. Not everyone will want to use public clouds, but a huge number of people will. There will always be a need for proper desktop workstations with local processing power, but again niche,market for the very few people who actually use computers to generate real content, rather than merely enter data or consume content.
But those are not mainstream.
Y'know that is so weird that it may be true.
I was playing an online game and generally expressing some dissatisfaction with some players when the guy I was talking to said 'you shouldn't do that: In real life they are a gang in LA and they could find out who you are'
My character (a gorgeous enchantress), replied. 'in real life I am a sixty year old bloke living in England, do you really think so?'
I think a lot of kids who would otherwise be on the streets are into their consoles. It probably isn't the best thing they could be doing, but its probably a ling way from the worst as well.
No you don't need to use adobe products to create the 'best' PDFS, not have you for many years.
Don't confuse a page description, with the tools needed to create a detailed and flexible page in the first place.
Quark was for many years the de facto standard of typesetting. It now exports PDFS as an alternative way to define the pages it creates. Because PDFS finally incorporated all the tools it needed to do that comprehensively.
Adobe tools (creative suite) are still regarded as inferior by many in that business.
You seem to want to actively lock out Linux.
In fact your whole demeanour suggest a mindset where 'software by decree' is how you see organisations as necessarily being run.
I don't see it that way at all. It is simply allowing departments the flexibility to choose what suits them the best, without having that choice enforced on them by a de facto proprietary standard.
Shall I tell you about the time a circular letter came out to a group containing an unreadable Word document (to most people). I managed to read it with the latest libre office version and send a PDF back to the grateful originator so that all those MICROSOFT WORD users who could not read it, now could?
When even a proprietary vendor is no guarantee of interoperability, small wonder that sane people are looking for an open standard of document distribution.
One of the reason PDFs work, is because the standard at least was published, adhered too and readers made free.
Everyone has a PDF reader.
Hmm. it is qualitatively different from many other boards, and this exites me. I've been looking for something with plenty of RAM and a decent basic kernel and analog I/O plus audio A<->D to do audio processing with.
This might fit the bill actually. Arduinos lack the RAM and code space and general interfacing Pi doesn't have the right I/O mix.
Hard to tell from the review that isn't a review - more a product announcement.
No it was never the best, but it was the best you could afford without selling the dog into slavery.
Had many CRT TVS (two left) sand still have audio setup with sony on the front and its still very very good. Still got two Sony STBS - again brilliant kit.
But LCD TVs? Samsung are simply better value or Panasonic. Don't play games and frankly when it comes to cameras its Canon or Nikon - not Sony.
I am not sure where the price/quality went. Maybe Japanese exchange rate, but go it did. I mourn their passing as I mourned Hewlett Packard test instruments. IN a long life or unreliable crap there are a few brands that stood against the tide Nikon lenses and cameras, HP test gear, Sony TVs and audio gear. Kodachrome slide film. Most are gone now...
Fry isn't an idiot, but he doesn't know the difference between being an actor and an authority.
He relies on researchers and they are plain crap.
His job is to add gravitas to their ramblings. Not his fault if they are sloppy and biased.
They are after all generally BBC employees.
It is what drives the whole industry.
The proposition that next years code will be so bloated than only next years machines will run it ensure a smooth planned obsolescence aided by the concept that next years code will also produce data incompatible with this years code thus ensuring that users cannot hold onto machines for more than a couple of years and must continually upgrade.
Also known as MS TE
More sh**t than ever
Android on a desktop = linux.
In all but name.
What is android beyond a pretty face on linux that is adapted to the exigencies of running on 'small' hardware with touch screens.
Take away the small hardware and the touch screens and it looks very like Linux in one of its more popular incarnations.
It is that, but it's more. These are gross generalisations, forgive me, but they are 'approximately true'
1/. domestic users aren't buying uprated PCS. They are overwhemingly buying tablets, e-readers smart TVs and smart phones. PC's are dull boring and old hat.
2/. Corporations are not buying new PCs either. This is an interesting point I will expand on. Because I used to be one of those business owners who did buy new PC's every year. With windows on.
I bought new because the old either wore out, (4 years was good going for a PC) or simply were not up to the task of running the new and better versions of the applications we needed to run a business.
But at a given point, both the PCs and the applications were simply 'good enough' to run, essentially forever. I have bought the 'budget motherboard' from my favourite clone supplier for almost 20 years now. Both as a corporate buyer and as an end user, and what I have noted in the last few years is that the same money no longer buys me more than it did 4-5 years ago. In fact I cant get a machine at all quite as cheap as I used to, and it is very little faster. OK SSD and a bit more RAM helps, but program load times and the ability to have many programs alive at once is not an overwhelming issue.
So we are running to the end of Moores law it seems. No longer is there an implicit guarantee that today's PC will be 4 times faster than a four year old one.
And now look at the applications in teh enterprise. Overwhelmingly what people were using in my company were:
- interface to a database. That used to be telnet into a Unix box, today its a browser into an SQL server.
- mail and contacts and meetings, word processing, the spreadsheet for some.. In other words MS office.
- specialist job related apps. Graphics for the creatives, IDEs for the techies, and so on.
Now it is probably true to say that actually all these applications are fully mature. There is no NEED to develop them beyond bug fixes. And that is why millions of copies of XP are still running with a 7 year old office suite or later, and running old and still very usable versions of mature specialist apps. Its good enough.
In short, in the corporate marketplace, computerisation has at the applications level reached full maturity. There is no need for more 'features'.
Likewise the hardware has reached a plateau.
Students of marketing theory realise this is symptomatic of when a given market sector moves from 'rising star' to 'cash cow' phase, and then perhaps even beyond it to a 'maintenance replacement and support ' phase.
Corporate buyers are ultimately driven by ROI. If buying this years PC with this years MS offering offers nothing they can't do on XP, they will stay with XP until the hardware gasps and dies. I am still running an ageing copy of XP in a VM together with three Windows apps, all equally venerable, simply because they are GOOD ENOUGH for my needs.
And money is tight.
This is I think the key to understanding the issues that face Microsoft. The hardware and the software is mature, in the corporate environment. There are no more productivity gains to be had. My brother in law who manages big IT transitions, says that the activity is overwhelmingly happening in the server room,. where multiple divisional servers are now virtual machines in a few large blade type platforms. and the new server side apps are Linux based.
Microsft and windows have to transition from a 'more sales every year, guaranteed' to a 'fewer sales every year, and replacement and support' mentality.
Or find a new pond. They are too late to dominate the new consumer devices and there is no new pond in corporate sales, or its shifted in emphasis from machines on desktops to network and server technology - an area they have always been weak in anyway.
That is Microsoft has done the job we wanted it to do and needed it to do: provided a standardised operating system interface on cheap hardware to allow the proliferation of single user applications throughout business. Thanks, but now that's done we don't actually need them to do it any more.
They are sitting on a cash pile of epic proportions, but they have nothing left to spend it on.
If I were they, I'd be buying back shares as fast as possible and getting ready to privatise the thing in order to be able to take it in a new risky direction, if that's what the directors wanted.
Otherwise what faces Microsoft is a slow decline in significance until its basically broken up and sold for scrap as is the fate of so many players in the game.
And not only are they apart from the Xbox a pure enterprise player, but enterprises themselves are moving away from desktops a lot too, with portable devices and cloud servers offering the corporate platforms.
I make a point when i go into a bank, or an office, to notice not what operating system the back office are using, but what they are doing with it. spreadsheets, word processing, the odd custom application that is dedicated to their business usually written for windows, BUT which could actually be written for javascript/HTML just as easily against a web interfaced server... or even delivered as a full 'cloud' service.
I note that old bastion of the PC, Sage accounts, now offers an online package...
Oh it wont be rapid collapse, just the death of a thousand cuts..And the tablet/smart phone revolution has put the idea in the minds of everybody who isn't especially tech savvy that there is a world beyond MIcrosoft, and when you come to sell a desktop system based on Linux, you will merely have to say 'oh, its the same as Android really' and suddenly everyone will relax...
"Until they accept they no longer a monopoly and the market is swimming away from them and that its now MS that are the small fish in a big pond they will never regain respect or a decent share at the middle and top end."
If I were to search for a definition of Microsoft's woes I could not better that sentence.
Way back when CP/M was king, and PC-DOS was a POS that survived solely because IBM put its weight behind it, Microsoft was a lean unethical marketing company, hungry for financial success and able to use aggressive marketing tactics in a world infested my naive techies, such as myself.
But that was decades ago.. Having achieved a de facto monopoly of operating systems in the desktop arena, Microsoft entirely forgot how it had done that thing, and its failure to diversify away from that market apart from the X box marks both the secret of its success and its ultimate likelihood of failure.
It hasn't got the creative genius of an apple, the dedicated engineering expertise to churn out volume low cost hardware,, nor the courage to abandon its 'windows;' brand and embrace android.
Nor yet the courage that Apple showed when OS-X was introduced, based firmly on a *nix core. A move which invalidated all of its legacy OS9 software at a stroke.
Now it has found a market in which it was not the early mover, but late to the game. It cannot respond, because it has none of the tools to respond with. Its tools are those of a dominant organisation crushing inferior competition, not those to do battle with a superiors and established player, like android or IOS.
Users don't want desktop PCS much any more. Tablets don't do word processing or spreadsheets - the applications that arguably drove the PC onto the corporate desktop. Dedicated consoles drive the games. And the internet IS the operating system now. A browser and a mail client and a half a dozen other net enabled apps don't care what operating system they have. The internet protocols become the new API for app develoepers, not the operatings system.
Linux and android are free of license by and large, adequate to the job in hand, so who needs Microsoft?
A dwindling number of corporate desktops that actually do more than read data and add a very small content to it.
And how long before the applications they run are 'cloud services' rather than loadable pieces of software? at which point the desktop workstation operating system also becomes supremely irrelevant?
Or the code writers start to get together and say 'well actually, if we pick a single Linux distribution, and port our applications to that we can still make money out of the application, even if Microsoft doesn't'?
Windows has long been an operating system nobody really WANTED, it was just that they had no choice if they wanted to write word documents and power point shows, or use a spreadsheet...or run some heavyweight desktop application.
Now increasingly they don't want any of these things, they have realised that not only do they not really want Windows, they don't actually need it either.
The moment a PC vendor or a big apps company breaks ranks and offers a Linux version either a basic workstation or of Corel, Photoshop, Quark, Autocad, Solidworks*...is the moment that Microsoft's whole existing business model really starts to collapse, not just decline.
Because inertia and the 'de facto office suite' are simply not enough to justify spending money with Microsoft when Linux is cheaper and Libre Office is arguably as good as, if not better.
*these merely being examples of heavyweight apps that the author knows need windows or OS-X to run on. Readers will know of others.
No science can be proved conclusively: that's not what science does.
Science proceeds by coming up with models that fit the data as data is currently understood.
If they provide predictions as to what the data should be in areas other theories do not, and that data fits the theory, then they gain weight.
They are never conclusive: they are always just 'better pictures of reality' and even there, first of all you have to have a metaphysical concept of what 'reality' is.
But they are - or can be - very USEFUL.
The photoelectric effect was one of the few unexplained things that dented the 19th century classical world: that and radioactivity led to the idea of subatomic structure, which 'explained' the abitrary nature of elements, and led eventually to quantum theory, whose main obvious every day effect was the transistor and the laser, without which computers would have been hard to make.
So black hole theory is not random bollocks: it is an attempt to see what happens at extremes of mass within the universe, and predict what we should be seeing where it exists.
If it calls into question our very understanding of what space, time, energy and mass are, so much the better. Perhaps these are all manifestations - or could be represented as manifestations - of some more abstract thing.
That wouldn't prove the abstract thing existed, merely that it was a good way to look at things we take for granted DO exist.
Microsoft is becoming increasingly IRRELEVANT.
And I think that is the greatest thing to be understood.
Apps, even commercial apps, are no longer things that talk to an operating system. They are things that talk to a cloud or a device.
Or they are server based and talk Linux increasingly to the hardware, and the above to the user interface.
And I think that is why people are hanging on to XP, because they don't want to spend money upgrading to the sort of platform that mobile devices and chrome OS style kit may make completely pointless in the future.
IF word processing and spreadsheet and mail move to a cloud, private or otherwise, what is left for the desktop workstation to do that cannot be done by almost any OS with a JavaScript enabled browser? Your database is now accessed by web forms as well.
Really, and I would be interested to hear it from others, in my direct experience its only 'creative' stuff.
Design and drawing technical, and artistic - and that includes coding.
Picture, movie and music editing.
These are the only things I can offhand think of that REQUIRE high bandwidth fast video interaction and a lot of local storage and processing power that still cant be done with a client/server model.
Yes, there are legacy windows apps, but who is writing new apps for windows exclusively? Rather than some sort of cloud based stuff.
I may be relatively biased but it does seem to me that Microsoft is stuck with a company geared to delivering something no one needs, and not that many people actually want, any more.
When Windows came out, it was despite the hype, simply a cheap way to get a GUI on a desktop, to do simple boring stuff. Only creatives really NEEDED a GUI at all, and they ran Macs anyway. OK games were fun, but consoles blew that away.
Now there's a friggin' GUI on everything and the API is HTML/JavaScript.
The question has to be 'why do we need windows at ALL?'
IN my case there is a very simple answer: because Linux freeware can't match two specific design applications that exist on Windows. Nearly, and getting better, but no cigar yet. And virtual box suffices.
I'd be interested in other peoples answer to this very serious question.