Re: MY EYES!!
Christ that Panamera is ugly.
Not exactly ugly. But it does make me imagine that a Porsche 911 once indulged in some anabolic steroid abuse, and then gave up on both the steroids and the working out in its middle age.
2642 posts • joined 10 Jun 2009
Christ that Panamera is ugly.
Not exactly ugly. But it does make me imagine that a Porsche 911 once indulged in some anabolic steroid abuse, and then gave up on both the steroids and the working out in its middle age.
With a hybrid, there is a complex efficiency trade-off between the power and charge capacity of the electric components, and the cost of dragging them around with you when the vehicle is running on its internal combustion engine. The designer takes a (hopefully well-informed) view of the likely spectrum of journey distances and road types that the average owner will require, and then optimises the electrical system for overall vehicle lifetime efficiency. (Note: if you are very far from that average owner profile, a hybrid may not be the right vehicle for you).
I'd guess that the electrical system is aimed at use in stop-start urban traffic, and for shorter commutes. Internal combustion engines are at their worst in stop-start conditions. Lighter weight trumps blistering e-acceleration (needs heavier motors) or long electrical range (heavier batteries). Those requirements are satisfied by the other engine. You have the option to engage it even in city traffic, if that's your (energy-wasting) style.
You can always buy an all-electric Tesla ... but you have to be sure about being able to recharge it before its batteries run empty, and even if there's a charging station on route, recharging it is less quick than refuelling. It'll be a fair while before an all-electric car is any good in rural parts (where the locals will tell you that the electricity suppy is less than totally reliable, and many of them own a petrol generator, just in case. Thought -- a multi-kilowatt inverter accessory for a hybrid car might open up the rural market? )
I thought the addiction danger was that these drugs become less effective with repeated use. At which stage you may be tempted to take a larger dose. Which then becomes ineffective with repeated use, and you stumble down the path to total addiction to doses that would kill a non-addict, and ultimately to doses dangerous even to an addict.
On the other hand I'm also aware of quite a bit of literature suggesting that if you do not have an "addictive personality", you won't have any trouble keeping your intake of opiates at the effective therapeutic dose and no higher. There's a lot not yet understood about opiates, even after centuries of use and abuse thereof.
Oh, and there were many Victorian addicts who lived quite long and productive lives. Perhaps medics shouldn't care about people becoming addicted, if they have nothing else that can deal with a chronic source of severe pain. It's not as if opiates are expensive.
Ibuprofen, Naproxen, Diclofenac and dozens of other prescription drugs are NSAIDs: non-steroidal anti-inflammatory drugs. They relieve pain caused by inflammation, by reducing the inflammation. As a gout sufferer I know that large doses of these drugs are spectacularly successful on the right sort of problem. Crippling agony to mild ache in six hours. Magic that works!
But if you have back pain, it may very well be neurological: a trapped or pinched nerve. In which case an anti-inflammatory drug is useless. Which leaves Paracetamol, Aspirin and the all-too-addictive opiates.
Personally, the only thing I've found Paracetamol any use for is flu / bad colds (and I do wonder whether palliative interference with the natural self-curing of these ailments is wise) I used to find Aspirin much more effective, except it's contra-indicated for gout sufferers like me. Aspirin has a direct effect at the neurological level, as well as being an anti-inflammatory drug.
Another reason to avoid NSAIDs for back pain is that it's likely to be chronic, and long-term use of anti-inflammatories is rather bad for you. They appear to block your body's natural repair mechanisms, and if you take them long-term your risk of suffering a stroke or heart attack slowly rises. In some people they also cause stomach bleeding, which long-term can lead to ulcers or anaemia.
I'd add "generally inappropriate use of variable speed limits".
for example, the M1 in the morning, approaching Luton airport from the South, there is often a queue on the slip-road and occasionally back onto the carriageway. So what do they do.
Five miles ahead they slow you down to 60. Four miles ahead, to 50. Three miles ahead, to 40. About a mile ahead, you can see that there is actually no queue.
Wouldn't road markings telling people who aren't exiting to stay in the right-hand three lanes be a better idea? (Going the other way approaching the M25, that's exactly what they have done: marked the inside lane as an exit lane for about two miles prior to the exit).
Now I know what the retard driving in the 4th (outside) lane of the M1 at 60mph was listening to.
LIFE IN THE FAST LANE (surely make you lose your mind)
a melted PINGO
that'll be a pingone, then.
A water world is another interesting possibility. Liquid H2O in far greater abundance than Earth, 200km or so deep i.e. down to the point where the pressure causes it to crystallize into a denser-than-liquid high-pressure phase of ice. Earth's ~2km of water would be stripped by the solar wind in about 100My without our geomagnetic field. A water world might well last until its sun boiled it, even if losing water to space at 2km per 100My.
Radiation may be a problem only for life that evolved on this planet with good radiation protection in place. Protection against a non-existent threat doesn't evolve. Even here, blue-green algae will thrive in the cooling jacket of a laser discharge tube (intense UV) and live off the Cherenkov glow inside a nuclear reactor (huge neutron flux and lots of other nasties). It's probable that they are descended directly from the first life on the panet, which evolved before we had an oxygen atmosphere or an ozone layer, and so needed good radiation tolerance.
One datum allows no conclusions to be drawn about the population it is taken from. Maybe we are alone. Maybe as many as one in twenty stars carries life on one of its planets. The evidence so far can rule out neither, nor anything between.
Oh, and what do we know of the possibilities for life in the various phases of degenerate nuclear matter on the surface of a neutron star with a strong magnetic field? NOTHING!. Apart from the obvious facts that we'd never get to shake hands or share biospheres, and that given the relative speed of nuclear chemistry compared to the everyday sort, one of our months might be long enough for the complete rise and fall of a civilisation.
We will never find a "new Earth" -- there will be too many chemicals, bacteria, viruses that will be deadly for humans.
Stated on what basis? We have zero observations to choose between these possibilities:
1. Life in general is poorly able to attack alien life. Our defences will overwhelm their bacterial attackers and vice versa. The Space-opera writer's choice.
2. The reverse: their bacteria would rapidly reduce our higher life to smelly slime and vice versa, followed by a long battle between two different clades of single-celled life for supremacy or symbiosis.
3. A one-way knock-out victory: our bacteria overwhelm their biosystem in short order, or their bacteria overwhelm ours. There can be only one ....
4. Panspermia: all life in our galaxy has a common origin, so it's only invaders from Andromeda we have to fear.
5. There's only one way to do life with the physics and chemistry of this universe that's not a thermodynamic impossibility, so convergent evolution means it'll look like panspermia even though it isn't.
All of which is ignoring the extreme unlikelyhood of mankind ever reaching another star. We are too fragile and too short-lived to endure interstellar travel at a realistic (small) fraction of the speed of light.
Our Von Neumann machines have a better chance, but the last place in the universe that Silicon-based A-life would want to conquer would be planets with moist oxidizing atmospheres. Alien A-life might even be established in our Solar system, and if they don't use radio to communicate (or if they use VERY efficient coding indistinguishable from noise) we might not have noticed it!
Yes, Intel will eventually have to follow plan B. Use their world-leading process technology to fab the world's best-performing ARM chips for whoever pays them to do so. They won't make so much money this way, since royalties will flow to ARM, but a profit is a profit.
IMO ARM trying to breakl into the server room is much like Intel trying to break into mobile space. For various reasons, I expect Intel to keep its hold on servers for a long time to come.
Now that XP is EOLed, a good portion of new PC sales are for replacement of old XP kit.
Very much my thought too. The trend for all computer vendors will be down, as the replacement cycle stretches from three years up to five, even eight years. Hardware doesn't wear out fast. Today's hardware is fast enough for most uses that people find for PCs, with huge amounts left in reserve for future software bloat.
Everyone who wants a Mac has got a Mac. Same for PC. Probably same for tablets soon if not already. There's a bulge in the PC replacement market caused by the EOL of XP and consequential replacement of many PCs 5+ years old that can't run more modern Windows well. It's distorting the figures.
In short, computer hardware is now a mature market.
BTW If you run Linux desktops in your business / school / home, you can acquire adequate hardware for free right now (i.e. ex-XP systems). You might even get paid a few quid to take it away.
Sapphire is impure ... if it didn't have trace impurities it would be transparent and colourless, not blue. Yes, I know that the initial topic was sapphire glass not sapphire (a gemstone)....
Brittle materials are ... brittle. You can break even a diamond into pieces using a hard steel chisel and a small hammer. How do you think they divide a large natural diamond of irregular shape into pieces that can be ground into jewels? (And how do you think the expert gem-cutter feels, when after much planning the multimillion-dollar uncut diamond cleaves in a different way to the plan? )
As for phones ... when a mobile meets a flint-gravel drive, the phone will lose. Sods law says display down, onto a sharp flint point, every time, guaranteed. Heck, it even works for buttered toast onto carpet.
Now when you drop it onto a gravel drive, it'll be a write-off whether it lands display side up or display side down or even sideways.
Power buttons are easy to find, they are on the front panel of the PC
yes, under the desk, behind someone's handbag.
Or two PCs stacked on top of a filing cabinet. You press a button and realize 0.1 seconds too late you've just shut down someone else's PC.
Or four PCs connected to a KVM switch. Tip: coloured cable ties are very useful. Cable ID you can see from any direction.
The trouble is that Microsoft has a direct or indirect almost-monopoly on certain widely used small enterprise application classes. There's no good equivalent of Autocad on Linux. Nor is there any equivalent of Sage Accounting. You can probably think of others. It's a scandal that MS is allowed to maintain such monopoly power, but until our legislators catch up there will continue to be applications where using Windows is essential (even if that's Windows in a VM on your Linux desktop).
Pretend that the hard problem has been solved, and a computer contains a weak AI that is capable of disambiguating natural language, that can filter your voice from others in the background, and work out the difference between input and meta-input (commands), and handle all the contextual mappings that are an instinctive part on natural language. (Personally I think that's fifty years to infinity away). But anyway,...
There's a subtle but significant difference between spoken / informal written communication (texts, memo pads), and formal written documents. There's also a not-so-subtle difference in how they are created. The former are linear, rarely revised, read once and thrown away, subject to question-and-answer clarification if unclear (conversation). Formal documents are usually not created in a linear fashion. They used to be written as drafts, with crossings out, arrows and boxes showing text relocations or insertions, etc. Then typed. With a computer one types and reads it back, and can do the editing as one goes. It's probably an improvement. But the key thing is, "how do I know what I think until I've read what I've written"? (A quote, possibly mangled). Speech has no part in this process - it would completely get in the way. Unless it's a play. In which case, the editing process likely involves listening to (and possibly watching) a rough and ready performance, and deciding what worked and what didn't.
BTW the reason why e-mail causes so many office embarassments, arguments, grudges and bust-ups is that it straddles the line between these two forms of communication, and what was intended as a conversation get interpreted as a formal document or vice versa. A genuinely useful AI would be capable of doing the same as a PA or a PR person - "do you really want to say that, because ..." Like I said, it'll be a long time coming.
What we need is "upwards-compatible". Windows 8 has a kernel that's probably an improvement on Windows 7, but only techies ever notice it. It could have had much the same UI as Windows 7 as "legacy mode" and that ghastly not-Metro interface as "new mode" with a choice between the two made every time one logs in (one click). But oh no - they had to tear up everything that went before, and force everyone to start over. F*** them.
It's not just a user issue either. Talk to someone who writes programs with Windows GUIs about it. If Microsoft cared about its customers, anything that prevented an old MS windows GUI program displaying on a new Windows platform would be called a BUG, and fixed asap.
In the Linux world, things work differently. The Gnome team actually did the same as Microsoft - foisted a radical new UI on their "customers" that they didn't like or want. But it's open source, so someone forked the old source and gave it a new name (Mate) and someone else took the newer version and re-skinned it to be less unlike the old version (Cinnamon - which is now also a complete fork). And there were several longstanding alternative UIs out there in the first place - no monopoly on our desktops, thank you!
It's the self-repair stuff that does the trick.
Sadly that's not likely to be as good for the next 50 years as it was for the previous 50! (Unless you're a tree or a koi carp).
Do we know where "there" is?
The Voyagers will evaporate, eventually. Probably long before they again come within 1AU of another star, and long after the human race is run.
It needs to keep its antenna pointed at Earth. Interstellar space is not a perfect vacuum, and there's doubtless a torque created by passing through that medium at high relative speed.
My Philips TV.
True, it's merely sitting in my lounge. On the other hand it's sitting in a moist oxidizing atmosphere being shaken up and down by vehicles speeding over the speed-bump outside, rather than coasting in a vacuum - hardly an improvement. It's particularly impressive given that old vacuum display tube technology involves twenty-five kilovolts of EHT. Were that ever to spark somewhere it shouldn't, that would be curtains.
I keep telling myself not to be sentimental, but it's no good ... throwing away something that well engineered would be criminal.
It is pretty much impossible to design a switch mode power supply that is efficient at both low and high powers.
Not being an electronics engineer I won't say "bollocks"
But surely it's not beyond the wit of man to design a power supply that is integrated with a small rechargeable battery pack. The power supply would turn off leaving the standby electronics running off the battery. They'd be capable of commanding the power supply back on to recharge the battery as fast as possible when it was close to empty, and then off again. In normal operation that wouldn't happen, because the device would get used before the battery was close to empty.
You'd need a real switch to get it full-on if the battery had gone flat because of a long peropd without a mains connection. It might work better with an ultracapacitor instead of a battery (but NiMH cells are very reliable and could be user-replaceable).
Methinks it's a cost issue not a technological issue, and it needs legislation to outlaw devices with inefficient standby. Otherwose there's always an incentive to save pennies (straight to the bottom line!) by shipping inefficient devices.
after 3.6 billion years very little has managed to adapt itself to live in very salty water
Alternative explanation: very salty water on earth is a short-lived and unstable ecological niche. It normally arises when salt in a body of water is concentrated by evaporation because it's cut off from the planet's oceans. Over geological time it will prove short-lived. Its limited water input will fail and it will dry into a salt pan and then become a stratum of rock salt. Or a narrow channel will open wider and dilute the very salty water down much closer to the norm of Earth's oceans.
Life has certainly adapted as the oceans gradually became saltier without huge short-term fluctuation. Most of the salt ever released from rock by weathering, resides in today's oceans. When life was new, they were almost freshwater.
Another alternative explanation. Once life works out how to do something fairly well, that mechanism tends to persist. It's rare for a second way to do the same thing to evolve. For example, the DNA/RNA codes are much the same in the wierdest and oldest archaea and in all plants and animals. Maybe because life here evolved in (almost) fresh water, it is not well suited to very saline water and struggles to adapt sufficiently.
On Titan, where all the "ocean" is all highly saline, evolution may have taken different paths of which we know nothing.
I mean, how can the market share of XP be increasing unless people are doing new XP installs?
How? If Microsoft's market share is decreasing, and an increasing percentage of Microsoft's home users are those who have an old system running XP and don't intend ever to change it. (Businesses ought to have migrated to Windows 7, or even 8, before XP EOL'ed, though we know that there are a fair number that haven't finished their migration yet.)
Which feels right. An ever increasing percentage of the students at the uni where I work arrive with Macbooks rather than notebook PCs. Then there are the many domestic users who don't work with a computer but just consume media and web-browse. They'll be scrapping their Microsoft PC without replacing it, buying tablet devices (Apple or Android) instead. They bought PCs in the past only because there was no alternative.
It just confirms what someone working in electronics-store retail will tell you, that there are lots of customers saying "I want a new computer, but I absolutely don't want Windows 8". If the shop is clued up it can offer them Windows 7. If it's not, I suspect that they buy an iMac, or in a few cases get steered to Linux by a clueful relative. (Are the figures for desktop iMac vs PC available, or do iPads and iPhones muddy the waters? )
Well, if you have to preserve an XP environment to preserve various people's sanity ...
Get a modern desktop system, preferably with an SSD. Install Centos (or other Linux of choice). Install XP and all apps into a KVM VM (using a Linux LV as the XP system's "hard disk").
Advantages: you can make backups of the VM with all apps installed, so recovery after it borks itself is straightforward. (Using LVM snapshot you can do this remotely or automatically, while the XP VM is running). You can use a "network" share for the user's data, and set Linux to work safeguarding the data in it. You can configure firewalling for the poor old XP using Linux. You can be sure it'll never stop working for lack of compatible hardware. Lots of other smaller advantages.
It'll still be much faster than XP native on the old box.
BTW VMware player is slicker and easier and free as in beer but not libre ... and probably not high on VMware's list of things to maintain support for. Which is why I'd recommend KVM, if you would rather put in more effort now than risk handling a problem years down the line when your elderly relative is even less able to adapt to using anything other than XP.
Edit - on second thoughts, probably not an SSD. Lots of RAM so Linux can cache loads without starving the XP VM, and software-mirrored hard disks, so your elderly relative isn't one disk device failure away from losing his sanity. (With smartd sending you regular reports, so you can turn up with a replacement disk drive when it's needed or soon will be).
I swear western civilisation would crumble to dust if anything ever happened to all the Excel spreadsheets that appear to run most businesses...
But it already has! (for pretty much all possible values of "Anything" )
There's a theory that if anybody ever manages to understand the universe, it will abruptly end and be replaced by something less understandable. There is another theory that this has already happened. many times. My personal theory is that this explains hangovers.
The price of the hardware components has continued to fall, so why has nobody decided to build a bigger one, and why a lack of enthusiasm for upgrades?
I'd guess that the problem is that we're close to the limits of what can be done in parallel with the types of hardware we have got. Single node speed has hit the physics limits, large multiple node counts run into interconnect bandwidth limits. Energy consumed scales with the number of nodes, useful work output does not. The %marginal value of an x% upgrade diminishes as the size of the supercomputer increases. What's needed is either a hardware breakthrough on the interconnect front (much more bandwidth), or a software breakthrough that can automatically generate more efficient parallel codes than a human programmer can (if that's possible).
Nature's answer to the problem of using vast numbers of low-power processors (a brain) is interconnection much closer to a fractal dimension of three than anything we can do today.
So you would prefer paedos don't get caught
If that is the price of maintaining one's right to privacy, of not living in a goldfish bowl where the powerful can find out everything about the rest of us without us knowing until they use what they know against us ... then YES.
In practice once they know that an illegal image has been downloaded, that'll be all the justification they need for a warrant to find out who downloaded it. So what you are arguing, is that they should have warrantless acess to a massive database of everything that everyone has ever browsed, so (official reason) they can go trawling for criminals. Do you really believe that is all they will ever go looking for?
Oh yes, the security services already have this access. (Snowdon disclosures). Today, they have to keep that access secret and can't use it except within a fairly narrow "state security" remit. They're well down the slippery slope, though. I fear that Orwell's 1984 is coming true, just 30-40 years later than he thought. (To say nothing of the Vingean nightmare of a society pushed over the edge of chaos by omnipresent surveillance, crashing back to the dark ages if not the stone age).
With respect to cyber-bullying, what's the problem? The police have a complainant who is being bullied, and an internet service provider who can tell them where the bully is. They'll just have to get a warrant for that disclosure in future. Is it being suggested that a warrant would not be granted in these circumstances?
In a city-communter environment, is anyone really going to run them off the petrol bit unless the worst has happened and the daily charge has run out?
So give them the same urban-area advantages as pure electrical cars. After all, if someone with a pure-electric car needs to make a long journety, he's going to use his other car, or hire a conventional car, so the CO2 emission will also be the same or worse. (Worse, if two cars have to be manufactured instead on one).
Back in the old days when a disk drive was the size of a washing machine and required an engineer from the computer company to install one ...
He turned up, un-crated it, and asked to borrow a phone. "Not working?" "No, and it won't. I need to call the shipping company and our loss adjusters ... go and take a look at it ..."
I did. There was a neat rectangular hole in the side. The exact size of a fork-lift-truck's prong ... right through the controller (about ten foot-square circuit boards).
Perhaps worth commenting that some insects can also raise their body temperature above ambient. Bumble bees are furry for the same reason that mammals are furry: to avoid excessive loss of body heat to a colder atmosphere.
Aren't dinosaurs the link between reptiles and birds? From the completely cold-blooded, to the creatures with (probably) the highest metabolic rate on the planet?
Dinosaurs were around for a long time. Plenty of time for evolution to get from cold blood, through cool, to hot. Pretty straightforward, compared to evolving feathers (a true miracle of biology, and perhaps the only radically new bio-structure since plants evolved wood? )
There's surely a good case here for an IPV4 address tax! Someone with a /8 block is likely to rapidly relinquish most of it, when a tax demand for 2^24 pounds/dollars/euros per annum arrives (about 4 million). Whereas the /30 block I have at home would cost me (via my ISP) an extra £4 per annum, which I'd happily pay. Heck, several times that wouldn't hurt much for any addresses actually being used.
Might even make ipv6 popular. People used to live in darkness rather than pay a windows tax. (NB, small W, 17th century).
My understanding is that it's only long (50km+) wires that are seriously vulnerable. Your rooftop solar panels are OK, and your inverter isn't directly vulnerable. I think you may be confusing a solar storm with an atomic-weapon-induced EMP.
What happens ina solar storm is that a large DC current is induced in long wires. Conventional 50Hz or 60Hz AC mains transformers can't transform that DC current, instead they dissipate it resistively, meaning they heat up. If the circuit is not made open-circuit pretty soon, the transformers then melt down and catch fire. HVDC transmission is immune - the solar storm either adds a bit more DC juice or subtracts a bit. Short urban-grid-scale wires do suffer induced curents, but less so proportional to their shorter length. The risk to them is a disorderly shut-down or melt-down of the long-distance grid, causing voltage surges, local overload conditions, etc.
Telecomms is similar, except that it's rarely copper and even more rarely DC coupled these days. Most of the long-distance internet is optical fiber. Long-distance copper is probably found only in very rural parts, connecting one farmouse or hamlet to the nearest town's telephone exchange many miles down the road in the old-fashioned way.
It's probably fair to say that if we'd been hit by a Carrington event in the 1920s through 1990s, our civilisation would have crashed.
Two things have changed / are changing. Firstly, the danger is recognised, and we now have satellites watching the sun that would give us a few hours' warning. That's long enough to prepare the grid. Controlled shutdown instead of fatal melt-down. Of course, whether they do enough "solar safety" drills to actually avoid getting the electricity grid fried, is an unknown until it happens.
Secondly, we are moving from a synchronous transformer-coupled HVAC grid (vulnerable) to HVDC long-distance electricity transmission (more efficient and not vulnerable). Likewise telecomms are moving from copper wires (vulnerable) to optical fibres (not vulnerable).
If we don't get hit by another X-unprecedented flare in the next couple of decades, we're probably OK. Except, we don't know what is the biggest flare our nearest star is capable of! The upper limit is only that it was never powerful enough to wipe out all land-based life (and there have been a few extinction events when something wiped out *most* land-based life ...).
A wimp, as these things go. The Carrington Event is estimated to have been X22 (on a scale that only officially goes to 20). So that's almost 1.5^20 times this one (around 3300 times bigger).
Good Platinum ores are graded at grammes per tonne. You have to dig up, crush and chemically process a tonne of rock to get a few grammes of Platinum. This is one reason why it's so darned expensive.
Bad management gets the unions and workforce it deserves. (Those who can leave, have left).
As an organisation at the other extreme, I'd cite the John Lewis partnership. Unions? Why? Everyone has an equity stake in the business, and it goes from strength to strength in a very competitive sector.
I think that a very solid line should be drawn between those who are paid a salary regardless of whether they perform excellently, adequately or badly
And those who founded a business and own some or all of the equity in that business.
Frankly, I don't see much evidence that many (any?) of the fat cats paid six- or seven-figure salaries are worth any more than the employees several levels below them. Indeed, it's usually the lower levels that do the real work, and can see how the self-perpetuating clique of fat cats more often than not have zero or negative value. They give themselves 10% or 20% pay rises, while the staff that do the work get 0%. They aren't working for a living, they are parasitising those who do!
In contrast, someone who put his own money and time into a business that is now thriving, should be allowed to enjoy whatever degree of success he is able to achieve, just as long as it is by way of dividends paid equally to all equity-holders, or sale of shares in that equity.
So (for example) I'd be in favour of higher levels of income tax on very large salaries, but not the same levels on capital gains (especially not on long-term capital gains, and especially not capital gains made by founders of businesses on equity that was worth nothing at all when they started). Anti-avoidance rules would clearly be needed to stop the fat cats playing the system.
Also there should be an outright ban on any salary greater than the Prime Minister's salary in any part of the public sector (including universities, quangos and suchlike -- not just the civil service). If a corporation wastes its money, it will sooner or later go bust. That's a crude self-correcting mechanism that eliminates the very worst excesses. Whereas if an organisation is funded by the taxpayer, its fat cats can and will carry on leeching off society effectively forever. (In the rare cases where such an organisation needs a specialist who really can command such a large salary in a free market, it should obtain that service by competitive tender, with payment under the laws governing commercial contracts, including appropriate penalty clauses. Never by employing that specialist on a salary. )
Controversial, I know. Asbestos jacket in place ....
Anyone under 35 years old won't remember. Those who do remember, can be intimidated into not talking to anyone who doesn't. Those who won't shut up, are in jail or exiled.
Never under-estimate the human ability to ignore anything that isn't aimed directly at oneself. Remember Pastor Neimoller:
"First they came for the Socialists, and I did not speak out—Because I was not a Socialist.
"Then they came for the Trade Unionists, and I did not speak out—Because I was not a Trade Unionist.
"Then they came for the Jews, and I did not speak out—Because I was not a Jew.
"Then they came for me—and there was no one left to speak for me."
The technology of censorship and repression is probably as well ensconced here in the West, as in China. For now the sinister "they" lack the will to use it to completely destroy our freedom. Post-Snowdon, I fear the question is, "for how much longer?"
TDP is important. It determines the type and size of heatsink the system needs. It's especially important if you are trying to build a passively cooled system (no fan, no moving parts (SSD), no noise).
It's nice to know your system can turbo for a few seconds, relying on thermal inertia to avoid meltdown, and then down-clocking itself when not busy or in thermal distress to allow the heat to dissipate. But TDP, as the maximum long-term-average amount of power that a busy system will ever need to dissipate, is a critical parameter for designing it.
Of course, no reason it actually has to BE glass topped.
Marble or Basalt (if heavy, hard and shiny floats your boat). Easier to source any size than tempered glass (and you really don't wan't to think about untempered)
I've just run into the annoyance of a worktop so black that an optical mouse cannot "see" it, and for the first time in my life I had to find a mouse mat (OK, a sheet of A4 paper). Give me wood or wood-effect laminate any day.
The assumption here is that if these companies didn't buy gold known to come from NK, NK would not be able to sell its gold, or would have to sell it at a huge discount.
A moment's thought should tell you that's not the case. A minuscule discount will suffice to sell their gold to a country or person who doesn't care, and once their gold is melted with gold from scrap jewellery or scrap computer parts, nobody will have the faintest idea where it came from. (Not that the Chinese even care).
Sanctions can only work for lower-value stuff, especially ones where the refineries are few and specialized. We can probably avoid buying Tantalum ores from war zones where it is dug out by slaves. We can track tankers full of Iranian oil and force them to sell it to customers further away than Europe, causing Iran a small percentage loss (at the cost of increasing global CO2 emissions!) For gold, there's no chance of anything like this working.
No - saying C is not the simplest or sparsest language. That honour surely goes to LISP, and B was simpler than C. Simplest and sparsest is not the reason C is very popular in some programming communities. Like most successful languages, C has a niche, which is the writing of operating systems and realtime systems. I'll also grant that until computers became fast enough that interpreted languages weren't "too inefficient", C was probably the best general-purpose compiled language. (FORTRAN was and remains better for numerical coding, but only for numerical coding. Pascal, PL/I, and Ada never really caught on. I'll let someone else talk about C++ if they want to, because personally I loathe it).
I think it's a mistake.
Dealing with a small set of foreign glyphs that are universal in a global programming community, is far better than the fragmentation that arises if every programmer uses their own script for their variables. It'll compile elsewhere, but it might as well be object code for all the use that the source will be outside that linguistic domain. I'll add, anyone who studies mathematics, gets to learn the Greek alphabet, and a few letters from the Hebrew one, and a handful of symbols not taken from any alphabet (eg union, infinity, ...). It doesn't give Greeks or Israelis any mathematical edge.
I can imagine an alternative universe in which North America was settled by Russians. In that universe, the Cyrillic alphabet might be used globally by programmers. I'd be able to go along with that: learning to recognise a handful of new glyphs isn't hard.
But learning 6000+ traditional Chinese glyphs in order to code: no way. I'd rebel and create a programming language based on the Latin alphabet. As for those in the far East ... well, China, Japan and Korea have all chosen to map their languages onto the Latin alphabet. Because we got to IT first, or because there are intrinsic advantages to our small alphabet over their huge ones? Don't know, but in China, this happened under Mao when the West was the Enemy, and before IT arrived there.
C is one of the most simple and sparse languages there has ever been. That's why it works.
Oh really? So why hasn't it been universally trumped by LISP? (And for that matter why did they ever do C, given B? )