Re: Engineering 101
Sapphire is impure ... if it didn't have trace impurities it would be transparent and colourless, not blue. Yes, I know that the initial topic was sapphire glass not sapphire (a gemstone)....
2580 posts • joined 10 Jun 2009
Sapphire is impure ... if it didn't have trace impurities it would be transparent and colourless, not blue. Yes, I know that the initial topic was sapphire glass not sapphire (a gemstone)....
Brittle materials are ... brittle. You can break even a diamond into pieces using a hard steel chisel and a small hammer. How do you think they divide a large natural diamond of irregular shape into pieces that can be ground into jewels? (And how do you think the expert gem-cutter feels, when after much planning the multimillion-dollar uncut diamond cleaves in a different way to the plan? )
As for phones ... when a mobile meets a flint-gravel drive, the phone will lose. Sods law says display down, onto a sharp flint point, every time, guaranteed. Heck, it even works for buttered toast onto carpet.
Now when you drop it onto a gravel drive, it'll be a write-off whether it lands display side up or display side down or even sideways.
Power buttons are easy to find, they are on the front panel of the PC
yes, under the desk, behind someone's handbag.
Or two PCs stacked on top of a filing cabinet. You press a button and realize 0.1 seconds too late you've just shut down someone else's PC.
Or four PCs connected to a KVM switch. Tip: coloured cable ties are very useful. Cable ID you can see from any direction.
The trouble is that Microsoft has a direct or indirect almost-monopoly on certain widely used small enterprise application classes. There's no good equivalent of Autocad on Linux. Nor is there any equivalent of Sage Accounting. You can probably think of others. It's a scandal that MS is allowed to maintain such monopoly power, but until our legislators catch up there will continue to be applications where using Windows is essential (even if that's Windows in a VM on your Linux desktop).
Pretend that the hard problem has been solved, and a computer contains a weak AI that is capable of disambiguating natural language, that can filter your voice from others in the background, and work out the difference between input and meta-input (commands), and handle all the contextual mappings that are an instinctive part on natural language. (Personally I think that's fifty years to infinity away). But anyway,...
There's a subtle but significant difference between spoken / informal written communication (texts, memo pads), and formal written documents. There's also a not-so-subtle difference in how they are created. The former are linear, rarely revised, read once and thrown away, subject to question-and-answer clarification if unclear (conversation). Formal documents are usually not created in a linear fashion. They used to be written as drafts, with crossings out, arrows and boxes showing text relocations or insertions, etc. Then typed. With a computer one types and reads it back, and can do the editing as one goes. It's probably an improvement. But the key thing is, "how do I know what I think until I've read what I've written"? (A quote, possibly mangled). Speech has no part in this process - it would completely get in the way. Unless it's a play. In which case, the editing process likely involves listening to (and possibly watching) a rough and ready performance, and deciding what worked and what didn't.
BTW the reason why e-mail causes so many office embarassments, arguments, grudges and bust-ups is that it straddles the line between these two forms of communication, and what was intended as a conversation get interpreted as a formal document or vice versa. A genuinely useful AI would be capable of doing the same as a PA or a PR person - "do you really want to say that, because ..." Like I said, it'll be a long time coming.
What we need is "upwards-compatible". Windows 8 has a kernel that's probably an improvement on Windows 7, but only techies ever notice it. It could have had much the same UI as Windows 7 as "legacy mode" and that ghastly not-Metro interface as "new mode" with a choice between the two made every time one logs in (one click). But oh no - they had to tear up everything that went before, and force everyone to start over. F*** them.
It's not just a user issue either. Talk to someone who writes programs with Windows GUIs about it. If Microsoft cared about its customers, anything that prevented an old MS windows GUI program displaying on a new Windows platform would be called a BUG, and fixed asap.
In the Linux world, things work differently. The Gnome team actually did the same as Microsoft - foisted a radical new UI on their "customers" that they didn't like or want. But it's open source, so someone forked the old source and gave it a new name (Mate) and someone else took the newer version and re-skinned it to be less unlike the old version (Cinnamon - which is now also a complete fork). And there were several longstanding alternative UIs out there in the first place - no monopoly on our desktops, thank you!
It's the self-repair stuff that does the trick.
Sadly that's not likely to be as good for the next 50 years as it was for the previous 50! (Unless you're a tree or a koi carp).
Do we know where "there" is?
The Voyagers will evaporate, eventually. Probably long before they again come within 1AU of another star, and long after the human race is run.
It needs to keep its antenna pointed at Earth. Interstellar space is not a perfect vacuum, and there's doubtless a torque created by passing through that medium at high relative speed.
My Philips TV.
True, it's merely sitting in my lounge. On the other hand it's sitting in a moist oxidizing atmosphere being shaken up and down by vehicles speeding over the speed-bump outside, rather than coasting in a vacuum - hardly an improvement. It's particularly impressive given that old vacuum display tube technology involves twenty-five kilovolts of EHT. Were that ever to spark somewhere it shouldn't, that would be curtains.
I keep telling myself not to be sentimental, but it's no good ... throwing away something that well engineered would be criminal.
It is pretty much impossible to design a switch mode power supply that is efficient at both low and high powers.
Not being an electronics engineer I won't say "bollocks"
But surely it's not beyond the wit of man to design a power supply that is integrated with a small rechargeable battery pack. The power supply would turn off leaving the standby electronics running off the battery. They'd be capable of commanding the power supply back on to recharge the battery as fast as possible when it was close to empty, and then off again. In normal operation that wouldn't happen, because the device would get used before the battery was close to empty.
You'd need a real switch to get it full-on if the battery had gone flat because of a long peropd without a mains connection. It might work better with an ultracapacitor instead of a battery (but NiMH cells are very reliable and could be user-replaceable).
Methinks it's a cost issue not a technological issue, and it needs legislation to outlaw devices with inefficient standby. Otherwose there's always an incentive to save pennies (straight to the bottom line!) by shipping inefficient devices.
after 3.6 billion years very little has managed to adapt itself to live in very salty water
Alternative explanation: very salty water on earth is a short-lived and unstable ecological niche. It normally arises when salt in a body of water is concentrated by evaporation because it's cut off from the planet's oceans. Over geological time it will prove short-lived. Its limited water input will fail and it will dry into a salt pan and then become a stratum of rock salt. Or a narrow channel will open wider and dilute the very salty water down much closer to the norm of Earth's oceans.
Life has certainly adapted as the oceans gradually became saltier without huge short-term fluctuation. Most of the salt ever released from rock by weathering, resides in today's oceans. When life was new, they were almost freshwater.
Another alternative explanation. Once life works out how to do something fairly well, that mechanism tends to persist. It's rare for a second way to do the same thing to evolve. For example, the DNA/RNA codes are much the same in the wierdest and oldest archaea and in all plants and animals. Maybe because life here evolved in (almost) fresh water, it is not well suited to very saline water and struggles to adapt sufficiently.
On Titan, where all the "ocean" is all highly saline, evolution may have taken different paths of which we know nothing.
I mean, how can the market share of XP be increasing unless people are doing new XP installs?
How? If Microsoft's market share is decreasing, and an increasing percentage of Microsoft's home users are those who have an old system running XP and don't intend ever to change it. (Businesses ought to have migrated to Windows 7, or even 8, before XP EOL'ed, though we know that there are a fair number that haven't finished their migration yet.)
Which feels right. An ever increasing percentage of the students at the uni where I work arrive with Macbooks rather than notebook PCs. Then there are the many domestic users who don't work with a computer but just consume media and web-browse. They'll be scrapping their Microsoft PC without replacing it, buying tablet devices (Apple or Android) instead. They bought PCs in the past only because there was no alternative.
It just confirms what someone working in electronics-store retail will tell you, that there are lots of customers saying "I want a new computer, but I absolutely don't want Windows 8". If the shop is clued up it can offer them Windows 7. If it's not, I suspect that they buy an iMac, or in a few cases get steered to Linux by a clueful relative. (Are the figures for desktop iMac vs PC available, or do iPads and iPhones muddy the waters? )
Well, if you have to preserve an XP environment to preserve various people's sanity ...
Get a modern desktop system, preferably with an SSD. Install Centos (or other Linux of choice). Install XP and all apps into a KVM VM (using a Linux LV as the XP system's "hard disk").
Advantages: you can make backups of the VM with all apps installed, so recovery after it borks itself is straightforward. (Using LVM snapshot you can do this remotely or automatically, while the XP VM is running). You can use a "network" share for the user's data, and set Linux to work safeguarding the data in it. You can configure firewalling for the poor old XP using Linux. You can be sure it'll never stop working for lack of compatible hardware. Lots of other smaller advantages.
It'll still be much faster than XP native on the old box.
BTW VMware player is slicker and easier and free as in beer but not libre ... and probably not high on VMware's list of things to maintain support for. Which is why I'd recommend KVM, if you would rather put in more effort now than risk handling a problem years down the line when your elderly relative is even less able to adapt to using anything other than XP.
Edit - on second thoughts, probably not an SSD. Lots of RAM so Linux can cache loads without starving the XP VM, and software-mirrored hard disks, so your elderly relative isn't one disk device failure away from losing his sanity. (With smartd sending you regular reports, so you can turn up with a replacement disk drive when it's needed or soon will be).
I swear western civilisation would crumble to dust if anything ever happened to all the Excel spreadsheets that appear to run most businesses...
But it already has! (for pretty much all possible values of "Anything" )
There's a theory that if anybody ever manages to understand the universe, it will abruptly end and be replaced by something less understandable. There is another theory that this has already happened. many times. My personal theory is that this explains hangovers.
The price of the hardware components has continued to fall, so why has nobody decided to build a bigger one, and why a lack of enthusiasm for upgrades?
I'd guess that the problem is that we're close to the limits of what can be done in parallel with the types of hardware we have got. Single node speed has hit the physics limits, large multiple node counts run into interconnect bandwidth limits. Energy consumed scales with the number of nodes, useful work output does not. The %marginal value of an x% upgrade diminishes as the size of the supercomputer increases. What's needed is either a hardware breakthrough on the interconnect front (much more bandwidth), or a software breakthrough that can automatically generate more efficient parallel codes than a human programmer can (if that's possible).
Nature's answer to the problem of using vast numbers of low-power processors (a brain) is interconnection much closer to a fractal dimension of three than anything we can do today.
So you would prefer paedos don't get caught
If that is the price of maintaining one's right to privacy, of not living in a goldfish bowl where the powerful can find out everything about the rest of us without us knowing until they use what they know against us ... then YES.
In practice once they know that an illegal image has been downloaded, that'll be all the justification they need for a warrant to find out who downloaded it. So what you are arguing, is that they should have warrantless acess to a massive database of everything that everyone has ever browsed, so (official reason) they can go trawling for criminals. Do you really believe that is all they will ever go looking for?
Oh yes, the security services already have this access. (Snowdon disclosures). Today, they have to keep that access secret and can't use it except within a fairly narrow "state security" remit. They're well down the slippery slope, though. I fear that Orwell's 1984 is coming true, just 30-40 years later than he thought. (To say nothing of the Vingean nightmare of a society pushed over the edge of chaos by omnipresent surveillance, crashing back to the dark ages if not the stone age).
With respect to cyber-bullying, what's the problem? The police have a complainant who is being bullied, and an internet service provider who can tell them where the bully is. They'll just have to get a warrant for that disclosure in future. Is it being suggested that a warrant would not be granted in these circumstances?
In a city-communter environment, is anyone really going to run them off the petrol bit unless the worst has happened and the daily charge has run out?
So give them the same urban-area advantages as pure electrical cars. After all, if someone with a pure-electric car needs to make a long journety, he's going to use his other car, or hire a conventional car, so the CO2 emission will also be the same or worse. (Worse, if two cars have to be manufactured instead on one).
Back in the old days when a disk drive was the size of a washing machine and required an engineer from the computer company to install one ...
He turned up, un-crated it, and asked to borrow a phone. "Not working?" "No, and it won't. I need to call the shipping company and our loss adjusters ... go and take a look at it ..."
I did. There was a neat rectangular hole in the side. The exact size of a fork-lift-truck's prong ... right through the controller (about ten foot-square circuit boards).
Perhaps worth commenting that some insects can also raise their body temperature above ambient. Bumble bees are furry for the same reason that mammals are furry: to avoid excessive loss of body heat to a colder atmosphere.
Aren't dinosaurs the link between reptiles and birds? From the completely cold-blooded, to the creatures with (probably) the highest metabolic rate on the planet?
Dinosaurs were around for a long time. Plenty of time for evolution to get from cold blood, through cool, to hot. Pretty straightforward, compared to evolving feathers (a true miracle of biology, and perhaps the only radically new bio-structure since plants evolved wood? )
There's surely a good case here for an IPV4 address tax! Someone with a /8 block is likely to rapidly relinquish most of it, when a tax demand for 2^24 pounds/dollars/euros per annum arrives (about 4 million). Whereas the /30 block I have at home would cost me (via my ISP) an extra £4 per annum, which I'd happily pay. Heck, several times that wouldn't hurt much for any addresses actually being used.
Might even make ipv6 popular. People used to live in darkness rather than pay a windows tax. (NB, small W, 17th century).
My understanding is that it's only long (50km+) wires that are seriously vulnerable. Your rooftop solar panels are OK, and your inverter isn't directly vulnerable. I think you may be confusing a solar storm with an atomic-weapon-induced EMP.
What happens ina solar storm is that a large DC current is induced in long wires. Conventional 50Hz or 60Hz AC mains transformers can't transform that DC current, instead they dissipate it resistively, meaning they heat up. If the circuit is not made open-circuit pretty soon, the transformers then melt down and catch fire. HVDC transmission is immune - the solar storm either adds a bit more DC juice or subtracts a bit. Short urban-grid-scale wires do suffer induced curents, but less so proportional to their shorter length. The risk to them is a disorderly shut-down or melt-down of the long-distance grid, causing voltage surges, local overload conditions, etc.
Telecomms is similar, except that it's rarely copper and even more rarely DC coupled these days. Most of the long-distance internet is optical fiber. Long-distance copper is probably found only in very rural parts, connecting one farmouse or hamlet to the nearest town's telephone exchange many miles down the road in the old-fashioned way.
It's probably fair to say that if we'd been hit by a Carrington event in the 1920s through 1990s, our civilisation would have crashed.
Two things have changed / are changing. Firstly, the danger is recognised, and we now have satellites watching the sun that would give us a few hours' warning. That's long enough to prepare the grid. Controlled shutdown instead of fatal melt-down. Of course, whether they do enough "solar safety" drills to actually avoid getting the electricity grid fried, is an unknown until it happens.
Secondly, we are moving from a synchronous transformer-coupled HVAC grid (vulnerable) to HVDC long-distance electricity transmission (more efficient and not vulnerable). Likewise telecomms are moving from copper wires (vulnerable) to optical fibres (not vulnerable).
If we don't get hit by another X-unprecedented flare in the next couple of decades, we're probably OK. Except, we don't know what is the biggest flare our nearest star is capable of! The upper limit is only that it was never powerful enough to wipe out all land-based life (and there have been a few extinction events when something wiped out *most* land-based life ...).
A wimp, as these things go. The Carrington Event is estimated to have been X22 (on a scale that only officially goes to 20). So that's almost 1.5^20 times this one (around 3300 times bigger).
Good Platinum ores are graded at grammes per tonne. You have to dig up, crush and chemically process a tonne of rock to get a few grammes of Platinum. This is one reason why it's so darned expensive.
Bad management gets the unions and workforce it deserves. (Those who can leave, have left).
As an organisation at the other extreme, I'd cite the John Lewis partnership. Unions? Why? Everyone has an equity stake in the business, and it goes from strength to strength in a very competitive sector.
I think that a very solid line should be drawn between those who are paid a salary regardless of whether they perform excellently, adequately or badly
And those who founded a business and own some or all of the equity in that business.
Frankly, I don't see much evidence that many (any?) of the fat cats paid six- or seven-figure salaries are worth any more than the employees several levels below them. Indeed, it's usually the lower levels that do the real work, and can see how the self-perpetuating clique of fat cats more often than not have zero or negative value. They give themselves 10% or 20% pay rises, while the staff that do the work get 0%. They aren't working for a living, they are parasitising those who do!
In contrast, someone who put his own money and time into a business that is now thriving, should be allowed to enjoy whatever degree of success he is able to achieve, just as long as it is by way of dividends paid equally to all equity-holders, or sale of shares in that equity.
So (for example) I'd be in favour of higher levels of income tax on very large salaries, but not the same levels on capital gains (especially not on long-term capital gains, and especially not capital gains made by founders of businesses on equity that was worth nothing at all when they started). Anti-avoidance rules would clearly be needed to stop the fat cats playing the system.
Also there should be an outright ban on any salary greater than the Prime Minister's salary in any part of the public sector (including universities, quangos and suchlike -- not just the civil service). If a corporation wastes its money, it will sooner or later go bust. That's a crude self-correcting mechanism that eliminates the very worst excesses. Whereas if an organisation is funded by the taxpayer, its fat cats can and will carry on leeching off society effectively forever. (In the rare cases where such an organisation needs a specialist who really can command such a large salary in a free market, it should obtain that service by competitive tender, with payment under the laws governing commercial contracts, including appropriate penalty clauses. Never by employing that specialist on a salary. )
Controversial, I know. Asbestos jacket in place ....
Anyone under 35 years old won't remember. Those who do remember, can be intimidated into not talking to anyone who doesn't. Those who won't shut up, are in jail or exiled.
Never under-estimate the human ability to ignore anything that isn't aimed directly at oneself. Remember Pastor Neimoller:
"First they came for the Socialists, and I did not speak out—Because I was not a Socialist.
"Then they came for the Trade Unionists, and I did not speak out—Because I was not a Trade Unionist.
"Then they came for the Jews, and I did not speak out—Because I was not a Jew.
"Then they came for me—and there was no one left to speak for me."
The technology of censorship and repression is probably as well ensconced here in the West, as in China. For now the sinister "they" lack the will to use it to completely destroy our freedom. Post-Snowdon, I fear the question is, "for how much longer?"
TDP is important. It determines the type and size of heatsink the system needs. It's especially important if you are trying to build a passively cooled system (no fan, no moving parts (SSD), no noise).
It's nice to know your system can turbo for a few seconds, relying on thermal inertia to avoid meltdown, and then down-clocking itself when not busy or in thermal distress to allow the heat to dissipate. But TDP, as the maximum long-term-average amount of power that a busy system will ever need to dissipate, is a critical parameter for designing it.
Of course, no reason it actually has to BE glass topped.
Marble or Basalt (if heavy, hard and shiny floats your boat). Easier to source any size than tempered glass (and you really don't wan't to think about untempered)
I've just run into the annoyance of a worktop so black that an optical mouse cannot "see" it, and for the first time in my life I had to find a mouse mat (OK, a sheet of A4 paper). Give me wood or wood-effect laminate any day.
The assumption here is that if these companies didn't buy gold known to come from NK, NK would not be able to sell its gold, or would have to sell it at a huge discount.
A moment's thought should tell you that's not the case. A minuscule discount will suffice to sell their gold to a country or person who doesn't care, and once their gold is melted with gold from scrap jewellery or scrap computer parts, nobody will have the faintest idea where it came from. (Not that the Chinese even care).
Sanctions can only work for lower-value stuff, especially ones where the refineries are few and specialized. We can probably avoid buying Tantalum ores from war zones where it is dug out by slaves. We can track tankers full of Iranian oil and force them to sell it to customers further away than Europe, causing Iran a small percentage loss (at the cost of increasing global CO2 emissions!) For gold, there's no chance of anything like this working.
No - saying C is not the simplest or sparsest language. That honour surely goes to LISP, and B was simpler than C. Simplest and sparsest is not the reason C is very popular in some programming communities. Like most successful languages, C has a niche, which is the writing of operating systems and realtime systems. I'll also grant that until computers became fast enough that interpreted languages weren't "too inefficient", C was probably the best general-purpose compiled language. (FORTRAN was and remains better for numerical coding, but only for numerical coding. Pascal, PL/I, and Ada never really caught on. I'll let someone else talk about C++ if they want to, because personally I loathe it).
I think it's a mistake.
Dealing with a small set of foreign glyphs that are universal in a global programming community, is far better than the fragmentation that arises if every programmer uses their own script for their variables. It'll compile elsewhere, but it might as well be object code for all the use that the source will be outside that linguistic domain. I'll add, anyone who studies mathematics, gets to learn the Greek alphabet, and a few letters from the Hebrew one, and a handful of symbols not taken from any alphabet (eg union, infinity, ...). It doesn't give Greeks or Israelis any mathematical edge.
I can imagine an alternative universe in which North America was settled by Russians. In that universe, the Cyrillic alphabet might be used globally by programmers. I'd be able to go along with that: learning to recognise a handful of new glyphs isn't hard.
But learning 6000+ traditional Chinese glyphs in order to code: no way. I'd rebel and create a programming language based on the Latin alphabet. As for those in the far East ... well, China, Japan and Korea have all chosen to map their languages onto the Latin alphabet. Because we got to IT first, or because there are intrinsic advantages to our small alphabet over their huge ones? Don't know, but in China, this happened under Mao when the West was the Enemy, and before IT arrived there.
C is one of the most simple and sparse languages there has ever been. That's why it works.
Oh really? So why hasn't it been universally trumped by LISP? (And for that matter why did they ever do C, given B? )
C is one of a small set of languages in which it's possible to write a useful operating system kernel. Don't knock it. But also don't use it, if you're not writing something that requires OS-like control over the fine detail of the generated code. And for heaven's sake don't teach it as a first language.
Works out pretty well in Python. Given tuple assignment you don't often need semicolons, but you can put multiple statements on a line if you want to.
As for Swift, I lost interest the moment I noticed that variable names are unicode strings not ASCII-alphanumeric strings. Bleugh. Immediate fragmentation of the programming world into human-written-language-script communities. I can process code written by (say) a Frenchman or a Finn. The variable names may be less helpful than ones created by a Canadian or an Ozzie, but at least the necessary processing skill is there in my visual cortex. Which it is not, for a string of Chinese, Japanese, Korean, Tamil, or umpteen other possibilities.
To say nothing of the fact that there are multiple unicode strings that generate the same visual representation (such as an e with an acute accent). It's bad enough dealing with O and 0, 1 and l and I. FAlL.
Anyone notice that one of this beastie's pins is designated PORN?
Thanks. That's what I wanted to know. They have cracked vertical stacking on a single piece of silicon. Call that stage 1: gaining access to the third dimension.
So the stage is now set for stage two: work out how to stack more than 32 deep, and how to stack smaller cells. Neither is up against physical limits, so Moore's law ought to get a second chance, and the Terabit or Terabyte SSD chip may be only a few years away.
Unless HAMR comes to the rescue (i.e. 100Tb drives), it looks as if spinning rust drives may go the way of the horse and cart in not much more than a decade.
On the other hand -- if they've really cracked fabricating 32 devices stacked vertically (as opposed to making 32 separate devices and merely assembling them into a vertical stack), the price of large SSDs may be set to fall something like vertically?
Which is why I'd also like to know more. One thing for sure, I wouldn't much like to be in the hard disk business these days.
No - these stats are based upon browser agent strings visiting a broad range of sites.
Someone, please. please write a benign virus that alters these strings! Wouldn't it be fun to watch Windows for Workgroups rising from the grave?
Microsoft doing that would create an opening for "Business Linux" (possibly hidden behind a non-Linux name, just as Linux's conquest of the mobile world goes by the name "Android").
Microsoft has jettisoned its CEO just in time, now to see if it can also jettison the business plan made out of FAIL.
almost every device i think of already existed in some form or another before it became popular with the masses.
The original Sony Walkman? (Yes, miniature tape recorders pre-existed, but not for playing music to consumers as they went about their lives).
When these things hit the road, the "drivers" won't need to see out. So maybe they'll go for privacy instead, and paint over all the transparent bits. Blue, maybe.
Giving "BSOD" a whole new meaning?
While the elephants duel, it's the authors who are getting trampled.
Not really embarassing for Red Hat. They do servers. They don't really claim desktop Linux (although personally I'm happier with my Linux desktop atop a Red Hat clone like Centos, than atop Ubuntu or SuSE).
BTW if you do run Redhat or similar on desktops, when you migrate them to 7, I'd recommend overriding the new default to keep ext4 as your root FS. I wouldn't entirely trust XFS in an environment where the electricity supply is unreliable (ie, where lusers have fingers on the power buttons).