Re: The Ghost of Tommy Cooper
Or Terry Pratchett's
Light a man a fire and you'll keep him warm for a night. Set a man on fire and you'll keep him warm for the rest of his life
6978 posts • joined 21 Jul 2010
Or Terry Pratchett's
Light a man a fire and you'll keep him warm for a night. Set a man on fire and you'll keep him warm for the rest of his life
>The issue with USB-C is power and signalling on the same pin
Can you expand upon that, Horridbloke? On every Type C pin-out diagram I can find, power and data are on separate pins.
There are some further pins called 'USB power delivery communication' but they are just data pins dedicated to communicating power draw and the like.
> Nice idea but from the article: "Once USB-C becomes ubiquitous and makes a single wire responsible for carrying power and data..."
The article means *cable", not *wire". USB C still has dedicated power pins, discrete from its data pins. The 'short length of cable' I referred to would be one modified so that only its power pins were still connected. Such a solution will give the cautious / paranoid user more peace of mind than any software approach.
A USB C pin-out diagram is here:
>What was wrong with those chargers with just a pin connector...
Not a lot - they were very ergonomic, much easier for people with limited dexterity or eyesight to use than microUSB. However, such people would benefit even more from charging docks or wireless charging solutions.
Ultimately, phones have got smaller, so designers have looked at ways to save space. Phones needed a data connection anyway, and then the EU mandated microUSB.
Most of the pin connectors were hard-cabled to the older, inefficient sort of power adaptor - the kind that was heavy and got warm during use.
Ha, I even remember a mate's Nokia that had a pin connector for charging and a mini USB socket for data - but it wouldn't charge over miniUSB, which was just frustrating.
Some Sony Xperia phones had a similar feature - two external nubbins mounted on the side of the phone, for charging from docks. Of course the required a non-standard cable or dock to use, so isn't directly applicable to the scenario sketched out here (i.e. you want to use an untrusted but common power plug. )
Another possible method:
- Carry a short length of USB male > female cable that only has power pins connected. For the next few years, this would be a handy cable anyway, because it could be microUSB.female > USB.C.Male, thus allowing owners of new phones to use a common microUSB charger.
A method to Doug S method has been implemented before - I've had gadgets that connect for power only (at a higher rate of mA) when turned off, and connect with data when turned on. It used to be (in USB 2) that many devices would charge more quickly on cables with the data pins shorted (AFAIK the thinking was to limit the draw gadgets would make on a host PC's USB bus).
I am slightly wary of not being able to access a device's storage by USB if a hardware button is broken, but TBH that is the situation at the moment (to access the internal storage of an Android phone with a damaged digitiser you need to use USB OTG to unlock it - if you haven't previously turned on USB debugging).
Don't worry, new phones are usually available cheaper than their list price. Often by a big chink, like £150. Do check real retail prices (from reputable sites, of course!) when drawing up your final short-list.
>No removable battery = dead brick after 2 years that can't hold charge
No, it means you pop down your local electronics emporium (or flea market stall) and pay someone £20 to fit a new battery. Against the cost of a phone, it's not a king's ransom.
>No removable battery = can't go away for a weekend without power
Use an external battery pack or two. Probably no pricier than buying a dedicated phone battery (which itself is only a good investment for the original phone which you might choose to upgrade for other reasons) Also handy for other camping gear, such as mp3 players, cameras, speakers, kindles etc.
>No removable battery = a guarantee to kill the phone if it gets wet
Just buy a waterproof phone. Any of the well-regarded Xperia Z models, some of the recent Samsung flagships, some other big names also use a limited form of internal waterproofing but don't advertise it. Or buy a waterproof case. I can't think of any non-waterproof Android phone that can do what its aquatic cousins can't. The best screens and cameras on Android handsets are all available on water-resistant models.
I have a Z3 Compact, and the battery life is excellent. However, the two days time largely comes from 'Stamina Mode' which means the phone isn't using its data connections all the time. The side effect, which doesn't bother me, means that you might not receive social media notifications until you manually turn the screen on. (And turning the screen on can be done with a double tap - one of those feature you only appreciate when using a handset that doesn't have it.) Such a Stamina Modo has come to stock Android, I believe.
I imagine that the smaller, lower resolution screen of the Z* Compact phones helps too.
One of the HTC phones from a couple of years back was offered in a Google Silver edition (ODM-branded phones but sold through Google with vanilla Android and a guarantee of a certain period of prompt updates), so its HTC-only sister-handset also received prompt updates.
>you remember that you don't have to pay for the privilege of using cash,
We do pay to use cash, but just not directly. There are costs involved with minting, regulation, sorting, handling... those metalsmiths, accountants and security guards still need to be paid.
>Actually, the really rich buy good land.
>>Why only the really rich?
In addition to our summer and winter estate, [my father] owned a valuable piece of land. True, it was a small piece, but he carried it with him wherever he went.
- Woody Allen, Love and Death
>Im not entirely sure its possible to calculate how many bitcoins are technically out of circulation because of this.
I don't it is - Butcoins might be sat there doing nothing for all to see, but we won't know if the owner has lost the key or if the owner is saving them for a rainy day.
Your comment did make me think of how (or if) minters of real cash account for inaccessibly lost or destroyed coins or notes. Anyone here have any ideas?
>That blockchain is currently approaching 65GB in size. ... This makes getting into Bitcoin in a serious way rather a PITA,
That's another interesting issue. I'm sure it's been discussed by people who know what they are talking about, unlike me. As a layman, I imagined that a solution could involve torrents, and hashes made so that you know you're not downloading a doctored segment. Upon googling it just now, it seems that some proposed solutions do involve Bittorrent-like systems and hashing, but with some extra cleverness that starts to make my brain hurt:
I'm no expert, and my interest in this is only idle.
>Why do you believe it is particularly difficult to control over 50% of the computing resources in the community?
I don't, especially in a nascent community.
The point I was trying to make is that a blockchain - properly designed and audited - can't be commandeered by an individual with some some sort of magic master key.
I might have sounded dismissive of the 50% weakness, but then it doesn't bother me personally - I've never mined or bought Bitcoin - and discussing it doesn't now help the OP at this stage, because his misunderstanding was a little more fundamental. (It's also worth noting that Bitcoin is but one implementation of the blockchain concept, and whilst Bitcoin gives a massive advantage of ASICs for mining, blockchains based on other algorithms still allow CPUs to be competitive. GPUs haven't been worthwhike for a while now. )
> if the solution has a part of it encrypted, someone has to have the key and that someone is therefor a weak link.
Um, no. Er, it can be a bit hard to wrap one's head around the blockchain concept, but I found the explanations that take the form of talking through the steps needed to have a way of Alice sending Bob a digital credit note (without sending the same note to Charlie, Dan, Edwina, et al). Try this one: http://www.coindesk.com/bitcoin-explained-five-year-old/ (and please, genuinely, forgive the title, I don't mean to be condescending)
In answer to your specific point, the blockchain can't be changed by a single person with a key because the blackchain is stored by every participant. A hash of the block chain is then used in generating the next blockchain, every few minutes. To make a retrospective illegitimate edit to the ledger, you would have to brute-force many successive generations of encryption, all within minutes. And even if you managed that impossible feat, the ledger that everyone is storing would look dodgy, so you would have to control over 50% of the computing resources in the community in order to force a consensus.
You can make an alloy where the electrons exceed the speed of light in the alloy*, but you can't make an alloy where the electrons exceed the speed of light in a vacuum.
You didn't notice that the example use-cases were all situations in which goggles and hard-hats, or other pieces of PPE, are commonly worn anyway?
> it strikes me that someone would need to code up the displays you describe.
It would simply be a case of selecting the two physical vertices of interest (either by touching them with a finger or pen, or by highlighing them on a superimposed wireframe display), and 'pasting' them onto the material you want to cut, then dragging the sketch until it 'snaps' to the corner or edge of the board.
All of this operation could be done with a UI similar to that already used in CAD. The conventions of selection, filtering, sketching, constraints, snapping etc are still the same. It just means that real, physical objects in front of you can be used as reference objects.
You're confused, Mage.
Input to output lag is not the same thing as raw computational power.
Input to output lag is a function of the system design as a whole. In this case, the input is head motion and the output is rendered content - 2D desktops and 3D polygons. What the Hololens device is not trying to do is render millions of polygons to provide eye candy for a gamer ('console-level' graphics), such as Oculus or Sony's PSVR efforts seek to do. Rendering CAD models, which typically contain far fewer ploygons than a virtual environment in a video game) at 90 fps is easily doable with tablet-grade silicon (and CAD software for years is designed to simplify the display of large assemblies to match hardware, and the video game world has other tricks that can be used - see http://www.engadget.com/2015/09/15/halo-5-frame-rate-resolution/). For sure, it will be custom silicon designed for minimal latency, but its power consumption and thermal design parameters will be comparable to what's currently available on tablets.
So input latency won't be an issue due to the GPU. Also, you've failed to note that the Hololens display doesn't fill your field of view - it is equivalent to a 15" monitor - so nausea is very unlikely.
I haven't mentioned the Kinect-like object tracking systems on the Hololens (again, specialist silicon with a very high data throughput) with regard to nausea, because this work only needs to be done to build a scene - it is the head-tracking which keeps thing where they should be, so the depth-sensing isn't as crucial. Real time object tracking has also been done by Google's Project Tango, and about to be shipped in a phone by Lenovo. Intel have been at it for a few years now, and made an acquisition in this area only last week. nVidia too have set their sights on machine vision processing.
>How would a phone or tablet have the computational power to avoid lag in movement?
Very easily. It only takes a lot of grunt to shift 4k prettily rendered pixels around at 90 frames per second. If you just want to display more simply rendered polygons at a lower res - or even 2D schematics marking out the location of electrical cabling, say - the hardware required is far more modest.
'Lag' is not directly related to computational power per se - in the same way that bandwidth isn't the same thing as ping time.
If the equipment was needed to keep production going, then knocking down a wall - and charging the cost to the supplier - would be the quickest and most cost effective solution. Regardless of who made the cock-up, the situation needs to be fixed. But hey, you're right, it's a pretty miserable Monday morning!
It was only an example - the point remains the same, that it is a tool that can aid collaboration between (in the example) a structural engineer, an architectural systems engineer and a contractor. So they can discuss which wall will be the least headache to remove, given the location of services like electrical cabling and water, the height of lifting gear, that sort of stuff.
All modern building are design in CAD anyway, the Unity engine is already commonly used for architectural walk-troughs. The expected price of AR kit like this is tiny compared to CAD display systems of the past, and offers some demonstrable advantages. And that is just in one sector, architecture and building.
>I'd be a little sceptical about the claimed half-life.
I'd rather look at the evidence myself - and an internet search isn't that bothersome, is it? A half life of 500 years has been observed in the DNA from bones of Moa, extinct birds, dating from between 600 to 8,000 years, preserved in similar conditions.
Were this DNA archival process ever to be used, there is no reason why the archive couldn't be based somewhere cold - much like the Svalbard Global Seed Vault. (https://en.wikipedia.org/wiki/Svalbard_Global_Seed_Vault#Construction )
Then of course error correction methods and redundancy can be built into any DNA-archival process.
You mutations you mention are those seen in living cells, and usually occur during the copying stage (and yeah, our cells have several error-correcting mechanisms) - but this is very different to these inert strands of DNA that have been removed from the molecular machinery.
>It also sounds like it's not random-access either.
Read the article again!
We propose a method for random access that uses a polymerase chain reaction (PCR) to amplify only the desired data, biasing sequencing towards that data. This design both accelerates reads and ensures that an entire DNA pool need not be sequenced.
We demonstrate the feasibility of our system design with a series of wet lab experiments, in which we successfully stored data in DNA and performed random access to read back only selected values.
Yes, Musk deliberately called the drone barges in honour of the late Iain M Banks.
From the horse's mouth: https://twitter.com/elonmusk/status/558665515351019520/
Yep, they were both the same person. His novel Transitions is sometimes M. Banks, sometimes plain Banks depending on where it was published. Iain was not related to Rosie M. Banks, a fictional romance author who appears in P.G Wodehouse stories. Though fictional, her name was borrowed as a pseudonym for real romance novels, and Iain's publishers initially didn't want to risk any confusion (not that anyone would confuse The Wasp Factory with Barbara Cartland-esque books). For a bit of fun, the two desk sergeants in the film Hot Fuzz, who are both played by Bill Bailey, can be distinguished by the noting that one reads Iain Banks books, the other Iain M. Banks.
There are quite a few design decisions made for cost reasons. Yes, reusing design elements and modules across stages reduces cost. Another example is that instead of machining material out of the tank walls (time consuming and expensive), friction stir welding is used to add material.
>Remix OS is not FLOSS
The 4.6" Xperia Z3 Compact is significantly wider than the iPhone 5 S/SE
(I have the Z3 Compact, a couple of friends have the iPhone 5 S)
I seem to recall that the PS4 is more or less a standard x64 PC and GPU, but with a bit of AMD's shared memory gubbins or something...
I have read Linux users lambasting AMD on forums for not releasing great graphics drivers, I don't know what the current state of affairs is in this regard.
>This was obvious at the time, but for some reason, not to journalists.
Hehehe! I remember reading at the time, in a dead-tree edition of Wired, "Twenty Things Apple Must Do To Survive" or somesuch title. Jobs then did the opposite of damn-near everything Wired recommended, and the bottom line has vindicated him.
( Recently Wired.com has blocked articles unless I turn off my adblocker - I stopped visiting Wired, which is a shame cos it's good for a laugh from time to time. Curiously, I didn't have a complete adblocker extension running - I see all the Reg ads - but I do have Ghostery installed. )
And yet Woz ended up with so much money that he was giving it away.
There are various ways of looking at morality, but if you start throwing figures at the question then we might be inclined to look at it in a pragmatic fashion.
If someone steals my wallet containing £20 but after a few years gives me a suitcase full of cash, I'm not sure that I would be too negative towards them. I guess it depends on the context, such as if I was going to use that £20 to take a lovely lady on a date.
It's just a MK I thing.
- First Ipod: FireWire only, Mac only. 'Only' 5GB. Three times the price of a MD player.
- First iPhone - no 3G
- First iPad - didn't receive nearly as many OS updates as the MK II product.
It isn't just an Apple thing. Sony, internally, saw product range lifespans as being like a day, from sunrise to sunset. Mk I was 'build it any way you can'. MK II was the 'iconic' version being more refined, and bought by more people than just the first adopters. Sony would then produce themes and variations. And yes, Steve Jobs had a friendly relationship with the tops executives at Sony. In turn, the father of the Playstation and VAIO was a fan of Essingler's Mac design languag, and Sony's design team would use Macs. (Source: Digital Dreams. The Work of the Sony Design Centre - ISBN: Google it yourself)
What we forget in the UK is that Apple only sold the iPhone through one network, and insisted that they didn't take the piss on data charges.
>I'd much rather thank Acorn for that since - as you mention in the article - it was their baby.
The company was founded in November 1990 as Advanced RISC Machines Ltd and structured as a joint venture between Acorn Computers, Apple Computer (now Apple Inc.) and VLSI Technology. The new company intended to further the development of the Acorn RISC Machine processor, which was originally used in the Acorn Archimedes and had been selected by Apple for their Newton project. Its first profitable year was 1993.
And as the article notes, Acorn weren't the only RISC designers in town at the time, but were cheaper than the competition.
> What about refusing to let developers build apps for the iPhone for the first year or two?
What about it? It didn't hurt the adoption of the iPhone by the people Apple wanted to buy it. Remember that the 1st gen iPhone was not great compared to the second version with 3G - the first iPhone had too many compromises, but it acted as a good statement of intent.
>Or that crap "music phone" from Motorola that ran "iTunes" in some f'ed up Java mobile environment?
Again, what damage did it do Apple? Not many people bought them, and if anything it might have been useful to Apple in muddying the waters around the 'Apple phone' rumours. This was at a time when most phones from Sony Ericcson, Samsung and Nokia didn't even provide a 3.5mm headphone socket, cos you were supposed to use whatever headset the phone shipped with. Urgh. Samsung were competing on the 'worlds thinnest' slider and candy bar, Motorola were competing on fancy materials, SE were using their Walkman and Cybershot branding, Nokia seemed a bit confused...
Jobs killed off the Mac clone program, but he was willing to make an exception for Sony VAIOs. Sony, however, had already invested too much time Windows VAIOs to switch to OSX.
>And another example is the amazing Magic Trackpads and the new interface it brings to the desktop.
That is a good example of how to bring some ideas from a touch OS (iOS) to a desktop OS. Bringing gestures to OSX didn't stop anyone from using OSX in the traditional way with a conventional mouse - if they so wanted.
I use Windows on my PC with a 'Hyperglide' Logitech mouse, and it works well for me. However, I find it very frustrating trying to use a cheap Windows laptop's trackpad to scroll. The Logitech software also emulates what on OSX is called 'Expose' - a press of one of the mouses many buttons, and all my open windows are presented in a grid. I've grown very used to it.
Maybe a detached hand, al a 'Thing' from the Addams Family is the 'next big thing'. It could type for you, fetch items, and in times of privacy perform more intimate functions...
Homer: What's this Bear Tax?! Let the bears pay the Bear Tax, I pay the the Homer Tax!
Lisa: Er dad, that's Home Owner Tax
>Isn't ground breaking technology usually call a shovel?
I'd call it a spade. A shovel is used for, er, shovelling material that is already loose like sand, snow or gravel, whereas a spade will cut into mud, clay, turf and the like.
Harder substrates - rocks - will call for picks, bars, explosives and other handy tools.
I don't need any of the current smart-watches, but I might consider one in the future if it offers some basic functions without any of the current downsides.
The Apple Watch does too much for my tastes, and its most interesting feature - ApplePay - doesn't require the power-hungry colour screen. It is not ApplePay itself I find interesting, but the concept of a device I can potentially use in place of physical keys, cash-cards and passwords (I prefer to carry cash, but its reassuring to have a backup).
The simple functions that I would like on a watch include being able to 'page' my phone (for when it drops down gaps in sofas), receive notifications, and use the watch to control media playback on another device. These features can be implemented on watches that boast a battery life of over a year - as Casio and Citizen have demonstrated.
My other criteria are more my personal taste: a small watch, simple dial, waterproof, stainless steel, sapphire crystal, rotating bezel. A bezel could function as both rotary input (volume control etc), and also as Direction Pad (up down left right - or Fwd, Back, Pause etc) without one's fingers obscuring the display.
They may be seen as being slow to innovate, but really there is no point in 'innovating' for innovation's sake - iPods, iPhones and iPads were all refinements of existing products. Indeed, I get the impression that Apple didn't really want to release a watch until batteries and SoCs improve, but felt they had to stake their claim on a nascent market.
Again, it should be self-evident why anyone who thinks they know what the Next Big Thing is keeping it themselves. Our home computers are fast enough, and have been for some time. Our mobile devices are almost as fast as our computers. There are some new forms of man/machine interaction around - LeapMotion, RealSense, Kinect, Project Mango - but none that yet feel like a must have.
Agreed, Cook isn't trying to make himself in Job's image. Whilst jobs didn't get involved in social issues, Cook has been outspoken on issues like gay rights. Cook's Apple is already different to Job's Apple.
>I fell for hype and bought one. I soon got a better computer.
I would have thought that most people who bought a computer in that era soon bought a better computer! :)
Heck, it's only in the last few years that I haven't felt the need (have seen no huge benefit to my using of productivity software) to upgrade. For most of the last twenty-five years, none of my current computers has quite seemed quick enough, but the five year old model I'm using now is alright!
>Steve Jobs wasn't interested in the user experience. He was interested in the perceived value of his products.
The two are not mutually exclusive. In fact, the easiest way to up the perceived value of your products is to make sure they have some value to begin with.
Felix Dennis had the same model of microwave in each of his homes around the world. This was because he couldn't be arsed with relearning how to heat some food. Being wealthy, he could afford to remove such minor annoyances from his life. Some microwaves are easy to use, some are just unaccountably awkward.
Jobs did care about the user experience - in the products he used himself (as Mercedes Benz and Sony will testify), as well as those his company sold. If you are going to differentiate your products, it makes sense to differentiate them in area you care and think about. If you are overly sensitive to shit, careless product design, then use it as an asset. This is no less true just because Jobs also wanted to makes lots of money (though his first billion came about by accident, because he financially supported the animation side of Pixar when really he wanted their hardware to be adopted by hospitals).
Of course, the PCs I steered my dad towards buying in the nineties were for gaming, where the more MHz and MBs the better the user experience (in this case, the user experience was shooting hellspawn in Doom at a decent framerate)- you'd want them to be as high as possible for the £. So most PCs were sold on those numbers, and money was saved everywhere else - there was simply no motive for a company to invest money in smoothing off the rough edges. Were these 486-era PCs user friendly? Hell no. And whilst I learnt some skills and aptitudes as a teenager which have since been useful to me, I would had sympathy for someone who just wanted to write and print a letter, for example. I also used Acorn Archimedes and Macs from LC IIs to PowerPC models in school, consoles from Sega and Nintendo, and there was plenty to appreciate in them.
There has been plenty that Apple has done that isn't mere fairy dust, and offer tangible benefits to the user experience. Would Jobs then (maybe over-) sell it? Yeah, that was his job. That should be the job of anyone in his position.
Good design costs time and money, and for a company to make that investment it has to see a return.
I've never owned a Mac, iPod, iPhone or iPad, so perhaps I'm more familiar (and the breeds contempt) with the occasional problems and rough edges of competing products - DOS and Windows PCs, iRivers, Androids. I've encountered so many clumsy and arbitrary design choices I've lost track. Like many people here, I have the experience to skip over many of these issues, but for many laypeople they appear more like hurdles.
>MS manage to patch Windows on an even more diverse range of hardware.
Desktop PCs have a BIOS, and were always designed to run a choice of OS. That was, and remains, the norm - PCs made up of different bits of hardware. You can get Windows running, then go looking for dedicated hardware drivers.
NT OS/2 ( then NT 3.1 > 4 >2K >XP > poo > 7. I stopped at 7 ) was designed to run across different architectures, too (MS were feeling threatened by network-capable OSs and RISC chips).
>I dont understand why Google cant send out updates themselves?
It's because of the nature of Android and the hardware it runs on. There is no equivalent to a BIOS on Android hardware, and each version of Android has be crafted for an individual device and its components.
For a new version of Android, Google release the code to the chipset manufacturers, eg Qualcomn. They in turn, if they decide to support the new version of Android on a particular SoC, release a binary blob, a 'Board Support Package' to the handset OEMs, such as Samsung. The OEMs, if they can be arsed, then build the new version, test it, send it to any relevant carrier partners (yep, carriers are still faffing around with phones) and regulatory authorities for testing. Rinse, repeat etc.
If Google were to create Android today from scratch they would do it differently, as they have with ChromeOS. As it was at the time, Google were racing to deliver a competitor to iOS.
Google have implemented a bit of a half-way house - they have brought more services and APIs into their Google Play Services, which can be updated just like any other app.
>Modern "smartphones" are designed for a business case that is incompatible with security. They are built to sell apps.
iPhones are built to sell apps and high-margin hardware. Android is built to sell advertisements.
Apple make their money from hardware, and a 30% cut of app store sales, and a cut of 'virtual magazines', music, videos and other content. Google make their money from advertising. Plenty of studies have shown that iOS users are far more likely to buy apps compared to Android users - which is really what you would expect: People who pay £600 for a phone instead of £300 tend to be those with spare money, thus are more likely to spend money on an app without adverts.
Each coloured block represents a handset/OS. MotoG is shown several times, but the MotoG name was given to each of a succession of phones.
The colour represents the OS. The size of the block represent market share.
>Like on this one, for example: https://en.wikipedia.org/wiki/Tupolev_Tu-114
Re the noise cancelling: wouldn't it be much easier to use loudspeakers?
Part of the patent is the idea of that if we can't eliminate all the noise, we can at least modulate the noise to convey a message or otherwise sound less annoyingly like a mosquito. In the fans can produce a noise similar to 'Watch out!', less powerful, lighter speakers can be used to fill out the rest of the frequencies (in effect they are adding to the propeller sound, not fighting against it.)
I don't like noisy PCs. A few years ago I had the idea if the CPU cooler couldn't be made silent, then it could at least be made to sound more relaxing. The idea was that if it sounded like a purring cat instead of like an out-of-breath asthmatic, I would be more relaxed for the same amount of airflow. (Since CPUs today generally have a lower TDP than they used to, I haven't bothered persuing it).
The concept is very plausible. You only have to look at YouTube videos of people making music with stepper motors to see so.
I don't know - drones are already used used to deliver small items to prisoners! The unregulated free market (drug dealing) doesn't lie etc etc
/tongue in cheek
Biting the hand that feeds IT © 1998–2017