Actually - Dolphins are pretty thick. (Compared to humans. Maybe not compared to chavs...)
https://www.sciencedirect.com/science/article/pii/S0306452213006234?np=y
54 publicly visible posts • joined 11 Jul 2008
Incidentally I almost bought the old version earlier today, but I was dismayed by the sheer number of one-star reviews at Amazon (http://www.amazon.com/Personal-Cloud-Storage-Share-Photos/product-reviews/B0047FL85U/ref=cm_cr_dp_qt_hist_one?ie=UTF8&filterBy=addOneStar&showViewpoints=0)
Apparently /some/ people experience serious performance problems.
Another question:
- Does it support rsync protocol out-of-the-box?
Well... You don't always get open-topped blondes in my neck of the woods....
But I assume that this thing would be able to export the resulting .gpx files that can be uploaded to Strava without any problems. (I wish the review went into these details. Also - how does it connect to the PC? Does it show up as 'mass storage' or do we need special drivers, that can't be found for Linux?)
(I typically use my phone for bike logging, but it's harsh on the batteries. And this thing does have the advantage of being mostly waterproof.)
Rant #1: Be Objective!
There are many parameters of tech products that can be objectively measured, but are typically just waffled about by people with very limited knowledge in the subject.
Colour rendition of screens and response curves on speakers are just two examples that spring to my mind. Both are frequently described as 'warm', 'harsh' or something equally fluffy when they could be explained by a detailed plot or measurement. If you don't have the means to properly test equipment - then don't bother trying!
Rant #2: Know what you're talking about!
Two recent e-reader reviews spring to mind.
In the Kobo Mini review (http://www.theregister.co.uk/2012/11/14/kobo_mini_5in_e_reader_review/) it was mentioned that the display comes from VizPlex rather than e-Ink, when in fact VizPlex is a brand of E-Ink.
The Kindle Paperwhite review (http://www.reghardware.com/2012/11/13/review_amazon_kindle_paperwhite_e_book_reader/page2.html) claimed that it's possible to switch off the light - which is also patently untrue (at least with the current firmware (5.2 and 5.3) ).
Both of these facts were pointed out in comments, but the article was never corrected.
But facts and figures need to be backed up by text, and that's where we have come to expect humorous and occasionally vitriolic prose in the above-mentioned Vulture-Style.
"The 600 x 800 panel actually comes from Vizplex not the usual suspect, E Ink"
Nonsense!
Vizplex is an internal code-name at E-Ink, for all their electrophoretic displays.
The Kobo mini apparently uses a slightly older version (I'm not 100% sure how the 'VizPlex 110' relates to the 'Pearl' display that's used in all other current units.)
Up until about 2003 there was a proliferation of single-use pieces that made 'free construction' almost impossible. If you followed the five easy steps of the instructions you ended up with a great model, but the pieces were useless for anything else.
However, as described in this interview ( http://www.monocle.com/sections/business/Web-Articles/QA-with-the-CEO-of-Lego/ ) they realised that they had gone too far and started by drastically reducing the number of different types of pieces. As a result a lot of current 'LEGO city' models use lots of pieces that can be used to build just about anything. (Ask my kids!).
However, 'LEGO technic' haven't followed the same course. Their current models (e.g. 8110 Unimog) is a brilliant working copy of the legendary off-road vehicle, but it's far too complicated to build anything else. (At least for an 8-year-old.) When I was a kid, you could buy sets with just gears (e.g. 9610), but today there's nothing like that on offer.
Please learn to use a spell-checker and how to take screenshots.
And try to find an in-house person to proof-read articles. (Regular papers used to have editors - for some reason many on-line-only organs believe that they don't need one. WRONG.)
I agree with Phil that the overall standard is slipping. The number of articles appears to have increased, but most of them are just re-hashed press releases, rants or slapdash 'reviews' devoid of in-depth analysis.
There are some excellent objective and measurable criteria when evaluating a speaker:
- linearity
- efficiency
- loudness
It all comes down to parameters that can be measured and quantified in a frequency-response curce, rather than fuzzy adjectives like 'warmth' and 'ambience'.
If you insist on testing analogue peripherals, such as speakers and displays, please set up a lab with proper equipment to actually measure the results!
It's not the first off-line map browser for iOS. I have tried both oMaps and OffMaps, and they both work OK-ish. Neither is as smooth as the built-in Maps app, but with the current competition something good is bound to pop up eventually.
The biggest problem is in the datasets used. The OpenStreetMap data quality is very varied. In some places it's better than GoogleMaps (e.g. Holland, a lot of the UK), but in some parts it's laughable (parts of Sweden, parts of Switzerland).
(The T&C for the Google map API explicitly forbid using the data off-line, so all these off-line map browsers use the OpenStreetMap dataset.)
(Or rather Gran Sasso, but it's close enough for government work.)
And they fire neutrinos, that rarely interact with much - but that shouldn't stop an enterprising journalist from making up some neat headlines....
http://proj-cngs.web.cern.ch/proj-cngs/ProjetOverview/projetoverview2002.htm
That's only one of several competing contenders for an updated definition.
One idea that I tend to like is to simply count the number of atoms of a certain isotope, and define Avogadros constant at the same time. A kilogram would then simply be defined as the weight of a certain number of Carbon-12 atoms.
There are two important parameters for hdds, how much noise they make, and how much power they consume.
Presumably all these drives are bus-powered, through a single USB cable (or do they come with Y-cables?), but it would still be interesting to see exactly how much power they draw, and what voltage they require. As you no doubt know not all USB ports are born equal, and the voltage / current available varies considerably.
Especially as the power will invariably end up as heat, and some of those devices will get warm to the touch.
And as Danny14 points out, any spin-down would be a great boon.
Finally, noise is generally not a big issue with 2.5-inchers, but some kind of vibration dampening is always welcome. And even though it's difficult to measure it ought to be possible to establish some kind of objective metric.
The Nexus One uses an AMOLED display, with Nouvoyance's famous PenTile RGBG pixel arrangement. That means that even though there are 800x480 pixels, not all pixels can show all colours. Not that it really matters as long as you stick to showing off photos, but if you want to compare resolutions you ought to reduce it by 33%, as explained by Ars Technica:
http://arstechnica.com/gadgets/news/2010/03/secrets-of-the-nexus-ones-screen-science-color-and-hacks.ars/2
Personally I wish that they would have gone for the PenTile RGBW array instead, but for some reason they never asked me...
I agree with J(ohn )Locke that it's time for 'tabula rasa', a clean slate on which Google can make their play. I would by no means be surprised to see Google pick up this project (let's face it, Harbinger is a finance company, and would probably be happy to sell off the project at a profit.).
After all, Google did make news some time ago about wanting to build a fiber backbone, and it would make sense to also add a wireless part to it. And LTE meshes very well with IP.
I think we'll see Google-branded (as well as third-party) Android devices that use this IP-over-LTE (with or without voice), within a year or so. I think it's even conceivable that they will throw in connectivity for free, a bit like amazon with their 'whispernet' for kindle, all in order to create ad opportunities.
Of course it requires atmospheric pressure!
There is no way for the water in the downhill slope of the siphon to 'pull' water upwards in the initial up-hill part, and for that part you NEED the external atmospheric pressure to instigate a pressure gradient in the initial part of the siphon.
Sure, it's gravity in the 'downhill' part of the siphon that exerts a partial vacuum in the upper part of the siphon, but without the external pressure it wouldn't work.
The linked document is also full of rubbish. It insinuates that the cohesion required to 'pull' the water up the hill comes from hydrogen bonds! (snort!)
It would be very easy to install a manometer in the topmost part of the siphon and show how the pressure changes.
Also, it's childs play to show that you can't suck water more than 10m uphill, a figure that maybe just happens to coincide with the atmospheric pressure? Please!?!
Even the wikipedia gets it right, unlike this Oz PhD. (Wonder what mail-order university he got his certificate from.)
I also ordered one from the states, and have been playing with it for a couple of days. The initial impression is, to put it succinctly, WOW!
The screen is fabulous, and the GUI is very responsive.
HOWEVER, after playing with it for some time, I put it aside, and reached for my ipod touch, and had a sudden insight:
If this device had come out first, and the ipod touch/iphone a couple of years after, I would have been much more impressed. There they manage to put the entire functionality of an iPad into a pocketable format!
This is by no means the first robotic hoover. The Electrolux Trilobite came out in 2001 (and was ridiculously expensive), and iRobot Roomba somewhat later. (I got my first in 2003 and by then they had been around for some time.)
This looks a bit like a copy of Roomba 500, but it would indeed be interesting to see the differences.
Or, to get an IT angle, take it apart and look at the actual hardware, compare the algorithms etc.
In fact, if you look at component cost, rather than US street price it makes slightly more sense.
According to isupply the 16/32/64Gb versions cost USD260/290/350 to produce. With the Newtonian prices that gives a price-to-cost ratio of 3.5/3.6/3.2. Not altogether unlikely, even though the margins are typically higher on the top-of-the-line models
I can see why they want to put up roadblocks for people trying to create multi-platform applications, as long as the other target platforms are symbian/android/blackberryOS. But what if you're trying to develop an application that can run on different versions of iPhoneOS? When OS4 is released this summer there will be three versions available at the same time: 3.1 for 1st gen iPhone/iPodTouch, 3.2 for iPad and 4.0 for newish iPhones.
The most reasonable way to manage this would be using some kind of wrapper that emulates the missing new APIs for 3.x hardware (or does Jobs not want people to develop for the iPad?)
So, this is a bit like the first SDHC cards that came out before the standard had stabilized. I'll wait a couple of months for the SD4.0 standard to be ratified.
And, -tim, the SDXC has a very specific raison d'etre - namely that it is physically compatible with the SD/SDHC (and to a large extent all the flavours of MMC).
Oh, you're not forced to use exFAT, it's still just a block device, so in theory you can use ext4 or whatever you want on it. It's still a shame that the SD consortium recommend a proprietary file system.
This might be a silly question, but what's really the problem with h.264?
They claimed patents, but I thought that the x264 implementation is fully patent-free.
It's true that there are patents involved in MPEG-4, but there are free reference encoders as well as GPL'ed implementations.
Or have I missed something?
"Building materials can significantly affect signal strength. In particular, more energy-efficient, heavily insulated buildings can be real signal-suckers."
Have you got any figures for different insulating materials?
I'm asking this, as I'm about to in insulate the loft soon, and I've already got lousy coverage - I wouldn't want to reduce it further.
At the same time I can't imagine how fibre-glass or expanded polyurethane could absorb radiation in the GHz spectrum, so any explanation of the physical interference would be very welcome.
The USB vs FireWire dispute comes up every time anyone mentions transfer speeds. Yes, FireWire is better at utilizing the availabe bandwidth, but it is a much more complex protocol, being basically a fully-fledged network, rather than a simple point-to-point connection.
The advantage of USB is that the slave side is very basic. (You can get USB chips for less than a quid), whereas FireWire requires a fairly complex stack on either end.
Sure USB3 will increase the complexity, but most likely the shear size of the market will press down the prices fairly quickly.
(I just wish they'd make the USB-A connector a little less symmetric. I always try to force the bloody thin in upside-down. But that would ruin the backward compatibility.)