Re: I used to own all the roads
I experienced the same sort of Moses-like parting of the traffic back in the days when I owned Vauxhall Omegas. Ahh, happy days...
386 posts • joined 2 Jul 2009
I experienced the same sort of Moses-like parting of the traffic back in the days when I owned Vauxhall Omegas. Ahh, happy days...
Whilst the total number of genuinely "live" contactless payments generated across the TfL network each day is still a pretty big number and could still quite possibly overwhelm the BTC network, it's nowhere near 90% of 31 million - bear in mind that many users are paying for their journeys either with a PAYG Oyster or with an Oyster containing some variety of season ticket.
And even taking this into consideration, the overall point you're making is still entirely valid - even if TfL alone wouldn't be capable of overwhelming the BTC network, as you point out they're just one of many, many, transaction generators just in London, let alone the rest of the UK, Europe, the world...
1. Additional cost is more relevant on devices aimed at the price-conscious end of the market. For a device like this where the price is already high enough to push it more into the "don't give a crap how much it costs, just remind me what it can do" end of the market, omitting a feature present on several of its rivals may cause it to be dismissed.
2. Gobs of internal storage is great, but you do need to be damn careful with your data backup strategy to avoid losing it all when (not if) something causes you to lose the ability to read from the internal storage. At times like that, you really appreciate the ability to simply pop out the SD card from your non-responsive device and access its data via any other SD-capable device instead...
So I'd prefer to have the right balance of enough internal memory to cope with 3-4 yearsworth of OS updates, app installs/upgrades, log files, caches etc, PLUS the option of using a SD card to cope with stuff I'd want to be able to easily recover (photos, documents etc.) in the event of a device error.
Also, for a device like this with a large screen and decent audio capabilities, some people may want to use it to view media generated on SD cards by other devices (e.g. dashcams) rather than relying on the less capable UI provided by those devices.
3. I already carry around a perfectly decent set of Sennheiser wired earbuds, which I can use directly with my work PC, my home PC, my iPod and my current phone. Having just shelled out *HOW MUCH* on a new handset, I really wouldn't feel too happy about then having to shell out even more on a USB/3.5mm adapter or something BT-enabled just to be compatible with this one device.
At some point in the future (near or far I wouldn't like to say, just somewhere out there) the critical mass of wireless head/earphones will be high enough to make the 3.5mm socket an irrelevance for most people, but right now I guarantee you that most handset/phablet owners will NOT also own wireless head/earphones, but WILL own at least one set of perfectly useable wired phones, and in many cases would very much like to continue using them with their new devices.
"For some so called technical users, the option of "auto-update" seems to be lost."
For *this* technical user, it came as an unpleasant surprise to see FF updating itself earlier today DESPITE my having previously set the "Check for updates but let you choose to install them" option, only to then be given NO choice whatsoever.
That's not a reason for *needing* cloud access, though, nor sufficient justification for causing the Link to turn into a brick once its cloud service is shut down - even if this was the only way to add *new* IR codes to the Link, existing customers really ought to at least be able to continue using their Links with their existing products for which it'd already learned the codes...
"Fully agree on hardware buttons.
They still work when a bit of your touch screen dies."
But hardware buttons can also fail if the phone is damaged, and flipping the phone from portrait to landscape or vice versa isn't then likely to work around the problem in the same way as you were able to do with your touchscreen issue.
"frame rates of about 1ps on very high end hardware"
Yes, achieving frame rates of 1 picosecond would require some pretty high end hardware, I guess...
The idea that the maintenance teams are so in the dark about how to fix problems that they're resorting to swapping parts out at random until they get the result they're after, does rather have a feel of back-street garage about it...
This is just one of the reasons why I consider the VC#2008 installer one of the most valuable bits of data in my personal archive - not only is is pretty much the last version that provided a decent UI of its own before thing started going downhill towards the monochromatic flatness hell we currently inhabit in Windows-land, it's also old enough to not have the ability to even suggest that I ought to try creating anything that might resemble a TIFKAM UI.
Free Pascal + Lazarus is also a good bet if you want to remain firmly old-school when it comes to developing desktop code, particularly if your path to developing Windows applications began with dabbling in Delphi, as well as if you really don't want to be dealing with any of the stuff in .net that tries to protect you from yourself and just want to hack out a quick and dirty bit of code in 5 minutes to do one simple task.
Thanks, I hadn't appreciated just how close to non-existent the Martian atmosphere actually is...
Wouldn't the mere fact that there is at least *some* atmosphere on the Mars-side of your suit/vehicle/etc, compared with the vacuum encountered on the Moon or in orbit, make the design of said suits/vehicles a bit easier thanks to them not needing to cope with such a steep pressure differential?
...or is it merely rendered temporarily useless until the owner recodes it to the car using the procedure documented in the user manual?
Nope. At the front of every DLR carriage, tucked away under a locked cover, is a set of manual controls which can be used if required - see https://en.wikipedia.org/wiki/Docklands_Light_Railway_rolling_stock#Passenger_stock_overview
“Nothing,” says Cave-Ayland. “You’re too big an impurity.”
Ouch... Given its scope for reminding people of their place in the universe, if this whole fusion power business doesn't work out as planned, they could always adapt the designs into the first prototype Total Perspective Vortex instead.
"Or you could just pay the <big evil vendor> and get a solution that works out of the box"
Provided that the <big evil vendor> solution works exactly the way you'd like it to work... Count me in as another fan of the old Windows Mobile devices - my first three smartphones all ran various iterations of WM, and I absolutely loved how open the OS was to allowing the end user to tweak stuff to their hearts content if the default way of doing things wasn't quite to their liking.
As someone who suffers from somewhat iffy colour vision, having the ability to knock up a custom colour scheme which was then respected by pretty much every part of the OS and third party apps, as opposed to the rather feeble lip-service usually paid to this sort of thing by many other OSs (and depressingly growing ever more feeble across ever more OSs as times goes on - don't get me started on how hostile "modern" UI design can be to people with less than perfect vision...) was an absolute godsend, and something I've missed ever since moving away from WM into the Android world. And that was just one of the countless things you could do with WM if you so desired.
So whilst I'll readily admit that WM wasn't a brilliant choice for the average user who just wanted a simple to use smartphone, and whilst the early iPhones genuinely did shake up the market in terms of making smartphones accessible from the moment you took them out of the box, it does frustrate me at the number of people who seem to equate "needs to be customised for your own personal preferences" with "can't do any of this stuff at all".
It might not have any 3D styling, but it does critically provide a very high level of contrast between it and its surrounding page area, making it absurdly simply to find and identify as something that can probably be clicked on, hovered over or otherwise interacted with in some way.
Let's not get too hung up over "flat" here - whilst the move to the more minimalistic flat styling was a bit of a shake-up after decades of 3D-styled UIs, at least those early flat UIs still provided clear delineation between their elements. The problem is *now* that UI design has progressed even further down the road of minimalisation, stripping away pretty much anything that lets you know which part of the UI does what. In some cases the designers didn't even stop there, and continued on to remove *every* visual hint as to where the active parts of the UI were in the expanse of seemingly anonymous whitespace they so graciously decided to dump in front of our eyeballs.
So I think when people are, quite rightly IMO, complaining about "flat" UI design today, many of us are really complaining about minimalistic UI design which is still promoted by some as "flat" design, if only because those few bits of UI styling which do still exist *are* just as flat as they were in the early days of "flat" UI design...
"Oooh, lovely text only interface."
Yes, it is quite lovely actually. Clean, simple, no nonsense, allowing the visitor to find the information they want with the minimum of effort.
And why go delving into the web archives for this particular iteration of his site anyway? If you want to criticise his current views on UI design, it feels a bit like clutching at straws to use a 5 year old version of his site instead of the current verson, the UI for which looks nothing like the old one...
Ideally a UI would both look good AND be useable, but if one of those two has to suffer then let it be looks, because whilst users can learn to ignore something that looks a bit iffy if it's supremely easy to use, they'll almost certainly never learn to love something which looks gorgeous but constantly hinders them from doing whatever it is they're trying to do.
You're right, good design is invisible, the user should never need to think about it. Flat UI design however seems to fall more onto the conspicuously absent side of the fence - taking away pretty much every bit of visual guidance to show users where to click, drag, type etc. and leaving them to guess at which parts of the UI do what really isn't good *UI* design, even though a static snapshot of the UI might look like really good graphic design.
"i5-4440, 12GB RAM, SSD etc"
Just what sort of business sector do you work in if you think that's a low/mid-spec PC???
Since having W10 foisted on us at work here, there have been several instances where I've switched my PC on in the morning, and half an hour later it's still not finished chuntering through whatever the hell it's decided to update this time...
And random reboots, oh yes, those too. Despite having my active hours set such that Windows shouldn't be doing anything during the times when I'm in the office, I recently left the PC running a data capture session while I popped out to grab some lunch. Half an hour later I got back to the office to find the bloody thing had done an update & reboot about 5 minutes after I walked out the door. There were a few barely concealed expletives hurled in the direction of the W10 development team at that point.
Well yes, but don't forget that at the bottom of the registration form will be the question "Are you now, or have you ever been, a member of a terrorist organisation", so that'll stop the bad guys in their tracks right there. Won't it?
Quite. I'd always been in the "must have a SD slot" camp, but as time has gone by I've now found myself (not entirely willingly) moving over to the "must has as much onboard storage as affordable" camp instead.
I'd *love* it if Android actually did support SD cards as if they were nothing more than a second partition onto which you could do anything that it lets you do with the default internal partition, but since Android either continues to treat SD cards a a second-class citizen (by default) or as a fully tied-in part of the storage scheme (via adoptable storage), I'm finding that on my phone at least the benefits of having even a moderately large SD card are diminished by the inability to use the spare capacity on it to augment the rather more limited internal memory - how many times have I cursed the Android devs whenever the "insufficient memory to install this app" message pops up, when there's more free space on the SD card than the phone would be able to provide even if the entire internal memory was wiped clean... And don't get me started on how little love I had for them when they rolled out whichever version of Android it was that messed around with the SD access permissions for third-party apps.
I can't even remember the last time I pulled the card out of my phone, so other than the continued somewhat obscene markup on buying additional storage space internally vs how much the same capacity would cost in a decent quality SD card, it might as well just all be internal...
For the routes the OH works on, there are two types of timekeeping employed - reporting points and headways.
On routes using reporting points, there are a handful of places along the route (typically the start and end points plus a few selected stops at regular intervals) at which the bus is expected to turn up at a specific time. At other points along the route, *including any bus stops not designated as reporting points*, the bus is free to arrive/depart at any time as required in order to ensure it arrives at the next reporting point as close to the required time as possible.
On headway routes, buses are instead expected to maintain a fixed time offset to the bus ahead of them, so that whilst the exact time of arrival at any given stop isn't defined, the interval between buses at each stop ought to be consistent.
Bus operating companies take this quite seriously, and drivers can and do get hauled up in front of their line management if they're regularly seen to be ignoring the reporting point or headway timings without a damn good reason (e.g. emergency roadworks throwing the whole schedule into disarray). The amount of monitoring of London bus and driver performance that goes on is quite something to see if you're still of the belief that buses are these antiquated means of getting from A to B, little more than giant tin boxes on wheels still pottering around in blissful ignorance of the march of technology all around them - realtime position tracking to enable the onboard stop announcements and remote monitoring of timings/headways back at the control centre, onboard acceleration/braking/cornering force logging (which are also used as a way of beating up the drivers if they're seen to be driving in a manner not in keeping with whatever policy their company has - e.g. driving for maximum passenger comfort, driving to maximise fuel economy etc), multi-camera interior and exterior CCTV setups, system diagnostics/fault logging (particularly so on the hybrids/other alternative fuel buses)...
So if a London bus doesn't arrive within the time interval stated on the bus stop, it generally means something has gone wrong further up the route, and only rarely will it mean that the bus is being driven by someone who really doesn't care about maintaining correct timings. There certainly are some drivers like that, they just tend not to last very long in the job these days given how quickly their employer will come down on them like a large quantity of rectangular fired clay building blocks - most drivers are only too keen to avoid the risk of disciplinary action, so do attempt to maintain the required timings if possible.
You asked: "Where did they get that 55 metres at 60 mph from and what is it meant stand for ?"
Article said: "As per the data from UK Department for Transport, 55m is the braking distance for a car driving at 60mph."
You asked: "Is that meant to be a reaction time or a stopping distance ?"
Article said: "Because the braking distance is the distance required solely for braking,"
See also the braking distance chart at https://assets.publishing.service.gov.uk/media/559afb11ed915d1595000017/the-highway-code-typical-stopping-distances.pdf
However did we survive before electricity, mains water, the internal combustion engine, written and spoken language...
It's not that people are generally unable to survive without something which didn't exist x years ago, more that once the value of x starts to get large enough, that something has probably now become such an integral part of their lifestyle that to lose access to it, particularly with no warning at all, can cause significant problems.
Another disadvantage is that with a monitor like this, you've put all your display eggs into a single basket. When that monitor fails, and sooner or later it will, what do you do then?
Meanwhile, those of us with just as much overall desktop real estate composed from physically independent monitors sat side by side, top to bottom, or whatever particular combination floats our boats, will just have to cope with a fraction of that real estate being out of action for however long it take a replacement to arrive, whilst still being able to continue working with what's left in the meantime.
Then there's the question over how well it'll fit on your desk compared with two side by side monitors where you've got the ability to adjust the angle between them as required to get them both tucked neatly into whatever free space is available. And then is the curvature of this one big screen just right for your personal preferences, or would you really have liked it to be a little bit flatter/more curved?
I mean, yes, I can see the appeal of having a single seamless display which is presumably nicely colour/brightness/sharpness/etc matched across the whole area, as opposed to a bunch of display areas seperated by bezels of varying widths, and where consistency can't even be assumed if you're using multiple identical monitors all bought from the same batch, let alone if you've cobbled together a multi-monitor setup in a more piecemeal fashion. And having just one physical display would also stop Windows from randomly deciding to reallocate your primary and secondary screens on startup. Oh how I laugh when that happens, how I giggle with mirth at the jolly prank the MS coders have just pulled on me...
But I just couldn't see myself feeling happy about spending that much money (particularly in comparison to how much it'd cost to get two identical 1920x1080 monitors of comparable quality to this behemoth) on something which, whilst stunning to look at, also brings with it a bunch of compromises of its own.
And as others have mentioned, at the end of the day it's only a 1080 display, no matter how many pixels are available in total... Given the abundance of 16:9 1080 panels available due to their use in LCD TVs, I can understand why it's now so difficult to find a 16:10 1200 monitor, but for a screen like this I'm not sure that explanation holds much water.
Mostly a mix of foreign language artists - Rammstein, Megaherz, Skalmold and Terasbetoni for when I just want a wall of noise to block out everything else to let me focus on the code in front of me, or Ruslana, Clannad, Mor ve Otesi and a handful of one-off tracks from some other performers if I just want to keep my ears entertained without simulating the speech processing parts of my brain - and instrumental/orchestral/soundtrack for times when I'm in a more relaxed mood but still want something that can be played loud.
A really nice hot cup of tea, excellent. Now, where did I put my Bambleweeny 57...
Start up your PC and let it boot into the OS. Now, without manually starting up *any* apps, games or whatever else you might use your PC for, open up the task manager and see just how much stuff is already running in the background...
Your favourite game might be so badly coded that it genuinely can only use a single core, but even then your gaming experience will be enhanced by having additional cores available to handle all the other crap that a modern PC will want to be running at the same time. Oh sure, for each specific workload there'll always be a question over whether x cores at y GHz vs (n*x) cores at (y / m) GHz gives the best performance, but the long term trend seems to be heading straight down the road signposted "More Cores Please".
Personally speaking, I can't wait to see these multi-core beasts hit the market, so long as the renewed level of competition between Intel and AMD keeps prices at a sane level - I could really do with refreshing my desktop system at some point in the next year...
"Most usefully, the camera is able to pan and tilt, following someone around the room – something that is done through the software rather than moving the camera physically."
Hmm... Not really what I'd call panning/tilting, particularly given the limitations it places on where you can physically locate the cameras whilst still providing the fields of view required.
"They are landing planes..problaby with a 100ms lag...that is unacceptable."
Umm, you do realise that the controllers in the tower (real or virtual) don't actually *control* the aircraft, and if it gets to the point where having a fraction of a second of lag in the virtual view leads to an incident, then things had already gone pretty badly wrong some time ago...
Now, I'm not saying I don't have some reservations about this idea, however aircraft can and do land and take off quite safely without any assistance from the tower controllers, and there will already be procedures in place to cope with loss of comms with a locally situated tower. So if things were to go completely T.I.T.S.U.P. with the virtual tower then it might make for an interesting few minutes elsewhere in Swanwick as the area controllers shuffle stuff around to cope with the diversions away from City, but it isn't going to cause aircraft on final approach to suddenly drop out of the sky.
"Others have also vandalised Antarctica in Waze, it appears"
Given the lack of real roads or motorists in that part of the world, it's a handy location for running tests on how the map editing tools and/or route calculations behave, so I suspect that most of what you think is vandalism down there is anything but.
The subject of this article OTOH... And from a level 4 editor too, who really ought to know better - wouldn't surprise me if their editing rights end up being a bit curtailed as a result of this.
In which case, he needs to have a chat with the occupants of 62 West Wallaby Street, as the whole design and test process for their successful cheesemoon landing and return to Earth was documented for posterity some years ago...
“I don't want anyone to think that the next version of Windows has a dramatic look and feel difference,” said Gallo.
That said, right now I'd happily put up with the continued eye-gougingly bad UI if they'd at least let us control exactly when updates got installed and, more importantly, when the resultant reboot then occurred. It's now got to the point at work where I've had to leave one of my personal Win7 laptops permanently in the lab just so I've got access to a PC that I know won't decide to spontaneously restart itself overnight, at the weekend, or, from time to time even during the middle of the working day, and which therefore is suitable to use as part of a long-ish duration data capture test.
Someone drag me away from the keyboard before I get onto the hardware compatibility issues with some esoteric (and not so esoteric) development kit which was absolutely rock-solid on the same PC running Win7, but which either now doesn't work at all, or does so at a level of flakiness that risks it being reclassified as an item of chocolate-based confectionary and stuck into a big dollop of ice cream...
To then have MS rub salt into the wound by labelling these latest builds of 10 as "Creators Updates" is really taking the proverbial. For at least some people who use (or try to use) their Win10 PCs to create stuff, each update to the OS takes us ever further away from the point we'd like the OS to be, and where we quite happily would have remained if only corporate IT hadn't decreed that we all needed to switch over to 10 from whatever older but wonderfully reliable versions of Windows we happened to be using.
Enjoy it whilst it lasts... in my experience over the last decade it seems like insurers are happy to dish out competitive renewal quotes for a couple of years, but then sooner or later will whack you with a quote that's so far out of the ballpark it should be taken as a clear sign that they really, genuinely, no longer want your business.
In the context of your reply to the earlier poster, then you're right - the A340 has yet to suffer a fatal accident. It isn't however accident-free...
Even more remarkable than the longevity of the 737 family is that of another Boeing creation, the B-52 Stratofortress. At present the USAF is still expecting to be flying these for another 20-odd years, which will not only take it up to near on 100 years since the maiden flight, but will also mean that the last flying examples will be around 80 years old by the time they retire - unlike the 737 story where its longevity is being helped along by newly built airframes, the last B-52 airframes were produced in the early 1960's...
If you think Airbus should be avoided due to their earliest design being unable to withstand abuse from the pilots, then presumably you also think Boeing should be avoided due to the 737 (yes, the darling of this very article) having had a rudder design flaw of its own which caused two of them (United 585 and USAir 427) to crash without the pilots needing to do anything, let alone anything like repeatedly mashing the rudder pedals back and forth for 20 seconds, which is actually quite a long period of time in this context.
As much as I admire Airbus for having achieved so much success as they have in the cut-throat airliner business, and for being a pan-European collaboration we (at least those of us on the right side of the pond) ought to generally be proud of, I also see much to admire in long and distinguished history of Boeing. They both make mistakes, but they also both produce some truly world-class pieces of aviation engineering that I personally am only too happy to trust my life to.
So whenever I then hear stuff like this, or the TL:DR "if it ain't Boeing I ain't going" variant, it makes me think the person saying it really doesn't have a clue what they're talking about.
"what are they good for?"
Cores, what are they good for, absolutely nothing...
...unless you're running a whole bunch of different things that can make use of all the available thread processing power at hand ;-)
"I can download a film 10x faster than it takes to watch it."
You might be happy watching stuff with an average encoded bitrate of 170Kbps, but some of us have slightly higher standards than that ;-)
In all seriousness though, who defines what "normal" requirements are? One person might only ever use t'internet for the occasional email or spot of online shopping, whereas another person might live their entire life online, taking full advantage of all the services available (streaming media, VOIP telephony/video calling, cloud storage/applications etc. etc). Both sets of user requirements may well be entirely "normal" from the perspective of anyone else who has a similar lifestyle to the users in question, but would seem completely abnormal to pretty much anyone else.
Full disclosure time: I've always been of the opinion that there's no such thing as a "fast enough" internet connection (*) - I switched from V.90 dialup to ADSL pretty much as soon as it became commercially available in the UK (and if it hadn't launched when it did, I was seriously considering getting a bonded ISDN connection instead), then switched to VM cable getting on for 12 years ago after moving house. My home connection is currently a VM 200Mb/12Mb link, and I'm awaiting further news of their 300Mbps rollout plans with eager anticipation... So from your perspective, I definitely don't have normal requirements, but from my perspective (and from that of many other people who live in highly-connected multi-user households and/or have jobs/hobbies which are made easier with a decent network connection) they seem quite normal.
(*) although if I could get a symmetric gigabit link to the outside world, I might concede that this would probably be good enough, for now at least...
I have fond memories of Warp too - back then I was doing some research work on robotic equations of motion, which had eventually evolved into a hideously complex Matlab script to do all the hard work for me. I'd just define the system geometry parameters at the start, click Go, twiddle my thumbs for an hour or so, and then get a complete set of optimised motion equations out the other end.
Unfortunately this was all being done in the Win3.1 version of Matlab, and as bad as the co-operative multitasking was in 3.1 generally, it was a shining beacon of excellence compared to how it behaved once Matlab started up - I'm pretty sure the Matlab devteam must have misread the Windows documentation and thought it featured "un-cooperative multitasking", because once you let Matlab loose on a script it was game over as far as being able to do anything else on that PC was concerned.
As a hardcore Amiga user at the time, I knew that multitasking didn't have to be this godawful, and I was convinced that the PC I had in front of me, which at the time had roughly twice the raw processing power of the fastest Amiga in my collection, really ought to be able to multitask at least as well as the slowest Amiga in my collection...
I can't recall how I stumbled upon OS/2 as the solution, all I do remember is that having learned of its existence and its claimed abilities to do stuff that Windows could only dream of doing, I dashed into town and bought my own copy of Warp, and once I got over the hurdle of getting it installed as a multi-boot setup with my existing fine-tuned DOS/Win3.1 setup (having expended god knows how many hours tweaking it to run all my games nicely - yes, even those that expected to have almost all of the base memory available, but still also needed to have CDROM *and* mouse drivers shoe-horned in there somewhere too - I didn't want to mess that up) I fired it up, installed Matlab, and tentatively clicked Go... Umm, is it running? This can't be right, the OS is still perfectly responsive, I can launch other Win3.1 applications without any signs of hesitation, and yet my Matlab script really does claim to be churning its way through its calculations about as quickly as it did hogging Win3.1 all to itself.
From that day on, Warp became my go-to OS for anything work-related until the day I finally ditched Win3.1 and made the switch to 95.
So yes, count me in as another one of those people who, despite the problems OS/2 had (I'll readily admit that it could be a bit flakey or just a bit obtuse when trying to get it to do what you wanted it to do) will still quite happily wax lyrical about just how bloody amazing it was in comparison to a DOS/Win16 based setup for anyone wanting to unlock the true potential of the hardware in front of them. Even today I still don't think the Windows dev team *really* understand how multitasking ought to behave, and I do wonder just how much productivity is lost globally due to those annoying random slowdowns and temporary hangs which remain part and parcel of everyday life as a Windows user, despite the underlying hardware being orders of magnitude more powerful than anything we could dream of having sat on our desks back in the 90's.
One wonders if the residents of places like La Paz also suffer higher than average rates of battery fires, given their similarly lofty altitude...
Since when have we been unable to take bottles of water onto an aircraft? I appreciate you were trying to make a funny out of the whole "no (*) liquids through security" thing, but it'd be a rare international airport that didn't either have airside shops selling water, or airside facilities for getting drinking water (water fountains, dedicated drinking water taps etc.) from which you could refill an empty bottle taken through security.
(*) certain exemptions aside, please read the small print for details before travelling, E&OE etc.
It depends on when the clock is required by the system. We know it's definitely required when the system starts up, but it's less clear if it's also then still required once the system has started up and the other clock sources have been initialised OK.
So as Richard 12 suggests, *if* this failing clock is only being used to get the system off the ground from a restart, then the fault may well remain hidden for however long the system can remain up and running. And if this is the case, it'd then beg the question as to just how many of these Atoms have *already* gone into this knackered state without anyone being aware of it...
I still remember the day someone at work discovered the terraserver site (around 1999-2000 IIRC) and the whole R&D team stopped work for about an hour as we all crowded round their PC looking at the fairly low-res black&white imagery available around Slough (no jokes please, it might not have been the most salubrious of places to live, but the sheer number of companies based there made it a damn good place to kick off my engineering career).
17 years later, and I find myself grumbling if the aerial imagery in Google Maps is more than a couple of years old, under/over exposed, or just slightly too blurry to be able to see the road markings clearly... how quickly we forget just how much of a revolution it is to freely have access to this (and so much more) data at our fingertips 24/7.
Going slightly off-topic here, but as Waze is something I have a particular interest in...
The Google buy-out of Waze was done in a way that maintained an arms-length seperation between the two companies. Other than the occasional traffic/incident alert appearing in Google Maps tagged with Waze as its source, and the slightly better integration of some Google products within the Waze environment, the two pretty much run independently of one another.
So yes, Google generate their own realtime traffic flow data via a combination of third party feeds where available, plus the location data returned from Android phones where such data hasn't been switched off by the user - its this latter data which gives Google Maps such good quality traffic flow data on side roads where the likes of Trafficmaster et al pay no attention.
However, Waze do pretty much the exact same thing. Every phone running the Waze app is sending back realtime data to the Waze servers, allowing them to build up the same sort of dynamic traffic flow picture as Google have. The main difference between Waze and Google here is that in the Waze app, traffic is generally only highlighted if it's moving slower than usual for that section of road at that time of day, so if Waze is showing no traffic highlights it doesn't mean the road ahead is clear, it only means the road ahead is flowing at least as well as Waze knows it usually flows. It could be completely stationary, but if that's normal for the time of day then it won't warrant a highlight...
Not the most human-friendly bit of UI design (something I've mentioned to the Waze devs on more than one occasion over the years), but from the perspective of the routing algorithms it does make sense, and if you're using Waze as the devs intend it to be used (i.e. always following a suggested route) then you do start to learn to trust that it's already taken all of the traffic it's aware of into account when deciding which route to offer you, and that if it still ends up directing you into the mother and father of all jams that wasn't shown onscreen then it's more likely that it really was the least worst option available, as opposed to it doing so because it really had no idea the jam was there.
"(and it can display on an OS 1:25,000 and 1:50,000 map)"
As can Bing Maps, with a rather more user-friendly UI... As with several other commenters, I was a big user of Streetmap back in the days when it was *the* go-to site for free detailed maps of the UK, but they've done themselves no favours at all by clinging onto their old-school UI design long after it ought to have been put out to pasture.
Indeed, multiple screens are increasingly the norm in those parts of the workplace where PCs are seen as more than just a means of getting access to your calendar and emails. Even with the higher resolution afforded by a 4K screen, it can be more convenient to simply have seperate screens for seperate windows, rather than trying to juggle multiple windows within the same effective viewport area on a single higher-resolution screen. And unless the single screen is also physically larger, then you're going to have trouble reading the contents of those windows given they now occupy a far smaller area of your retina assuming you're still sitting a comfortable distance away from the screen...
Whilst I don't have a personal need for a multi-screen laptop like this - I'm fortunate enough to work for an employer who still gets the concept of providing suitable desktop kit for desk-bound R&D employees rather than assuming everyone can do their jobs just with a bog standard corporate laptop - I can easily imagine quite a few engineers, designers and anyone else with a genuine need to be able to display lots of data at the same time, would be getting very excited at the prospect of a product like this making it to market.
Naah, can't be - a) it's far too wordy for a trump from Trump, and b) it's only semi-controversial...
"Holy cow! I couldn't imagine anything worse than the Microtome used during LASIK until I read the description (and saw photos) of a Vitrectomy. I hope you were asleep."
Dunno about Dave, but during mine (as part of a retinal reattachment) I was quite happily wide awake (aside from the area around my right eye, that was well and truly under the control of whatever local anesthetic they use for this sort of procedure) and thoroughly enjoying every fascinating minute of it all - as someone with the typically inquisitive mind of an engineer, being able to experience something like that first hand was pretty amazing.
Especially since, being rather terrified of needles, there's probably no way in hell I'd be able to watch such a procedure in the third party, but when it's your own eye that's being worked on, the needles are conveniently out of sight... The follow up caratact removal op and laser clean up procedures a couple of years later were almost as much fun too.
And on a more practical note, the surgeon who did my original op did say they prefer if if people are able to go through the procedure with just local anaesthetic, as it makes the post-op recovery process easier when patients aren't needing to be brought back around from being under.
There are many words I could use to describe the Win10 UI, friendly not being one of them...
Biting the hand that feeds IT © 1998–2018