reddit usually is, at least in my experience.
If people start being uncivil, they get down-voted, and so their posts tend to vanish from site.
Perhaps I frequent nicer channels? (Usually tech, gaming or development related).
645 posts • joined 17 Jun 2011
reddit usually is, at least in my experience.
If people start being uncivil, they get down-voted, and so their posts tend to vanish from site.
Perhaps I frequent nicer channels? (Usually tech, gaming or development related).
Personally I think 'default' passwords, admin and WiFi (and SSIDs) shouldn't even exist.
Part of the initial set up should be to force the user to log into the router/modem and put these details in themselves, with minimum standards on the complexity etc.
Even with an issue these days being that not everyone has an Ethernet enabled device, that could still be handled.
A possible option could be to have an initial, default but restricted Wifi SSID and password (and possibly a restricted Ethernet), restricted to a DMZ that only allows access to the routers admin page, and not the Internet itself.
So the user connect to new shiny router, with <any device with WiFi/Ethernet and a web browser>. And if via WiFi, uses the initial 'temp' SSID and password.
User is presented with a simple configuration web page (irrespective of what URL they typed in), that forces the user to set up a new admin password for the router, and then a new WiFi SSID and password (or to disable the WiFi if they don't want to use it).
The router doesn't enable Internet access until these steps have been completed.
If you only put in the new admin password, and don't change the WiFi SSID and password, then only Ethernet get Internet access, with any WiFi connections still being DMZ restricted to the router admin page.
When I first saw the 'Compute Card', the first thing that popped into my mind was how is this any different than basically a top end smart phone without a touch screen?
Quote: "So go to 'Choose what the power button does', then 'Change settings that are currently unavailable' and turn OFF 'Fast start-up'. Then it doesn't part-hibernate, and takes only a few seconds longer to boot. Power pack still lights the motherboard, though."
Yup, what Steve said.
Also not recommended to use Fast start-up on an SSD, as it increases the writes to the drive quite a bit, (dumping GBs of memory to the SSD on every shutdown) and it's not really needed on an SSD anyway (you might shave a couple of seconds on boot).
And unless you want to use hibernate, doing a 'powercfg /h off' from the CMD, gets rid of the old hibernate file, so saves a bit of space too. (Turning fast startup off, and even turning off hibernate in Windows settings, doesn't seems to actually switch it off fully and leaves the hiberfil.sys file behind, which can be several GB in size!).
Quote: "My windows 10 work pc has an item on the srat menu called power \ shut down , and when i click it the pc appears to switch off .
I'm not sure though i think its just pretending , in order to show off how quick it can "boot" up.
pstools say its been awake for weeks!"
Win 10 shut down is some form of hybrid hibernate. It shuts down apps, but as far as I know the OS itself is basically hibernated. It only gets a real full shutdown, when re-booting after updates etc.
There are some gloves that have actuators on the back. As you grasp objects in the VR world, the gloves provide variable feedback, per finger, so you actually get tactile feedback to each finger individually, as if really holding something.
Apparently it good enough for people to be able to recognise different objects due to their 'feel'. i.e. recognise if something is hard or soft, what it's shape is etc.
Early days yet though.
I'm curious as to who these 'users' are.
Are these people that are still using IE6 on XP or something?
The BBCs HTML5 support is still in beta, so it's opt-in currently.
Although if you don't have Flash installed/enabled in your browser, it will automatically switch to HTML5 anyway, without the need to opt-in.
As above, just go here to opt-in : http://www.bbc.co.uk/html5
There are also lots of technical details and other info on that page.
Think I'll stick with PuTTY thanks.
First, as people have mentioned, names like GTS, GTX etc are quite generic amongst performance devices.
nVidia also seem to have Prior art, as they've been using GTS since the 'GeForce2 GTS' in 2000, and GTX since at least 2005.
Hardware Labs, as far as I can tell from some searching (please correct me if anyone knows different), were founded in 2005, and made car parts back then!
At some point since then (and presumably after nVidia had been using the GTS and GTX names), they switched from producing car parts, to PC cooling radiators, and it's these radiators that have the GTS/GTX name.
So the only commonality, other than the name, is they are both for use in a PC, but otherwise are completely different product categories.
How exactly is anyone going to confuse a liquid cooling radiator, with a GFX Card?
Especially as these will be PC enthusiast components, i.e. people who build their own systems, and so can easily figure out what's what!
I can't see how Hardware Labs has a leg to stand on here, and I'd think this could even damage their own market, as any nVidia fanboy out there (and there are lots), on hearing about this, are likely to add Hardware Labs to their black-list, and go get their cooling solution from someone else!
Wouldn't a better analogy be more like a fire sale?
From other comments, it seems Pebble were circling the drain already, with layoffs earlier in the year, and basically surviving on investor money (i.e. not actually making enough income to cover costs).
All they've done is sell off what bits they could (i.e. some/all of the IP), and try to protect a few jobs, before closing the doors on the business for good.
Court confirms patents "were not patentable"
Step 1. Patents are resided.
Step 2. Any existing court cases referencing the same patents are amended to remove those claims (if other patents are still in dispute), or court case is dismissed fully if only resided patents were involved, with any costs automatically going against the organisation that raised the case.
Step 3. Any members of staff at the Patent Office who were involved in approving the patents, are re-trained, disciplined and/or sacked, as appropriate.
Step 4. All existing, and any new Patent requests from the same organisation (or any related to them) to have additional scrutiny applied, for a period of time determined by how serious the issue was (e.g. for the next 5 years, and assuming no further stupid requests are made (i.e. if a stupid request is made, the 5 years starts again)).
You'd have thought they could have thrown a few benchmarks in there to make it at least look like they did something with the drive!
You seem to be stating that Jackson is failing simply because of the use of UTF-16, and stating use UTF-8 to 'fix' the issue!
Try reading the actual specs (or the referenced article).
Quote: "JSON text SHALL be encoded in UTF-8, UTF-16, or UTF-32. The default encoding is UTF-8".
If Jackson is failing due to the encoding alone, then the failure is with Jackson, not the encoding.
I suspect this would have been a management decision, and the people in IT likely tried to tell them why it was a bad idea, and would simply have been ignored.
That's why in any decent company, you should have a dedicated, and suitably qualified security expert, that the management are not allowed to overrule.
I like Amazons approach.
Not just an email warning like some sites (which no doubt many users would simply ignore), but a forced reset, therefore hopefully blocking anyone who's accessing your account unauthorised, forcing the user to actually do something to gain access to the services again.
HTC already updated the cables used for the Vive, from the rather heavy and stiff 3-cable ribbon they started with, to a lighter more flexible single cable (direct replacement part, just plugs in to existing headsets).
Plus they are working on wireless, but that automatically gives you bandwidth, latency, and of course power issues.
There are also back-pack PCs available now, that are basically a high end laptop, with extra large batteries and cooling vents. The Vive has an advantage there, as the tracking stations (the lighthouses), don't need to connect to the PC, leaving you completely un-tethered.
Quote: "The fact that Oculus has gone to the trouble of creating a custom audio socket is just another sign of the organization's desire to create an entirely walled garden."
Compare that with the HTC Vive, that has both a standard 3.5mm headphone socket, plus an unused USB socket, both on the headset itself, leaving the user free to use just about and headset type they want!
+ 1 for the OnePlus, wait.... I'll get my coat
I've had a few Nexus devices, last one being the Nexus 5 (original). They always seemed to be a reasonable spec, at a reasonable price, without being messed with by carriers and manufacturers.
But now, the last Nexus devices, and the new Pixels, just seem expensive to me!
So last July (when my Nexus 5 finally died on me) I decided to go somewhere else, and bought a OnePlus 3 instead for £328.99, and have been very happy with it.
The OnePlus 3 seems very comparable with the Pixel XL for screen size, CPU etc. With many things being better on the OnePlus.
The only real + for the Pixel, being that the OnePlus 3 is 'only' a 1080p screen, compared to the Pixel XL being a 1440p, but that to me is irrelevant, I've done direct comparisons with other 1440p screens, and personally, I really can't tell the difference on a screen this size! But then I don't have perfect (not even close) eyesight.
OnePlus also don't seem to mess with the OS much, a few tweaks, such as enabling customisation of the pull down area (data/wifi/NFC on/off etc.), controlling the LED colour (for notifications, battery state etc), and the few apps they pre-load, can be ignored, nothing is forced on you. It also gets the monthly security patches from Google.
Oh yes, is dual-sim as well, which is handy for some people.
Quote: "You still need to log into a Google device; android tablet or chromebook..
Not true for Android devices, at least not most I've used.
I think Nexus devices do force you to log in with a Google account before you can use them, (and possibly some other manufacturers), but the non Nexus devices I've used (and Nexus devices where I've put stock Android on, or Cyanogenmod), let you skip that step during initial set-up, as it's not needed by Android itself, only by the Google services pre-installed on the device (Gmail, Play store etc.).
You of course still need a Google account to use play store, sync contacts with GMail etc. But it's not needed to access the device itself.
Different with Chromebooks of course, as that's intrinsically tied into the Google services (Drive, Docs etc.), and is basically a brick without an account (unless you replace the OS of course).
Unfortunately dave bates is correct, the gen 1 Nexus 7's slow down over time, no matter what you do.
I've wiped mine several times over the years, with fresh stock ROMs installed over USB rather than OTA, in order to really factory reset the device, but it was never the same after 5.0 was rolled out.
I replaced the stock OS with Cyanogenmod a few months back, which gave it a little bit of a speed boost compared to the stock OS, but it's still quite sluggish to use and isn't really a nice experience anymore.
I no longer use the Nexus 7 as an actual tablet (I have a newer device), so these days it just sits in a dock in the kitchen, with the audio output in the dock connected to a hi-fi system. So is basically just used as a music jukebox now.
Whist I agree with you on the business side, I think for gaming, VR rather than AR is going to dominate.
AR isn't immersive and that's by design, so for most games, VR will always trump AR (imho). In fact I suspect most genres of games wouldn't be playable in AR at all, or at least not very well.
I could see AR being good for table top type games, games with a 3rd person, top down type view, such as strategy games etc. But that''s a fairy small market. For anything first person, or role playing type game, which covers a huge chunk of the gaming market, I can't see how AR could be used effectively there, being able to see you're still in your living room, would just break any immersion.
But all this being said, VR (other than for cockpit type games), needs to be room-scale. Sitting in a seat with an XBox controller in hand, is doing VR a disservice currently, even potentially damaging a new market by imposing restrictions that shouldn't really be there.
From your comment, (cockpits and moving characters in-game when you are not moving), I'm guessing your experience is limited to the Rift, so seated or standing, using an XBox controller? If so I suggest you go out to a PC World or similar, where they have the HTC Vive on display and give room-scale with motion controllers a go. It changes VR completely, and makes the Rift look positively dated in comparison (although I will admit the Rifts headset does look better than the Vives!).
I went through a very similar trek a few weeks back.
Had a bit of a move around at home, which meant having a larger desk space. I realised I could fit three monitors on the desk now (from two previously), and I had an old spare 1920x1200 one sat on a shelf doing nothing, so I figured why not!
I too ended up buying new cables on line, as no local shops (Maplins, .*PCWorld, large Tescos, Asda living etc) had anything of any use to me.
In fact I'd go so far as to state that most of these shops, were essentially only stocking what I'd class as 'legacy' cables, and not one of them was stocking anything that could be used to connect 'modern' devices to another 'modern' device.
Same was true for USB-C.
I get the feeling that whoever is in change of stocking things like cables in these shops, is working of a list that's about 5+ years out of date, and so they simply don't cover current 'standard' connectors, like DisplayPort etc.
Would that still work?
Don't VMs access the CPU (essentially) directly anyway?
i.e. If your host is a 3rd gen i7, then your VM also see's a 3rd gen i7. All the 'user' can usually do is manage things like how many cores are available to the VM, not what type of CPU the VM gets to see.
Therefore wouldn't trying to run Win 7 in a VM, on a host that was running on a new CPU, still have the same compatibility issue?
Genuine question. As my experience with various VM environments (desktop, not server), don't allow you to change the 'type' of CPU available to the VM, they always see whatever the host has installed.
Long time AMD CPU fan here, I've built a few AMD gaming rigs over the years (late 90s onward)..
But my current rig, built in 2012, is an i7 (3770k), as unfortunately for gaming, single core speed, not number of cores, rules, and AMD just were not there!
I really hope this 3GHz speed is just due to it being a test, and that this isn't indicative of the real world speeds we can expect from this new chip set in the final production version.
Hopefully if they can match proper desktop speeds, as in 4.5GHz (and upwards), then this could mean AMD become a contender against the top end i7 chips, and become a viable option in time for my next build.
Wouldn't the EFF be better lobbying for legal changes instead of targeting specific vendors? (Or do both?).
i.e. try to get the same things that the EFF are asking for here with Windows, but within a legally binding set of privacy laws, that cover all software, applications, devices etc.
That way all operating systems (desktop or mobile), applications (including phone apps etc), and anything else that can capture any metrics, has to abide by this legal framework.
For example, such as declaring what is captured and why, and clearly identifying what is necessary for the service to work (i.e. GPS data for a navigation system), and what is not. With mandatory means to manage this 'additional' data, or at least allowing for a more informed decision on if to use that OS/app if you can't turn it off.
Plus also making sure this snooping isn't just some obscure bit of text hidden inside the T&Cs/EULA but rather something that explicitly informs you during installation, before it actually starts capturing the data, with a mandatory means to not accept, or at least back-out.
I can just imagined, say 10 years from now when auto-braking and collision avoidance systems are likely to be common place, some miscreants deciding to have some fun kicking footballs across a busy motorway, or dropping cardboard boxes from a bridge, just to see what happens!
Quote: 'I didn't say it should do anything different, I just commented on the apparent lack of differentiation in the display.'
Surly in an emergency situation, a speedy response and warning is the critical path?
Spending CPU cycles on deciding if it's a car/lamppost/meat-bag before warning the driver is just going to be wasting time, literally.
Perhaps they should just change the warning to something more generic?
i.e. Just standard stuff in any interrupted market?
We sell x number of doodads per year globally.
Someone launches a thingamajig and lots of doodad users decide to switch as it does what 'they' need.
After x amount of time, the new 'norm' settles in, with doodads and thingamajigs now sharing/splitting that once single market.
I suspect they'd probably loose radio comms for a short while, but would otherwise go (relatively) unharmed outside of the primary blast/heat area.
They look to be the same format as the 40bit password generator in KeePass 2.
Perhaps there is a password generating tool out there that just isn't very good, or someone has pre-generated a few passwords, and have hardcoded them into something? (Malware etc).
I wonder if this phone can get through Titans of Space without overheating?
Tried it on a Nexus 5, and the phone just switches off about 5 mins in!
I'm curious, what would you use 128GB (or more) of storage for on a phone?
The only time I've ever wanted more than the stock 32GB I have in my (soon to be replaced) Nexus 5, was when taking videos whilst on holiday, as I'd stuck a few movies and some music on there for use in the airport/plane etc. So didn't leave much space for recording videos.
But anyone serous about taking videos, aught to be using either a proper vid camera, or get a phone with an SD slot where it becomes a non issue.
I suspect that 64GB is probably more than adequate for the vast majority of typical users. (There will always be fringe cases of course).
So is this going to result in a two tear console gaming platform?
i.e. CoD 49 comes out...
"Now with enhanced GFX [*]"
[*] XBox One Scorpio or PS4.5 required.
Or even some (non VR) games only being available on the updated platforms?
If this happens, then it could drop console players into an upgrade cycle!
I can also imagine some future console gamer, who prefers higher frame-rates to resolution, having to start a game, then go into Game --> Settings, and select 1080p rather 2160p to drop the res down, in order to bump the frame rate up.
All very PC like!
"That along with an Intel Core i7 6-core CPU indeed."
Games are generally not CPU bound (even 4k or VR ones), hence why most PC gamers (without an unlimited budget) still tend to stick with an i5. It's almost always the GPU that is the bottle neck (on a non budget system).
I've got a venerable old i7 3770K from 2012 (clocked to 4.3GHz) and the only games it ever gets busy on are turn based strategy games as it calculates all the AI moves for a few seconds between turns.
For a typical GFX heavy AAA title, you're looking at ~40-60% CPU utilisation. That's running an AMP Extreme 980 Ti (faster than the Titan X) on a 3440 x 1440 monitor.
The CPU utilisation doesn't change all that much between resolutions, or GFX settings. i.e. going from 1080p to 1440p ultra-wide might add 5% to the CPU (as the heavy lifting is still being done by the GPU), so even running 4k would likely keep the i7 running at under 65%, so still easily within the bounds of a more current mid range CPU.
@ frank ly
Quote : "If you want precise positional information about a distant object, surely you'd need the individual sensors to be as far apart as possible...".
The galaxies are too far away to use trigonometry to measure distance. We can use that for neighbouring stars, but once you get past a certain distance, it becomes very inaccurate.
So this about grabbing the spectrum of the galaxies, which gives us the red-shift, which tells us how far away the galaxy is far more accurately at these distances than trigonometry would.
We've done this already on a smaller scale, so this is about doing it on mass.
Quote: "So why are these fibre optic heads placed so close together that they have to be careful not to make them collide with each other when adjusting their aiming line?"
Don't think aiming, think more filter. It's one fibre per galaxy.
Imagine a disk (i.e. a disk of aluminium about a meter across), now drill holes in that disk that precisely match with the relative positions of galaxies in a specific area of the sky. Stick this plate at the end of a telescope (where the camera normally is), and point it at that section of the sky, so that the light from each galaxy lines up exactly with the holes on your drilled plate.
Now direct via fibre optics the light from each hole to a sensor, and you can measure the spectrum of each galaxy, one fibre being the light from one galaxy.
This was basically the Sloan Digital Sky Survey, check out wikipedia etc for some pics.
This new work is to automate the process, so rather than drilling metal plates, and fitting the fibres, a plate for each section of sky, you just move the fibres around via the 'robots'.
The Sloan Digital Sky Survey (SDSS) basically did the same process a few years back (the fibre bit), but without the robots, so was a very manual process. Basically drilling holes in plates for the fibres, that lined up with the galaxy positions in a specific area of the sky.
I remember watching a documentary about it a few years ago, took them days to set up a single plate.
This is basically automating the process, to ramp up the numbers (massively).
A couple of pics and some further links/reading here.
Quote: "But surely a discrete toggle switch to bypass all the smart components might be possible?"
Why do you feel the need to bypass the smart stuff? It's not like it sits in-between anything!
The smart part of a TV is basically just an App that you launch, don't want to use it, don't press the corresponding button on the remote (and like mentioned above, leave the network unconnected).
Perhaps in the past, but certainly not in recent years.
Trying to run a bat file, with a .exe extension, either from command line, or double-clicking in explorer, fails with an error. (I just tested in Win 7 out of curiosity).
Do you mean ART (Android Runtime)?
Dalvik was superseded by ART back in 2014 when 5.0 came out, and was available in 4.4 (KitKat) before then (as an optional setting).
Quote: "it's going to stop being a viable option sooner-or-later."
Bear in mind that WiGig uses the 60 GHz band for the high speed part (video to TVs/monitors etc), which won't penetrate walls, and is very short range. Effectively limiting you to gadgets/devices in the same immediate area, or at least the same room. So your finite bandwidth, is only being shared by a few devices in one room (or one part of a room if it's a large area).
You still have your normal 2.5 and 5 GHz for house/building coverage itself, which works alongside the 60 GHz.
I would suspect for most consumer gear, TV's, Blu-ray players, set top boxes etc, and for normal business use, even in an open plan office, this bandwidth issue is likely to be a none issue.
Personally, I doubt my desktop at home will ever, not use hardwired network and monitors (latency etc). But for my TV, satellite box etc. I welcome the day I can get rid of the huge rats next of network, power, HDMI, displayport and optical audio cables, I have crammed behind the TV cabinet!
I keep getting messages that are apparently from WhatsApp telling me I have have deferred messages waiting for me to read.
One of these days I must get round to signing up for an account to see what they are about!
Especially when the film only came out a few months back!
Oops, NFC of course!
Been setting up a NAS, me thinks the TLA got stuck!
Pebbles (other than the ultra thin versions) last 7+ days on one charge, and only take about 30 mins to fully charge again. (Even the thin ones last around 2+ days).
But they do try to keep the Pebble simple, no NFS, no GPS, no speaker etc. (Although it does have a mic for dictation). If they added all that, you'd likely be down to charging every day or two again.
From what I understand, these figures are generated from active Internet browser use. i.e. page access data etc.
So this 'should' be active systems, if a user has rolled back, their stats would be back under whatever OS they were previously on (7 or 8.1 presumably).
The only thing I've found that seems to be a genuine 'upgrade' is DirectX 12. But that's only really relevant to gamer's.
So far I've only met one person who actually likes Windows 10, but he also liked Windows 8.0 (yes .0 not even .1!), so it takes all sorts :-/
I'm happy with my venerable Win 7 for now (and Mint on the 2nd SSD).
Some places also don't seem to know what VR actually is.
I've found a few sites, including estate agents, that claim to be using Virtual Reality, that are just using 360° pictures.
For me,it must have depth, i.e. 3D/stereoscopic to be VR. A picture you can scroll around on a normal phone/tablet/PC is not VR!
Biting the hand that feeds IT © 1998–2017