Re: "We thus present a new technique based on WebAssembly fingerprinting to identify miners..."
No he just wanted to flaunt his superiority that he uses NoScript. NoSCript users are a bit like vegans, they always want everyone to know.
6272 posts • joined 28 May 2010
No he just wanted to flaunt his superiority that he uses NoScript. NoSCript users are a bit like vegans, they always want everyone to know.
I have a mini-USB-C port on my phone but nothing else I own does, and even the cable for the phone is regular USB at the other end!
How widely has USB-C been adopted so far because recent devices I bought didn't use it (Fire devices mostly) and it seems microUSB is still the defacto standard for everyone but apple - or am I just unlucky in what I've chose to buy?
My other gripe with USBC is the cable keeps falling out of my phone - ofte I wake up to find my phone hasn't charged as the act of putting it down after plugging it in dislodged the plug. But with only one device and multiple cables I can't theorise if this is an issue with USBC or just my phone's socket?
Given that probably 1% of tablet owners bother with a keyboard, I don't think that's true.
Tablets are seemingly used primary for content consumption in the domestic market, and data-entry in the business world (but this is a much smaller market). Whereas smartphones are used more as general mini-computers really.
Of course a tablet can DO vastly more than this, especially a higher-end one, but that's more niche. I can (and do) use mine as a modelling amp for my guitar, a music creation tool, etc, etc
Is El Reg uncommon in being a technology news site which is pretty uniformly pessimistic about technology? And is that conservatism, cynicism or realism?
Are there any good games that don't rely on APM and click-speed, but more on considered strategy? Then we can remove the arguments about advantages inherent to being a CPU
I can't imagine anyone buys a tablet type device thinking about upgrading it.
I did some searching based on responses and found you can get what IS a monocular-based clip-on from about £20. Probably some come in a handy case including a fold-up stand.
To clarify I'm not trying to make my phone into a pro camera here. Just something better than digital zoom so I can actually capture something further away.
Good points made, thanks.
It seems every phone can take great shots these days if you don't want to zoom, and have good lighting.
Lighting is being resolved through clever tricks but I still can't take a photo of anything not right in front of me, in most cases. Granted a phone isn't a DLSR but my Lumia 1020 had proper optics.
My question is, can one buy an after-market clip-on optic to get 5X zoom? Don't laugh, but in the past I've stuck my phone up to the eyepiece of my binoculars and been really quite surprised how well it worked, letting me take photos of a bunny across a field or whatever.
>Because MS insists on shoveling features they haven't fully tested (if at all) at their hapless victims (otherwise known as "users") in a desperate bid to prove their OS is ready for primetime.
MS do several orders of magnitude more testing than Linux distros do. They are obsessed with it.
The reason MS updates get in the news more when they have issues is it affects 100X as many users and businesses. If Mint balls up a new version, you're not going to hear about it outside tech forums.
For most users, if W10 didn't keep doing unscheduled updates I'm sure they'd love it. It has a more W7 style with some good bits of W8. People are wary of security slurping but if it doesn't stop them using the thing they'll be OK; an erratic power-crazed update system on the other hand is something even the most PC-ignorant user struggles with.
I had to try several times to get it to install on Parallels and when it did things looked different, but 5 minutes later I couldn't remember how such is my view of the OS (something to be ignored while you get work done).
I know it was a big 'un but what (sarcasm aside) were the changes us users were supposed to enjoy?
It seemed pretty clear this article was aimed at businesses not homes.
> let's face it, anyone who has gone to such lengths to avoid Redmond's wares is unlikely to be shovelling Word or Excel on to their pristine, and slurp-free, open-source installation.
Really? Surely the whole point of wanting Windows over another OS is to run the Windows applications you can't run on Linux? Otherwise you might as well just get Linux with a windows-esque GUI?
I was surprised they don't find using actual music score (or similar) works... it's the language of music encoded quite strictly. Training AI on "just the noise" is counterintuitive to me.
What was the actual output format? Surely the AI didn't hear piano WAV and emit a WAV with individual sounds exactly like piano keys?!
El Reg included information about why it's relevant in the article. Did you read it? Did you even search for "why is HTTP relevant..." yourself?
1. Nobody is ever told that "the image of a man nailed to a cross is good for them". They're told that the image is a useful reminder of what their religion holds as central.
2. "Kids see nudity as no problem - until they learn that it gets them into trouble with the sex-obsessed dogma of their parents' religion. That is how they acquire their subsequent prurient fascination with the "forbidden fruit"."
This is just bollocks. Most kids in the UK don't have religious parents these days, or have much exposure to religion. Yet they still become fascinated with sex and nudity. ALL teenagers have "prurient fascination with the "forbidden fruit"."
Religious (and other) people see that nudity naturally leads to lust. Which doesn't seem controversial, being lustful to the sight of (normally hidden) naked flesh seems to be how we work. THe issue is that some people consider that lust a problem to moral behaviour, and others see it as natural, proper behaviour.
What we consider 'nudity' appears relative. Whatever you don't normally get to see, or are not 'supposed' to be able to see. e.g. a woman in a bikini on the beach not particularly tantalising. If she wears a dress over the bikini and the wind gives you a glipse up her skirt. it is.
A.I. systems used to help triage wounded would also be "facilitating conflict".
Disagree. Not every long comment is interesting even if it is cogently argued - a sub-thread on some particular point perhaps. How much effort is it to click a button to read a long comment?
Perhaps get a job at a less gulag-esque company.
At least that's the feedback you're going to get. Every SINGLE community website makeover I've ever been involved with has been thus - users up in arms that "it's literally blinding me", threatening to boycott the site, even creating custom scripts to recreate the old look.
It doesn't matter how brilliant the site is, in my experience. Better to set up a feedback email piped into nul.
Their main area of complaint seems to be customer service, but most people rarely need to contact CS after they get it running.
For lots of people, getting where they're going is the only consideration, every pound spent on making that journey more comfortable is a pound they don't have to spend when they get where they want to be. I think the average Brit would accept an offer "will you sit in this cramped box for 2 hours for £100".
I guess similarly if you just want internet that does Netflix, all options look the same to a non-techy. Though I know IT people who are with TT on the basis "it pretty much is the same"
I recall a certain surly bending unit was partial to a beer or 101011
The home-brewing community I'm in certainly wouldn't agree that lager is inherently bad. You're just talking nonsense. But then you seem like a proper snob, looking down at 'chavs' for liking things they 'shouldn't' like.
Each to their own.
The people making the beer don't know what you ate, how hot the day is when you'll drink it, etc, either. And yet they still try to come up with something. That's what the drunken robots are going to be trying to do.
Wow, what a comedian. Tossing insults at mass-market lager... what do do you for your encore, some observational humour about how Donald Trump isn't especially popular?
Flavour of the month for 15 years or so. It's been a widely used niche language for that time, widely espoused by developers as something that "should" take off more widely.
In the meantime, Ruby came as a competitor... and seemingly went.
One imagines MS have data gathered to tell them objectively how helpful it is.
It is a legitimate question. On the one hand even our modern advances only highlight just how stupendously amazing the natural world is, which can invite disbelief such things could occur "at random". On the other, one can argue that our endeavours have been going on for just under 100 years whilst natural selection has been at work seemingly for over a billion years, or 10 million times longer - we simply can't think on that time-scale.
Personally I still con't comprehend it being random (disclaimer I'm a Christian) but as scientist I realise not being able to comprehend it doesn't imply it can't be true. The physical and temporal scale of the universe don't suit the human brain well!
I'm sure one of the later Ender books (Children of the Mind?) features an alien civilisation who communicate by sending digitally encoded DNA sequences, which the receivers are supposed to 'read' by incorporating them into their own DNA.
Tangent, but that's what this reminded me of.
Apple haters have been calling it a fad since iPhone 1. There's always someone who sees their mates leaving Apple as a sign of End Times for Apple.
Did you miss the part where Apple sells both the #1 AND #3 most popular phones in the world? Hardly "on the way out" now is it?
I don't think anyone, even Apple, would expect their more expensive, premium version to outsell the regular version. They launched the 8 for regular users and the X for rich users.
You might as well complain the iPad Pro undersells the iPad, as an indication it's a failure.
If you get the replacements free it's ok!
You know unless you move around for work instead of being a 9-5 permie. Lots of contractors prefer to use their own machines when on site, consultants or sales-people might want a way to demo an entire system on the move, etc.
A client of mine spent a couple of grand a few years ago doing just that, they needed a workstations-server "system in a box"
I really hope you don't work in IT. Because in 1998 VMWare was still being founded, and Parallels only came on the scene in 2006.
Even in 2018, GPU-related things don't work well in virtualisation (Parallels doesn't support modern OpenGL). And these days, GPUs are no longer just used for gaming.
So no, dual-booting may be defunct in certain areas of IT but even outside gaming, it is required in others.
>Ellison's main involvement in B5 was in the role of the person Straczynski trusted to tell him if he thought something was shit.
Was he not on set often then?
They didn't rename Skype. Go find some actual news to report instead of pandering to the Penguintards
Setting up the construction line to produce without fitting the camera costs money. Setting it up to produce different case variants costs money. Getting the "imbecile programmers" to write code that checks if the camera is there costs money. Making sure your £2/hr workers put the right cases on the right phones and in the right boxes costs money. Maintaining two versions of the phone and marketing them costs money. etc.
If you don't want a camera, put a piece of duct tape inside your case covering it.
I'm assuming with your cutting analytical skills you've the £millions for such projects but are sensible enough not to waste them. After all, someone who knows better than multi-billion tech companies must be in high demand BY those companies or their competitors.
How much does the dock cost, smart guy?
A mouse? A keyboard? A headset? All these you can get wireless versions, or wired versions for a tiny fraction of the price?
Not to mention an external hard-drive.
In the modern world lots of us work remotely or in distributed companies. Even if you're in the same building as other developers, the moment more than 2 of you need to talk it gets time-consuming to get in the same place physically.
So I'm fairly scuppered this afternoon. Although the football is on...
because everyone else is using Slack. It's a bit like asking why nobody uses Google+ even though it's better than FaceBook.
It's not a simple game you lame troll. Chess is a simple game. Go is a simple(ish) game. ML trounces human players and is on a par (at least) with traditional non-AI software representing man-centuries of work.
In a complex game requiring substantial strategic thinking and planning, mouse-click reactions and CPM really shouldn't be the dominating factor. I don't know if this IS the case in Dota 2 but it sounds very different to Starcraft.
This is a major undertaking not to be done at the drop of a hat. Also, you think MS would move their platform away from their own servers just to win points from anti-MS people? I bet most people never even knew GH was using MS servers (I didn't)
We're really off to a flying start with the idiotic comments today. This is a lab for noob AI developers wanting to get some understanding, and you're complaining it's not solving formerly unsolved problems?
>If this is AI then we have AI since the 80's.
Seriously? If a problem has already been solved by non-AI/ML means then solving it a new way is meaningless?
I hope when your kids get into coding and proudly show you their version of Pong you will tell them kindly, how useless their efforts are.
That is rather the point - they are speculative R&D projects that may through out some interesting results even if the project itself is never marketable. Like concept cars.
>The idea of writing a large scale system with a modern distributed architecture in C++ is ludicrous. Even if you could, what would be the point? And where would you get the developers.
What do you mean, where would you get the developers? As the dominant applications language in the recent past, there are a vast number of experienced C++ developers out there. Many of them still relatively young (30s).
Biting the hand that feeds IT © 1998–2018