* Posts by zanshin

42 posts • joined 2 Nov 2010

Trolls have DARK TETRAD of personality defects, say trickcyclists

zanshin

I "knew" a Troll that fit this bill...

...but I tend to agree with a lot of other the commenters that most folks many readily consider trolls aren't particularly sadistic or even clearly narcissists. Now, none of folks I knew well enough to think I had any insight in their true personality were people I truly knew well, nor am I an expert in psychology, so perhaps these traits were there but subtle. Some of them were quite bright though, and were capable of Machiavellian twisting of a thread of discussion, though I am not at all sure any of them were doing so with any specific plan in mind beyond to mess with people.

One, though, the one I refer to in my reply title, did seen a fit for most of characteristic mentioned in this poll/study. He was extremely narcissistic - he simply could not ever admit to being wrong, even when faced with incontrovertible proof that something he posted was false. He spent a lot of his time talking down to people, posting mainly to disagree with folks and always in a disagreeable way, which variously fits the psychopathic and sadistic labels. And, my God, actually engaging in debate with him was like falling down a rabbit hole designed by Daedalus. He would constantly dance and twist and subvert, and if you weren't careful, after about 10 posts you were arguing about something totally different. This was often related to his refusal to accept lines of logic that undercut his argument - he would would drag the discussion away from such things as a red herring debate tactic. He was quite good at it, using an ever-evolving, subtle shift through of sidebar arguments seeming related to the topic at the time, but accumulating to pull things every further into a different area entirely.

To this day, I don't know if he was a fairly brilliant troll or someone who was deeply disturbed and lashing out online because he could. I lean towards the latter, though.

1
0

Google+ GOING, GOING ... ? Newbie Gmailers no longer forced into mandatory ID slurp

zanshin

Interesting to see if other services follow

When I first got and Android phone, signing it up for Google Play also signed me up for cross-sell services like GMail, Drive and, yes, Google+. I do use Drive, but not the rest.

I don't use social media much to start with, and what I use it for, Facebook is sufficient and also where everyone I want to interact with in that way is found. Google+ thus served no purpose, and I considered it more unwanted "attack surface" for my online presence than anything, though I don't mean to suggest I was deeply paranoid about it.

Thus it was that I was pleased to learn how to disable my public Google+ profile. If anyone didn't know that was possible, ironically the best way to find the instructions is probably to Google "delete Google+ profile". You do want to read the instructions - some Google products are intimately linked and deleting your profile can delete other data, too, but most of the mainstream services like GMail and Drive are unaffected.

So I do wonder if they'll unlink automatically creating a Google+ account from any other of their offerings, since it's clearly not deeply integrated.

1
0

Siri: Helpful personal assistant or SERIAL APP KILLER?

zanshin
Meh

No...

How do you think the assistant is going to *do* all those things, and get all that data?

When I need directions, I still want to *look* at a map. I can figure out what I need to do much faster by doing that than by having a virtual conversation with my device. Having the map at hand calls for an app, even if it's an app the assistant-maker also built, and the assistant can directly integrate with it.

And when I have had an instant messaging chat with someone and am trying to remember the name of the person they told me to ask for at the local office? Well, I suppose it's possible a *really* damn impressive assistant could get that from the chat, though at that point I'd start to wonder what they'd need me for. But where are the chats stored? Sounds like the chats are stored in an app, to me. Did I have an IM chat with my friend via the assistant? Sounds like the chat needs an app interface of some kind.

How is an assistant going to act as a remote for my movie streaming? Do we all really want to *have* to talk to the device to look at a bit of action frame by frame?

If I want to connect to a new, walled-garden media streaming service a-la Netflix or Spotify, how is my assistant going to achieve that without some sort of app?

No, assistants are not going to be the death of apps. Apps may be replaced by something else someday, if web-rendered apps, exposed APIs, cloud-based screen rending and so forth keeps advancing. But even then the assistant is going to need to be able to integrate with whatever thing the apps become do do new things. The assistant won't have killed the apps - they'll have changed on their own for largely unrelated reasons.

0
0

Trick-cyclists defend Facebook emoto-furtling experiment

zanshin

I'm sure you were being sarcastic here, but...

"And El Reg can't help but wonder why informed consent is a concept that requires scare quotes."

Simple. If they have to act responsibly, that's a barrier to arbitrary action, usually intended to benefit them financially in some way, even if indirectly. I very much doubt Facebook worked with these folks purely out of interest in science, even if they had received no or minimal compensation. Knowing these sorts of things help them figure out how best to monetize their users.

If companies and/or academics working with them had to request consent or, possibly, be completely barred from such research, then this means they get to investigate how to monetize more slowly, or not at all, and they will resort to exactly this sort of red-herring arguments to try and hedge against that risk.

It's really quite disgusting.

6
0

We're ALL Winston Smith now - and our common enemy is the Big Brother State

zanshin

The separation of concerns seems very thin

Other posters have mentioned this, but I'll pile in. If some company like Google has a wide-ranging amount of information about my interests, my communications and my movements, it's not much consolation to me that these private companies don't want to abuse that power where the state government might. The reason is that the government has the power to demand that information from the company (or to take it without their knowledge) for the sake of whatever it is the government might want to investigate me over.

As we've seen with the Snowden releases in the US in particular, the very act of the government tapping corporate intelligence stores can be contrived to occur in such a way that almost no one outside the channels that make it possible knows about it, and anyone involved who would like to make it public is under threat of severe criminal prosecution should they try.

It's fine and well that our governments have not not, seemingly and so far, meaningfully the abused civil rights of their citizens using the information they now have access to. That is not a sufficient defense of the practice. The reason democratic nations have historically sought to reign in the knowledge freely available to a government's apparatus about the people governed is to limit the *possibility* of government abuse.

Quite simply, if a system that can be abused is left in place long enough, two things happen. One is that many of the governed people become inured to it, assuming it's OK because "it's always been like that". The other thing, which often comes only after the first is established, is that someone *does* abuse the system. It's human nature - either someone eventually won't be able to resist committing abuse, or someone will seek a position of power *specifically* because they recognize the potential for abuse they can execute.

As a species, we humans like to live under the conceit that conditions we enjoy now will persist into the future without bound - that because no historically decent government will ever change to be otherwise. I think this is imminently foolish.

I'm hardly a doom-sayer, but it's hardly impossible for me to imagine future situations of civil disorder, most believably due to some natural disaster or resource constraint (water, power, food, etc.), where governments of what are today democratic and free societies might resort to more totalitarian means simply to try and keep things under control. (Martial law.) In situations like this I believe you very much would not want to mix in such abuse-prone tools such as a way to track basically everyone all the time (pervasive cameras, facial recognition, cell phones, centrally managed driverless cars, etc.) It's unwise to trust leaders with such tools to do the right thing in situations where civil rights are so specifically curtailed History does not show good precedent.

On that note, one thing I'll disagree with in the original article is the notion that we owe Orwell for the caution of people my generation and older. While 1984 certainly stood out for some time and doubtless influenced many readers, for cautious people I know it is real, historical events that serve as more sobering reminders of what abusive governments can do with the power of extensive information about the people they govern. The examples set by the Soviet communist party, Nazi Germany, and the Red Scare in the US are much more frightening to me than any fiction. Imagine those regimes or movements with access to the information they could gain on their citizens today, especially if those citizens were raised to use the internet with limited caution.

Hope for the best, but plan for the worst. Enabling pervasive surveillance is unwise, even if it is not the government who directly surveils us..

4
0

Job for IT generalist ...

zanshin

The jobs do exist, but I've no idea how common they are

I am in the US, but I work for an international company for which the UK members have a strong leadership presence. My boss' boss is in the UK.

I'm a generalist. I know something about a lot of different things, can use that to solve lots of problems, or create lots of solutions. And I've got a job where that's basically what I do professionally, where the breadth of my skills is basically specifically why I'm valued, and I'm paid very well. I've been where I'm at for some time, though, so I can't speak to how easy it is to find a job like mine, and it's something I do worry about should this job disappear or become unsavory. I *can* tell you my team wishes we could find more people with a breadth of skills.

Where I fit in best is in a place where specialists exist in their own silos. You have developers, DBAs, sysadmins, storage teams, and networking gurus. In places that divide specialties up like that, you often benefit from someone who is a bit like a business analyst, except instead of being the interface between developers and customers, they face the other direction, interfacing between developers and infrastructure / middleware.

What we find is that the developers often are wildly ignorant of the implications of the system's (virtual) physical design. The infrastructure teams often have no time to learn the ins and outs of the applications, in order to tune their systems for them. I help the developers create systems that won't be rubbish on the basis of the systems on which they run, and help the infrastructure folks design hardware that won't be rubbish for the needs of the application.

The challenge is in finding an organization that values this role. Not everyone does, and that's clear even within my company. What seems to make the tuning and problem solving skills valuable to people is when they're strapped for budget and they need to expand their system or make their existing scale of system run better. Tuning things can increase concurrent users on existing footprint or reduce infrastructure for same performance. And even in a cloudy context, the ability to achieve those things can be valued. But I fear that may be rare.

I would never do project management. It has nothing to do with why I'm in IT, and requires primarily the exercise of people skills, not technical ones. If I lost this position, I would look for a job as a systems architect - someone who looks at the big picture of software, infrastructure, APIs and whatnot and assembles it into a solution. I see a PM as someone who drives all the people involved to implement that vision. I would want to be the person creating the vision itself.

2
0

Ubuntu desktop is so 2013... All hail 2014 Ubuntu mobile

zanshin

Canonical feels like the Apple of Linux Distros

I would like to see a Linux OS take off and get real market share and mind share in the broader device market, but as a power user, I would never personally want that Linux flavor to be Ubuntu. Of late their party line is that they know what the users want better than the users do. And *maybe* they do for users in general, or for new users that (somehow) find themselves using a device running Linux. But they sure as hell don't know what I like, because I don't like a lot of high-profile things they've done in recent Ubuntu releases.

That attitude of knowing what I want/need more than I do myself has been classic Apple for ages now, and it always kept me away from them. That attitude was recently adopted to much angst of late by Microsoft with TIFKAM, and it kept me away from Windows 8. And it keeps me away from things Canonical the same way.

You can be pioneering without immediately throwing old paradigms under the bus. You can be progressive without remaking *everything* from scratch without a smooth transition. Does it take longer? Most likely. Does it cost more? Possibly. Will it give you a happy base of users who you help through the transition? I think it would.

It's great to be looking at the long road where today's 6-year-olds will be the device consumers of the future, but the fact is that right now we've got a ton of users bridging the desktop and phone/tablet paradigms. Making *them* want to use your product would get you significant market share which those 6-year-olds would grow up seeing. I don't understand the strategies that seem to decide those transitional people are irrelevant and/or can be made to change their opinions en masse. Given how poorly it seems to have gone in general, I think I'm probably not alone in feeling that way.

8
1

Facebook to BLAST the web with AUTO-PLAYING VIDEO ads

zanshin

I'll never see these on my PC, or if I do, it'll be a brief situation.

I'll likely just use my mobile client less. It's getting to the point that I use its updates notifications to go look at FB on my PC, heh.

0
0

Inside Steve Ballmer’s fondleslab rear-guard action

zanshin

Re: At the risk of downvotes

I'm not sure the comparison with what happened to minis is 100% apt. That seems to compare better withpredicting that in 5-10 years those using PCs might have them based around low-power ARM chips, which is certainly not impossible. The comparison between slabs and PCs (and notebooks) is as much about form factor (keyboard+mouse, etc. vs pure touch) and user interface as much about the architecture of the system.

Now, the shift between on-client and remote processing by your application is possibly apt in that comparison, but that is actually not wholly integral to the PC vs slab debate, at least IMO. Modern slabs come with enough grunt to run certain things locally rather well *if you buy a high-end one*.

The migration back to the "cloud" version of a client-server model for various compute is partially about keeping the price of slabs low for growth into developing markets but also about control (vendor lock-in) by large corporations, the attraction of locking people into subscriptions versus one-off purchases, benefits of economies of scale, and the idea that it's nigh impossible to pirate stuff that doesn't execute locally.

2
0
zanshin

Eh

I don't really buy into the full spin of the article, but I do think some aspects of the impact of slabs will hit PCs as we know them.

The modern PC has become a fairly durable good - something you buy and keep for 5+ years rather than something people replace every 1-3. Why has that happened? Partially maturity, partially because the incremental performance for buying a new one has been dropping for a while - the bang for the buck on an upgrade has decreased such that it doesn't make sense to do very often for all but the most hard-core users. I'm a pretty serious power user who builds his own (and family) PCs from parts and I expect my current PC to last me 5 years before I really want a new rig.

This means that, for branded PC makers, this means growth is slumping. Windows 8 and it's idiotic handling of what was probably a generally wise paradigm shift didn't help, but it didn't cause this growth slump. Market saturation with longer-lasting goods did. In this case "market" means "all the people who can afford and make use of a desktop/laptop PC". Most of the people who could/would own a PC already have one and have no compelling need for a new one.

Yes, there are lots of *other* potential customers out there in the world, but for a variety of reasons they weren't going to be able to afford a PC (or have a place to hook it up) for the foreseeable future.

And then, along came a sort of PC-like thing that people don't have to "hook" up in the same sense. All they need is a cellular network capable of decent-ish data rates and a place to plug it in overnight for charging. Still high demands in some part of the world but vastly more common than the home office niche a lot of us have in the developed world. And it's a lot cheaper than the cheapest PC or laptop.

Companies as a rule chase growth - if they're public they basically are obligated to do so, because their shareholders and often the law demand it. If growth in PC sales is collapsing, PC makers *have* to chase the next growth market, even if it's not PCs.

Does that lead to a "post-PC era"? I doubt it in the sense that some people seem to like to claim, but the reality is that if enough of the big companies wither their PC production, that has a knock-on effect on price and availability of the things that go into PCs. As they exist today their parts are cheap and generally modular because umpteen different companies make them. What happens when that changes?

I think that market group-think leads to this notion that PCs will completely disappear, and I think that's dumb, and even this article doesn't claim that. I do think this shift in where growth leads makers will impact who makes the PC of the future, how easy it is to modify or build bespoke, what OS is available to run on it., and (perhaps most importantly) what applications are made available for those OSes. Personally, I think it looks a bit grim, but not completely awful.

4
0

Autogyro legend Ken Wallis hangs up wings at 97

zanshin

Here's hoping this means he's off flying somewhere better. :)

I've only ever experienced the man through TV interviews, but he was rather inspirational, and indeed seemed to me to have had a fascinating life. Sounds like he would have been a terribly interesting person to meet.

RIP.

0
0

Can't agree on a coding style? Maybe the NEW YORK TIMES can help

zanshin

Re: 4 SPACES?!

You use spaces if you have multiple people editing the same files from in mixed environments, like having some people using various Windows- or X-based editors, some in terminal vi or emacs, etc. Tabs have arbitrary visual width, and you can tell people to set whatever they use to display a certain tab width, but you cannot always enforce that they all do it. So then you get people mixing spaces and tabs (often unintentionally) in ways that look like the code is aligned in one editor but not in someone else's.

Spaces avoid this ambiguity. No editor displays five spaces as anything other than five character positions, unless, God help everyone involved, someone is editing code in a proportional font.

1
0

You're 30 years old and your PIN is '1983'. DAMMIT, biz mobe user

zanshin

Not reasonable

It just don't find it practical entering a complex password on a mobile device that locks frequently (as many corporate-issued ones do, per pushed policy). I have to do this on my corporate Blackberry, and that's barely tolerable because I still have one with actual keys and because my employer doesn't enforce very onerous password requirements and has a low-frequency change policy for such phones. Entering a complex password on a phone with a screen keyboard, given how often a good password requires mixed case and punctuation (and how most soft keyboards implement this), would be a nightmare for me. I would come close to spending as much time unlocking the phone as I did actually using it to reply to mails and text messages.

1
0

Windows 8.1: So it's, er, half-speed ahead for Microsoft's Plan A

zanshin

Re: Search as primary means of navigation?

I'm also not a fan of search spanning my machine and Bing. I either want to search my machine, or the internet. I cannot remember ever having had searching both at the same time be a use case I actually needed.

I assume, perhaps incorrectly, that it can be disabled, but still.

(The author's use case of wanting it to search in calendars seems pretty wild to me too.)

37
0

Girls, beer and C++: How to choose the right Comp-Sci degree for you

zanshin

Re: Pascal had a use (for me at least)

As a learning language, I believe excess focus on Pascal, Delphi and even Java hobble you with respect to what's going on under the covers. That may not seem important to many people - too many in my opinion. But in my experience, the really excellent programmers are the ones that understand at least some of the low-level implementation details of how languages, and indeed computers overall, get done what you ask of them. Languages like Java work hard to hide that from you by design.

Learning C and C++ force you as a matter of the language basics to learn about things like referencing memory more or less directly via pointers and low-level arrays. Naturally, using these tools can be risky and poor use leads to all manner of awful bugs. Indeed, that C/C++ can permit such horrific bugs is the main reason we have languages like Java, whose core design philosophy is to "protect" the programmer from these things. But by using languages like Pascal and Java as your sole teaching tools for a new programmer, the students don't learn nearly as much about what's actually being done by the resulting program than they would having written in, say, C/C++.

I believe this deprives them of all manner of useful knowledge they can later bring to bear to debug deeply buried issues or improve application performance. Not because they're going to hack pointers into a Java program, but because they often have a better grasp of things like stack and heap maintenance, why things can go wrong with loading in foreign libraries (vi JNI and the like), and so on.

The above can be covered of course by courses in assembly or compiler design, but not that many CS students I know take those if they can avoid it when their goal is to get marketable skills, as opposed to pursuing computer science directly. They usually focus on programming classes, which usually focus on higher-level languages used in business.

Learning algorithms, design patterns and the like is essential, and I hardly think

0
0

Obama weighs in on NSA surveillance imbroglio

zanshin
FAIL

He's right. We need to make a choice.

We choose less surveillance.

14
0

Dialog Bluetooth chip boasts battery life of four YEARS

zanshin

Re: Beggars belief that TVs & remotes don't use bluetooth

"And yet the PS3 had a bluetooth remote control 6 years ago"

Which is frequently considered *a detriment* because nothing else can control it, because everything else is still designed around IR. That perspective is *exactly* why this hasn't happened yet.

It seems to me the idea of non-IR controls is gaining more steam the last few years, but it also seems a bit more likely to me that this will manifest over full-blown WiFi than over Bluetooth. Every controllable device I have but one is now attached to my home network, even when I didn't aim it to specifically have for that functionality. Granted, not everyone has a home WiFi network, so that still might let Bluetooth fill that slot.

(The one un-networked device? A flat-screen TV that was a 3-ish year old model when I bought it new, specifically chosen because it didn't need smart features and was thus far cheaper. It doesn't need to be smart - every single thing I have that can feed it a picture is an internet-connected smart device - even the receiver that sends A/V from my other sources to the TV.)

0
0

Oracle and SAP are Big Software, but for how long?

zanshin
Meh

A lot more is needed from SaaS offerings

I can see SaaS as very compelling for small and medium size businesses. For large, complex organizations with a lot of IT already in place, there are frequent needs I don't often see mentioned that, so far, limit how useful replacing SaaS would be. For example, where I work, a common thing with on-premise solutions (licensed or home-grown) is that they are integrated with one another. From what I've seen of most SaaS offerings, integrations and customizations are usually extremely limited, if on offer at all.

While integrations like this are sometimes a pain to create and maintain, they usually exist for a reason, reducing duplication of data entry, improving automation, etc. These are real needs that still need to be met in big organizations, but most SaaS offerings seem to exist on a pedestal inside a walled garden, making using them for this awfully challenging. Yet it seems overkill to me in a world where SaaS exists that having an on-premise solution is the only way to address these kinds of needs.

I think this is a problem with product maturity that will eventually need to be met. I expect some future wave of cloud marketing to make a big deal of how various SaaS products expose APIs (possible for added cost) that allow this sort of integration with other products - even SaaS products from other providers. At least, I hope to see that some day.

3
0

Forget choice: 50% of firms will demand you BYOD by 2017

zanshin

Re: Not interested, personally

No, really, it's not that simple. I know there are plenty of people *think* it's going to work like that, but they won't be hiring me, and I'm not at all concerned about that. I recognize that not everyone has that luxury.

This isn't quite as bad as people insisting on access to personal social media log-ins, but it's certainly in that general area for me. With the social media access thing, if I'm asked or told about it in an interview, it means I'm done with the interview and I'll keep shopping. Mandatory BYOD means I'm going to have a lot of questions, and *might* keep looking depending on the answer.

1
0
zanshin
Thumb Down

Not interested, personally

The issue at the end of the article works in reverse, especially in light of "BYOD" really meaning "Buy your own device from a list of certified options". If I buy a device for a job and don't end up staying, I'm stuck with an out-of-pocket investment in a device that might not be something I would have chosen without that job's likely restrictions on what I can buy.

On the happier note that I stay at the job, this doesn't save me any money, because I'm not going to use a work device for personal things. Sure, if it's a phone I might be OK making calls on it, but at this point I don't have any interest in using a work device to chat with with the gang, post my personal opinion on forums under a pseudonym, or exchange racy texts with a girlfriend. If I want to do any of those things, I'm going to want a truly personal device anyway, so BYOD becomes a pure cost overhead for me.

Maybe eventually I'll be mollified by the internal work/life firewalls of the sort Blackberry and Samsung are working on, but at this point they are too new and untested for me to trust very much to truly keep my prospective employers out of my personal life.

3
0

Dubai splurges on 700hp, 217mph Lamborghini police cruiser

zanshin

If the cops actually need to go 150+ MPH to chase someone down, I rather wonder what they intend to do with them once they catch up to them in this particular car. I can't imagine they plan to ram or PIT maneuver them with a 250-300k vehicle. (That's ignoring the question of what harm making a car spin out at those speeds might result in.)

I suppose this could just be used to follow the speeder doggedly, and thus ensure the speeder actually can't get away, to be bagged (one way or the other) when they finally give up or eventually crash. In that case, the cops better hope whatever they're chasing doesn't have dramatically better MPG than their 12-cylinder beast. :)

0
0

Public cloud will grow when experienced IT folks DIE

zanshin

Re: So who will run the servers in the various 'clouds'?

Totally agree. We're where we are currently because we've had a few generations of people who grew up learning PC-related tech in large part because it was abundantly available to everyday people. While I'm not so dense as to claim standing on the shoulders of a nearly totally cloud-centric future is impossible, it leaves me wondering very hard about where the innovations to expand those cloud providers into the next big thing after that will come from. We already went down this road with "wizards" who were the only people who could manage proprietary systems in the form of things like mainframes. The PC era did a pretty good job of sweeping that away. What will be the cloud equivalent?

4
0

Congress plans to make computer crime law much, much worse

zanshin

Insanity

There's a preposterous disconnect in the potential penalties for things like this and actually horrific crimes like murder or rape, or things like selling hard-core drugs. The claim that they're some sort of prosecutorial "bargaining position" is small comfort, since the very existence of the penalty means a prosecutor has the right to ask for it and a judge is free to apply it. The idea of actually scaling the penalty to the harm done (or threatened) seems right out the window. This, sadly, should be no surprise, as I feel we have been seeing similar problems in copyright infringement cases as well.

23
0

Gnome cofounder: Desktop Linux is a CHERNOBYL of FAIL

zanshin

Windows isn't all that if you install it yourself on a laptop

For folks pointing out issues with drivers on laptops in particular, doing an OS install of Windows on a laptop from base media is rarely a walk in the park. In my experience, laptops have historically been the biggest offenders for having odd-ball, even completely specific hardware for which you need to obtain the driver *from the laptop vendor*. On such systems, Windows only works "out of the box" when it comes pre-installed (or primed in a way that it will complete the install off a disk that includes the needed drivers). Otherwise, you need to go trawling the internet looking the right driver download.

Yes, in such cases you'll often struggle to find a Linux driver for the same devices. You'll often also struggle to find drivers for older Windows OSes (even XP at this point, for new laptops), as well as Mac OS compatible drivers if it's not actually a Mac laptop.

3
0
zanshin

So commercial interfaces don't change, eh?

I don't want to ride this too hard, but I'm pretty curious about some of the complaints about how Linux distros apparently constantly change paradigms like how to copy/paste, drag and drop and the "File" menu bar. If you're talking about a user who struggles with mapping those things across new OS variants, surely you're not using that as a defense of commercial OSes? Yes, their change cycle is a lot slower than that of Linux in general, but Apple has radically changed their OS's user interface several times. They're also mildly notorious for making new OS versions incompatible with software from previous versions. Microsoft has tended to be more stable in terms of both interface and software backwards compatibility, but they aren't completely unknown for interface shifts that are crippling to folks who struggle with basic computer operations. Windows 8's interface-formerly-known-as-Metro is a wild departure for folks who know how to use Windows XP or 7 based mostly on rote. And even before that, things like the Office Ribbon interface was severely disruptive to non-computer-savvy folks I know. (Hell, it was disruptive to savvy folks, too.)

If you're dealing with folks like this, you don't change their OS unless you absolutely have to. Odds are, they aren't installing their own OS regardless - they're getting it shipped on a PC or having friends or family do it for them. That means the glut of variants of Linux and the deep vagaries of configuring the OS (any OS, not just Linux) are basically non-issues. If it breaks, someone else is likely going to fix it anyhow.

And if they're actually more capable computer users than that, major distro Linux installs from, say, 2008 and on are not that different from the one you get with Windows, and for the last few years they actually come with good device drivers for mainstream video and sound devices.

Is Linux awesome for "normal users", if such a thing exists? Probably not. Are the major distros nearly as bad as some commentators here seem to suggest? Not by a long shot.

1
0
zanshin

He's happy because he wants something that isn't what Linux has been about

Desktop Linux could probably stand some better end-user friendly distros, but ultimately, what MdI is happy about is that he got his hands on an OS that someone else made water-tight and locked the rest of the world out of serious tinkering with.

That's pretty much the polar opposite of what Linux has been for ages. The fact that it's wide open to tinkerers who think they can build a better moustrap is *why* it has a bunch of variant distros. Sure, it also leads to some annoying schizophrenia among its tools even within a given distro, and yes that all contributes to it not always being that user friendly. But, honestly, if you want an OS that you're never going to tinker with and you want to "just work", and (most importantly) you want to be able to call someone for help if it doesn't "just work", you really are probably going to be happiest with a commercial OS. Not because they're better, but because they're closer to what you want.

Apple's OS isn't intrinsically superior. Its just aimed at a different market segment.

8
1

Apple confirms 128GB iPad. A hundred bucks for an extra 64GB

zanshin

I agree it's not a PC replacement for most of us, but that's what makes the comment about using it for something like CAD work pretty laughable.

Current touch interfaces would be a truly horrific way to try and do serious 3D CAD work. A workstation class PC is also likely to completely blow away any tablet on the market (iDevice or not) on raw grunt power used for a lot of things like 3D rendering or even video encoding.

Maybe someday, but right now the suggestion that we use a tablet for anything other than social and multimedia consumption seems pretty silly. I don't even particularly like using them for office productivity stuff like spreadsheets, but that's down to either having to use touch or at least lacking a mouse, rather than tablets lacking the power for most of that.

2
0

Facebook's sexy pick 'n' mix OCP model is great... for Facebook

zanshin

"That's why the OCP IT model will fail in the real world outside the hyperscale data centres of Amazon, Facebook, Google, Yahoo and the other enormous cloud IT service operators."

It seems somewhat likely to me that they are envisioning a future where that's where almost all of the compute hardware actually lives, and thus where the economies of scale end up focusing the market. Personally, I don't think such an one-sided outcome is going to happen for a very long time, if ever, but a lot of hype seems to assume it's inevitable.

0
0

The year GNOMES, Ubuntu sufferers forked off to Mint Linux

zanshin

Re: Wants A and B

"Might I recommend Windows 8?"

This may have been intentional sarcasm, but for Windows users, going to Windows 8 is not totally unlike the switch from Gnome 2 to 3. The interface-formerly-known-as-Metro is anathema to power users who want to heavily multitask, and some of the first things such power users do is disable as much of it as possible, which is mostly achieved with 3rd party software, as actual end-user-facing options to disable it do not exist. And the result is still less productive to a multitasking power-user, IMO, than prior versions of the OS.

9
0

Facebook to debut auto-play video ads in 2013

zanshin
FAIL

Good luck with that

I'm sure a lot of people will be stuck sucking up those ads, but I bet it will generate a lot of ill-will. I can't say I blame Facebook for trying - they're literally obligated to try and monetize their users, but I don't expect users to like this particular method of it. I'll be interesting to see if networking effects keep users stuck there, or if such blatant ad peddling drives them elsewhere, perhaps to Google+. Google, to its somewhat underhanded credit, is generally more subtle in how they monetize users through cross-product tracking. They show us ad videos on YouTube, but hey, at least you went there looking for a video.

Personally, given that video is not fundamental to Facebook's core use, I'm confident I'll be able to strip out such nonsense on my desktop browser. Given that I run a miserly data cap on my mobile data usage, if they try to force it on me as a smartphone user I just won't access Facebook from there. (I really think that forcing video viewing on people using a mobile data connection would be an outrageous PR fail for them, though. A "video ads on wi-fi only" setting would probably go a long way to mitigating that.)

2
0

Big Data in creepy hook-up with big-game whales

zanshin

Re: Games additication

It's hard to say there's been no evidence, but based on my own reading, there seems to be no accepted scientific / medical evidence that indicates gaming can rise to the "addiction" classification. There are apparently some clinical studies that claim it games can be formally addictive, but all are apparently considered anecdotal for various reasons, and not firm evidence one way or the other.

On the other hand, there are a fair number of papers or write-ups on how to tap into the impulsive tendency of (general) people which, when incorporated into game design, have been shown to increase time players spend playing, how long they stick with a game, or both. Again, these are surely mere anecdotes from a hard science standpoint, but it gets a bit hard to ignore them when game makers talk about using them successfully. Add in that you occasionally get people who die in (usually East Asian) gaming parlors because they would not leave a game to go get a drink or use the facilities for three days straight, and it makes you wonder.

Consider this article: http://www.gamasutra.com/view/feature/3085/behavioral_game_design.php?page=1, which is written by this guy: http://www.gamasutra.com/view/authors/205411/John_Hopson.php . This kind of writeup certainly makes it easy to draw comparisons between games and a virtual Skinner Box for humans, though that direct comparison is often challenged.

When you consider that gambling presses many of the same psychological levers, and addiction to gambling is considered a real disorder, I'd say it "gaming addiction" seems fairly plausible, if not formally "proven".

0
0
zanshin

A Pandora's Box of sorts

I'm neither a would-be Luddite nor, typically a doom-sayer, but topics like this do suggest the risks that come with our accelerating progress with computing power and data collection. We are simultaneously armed with ever increasing knowledge of how the human mind works, both biologically and psychologically, partially thanks to access to massive computing power and data collection, and that same computing power promises new and sometimes scary ways to use or abuse that knowledge. There are certainly potentially gloomy side paths to consider off of this topic, such as how all this computing power and data collection can be used for inappropriate levels of surveillance (government or not), or perhaps how governments or corporations could use similar kinds of behavior-shaping to make us more compliant, particularly when talking about our school-age kids.

It certainly seems possible to me that right now our capabilities are outstripping our foresight and thus our ethical and legal frameworks about how to limit their use. That's probably always held true - I can't think of a time when legal frameworks didn't seem to lag cutting edge processes. Still, with the vast computing power and number crunching we see not just present but accelerating, the gap seems to be growing larger, faster. There might be some structural way to try and control that, but I'm certainly not sure what it is, or, honestly, who could be trusted to adhere to it.

Anyway, on-topic, Matt, I don't envy you the challenge you face, but it sounds like you're talking with your son about it, and trying to let him help drive change rather than dictating it to him. I applaud all of that, and am impressed that you'd share this rather personal story here. I wish your family the best of luck.

0
0

Phone users favour Wi-Fi for dataslurp

zanshin

This makes sense to me. Despite being a self-proclaimed technophile, I pay for a skimpy data plan from AT&T in the US, and mostly avoid heavy data usage (such as installing apps, streaming media, OTT VoIP, or viewing lots of images) when I am on a cellular link. When I'm out and about I mostly use my phone just to text or talk to people (in that order), or for things that don't need connectivity, like shopping lists.

Now, when I plunk down in a restaurant or some such, especially when alone, I'll probably check my Facebook, email, etc, so I do consume bits and pieces of data doing so. But if there's WiFi available, I will use that instead, though I might be a tad more cautious about what I use it for. I probably wouldn't manage my billing or bank account on a public hot-spot, for example, but I mostly tend not to do that stuff over cellular, either.

I don't presently game on my phone, so that doesn't enter the picture.

Both at work and at home I have good WiFi coverage, so basically my phone only needs to consume cellular air time when I'm going to/from work, running errands, or out on the town.

0
0

Watch out, PC disk drive floggers: Cloud will rust up those spinners

zanshin
Meh

Not yet, anyway

I see this as a far off outcome, with other pressures, like solid-state storage, more likely to be of more immediate concern to hard disk manufacturers. If hard drive demand plummets, I think it will be because the technology of storage itself shifts, instead of *where* the storage units live.

I see three primary barriers to a radical shift in consumers migrating storage to "the cloud". The first is convenience. If it's not fast or reliable, then average consumers will not use it as a complete replacement for local storage, no matter how carefree about the data itself. Anything that I would put in the cloud but want frequent access to needs to be small enough that I can download it in trivial amounts of time. The less often I need to access it, the bigger it can be, but there are limits. If anything that is so big it takes me hours to upload, forget it - I'm not going to fool with it. Bear in mind that most people have asymmetric upload/download speeds - I can download nearly 10x faster than I can upload.

The next barrier is trust that the provider is a long-term partner. When I put money in a bank, I don't expect that institution to shut down any time soon. Setting aside whether or not that's a safe assumption about banks in today's world, it's still one that I think most people take. I don't yet have that sense of stability with remote storage providers yet, with the possible exception of Amazon. (While some storage companies use Amazon as their underlying storage platform, that doesn't mean I would benefit fully from Amazon's corporate stability.) Also, unlike banks, storage companies aren't insured in ways that ensure our data will be returned to us or moved to another storage company should our chosen provider fail. With time, and as the industry matures and consolidates, I'm sure this will be less of a concern, but my own feeling is that is no company out there with the perceived stability that I would treat it as a permanent solution. While even a local solution has uncertainties associated with it, some can be mitigated through design, such as Drobo or other NAS products. Even if the companies that make such products go bust, I would already have their hardware on-premises, and could replace or upgrade it at a time of my choosing (barring failure), rather than be at the mercy of a distant board of directors, the economy, etc.

Finally, there is the matter of trust around the content itself, and how secure it is on a remote host. I have data on my personal systems that, if I have my way, will never see itself stored on any media that leaves my home, or at worst, perhaps a safety deposit box at a local bank. While I realize that not everyone feels this way about any or all of their data, the fact that I do means that I struggle with the idea of explicitly putting its storage in hands I cannot know. At a minimum, I would want such content encrypted end-to-end, which gets back to the point about convenience. If applying security that will make me comfortable with remote storage makes dealing with it too slow or inconvenient, then local storage is going to be more attractive.

Most users are consumers of data, and less so producers of it. Many consumers are also willing to rely on external providers to keep things they like readily available for them. This is the basis of streaming media - few copies, many views/listeners. But some of us both produce a lot of media (I toy with 3D graphics), and also dislike that external media providers may demise or otherwise remove something I like to listen to or watch. Having my own copy locally means I control my destiny, but means I must account for that storage myself, and risk loss if I have a catastrophic loss of my home. Everything has trade-offs. My preferences and choices lead me to want local storage. I suspect I'm not alone.

0
0

Owning a cloud means learning to love the business

zanshin

I agree!

Buzzwords aside, there are some very good observations here. Whether one lays it at the feet of "the cloud" or what-have-you, the combination of greater compute power (faster CPUs, network, more memory, etc.) and improving IT efficiencies in deployment of that power (increased automation, virtualization, etc.) mean that IT departments need to be delivering more "oomph" in the same timeframe, or there's no return on investment in actually having all those new capabilities. And my experience does agree that taking advantage of this is going to usually be disruptive of existing IT culture, all the way up the IT delivery chain from VMs and network to middleware stack to the main application developers.

One thing, though, doesn't change - the need for quality requirements. All the fancy IT infrastructure in the world is only going to let someone with poor requirements deliver the wrong product faster. That may seem like a good thing, but if your business customer is choosing between you and another provider (internal or external) and they too share your turn-around speed (and eventually, someone will), then the best investment for the customer is the IT delivery organization that gives them the product they need (or at least want), and not the one with the coolest IT toys.

1
0

Clouds gathering on horizon for software devs, say wise men

zanshin

Where do they think these services people hook together will come from?

There's going to be change, and *maybe* that change will disrupt monolithic software providers. But developers? Someone has to create, maintain and extend the components being discussed that people would access via APIs. Heck, someone has to design the APIs themselves, let alone implement them. Developers and designers will have a place doing just those things.

And I think the claims that "the world" wil be writing tomorrow's software is a bit overblown. The vast majority of people don't want to fiddle with APIs, and use of even highly graphical tools for creating programs has never really taken off.

I don't see the change as all that, personally.

0
0

Nvidia drops veil on game-changing might of VGX

zanshin

I think it's a game changer

It's an early but important step. I think it's likely to be the first real step onto a slope that leads to not only enterprise VDI, but also consumer VDI, where people who want more than a browser-enabled media device (like a tablet) *still* just a browser-enabled media device to access remote compute resources that run in remote data centers. More than any other tech we've seen yet, It allows gaming and other locally resource intensive programs to become remote apps.

That has the potential to further current trends we already have in consumers shifting to lower-power, more portable devices, which has implications for the costs of heavier-duty kit. If almost no one is buying full-on desktop PCs, they'll become expensive niche products, assuming anyone sells them at all. (I do assume there will be some legitimate need for them, and someone will meet it.)

0
0

Charge of the Metro brigade: Did Microsoft execs plan to take a hit?

zanshin

I think this is going to be the next Vista...

... which is unfortunate, because it does have some significant improvements in the guts. But the interface is key for a desktop user, and I can't see business and technical users finding anything but net pain from this interface.

I work for a multinational, and we're just now getting around to deploying Win 7 on user desktops. I won't be surprised if there's room to skip a generation here, the way many did with XP -> Win 7, so we can get to a more mature interface bridging touch and keyboards in whatever Windows 9 will be. Barring some major upgrade to Win 8 before then, of course.

As a home user, I see no reason to upgrade. If I buy a Windows-based tablet, I might use it there, but not on a desktop or laptop.

4
0

Metro breakdown! Windows 8 UI is little gain for lots of pain

zanshin
Unhappy

There's still time to change, but I'll be passing on this as it stands

This review hits home for me. It's not how Metro looks, and it's not just that it's different. If this review is accurate, what I dislike is that it takes more motion/clicking/typing to do the same things. I can deal with that, here and there, but as a standard approach to the interface? No thanks.

I'm not a "typical" computer user. I do lots of different things with my PCs, I frequently tinker with settings, and I like the things I do often to be simple, readily accessible and happen with minimal fuss. So far, the Metro approach looks set to fail me unless I'm checking Facebook or the like. Making basic things simpler is good. Making everything else more complicated at the same time is not so good.

Sadly, I see this sort of direction as an outgrowth of the buzz that suggests that smartphones and tablets are going the death of personal computing using full-on computers. That view seems to be creating a rush to "dumb down" *all* interfaces (not just at MS) to those appropriate to such devices. The issue is that those devices are oriented very strongly at content consumption. There are things about making content consumption simpler which are well and good - after all, that's how *most* people will do, and we certainly do consume things on PCs as well. But somewhere, someone needs to be building the content that those smartphones and tablets consume. Making the OS interfaces for such people be a pain in the derriere isn't a very compelling way to get them to produce goods for your OS or app store.

In his opinion pieces, Matt Asay likes to say how developers are the new king makers. It might be wise for folks like MS to build a castle the kingmakers will actually like to visit, lest they plant their kings somewhere else.

4
0

Vint Cerf: 'The internet is not a human right'

zanshin

I have to agree that it's silly to consider internet access itself a human right, but I do think that laws or what have you stating that such access, as a powerful, modern enabler of other more truly fundamental rights, should be protected within some attempt at applying sane limits. I do think the UN proclamation did rather miss the point of what is truly a human right, and that's a valid complaint to be sure. That said, I also think there's something sensible in what it was (probably) trying to achieve.

1
0

Supercomputer and superboffins spot rare baby supernova

zanshin

Floating Point Accellerators!

I, too, was pretty amused at the occurrences of 'flops/s'. From a lot of news sources I would mostly just roll my eyes a bit, but I do rather expect El Reg to not do such things, if only because it's the sort of thing they would mercilessly point out if done by others. :)

Picking nits, I know!

2
0

The terabyte iPad is coming

zanshin
Stop

Not great initial assumptions

You start off early with the assumption SSDs don't fail or need backing up. That's a path to pain. They fail for different reasons than platter drives, and theoretically less often, but they do fail. If you don't make redundant backups of data on SSDs, you stand to lose it just as you might lose it on a traditional drive.

Perhaps someday not too far off we'll all be using SSDs or something like them instead of platters, but we're still going to need backups and/or mirrors of some kind.

0
0

Forums