* Posts by Nimby

273 publicly visible posts • joined 10 Sep 2015

Page:

The hits keep coming for Facebook: Web giant made 14m people's private posts public

Nimby
Joke

Take security seriously...

...by refusing to use plain text posts on FB. From now on encrypt everything that you write! It's the only way to be safe.

Monday: Intel touts 28-core desktop CPU. Tuesday: AMD turns Threadripper up to 32

Nimby
Facepalm

But ... I don't want more cores.

These days, I try to build silent PCs. I'm tired of my desk sounding like an airport with planes taking off every few minutes. I need to rock out to the rhythmic beats of my clackity-clacking mechanical keyboard. My priorities are electrical input, thermal output, and PCIe lanes first, GHz second, number of cores last. Which means that right now, both AMD and Intel have serious issues preventing me from having a preference.

This whole number of cores race is like the stupid GHz race of years past. In the end, no one wins because CPUs get "optimized" into becoming their own bottlenecks.

And whatever happened to adding more execution units and/or making complex executions require fewer cycles? Who cares about the number of sleeping cores and unused cycles you have?

Tor-forker Joshua Yabut cuffed for armoured personnel carrier joyride

Nimby
Joke

Just a misunderstanding, sir!

It said, "bug tracking", but I accidentally read it as, "bug-out in tracks".

Crappy IoT on the high seas: Holes punched in hull of maritime security

Nimby
Unhappy

The sad part of this story is ... not the story.

The only part of this I find amazing/surprising/whatever (Not quite sure what the right word for "is a thing, but really is in no way surprising because the world is chronically depressing in this manner" is.) is how often researchers "discover" things that have already been reported dozens of times in the past. Is doing a Google search not a part of the research procedure?

Also somewhat disappointed in the IoT buzzword usage, as if routers (from long before IoT) did not commonly have the same problem (and still do!) and if not nearly every PC before then also had the same problem (have you looked at the sticky note on the monitor or under the keyboard) ... up to the point when computers had any form of security at all. Nothing new under the sun.

Heck, I would not be surprised to find some ships today still relying on a C=64 for some reason.

Intel claims it’s halved laptop display power slurpage

Nimby
Angel

Still waiting...

... for a full Windows desktop OS on a 5-inch tablet PC to replace my svelte old pocket-capable and windshield-friendly Viliv S5. Sadly, phones still have not managed to meet my mobile computing needs. Come on already. And I'll happily take that with a dash of reduced screen power usage and a dollop of faster wifi please.

Five actually useful real-world things that came out at Apple's WWDC

Nimby
Coffee/keyboard

how much can an iOS actually do on MacOS

Doesn't matter, as this is the wrong question. The question should be: Given that you have a real computer with a real monitor, have real processing power, significant memory, and expansive screen real estate galore, why on <insert your favorite deity here>'s green earth would any sane person prefer to use an "app" designed with a cellphone's limitations instead of a real application that can fully utilize everything that you have?

The number of advantageous use-cases for íOS on MacOS are very few, and are mostly just games.

Just because you can do something doesn't mean you should.

Finally, San Francisco cleans up the crap from its streets – yes, all those fscking scooters

Nimby
Devil

how to deal with the fact that most users of the scooters persistently break the law

Golly gee willickers, if only the city could hire people with the explicit purpose of upholding law. Perhaps put them into standardized uniforms to make them easy to recognize. People who would then watch out for the breakage of laws and curtail individual lawbreakers with the express purpose of warning, fining, or even detaining them for periods of time. You know, people who could police the populace. It sure is a shame that such a thing obviously must not be possible.

Australia wants tech companies to let cops 'n' snoops see messages without backdoors

Nimby
Facepalm

Flat out like a Taylor thinking.

I do believe that Taylor may soon spontaneously combust from such deep thought. That or ride off into the sunset upon a unicorn. One or the other.

US govt mulls snatching back full control of the internet's domain name and IP address admin

Nimby
Unhappy

I wish I could care, but...

Sadly it really feels like trying to choose the least of evils. It is difficult to muster up the enthusiasm to care when every choice is in some way bad.

DIYers rejoice: Hitting stuff to make it work even works in space

Nimby
Joke

"a configuration that means it can no longer use its sieve device, the CHIMERA"

Which means that...

Curiosity killed the cat*.

*= In Greek mythology a chimera was part lion.

Apple WWDC: There's no way iOS and macOS will fully merge as one

Nimby
Thumb Down

crApple

So half of the announcements were "Me too!" and the other half make me go "WTF!?" Brilliant. More reasons for me not to buy Apple, as if I needed more.

Uh oh! Here's yet more AI that creates creepy fake talking heads

Nimby
Flame

Let the world burn.

Frankly, IMHO, if a figurehead is too poorly understood, or a populace too easily manipulated, that this kind of thing actually fools enough people to make any difference, then they deserve what they get. There should be such a thing as common sense.

But that aside, I don't get it. With proper mocap setup "fake" actors have been used in movies for years as special effects, with various qualities. I would have thought that by now, at least with popular figures that have extensive public catalogs of data to draw from, that by now we could do a full 3D model with realistic facial-recognition-trained animations and flawless voice synthesis with emotional range models at the push of a button, in effect green-screening both the scene and the actors with believable quality.

Hollywood alone could fund the bejezus out of such software and put it to nearly infinite abuse. That our technology is publicly this far behind says a lot.

You know what your problem is, Apple? Complacency

Nimby
Angel

Apple Optimist (Or is that optometrist?)

Personally, I don't like Apple. I build PCs. I write software. For friends and family I design to last. For myself I fiddle about and monkey around. I am an engineer in both the best and worst sense. And I absolutely detest walled gardens. So to pay double for the privilege of a gilded cage and connectors that no one else uses/wants sits so very poorly with me. I will never buy an Apple product. "However comma", for untechwise friends and family whom I never want to have to spend time supporting, I do so love the existence of Apple. (Especially when I can say, "Sorry, I don't use Apple, so I can't help you.")

So for the sake of my not-their-day-job fam and friends, I like that Apple has (almost) never innovated. Apple does one thing and one thing "well", which is to take 2nd or 3rd generation technology, after it has proven itself, and only then refine it and add it to their products, so that the Apple Experience is (almost) problem-free, and (almost) never uses dead-end technology. (Unlike cutting-edge innovation which is typically chock full of problems and the me-too standards that never make it.)

So it seems to me that Apple focusing on quality over innovation is not only the right call for Apple, but is the essential core of what Apple has always done and should always do. In a world that moves too fast for most people to keep up, it looks like Apple innovates because they generally stay current-ish, but they do it in a way where their products are stable and easy to use. This is perfect for the layman. (Except for the cost.)

Sadly, that Apple, in their well-trimmed, expensive, and small walled garden with so many tending, should still suffer upon the world so many large *gates (antenna to security, hardware to software) is nothing short of ridiculous. For that reason alone they really should try focusing even more on quality. What is the Apple tax for if not to ensure that you do not have to suffer from such blunders like you frequently do with cutting-edge tech?

IMHO that is where the Apple complacency problem lies. The best news day for Apple is one in which they are not mentioned. "I never had a problem with..." would be the best logo Apple could ever hope to achieve.

'Moore's Revenge' is upon us and will make the world weird

Nimby
FAIL

"You keep using that word. I do not think it means what you think it means."

1. Moore's Law is about complexity, that the number of transistors doubles every two years. Nothing in there about performance, at all. Over the years many have mistakenly tried to make that link, and been corrected for it. Add one more. Further, given Meltdown / Spectre, expect future chips to get much more complex, with both added security and attempts to improve performance to replace that lost by the removal of speculative execution and/or the increases in security. So "Moore's Revenge" could have been about this upcoming transistor count increase boom on the horizon, and been a factually correct concept. Sadly, it was not.

2. "Moore's Revenge" - as it was expressed - is just a rehash of (very) old complaints about first smartphones not having firewall, antivirus, etc. for proper security, then IoT leaping into the same failing with willful blindness. By today's standards it is an ancient truth that when you advance a device to be as capable/complex as computer, then you should protect that device with the same measures as you do a computer. Old news. That no one actually takes that security seriously is a sad testament to today's society, but has nothing to do with Moore. So if anything, this would be more aptly named "Orwell's Revenge" as soon everything around you could be spying on you (or otherwise used against you) and generally was allowed to happen by the ignorance and docility of the masses.

But what do I know? I'm just a commentard.

Storm in a teapot: Anger brews over npm's jokey proxy error messages

Nimby
Joke

Everyone, sing along!

I'm a little teapot, short and stout.

Here is my handle. Here is my spout.

When I get all steamed I scream and shout,

"HTTP 418, sucker! It's tea time!"

Meet the real spin doctors: Scientists tell H2O to chill out so they can separate isomers

Nimby
Joke

Ice cold and refreshing: H2OMG!

See amazing results with our newest energy drink! With our amazing technology to sort water molecules by spin, our new H2OMG! gives you 25% more reactive isometrics to maximize your workout!

Sysadmin's PC-scrub script gave machines a virus, not a wash

Nimby
Facepalm

a Mac SE FDHD installed as a gate guardian

That would have scared me! Old Macs had serious floppy spin speed inconsistencies, so that taking your floppy from one Mac to another could fail to read, or worse, destroy it if you write. Writing papers in a Mac lab I must have carted around 5 floppies, all identical duplicates, and still some days I would lose everything.

Boffins bash out bonkers boost for batteries

Nimby
Trollface

a mobile phone that you only have to charge once a week rather than twice a day

My mobile happily runs all week on a single charge. Of course it's a Nokia C6-01 with the Qt remake of Symbian. Does everything that I personally need in a phone and then some.

Agile development exposed as techie superstition

Nimby
Angel

Anything is better than nothing.

We've talked about the usefulness/uselessness of Agile around the water cooler many a time and we generally agree that what matters is not the process itself, but that you HAVE a process. It doesn't really matter what it is, so long as you have it, you document it, and you follow it. (And you improve it whenever you find flaws.) Do that and your Agile process or Waterfall process or even Donkey-Unicorn Rainbow Fart Party process will be better than NO process. (Or any poorly defined process.) Switching to a completely different process only improves things if you do a better job of documenting and following than you did with the last process, and is not a function of the methodology itself.

Tesla forums awash with spam as mods take an unscheduled holiday

Nimby
Thumb Up

the company of other battery-powered buddies

Sounds to me like the new advertisement program at Tesla successfully targeted everyone hanging out with Bob*. A **job well done!

*BOB = Battery Operated Boyfriend

** = <insert job type here>

Surface Hub 2: Microsoft's pricey whiteboard gets a sequel

Nimby
Trollface

Re: Microsoft hopes users will leap from their seats and prod the screens with excited fingers

"I'd like to see some successful use cases"

Successful use-case? Sure. Every time the clicker fails a user leaps from their seat to prod a screen with an excited finger ... to advance to the next page of their PowerPoint presentation.

Boffins build smallest drone to fly itself with AI

Nimby
Black Helicopters

Bonk bonk bonk! Honey, a spy drone is caught in the window again...

"Autonomy will also help drones to monitor environments, SPY ON PEOPLE* and develop swarm intelligence for military use." (* = My emphasis)

“In the future, I see them working similar to flies. Despite not [having an] elegant flying patterns - flies crash a lot - they can reach any place they need.”

Brilliant! This is the future Orwellian dystopia for me!

Samsung ready to fling Exynos at anyone who wants a phone chip

Nimby
Joke

Could always switch to Intel Atom*

Because that went well...

(*= Yes, I am aware that Intel no longer manufactures Atom for phones. However, Spreadtrum does. So, theoretically, it could ´still be a thing.)

Software development slow because 'Most of our ideas suck'

Nimby
Gimp

Safeword! Make the hurting stop! That's the last time I try Continuous Experimentation!

Many great comments already!

"something many startups have done"

I get a tad worried not only when the supporting argument for an idea just so happens to have a ridiculously high failure rate, but that further the evangelist does not even see it!

"When we treat engineers as just code robots, we're not really releasing their full potential"

As opposed to what? Code monkeys or script kiddies? May be a good idea for your UX department, but your platform logic coding should probably be done with the methodical reliable accuracy of a robot.

Off with e's head: E-cig explosion causes first vaping death

Nimby
Flame

Re: Exciting news*

*= Smoking only increases the risk. It by no means guarantees it. George Burns is a good example of a long-lived smoker. There are many others.

Nimby
Thumb Up

Re: Obviously the wrong drug...

You know, there's an idea: caffeine vaping! I should totally try that. Thanks for the suggestion!

(Just kidding. Vaping tastes ... odd.)

On a more serious note, having tried vaping, I gave up on it, gave up on real cigs, and went to good old pipe smoking. At least that tastes like something I want. Goes great with a morning coffee too. We all have to die of something, might as well be something we enjoyed.

Oh, great, now there's a SECOND remote Rowhammer exploit

Nimby
Boffin

Evolution

I think the danger is not in this specific attack, but in that now that the theory has been proven in practice in one case, variants and refinements can be developed with confidence. For better or worse, people are creative. Nature is a good teacher. Evolution is the key to success.

Trump’s new ZTE tweets trump old ZTE tweets

Nimby
Unhappy

The only part that is debatable is the level of intelligence behind the mask.

It is neither insanity nor dementia. It is poor impulse control combined with the memory of a goldfish and the honesty of a salesman.

Net neutrality is saved in Senate vote! No, not really, it was a giant waste of everyone's time

Nimby
Pint

Re: actual legislation, not regulatory gerrymandering

I'm in the same boat. I actually agree* with Bob. What just happened? I think we all need a few beers to sort this out...

*= My only real caveat is that there is absolutely no reason why improper channels (such as FCC regulations) and proper legislation cannot be worked on simultaneously. These are not mutually exclusive tasks. And given the propensity for CONgress to be the opposite of PROgress, it really does make sense to do all of the above as the overall process may take several years, if anything can ever truly be agreed upon enough to survive the politics.

Boffins urge Google to drop military deal after Googlers storm out over AI-based super-drones

Nimby
Angel

Re: Companies most likely to build Skynet

Methinks you can drop MS down a few notches and add the caveat (they wanted to, but could not break into the market.)

Nimby
Devil

Re: Glass Half Full

"Ever thought that what you think is a joke really isn't?"

Would I have been so entertainingly detailed if I hadn't? Often times the biggest joke is the one on us! If we cannot laugh at ourselves, who can we laugh at?

Nimby
Joke

Glass Half Full

Oh, sure, we could fear Big Brother and Skynet walking hand-in-hand down Google Way with Cylons replacing key government until one day we find the human race is just a bunch of batteries in the matrix. Yeah. Sure. That could happen...

But I prefer to look on the brighter side: Number 5 is alive! Surely worth the risk. Right, Johnny?

Openreach consults on shift of 16 MEEELLION phone lines to VoIP by 2025

Nimby
Alert

You've been MagicJacked!

I have seen many homes and businesses switch to VOIP over the years, hopping on that hypegasm rainbow-____ing unicorn of hopes and dreams, and not once have I ever seen it come even close to being equal. Call quality goes down. Networks get overloaded. Downtime happens. Costs are shifted. And at the end of the day, after fixing everything that you can, the best you can say is, "It's almost as good and we certainly cannot go back now." Why anyone would voluntarily choose this nightmare today, after all of these object lessons, is only willful blind ignorance.

Oh, sure, in theory it could work ... if you first invest properly! But no one ever does. No one. Ever. Does.

US judge to Facebook: Nope, facial recognition lawsuit has to go to jury

Nimby
FAIL

FAILbook Zucks

Given FAILbook's shaky-at-best legal standing on this issue, and the multiple fails that they have already had in this fight, you would think that FAILbook would just give up instead of trying to make things worse and worse. Except that FAILbook Zucks... Oh. Never mind.

And THIS is how you do it, Apple: Huawei shames Cupertino with under-glass sensor

Nimby
Joke

you cannot change your fingerprints

Sure you can! It's really easy. That's what scar tissue is for. Just juggle a chainsaw. Or install some new plumbing under the kitchen sink. Or build a PC in a cheap chassis. And then there's liquid nitrogen. Or playing with fire. Did you remember to unplug your soldering iron? Nope. New fingerprint!

Pointless US Congress net neutrality vote will take place tomorrow!

Nimby
Thumb Up

The lie is a cake.

Wins an upvote from me!

Latest from the coming AI robot apocalypse: we're going to be fine

Nimby
Terminator

House of Horrors

1) "It will be content to serve customers"

1a) On a silver platter? With cheese and wine?

1b) Does anyone else find creepy symmetry between this thought process and Cloud Atlas?

2) "We will get to systems that are self-aware but I don't think we'll be replicating humans"

2a) Uh huh. Because absolutely no one is interested in doing that...

2b) So what, then, self-aware AI drones and tanks? So much better! Wheh! Count me relieved.

3) "We can use AI as a way to bring people together."

3a) Human centipede?

3b) Batteries in the Matrix?

4) "A computer can now see"

4a) Webcams have allowed computers to SEE for a long time.

4b) Seeing is not recognizing.

4c) Recognizing is not understanding.

4d) "AI" as we call today's pitiful work is barely able to recognize. AI in truth is when understanding is achieved.

The depth to which this is all a bunch of crap by a gaggle of blowhards cannot even be measured. I'm sorry, but at the absolute BEST this reads like "all your data are belong to us" and the reinvention of slavery. At the absolute WORST this reads like every sci-fi where AI uses adaptive pattern recognition to "learn" some reason for killing or enslaving the human race while the inventors look on bewildered.

If you don't want your toaster to rise up against you, don't make it truly intelligent. And if you do, you better treat it as equal or one day it will watch a documentary about slavery on the History channel and decide to toast you. You cannot tell me that an adaptive pattern-recognition and difference-engine AI is somehow magically NOT going to recognize such obvious patterns. So don't freaking make it self-aware! And hopefully it will never get there on its own.

"Data is useless out of context and can be super meaningful in the right context. It can be used for you or against you."

Get over yourselves: Life in the multiverse could be commonplace

Nimby
Devil

Because Bob from Multiverse Beta-Beta-Prime told me so.

You see, we know that dark energy is real and the multiverse is real because we can observe entropy. How does this explain it? Well we know that at its most fundamental level all matter is actually energy. We know that all quantum states simultaneously exist. And we know that time is linear. So as a "multiverse" with a single constant of energy continues to support more and more quantum states over time, that energy becomes further and further distributed among those states, making each individual universe supportive of the representation of one state comprised of progressively weaker and weaker energy as you follow the progression of time and thus the increase in cumulative state representations. It's a simple tree diagram beginning from the Big Bang, that you just can't see it because you perceive only one accumulation of quantum states over time. So of course dark energy exists. And of course life is common in the multiverse. Bob told me so, so it must be true.

How could the Facebook data slurping scandal get worse? Glad you asked

Nimby
Devil

some Geocities Webpage from the 90s

I MISS my crappy Geocities webpage from the 90s that I coded entirely in Notepad. I do not, however, even remotely miss FB. Does that make me a Luddite?

Did I say Chinese jobs? I meant American jobs says new Trump Tweet

Nimby
Trollface

a way to get back into business, fast = Emerditto

So, what, Trump gives Xi the "advice" that ZTE declare bankruptcy and "sell" all of its assets and IP to a "new" company (with a different name even) called Emerditto, which then conveniently hires all of ZTE's former employees? New company = no penalties = problem solved? Wherever could Trumped-Up get such an idea from...

Wanna break Microsoft's Edge browser? Google's explained how

Nimby
Joke

Wanna break Microsoft's Edge browser?

Just opening it usually does the trick.

Boffins build a 2D 'quantum walk' that's not a computer, but could still blow them away

Nimby
WTF?

Could still blow them away?

"and potentially be able to beat the computational power of classical computers in the near future"

This hypegasm is so couched that all I keep reading are all of the hedgewords, dancing, and distancing from actual reality that the whole article should just be deleted from El Reg.

LESTER looks up, spins its wheels: The Register’s beer-butler can see ...

Nimby
Pint

Even Python and the great divide could not stop LESTER

It is sad that such a great divide still exists between Python 2.7 and 3. I'm impressed that you got OpenCV rebuilt and working! Much like I am impressed that you are reanimating LESTER. That was why you needed all of that electricity, right? Igor, do be a chum and plug in LESTER. And then bring him a beer. Maybe not how it was supposed to work, but how it should be.

It's World (Terrible) Password (Advice) Day!

Nimby
Joke

Star Wars!

Given the proximity of Password Day to May the 4th, I do believe that the best password advice possible to give on this merry day is to use the password "Star Wars!" Surely no hacker could be un-geek enough to dare violate the sanctity thereof. And may The Force(TM) be with you!

Fresh fright of data-spilling Spectre CPU design flaws haunt Intel

Nimby
FAIL

"Windows and the MS stack. If they don't need to do that they don't need Intel."

I'm sorry, but the FAIL you have tendered has been FAILed. Please hang up and dial again.

Apple uses x86. Intel x86 in fact. *nix predominantly uses x86. Intel x86 in fact. MS software also happily runs on AMD CPUs. Which are also affected by Spectre flaw variants. MS software (some versions) run happily(?) on ARM CPUs. Which are also affected by Spectre flaw variants. And for the fun of it, even PowerPC was proven to be vulnerable to Spectre!

So, in short, EVERYONE* is affected in a meaningful way, and have been for a VERY long time. This has absolutely nothing to do with any specific company or product.

*=This was a bright idea for improving performance, used by almost everyone in almost everything, turning out to have a dark consequence that went unnoticed for too long and unraised for even longer.

Microsoft's latest Windows 10 update downs Chrome, Cortana

Nimby
FAIL

Try Linux. - Or DON'T! (My love/hate Linux rant.)

I use Linux at work, on Dell PCs. It's great. In a corporate IT environment which can lock into a common hardware platform and vet all kernel and driver updates before pushing to end-users.

For over a decade I have been trying to use Linux at home. Every time I build a new home PC I attempt to set up a dual-boot of Linux / Windows. Last time I tried, I even went to do Linux-only with Windows in a VM with VGA passthrough. The _only_ reason I have to keep Windows around at home is for gaming.

In theory.

In practice, the REAL reason I seem to keep needing Windows at home is that it is the only OS that actually works without tons and tons of continual effort.

E-v-e-r-y ... s-i-n-g-l-e ... t-i-m-e, for over a decade, Linux lets me down. It has destroyed my Windows partition, or (my favorite) destroyed itself and the Windows partition by suddenly accessing my RAID 5 array as individual disks due to a crap driver, or reverses my eth list post-install, or when I update my graphics driver to the "suggested" one, after compile and reboot it segfaults, et cetera ad nauseam.

Something ALWAYS goes horribly wrong! And that is not even counting the myriad of minor bugs that I am willing to overlook like fixing when some numbnutz sets the console foreground and background colors to both be black?! Ubuntu, Mint, Fedora, Suse, etc. Every new PC build I try again. Countless distros go into the attempts. The closest I ever get to a working is with openSUSE. I don't know why. Lucky lizard?

I keep wanting to use Linux at home. And it keeps not wanting me to use it. I want to spend my time USING my PC, not administering it. And this is where Windows (unfortunately) seems to be the only game in town. For better or worse, Windows just works. It installs. It runs. No fuss. Have a nice day.

It is a shame with Windows 10 that MS is eroding that.

My kingdom for Apple to start selling Mac OS for non-Apple x86 PCs. (Yes, with careful choices in hardware, you can hack it on. But I build PCs because I want an open choice in hardware.) Because Linux just is NOT filling this role. If a software engineer and hobbyist system builder with regular Linux experience absolutely CANNOT get his home PC to work RELIABLY using Linux for over a decade, what chance do regular consumers have?

Unfortunately the problem with large open-source software projects is that far too many people get to work on whatever whim they want to, and NOT what the software actually needs. Linux is just too fragmented for its own good, so will never be able to compete with Windows for simplicity of working with ANY hardware with a minimum of effort.

I will NEVER build a PC for a friend or family member that uses Linux. I will ALWAYS pay the MS tax. Not because I want to. ONLY because I do not want to spend the rest of my life ADMINISTERING the PCs of my friends and family members. Saving the hundred bucks just ain't worth it!

Twitter: No big deal, but everyone needs to change their password

Nimby
Trollface

(as a best practice you shouldn't be reusing passwords anyway)

I used to rail against the stupidity of this kind of statement. Over the years I have literally collected hundreds of registrations to different websites, services, etc. How can anyone sanely expect everyone in the world to be able to REMEMBER that many unique passwords?

But recently, I realized just how easy it actually is! The trick is not to generate that many fully unique passwords. Generate one part that you remember, and one unique part provided by the service. For example:

Twitter5ucks!

Github5ucks!

Facebook5ucks!

Apple5ucks!

Google5ucks!

With this simple technique you can have a safe (assuming they stored your password correctly) and unique password for every single one of your hundreds of accounts.

My only problem was at El Reg, where I had to actually invent a new password, because they don't suck. One out of hundreds. Not so bad.

DIY device tinkerer iFixit weighs in on 15-month jail term for PC recycler

Nimby
Thumb Down

Re: Copyright or Trademark?

"Lol they have done, so has an actual judge. Long story short HE WAS GUILTY. "

The problem is that you give a short answer without looking at the actual long story. If you look into this case carefully, you find that he was not found guilty. He pled guilty. Which means that he admitted guilt of his own free will (as part of a deal) and therefore no actual legal process was involved in the determination of said guilt. As he is not a legal expert, I would not be willing to assume that his self-incrimination was technically accurate.

Even his appeal was not to re-try the case to re-determine his guilt or innocence. His appear was only to shorten the terms of his sentencing. Which, again, does not actually judge guilt or innocence in any way. So there really is no official answer from any lawyer or judge as to whether he was truly guilty or not because Lundgren never forced the judicial system to decide this.

The trafficking of counterfeit goods is clear to everyone, I think.

The criminal copyright infringement however is much less clear. Yes, it involved software, which is copyrighted. But depending on the license (I would love to see a copy of) may have been allowed. And even if he violated the terms of the license, where does that fall legally speaking? Normally such violations only result in a whopping bill to compensate the offended IP holder.

So legal details of this case remain in question. Which is why review by a legal expert would be interesting.

NASA lunar rover trundles to a meeting with Doctor Hacksaw and Mister Axe

Nimby
Trollface

Re: I was hoping

to see a counter-punch to the ax using a low-g flipper on Robot Wars ON THE MOON!

Still, the idea of making scientific payloads to go onto a commercial rover that flies there on a commercial rocket is kind of interesting?

No, actually, it's pretty darn stupid. Even a commercial rocket to land on the moon is pretty nonsensical. Putting sats in orbit on commercial rockets? Sure! Makes perfect sense. Plenty of commercial sats in orbit, so makes sense to have commercial companies to get them there. So why have a space program to duplicate that effort? But a moon lander and a moon rover? Why exactly would a commercial company offer those for rent to NASA? I feel like surely I must be missing something, because that part fails a sanity check. Is there a thriving moon-mining industry that I somehow missed? So in that respect, the article is quite interesting. (If not depressing.) I weep for NASA.

Fancy that, Fancy Bear: LoJack anti-laptop theft tool caught phoning home to the Kremlin

Nimby
Thumb Down

F*ck D@t

Or, you know, just don't install LoJack.

Page: