* Posts by Lou Gosselin

487 publicly visible posts • joined 1 May 2007

Page:

Gates predicts 'significant' US recession

Lou Gosselin

Unemployment figures

The reg should have pointed out that it is widely accepted that the US official unemployment figured quoted from the BLS are completely misleading at face value. I'm sure some of the audience may not be aware of this.

The 6% figure represents those who have been laid off in the past 6 months, qualify for and are receiving government unemployment checks. If one takes any job, even fixing the neighbor's computer for pay or selling something on ebay, then one looses unemployed status (at least if the income is reported). Most of those who've been fired, left voluntarily, just graduated, finished a contract, etc do not qualify for government unemployment and are not considered unemployed, even when in fact they are.

What this means is that official employment figures are greatly under reported, some say by as much as 100%, but 10% is closer to the truth than 6% being quoted in this article.

As for Bill Gates, it's a little late in the game to be making novel claims.

Google's IP anonymization fails to anonymize

Lou Gosselin

Re: Who are they trying to kid?

If one does erase cookies, the old and new cookies could be associated by IP address, browser fingerprints, and common login ids. Knowing google these associations probably are recorded and will not be included in the scrubbing.

In order for google's claims to be genuine they must not only wipe the ip address, but also wipe all records generated by IP address matching algorithms. Otherwise this is nothing more than a publicity stunt.

adnim is right, any average sysadmin could figure out how to remove the IP address data. I wouldn't be surprised if google's delay is due to them tweaking the stored data such that they still still be able to follow the trails even after after removing IP addresses.

Does anyone know how long google plans on keeping the data without IP addresses?

Exploit code targets Mac OS X, iTunes, Java, Winzip...

Lou Gosselin

Of course, it's http

All http traffic should always be considered insecure as it is all vulnerable to man in the middle attacks.

Any updater that uses unvalidated binaries from http will be vulnerable. Though the developers can be accused of bad decisions when choosing to rely on plain http in the first place, the vulnerability doesn't technically lie with the updater per say.

Even if the updater used https or other key management, an initial download over http would still be potentially suspect. And virtually all downloads these days are over plain http.

Boeing to build combo airship-copter flying cranes

Lou Gosselin

Here's an obvious thought

Surprised that no one has mentioned it yet.

Since the problems occur mostly during dynamic loads (at the construction site),

why not tie the thing down to the ground near the construction site? With a few cables, the blimp could be supported by ground vehicles or tankers. These could also provide electrical power and/or gas for fuel and buoyancy.

Time to dismount the hamster security wheel of pain

Lou Gosselin

Re: It is an infomercial

>But the truth of the matter is, it is really hard to sell security to a client, they just don't want to pay much for it. Unless they have just had a compromise, it is so far off the radar for them.

I take issue with this statement only because security shouldn't be an afterthought. Especially when some of the vulnerabilities are so trivial to avoid by being consistent in the first place. Encoding fields may require a little more typing, but it hardly requires any additional thought. Unless one is looking at getting a new contract to fix one's bugs, then the cost really should not factor in.

>I do sort of agree with the idea though, but the problem is not the coders, it is not the developers, it is the clients.

We disagree here as well, bad security is usually caused by incompetent developers. Some of the glaring holes are unbelievable. Why did these people get hired, they should have known better. Even experienced developers might not be aware that storing database connection strings in viewstate is evil for the same reason that passing security credentials in the URL is. Homegrown string encryption is very common and as stated in the article, is usually flawed.

Admittedly some attacks are quite involved, intricate, even clever. It's just disappointing to see so many IT shops failing at such a basic level. If the jobs only go to shops with the lowest bid, there is really no incentive to improve.

Lou Gosselin

Security problems caused by clueless programmers

There isn't always much we can do unless security is a project in and of itself. Every faulty line of code needs to be fixed and regression tested. I have even had a client ask me to revert back to vulnerable code since their client's depended on it. I can hear it: "That's a feature not a bug".

Some of the most notorious attacks such as cross site scripting and SQL injection attacks are trivial to fix: convert all data strings into the encoding used by the medium. This shouldn't be more than a few lines in any modern language, and that's only if the escape/encode functions are not already defined by the language.

Problem is, amateur programmers tend to make the same mistake hundreds of times, and by the time they're discovered, the damage is done and the product is already in production.

And then when it's time to fix the vulnerabilities, users go about solving it the wrong way. Like rejecting the data (via regex or other means) in lieu of proper encoding of the fields. Even Asp.Net pages display a server error when it sees a less than sign. Why *can't* I use "<" in a text box? And Why *can't* I use an apostrophe in that name field? Many names have an apostrophe. Should we do away with these keys on the keyboard since developers don't have a clue how to handle them properly?

The problem is, employers just think about cheap labor and not so much about quality when choosing among candidates. Until that changes, the vulnerabilities will remain common.

Fellow from AMD ridicules Cell as accelerator weakling

Lou Gosselin

Too many assumptions, not enough understanding

It sounds like many commenter's are making too many assumptions about the nature of parallel programming. In reality many problems are serial in nature in that they require output from one step before starting the next. Parallelism WILL NOT HELP in these cases.

Specialized processors are faster not (necessarily) because they run better/parallel algorithms, but because they can eliminate millions of gates required to make them generic and therefor shorten the electron's path in a cpu from the input pins to the output pins.

I believe that Moore's law continue for some time to produce results by converting software into hardware. Even without any parallelism, optimal graphics chips would be faster than any software running on optimal generic hardware. They would be far more energy efficient as well. Problem is hardware is expensive and difficult to build compared to software.

The solution in my opinion are Field Programmable Grid Array technologies. This might make it possible to virtually construct specialized processors on the fly. A game requires an advanced AI processor? Flip a few million bits in the dynamic FPGA and Viola! Now your software is running almost literally at the transistor level. Current FPGA technologies that I am aware of need to improve by two or three orders of complexity to be able to represent larger problem sets, but there is a lot of potential.

Now back to parallel programming. For those problems that CAN be solved by a divide and conquer approach, simply multiplying the hardware (no matter how inefficient it may be) could significantly benefit the computational ability. This is especially true in cases where one processing unit has very little need to communicate with other processing units.

However we must watch out for problems where each processing unit requires data and synchronization from other units before every parallel processing step. This data must travel over a slow network between units. For this reason an optimal serial algorithm can sometimes run faster than an optimal parallel one.

Shared memory architectures can help mitigate IO overhead, however shared memory is not scalable, every unit doubling requires more transistors and a longer data path between each processor and it's memory. Not to mention contention on the bus and caching issues and exponential expenses.

That is not to say that parallel clusters are not useful. I think they are cool, but they're not the magic bullet some people are claiming them to be. That said, the industry does need better tools in order to handle the hurdles between serial and parallel programming.

World realizes Google home page is 'illegal'

Lou Gosselin

google innovation=ms innovation=hardly

"Google maps first (first pan and zoom + satellite + street view) Show me another."

Actually the first poster was right, google didn't invent google maps, it just pulled a microsoft: rebranding it and removing the original product from the market.

Google bought out "KeyHole" which as I recall had a product developed by NVidia which did almost exactly what google maps does today.

I remember seeing the map panning/zooming on a news station covering some war news (I think this was back in '02). The graphics displayed had the KeyHole logo, which I looked up and was impressed to see that KeyHole had free trial accounts with which anyone could start exploring the earth from above.

And, though I admit they were a bit later, tools such as mapquest do aerial imagery as well as google.

It may be tempting for some to give google the credit for all "its technologies", however a closer look will probably reveal that they only deserve credit for popularizing the technology and not developing it.

In case you are wondering, I dislike google because 1) I hate ads, 2) I like my privacy, both of which are exploited by their business model. Before anyone reply's "if you don't like it, then go somewhere else", well I am quite content using non-google services thank you very much.

I just wish it was easier to remove all the google search backdoors in firefox (see about:config).

Microsoft seeds HP PCs with Live Search

Lou Gosselin

How google advertises.

I don't like underhanded moves by MS or others, however it is not surprising that computers come pre-loaded with sponser's services. It's not actually so bad as long as knowledgeable users user CAN change those defaults easily.

Have any of you noticed how difficult it is to remove google.com from firefox? The average users would be baffled to find google.com still coming up for no apparent reason because google is guilty of coding several search backdoors into firefox. Try any of the following with firefox 2 and up to prove it to yourselves:

Try changing the quick search to yahoo or hotbot or whatever, and you'll still notice that bad domains go to google, random text in the url box goes to google, set no home page and hit the home button goes to google. A power user can use the hidden about:config settings to remove at least some of the back doors on a per-user basis. To remove all the google backdoors requires recompiling the code or remapping google.com via hosts files to the search engine of choice, neither of which is reasonable.

It's really a shame that firefox has become corrupted in this commercial way; did anyone really believe that google was donating money out of generosity? Nope, this is how *google* advertises.

Control your PC, with a lemon wedge

Lou Gosselin

Prior Art

Wow, this looks exactly like a college project of mine I did about 7 years ago for a computer vision class. I was able to use colored objects to control the cursor on the screen in real time. At the time I had a PII266mhz with a Matrox video card to capture frames at 30fps. Another effect created was holding objects in front of the camera and display them on screen as sparklers. The work and algorithms were presented in class. I remember the professor trying to discourage me from the project because it was too ambitious and thought it would fail. It wasn't as difficult as he suggested.

Out of curiosity I tried to locate the referenced patent but didn't find it. Does anyone know anything more about it? US or Euro?

I could probably have patented this same patent, as I did the work. However even at that time I seriously doubt that no one else had thought do what I did. It's one of the reasons I believe all software patents are evil.

How scanners and PCs will choose London's mayor

Lou Gosselin

Solutions

> "...a hidden piece of code to be activated, ..."

> Why is the software not open-source, in that case?

Two problems with this. Just because the manufacturer opens the source doesn't actually mean the machine is running this source.

Secondly, the manufacturer claims the machines would not be reprogramable, so there may not be a way for a third party to verify the code on the machines.

I still agree they should demand open source, at least people will catch genuine coding errors. Better than security by obscurity.

I've long thought that all these issues could be solved the "NASA" way - tons of redundancy. Why not get 3 different vendor implementations to count the ballots?

All the votes would be counted by machine A, then go to B for recount, then C for another. Then look for statistical discrepancies and investigate those. You'll quickly find out which systems have bias - deliberate or accidental.

If the ballots are stamped with serial number before going into machines, then it would be possible to track exactly which ballots were disputed.

Heck, it would be possible to have one scanning machine, that is incapable of anything but scanning. All these images go into a feed bound for each vendor's scanning software. This would bring down the cost of duplicating the scanning hardware between vendors.

Why is this so hard to solve? Most of us have to solve much more difficult problems on a daily basis.

Build a 14.5 watt data center in a shoebox

Lou Gosselin

@Any of those does any kind of encryption

Any of the solutions mentioned should handle encryption if it is possible to upgrade to an unlocked linux kernel. Then you could setup the file system any way you like. None of these are encrypted out of the box however. In my opinion client side encryption would be more beneficial, the server could hold the encrypted files without a key.

Remember NAS devices are not created equally:

Linksys NSLU2 = 133MHz ARM 32MB

WD myBook = 200MHz ARM ?MB

Linkstation HGLAN = 266MHZ PPC 128MB (faster than the ARM)

Linkstation Pro/Live 1 & 2 = 400MHZ ARM9 128MB

Lou Gosselin

They are awsome toys and tools.

"The LinkStation Mini uses a pair of 5,400RPM 2.5 inch notebook drives to perform its magic, making it the only Buffalo storage unit not to run on SATA drives"

Not true, many of the kuroboxes support PATA/IDE, and also the previous line of Linkstations (HGLAN) which ran on PPC. I have no preference between SATA and PATA as neither of them are bottlenecks.

Unfortunately the newer ARM9 models running at 400mhz are much slower at applications than the older 266mhz PPC models. Never the less, I have mine running Apache, SSH, Asterisk (the Callweaver branch), OpenVPN, and Samba for file share. When I only had one Linkstation, I connected USB hard disks to it for back up, now I have an RSync script between multiple Linkstations.

I have looked at many products for low end server platforms for running trivial daemons on a LAN. I bought the Linkstations explicitly to install Debian. In my opinion the Linkstations are the best value I've seen for this purpose.

For instance the Linksys NSLU units mentioned above only have 16meg RAM, while most Linkstations have 128meg. It might not affect their intended purpose, but when you configure a GraphicsMagick thumbnail cron job, that ram helps.

I have heard bad things about Buffalo's customer service, and unlocking them isn't "supported", so it's not for those who don't know what they're doing. That said, the unlocked Linkstations are so cool. Check out linkstationwiki.net for ideas/code.

Data pimping catches ISP on the hop

Lou Gosselin

The ISP Trojan Horse

Doesn't this resemble Jim Carry's Batman character? "I'm sucking everyone's data". Cue maniacal laugh.

I really hope all the ISPs involved get in big trouble for this, who thought it was a good idea?

American ISPs already sharing data with outside ad firms

Lou Gosselin

This has gone too far.

As a professional, I had absolutely no idea that any traffic monitoring was occurring in the US. 10% is a staggering number if true. I really am not sure if my own data has been sold now. Is there a list of these ISPs somewhere?

I know very well that the "there is no privacy issue" arguments being pushed are total non-sense. That is what makes this infuriating. Both the user, and don't forget the web site owners too, do have a reasonable expectation that the data between the browser and the web site is not being routinely sold to 3rd parties.

This is very analogous to AT&T monitoring our telephone calls and saying "don't worry we've removed identifying information".

Whether or not they claim it is personally identifiable is irrelevant.

1. It isn't always possible to remove identifying information. Even hashed data can point a finger. If they can disprove it was anyone else, it was obviously you.

2. My data is mine with or without identifying information. They claim the user is acknowledging their "services", I guess that's one thing. However I suspected many users are being duped into monitoring and are unaware.

3. Web site owners have their own rights. The users themselves may not have the right to have this traffic monitored even if approve. The monitoring and analyzing of copyrighted material directly leading to profit could breach US copyright law.

To expand on point 3 - if the user downloads GPL content (for example), then any permutations to that code must be published. In analyzing traffic the ad agencies must build a database to analyze the content and choose ads. The information contained in this DB are directly derived from the copyrighted material (if the content were to change, so would the database). The DB may add knowledge, but it is still an extension of the original work even if it's ultimate purpose has changed. Under the terms of the GPL, this information must be freely available thereby breaking the business model for selling the information.

Some readers may find me argument far fetched. I'd like to hear other opinions.

New banking code cracks down on out-of-date software

Lou Gosselin

Full Security, clairify patents

I feel I must pre-emptively clarify my stance on patents.

A company should make a profit for building security devices, after all the existence of safe (hopefully open source) security devices *is* a public benefit. They should not make a profit off of the mere idea and algorithms of security devices, which is what such a patent would grant to patent trolls.

Some people may complain that it is impossible to compete/innovate without patents. However without patents, nothing is stopping them from building something better than the competition - actually building on their competitor's ideas without worry of a patent lawsuit. This would be a huge boast to innovation and to the public.

Lou Gosselin

Full security

The one time keys + password might be able to prove to the secure site (bank) that the user is who he is and has the FOB. However as mentioned by Anton Ivanov, an attack vector would be possible by using man in middle attack or by hijacking the authenticated session using some method.

Obviously the Man in Middle could have his own valid SSL cert, so that the user sees the "key" and thinks he's secure even if it is the wrong site. This even happens with legitimate banks who use multiple domains across subsystems, some pop up windows may even hide the address bar completely.

Or worse, if the client desktop has already been infected, then anything is possible as the machine can show the user what he wants to see while doing something else. Even a compromised *user* account is powerful because it can fool the user into providing root credentials by displaying a false login screen, I bet even most pro's would be caught off guard.

There is a physical solution to all these things without resorting to out of band communication. The FOB's weakness is that it can be used to authorize *any* action, even if not intentional. It would have a small lcd display on it so the user could review the transaction(s) on it and then click 'Approve'. The device would then sign the transaction (or a hash of it), and this signature would be verified by the bank. So long as this "digital signing FOB" is kept physically secure and cannot reprogram itself so that it is not digitally exploitable, then the user has very high confidence that the signature can only be used to approve a specific series of transactions.

Such a device would be used at the *end* of a session/transaction instead of at the beginning. An attacker / middle man could go through the steps, but the user would have to approve the transaction. If he doesn't look at it, then it really ought to be his fault.

I wasn't going to write this in, seeing as this thread already has many comments. However am worried about someone patenting/profiting from the idea instead of benefiting the public. So hopefully this thread can be used to claim prior-art on the idea.

How an app called WarmTouch nailed a grenade-stockpiling cyber extortionist

Lou Gosselin

An unusual suspect...

I wonder would it would say about amanfromMars?

Creative threatens developer over home-brewed Vista drivers

Lou Gosselin

He probably did not write his own drivers...

I am *guessing* that this developer reverse engineered the drivers (nothing illegal btw), and then patched the drivers with bug fixes. Then he distributed the modified drivers, against creative's copyright license agreement. As opposed to say, releasing his own proper drivers for the sound card.

If it really is the case that this developer released his own "home brewed" drivers as the article implies/states, then creative would not have a leg to stand on and their actions are shameful. This goes much further than not publicizing the interface specs; it is actively preventing others from interfacing with the device using their own implementations (think linux drivers). In which case creative really deserves a beating here.

Asterisk mauled by buffer overflow bug

Lou Gosselin

VOIP via SIP

SIP was designed to be very flexible, and can handle much more than just voice.

In theory it's a great protocol, as the actual RTP data traffic only needs to travel between active parties, and not through a central server as required by other protocols (IAX). in practice SIP has alot of faults.

In practice SIP fails to function natively with the most popular/common NAT routers found in homes.

This is due to SIP/RTP's need for a huge range of incoming ports needed. Some applications can attempt to use an external STUN service (which is not a SIP protocol standard) to learn external IP addresses and ports, but this isn't 100% reliable even when configured properly.

More likely someone needs to configure the router to map ports internally. Even this isn't strait forward because the SIP packets themselves need to contain the public ip&ports, which will not match the ip&ports the device is using internally, so sometimes the public IP address is statically configured in the VOIP device. Alternatively the other end often ignores the IP address in the data portion of the SIP packet, and use the IP address from the IP stack, thus reducing the original flexibility SIP was designed to offer.

SIP is transmitted in plain text, which is bad for security, and consumes more bandwidth.

All these complexities make it difficult to roll out SIP securely as many are simply added to the router's DMZ.

There is no question about it, SIP will be the target of numerous attacks once hackers become motivated.

As much as I like to support open protocols, I think SIP users will continue to suffer from it's design problems.

Russian serfs paid $3 a day to break CAPTCHAs

Lou Gosselin

Captchas are already being broken

Captcha's are at best a temporary solution to a long term problem.

I've followed this topic for a long time.

Do a search and find hundreds of links about breaking captchas.

Some engines are generic enough to work with many captcha generators.

http://www.cs.sfu.ca/~mori/research/gimpy/

http://www.botmaster.net/pictocod/

To respond to this threat, sites make the captchas more and more difficult to recognize to the point where I personally make mistakes when I incorrectly guess what the correct answer is. I imagine an automated program may have a better chance at solving these than I do.

In any case it was long speculated that it would always be possible to pay people to manually solve the captchas.

I would like to see more effort to block spammers at the point of origination.

For instance, apply spam heuristics on a sender's email and take action earlier.

If thousands of accounts are sending the permutations of the same email, then there is a very high chance that it is spam.

Google claims 'non-existent' Android beats everything but the Jesus Phone

Lou Gosselin

Re: Java already won on mobiles.

Agreed, Google is just forking java binaries, making them incompatible for no obvious reason (other than taking control). I'm sure the api's will also be incompatible at a source level. All in all it's additional fragmentation.

That said, I am very aggravated how manufacturers/operators keep the platforms completely locked out. "Yes, MIDP can do that, but only if you can obtain the manufacturer's digital code signing certificates". On many handsets, it isn't even possible to access the full API on one's *own* device without such a certificate, not even for debugging. For instance on Nokia 40 handsets, the bluetooth RFCOMM API is "supported" but remains inaccessible to developers like myself. Meaning real developers have to resort to ugly hacks - such as opening communications channels through the file system.

So while fragmentation is evil, I hope that the additional competition forces the entire market to open up, because the status-quo is completely anti-innovation.

Tool makes mincemeat of Windows passwords

Lou Gosselin

re: Hardware/specification/dma/busmastering fault? Rubbish!

Yes, Lou Gosselin, it is exactly like "installing an externally readable memory probe". So is plugging in a PCI card.

Obviously PCI cards have access to the bus, this should not surprise or disappoint anyone. DMA is not intrinsically faulty. Sound cards use dma, so do network cards. Nobody expects these to be vulnerable. These devices are given memory addresses by the CPU and NOT the external cable. The problem is that apparently the Firewire architecture is designed to allowing Firewire nodes to make memory requests directly by exposing DMA to the outside. This design has significant ramifications to security beyond PCI bus sniffers.

So the comparison between Firewire (an external bus) to PCI (an internal bus) is not really fair. As for USB, it doesn't work like Firewire does. Devices do not instruct the USB host controller as to where to read/write memory. DMA is used as intended and the range is determined by the OS, not the device. You may have a valid point with PCMCIA on Laptops, although that's a different story.

Even if it is possible to disable the flaw in software, some Firewire devices will cease to function any longer because this is how they work by design. The article itself said that certain device profiles will open up the hole.

That being said, it is very likely that many USB (and Firewire) drivers have buffer overflow bugs which could be exploited without this design flaw. Unlike the flaw, these would have to be OS specific and could be fixed once discovered.

Lou Gosselin

That really sucks

So basically, it seems in this case that installing a Firewire port is equivalent to installing an externally readable (writable?) memory probe. And then somehow blaming MS/Apple for it... Hmm, yes Apple...

Software clearly isn't the culprit here, neither are drivers. This is a hardware glitch (aka feature).

No, the fault lies with the designers of Firewire. There is no justification for allowing a Firewire device to probe memory beyond the range given to it by the OS (possibly for debugging, but then it should default to off). All other DMA devices must be told by the OS where to transfer data to.

This is such poor design that I wouldn't be surprised if some fraction of Firewire ports already do not adhere to the spec (and are not vulnerable).

Google mounts Chewbacca defense in EU privacy debate

Lou Gosselin

Some day ISPs will sell *dynamic* ip addresses, seriously

The fact is that the IP address is the most effective UID for the household.

It's more effective than cookies, since those can be deleted and can't track users having more than one computer. The two in conjunction reveals even more information and becomes more reliable.

All web sites are capable of this and it is nothing new. The scary part comes when someone comes along and collects all this information into a giant central database, such as google's, so that users can be identified across distinct sites.

The privacy concerns come not (so much) because of data collected collected by google through explicit submission like 'search' or 'gmail', as this are optional services. Instead the real concerns are from collecting this information across millions of sites (think 'google analytics' and 'doubleclick') without user knowledge or consent, even those deliberately avoiding google.

The google maps service can give away user geography if a merchant site has a "find a store" feature that passes the users geography back to google. The user probably did not intend or realize that google got the information when locating the store.

The pressure is clearly there for google to merge data from all these services (by IP and cookies). It paints a very detailed profile of the household, which is obviously personally identifiable, and there is no point in refuting it unless it's a PR stunt. Some people don't care one bit even if google can read their email (gmail), and that's fine. However what about people who never signed up for the Google-Boat(tm).

IPv6 roots planted on the net

Lou Gosselin

The Transition from IP4 to IP6 isn't exactly graceful.

Sure many vendors have future proofed their products by implementing IP6.

But it's practical use is limited to internal networks or tunnels.

Unfortunately first adopters will realize that having a pure IP6 address (not the IP4 subset) will make it impossible to establish a link with IP4-only services or peers, unless the IP6 address gets NATed to a public IP4 address.

Which means it will become mandatory for any ISPs offering IP6 addresses to NAT addresses back into an IP4 address, but then what's the point since ISPs do that today with IP4 while assigning private IP addresses (ie 10.*.*.*) .

If some company were to start disappearing from Ithe P4 address space in favor of IP6, then users would be forced to upgrade to IP6, or just look for an alternative.

Where is the incentive to be a first adopter?

Governments should just make File Sharing legal in IP6 for the greater good of the internet.

Google Android - a sneak preview

Lou Gosselin

Byte Code

I would also be a little put off by the lack of low level support for programming directly atop linux. Although limited access is something we're all very accustomed to on phones, and this phone's not going to change that.

Also, I'm not convinced that there is a legitimate need to replace the java byte code with google byte code. They've created a new tool chain that's incompatible with the old, there better be a good technical reason for it.

With just in time compilation (java byte code or otherwise), the byte code gets compiled down to native machine, so in theory optimized byte code compilers would convert byte codes down to the same machine code. However, a register based byte code is probably going to prefer one register based architecture (both count and bit size) and be at odds with all others. Not to mention stack based processors such as itanium. A stack based byte code doesn't make this assumption.

Quantum computing firm D-Wave bags $17m more in funding

Lou Gosselin

Article accuracy?

I'm no quantum computing expert, but I thought one qubit could be used to simultaneously solve two solutions in one moment by being in two states at the same time. This article says one qubit represents 4 states, which is correct?

From the article:

1 qubit = 4 states

2 qubits = 8 states

4 qubits = 16 states

6 qubits = 24 states

This is clearly wrong as the author is just multiplying the qubits by four. I'm not going to lecture readers as to how to properly compute exponential growth, suffice it to say this is not it. How did that slip?

Line up for full-windscreen satnav

Lou Gosselin

Some questions

Not completely novel, but I'd be a fan if it worked well.

Some questions though:

1. Would the projection need to be calibrated to the driver?

By shifting position, won't the image become mis-aligned? The image would be completely distorted for the other passengers.

2. The "screen shot" looks good, but in the real world, won't the virtual line overlap physicial objects that ought to be occluding it. (the line should disappear around a bridge or corner).

As someone else posted, perhaps this would be more useful as just a direction/distance indicator without attempting to paint into the scene.

'100% accurate' face recognition algorithm announced

Lou Gosselin

100% accuracy?

This technology may work well for several visually distinct faces, however any talk of facial recognition must consider that it will be applied to millions of faces. Frankly, there are not *that* many visually distinct faces.

Take any given set of facial parameters, let's say some eye-nose-ear spacing ratio. The more people a system (or even a human being) needs to distinguish between, the more resolution is needed in measurements. At any given resolution (ie 0.05mm) , the system can mathematically differentiate between only so many possibilities.

So the (only) mathematical solution using the same ratios is to increase the resolution (ie 0.001mm). However this will not yield more accuracy, due to the impossibility of capturing such small facial details consistently. A small rotation would throw the measurement off, or even a pulse could change it.

A computer could be at least as good as a human given the same input, but 100%? I remain skeptical.

I'd be interested in learning how many distinct faces a human being can recognize.

Rogue trader blows sox off control systems

Lou Gosselin

implicit trust in employees

Someone has to implicitly be responsible for building the databases, writing applications, doing maintenance work, etc. Sure, someone can come in to do an audit, but a malicious employee could trick them into auditing a false copy of production without anyone catching on until it's too late.

It is a serious problem, with no easy technical fix. Employers must realize this and do their best to find honest individuals as well as removing incentives to hack into systems. This is why many DBAs command a relatively high wage relative to the difficulty of the job; giving them less would be a risky move.

Personal data for 650,000 customers vanishes into thin air

Lou Gosselin

responsibility

GE Money President Brent P. Wallace reads in part that J.C. Penney "was in no way responsible for this incident."

Except for choosing GE Money in the first place!

Network Solutions games net domain biz

Lou Gosselin

Just a thought

What kind of fee does NS itself have to pay if it registers a domain?

It seems to me that if enough people register millions of long nonsensical domains, then NS would become crippled by this behavior.

How about registering millions of "NetworkSolutionsSuck1234.com"

Xbox Live account takeovers put users at risk

Lou Gosselin

Hold on, there isn't a way to track someone's MAC address!

FYI one's mac address is generally never transmitted over the internet. It's just not the way IP routing works. Ethernet devices use mac addresses within the local subnet only. And anyways MAC addresses are trivial to alter in that context.

Sure the IP can be polled up but unless MS can compel the ISP to release the user's information it won't help.

They'll have to do like the RIAA and file a John Doe lawsuit. The ISP may not even be within US jurisdiction. Furthermore the IP might be tracked to a proxy, or the owner may have been victim to a trojan.

This being said, I'd be surprised if the XBOX units don't transmit a unique serial number somewhere in the data stream to microsoft (is this what you guys meant by "mac" address?)

Revealed: USB 3.0 jacks and sockets

Lou Gosselin

USB is limited

I also wish that USB was a peer to peer network without any of the "host" nonsense.

Also, why doesn't one of these standards incorporate the future-proof design of using arbitrary data bandwidths. This way a manufacturer can use faster and faster speeds without requiring a new standard every generation like "USB1 lo=?, hi=12mbps, USB2=480mbps, USB3=...".

Both peers should detect each other and negotiate a max speed and then just work.

"Hi, I am a webcam, I can talk at 500mbps"

"Really, I'd like to see your pic, but I talk at 300mbps"

"No problem, lets use 300mbps"

The old serial ports were more flexible in this regards (only they lacked a standard protocol for establishing the channels).

Of course like so many others, the USB consortium has no incentive in developing a future proof technology.

Feds to probe Comcast's BitTorrent busting

Lou Gosselin

ISPs blocking and throttling

"If they are inserting additional packet's into a customers data stream, with the customers IP address etc... it sounds more like forgery or identity fraud than reasonable behaviour"

I'm not sure I'd go that far, I'd only go so far as to call it port blocking. It's still a really major headache though.

All of the cable ISPs I've dealt with (in US) do it.

Time Warner's Road Runner service was horrible about blocking thousands of incoming ports rather arbitrarily such as the entire 6000-7000 range outright which sucked for VNC, remote X displays and several games (ironically port 80 was open).

Cable Vision's Optimum Online blocks several incoming as well as outgoing ports, which sometimes makes it difficult to use legitimate services.

As for throttling, at times I have noticed network bandwidth going down to modem speeds but I've always just attributed them to a "slow internet", although the bandwidth speed tests would show no bottleneck at all. It wouldn't surprise me if it was due to a cap.

The major ISPs love to throw out "security" as the reasoning for blocking traffic, even if we know better.

If they won't change, I think the only solution will be to use random ports with encryption for everything.

Mozilla pulls offensive viral campaign

Lou Gosselin

google logs

"Yep, i just configured my apache server to store users state of health, what diseases they have, their religion and also what they are likely to vote in the next election in their country... apache extended logs are great!"

You laugh, but google probably does collect a great deal of this information and could pull these stats from their web logs.

.

New Jersey bans sex offenders from the web

Lou Gosselin

Ban the internet? How is this considered by a free country?

The government should be forced to put such draconian measures to a public vote before being allowed to impose such laws.

Not that the public is particularly good policy making, but at least they'd only have themselves to blame.

Things that would consequently be banned:

Search Engines

Online Banking

Online shopping

Online actions

Blogging

Itunes

Email

VOIP

Windows Updates

WarCraft/UT

the register

...not a comprehensive list of course

Skipton in lost laptop security woes

Lou Gosselin

How to fix?

Many people here suggest that there is gross negligence by the company losing the data, as opposed to the consultants who failed to encrypt or protect it.

From a technical point of view what exactly is the solution, then?

1. Have policy and equipment requiring consultants to use a remote desktop so data always remains on site.

This has it's own technical problems, requires high connectivity, can be expensive, limits the software/os available to the consultants, and is potentially vulnerable to exploitation in itself.

2. Require consultants to always store data on remotely mounted drive located at company via VPN.

Difficult to enforce, and requires high connectivity.

3. Require consultants to keep data encrypted.

They should already have been doing this, it is difficult to enforce.

4. Prevent them from having access to all the data (select *) so they can't loose it

I've heard people say this, but what exactly are you talking about? The consultants may if fact need the data. SQL is by nature an adhoc mechanism, how would one impose restrictions while not simultaneously hampering the ability to do one's job?

The company could have DBA to create and grant restrictive views to the consultants. However if every query needed approval, efficiency would drop like a rock. And if the DBA knew which queries to grant, then they probably wouldn't need the consultants in the first place. So this still wouldn't necessarily fill the security hole.

I'm really interested in knowing how you guys would go about solving this? Clearly there are things that the consultants can do, but what about the company who's data is at risk?

Microsoft loses battle of the piggybacking passwords

Lou Gosselin

Response to @Greg and @Steve

GREG: "WHY an inferior and more expensive product should be protected is quite simple: if I have to spend 1.5 billion dollars in R&D, like is the average currently, to create a new drug, I NEED to have a protection against generics companies who will, oe month after I get the product out, have identified the molecule in their facilities for $5 millions, and will then start churning it out of their factory for the same basic cost."

I specifically said software patents, I don't hold an opinion on other patents one way or the other because it is not in my field of expertise. I don't regard your comments relevant whatsoever to the software field.

Just like you say, patents in general were intended to compensate the R&D process, however with software this is completely unnecessary, as R&D costs are minimal (compared to your example).

For about the costs of filing a patent of a "software invention", one could probably hire hackers to implement the algorithms of said invention from scratch within a week without the benefit of prior work. Assuming that this is true, what then is the public benefit/justification of software patents?

STEVE: "If the developers can make it better and faster than Company X then they are surely doing it differently, and so won't infringe the patent held by Company X?"

Should developers be wasting time and resources by the fact that they must compare their home grown algorithms to those vaguely described in the patent registry? It is a pointless legal endeavor with no technical merit. It should not matter if they happen to develop and use the SAME software algorithms, they should be entitled to.

You may not be a developer, but you are misunderstanding that there are mathematically only so many efficient ways to code things, patents restrict developers from using the best implementations. So even if there was no significant R&D for the work covered by the patent, there sure will be for the developers trying to avoid the patent.

STEVE: "Then they patent their improvements, license the basic technology from Company X, and get rich selling the improved version..."

Even if a developer chooses to apply for a patent on an improvement of an existing software patent and succeeds, Company X is not legally required to license the original software patent to the developer under Reasonable and Non-Discriminatory terms. So the improvement may be worthless. Additionally, unless the developer files a broad patent or several patents to protect the first, it may be possible to side step his patent.

Neither of you addressed my base point that patents create a huge source of completely unnecessary overhead, resulting in less development funds, and ultimately raising costs.

We see the patent lawsuits in the news all the time, who do you think is paying for it??

I reassert that those who understand software patents objectively would do away with them.

Lou Gosselin

No software patents.

In today's technological world where there are millions of coders, it is crazy to hold the notion that the government should/could maintain a registry of all software inventions and their inventors. It is too monumental a task to seriously justify the enormous costs of doing it right, while doing it wrong can be (and is) devastating.

There are those who staunchly defend "IP" rights and patents without even understanding the differences between copyrights and patents. These people should not be making policy or voicing their opinions, please go read up on what a patent is.

If you are a lawyer or a troll, well go find a job that actually benefits society, I don't care about you and neither should the government.

As for the rest of the pro-software-patent crowd, please realize that patents raise the overhead costs of doing business for both the patent holders and the licensees: lawsuits, application fees, the constant uncertainty, the eternal delays.

Why should developers pay $ to Company X instead of developing the technology internally for a fraction of $?

The answer: IDEALLY Company X WOULD MAKE the SALE, but only because they produce a better product for a better cost in a shorter time frame than the developers could have achieved on their own, otherwise Company X doesn't deserve the sale, patent or not.

Why does Company X need a government backed monopoly on a technology unless it's making inferior software, charging too much, or is a troll.

Think about this very carefully, and I think you'll agree that software patents should be obliterated.

World of Warcraft spykit gets encrypted

Lou Gosselin

@Morely Dotes

Err...not quite.

The controversy is over Blizzard running spyware on *your* machine. Said another way, they are taking away your right to not have them snooping on your machine.

"It's *their* program, which you *rent* on a monthly, quarterly, or annual basis."

*License* would be more accurate, but regardless.

The fact is that they bundle unwanted additional spyware to monitor *your* software and operating system installation, there is no intrinsic right for Blizzard to do that. This latest encryption change means that users can't see what information is being collected about them.

Blizzard shouldn't get off the hook just because MS may be less trustworthy. In any case Microsoft has taken alot of heat over privacy concerns and Blizzard should too.

BTW I am not a gamer, it's just that I've seen spy ware in other applications and I'm not thrilled about the prospect of it being integrated into every application running on my machine regardless of the justifications. Can you imagine that? What a waste!

Raytheon to deliver 'paging system' for submarines

Lou Gosselin

@Silas Humphreys & AC

Why would you assume that anything would be left unencrypted?

I am sure they will be using proper encryption and time encoding on messages.

Transmission by acoustics is no different than airwaves as far as security goes.

Chances are the device will not even be capable of decrypting the messages, just relaying them.

The enemy couldn't intercept or forge messages, however jamming the radio or acoustic waves would be possible if they are nearby.

My question is why they wouldn't implement a bidirectional link, or they are but aren't talking about it.

Obviously it would give away the location of the sub, but it'd be nice to have the capability.

Mass. firm sues Google over 1997 patent

Lou Gosselin

Software patents must stop. They are obsolete.

How can anyone know that an idea hasn't been used before? It is extremely unlikely for the patenter to actually be the first one to use an algorithm. Many go undocumented due to trade secrets or the true developers don't care to spend significant resources on a faulty patent system. It is impossible to verify that a patent is novel, period.

It is especially ridiculous for person X to claim ownership of an algorithm when persons A, B, and C actually come up with the same idea independently without knowledge of X, possibly even before X.

Then the government grants X exclusive monopoly rights to an algorithm, the problem is that this adds very little value to the community at very great cost. Even if the patent turns out to be invalid, it can cause a great deal of harm to developers.

Why grant software algorithm patents when someone else is perfectly willing to implement a custom solution at a lower cost without using any of X's work. The basic idea behind patents is to spread ideas and knowledge. This is noble, however anyone who's read a patent knows how cryptic and useless they are to actual developers. It's a failed cause and everyone knows it except those who profit by them by draining "intellectual property" out of the developer's domain. When the patent expires, nobody's going to gain any value whatsoever from these now public domain obsolete legal documents.

These days we have far better resources than patents for learning about software technologies: books, technical magazines, "the intenet". In fact I don't know any developers who ever consider patents as a source of information (this is the "value" they were intended to provide.)

These artificial monopolies on software algorithms just never made any sense, but today with the viable alternatives available it never made more sense to ditch software patents than now.

Task force aims to improve US cybersecurity

Lou Gosselin

In related news...

Microsoft sets aside a couple million in bonuses for employees of the Center for Strategic and International Studies task force, to be paid at the conclusion of the study.

Google funds hold Firefox fate (for sure)

Lou Gosselin

Re: except that...

Incorrect, google ads are blocked on affiliate networks through adblock. They are just not blocked on google's own site.

This is about right, because it means that users voluntarily using google.com will see the ads, but those browsing other sites don't get plagued by google's ads.

Those who want to avoid ads seen on google.com but want the same results can access it through a meta search such as http://scroogle.org/

Cafe Latte attack steals credentials from Wi-Fi clients

Lou Gosselin

Many problems besides wep

Some people realize that WEP is inherently insecure these days.

However even if WEP were secure there are many other security issues with WIFI hotspots. These apply even to the more secure WPA.

1. You have to trust the operator of the hotspot to not be snooping traffic or stealing your data (as well as their ISP). And that they haven't been hacked.

2. You're can't be positive that you've actually connected to the hotspot you intended to, it could be a hacker's access point falsely advertising the same ESSID, you or your laptop software might decide to connect to it initially because of a stronger signal.

3. Attacks that affect traditional wired networks still apply to wireless despite WEP/WPA which only aim to secure the link to the AP. For instance ARP attacks, etc. Although with the right hardware it should be possible to separate clients from each other.

The solution is, if possible, to use a VPN for all your sensitive traffic. This way data would be safe even when going over public channels.

Google's proposed global privacy standard slammed

Lou Gosselin

Who gets to decide these things?

It is unfortunate that the fate of privacy rights laying in the balance will be determined by large corporate lobbyists instead of the public who are most affected.

One might say it's Google's data, it collected it from it's own (and affiliate) websites. Any of us would expect to be able to do as we please with our server logs. I would say this is OK until it gets to the point where Google is ubiquitous and unavoidable, then there is a real public stake involved because people don't have the choice to "just avoid google".

Google launches YouTube video-blocking contraption

Lou Gosselin

Video fingerprinting is a challenging endever

How successful will this be against transcoding or mutations? (aka using a camcorder to create a new digital stream).

How fine grained is the fingerprinting? Every second of video? The entire frame or also translations of the picture?

I've studied some image watermarking tools. Even the more advanced algorithms could be defeated quite easily using mutations leaving the contents visually unaffected.

Fingerprinting will probably have similar faults.

I suppose if enough users become frustrated they will migrate to other sites (such as megavideo.com, a superset of youtube).

Personally I gave up being entertained by the entertainment industry, so this won't affect me.

(if reg is keeping tabs, I vote no to icons!)

EU privacy verdict on Google set for new year

Lou Gosselin

Will this legislation be effective?

Will the wharehousing limits apply to just search (aka google.com) or to the entire google network?

Adscense, google maps, utube, doubleclick to be, google-office, gmail, ...

Does google really have to delete the data, or just "anonymize" it by setting cookies to self destruct in a year? Google will obviously issue new cookies so this approach is totally ineffective for anonimizing purposes.

I hope this legislation goes a bit more technical than just placing a duration clause in.

Page: