28 posts • joined 8 Sep 2011
Re: Rendering on PC:s
Speaking as someone who was using Lightwave and NewTek's Video Toaster at the time..... The Toaster had nothing to do with Lightwave. Lightwave could be used without the Toaster (although until '94 you could only get it by buying a Toaster). The Video Toaster purely handled switching, keying, and character generation. It was in no way involved in rendering in Lightwave. That was handled by the 680x0 chip in the Amiga.
Waiting for full trace renders in Imagine or Lightwave on a 68040 could mean hours for a single frame.
"Any notions that the Arduino platform is completely wedded to the Atmel ATmega family of microcontrollers have been shattered"
/me looks at Arduino Due currently hooked up to laptop, looks at article, looks at due. *facepalm*
The AtMega family is perfect for realtime, embedded, and low power. The Due and Tre step it up with a Sitara ARM processor for when you need more power (faster, finer grained sampling, etc).
The Tre isn't here yet, but the Galileo struck me as a strange beast when it was announced. Rather expensive and power hungry. The I2C communication isn't a no go for me. When you consider the speed of an AVR processor, I2C isn't terribly slow. Plus the Quark processor can do other things while waiting for responses.
The Due isn't compatible with many shields built for the Uno since they will use 5V signaling and the Due is limited to 3.3V (which is why I have a bunch of 74LVC245 level shifters here also). This is the step up that the Galileo definitely has.
I've still got Amazon gift cards and $60 for the Galileo is tempting. But, when you read the product page for the Tre...... I think I'll wait. Hopefully the price point is friendly, but that does look like a sexy beast.
*edit* - Is that a header for an XBee? Oh, my it is. Super nice!
It's not a site
"Moreover, the warning wasn't raised by the kitchen planning tool. The Register only spotted it because Chrome raised the dialog. No such warning appeared when we accessed the same site on Firefox, for example."
The warning is raised because it is trying to install a Chrome Extension. All Chrome Extensions must declare the permissions they wish to use (or optionally use). Lazy developers request them all because they can't be bothered to look at the docs and see which ones are required by the API they are using.
For the record, I am a Add-on/Extension developer and I don't find the entire model of these extensions to be entirely satisfactory. I wish all of the browsers had much better/finer access control, but there will also always be bugs that allow the permissions to fail. If you can't live with the state of things, don't install extensions or get a VBox/VMWare image that you can browse with and always roll back to the last good snapshot.
> Where's the sentence "we'll fire our executive who weren't able to understand the business they were paid for."?
I worked for IBM for over 12 years and I'll probably have a heart attack from shock the day that happens. As it is, I'm having palpitations that they've forgone *their* bonuses. We'd had quarters and years where while revenue declined, our division was able to up our profit by being more efficient. Didn't matter. We'd still get told how we sucked and that everyone needed to sell. The only thing IBM and the execs care about is revenue. They'd crow about Global Services increasing their revenue. When you looked at the numbers you found that their profit had actually gone down because of the costs they incurred to earn that revenue.
The thing that keeps IBM in business is that it is huge. Plenty of managers doing the right thing and silently subverting upper management's direction. Those two things will keep any IBM board from being able to plow the company into the ground any time soon.
First off, I've been writing software for 30 years. While I've studied a lot about security methods and encryption, I would never consider myself an expert. I would consider myself "suitably skilled in the art."
It makes more sense to me to use a NFC connection to have the two devices identify each other. The user verifies on their device where they are (Joe's Coffee Shack on W. 3rd St). Your device then contacts the payment servers over the data network (or text message in a pinch) as does their device. Both devices have created a shared secret that they have each encrypted with their own key. Payment server verifies with each public key that it has that the secret agrees as does the charge amount. User's device is contacted again and once the user verifies the transaction it goes through and the merchant's system is notified.
Protecting the payment system from the app path is ludicrous. If it is run in the same system as the OS and the app environment, it is not safe and cannot be made so. Instead make it harder by doing essentially the same thing as the baseband modem (but don't even think of running it there or as a uJava app on the SIM because they are insecure as hell). Instead you place a 3rd independent ASIC that is the payment processor. It would only communicate over specific and protected channels with the OS and the baseband modem. That again would be impossible to make 100% secure, but it would definitely raise the difficulty level significantly.
The other point that is insane is to trust their Bluetooth or WiFi as the second channel. Can we please move to making banking more secure, not less?
And as for obvious. The above scenario is me thinking about how I would design a secure payment system for roughly 2 minutes after only reading the first two paragraphs in the article and not looking at the patent. Please don't nit-pick it because I know I've probably missed things here and there. So with that in mind back to the article.
SIM card?!? Are you ****ing kidding me. Secure? Not hardly. That was pretty well dispelled at DefCon 21.
Now to the patent. Ah, sure. I knew no one would specify Bluetooth or WiFi in a patent. You don't fence off things with specificity anymore. You fence off the whole continent as in "a second air interface different from the first air interface includes identifying an air interface having properties more desirable than the first air interface for communication of data to a user over a time period longer than the time used to establish the first secure link."
Shared secret, check. Not sure that I agree with the shared secret being the encryption key. Maybe if each device signed the message first with their own key before encrypting with the shared secret. That sounds like they really mean Diffie-Hellman and that means you end up with a key weaker than the keys used to create it. Again, we don't know the real implementation. This is modern patent law so be as broad and cryptic as you can.
I'm sure within 5 years we'll hear about this patent being granted. Pretty well shouldn't as it is overly broad and obvious. They really haven't added anything new to the game.
As for my idea. I would never implement it myself. I'd definitely want a team of experts in security to go over it and improve it. As well as have the code and any hardware peer reviewed. That would never happen. Once the big players get involved everything gets watered down and made easier and you get the least common denominator solution. Fraud is considered just a cost of business to the banks, they charge it back to the customers however they can.
I never get tired watching DefCon or C3 presentations where developers or engineers have decided to role their own security or encryption solution. They're hysterical.
Hoping for a good show
I've been amused by all the comments I've seen around the web that are variations of "But won't the sun just burn it up?"
No it won't and for a few reasons. First ISON is big. I've heard estimates of between 1 km and 500 m. Second, it gets a bit of shielding from radiative heating by its coma. The coma is being pulled away by the solar wind, but it is still enough to slow the sun's heating. Third, while the sun's corona is very hot (hotter than its surface) it is also very thin. It's really a vacuum, just a slightly thicker vacuum than the vacuum that is out by Earth's orbit. Fourth, it's not going to be near the sun for very long. It's currently moving at 115 km/s and will be moving even faster on the 28th. It is currently 23 million km from the sun. It'll go past the sun by 750,000 km on the 28th. By the 30th it'll be 25 million km away from the sun (heading roughly towards us and north out of the ecliptic). In 4 days it will manage to cover almost 50 million km, which is roughly 1/3 of the distance from the Earth to the sun.
What might happen is either ISON comes through in one piece (admittedly smaller than it was) or tidal forces from the sun's gravity will cause it to break apart. If it does break up, it should have a larger tail than if it stayed in one piece. Comet Lovejoy (2011) for a quick comparison was only 500 m in diameter. At perihelion it was 5x closer to the sun and was moving at 500+ km/s. It survived intact and gave folks in the southern hemisphere a great show.
The Quran agrees?!? Well of course it does. Judaism, Christianity, and Islam are all Abrahmaic religions. The texts are all evolved from the same source Torah -> Bible -> Quran.
Now tell me that the Vedas agrees and I might raise an eyebrow.
I started with a C64 and taught myself 6502/10 assembler. I had to buy and borrow tons of books and magazines. I loved it, but it was brutal.
The Raspberry Pi is great, I have one. However, it's no better than the Linux laptop I'm currently typing this out on. I'm not saying either is worthless, quite the opposite. They both come installed with pretty full stacks for Java/LAMP/Rails/Python/Perl/C programming. As for getting an understanding of hardware, they are worthless compared to the C64. Bank switching, working with the zero page, interrupts, registers, double buffering, etc. In a modern system that sort of direct manipulation is left to the kernel hackers.
But, there are great resources today that I would have killed for as a kid. Sitting on my desk is a stack of Arduinos (Micro, Uno, Due, Mega ADK) and shields. There are tons of sensors and parts from Adafruit, Parallax, and others.
Lot's of example code and tutorials on YouTube and the web. I have my own blog and channel where I occasionally post lessons as well as some of my own work. Built my 7 year old nephew an awesome robot costume this year.
The reason kids can't program is not the tools.
*** THE PROBLEM IS THAT IT IS NOT RESPECTED ***
We still have it being taught by teachers who can barely operate a computer and that is when the school actually uses their computers for more than MS Office skills.
In industry it is not a career path. 40 year old programmers cost too much. Management doesn't see why a kid just out of college isn't just as good. That's if the kid is lucky enough to get the job and it isn't just sent overseas. I'd love to be able to teach programming at the local schools (community college and the high school), I just don't see that I'd be doing those kids any favors.
Re: More than just doing the job
Between open source software projects and my 8+ years of working on multi-national teams I'm not sure that I could possibly disagree with you more strongly on your second point than I do. There are definitely jobs that cannot be done remotely, but the tech sector has much fewer of these.
For a professional the tools have become ubiquitous. E-meetings, instant messaging, video conferencing, and even the good old telephone and email. I've worked with people from Ireland, the UK, India, China, and Japan (I'm in the mid-Atlantic US). The only difficulty was in dealing with increasing differences of timezone.
When I did work in the office, the drop by your cube meeting was to see if you wanted to go play foosball. If it was a work question it would be an instant message, even if it was from one aisle over.
I'm sure he thought it was just a typo when the dedication plate read ISS Enterprise.
Re: The core solidifying removed the magnetic field
"The core needs to spin in order to generate a magnetic field. Specifically in relation to the solid/liquid core boundary.
'Tis the Dynamo Effect wot does it."
Exactly. Mars being smaller cooled enough so that it's core stopped spinning. When the magnetic field stopped, the solar wind started pulling off the atmosphere.
For what it's worth, in the next one to two billion years the same will happen to Earth. At that point it will very quickly turn into Mars 2.
I think this says all that needs to be said on this.....
"Because our goal here, of course, is to meet the requirements, number one. But, also do so as inexpensively as possible, keeping in mind our goal. And our goal is, clearly, not to find a qualified and interested US worker."
Re: Coming to a wallet near you...
Credit cards get their money from two sources. The user who pays interest, fees, etc. The other half is the merchant. To accept credit cards you need to pay fees for your merchant account. Then you need to pay your monthly fee to your processor (this is why PayPal and Square are popular with craft fair folks as they don't charge you $15-$75 just to say you accept credit cards). Then when you accept a credit card you pay the base charge fee of 28¢ to $1 plus the 1.8% to 4% of the transaction amount.
Paypal/Google could create an immensely popular card by just making the merchants pay just enough to cover the cost the banks will charge them just to get the money to the merchant. Then give users low interest rates and reasonable fees. Of course the regulators will then act at the behest of the big banks and make sure that they have no end of troubles getting this all up and flying.
Nice solar array
Pity about all of the trees that were cut down for it.
Doing it in the first place in this day and age is
Unfortunately, it still happens with alarming frequency.
Re: Yes, it's about time, but…
Sure it is. Using it is still better than sending the data over the wire in plain text.
For DigiNotar to work, your victim would need to be using very old software. It's certificate as a root CA was revoked by pretty much everyone. As a cautionary tale for the whole CA system it is definitely a loud and clear example of what everyone knew was an issue. Unfortunately there is no panacea when it comes to security. You do what you can. The only true security is a one-time pad, but that is actually impossible to achieve in reality.
If you look at the mitmproxy tickets, you'd find out that Apple has pinned it's certificate (at least in iOS 6), which is exactly what should be done everywhere that it is possible. Since the certificate is not used over the wire, you'd need access to their device and the ability to change the certificate on it to your certificate.
So what's left? Well if you want to target specific sites that have mixed content (some SSL and some HTTP [preferably JS files, but CSS would also work]), you can proxy the traffic and inject your own JS code in the HTTP stream. SSL works by public/private keys to set up the connection. After that it is simple symmetric encryption. Your code would make repeated connections to the server with a block of text that you know. Known text attack is pretty simple for working out the symmetric key. If you've been caching the SSL packets, you can go back and decrypt that stream.
You've got good points, but just shouting the sky is falling on a forum is perhaps not the best thing to do. I mean, what if some PHB is reading the site and gets the idea that they can just stop using SSL on their services. ;)
It's better to point out that security is hard and SSL is not a panacea because it needs to be implemented correctly and carefully. When I was a SysAdmin, I used to tell my colleagues that if you wanted true security, you'd cut the cords off of your system, send it through an industrial wood chipper, embed that in a block of cement, and then drop that into the Marianas Trench. Then I'd be 99.999% sure you couldn't be hacked.
The process is tiresome
1. Install new update after it bugs me incessantly for several days in a row.
2. Tell it that I do not want the farking Ask.com toolbar for the 17th time.
3. Re-disable the plugin in my browser, again for the 17th time.
If I didn't need Eclipse and the Android SDK, this piece of trash would be banned from my systems. :P
In all, I rather expect it will bump Windows Phone to #4. I wouldn't actually take a money bet on that, but.... It has a much better interface due to its borrowing all of the good ideas from WebOS. It does have that Android compatibility layer and as a result will go to market with a much bigger app store. As stated in another comment Blackberry hardware is generally very good. There is still a fan base for Blackberry.
I haven't bothered to look at the new API. Hopefully folks writing native apps will get to deal with a much better API than the Charlie Foxtrot the old API had become.
I'll stop in a store to give one a try when they are released (just like I did with the Torch), but I'll stick with my Droid.
OAuth is problematic, especially 2.0. There is really nothing stopping me from asking you for your FB/Twitter/Dropbox credentials and storing them. At that point I can do the whole sign in, authorize, and obtain access token from my server without you ever knowing what permissions you just granted me. I can also get access to your account at any point until you change your password. Deauthorize the app and I just reauthorize myself.
I'm sorry there is something stopping me. I'm ethical and I take the ToS that I accepted seriously. There are a lot of people that would not worry about that if they could make a quick buck.
Obviously most tech savvy people are going to know that they should be on the Twitter page to login. Then you get people like my wife who won't do anything or call me in to see if it's legit. Then you get people like my brother in law who will just go ahead and log in (my sister banned him from her computer and I'm constantly pulling malware off of his computer). These are also the people who won't know how to check what permissions the app has or where to deauthorize it and will complain profusely when told to change their password.
Re: 8080, bloody Hell!
Actually the C128 had an MOS 8502 and a Zielog Z80. The C64 started out using the MOS 6510 and later used the 8500. Everything except the Z80 were 6502 compatible, but each had its own additional capabilities.
I prefer the more descriptive name of Dumfukistan.
A few months before they de-orbited it, I saw a simultaneous transit of Mir and the ISS. it was definitely a very cool thing.
I always remember this about my fellow citizens: I know what the stupidity of the average American is and that half of them are dumber than that.
Re: 2 Gig?
They need to use memory that is hardened against cosmic rays. Here on Earth we've got a magnetosphere to block them out and it still doesn't much matter if a bit gets flipped here or there on your phone. On spacecraft an SD card would probably get fried by the rads.
Little known actor/radio personality decides to be a director and uses a gimmick to get attention. The fund raising bit on his website is a hoot.
I'm not saying that it is a bad thing to use cameras, lenses, and filters to achieve an effect. I knew someone who did several short films using a Fisher Price Pixelvision camera because of the look of the final product. He also did quite a bit of post in After Effects.
Hell some real directors have overused filters. I wanted to walk out of Star Trek Generations because of the heavy handed use of colored lens filters. I seem to remember the director apologizing at a later date for that.
Simpler solution. Google can close all Italian offices and tell the employees to move to another country or lose their jobs.
Cruel? Heartless? Perhaps, but all countries need to learn that their laws end at their borders.
They didn't get rid of WebOS
They dropped and canceled the hardware. WebOS, the software portion, was transfered to another division. Without hardware, its limbo state is in essence the same as being dead outright. The difference is that they could revive it or license it to another company.
My Pre is getting replaced with a Thunderbolt here in the next few weeks. HP has no credibility. Any HP branded WebOS device will not have a chance. HTC would be one of the companies who they could license it to and it might work. It would be a very slim chance with Android and iOS out there. I'd lay higher odds on WP7 or 8 at this point.
Yes you are missing something
Amazon is not a California business or corporation. Amazon has no physical location in California.
Why should Amazon be subject to California's laws?
If I owned a shop in North Carolina and you walk in to buy something I am required to collect the sales tax. I as the business do NOT pay the sales tax, you do. The purchaser is paying it and they are the ones required to pay it, but the state requires that I act as their agent in collecting it from you. That's fine. I'm in their jurisdiction and beholden to their laws.
If you order it from California, I am not their agent, nor in their jurisdiction. Legally I am not subject to their laws, but you are. You still legally owe what they say you owe. If they passed a law saying anyone sending a letter owes us a dime, should I send a dime to the CA state every time I send a CA resident a letter? Hell, no.
This is just CA and the other states pulling these shenanigans trying to get illegal laws enforced because they know their residents won't be honest.
In fact let's flip the idea. What if NC passes a law saying I must collect sales tax from anyone making a purchase regardless of where they are. Would you be fine with paying NC a sales tax for an online purchase if you lived in France?
NC has no Amazon affiliates because of NC pulling this shit. I know two people who were making $10k+ a year as affiliates and paying income tax on that. Now NC has not sales tax and no income tax on any of those purchases? Bravo to the folks in Raleigh.
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
- Samsung Galaxy S5 fingerprint scanner hacked in just 4 DAYS
- Feast your PUNY eyes on highest resolution phone display EVER
- AMD demos 'Berlin' Opteron, world's first heterogeneous system architecture server chip