229 posts • joined Friday 3rd July 2009 08:24 GMT
Re: Uh ... computer says no.
Last time I programmed for CoreAudio on Mac (a long while back), the audio driver had a fixed sampling rate of 48Khz and the audio card's crystal had a interesting drift.
That would be 2.4 samples per bit. This is certainly suitable for sampling a sine wave and the .4 makes it likely that you wouldn't even have to be in phase. Of course, you'd need to run some form of digital signal processing to reproduce the peaks. You'd need additional DSP high pass filters to extract any actual data from the signal. We'd probably need some additional time to make it work so that PLL could kick in. Of course, there are other modulations methods for transmitting data over sound waves, but they're going to be SLOW!!!!
Now, in order to make this work, you'd have to have code running at all times on all audio cards and/or drivers and/or OS kernel audio implementations and/or BIOSes running what I'd imagine would need to be a 20 point filter for everything to work.
I just love this nonsense.
Oh.... but you said square waves... that makes it more realistic.
Can someone please get Zack Brown from the Linux kernel over here. He needs to make some comments to bitch slap some people around.
Re: Uh ... computer says no.
Well, to be fair, I do have a microphone connected to my video workstation with a microphone which does actually have a 100-22,000Hz frequency response range. I've tested on a scope as well.
What I love is the suggestion that there would be some special code which would contain code to run a filter to extract ultrasonic from an "ultrasonic signal". I'm pretty damn sure that Nyqvist would have a blast with this. Next we'll here there's a DSP PLL to compensate for sampling rate issues on these sound cards. :)
Re: Just too possible!
I heard from a respected homeographic doctors that drinking water with the essence of gold will change your DNA to make you appear as a direct descendant of King Midas himself.
Respected security researcher ... that's too damn good. :) I love this stuff.
William, do you drive around a van with your name and photo on the side and a nifty slogan like "PC Problems? Call the Dr. Data!"
Thanks, I needed a great laugh.
Damn it, that was my line!
Re: I call bullshit
Come on now...next you'll tell me the tooth fairy, santa claus and intelligent business grads are all fake too.
Get real :)
No it's not
No, it's not technically plausible.
Next dumb comment please?
How about giving people an excuse to upgrade?
I know more than a few people who still use iPhone 4 because they're waiting for something new.
I bought two Surface Pro 128GBs and a Surface 64Gb. I didn't even notice the price. It wasn't important. I understood when buying it, if it broke, it was a gonner.
I think you need to see who the target audience of a machine like this is. It's a well engineered machine which looks awesome, weighs very little, has a replaceable keyboard, has an awesome screen, has good battery life relative to the specs and size, is really versatile in general. It has made my life amazingly better. I tossed my MacBook Air 11" and iPad G3 and Samsung Series 7 Slate because now I have one machine which does what I needed three machines for earlier. If it cost $3000 a machine and wasn't repairable, who would care?
I already ordered a 512GB Surface Pro 2. Can't wait to get it. Better battery and more storage... it's like Christmas.... in fact, it'll probably be Christmas.
I guess some of us prefer to pay a bit extra for something that improves our lives.
An alternative hack... but in the spirit
If you take 10,000 files (or less, I'd need a proper sample set to work with) and make them sequential patterns, the given that you have g sub x and intend to recover g sub y when in possession of G sub X and G sub Y, then you encrypt the large data set using g sub x and G sub XY and factor characteristics of the common exponent the logarithms... I'm not conveying this right. I see it mentally, but am not good at wording. I read part of the paper you linked which takes a similar approach and might actually even shorted the brute force attack remaining.
Using my method, you construct a tree of common traits of possible key values based on the fact that you're actually in possession a single private key and both public keys. It's something I came up with when Diffie identified another weakness in the keys.
The main idea is that the Diffie Hellman Problem is called a "hard problem" not an "impossible problem". We already have more information available if we have the client's private key than the algorithm accounts for. We also have the ability to encode known sequential or patternistic data sets. This means we should be able to attack the algorithm by identifying common traits of the cipher when comparing the algorithm, the data sets and the outputs produced. This of course would be infeasible without the private key used for encoding.
I've always had issues coping with the DHP when the encrypting private key is included in the algorithm. After all, it should be theoretically possible to reverse much of it. After all, unless you actually specifically drop data making it useless to begin with then you should be able to work backwards through it.
I'm guessing someone smarter than I can probably hack more of it algorithmically, I have major limitations in that field, but I am pretty damn good at factoring based on producing tweaked data sets to build search trees or sets to brute force.
Let's face it, there's a reason we key cycle 3072 bit keys... it's because they should be recoverable by someone somewhere as their sample sets grow... in fact Diffie makes direct reference to this in the original paper and later articles. We're simply expanding the known sample set and exploiting the inherent weaknesses.
If they're using DH (likely) and they're using the same keypairs to encrypt and decrypt all the files, pause the machine, backup and copy a crap load of small word files to the machine and let it run its course. Once you have enough sample data with both source and scrambled and you have the local keypair and you have the remote public key, tree search the key bits and factor to a brute forceable length. Then GPU farm the remaining bits of the missing private key. Then decrypt.
what's the issue?
Uhhh... and even now he doesn't get it.
There is no room for a handset maker in the smart phone market. Never was.
Samsung is a strange bird because so far as I know, they have absolute control over every component in the telephone. They make flash, ram, CPU, batteries, screens, etc... This means that unlike other handset sellers, they can make a huge profit per handset. But even then, Samsung is pretty much THE Android phone. Even though Google bought out Motorola's phone division, it seems that Google is happy letting Samsung make the initial profit from the phone.
Apple sells their phone as an accessory to their more profitable streams of apps and media. They are making fantastic amounts of money off of add ons. I'd love to find out what percentage of Mac sales and accessories they make just from selling PCs to developers of iPhone and iPad apps. What about the tons of money made from licensing chargers and accessories?
Google makes money from search, the store and more. Not the phone.
So, how does a handset maker with a handset maker mindset compete against companies who just don't care how many phones you buy as long as you buy from the stores?
The only place for a handset maker today is in selling feature phones to the non-smart phone people. Things like tossers and plan freebies.
Could Nokia have developed a competing platform to take on Apple and Google? I highly doubt it. They always come out strong and get cold feet if it doesn't show returns on day one. They have never had any proper long term foresight. They always made it about the phone, not the platform.
The Windows Phone cock-up is probably more about Nokia's inability to establish an image. Their "coolest" advertisements seem to focus on a demographic of people who watch stock tickers. Even their release of phones have the two fats Steves on stage wearing ties. Then when they should be targeting teenagers and young adults and American welfare/food stamp recipients who have iPhones, they talk about selling the budget model to developing nations.
It's amazing how they never once understood that it simply wasn't about the phone anymore.
Re: Fashion company to fashion company... duh..
So, you're saying that :
Burberry woman understands fashion, I.E. branding. Make it look special, milk the customer on the accessories, get them to come back for more and more.
Dixon guy doesn't, he figures that all that matters is what happens in his part of the store because that's where his results are measured so who cares if you buy all the addons, it doesn't show up on his quarterly reports?
Yeh... tech vs. fashion :)
Might have some valid points
I don't think the paper says straight out "open source is bad". It seems to focus on the idea that private companies may already be able to support open source better than running your own division to do so.
It is questionable whether the DoD can in fact run any large scale software projects successfully internally. I don't care whether its based on open source or not. I am 100% sure, they can do it better than Lockheed, Boeing, Honeywell or other contractors. Oracle may well be better suited to manage projects than DoD. I think it's a toss up really.
Whatever the case, the DoD would at least be able to handle revisioning and have a staff auditing incoming software changes that up to this point, they probably didn't. So even if it's a room full of DoD enployees maintaining a distribution based on CentOS or Ubuntu and only letting changes in which are verified at a code level, it could be a good thing.
Now the question is, where would the military get the coders who are skilled enough to audit as such?
Fashion company to fashion company... duh..
Dixons guy was crap because he was a tech guy. Apple is a fashion company first, tech second. She's probably better suited for the job than Tim Cook. She probably gets it.
This is such a dumb ass method
I developed a watermarking algorithm two years ago which survived re-encoding, functioned over the broadcast network and actually managed to allow geographic location of who leaked a stream from broadcast to the local exchange. Best thing is, the viewers in the test group we ran it against couldn't tell the difference between the original sources and the watermarked video. We reencoded the files up to 10 generations and could still always identify the markings. It even survived HD to SD conversion and back. Even better.. survived qcif rescaling.
This design was stupid since it's so easy to bypass. You don't need to compare against the original. What you need instead is to confuse the detection software. How? Get two copies of the same film from two different accounts and compare them together and find the differences and obfuscate them further.
This is not rocket science... It's just math. Could my system be bypassed? Sure... but the way I marked the video made it almost impossible to do so without understanding the pattern of picture alterations I made. So even if you started mixing from multiple different sources, each source it was mixed from would be be identifiable.
I am a huge fan of watermarking. I hate DRM. I like to be able to get content and use it however I'd like. I have no problem with people sharing videos with their friends and family. It's when people rent or borrow a movie and then put it on the pirate bay I don't care for.
That said... the Pirate Bay has evolved into something we all need to some extent. It's a video archive. It's a place where almost all video and music can be found. I love that. Too bad there's no good official source for the material.
Yes they did
I met many people who loved their BlackBerrys and wouldn't have anything else. They were sure that new BlackBerrys would catch on.
On the other hand, they also wear suits... and actually think that's cool
Re: I always viewed the stock market as "intelligent high stakes gambling"
I disagree... an intelligent stock market investor would identify means of altering the value of a share and no gambling at all. I know it is supposed to be illegal, but that's only an issue if you get caught doing it.
Sad to hear about your confidence problems
It seems strange that you wouldn't take a great deal more pride in being a female who "beats the odds" as it seems. My wife spent the past 15 years at service desk at a major national newspaper here in Norway and how she dressed never came into question. She was confident in what she did and she solved problems and people treated her as well (or as poorly) as they treated her male peers.
You are obviously in a really bad place if your gender plays a role in your career. This should only be an issue where being able to urinate standing or giving birth is an issue. A competent person is a competent person and it has nothing to do with gender. When I was in my early 20's, I dressed in 3 piece suits to work in order to convince people to take me seriously at work. That was really sad. Now, I wear jeans and a t-shirt to work every day. My words and actions speak for me. I could easily (as a man) show up for work dressed as a woman and people would depend on me precisely as they do now. In fact, if anything, it would make people see me as being even more confident with my ability to accomplish tasks.
People who think the way they dress matters at work are precisely the people who aren't that good at what they do. If you're a problem solver, you're value is based on your ability to solve problems. If you blow sunshine up your bosses ass, gender doesn't matter, you'll get those raises and gain their misguided respect. I hate that I spent years having to read about sports to convince idiots who rate your value based on your tie that I'm of any value.
I now, at age of 38 years old signed a contract with an IT firm which gave me 5% of the company, salary plus bonuses and the money I need to do my work.... without kissing ass or talking sports. It came from years of 18 hour work days, constant studying, huge numbers of certifications and more. My work paid off. Gender had absolutely nothing to do with it. My knowledge, connections, skills and friendliness was all that mattered.
Want to know something scary? If you showed up for an IT related job dressed like a man, I'd see you as lacking confidence in yourself. I'd see you as a person who puts more value on appearance than on productivity.
Wow!! Norton! There's a name from the past!
First... Facebook is pretty much evil. We all use it because the only real alternative is Google+ which is fr ok the company who brings us 184.108.40.206 to track all our DNS queries. So, when you basically give 99% of your life to a company which is competing with Google to see who can be Orwell's big brother first, you can't really bitch when they try to get that last 1%.
Second... Norten! Wow! I remember those guys. They're the ones who too over from Microsoft for a while with regards to making computers slower so you'd have to upgrade right? I mean, when MS started doing stupid things like making Windows use less RAM and CPU with each version, Norton Antivirus used more and more so Intel would sell more chips :)
Windows 8 if only for efficient people
Let's be honest, Windows 8 is only for intelligent and efficient people. Windows 8 if perfect for people who prefer to press 4 keys on a keyboard to start an application as opposed to people who want to move a mouse all over the damn place. It's kinda like how some people will install enacts or vim on Windows to have editors which are just really efficient as opposed to ones which are dog slow and sloppy. Windows 8 made it possible to move at the speed of light compared to just about any other UI. By integrating with a search engine which works very well, documents and applications are far more readily accessible than before.
I understand why many of you are confused though. It's a slow witted thing. You need to go click, click, click, etc... To do anything. It's ok, Microsoft has learned their lesson, they'll dumb down Windows 8.1 for you.
Good!!! Nokia is killing Windows Phone.
Let's start with Nokia. THEY ARE A PHONE COMPANY. They make their money selling phone hardware.Once the phone is sold, if they want to make more money from a customer, they have to sell another phone. Nokia also never understood smart phones. Buying Qt was the smartest move they ever made and then screwed up. Nokia has a long reputation of selling phones and saying screw you to their customers. For some stupid reason, they think making bunches of different phones is a good idea. That worked before. Now, we want phones who get love from their makers afterwards. People complain about iPhone software updates, but in reality, people know they're getting love when they get updates.
Microsoft needs to make their own damn phone and do it right. Spend a year and focus on industrial design. Then, whatever you do, don't let some loser like Stephen Elop looking fat and sweaty in a suit with a tie get caught calling it cool. Steve Ballmer should never been seen in public with one. Give him a Nokia. Hire someone to make it cool. Not a pop star, but an artist with a gift for industrial design. People don't realize that John Ives is far more important than most think. He brings fashion to the devices.
I love it when a company tries to make a new version of an obsolete technology. Last night I was helping to spec out a new FCoE network with four 40Gbe connections to each blade. Seems kinda funny anyone would want to invest heavily on FC when it's so insanely expensive and incredibly slow.
Good to hear
I was burned more than a few times by using AMD chips over the past 25 years. I have to admit, I haven't bothered with them in a while. This is mainly because they're great at consumer and great at data center, but somewhere in-between, they're lacking when it comes to supporting small scale developers. Their development tools have always been lacking and their pathetic support for GPU code developers really chased me away from them. That said, I am happy to hear that a real CPU vendor is taking on ARM.
There are a few major problems I see here :
- 10Gbe support on die, this is just amazing, but to be honest, I don't any mention for FCoE/DCB here. To make the 10Gbe controller useful, it needs to be more than an Ethernet controller or it's just a waste of die space. A modern 10Gbe controller needs to support priority flow control and enhanced transmission selection. This is a minimum requirement for supporting FCoE. In addition, added support for RDMA over Ethernet is a major requirement in today's environment. Add to that VNtag support and you have a controller worth using. Yes, I know only a handful of vendors have those features today. Broadcom, Cisco and it looks like soon Intel will all have them. But two 10Gbe controllers with those features are a minimum requirement in a data center network adapter for 2014.
- Virtualization support. ARM has it, but it's borderline crap at the moment. This is really not ARMs fault. They're just noobs to this category of computing and it'll take some time to get it worked out. They should be actively working with VMware, Microsoft, RedHat and Citrix to make this happen. They should get silicon out there as soon as possible so those companies can get bare metal hypervisors ready for the ARM processor in the real world.
- Compiler support. ARM/AMD need to stop screwing around with the ARM compiler which has always been a pain in the ass except when developing boot loaders or code close to the hardware. They need to put together a team of real engineers to take LLVM seriously. Thankfully, the guys over at Apple take ARM seriously, but they're interested in the 32-bit core from ARM. I haven't heard a single rumor of a chip from Apple which will employ 64-bit instructions. In fact, in a smart phone, it's probably almost a disadvantage to waste space on wider word width. What's the point of adding a huge amount of CPU power using 64-bit in a phone when 99% of what you want to accelerate would profit more from custom cores and better GPUs. In server land, it's all about CPU and therefore neither ARM or AMD can bank on someone like Apple taking on the optimization of back end code generation for 64-bit ARM as there's just no profit in it for them. AMD needs to invest to make this happen.
- Wide busses. Most people simply believe that PCIe busses come for free.The fact however is that the PCIe controller of the CPU takes a tremendous amount of bandwidth, requires direct memory access within the CPU cache as well as system memory causing major issues. This increases the amount of multiplexing that has to occur within the CPU cache especially when trying to support cache coherency. This generally is accomplished by placing a much larger burden on the cache logic itself, the effect of this is to either slow down the cache and the speed at which the CPU cores can access it OR by increasing transistor count and power consumption. Want to see all those great benefits of ARM technology go bye bye? Add more PCIe lanes.
- Microsoft support. Yeh... I know.. we all hate Microsoft, but to be fair, unless Microsoft throws some server and data center love at ARM, there's little hope for this being a really useful technology outside of corner cases. Sure, you can do more web serving. You can maybe run some monster apps like hadoop, but in the end, companies run on Windows Server, Exchange and others. I doubt there's a whole lot of processor specific code in Exchange, but there's bound to be some. In addition, for developer workstations, there should be a version of Windows which runs on ARM with desktop support and Visual Studio for example. Most developers really don't enjoy debugging their apps remotely. There's something more natural about debugging on the machine you're coding on.
Guys, it could happen... I hope it does, but unless ARM and AMD take the real world problems seriously, I don't see this being more than a niche market item. And worse, if AMD comes to market with a half assed solution, it'll become a problem like Surface vs. Surface Pro. People still go to the store and buy computers with either Intel or AMD chips in them. If they get a machine with an AMD and it's an ARM and they end up running Windows RT or worse, an Android variant, people will stop looking for AMD systems because it's too confusing.
x86 is definitely not the ideal instruction set, but RISC vs. CISC or VLIW was never really what it was made out to be. There were just as many disadvantages to RISC as there was to CISC. Code size on RISC was huge. Then we ended up with the bastard step child of RISC being Thumb which was somewhere in-between.
These days, the instruction set means nothing in reality. It's all about efficiency in processing itself. It's about things like how the CPU handles cache coherency, how the CPU manages passing code between cores, how to handle multiple ring-0 contexts... effectively making Ring 0 the new ring 0.5. It's about handling SLATs. These are all things which matter. Then of course what matters is the ability to power down major parts of the chip. This is something which doesn't work well in a single die environment where 99% of the chip is synthesized from a common VHDL/Verilog code base which doesn't allow for the analog nature needed between units.
Intel's chips make use of x86 and x64 instruction sets, but no decent processor today will make use of that when executing code. Now the next generation of Atom is also doing away with x86 and x64 in the core and replacing it with a instruction set agnostic architecture. The CPU will instead attempt to recompile the code when it receives it in order to handle tasks out of order. In fact, to a certain extent, the nature of x86 and x64 lend better to this design since RISC groups everything into a single instruction where ever possible. Intel's nature is relatively granular and provides what will be easier to recompile on the way in and manage dependencies for. I can very easily in my head design algorithms for out of order execution of x86 instructions where ARM instructions require a second phase altogether to manage the instruction dependencies... though it's not particularly difficult either... just takes more transistors.
If you also give me a chip with AVX2 instructions, then I'm really happy. AVX2 is just damn sexy in everything regarding mobile phones. It would allow me to vectorize my code and make use of two-in-one-out instructions. If they make a new set which allows a single instruction for a 16x16 16-bit hardware transpose operation or an extra flag to access registers vertically instead of horizontally, I'd be in love. At the moment, a 16x16 transpose is the last missing instruction in AVX2 in my opinion.
I agree, he was a bit trollish, but HH has been far more active in things like genetic research during most of that period than involved Internet wise. This isn't to discredit HH and frankly his predictions are probably about as sane as anyone else's... well maybe not John C. Dvorak who has successfully predicted the exact opposite of everything in the industry for nearly 30 years.
But to be honest... there are some issues here. For example, you can't help but to feel that as what could be considered one of the fathers of ARM that he might be a tid bit biased. Let's not ignore that all of his computer companies did get their asses whipped by companies like Apple, Intel and Microsoft in the long run. ARM is really his only computing legacy that I could Google which has survived and impressively so. So discounting all the places where his ventures fell on their rears, he did an amazing job in the case of ARM.
I can't help but to personally dislike ARM and it comes from trying to write compilers and assemblers for the platform. I actually found it had to be the only platform ever made I considered to be less elegant than PIC. It was aggravating as hell and I wished they could just pick a damn instruction set an stick to it. That said, if Intel loses it's crown, I sure as hell hope it's not to ARM but instead to a company which actually cares about developers and wasn't so hackish as they are.
For a sixth generation of computers, I really hope that someone creates something new. I felt a great deal of hope for XMOS for a while, but they're pretty much stagnated into boring crap now.
My two cents
Larry should revive the Sun name. Oracle has a worldwide reputation for things that are VERY EXPENSIVE. Sun wasn't cheap, but Sun's name was more closely tied to innovative and powerful. I know that since the buyout, almost none of my customers (the biggest IT companies in Norway) have looked twice at what was once Sun.
Also, Sun did a much much better job on keeping OS documentation available to all users. People even bought Sun machines because the documentation wasn't buried like it is now.
Oh. Solaris? Who's going to use an operating system if they're not even sure what it's called anymore?
Re: With twice the users...
Strange, I bought a Surface and two Surface Pros. My wife and I haven't even charged our iPads in weeks.
Sure... There are a handful of bugs, but less than on my iPad 3.
Only true bummer is I have to use the stylus from my Samsung Series 7 Slate because the Stylus from Microsoft doesn't work well when held at a slant. But the Samsung pen is absolutely amazing on there.
Almost exactly the reason I won't buy Nokia, HTC or Samsung
It appears that other people may be happy with a $400-$1000 investment in a phone which no one at the company who makes it will care about a few months later. I personally however demand that whoever I buy a phone makes only one or two models per year. That's why I have iPhone 4 and am anxeously awaiting Surface Phone.
Re: actually no
I would say this card is an excellent example of way too little too late. Beside the points you made, let's focus on the fact that this card is of minimal advantage without making fabric upgrades for 8gb/s FC. After all, you buy switches and directors when you build the cluster. So you don't generally have a pile of unused 8gb/s ports laying around unused. When you upgrade the HBAs, you will also upgrade the fabric. There's no real advantage of replacing HBAs unless you're upgrading hosts and maybe directors as well.
So, if you're going to upgrade, who the heck would even care about pure FC when CNAs are far more cost effective at this time? I can't even begin to think about who the target market for these are.
First... YES FCoE DOES require special hardware to function properly. This is why we have converges Ethernet adapters.
So if we leave aside the fact that the author basically has less than the slightest clue about modern network environments, maybe it's a good article to sell $500 network cards to other people who lack a clue.
Re: Give it time
Microsoft have away Samsung Series 7 Slates at their build conference. An AWSOME machine. I've used the one I bought with Windows 8 over a year now. Once Surface Pro comes out, I'll even stop carrying my ebook reader with me (iPad 3). I only need two tablets with me when I'm working. One for work and one for ebooks and watching videos. I use the Series 7 Slate for work as the iPad is useless for that. I use the iPad for the high red screen. 206dpi will be fine when I get the Surface Pro. The I can have useful computers with me. ;)
First to clarify, the profit I made was from porting libraries to be used by others. No released app yet. So there's nothing to see here.
The application I'm developing is an SSH client integrated with Visio which also has thumb oriented macros targeting Cisco and Juniper configuration.
I did present this discontiguously. My apps take longer to write. My customers apps take longer to write. It is because we're all waiting for the infrastructure to establish. This takes time.
On iOS, it took years for libraries to all make their way to iOS and even now it an often be problematic. Android is an absolute nightmare for libraries at time since you have to mar the judgement call of whether to diddle around writing Java code which is useless for you on other platforms or if you'd prefer to use native code instead which also is a half assed option. Blackberry is semi-ok since you can port half assed Android apps to be half assed Blackberry apps using their Android compatibility layer.
When MS finally permits native code on Windows RT, and it almost certainly will happen once they feel that there is enough Metro apps, Windows RT and Window 8 will be the same tool. Windows RT was strategically amazing. It's like iOS and OS X. You have a great deal more limitations on Windows RT (like with iOS) but instead of the Apple stupidity of making it so you have to choose to develop for either iOS or OS X, you can write one app which runs on both Windows RT and Windows 8.
Then you have Google making ChromeBooks with a full OS which runs Java, but they don't even have Eclipse up and running on it. That is SOoooo lame. Like "oh... You want to make an app for Android. Yeh, go get one of our competitors products, we can't do that"
Re: Give it time
Uhh... I paid $800, invested two days and made $9000 on it. Where did I lose money on it?
Same reason everyone else has problems competing with Apple iTunes Store. They make half assed windows and Mac solutions for interoperating with their core products.
Step 1) Make a cross platform library for audio and video communication that compiles on Windows, Mac and UNIX systems.
Step 2) replace software codecs with platform specific hardware codecs
Step 3) build an awesome UI for Windows AND Mac. Make sure you have a good Metro UI as well these days.
Step 4) make sure there is encryption... No SRTP crap. ZRTP or something better.
Step 5) make it integrate with PBX systems via SIP or Cisco ATA boxes.
Step 6) add cheap or free calls to classic phone networks
Step 7) spend a bunch of time figuring out why the heck you just invested $10,000,000 and two years in the development of a product that has no reasonable means of producing revenue.
Step 8) recognize that Skype technology is core to Microsoft Lync and FaceTime is core to iOS and get a trip and realize that there is no practical market share left for another player
Step 9) laugh you ass off at Cisco for pissing away a ton of money on Tandberg only to be almost entirely obsoleted by Skype and Lync and tablet devices.
Re: why should I only be able to talk to people to people who are signed up to MS
Give it time
Let's pretend for a moment that Microsoft isn't as dumb as everyone thinks when it comes to business.
1) a high price tag on Surface limits the number of consumers who will spring for it. Nearly every developer I know preparing apps for Windows 8 has bought one and we consider it the reference platform for Windows RT development.
2) the API for Windows store is a much bigger pain in the ass than we expected. After 20+ years of using almost the same APIs for Windows development, it's actually slapped some of us silly. Many developers by nature are in fact procrastinators with cause. We prefer to wait for the final version of the OS to ship before we start porting to the new platform. It makes it so we can benefit from the stabilization that comes with other developers submitting bugs and Microsoft adding APIs which are needed. In my case, it wasn't until the last month before RTM before the API for keyboard handling I needed came about. In addition, some of us old dogs are trying to stick to old tricks and it isn't always working.
3) a bunch of us are waiting for third party libraries to be released before we move forward. I for example have developed my apps as a Windows desktop app with WPF while waiting for the libraries I use to compile using the new sockets layer.
4) text rendering is a big problem and documentation is still not good enough. If all you want is the same crap apps as iPad or Android, get one of those. But when we develop for Windows, we deal with a more fickle audience and it's extra important to get it right. Don't believe me, ask Corel. They've developed half assed applications on Windows for years and the current generation doesn't even know their name. I'll spend an extra three months getting my Windows app right as opposed to releasing iPhone or Android quality crap.
5) tricky change for developers who want I code apps for both desktop and metro. It's quite hard to do it. Often you have to write your code twice. As much as Microsoft had made it easier. It seems impossible to maintain a single copy of a library for both Windows Store AND desktop in the same project file. This really makes it more complex.
So, a higher priced tablet which seems to sell mostly to developers makes a lot of sense. Windows RT will fly when developers catch on. The restriction on desktop apps was a great idea as it forces many of us to code for Metro.
All I can say is, give it time. It will happen. I think once we get out heads around how to develop both desktop and Windows Store apps, it will make a huge difference.
As an algorithm specialist and someone who for years has been designing RAID algorithms and have designed software based wear leveling algorithms for portable devices, when I come across an article about a patent verdict like this, I like to know what it is that is being talked about.
What patents were violated? The money is a big number and will be appealed for 10 years and Marvell will file bankruptcy and sell its assets to a new company run by the same people before paying. So frankly, it's meaningless other than to say "Wow, big number!".
What people (and probably) the courts don't understand is that flash controller algorithms are typically quite trivial. 99.9% of the algorithms can be found in Donald Knuth's TAOCP and are just a mixture of what is already known. The implementation is whatis more interesting and frankly, I doubt there is anything past basic triviality involved in that. That being said, developing chips is rocket science not because of the algorithms, but more because of the art and time involved in designing and simulating before paying a ton of money to prototype. It's not like software development where if you make an oopsie, you just fix it and recompile. In a chip, you have to design not only the initial logic, but a means to implement patches as well. It's HARD!
I would love to review the patent in question and disect it to see if there was anything Marvell implemented which could be considered more than just gluing together a pile of 40 year old algorithms.
Re: So sad...
Had me practically choking on my tounge laughing when I saw that.
Re: C++ put me off programming
I don't know... I am not a fan of virtual machines, but I adore JIT as a JIT if implemented correctly would do per processor optimization on the fly either at startup or after performing tracing. C++ is my baby, I grew up with it and have programmed it since I pirated my first copy of Glockenspeil C++ when I was 14. It is truly a thing of beauty.
I however have begun investigating compiler and operating system design and development using C++ derivatives since C++ doesn't have enough ++ for me anymore. I hate the static architecture of ahead of time compiling. I don't believe good code can be produced anymore in the modern world of processors. There are exceptions. The x264 guys being the primary exception. But they code in C and blatantly force incompatibility with C++... Often out of their own interesting principles.
I'll always love C++, but for now, I think JIT oriented languages are much better technically than C++.
Re: Chrimbo Quizz
Ummm... Buying two surface pro 128gb and a 64gig surface when I go to the states next month. I have had a series 7 slate for a year. I use my iPad for PDFs and sometimes watching cartoons, but use the Windows tablet when I have to something meaningful. The Surface (not pro) is just another iPad clone. Mostly a waste if money. I program on a slate and find it to be the most versatile computer (battery life excepted) that I've ever touched.
I have four iPads a pile of iPhones and a couple of Androud tablets.. They're all just tablets and therefore nothing useful. Now Surface Pro... It's a laptop, a tablet, an entertainment center... It does it all.
So before you assume that people aren't buying surface as opposed to the more likely issue being that all the people in the stores selling them are saying "Wait until January and spend a few bucks more", check the reality.
iPad/Android = toy. Surface pro = super toy!
I don't mean to be rude, but...
Is it at all possible you're an idiot? This type of thinking is what gets the U.S. into the position it is. It is counterproductive and spreads conspiratorial nonsense around making unemployed workers believe their "misfortune" is the result of easily identifiable groups as opposed to accepting responsibility for their own inadequacies. Of course, I'll assume you're one of the inadequate, so for your sake, I'll say it obviously someone else's fault.
Foreigners willing to leave house, home and country behind to work their asses off to prove themselves while risking everything are simply more attractive than workers who are simply sitting on their asses waiting for the jobs to come to their home towns. Pack the bags, risk it all and move and you too can be part of the employed as well. I did and it pays off... I make far more than if I lived in America because the companies I work with can feel that I eminate motivation.
Or you can sit on your ass and bitch about how far more motivated people are stealing jobs from your lazy ass!
Facebook is for nonsense. I would hate to have to start being well behaved on there.
LinkedIn is where I pretend like I'm a decent person.
Wasn't this more meaningful when...
Apple made better products?
Really, this year, Microsoft... yes Microsoft has outdone Apple on cool.... with the exception of Nokia phones. When Microsoft has their phone and Surface Pro out, I just don't see what will make Apple interesting anymore. I have bought 4 iPads, 4 iPhones, 2 Mac Minis, 2 Apple TVs, 2 Mac Books, etc... and frankly, now that Microsoft has finally gotten in the game, and frankly, Windows 8 is just about 10000 times better than iOS (and Mac OS X), I just can't see this mattering.
Apple went from being a monster early on to being almost nothing when Steve Jobs left. Somehow, I feel that Tim Cook just doesn't get it either and there will be no Steve Jobs coming back this time.
Re: There is no reaction image for this
I read both patents and as a developer of similar technology at companies like Cisco and Opera, I can only say this. I would implement infringing technology without knowledge of the patent because it was just common sense.
Gambler loses and sues to win instead?
Wow!!! I need to get into this gambling thing. If I understand correctly, a woman bet all her chips on one square and when the wheel didn't fall on her number, she decided to sue the guy spinning the wheel because he failed to tell her that the wheel might not land on her number?
I'm sorry, but I'm busting a gut here. Gotta love this mentality :)
Re: WL killed off?
You mean the top left corner of the other service Microsoft had already announce killing off in favor of outlook.com?
Re: Static typing is not the right solution
I practically memorized the dragon book an the LCC book and some other good ones. I also developed a browser for a living for 5 years. So please don't simply discount me for chiming in.
Now all we really need is tight integration between TypeScript and OpenCL. That would be amazing?
Re: ActiveX Comes to mind
Haha!!! Best comment I've seen in a long ass time!
Seriously, I was reading comments just to find the moron who would bash the technology because of lame assed anti-Microsoft sentiment.
Personally, I've been looking for something like typescript for ages, so I'm happy to see it come. It would be fantastic to have a real programming language in the browser. I was just so pleased to see your response to this guy :)
Nokia fails because of Elop and Ballmer, not Windows
Ballmer and Elop make the Nokia handsets look like something for sales people. It's pretty sad.
Personally, I'm itching for a Windows 8 phone but I can't find anything but the Two Steve's phone. I wouldn't be caught dead with something that looks as shitty as the Lumia. It looks like something to be sold at Toys R Us in the little girls play makeup section.
Ativ S looks ok, but I won't spend that kind of money knowing Intel based full HD devices are soon to come.
So without free advertising, FireFox can't get downloads?
Funny. It seems crappy to me that Microsoft has to give away free advertising.
What about Apple? Last I checked, not only do you have to search for web browsers yourself, but hen you do, Apple posts warning about them being dangerous.