Don't know how popular Xamarin is, but Unity is built on top of Mono and supports C# on just about every platform out there.
585 posts • joined 8 Jul 2009
Don't know how popular Xamarin is, but Unity is built on top of Mono and supports C# on just about every platform out there.
They raised $60m in 2013, a further $500m last year, and are now looking for yet another $500m.
Maybe it's time to consider a business plan that makes money from people other than investors, guys....
How does one go about hiding the fact that one is now very very rich...
That's actually the easy part. Register a company somewhere, and quietly buy property all over the world. Wait a year or so and then take a little vacation somewhere with a nice climate. It's pretty easy to buy a residency permit more or less anywhere in the world after you arrive (in most cases you just need to invest a certain amount in the local economy) and then you're an "international property investor".
Li-Fi hackers are now targeting your area! Protect yourself before it's too late. Our patented curtain rail system with multi-dimensional curved corners ensures no Li-Fi signals can be read by hackers from outside your home.
By connecting these rails with our specially woven curtains that incorporate the latest reflective technologies, not only can you rest easy knowing that your data can not be read but you'll also benefit from faster connection speeds and better network coverage within your home.
Order now, and receive a free Li-Fi wall protector panel. Research sponsored by the tin foil association of America has shown that up to 20% of Li-Fi signals are absorbed by the walls of your home. Our unique, specially designed wall panels can reduce this loss down to a staggeringly low 1%.
Don't be a victim. Call our sales team today!
Yes, it's a great thing. No more shitty anti-aliased text. No more clear type bullshit that gives your text a red/blue ghost like the 1970's 3D, George Lucas bastard love child that it is, and looks shit on almost horizontal lines.
Windows 8 scales just fine. It even scales legacy applications that are not high-DPI aware. (Admittedly that could be done in a nicer way, but at least it does something - which is more than can be said for Windows 7.)
Design flaws in Linux? ...because it uses industry standard methodologies that have been tried and tested for the past 40 years...
fopen, strcpy, memcpy, et al. You mean those kinds of "industry standard" methodologies?
I haven't lived in an English speaking country for almost 15 years. I am used to watching movies that have been either dubbed or subtitled - or are in a foreign language without subtitles or dubbing.
About an hour into Cloud Atlas, I found myself thinking that I will have to watch it again in English so that I can understand what the fuck is going on.
About an hour after that, I realised that I *was* watching it in English, and that no number of subsequent viewings were going to make the plot any clearer.
Microsoft started dictating what level of functionality had to be supported for a video card to claim to be DirectX version X compliant a long time ago. I think it began around the launch of Windows Vista. That was the point at which GPU manufacturers stopped putting in what they wanted to put in, and started putting in the functionality required to be compliant with DirectX feature levels.
And yes, I'm deadly serious about the DirectX API being an open standard. Lots of Microsoft technologies are open standards. Why not this API?
I have always wondered why Microsoft never created an open standard from DirectX, to be honest. It's had its problems in the past, for sure, but these days it is a surprisingly nice, well designed API.
...we refer you to its clean sweep of the 32nd Razzies, where Sandler deservedly picked up both Worst Actor and Worst Actress awards, and the "twaddle-fest" was furthered honoured with Worst Picture, Worst Supporting Actor (Al Pacino), Worst Supporting Actress (David Spade as "Monica"), Worst Screen Ensemble (The Entire Cast of Jack and Jill), Worst Director (Dennis Dugan), Worst Remake, Rip-Off or Sequel (for its debt to Glen or Glenda), Worst Screen Couple (Adam Sandler and EITHER Katie Holmes, Al Pacino OR Adam Sandler) and Worst Screenplay (Steve Koren and Adam Sandler)
I have to say, I'm kinda intrigued. It's a bit like looking a bit too hard at a road accident - you know you shouldn't, but you just can't help yourself.
As opposed to Google who are openly slurping as much information about you as possible to sell on to advertisers.
I suggest Microsoft add similar warnings to Windows for when Chrome starts
I agree that the design goals of C didn't include security. Back then I'm sure people were too busy being excited about every little thing they invented. I totally get that.
But that brings me back to my original comment. Technology has moved on. Times have changed. We've grown up and matured as an industry and as individuals - well most of us have. Anyone can write software today, not just people who spend their days in labs wearing white coats and smoking pipes. And because of this, our tools and technologies need to mature as well. No longer should we be making it so easy for programmers to make what are, essentially, simple mistakes. Sure, the languages of today can allow access to hardware and allow programmers to overrun buffers, but these things should not be considered the norm. There should be better alternatives in place for 99.99% of the tasks developers need to perform.
A lot of C/C++ compilers these days can issue warnings if you use unsafe/legacy functions. Turn those warnings into errors and let's move forward together as an industry.
Shoddy programmers who don't check length parameters are the fault here.
You can blame the programmers all you want, but at the end of the day, your language/library of choice either allows programmers to make such mistakes, or it doesn't.
The design goals of any language (or coding standard for that matter) should include "make it as hard as possible for people to fuck up", because at some point in time if someone can fuck up they will, and the easier it is for someone to fuck up, the more often it will happen.
You could be the best programmer on the planet, and you'll still make mistakes - probably on a regular basis. The more mistakes that can be caught before you even attempt to run your code the better.
Still using C today (or the C-Runtime Library in C++) has always baffled me to be honest. It's obviously not designed to be secure (yes, there are more secure variants of most functions these days) so I don't understand why people act so surprised when exploits like this are discovered.
I finally got fed up with Opera 12. It's falling apart these days as the web moves forward.
I plumped for Firefox with the "Tree Style Tab" (which is far superior to Opera's tab grouping IMO) for tabs down the left side of the screen.
I customised the address bar a bit too, so I can 'g...', 'w...' for Google and Wikipedia respectfully. I'm just missing Paste & Go on a single keyboard shortcut.
I'd like to find an extension which lets me block content selectively (and show me the URLs so I can add them to the filter on my router), but that's all I'm missing at the moment, I think.
Other than that, I have to admit I'm reasonably happy. Which is definitely different to when I tried Firefox many years ago.
Actually, he's given us a kernel that is running on an immeasurable number of devices from embedded devices and phones right up to the majority of super computers. There are very likely far more running instances of Linux in total today than there are Windows PCs (and certainly phones).
Popularity is a terrible measure of quality.
Seagate sounds more like a political scandal involving half a kilo of Colombian nose dust, a few scantily clad women of negotiable affection, three overweight Gibbons, and a couple of Panamanian registered boats.
I'd certainly prefer that over having to use one of their hard drives any day. Except maybe for the bit involving Gibbons.
I don't know how much effort is required to build Win7 / 8 / 10 compatible apps out of an older XP codebase...
Building can be a hassle, but more from using an updated compiler than an updated OS. The STL that shipped with Visual C++ 6 and earlier was vastly inferior when it came to obeying the C++ standard. If you use anything from there, expect to have to change a lot. There have also been major changes and conformance clean-ups in the C-Runtimes too.
You generally have to add a manifest file to your project, and link that in too, but that's not more than a few hours work to look up how to do it online and add the bits you need (they're just XML files).
If you use third party libraries, you'll obviously have to find or build conformant versions of those too, which can be the biggest problem if you're tied to particularly obscure ones.
Once you've managed to get things building, the main issues are when the application "assumes" things about the system. Like where it can save configuration/temporary files to, where the Program Files folder is, trying to do things only an administrator can do, etc. I don't think there are many kernel functions that have been removed since XP (that said, if you use an old version of DirectX - specifically D3DX, you're in for a world of hurt).
Ironically, moving to a 64-bit OS and a 64-bit process (more pain if you think sizeof size_t still equals sizeof anything else) can actually reduce application instability simply because you're less likely to run out of memory if you think memory is there to be leaked.
As I do for my development; the Win7 machine is too flaky compared to XP.
Or perhaps Win7 doesn't let your dodgy apps run rampantly through the system raping whatever they like along the way? ;)
Seriously though, I don't think anyone can really call Windows XP more stable than Windows 7. All my Windows 7 systems have been rock solid since the day I built them. Most of them are usually up for weeks - if not months - at a time with no problems at all.
Good God, man, that was 13 years ago. Almost 14 if you can count properly right off the bat. (Which I clearly can't.)
I have trouble remembering last Wednesday's breakfast. (But putting White Russians on your Cornflakes can take you that way sometimes.)
I guess my argument would have made more sense if The Register had a tongue-in-cheek icon, huh? ;)
Substitute "Facebook" with "The Internet", and you might be on to something.
As much as the Internet has helped us, it has also reduced productivity - especially in the office. Back when I first started working (early 90s) there was no real Internet (technically there was, but we didn't really know about it), and email rarely happened (I vaguely recall some convoluted DOS prompt login process which allowed me to be informed I had no new messages). When I sat at my desk, I worked - or looked at and discussed the cool things people around me were working on (I worked in the games industry at a well respected developer - the stuff going on around me was cooler than cool).
It's not even cordless.
Bit of a pain having to stop every few miles to clear off the cat fur and stray pubes, though...
If you regularly drive through an area with enough stray pubes to bring down a Tesla, I think you probably have more important things to worry about.
All you actually need to do is stick a couple of metal brushes on the bottom of the car, and turn all pedestrian free roads into giant Scalextric tracks.
By all means release PoC code after the patch has been released, to show what was done, but making code available to exploit the bug before the patch has been released?
Even releasing source code after a patch is still pretty irresponsible IMO. There are millions of PCs out there that either won't be patched at all, or will be patched days or weeks later. (Either by lazy fucks like me who restart their PCs once in a blue moon, or by IT administrators who like to try stuff out for a while to make sure it doesn't break anything else.)
My take has always been this:
Agile development is project management for managers who can't manage projects.
Four cars an hour isn't much slower than a single pump in a petrol station. At best, a single pump can't serve more than 15 cars an hour. (Once you've filled, paid, taken a quick selfie, and posted an update to Facebook.)
So let me get this right...
You can start using the technology in the patents for free now. But in 2020 Toyota will come a knocking looking for their cut?
What I want is a telly that can stream from a NAS box with a client/application that respects account/folder permission structures (unlike DLNA unless it's changed recently).
Maybe I'm missing something, but when I'm streaming video from my Windows 7 "server" to my iPad, I see DLNA servers for every local account on the PC. I would assume depending on which server I connected to would control what I could watch from that server.
That's not something your TV should care about, that's a server configuration issue. Just create a DLNA TV account and set permissions accordingly.
How about this 5120x2880 Dell monitor? http://www1.euro.dell.com/content/products/productdetails.aspx/dell-up2715k-monitor?c=uk&cs=ukdhs1&l=en&s=dhs
Paris, because 19 inches just isn't enough.
Actually, I was wondering while reading the article what Seagate drives are like these days. I remember when Seagate was a synonym for garbage, and I tended to avoid Seagate drives like the plague. I still avoid them now simply because I have so many bad memories - mostly of stories that happened to other people I might add - I only ever bought one Seagate drive - that was more than enough.
Anyone care to share their opinions on Seagate these days?
In other news, crass generalisation made on Internet forum found to be not true.
"If [Pirate Bay's] code wouldn't be so shitty we would make it public for everyone to use, so that everyone could start their own bay."
That's never stopped anyone from open sourcing code in the past.
It's like commercial television; programs are a necessary cost centre, without them nobody watches the ads.
Does anyone actually watch ads on TV? I generally do one or more of the following when they come on:
Mute the TV.
Head to my PC and check email/messages/etc.
Go make a snack.
Go get a drink.
And to keep this about Google:
Has anyone *not* skipped the ads when watching a video on YouTube? Apart from being incredibly annoying, they are the lamest implementation of adverts ever - I just love the way they start playing adverts mid-sentence, halfway through a scene.
Unknown is more than a bit worrying. I have twice encountered unknown as a gender in HL7 messages (the 'standard' messaging format for health data). Both times it has been where an unknown emergency patient was involved in an accident of sufficient severity that their gender could not be determined.
Google probably don't need to worry about this one then.
I picked Norway because it was mentioned in the article, and happens to be where I currently live. if I could be bothered, I'm sure I could make the same point with many other countries.
The point I was making (or trying to make) is that while the US may spend a larger percentage of its GDP on social welfare, the actual amount isn't proportional to the size of its population (or the percentage of that population that relies on it). In fact the number of unemployed people in the US is more than triple the entire population of Norway. Putting those numbers into the equation implies that the US spends even less per capita than its #2 ranking would suggest.
As an aside, sucking oil out the ground accounts for less than half the oil industry revenue in Norway. More money is made from services these days. (For which I am grateful every pay day. :)
The US GDP ($16.25 trillion) is 32 times larger than that of Norway ($500 billion), of which each country spends 30% and 20% respectively (according to the article).
Which by my maths works out that the US spends approximately 4.9 trillion dollars a year on social welfare, while Norway spends about 100 billion dollars a year.
The population of the US (320 million) is 61.5 times larger than that of Norway which currently stands just under 5.2 million.
So per person, the US spends just over 15,000 dollars a year on social welfare, while Norway spends just over 19,000 dollars per person.
I have to agree though. Uber have a service that involves customers actually paying for something.
Ultimately, I suspect they'll just register themselves as a bona fide taxi company where they've been banned, and their business model will adjust accordingly in those areas.
A friend tried this once. He just used regular ice trays though.
After taking the frozen water off the top, he ended up with really strong, cold whiskey, which was much nicer than the watered down crap they sell you in the shops. So he did it again, and again, and again.
Eventually he acknowledged he had a problem and finally stopped doing it. Which was just as well because it was starting to eat through his coffee table.
Did someone forget to tell the USA that we have been in the Digital Age for quite some years already?
I think someone sent them an email. Maybe it's still downloading...
...will be perfect in the upcoming Jedi Vampire Hunters spin off show.
"but then it occurred that there are people who --if they were empowered financially-- might do greater good"
Or go out and spend it on booze, drugs and "ladies of negotiable affection" - and then waste the rest.
Interestingly, nearly all the experiments involving giving people a monthly salary, or basic income, just because it's the right thing to do, resulted in nearly all of them actually wasting less money on trivial things and spending it on bettering their lives and the lives of those around them.
Barbie Does Paris?
Bit of an odd name for a messaging app, isn't it?
I believe it to be a play on the words "What's up?", a phrase commonly used by the young people when greeting each other. Because it's an application running on a mobile device, it cunningly combines the aforementioned phrase with the increasingly accepted shortened form of the word "application", viz "app".
But yes, it's a bit odd, as you so eloquently put it.
Do you have any idea how hard I resisted making some pun about Maunder being late for tea time? :p
We might be due another "Minimum", but it won't be Maunder's. The Maunder Minimum was very specifically the period of low sunspot activity between approximately 1645 and 1715.