Re: 52 weeks per year
GPS doesn’t have years. It just counts 1024 weeks.
816 posts • joined 13 Oct 2014
If I was the developer, then I would figure out that a timestamp can never be older than the device itself (if you buy a TomTom today, it will never receive dates from Jan 2019 or earlier, so we can just use a window and hope the device doesn't last for 1023 weeks :-) Probably changing the ROMs going into manufacturing once ever year or even every five years is fine.
But even without this, you'd only get into trouble if some satellites are in week 1023 and others in the next week (week 1024 but transmitted as 0). But that state lasts for less than a tenth of a second.
"You gambled with Crypto and got burned. I have no sympathy for you since it was us gamers that got screwed in all of this."
NVidia didn't get burnt. They made a ton of money from crypto. Five times more than from gaming (just guessing). And now crypto is mostly over, so they make less money. The crypto money is still in their pocket. Now people who bought shares when profits went up because of crypto, they will be in pain. But NVidia isn't.
But Nvidia? Imagine someone offered you four times your current salary for three years, and after three years you are back to your old salary. Did you get burnt? No, you didn't. You made lots of money.
I needed a DUNS ID for my hobby app in the app store, and got it for free, with no problems, so this isn't exactly what you'd call a "hurdle".
And we are talking about an "enterprise license" here. To distribute apps in your enterprise. Which is pointless for companies with less than 100 employees, because you can do that as an ordinary developer.
There was the story of one insurance company who was _mostly_ Y2K ready. The only problem was that they couldn't handle people who were burn and lived in _three_ centuries, like born in 1899 and still getting a pension in 2000. They found they had 14 cases, so instead of changing the software, someone got the responsibility to handle these 14 people manually.
"Shamir Secret Sharing". First time I read about it was in Donald Knuth's "Art of Computer Programming", printed some time in the 1980's. Of course you might tell Adi Shamir to not write his own crypto, but he'd probably just say "and who are you?"
And it works perfectly fine for k out of m people. Just a polynomial of degree k-1, evaluated at m points, so k results reconstruct the polynomial, an doing everything over some finite field makes sure there is ZERO information out from k-1 results.
"For me, a core on a multi-core CPU has always meant that I can run an additional thread without impacting performance. "
Doesn't work like that on modern Intel processors. The cores share a very important resource - the processor cooling. So if you run an additional thread, temperature goes up, and you need to reduce the clock rate.
Doesn't work for any modern processor. The cores share a very important resource - RAM. If you add more cores, your performance per core goes down when the cores start fighting over who can access RAM first. (L2 and L3 cache are also often shared).
"It was also astonishingly expensive hardware/software combo very few can afford. The only real one I saw was at CERN, an organization not known to procure cheap hardware... far better than the Lisa, but too niche to sell in quantities enough to survive."
Long after the Lisa.
"Try systems like the Zerox Star, ICL Perq, the LMI and Symbolics lisp machines. You might also look at the embyonic days of Sun, and just what a Sun-1 was."
I've seen a live demonstration of Lisa vs Xerox Star during CeBit. The Xerox Start cost about five times as much as the Lisa. And the Lisa ran circles around it. It was expensive. And what it did hadn't been seen anywhere for that price.
Well, clever boy, the German court in question has decided nothing except that Qualcomm made a claim that isn’t so unreasonable that it needs to be thrown out immediately, and that Qualcomm put up a bond to pay Apple’s damages if or when the claim is eventually turned down.
"Samsung has been using PenTile for 10 years now, since they first started shipping AMOLED screens. Yet now that Apple used it, SUE?"
Of course. The basis for the lawsuit is not so much the facts, but whether it embarrasses the company. Nothing in the world could possibly embarrass Samsung, so no chance for a payoff from them. Very little chance of a payoff from Apple either.
If Ticketmaster included scripts on their website, then they are fully responsible to their users for the action of these scripts. Even if they didn't turn the scripts malicious. The only way they would be off the hook would be if they are "not in any way responsible". So just a little bit responsible would be enough.
Sure, if Ticketmaster has to pay out damages, then they are absolutely entitled to recover their money from the creator or distributor of the malware. But that's their problem, not the problem of people visiting Ticketmaster's website.
"If Citrix did indeed run a comparison of their sign-ins against publicly known compromised credentials..." then I would be very worried, because Citrix is not supposed to know their user's passwords and not supposed to be able to do this easily. And not faster than any hacker could do it.
"I just write it down on a post-it note and stick it under the keyboard"
You want to be protected from evil hackers on the internet, and from nosy colleagues at your workplace. So if you use this method, take a password that is memorable to you, but not to your colleagues, add a longish random password, and only write the longish random part on a note. Evil hackers on the internet can't read the note. Nosy colleagues are usually not skilled enough to add your memorable words part.
"Not clear they can sue unless they can prove some sort of conspiracy. "
Of course they can sue. It is unlikely they would win a libel suit, since it would be very hard to prove malice. However, if a court ruled "there is not one bit of evidence that Bloomberg's facts are true, but Apple / Amazon etc. cannot prove malice", the companies would be happy with that.
Not quite what he said would happen. If he is being charged now (and we really don't know that at all), that is many many years later. He could have gone to Sweden, maybe gone to jail for a short time, flown back to Newzealand, all before charges were brought. Now it may be too late.
The jailtime wasn't due to the severity. He got jail because he wasn't an employee anymore and had no right to access the old employer's computers at all, so he was caught by computer hacking laws. If a regular employee did this (one that had permission to access the data, but obviously not permission to send them to a competitor), it would have only been a data protection violation.
Of course the company can sue him and the receiving company for damages in any case.
"My careful study of Apple's computer division over the past few years has indicated that making stuff people want to buy has been off the table for some time."
My careful study shows they are not selling the stuff _you_ want to buy, but they are selling the stuff _people_ want to buy.
"on XE.com the current conversion rate says it should be £941.21"
Oh well. Have you ever heard of VAT? Of the £1,199.00, Apple takes £200,00 straight away and sends them to the nice Mr. Hammond who is going to look after them.
Another major difference is UK customer protection laws, which mean that Apple will have to fix things if they stop working within two years. That doesn't come for free either.
There have been rumours for Macs with ARM processor for long time.
I'd say: There _will_ be Macs with Intel processors for a very long time. Current ARM processors are fine compared to a current quad core laptop, but don't come near the high end Macs (18 cores currently).
But if you see that new iPhones and iPads come with half a terabyte or a terabyte of storage, Apple could just let users run MacOS X on an iOS device, possibly as an app. Take an iPhone XR, run the "macOS" app, attach keyboard and monitor.
"The only world that changed was the world of the aimless slurping their Starbucks. 99.99999% of 'the world' has little to do with staring and dabbing at a small slab of glass for hours every day."
Wait a second... Are you telling us there are only 700 iPhone users in the world?
"If Apple did ass extra instructions to an ARM architecture it would fundamentally invalidate every ARM compatible compiler."
Existing compilers would just not use the new instructions. And Apple was always the driving force behind Clang, so you can be sure that any new additions would be supported by Clang - which is what people use to compile MacOS X code. Not that I expect any additions.
"You'd've thought so, but I've recently contacted the developers of two Mac applications I'm rather fond of (both of them first came out on 68k Macs). They're currently only available in a 32 bit version.
According to the developers, the only way they can convert them to 64 bit is to completely re-write them in Apple's Swift language which they are having to learn. And that's going from Intel to Intel...
On the other hand, I'd guess that if you've got a fully 64 bit application written using Swift and XCode, then Apple will arrange things so that converting an Intel application to ARM will be low-effort, assuming that Macs are switched to ARM CPUs."
Going from 32 bit to 64 bit can produce all kinds of problems, especially if the code was written at a time where nobody that 64 bit might ever exist. As you said, built for 68k Macs initially. That would have been before 1990.
What these developers probably have is _ancient_ code ported to MacOS X using the Carbon framework, which was never ported to 64 bits. That would have happened around 2002 or so when MacOS X was introduced, and stopped being supported about six years ago. There is no requirement to switch to Swift, Objective-C would do just fine. But 32 bit won't run on MacOS 10.15 anyway (that's the next version after the just released one).
Intel 64-bit vs Arm 64-bit using modern frameworks is no problem. The capabilities of both processors are the same, so any code not using Intel assembler code will compile and run identically.
Biting the hand that feeds IT © 1998–2019