You sound like those photographers who didn't think camera phones could do real pictures. Technology moves on, especially for government agencies :)
1522 posts • joined 12 Jun 2009
You sound like those photographers who didn't think camera phones could do real pictures. Technology moves on, especially for government agencies :)
"I guess only disposal fees will make this crap less desirable"
The WEEE directive is part of the reason they do cost more these days, so probably not.
Not sure what you mean by "very little coming out of running a space station". Do you think the people up there are on holiday?
This is kind of the point, you think architectures struggle to scale to a billion customers, but modern cloud architectures don't struggle at all. I never said it couldn't virtualise, I said I don't see why I would run containers or Linux VMs on such over priced hardware. The only reason I can think of is clock speeds being higher, but it's very rare to have something so single threaded and latency sensitive these days. My point abou the architecture is that it is designed for scale-up solutions, and hence costs a ton of money. For scale out you're far better off using lots of very cheap systems.
On the point about cloud - it has nothing at all to do with on prem vs off prem. Cloud is a very specific solution and Mainframe is not it whether they've slapped that badge on in the software or not. It just doesn't meet the requirements as set out by NIST, and even IBM's "cloud" isn't a cloud solution, it's just rebadged colo for the most part. All of the platforms I work with also have end to end encryption, this isn't a new thing. We also have homomorphic encryption too so that data can be used while encrypted in a database more usefully.
I agree, you can be agile on a Mainframe. It's just that nobody is, and every change costs millions. Of course, that can be solved with Linux VMs, but then why have a Mainframe? Every argument comes down to why would you choose to go to the more expensive hardware to implement the same solution, and there are no good answers to that other than 1. you work for IBM or 2. your knowledge is decades out of date.
I can't tell what your argument for moving to Mainframe is though? Lintel does everything you describe there and it does it in a cheaper, more scale out manner. It also exists as a real cloud offering, whereas you just used the term cloud on one rack of hardware that you have to buy (Thanks to IBM Marketing, no doubt!). Cloud is a very well defined term these days, and IBM do not participate in that industry in any meaningful way, especially not with Mainframe hardware.
Your concept of moving to Mainframe once you start to scale is extremely out of date too. Mainframe was designed as a scale up solution which had very real limits. Modern cloud architectures are designed to use massive scale out with multiple containers or VMs to avoid the limitations of older architectures. As a result we no longer need bigger hosts. I'm sure you could use IBM systems to host your containers, but the question would be why, given the high costs?
If you're relying on hardware/software to keep you up and available in 2018 you've missed a lot of things at architect school. These days we plan for failure, expect failure, and plan for a retry of the process. The banks have more than proven how reliable Mainframe is in reality in a modern agile world. Reliability came from the fact that in the 1980s people weren't changing things and adding new features daily. They also weren't connected to a billion customers.
Unfortunately you also need to be a billionaire to affort the hardware on which to run it by lunchtime. My sentient cyber brain is at the angry toddler stage after a year of training on an old Pentium 4...mwahahahaha
"Many people think they need to spend years studying advanced math first [to learn AI]"
Not true. ML and AI are already commoditised to the point that a toddler could train a model. Many people do, however, think that you need to understand maths to know if your model is doing what it's supposed to. If the "what it's supposed to" is any more detailed than producing an output, any output, then you'll probably want a keen understanding of maths.
A/B testing is what led to Vista. Do you really want the majority to win in a vote about OS choices again? Technical people are a tiny tiny minority which is why Linux is really popular here but the year of Linux on the desktop will never come. The normals don’t like it.
@ForthIsNotDead that's a pretty intolerant view you have there.
Aspergers is a mental condition and is not something you can choose to do or not do. It's like any other disability except is not visible. What you are suggesting is akin to demanding those lazy wheelchair users just get up and walk. They aren't lazy, they just can't walk.
If he truly does have Aspergers then I'll be surprised if what he's trying to do will make a real long term difference. The issue isn't just acting differently it's perceiving the world differently. As an Aspie myself (I have a diagnosis) I've been in many situations where people I trust are telling me I'm in the wrong yet when I review all the facts I still don't see an issue. Linus is likely the same, as he said in the note that he's gone for years sometimes without reading the situation the way others do. His reactions are perfectly reasonable for his perception of the situation, and that's the part that's almost impossible to fix.
He will ALWAYS see these people as incompetent morons. Some of them probably are. Others just aren't communicating in a way that an Aspie can understand.
"If you spoke to people at work the way he does you'd get fired."
Not if you have a diagnosis, that would be illegal in most countries as disability discrimination. More likely your company would have to help you in day to day life, for instance by providing a mentor to discuss reactions before responding. We can recognise inflammatory wording, so it's possible to train ourselves to get a third party perspective before hitting send. Of course, just because we can spot it, doesn't mean we think we need another opinion - another issue!
"You're fighting the largest company in the world with that."
Walmart? Why would they stop you improving Windows?
Read it yourself..."in the latest Windows Insider builds"
I don't think it's unreasonable in preview software for the vendor to expect you to fully test all features and ask you to use their programs. If this is in post release then we can rightly kick up some fuss and I'll be front of the queue.
I know, not a popular viewpoint; downvote away.
"At the OS level they do not have any meaning, they way they do in DOS and Windows."
DOS doesn't use extensions either, it works identically to Linux in this in that you'd need to launch a program to use a file, and on Windows this works the same way as Linux - it'll joyfully launch the program and try to open the file then fail if it's incapable. I'm not sure what you all think is so baked into Windows around file extensions, but there's nothing there other than a list of extensions and default programs just like in Linux DEs. These are there just for convenience when you double click a file. The only exception to this is that DOS and Windows will only try to execute files with a few extensions such as EXE and BAT while Linux will run anything that has the executable bit set.
Also, no need to get personal and start name calling. You may be a Linux developer, but that's no excuse to act like you're Linus!
@jake that's horseshit every Linux distro I've used has had a config to recognise file extensions and launch apps accordingly, and my first Linux installed from floppy disk. The difference is that Linux doesn't use a file extension to determine executability. It does use extensions to determine how to use files though, as do all modern operating systems. Double click an image file with html as an extension and see what happens next. Does your OS launch GIMP or Chrome?
Could be worse, on Linux you could have a .txt or .jpg extension which the OS will happily execute as a script based on permissions. Extensions aren't a good security feature on any OS, but if you want to show them it's blindingly easy in Windows Explorer.
Nope, this is fully documented. Azure AD has always been a global service where data is not guaranteed to be in region.
Also, there are disaster recovery procedures in place which would have been used if recovery were not underway, which it was. What I think you mean is business continuity, and I agree it's disapointing that AAD isn't designed for availability across regions.
@Dvd if there is room for screws you have a shitty big old fashioned laptop. Not a fair comparison. Find another ultra slim or convertible that lets you replace componentsnd maybe there's some discussion to be had but the Surface range hits each niche perfectly well and MS replaces hardware with issue for free so you don't need to repair it. iFixit have a vested interest in bashing this stuff, and getting nerds all worked up about lack of screws is their business model. My department of 30 all had new Surface Book or Pro a year ago - zero issues. I don't even know anyone who has had issues and the business has thousands of these. Personally I swapped my Macbook Pro for a Surface Laptop and it's the best move I've made. Better hardware and better software, and the blue laptop even trumps Apple for style.
No, no, no, no NO! US government funding for CVE will remove any and all trust I have in the system. Instead, why not require commercial software companies to pay for the system? It's their bugs being reported anyway. The US Gov is too eager to have access to backdoors and exploits already, and this kind of funding would allow them a way to retain exclusive access to these for longer while the rest of us are spied on. Not OK.
Yes it's well paid for picking up and disposing of crap and needles. What they fail to mention is phase two when you'll have to start disposing of the bodies...
Write what you like in the terms US lawyers, some of us live in free countries where your contract isn't worth wiping an arse with anyway. The only thing that bothers me with EULA and similar is that I lose 0.2 seconds clicking "I agree" while laughing and having not read your unenforceable bullshit.
Time for a fork then. In a year nobody will remember Redis and AWS/Azure will switch to a forked version which will become the new default. Fighting Amazon isn't generally a worthwhile endeavor, and closing source that people (probably including Amazon and MS employees) have contributed is just a dick move.
Ah developer misunderstandings, my favourite joke
"Could you please go shopping for me and buy one carton of milk, and if they have eggs, get 6?"
"Reportedly MS have done MSSQL on Linux by doing a Windows kernel interface shim for Linux. They can put win32.dll and everything else on that, and software that uses these DLLs has no idea that there's no NT kernel underneath. So with that installed, a Linux can now also be a Windows too. At least to some extent."
That's not how SQL Server on Linux works. SQL Server was already very agnostic to OS and didn't need the Windows kernel. The "shim" only includes a couple of calls which weren't already inside SQL Server to make it compatible. It's actually very good engineering when you look closely at it.
"how about their NTFS implementation"
Why do you want the source to that? MS has been actively contributing to SAMBA for about a decade. Opening up closed source takes far more resources than you think, with many many lawyers getting involved too. Contributing to the OSS projects seems an easier solution all around.
"go cloud or go away tactic Microsoft has been dancing round for a while"
In what way? MS still offer all of their on prem software and have not adversely changed the pricing model as far as I can tell. Naturally all of the marketing is going into the new stuff, but on prem is still actively developed and released on a slightly slower cadence than the cloud (at customers request!). I'm not trying to big them up here but I'd be curious to know how you think MS are doing this?
Lots of western organisations use Alibaba. They do this because they need to have a presence in China and the choices in China have been Alibaba and Azure, with Azure run by a local company in that region. I believe AWS are in the process of following suit.
If you want to be truly global, you can't not address China. If you address China you have to do so locally and on their terms. That means inside the "great firewall", on WeChat, run by a Chinese company following Chinese rules and regulations.
It's not shoving your data in to China, it's keeping Chinese data in China. Not massively different to GDPR just a bit more controlling. If I were China I'd want to keep data out of the US too.
You have to wonder too, how much of your distrust of Chinese companies is down to western propaganda (I'm not saying China have no issues, but come on, look at the US!). Read real security reviews of Aliexpress for instance and it comes out better than Amazon in terms of trust and security when compared fairly and objectively. It's also 1000x cheaper and sells knock off goods, but the security and trust is there!
Quite a few very large customers doing core business on Azure and more being added all the time https://customers.microsoft.com/en-us
What does this achieve? All they've done is show how laughably easy it is to get pictures of women without bikinis on*. From experience I know that it's even easier to get pictures of women with bikinis on, so why bother burning CPU cycles for this? If people want either kind of picture they will just go and get them, all we have here is something that creates twice as many images as there used to be (three times looking at the sample!) thus making it even easier to find pictures of women. Producing more porn, however soft, is not a solution to porn. I'm ignoring the fact that porn doesn't even need a solution, right now these people don't even understand how pointless their approach is.
*interestingly, this shows how poor Google are at AI and ML. Type "women without bikinis on" into their search engine and it gives you nothing but women with bikinis on (safesearch was off).
Even more interesting is that for a laugh I asked Bing the same question, given its reputation for finding nothing useful. After turning off the safe search which gave me a warning Bing happily showed me a massive page full of boobs and more. I may have to give Bing one more shot ;)
Not sure they count as moguls but Risuale is quite tasty from Risual!
"It's actually £600 which is quite a way off ~£1000 "
The Ti one is nearly £1000. The point was that people are happily paying this over and above an Apple watch because the Garmin is a useful device that has a purpose. Garmin have sold Fenix watches in the millions, and many other models in large numbers too. That's not a failed category of device.
"Unless - or until - a wearable device provides a unique feature"
For instance tracking activities while not carrying a phone? Or mapping without a phone? or music while running...without a phone. Scuba Diving (Garmin Descent) without a phone.
Your phone isn't a part of you, you will survive leaving it at home. Without it, many activities become considerably more pleasant, especially given the bulk of modern phones.
That would be a Garmin. Ticks all your boxes.
+1 Garmin have just released a Fenix 5 Plus costing ~£1000 and even that is selling well. After Apple Garmin is probably the largest install base of smart watches, and they aren't dumbed down smart watches either so clearly there is a market here or the little guy couldn't be charging 3x what Apple are. Also my Fenix 5 battery lasts for weeks!
You probably would if you wanted to run an equivalent number of VMs. Also, Stack isn't the only way MS does hybrid. Also that's not the minimum spend on Stack.
@Blockchain Commentard you make a good point in a legacy network. In cloud world though, ports don't need to be locked down since they only access a public network anyway. Services are secured at the service so all this cloak and dagger security becomes unnecessary. If your main security requires keeping people off of the subnet you're probably already compromised. Proper authentication, encryption etc. is more than enough for normal use-cases, and for abnormal use-cases port locking is laughably innefective so doesn't really contribute. Most devices have their MAC printed on them, and most NICs can spoof a MAC address - can you see the problem here? Even if the MAC isn't printed on, all you'd need to do would be to power up the device and plug it into your own switch - you're on the network with a spoofed MAC in seconds!
The problem they may face is that the question isn't whether they can do the same as AWS it's why someone may choose to also implement Cisco in an AWS environment. If they achieve the same or similar things then why pay another vendor and complicate support operations? AWS or Azure will be a given once cloud adoption begins to build, but traditional network and security vendors are very much optional. Cisco has a large and loyal following from people who got their CCNA and made a career from it, but will that loyalty continue in a world moving much faster than Cisco have ever managed? My feeling is that we'll see one generation (refresh cycle) where loyalty wins and the very next cycle will see everyone realise they are paying for something they already have. We're already seeing this with firewall vendors in the cloud where people realise they can block ports natively and weren't using the "advanced security features" they'd been paying for. I like Cisco, but I'm not sure they'll win against a cloud giant.
You paid £2k for a laptop and your LAPTOP VENDOR chose to put an innapropriate OS on which limits your use? How is that Microsoft's problem? My £1k Surface Laptop has Pro included so maybe you need to rethink your laptop vendor of choice next time.
@AC I must have missed the part of the article where they said Visual Studio no longer supports writing and compiling code!
This doesn't hide things behind the scenes, it automates things for you. You still write and debug code, all it does is place some components in a Docker container for debugging and then strips them out when you set it to Release.
It's really not scary at all, and having seen a demo at developerdeveloperdeveloper recently I'm surprised at the issues el Reg had, but as mentioned it's all preview so subject to troubles occasionally.
Are people still using Xen? I only saw the AWS thing today and they were one of the main backers. Just interested to see a quick poll of Reg readers using it in the enterprise, I'm not saying don't use it :)
I can't tell if this is marketing for a phone or a series of Silicon Valley. This was the plot last series and they seem to be trying to make it into a product?
“My experience allows me to solve problems in minutes that take my millennial colleagues all day. ”
Agreed, but you solve it in the same ways you always have so realistically we could automate that. Inexperience brings fresh ideas which are at least as valuable as experience if not more so now we’re architecting for cloud which when done properly is completely different to standard architectures.
The number of experienced network people I see “solving” the network design by making one huge network on Azure is astonishing. Not one of them considers individual networks with public endpoints for each service, yet the inexperienced often consider that first.
I think it’s a useful thread as a lot of people misunderstand the pay gap stuff and this is often the cause. Kudos for owning up though :)
"The median woman's salary is (naturally) £25,000, the median men's salary is £35,000 (20,000+50,000)/2."
@Peter I'm afraid we'll have to give you an F for this comment. The median (the middle value) in your example for the mens salary is, in fact, £20,000 . This is entirely the point of the exercise - to show that more women are in lower paid jobs. To look at it another way, the jobs women traditionally do have been assigned lower wages by the people in the jobs men traditionally do.
"In my time at the coal-face of IBM, I saw maternity leave have an effect quite a few times"
Because at IBM those were legacy skills where experience did make a difference. Elsewhere in 2018 people are doing new and interesting things where experience counts for very little. In fact, I see more poor architecture from people with infrastructure experience in the cloud than those with none. This is because those with no experience aren't trying to use the wrong tools for the job at hand.
"@lusty Am I wrong or are you saying that seniority, loyalty and experience shouldn't be rewarded in companies"
It's not just me saying it, some of it is the law! Also, most people think this way when they actually think about it and stop assuming that the old system is right. And yes, I do believe that being somewhere for a long time doesn't make you automatically better than someone else. Seniority is about position, and that position should be awarded on merit. Loyalty is earned by the company, and rewarding it (aka bribing to stay) is ridiculous because it necessarily means that if your job doesn't change that they used to underpay you for your talents! As I said, if a 22 year old is doing the same job, at the same or better level as a 45 year old then why wouldn't you pay them the same money? There are a lot of technologies around right now that NOBODY has experience in...
Next you'll be telling us older people need the money because they have families! lol
@Peter Gathercole your assumption that higher pay should come with time served is nonsense. EQUAL PAY FOR EQUAL WORK. Age discrimination is just as bad as sex discrimination or gender discrimination or even racial discrimination.
As you say, it's a fast moving industry. As a result, the required skills are the ability to update skills, not the skills themselves. This means I could miss a year and still be better than colleagues who aren't as good at upskilling. I've also seen 22 year olds who run circles around 45 year olds - this is in fact becoming more common given the rapid change in skill-sets and a reluctance to move with the times. The phrase "I've been doing this for 20 years!" isn't usually a good thing any more, especially in IT. It's not specifically bad, but if the thing you're doing is server installs and you've not learned cloud then it's certainly not good.
You're also assuming that she would need to "relearn". Also nonsense as any woman in a sufficiently senior position for this to matter would likely keep themselves up to date throughout maternity leave. I manage it while on holiday so I'm quite certain the girls can do it too for as long as needed.
Why do you think the climb to the board takes 20-30 years? Tech in particular has extremely young exec level staff. Bill gates certainly hadn't been at it 30 years when he became CEO of Microsoft!
My Wheelmouse Optical 1.1A also still works, and is old enough to have the old MS logo on the back and the button texture has worn down to shiny and smooth.
Got to say my Surface Arc Mouse is a thing of beauty though, and lives in my laptop bag for work.
We’re talking about laptops, mostly. What are you talking about?
Is "Le Reg" the French subsidiary of El Reg?
Biting the hand that feeds IT © 1998–2018