I'd be more concerned about your other APs that are burping if I were you?
Have you lost any family pets in odd circumstances?
1126 publicly visible posts • joined 12 Jun 2009
You buy an Apple, you know what you're getting.
You're getting a device they make it hard to repair, but not a device that they will arbitrarily brick if you take it to another repair shop.
This would be like Apple using the GPS in the phone to brick a phone if it's spent time in a non Apple repair shop.
I don't mean charge by usage, in broadband the actual cost of shuttling the electrons about is probably negligible. (it's 2023, not 2003, traffic shaping and throttling technologies are well developed, so a congested network can still be managed).
I mean build a network, then as demand increases, increase the capacity.
Don't charge 'per usage', but just charge what it costs to build, run and maintain, plus some profit.
Whilst there's no logic or real punishment through a public sector body imposing financial penalties on other public sector bodies, the problem is that there's not any incentive to improve.
Make people accountable. Not just in the public sector, but in general. The idea is the increased remuneration for these roles reflects the increased responsibility. So ensure that responsibility is enforced. Start actually penalising people for doing a bad job, and holding those responsible accountable for negligence.
The problem is (basically) nepotism though, the smaller and smaller pool of people in the higher paid jobs acts as a disincentive, as shitting on one of your friends is hard, and one day it may be you.
Making the trust's CEO walk round for a month with a hat that carries a prominent logo "Mr Bumhead" would probably be an effective deterrent,
I like that, and lets be honest, it's as likely to happen as my suggestion. :D
Criminal is still going to check if it is not a bluff and they can keep employee hostage until someone who can disable CCTV do it.
Which criminals? People aren't either criminals or 100% law abiding, with no middle ground. Violent crime is a whole different ball game to non violent crime.
This isn't TV... if you're taking hostages you're in a whole different world not just when it comes to sentencing but to mental attitude... most criminals want a quick in and out, if the staff can't shut off the CCTV they'll run, not re-enact a scene from 24. They aren't generally highly trained martial artists or ex soldiers with a cool backstory, they're just as likely not to want to get into confrontation as the next person.
There seems to have been around 7400 kidnapping offences in the UK for 2022/23 (The nearest I could find to hostage taking), and 250,690 reports of burglary. I would guess you're much more likely to not bother reporting a burglary over a kidnapping too. (Also bear in mind that kidnapping includes things like divorced parents taking their kids from their ex without permission, not just bungling someone into a van).
The world is a lot safer than it appears through the lens of the media, even now.
For example, I chuckled at a recent story, regarding the recent rise in crime in the UK, the figures were alarming, then it got pointed out they were about the level they'd been in the 1990s. A time when we used to bang on about how much safer it was since the 60s and 70s.
How do you explain that almost always it seems to be the case when CCTV is needed, it is scrambled or missing?
It often isn't. You don't hear about the routine cases of a grey blob on a shit camera feed not being recognisable. You hear about the extreme instances where someone has fucked up more, like in this case.
Also, like with anything in life, people are shit at assessing risk versus reward.
"I would like a CCTV system, my insurance company says it's needed"
"Okay, we can do this hi-def 5K system, with night vision for about 80 quid a camera, and you'll need a DVR too"
".... got anything cheaper...?"
"Well, a full HD system with 4 cameras and a DVR is about 170 quid?"
It's nothing to do with some fanciful imagined scenario from late night TV, it's to do with cash.
Plus, in the above scenario, the insurance would still pay out, the cheaper cameras won't help them catch the criminals, but the net effect to the person who's been robbed/vandalised or whatever is still pretty much the same.
Yes it ‘should’ work, but meanwhile back in the real world……..
"...that thing sounds complicated so we best not try, it'll not work anyway."
Yup, worked at places like that. Probably best avoided. Over time this attitude distils, until you get a workforce not only averse to risk, but to even risking doing their jobs with any degree of competence, as the warm comfort of gentle monotony is much much easier.
I was going through some personal stuff which meant changing jobs wasn't on the cards. The main frustration about the whole thing is the one upcoming project that sounded difficult (and therefore interesting) was never ever started... as too much risk.... just dangled like some future development carrot.... whilst the existing creaking poor quality and hard to use system just plodded along.
why not just put the goddam error message on the screen in the first place?
Or why not both?
That your system has been hacked in some way. We see those stories here almost every day of the week.
Those hacker could easily hack the QR generator so that the resulting QR code sends you to a Child Pron site or some other NSFW embarrasment?
Are you going to risk going to jail and being on the sex offenders register for life because you followed a QR code?
I... erm... what?
I'm not sure that's how the law works; you don't accidentally follow a link and go straight to paedo jail?
In the unlikely event the above situation ever happens to you, close the browser, then take a picture of the screen as evidence. (Quickly mind, the rozzers are probably already on their way...)
but I still can't quite grasp how a *language* can suddenly become "safe". What makes it's compiler (which in itself has to be trusted) so much better ?
Time.
Rust was designed in the late 2000s, when computers were running at multi gigahertz speeds and had gigabytes of memory. This means you can write much more complex compilers, and knowing this you can specify the language with much tighter restrictions than a language designed for the PDP-11 in the 70s.
Remember in C a prototype exists so the compiler knows what function call to generate without having seen the function. Now it would be trivial to infer this information. However in the 70s this was a sensible choice.
...but that was true back then too.
It's also true of every industry... ever.
I also do think all the basic ideas in computing have in general been done before, most things now are just re-packaging of older ideas.
However, the attitude that anything 'new' is by definition wrong, or attempting to improve, what, after all, are the 'tools' of our trade is a bad thing I really do feel is counter productive.
I expect Rust would've been invented in the 70s if they'd had the processing power to write compilers for it. A lot of the advances in tooling and languages are due to more powerful machines. You can now realistically have a compiler enforce things that would've been computationally unfeasible in the 70s.
Also.... hammers still exist... C/C++ will still exist, getting territorial over programming languages doesn't really help. I used to be very much 'IF you can't manage to program in C properly you probably shouldn't be programming' camp, then I made one too many accidental buffer overflow mistakes and I grew up.
To last a pair of shoes to their soles is a highly skilled art, but a mechanical machine will do what will take you half a day in about 30 seconds. It doesn't detract from the skill of the hand finished way, it just gives you another option.
If your tooling allows you to not worry about a whole class of 'problem' then you'll get quicker as a consequence.
You're forgetting that no new stuff can ever be any good in the field of IT. If it hadn't at least been considered by 1973 then it's no use.
All the things you mention are you fault because you're either not trying hard enough, paying full attention at all times, or, frankly, just too stupid.
(Okay, sure, you may actually be writing software that people use, who might appreciate you putting in more effort, attention, or simply being cleverer, but what does that matter, it's their fault if they lose their work or are more susceptible to malware, they should've bought software from someone who tried harder).
It's the same with heavy industry really, that peaked around the 1700s, everything since is just 'health and safety bullshit', sure maiming is less frequent, but how will workers learn if they get to keep all of their limbs?
It is mainly driven by high energy costs and taxes
Surely it's mainly driven by high energy costs (as a result of the Ukraine war) and high food cost (as a result of the perfect storm between the war in Ukraine and the increased costs and reduced access to cheap labour due to us leaving the EU) that are driving the current round of inflation?
How do 'high taxes' contribute to this?
We have a below OECD average tax burden, although it is split with an above average on personal income and a below average on corporate income and gains. Surely this should reduce personal income relatively, acting as a deflationary pressure?
Which taxes would you reduce?
Reducing the personal tax burden would act to increase inflation (greater spending power). Your other option is corporation tax, and as trickle down economics has been shown to be a fallacy before now, reducing that will only enrich the shareholders, many of whom invest for the long term, so would have little net short term inflationary effect.
You would then either have worse public services, or have to increase personal taxation to pay for it. (which would help inflation, but would also be considered 'higher taxes').
Or have I misunderstood, and you just mean 'taxes in general' contribute to inflation? But then I'm really confused??
Au contraire. It's fascinating that some people can live without it.
I don't understand? Spotify not paying what you consider 'fair' won't make musicians stop writing music, or artists stop, erm... arting?? It never has, and it never will, as generally artistic people express themselves regardless.
You could say it about any profession.
Yes, in general that's how it works isn't it?
Surgeons should be able to support themselves until they fix the patient big time.
That's how it works... yes... Often higher skilled professions will incur a much greater 'no decent income' time at the start too. That's why students work in bars or restaurants.
Warehouse workers should keep themselves afloat until they put their 1000th OLED TV on the shelf without breaking.
That doesn't make sense, but warehouse workers should be paid for their work, yes. That pay will keep them afloat.
It's not anybody's fault bricklayers don't have viable business model (if there was such a thing as Spotify for Bricklayers).
Most bricklayers lay bricks. That's been a fairly viable business model for a good while now.
Appeal to tradition. Remember that at some point one could have said that to someone saying slavery is wrong. "We always had slaves".
???? And people are being forced into writing music? I'm not sure comparing a badly paid (in general) profession someone enters by choice to human trafficking and enslavement is helpful, or accurate.
Workers are entitled to a minimum wage for their time, at very least if they work at a for profit business.
And any musician can go and work for a 'for profit' business and get paid that minimum wage. However if you work for yourself you need to convince enough people as to the merits of your output, be that a lovely tune or a fine oak table, to pay for it.
The problem isn't really Spotify, it's people.
Throughout history artistic people have been exploited by business people. It's a weird symbiotic relationship that I'm not sure you can ever really fix. How many now famous painters were poor in their lifetime? There'll always be someone with a mythical pot of gold, and there'll always be someone talented but naive to fall for that.
I have an MX Master 3 that's 3 and a half years old, it's had water spilled over it (...and been disassembled and cleaned), the rubberised coating where my thumb rests is worn smooth and the buttons are nice and shiny, but other than that it's going strong.
The battery isn't as good as it initially was, but still holds a charge for a few weeks of heavy daily use.
I'm expecting that'll be the thing that eventually needs replacing, but don't plan on changing the mouse any time in the next few years.
Bought to replace an MX-700 of some 20 years vintage, that was on it's 3rd set of rechargeables, but no longer actually charged them, even when placed in the charging dock 'just so' like it'd needed for the last few years.
Unfortunately, it does not – a membrane operates behind the scenes, and while that might not bode well for longevity, it does mean that less effort is required to press the keys.
What are you doing with your keyboards people??
I'm not sure I've ever had a membrane keyboard fail. I replaced my last one simply because it had become far too disgusting after 10+ years.
Sure, I replaced it with a very nice gaskety mechanical one, and now typing on a membrane is a much worse experience, but lifetime wise, they should pretty much last.
(I'm a coder, tend to type many hours a day...)
I mean, I'm sure Borkzilla will bury it,
They've just put the code on github, and according to the Eclipse foundation will subsequently be re-licencing it under the MIT licence.
The current licence forbids you from creating a competing product, which you will be allowed to do once it's re-licenced.
That doesn't seem like burying it?
You remember Microsoft's Monkey Boy ? Ballmer. The one who said Open Source was a cancer.
To be fair to fester, he said the GPL was cancer, due to it's requirement to licence code it touches under the GPL.
It was a pejorative term, but the point is valid.
Today, code that's not open source is the rare exception.
IS there any actual evidence for that?
I'd be very surprised if that was true. Most of the software people use isn't running computers, it's running 'things' or 'companies'. Most of that isn't open unless I've really missed something.
I've also worked as a developer for 25+ years now, have never worked on open source software as part of my work. I've worked with open source software, sure, and a large portion of the work I do is built using it, but the code I write isn't, and at work, never has been.
Always be polite to PAs ...
Always be polite... especially to 'prickly' people.
I've worked with a few 'difficult' people over the years, more often than not simply being a little more polite and a little less dismissive makes one hell of a difference. One day, you'll need something they control and then your patience will be rewarded.
At a previous job we had a software repository, controlled in another country. A dev needed something adding, I suggested they email and ask for it. They complained as that particular department never responded and always took ages.
I fired off an email and had the software in the repo in about an hour.
My colleague was genuinely shocked; the only thing I can put it down to is whenever I had to email this dept and they did something I would always take 10 seconds to hit 'Reply' and say thank you.
Sometimes people forget, especially software people, that work is as much about programming the people as it is the machines.
Or do you mean his personal stuff? Maybe he doesn't want to?
I would also question the author's assertion tht FFMPEG is used by the streaming services. They may well use it in some areas but only when they can't use hardware compression, which depending on OS and hardware isn't always available.
Isn't that how they use it?
It seems to decode/encode pretty much anything, and has good hardware support, so chuck it at FFMPEG and let that decide to use hardware if it's available? It may not be doing the actual lift and shift in many cases but I bet it's the 'director'. :)
Sorry, I was replying to the comment 'fingerprint works <25% of time', not the article in general, just pointing out that the leading fingerprint sensor became vendor specific, and that is why the Apple stuff tends to be better. :)
Lets be fair, realistically, fingerprints aren't that secure if you're an 'important target'.
If I've nicked your device I've more than likely nicked a device covered in the fingerprints I need to unlock it.
Also, as this thread illustrates, consumer grade FP readers tend towards convenience rather than overall accuracy. I've worked with industrial applications of the technology for building access (slightly different as you're trying to identify someone from a fingerprint against a database of hundreds or thousands of people rather than trying to determine if a fingerprint is one person), and in order to be good enough to not have a shocking rate of false positives you often take a few goes to get a successful read.
To be fair, Apple did also buy the leading fingerprint sensor manufacturer. The TouchID stuff was at the time streets ahead of the competition.
But if you're on ZFS you might as well just use snapshots.
Then zfs send the fs elsewhere periodically for redundancy.
This is what I do at home, rotating hourly/daily/monthly snapshots, then some heith-robinson esq scripts to send it elsewhere every hour.
The remote stuff is a bit fragile but I assume there's pay for tooling for that stuff that is a bit better than my shonky shell scripts, but it does mean I can basically restore my home server's root FS from an off site backup if needed.
The Boeing 737 MAX flew thousands of times without crashing due to flawed sensor data and a recalcitrant MCAS system. . . until it did. . . so a "success rate" is somewhat irrelevant.
...
I certainly hope that there's nothing overlooked and that the SpaceX human flights continue safely but given the safety track record of some of Mr Musk's enterprises well documented in this very journal. . . well, I'll just have continue to hope, I guess.
So 'success rate' doesn't matter, but 'safety track record' does?
How does one measure the second without the first?
What other metrics do you suggest, as it seems like 'not liking musk' is the one you're using?
Put it another way, if I was going to put myself or something else into space, would I chose the rocket company that has more successful launches than all other rocket companies (state or private) combined, or does that not matter because... Elon?
My side comment wrt Mr Musk still stands. That's a fact. Whether it's salient, well, I suppose that depends on where you're sitting. . . on the business end of a rocket or on the sidelines.
It is a fact, and so long as you take it in context that Blue Origin and Virgin both offer low altitude space tourism flights, and Space X can take you into orbit and dock with the ISS then that's fine, but to imply that all 3 crew vehicles are in the same ballpark is disingenuous.
Also, I might point out that New Shepard is currently grounded due to a booster failure in it's last flight, but that doesn't seem to matter because it's not Elon?? Or was it not the 'businesss end' that failed? ;) Personally, I'll trust a rocket that's flown well over 200 times without issue over one that has flown 30 times, the last of which ended in failure.
Looking at the history of, for instance, the Boeing 737 MAX, I suspect that the "losing government control to private entities" ship sailed quite a while ago.
Per Wikipedia "During the certification process, the FAA delegated many evaluations to Boeing, allowing the manufacturer to review their own product. It was widely reported that Boeing pushed to expedite approval of the 737 MAX to compete with the Airbus A320neo, which hit the market nine months ahead of Boeing's model."
You could also look at the history of SpaceX, the Falcon-9 currently has a 99.3% success rate over it's 281 missions.
The current 'Block 5' variant makes up 221 of those, with a 100% success rate.
How much such "self-certification" is going on at SpaceX is at this point probably anyone's guess.
One would hope the FAA has learnt from the Boeing fiasco?
There may be a point to ponder that while Jeff Bezos and Richard Branson have both flown in their rocket contraptions, Elon Musk hasn't flown in any of his.
Other people have though, to orbit.
Just sayin'.
SpaceX is, statistically, by far the most experienced and successful rocket company out there.
Elon Musk is a narcissistic dick.
Both these things can be true. :)
Of course, if you remember the ZX81, you also remember systems with just 1K.
I do, although as I got a second hand ZX81 in the Spectrum/C64/Amiga era it also came with a memotech memopak 64K. This meant wobbly ram upgrades were just a thing of legend.
It also came with a good stack of old Sinclair Users too, I used to covet the various keyboard expansion options available.
... It sometimes consumes up to 40GB for just the filesystem cache.
But that's because you have 40G of free memory. Any decent OS should use all available spare memory (and possibly more) for it's filesystem caches, however it doesn't mean they're needed.
(My personal server has gone from 96GB to 32GB, and uses much less memory for a FS cache as a consequence... I've not noticed the difference...)
Why does every purchasing decision have to be made solely on price and perceived value?
Why isn't it okay to buy something simply because you like it?
Why are people making different choices to you 'idiots'?
I'm not badly paid, and I can't afford a lot of the high priced apple stuff, so the idiots must be doing something right.
How have we got to the point where 8GB is "usable for basic browsing / word-processing etc."?
The boring answer... media.
That's the one thing that was, is and always will be 'large'.
I can word process a nice colourful document in as much memory as I could in the 90s, but once I start to add media to it, then the memory usage rises.
Pretty pixels cost memory. (As pointed out above somewhere, a 4K frame is around 30 MB... back in the 90s I had 2MB of video memory...)
There's a lot to be said for fat binaries, especially in the Linux world.
A few years ago I had a (Windows) Qt app I wanted to distribute cross platform. This was in the days before Snap et. al.
Windows and Mac were fine, NSIS and macdeployqt respectively. However, for Linux (which I didn't really use) I concluded that a 'fat' binary was the way to go.
I didn't wish to spend time learning the various package managers for OSs I didn't actually use, so just ended up statically compiling the binary and shipping that, along with the object files and instructions needed to re-compile against a modified Qt for LGPL compliance.
(This was a hobby freeware app just to be clear, not a commercial or work thing).
That would probably be one Burrell Smith, another Woz level Apple genius.
FreeBSD is quite different from Linux, and even experienced Linux users will find themselves lost sometimes.
I'd say that was the other way round. FreeBSD is still very much a traditional unix descendant, whereas GNU/Linux/SystemD does seem to be forging it's own path. i.e. it's Linux that's becoming increasingly different.
I'm not sure that this is a bad or good thing, it's just a thing. Although as a BSD user I do find my odd forays into Linux land increasingly unfamiliar. (and I still find man pages telling me to go read something in a browser... Grrrrr... )
I also may be old, but I'm still not 100% sure why ifconfig isn't fit for purpose any more, in BSD land it works as good as it ever did. (Or I suppose more generally, I sometimes find Linux feels like it changes things because someone feels the need to change it, rather than the new thing being materially better).