Do you think it'll handle relational databases?
216 posts • joined 7 Jul 2017
That's not directly possible. DNA by itself does nothing - you need all the array of cellular components (ribosomes and mitochondria and al that stuff) to go from DNA to an organism.
Basically, all DNA is is a recipe list for proteins. A cookbook cannot become a macdonalds and give everyone heart disease.
There's an outside chance that a bacteria might get in there and read the database like it was it's own DNA (maybe? I guess? Seems unlikely, given that bacteria don't spontaneously become other bacteria when they eat 'em, as far as I know) but as already pointed out this stuff will need to be kept in suuuuper clean conditions to stop micro-scale wildlife eating it anyway.
Meet YouTube-linked games-streaming Stadia, yet another thing Google will axe in two years (unless it kills Twitch)
You're not wrong! But neither were they. All the serious calculations were being done with serious, built-for-purpose, statistical simulation software on dedicated hardware.
Excel was mostly just mucking around with summary data being prepped for powerpoint slides, and various hacky fudges that high command wanted but wouldn't budget for them to be built into the models properly.
Last I heard (I left that team a few years back now) they are moving towards a proper end-to-end database solution that eats text files and shits slide decks, but the whole environment is something like 40% compensation for inflexible legacy systems and 20% working around flaws in previous attempts so it's been slow going.
Sometimes it's worthwhile, though.
I was working operations support - not hardware, but various flavours of Excel-derived hell, deep in actuarial hell where they work out how dead you are to five decimal places... for each month for the next 100 years. Lots of people with very good degrees and years of further training in deep statistical alchemy.
I had a ticket from the Annuities team about something *dumb*. Like "When I open the spreadsheet there's a yellow bar at the top and the macros don't work" dumb. Read the message, click the "enable macros" button, carry on. But I dont't talk to that team often enough, so it's worth occasionally doing a desk visit just to make sure they know we exist.
When I get there, you can feel the tension in the air. People are visibly stressed, staring are screens full of numbers that refuse to be the right numbers. More than a few are muttering swear words under their breaths. There is a definite sense of a Deadline lurking overhead. So... I start with my ticket, and fix it with a smile and a few words like "anyone can do it, I've done it myself". The next desk over says "can you help with...?", and then so does the desk opposite. I fix a few simple blockers with smiles. Be the second-pair-of-eyes that spots a formula error the author couldn't see after an hour. Mention to one that the same thing happened to that guy over there, ask him how he fixed it. Un-hide a shared folder someone had accidentally Hidden. Easily a dozen small fixes, took maybe an hour, all on the back of a super-stressed maths graduate completely forgetting how "Enable Macros" works, but knowing enough to email IT for help.
You could genuinely feel the mood lift in the room. Days like that make it worth dealing with customers - the machines might be better behaved than people, but even an introvert like me can recognise the value of genuine gratitude.
Re: misunderstood that encryption, hashing and encoding are different things.
It's simple at the core:
Encryption is reversible - meant to be undone. You encrypt things that need to be read by both parties.
Hashing is one way. You hash things that only one party needs to know (such as passwords).
Generally speaking with Hashing you hash a new input and see if it matches previous hashes. It's how duplicate images are detected, f'rex: turn the entire jpg file into a 256 character hash, and it's trivial to find matching items and very unlikely to be coincidence.
In overly simplified security context: I type my username and password. Client side those get encrypted and sent to the server so no-one in the middle knows what I typed. The password is then decrypted, hashed using the same algorithm as when my account was setup so that if I typed the same string, it gets the same hash and matches - and the server never stores my unhashed password, and the client can't be reverse engineered into giving up the hash function.
(broadly. There are many subtle variations in how this can be done, naturally)
On holiday to Japan about five years ago, I decided to get one of those local 4G boxes that creates a wifi hotspot for your phone. However, as I had to kill some time at the airport before I found the place to buy said magical box, I thought I'd make use of my UK provider's kind offer of 25mb of data for £15.
I mean, it sounds a bit much, but after a 12-hour, £400, flight you don't care. I figured "it'll be handy. Why would they even offer the service if it's useless?"
Turned the phone on and by the time it'd finished reconnecting to the network and catching up on notifications, 20mb was gone.
...so I switched it back off and went to find lunch.
"And how do you then account for me, the idiotic luser, setting the system date 20 years into the future and then back to the current time? Not exactly your business as to why I've done that, but because I have, your software is now broken and you're in breach of contract."
That's not how 'breach of contract' works, though. Contract law is largely focused on not taking up any court time if at all possible,, and strongly encourages the parties to take reasonable steps to put things right.
In this case, where you've done something and the software has stopped working - you contact the vendor, and they should then fix it for you. This is pretty much the same story as home licenses tied to a limited number of activations - they'll almost always reset that count if you just call them up.
It's not "breach of contract" if you're not attempting to take up the remedies already in place.
The biggest uptick in demand for software devs by bosses is for... *rubs eyes* blockchain engineers?!?
Re: WTF is a...
Search Engineers make search boxes work. In a lot of cases this is trivial, but if you've got a lot of data and/or don't want to just use google's code, you do want a specialist to come in and optimise things to fit your environment (especially if there's legacy data involved that you can't just jam indexes into, or security levels where some results mustn't show up at all for the wrong people).
WWW = Woeful, er, winternet wendering? CERN browser rebuilt after 30 years barely recognizes modern web
New claim dogs Oracle: After $11m of sales, I was unfairly axed before next big deal – because I am a 64yo woman
I used to manage a "Beowulf cluster" that was actually 80-odd Pentium 4s independently running windows and some specialist actuarial software, each taking a segment of the customer data we needed to model (using a master-slaves type deal). We had plenty of licenses, and helpfully the license check was only done at startup by the Master.
Unhelpfully the licenses were done with hardware dongles and the actuaries (who developed the models and ran local tests to confirm results) needed them too. Over the years, more and more of the dongles had gotten lost (or obsolete - we had a mix of USB and parallel-port dongles) and since the cluster only needed dongles for the "masters" we gave some of them out and slowly reduced the pool of potential master machines. All the ones left were using the older style dongles screwed into the back of the machines.
Then, one day, we had an upgrade, and every machine needed to relicense. Not wanting to get behind the racks, I managed to score a few spare USB dongles someone had hidden in their desk "because it's so hard to find them when we need them", and spent most of a morning plugging them into four machines at a time to do the startup processes.
National Enquirer's big Pecker tried to shaft me – but I wouldn't give him an inch, says Jeff Bezos after dick pic leak threat
Yay, we got a B for maths. Literally, a bee: Little nosy nectar nerds smart enough to add, abstract numbers
El Reg eyes up Article 13 draft leak: Will new Euro law give Silicon Valley more power? Some lawyers think so
Apple: You can't sue us for slowing down your iPhones because you, er, invited us into, uh, your home... we can explain
Re: Well that was an invisible problem
I was once working in an office that was moving from having all the servers on-premise to having them all in a nice IBM blue barn somewhere on the other side of England.
One day we came in and none of the IBM bods working on the migration could login. They'd accidentally migrated the servers that all the wireless connections ran through, so they got 120 miles of round trip lag on everything from their shiny laptops to their home servers and their logins had to fight through the same bottleneck at the end of it.
Re: Not impressed
Isn't Firefox 52 ESR out of support with Mozilla? I thought (and correct me if I'm wrong) they dropped that back in August last year.
I think you may have run into Microsoft's other hobby: trying to get people to keep their browsers on the latest versions for security reasons. Somewhere along the line, MS worked out that they get a lot of flack for all the virii in their ecosystem, so they've sunk resources into fighting that from as many angles as they can.
Re: A file so large...
"Sure the address space could be higher, but no one will ever need two freaking gigs for emails: it's basically plain text, FFS."
I think my personal favourite example of "no-one will ever need that much" coding was in Deluxe Paint back on Ye Olde Amiga. If the RAM used by the picture you were working on exceeded 512, it threw an out of memory warning, because, well... when they wrote it, no-one had more than that, nor did anyone sell more than that, so they set an upper bound on the test.
However, it was just a warning (not an error or a crash), so if you'd gotten one of the 1MB expansions Commodore later introduced, the program would carry on happily after, letting you do all the fancy gradient fills you could want.
A while back (2010 or so), and this is probably the single nerdiest thing I've ever done, I used to play EVE Online using a 3G dongle. Only for a short while waiting for proper internet, but, well, it's an MMO. They pay all the bandwidth costs too, so they make them remarkably low-traffic in normal use and I used hardly any data at all.
Anyway, every now and then, I'd get booted because it lost internet connection for a few milliseconds, and the game was incredibly twitchy about that. I eventually worked out it was because I was living near East Croydon - every time the Gatwick Express shot by, hundreds of active mobile calls were being transferred to and from each base station, and they took priority over data connections (in theory, data connections should handle a momentary disconnection).
Anyway, that's how I ended up playing a fully featured MMO on a _2G_ data connection, since that remained stable and largely unused.
I think a large chunk of it is that early games (especially during the Amiga's shareware boom, when one-man-shops were almost the norm) were playtested primarily by the developers themselves, and their sheer familiarity with their own project lead to the difficulty creeping up.
This created a market full of difficult games, convincing everyone else that the market wanted games to be hard - and as a small child at the time, I was happy to invest hours and hours into becoming good at them.
Then in the late 90s devs starting saying things like "I want players to see the end of this game", and larger studios began to seek wider audiences (including older gamers with less free time), so they slowly dropped the difficulty down or more commonly introduced variable difficulties so that the players could choose their poison.
I think my favourite example of that was the original X-Com game, where a bug silently reset the game difficulty to easy on first save. A bug that was discovered by a fan porting the game over a _decade_ later, by which point the famously super-hard sequel "Terror From the Deep" had already been made based on feedback that the original was "too easy".
Re: Seems a bit odd
Looks like classic Menu pricing to me. If you can sell only at £30, you can't get any of the cash from people with a tenner spare.
This way, they get access to more customers, with an upgrade option for people who go in cheap for whatever reason who later decide to go with the shinier option.
Silicon Valley CEO thrown in the cooler for three years, ordered to pay back $1.5m for bullsh*tting investors
It's like that famous quote about advertising. "I know 50% of my advertising money is wasted, the trouble is I don't know which half".
Except with middle management, the ones you most want rid of are basically only doing one thing: whatever convinces their superiors that they're the Good Ones. The ones who fail the metrics, who lack 'visibility'... those are either utterly useless (which does happen!) or too busy doing important stuff to make sure they look like they're doing important stuff.
Re: RF Jammer?
Collateral damage, as it were. Airports use a lot of radio across a lot of spectra, so it's hard to jam all possible drone frequencies and still be able to fly planes.
Plus there's a chance it's pre-programmed and only needs GPS, and you really can't fly dozens of airliners if GPS is being jammed.
Re: Rather a lot of unanswered questions there
Our current understanding of low density planets is that they're a *lot* of gas around a smallish rocky core, though at Jupiter-ish sizes there's likely some crazy pressure-related shit going on with metallic, liquid, hydrogen. This one isn't anywhere near to Jupiter scales so it's probably just rocks.
So it's losing all the gas and as the article states it'll eventually end up with just the rocky core.
Re: Explains the riddle...?
I *think* what they mean is that this is the first old galaxy we've been able to measure in this way (because of the angle), and previous results suggesting low levels of dark matter in old galaxies were using less precise methods (at a guess that'd be something like gravitational lensing, the rotation at the outer edges only, or trying to infer the effects on nearby galaxies).