Re: Time to find another solution
I've been looking into this, and I think this is the best solution I've seen yet.
5648 publicly visible posts • joined 20 Feb 2015
That's fine if you have a static IP to point to. I have had my own hostnames for lots of years now, but aside from my sites intended for public use (which are properly hosted, have dedicated IPs, and for which I don't use Dyn), I don't.
I've been looking at NoIP, but I'm also considering setting up my own Dynamic IP service to run from one of my hosted installations. That way I can be sure that my service won't be sold off and that it's as secure as is possible.
"Unless, of course, Microsoft admits defeat and drops the whole six month cycle"
I'm hoping that all software, not just Microsoft's, drops this insane idea soon. Rapid release has been a borderline disastrous thing, leading to decreased software quality, greater pain for users, and (worst of all), reducing the uptake actually important security-related updates as people increasingly block updating across the board.
I hope that companies go back to something more resembling the old way. At the very least, there needs to be a clear differentiation between security/bug fixes and feature changes, with the feature updates being optional.
"So, in short, Canonical will try to keep some older legacy i386 software running in some shape or form in the future, but it really, really doesn't want to, and thus you're better off just running Ubuntu on 64-bit x86, or one of its other supported CPU architectures."
A better solution is to switch to a more reasonable distro.
"Today our society is practically based on the Internet"
Not entirely, yet, fortunately. Personally, I literally can't think of a single thing that I need the internet to accomplish. The internet is more convenient, but I can still do every critical function I need the old-fashioned way if I choose to.
"This sounds exactly like something software engineers come up with, then stare blankly when told no one outside their orbit will be able to use it."
That's not true or fair. I'm a software engineer, and there's no way in hell I'd be willing to go through that nonsense either. Nor would I have ever been OK with implementing that scheme. I'd die of embarrassment first.
"stacking boxes on the kitchen counter and using them to raid the biscuit tin."
My eldest is in his late 20s now, and I only recently heard this story from when he was a toddler. A good friend was staying with us for a while, and while I was at work, he walked into the kitchen to find my kid had stacked up boxes and chairs, and managed to climb on top of the refrigerator to get access to the cookie jar in the cupboard above it. My kid looked down on my friend from his perch, munching cookies, and with an evil glint in his eyes said sternly "you WON'T tell dad."
Never underestimate a toddler.
"or add some amusement (I hate boring presentations)"
In my experience, there is no slide so amusing that it can make a boring presentation not-boring. At best, it can only distract you from the boring presentation for a brief moment.
"which means that slides usually only contain 10 words at most"
The only words that should appear on any slide are labels for the data visualization being presented. If the slide is not presenting data visualization, or is not purely decorative, it shouldn't exist.
> There is nothing worse than having read the entire slide, getting the point, but being held to that wait-for-the-clicker moment to read the next one while people waffle on.
This, so very much. Also, it drives home the fact that these presentations almost always take an hour to say what it would take a normal human being about 10 minutes to say.
"Not if the individuals pay in cash"
True, which is why I pay with cash about 99% of the time.
"But honestly, you can't blame the merchant for recording the fact that you bought something, nor can you blame the city for recording the fact that you used its transport system."
"Blame" is too strong, but I can, and do, strongly disapprove of them doing that for any purpose beyond payment processing.
"What I mind very much is somebody going all Big Data on the two different data sets"
We are in complete agreement here.
This sounds like a "better than nothing" sort of approach to the underlying problem to me.
"The city's rider data set and the point-of-sale data set from merchants can be processed using Private Join and Compute in a way that allows the city to determine the total number of train riders who made a purchase at a local store without revealing any identifying information."
In my view, the real problem is that the merchants can identify which individuals purchased what, and the city can identify which individuals traveled where and when. Sure, it's great to give them the chance to compare notes in a way that is a bit less invasive, but the privacy incursion has already occurred before that happens.
"protocols even a MITMing firewall may not be able to detect without constant updates and analysis."
This can be overcome by using deep packet inspection. I haven't gone that far, but if people start engaging in this sort of activity, I'll have to decide between setting up a DPI system or ceasing to use the web entirely.
The usual way -- glossing over technicalities, I have the cert for my proxy installed in everything that needs to use HTTPS. All HTTPS traffic gets routed to the proxy, and an HTTPS connection is established between the client and the proxy using that cert. The proxy establishes an HTTPS connection to the real destination, using the appropriate cert for that (just like a browser would do). At that point, the proxy is just relaying the datastream between the client and the real destination and has complete access to the decrypted datastream without allowing any unencrypted traffic over the network.
The downside of this is that you can't make any HTTPS connections until you have installed the proxy's cert. But it's a tiny downside, as installing the cert is simple.
"Any strategy that depends on blocking a connection is doomed to fail."
True, if that's the only thing you're doing. It is a very valuable piece of a larger security stance, though.
"As is one that tries to snoop the content to determine if it is ad related, since like everyone else they will use HTTPS for everything in the future."
That problem is why I've set up a man-in-the-middle proxy specifically to retain visibility into my data streams.
"A surprising statistic is that "like last year, about 30 per cent of developers still don't have unit tests in their projects", or so the survey said."
I don't think that's terribly surprising, personally. It's in line with my personal observations, and am struggling to get the dev team I work on to start implementing unit tests.
I think the main reason for that is that the quality and productivity benefits from unit tests are all at the back end. At the front end, unit tests look and feel like a time-sucking pain in the ass that makes hitting your deadlines more difficult. This is made worse by unit test extremists who tend to be obnoxious and insist on things that don't make practical sense (such as that every line of code should be exercised through unit testing).
'I don't need it so no one else does either'
Except this is accurate. Both FTP and Telnet present fairly serious security risks, and there are more secure substitutes readily available. Even devices that require telnet often don't need such access from the internet at large, and if they do, then it's worth the effort of setting up a relay so the telnet exposure is limited to your LAN.