That should be illegal
After all it's manipulating other people's data intentionally. It should be just as badly punished as if I was to go to my bank and add some additional digits to my bank account.
4850 publicly visible posts • joined 9 Mar 2007
Well as usual here, those vitriolic comments come from people who don't understand IP and think NA(P)T is a security feature, just ignore them. Yes IPv6 will have it's initial problems, but it's far from the hell hole IPv4 with carrier grade NAT or NDN will be.
I mean just look at ATM, X25 and ISDN. All of those networks could in principle do everything we ever wanted, but at a much higher cost. IP took off, because it's so incredibly cheap and easy to implement.
It also took off since there is no difference between different services. If I wanted to transmit smells, I wouldn't have to go through 10 years of standardisation, I could just transmit the data. And if I do want to have a standard, I just publish my protocols.
Also there is no concept of "Client" or "Server" in IP. It's in TCP instead. Therefore there is no difference between client and server connections to the Internet, every connection is the same. This is what enabled rapid growth and a vibrant culture.
An "intelligent network" would turn back the clock to online services like Compuserve or AOL, where you have more or less walled gardens.
The only problem facing IP(v6) is ISPs which put their customers money into ad campaigns instead of just upgrading their network as they are paid to do. This is why some ISPs have congestion, and this is why we now have widespread outages. The Internet is to important to be left to companies that need to gain the highest profit by law.
I mean let's ignore the prejudices about the weather and the food, because frankly nobody cares if it's raining all day long, in fact that actually reduces the glare from the window on the screen. ;)
The UK is a country where you can end up in prison for not giving your password when _you_ are accused of having done something wrong. The UK even has key escrow. The UK has the highest number of surveillance cameras in the world per capita. The UK has ISPs which have ctively messed around with everyone's internet data. (BT and Phorm!)
Sure compared to the US the UK probably still is great, just think of the medical care system which is worthy of an industrial country, and yes, the UK has the BBC and Channel 4, but on an international scale that doesn't quite make it.
As far as I know, Windows now ships with a TCP/IP stack in all versions. In newer versions it's even partially IPv6 capable. Plus you can get putty for Win32 so you should be set.
Plus newer versions even support USB Ethernet adapters.
Both factors combined should make it theoretically possible to connect this machine to the Internet. For mobile use, I'd recommend one of the USB-sticks which emulate a network card. Since they are automatically configuring, you should be set.
In an ideal world, we'd have laws against "secure" boot, since it effectively shortens the lifespan of a device artificially, creating incredible amounts of waste. After all the whole point about it is to prevent the second part of it's lifetime when people buy those devices used and install a different operating system on them. So instead of just installing a modern Linux on your laptop instead of the unsupported 5 year old Version of Windows, your only option is to stop using it at all.
"Not it isn't, anymore than understanding how a steam engine worked was an essential part of being a member of victorian society."
Actually if people in a victorian society understood as little about steam engines as people understand today about computers, it would have been a problem.
I mean there are people out there who know so little about computers they believe voting computers can somehow be made to conform with democratic standards. There are people out there who believe that computer can somehow prevent unauthorized copying of data they show to the user. The problem is that such insane ideas get put into laws and contracts... with lots of negative side effects for all of us.
Today more than ever we actually cut of children from the ability to learn about computing. If you bought a computer in the 1980s, chances were that it booted up with a BASIC interpreter... today you need to root most mobile phones to even get to the shell.
The smallest of the issues arising from that will be a certain "brain drain", you will get less and less people who get into IT and understand what they are doing. This will, on the long term, mean worse and worse IT.
If we don't start educating young people now, there will be no one left to design, build and maintain the exoskeleton you need to get around when you are old. We already have one lost generation.
... but a nice marketing coup. It's not like Apple could do anything against it without giving up basic principles.
It'll just change the business model of those companies slightly.
Instead of creating a hype to be bought by some larger company of insane amounts of money, startups now hope to be bought by an insurance company for insane amounts of money because that way the data will be bought with them.
There are now cheap ESP8266 based WIFI boards around which cost around 5 Euros a piece and have a nice serial port. They contain a minimal TCP/IP stack and can be easily controlled from just about any microcontroller you want. Plus there's a fairly obscure SDK available which should allow you to actually do everything on the board itself.
I mean those systems are probably cheaper than actual workstations and the operating system could probably easily be replaced by something more suited for professional use (like some Linux or Windows Server instead of Windows 8 or whatever they are shipping this with).
How reliable are those systems? Is this typical "if it left the shop it's already half broken" quality, or is this something decent?
Well those are shortcomings of current browsers. Browsers have become a gigantic mess. However let's imagine for a moment we'd have something much more simpler than a browser bringing you the same functionality. Essentially a simple protocol to let your mobile device be a client to a server you choose.
BTW access to cameras and local storage is something Javascript has on modern browsers. :)
Well it's just like with "multimedia CD-Roms" back in the 1990s. You bought an "Online Encyclopedia" which had a couple of thousands of articles of dubious quality. Or you bought image archives where you got someone's holiday snapshots and povray experments.
This has passed with fast Internet. And as soon as decent mobile Internet is available it'll pass in the mobile world, too. Now what Siri has introduced is something very much like a command line. You literally say your computer what you want, and it'll obey your command. Maybe one day there will be a simple voice terminal, encoding what you say into the 4800bit/sec stream used by such services, and giving you back the results in a form that can be said and shown.
"But Dell offered Linux (do they still?) as an option so I'd have expected more consumer penetration by now."
Yes they offer it for the intersection of their models that are neither suited for Linux (as they use overly exotic hardware) or are utterly undesirable (as they have shiny displays, non-replacable batteries and/or no Ethernet).
Companies used to make products for specialist markets. For example a home computer required you to actually learn about how it works to some degree. Of course there were devices for people who didn't want to do that, those were games consoles or television sets.
Today it costs more and more to develop and build a "smart"phone. So much that companies increasingly won't dare to experiment. Apple has brought out a "device" which was rather bad by the standards back then. Since Apple has developed a cult following with their iPod and since it didn't require you to think, it was successful. Being the only such device from Apple in a market where companies like Nokia had hundreds of models, also makes it look good on the sales rankings. Most of the competition being utter shit probably also helped.
What companies don't do any more is to experiment and take bets. Nokia did that with their "Maemo" series. Despite of not being advertised and not having any GSM connectivity, those devices were very popular.
I mean surely this seems very dated now, but calling it "Medieval terror bastards" seems harsh. It can't be worth than "Saphire and Steel". What does that even mean in the context of a 1970s children's TV series?
https://www.youtube.com/watch?v=eYmbt2RVqCg
Ohh you mean that organisation in Iraq? That's named "IS" not "ISIS", they had a rebrand recently.
The goal was to have independent systems apart from GPS. Because of political problems, Galileo will never operate without the consent of the US. That's why Galileo cannot do this directly...
However the announcement of Galileo has prompted other countries to start their own, truly independent, systems. Glonass is just one example. In fact many "smart" phones already have combined GPS Glonass receivers.
... and how much effort it takes to remove that DRM again on the customer side. All of that would be so much simpler if it wasn't for idiotic DRM which neither protects content nor helps anybody in any way.
If you want to see what's streaming without such idiocy, look at the streaming at the Chaos Communication Congress in late December. There they have a fairly well scaling streaming infrastructure which is simpler and works more reliable... unless the network there fails.
You will know that you will have to do harm the people around you and all over the world. You will be responsible for opposition forces in some country being tracked to kill them because your prime minister likes the dictator in their country. You will have to find security holes and are not allowed to get them fixed. If you take such a job, you will make the world a worse place... and if you want to quit they have more than enough information on you to blackmail you into staying.
If secret services would act for the common good, they would do so publicly, or at least disclose what they did after a sensible amount of time. What we see instead are secret services fighting of every little bit of democratic oversight they have. The sensible thing would be to close them down, and maybe, if we kind parts of them useful, to recreate those parts.
Today most mail servers already use TLS for all their connections, so only the involved servers see the headers. Of course those are self-signed certificates... but for governmental attackers that's no less of a problem than actual ones. In both cases you need to do an active attack which is potentially visible.
Same goes for any sort of "encrypted webmail service". Even if the browser was a secure environment, once you can break TLS you can send any Javascript you want over that connection.
So what shall we do? I believe we should make GPG more user friendly while keeping it compatible with what he have. For example the default configuration of Enigmail could always attach the current active key for the sending address, plus it could automatically store public keys it got from e-mails that were signed. In the default setting it would then try to make smart decisions on which keys to use when. So if it recently got a signed e-mail from someone you'd send back an encrypted one to that address.
Of course you should still be able to do everything manually, if you choose to do so. Also for mobile devices you could do key exchange via QR-codes.
The point is, we already have good infrastructure, which was not designed by idiots. Redoing it now again risks that it'll be done by the current flood of idiots who think that earning their money in writing shitty apps for mobile devices and reading a the Wikipedia page on Cryptography makes them suitable for designing systems that should protect peoples lives.
"Smart TVs are the rage. but for some reason, clunky and really needing some improvements. My point, there is room to grow in that area. MS could really hit a home run in this evolution if they could just push aside the 800 pound gorilla. I dont want Windows on TV, I want an interactive TV menu that makes me wonder "How did I get along without it"."
Well for that Microsoft would have to:
1. have a clue on how to do it, which is much harder than you'd imagine
2. be able to have that clue somehow survive through the company and reach the people who are in charge
Particularly point 2 is not likely to do happen at any time. Microsoft just is far to large for that.
No that should be:
Linux has fans and users
Windows has mostly sufferers
So far most of the Windows users I've seen seem to suffer from it. They are constrained by the arbitrary limits it imposes and more or less fight with it over trivial problems. Just read Trevor Pott's articles where he fights to do trivial things like getting e-mail out of an e-mail server. Things which on any other platform just require a single line typed into the command line... or dragging and dropping a folder in the GUI.
Of course there is also a group of genuine Windows fans. Those people actually know Windows and do things like porting Windows CE onto the Raspberry Pi (at least they claim to do that) or bypassing the Win32 API and directly talking to the kernel.
Then of course there are the Windows fanbois. People who have no idea about Windows, but just irrationally like it very much. That seems to be a much larger group than the genuine fans. They may have tried to install some 10 year old Linux distribution on overly exotic hardware... and fail, which they use as justification for thinking Windows is the best thing EWAR.
I'd say the vast majority of software for Windows was long abandoned before 64 bit Windows came around. And the software that's not yet abandoned couldn't afford to cut Windows XP users out.
On the other hand, only very few types of applications actually profit from the larger 64 bit address space.
Additionally, Microsoft removed Windows support from their 64 bit versions. So your normal 16 bit applications won't run anymore. So I can understand large parts of the Windows market still being on 32 bit, particularly in the business sector.
...and a severely broken copyright system which forces/allows TV stations to limit their broadcast area.
Just think how differently radio on the Internet is. Today you can tune into virtually every radio station in the world from wherever you are. It's like shortwave, only often in "better than FM" quality.
We could have the same with television. The step from 128 kbit/s audio to 1024kbit/s video isn't big enough to make it infeasible.
Television used to be different. Back in the 1990s, you just started your TV station and put it onto a satellite. Everybody in Europe could just receive it. Television was a lot more European, it didn't know as many boundaries as it does now. Today when I order "Cartoon Network" in English on my cable company, I get a monstrosity known as "Cartoon Network Deutschland", which has very little to do with the real "Cartoon Network" as it only shows shows which have been dubbed to German... which means that those shows have been shown on other channels for years and are continuously repeated. The result is something more akin to "Pop" than Cartoon Network.
At least that's according to what the people working there actually complain about. That's something that can, unfortunately, now be found for every platform.
One bizarre complaint was, that people couldn't get "mobile devices" quickly... which is actually more of a sign of decent administration since it's good practice to not let every insecure device onto your network.
People usually take that for granted on Windows.
Well first of all most of todays computers have MMUs. Virtually every slightly more modern mobile phone has one and certainly everything that runs Linux. And yes I think we all understand how MMUs work, and what vital work has been done in the last years to improve on it.
"License plates" or other identity schemes where you have to show your passport to get to the net, won't help anything. Just look at malware like WhatsApp, you can find out who made it, but that's of no use, you still cannot get the malware aspects out of it. The only thing this helps with is make it easier for governments and companies to track the opposition. There are people whose life depends on anonymity.
What we need to do instead is to make computers more secure. We are already much better at this than we were in the 1990s, except on Windows and mobile phones. Instead of misinformed grumbling we should go on on that path and make our systems even more secure. We need to understand where our current weaknesses are and find out ways to circumvent them.
Like sci-fi series where people live in stations on the moon, but only have one computer on that station and communicate via videophones with monochrome CRTs.
Come on, we shouldn't need this by now.
Even ad companies can now build self-driving cars. Trains and buses have reached a point where they are far more reliable and safe than cars. We shouldn't have large parts of our population driving around regularly in cars. It's 2014 not 1964.
Plus you should design battery operated devices to work on a larger range of voltages. Such parking meters wouldn't work reliable with rechargable "9V" batteries since they often just have 8.4V. Ideally you'd a little step up converter inside and make them work with a single cell. That way you avoid the problem of a single bad cell among several good ones bringing down the whole battery.
Well you can always use self signed certificates. They have little security disadvantage over the ones coming from a CA... unless of course you believe ALL the CAs you trust are somehow all trustworthy and totally secure. If one of those is compromised you are back to the security level of self signed certificates.
Most governments already run their own CAs which means they can easily issue fake certificates which will be accepted by the browser. So if you already do man in the middle, it's trivial for a government to just intercept that. We already see that in some countries.
The big problem is the CA system in SSL/TLS. It relies on hundreds of organisations to all be trustworthy. Therefore I'd go for a system like in ssh where you see the fingerprint of the public key of the host and once you've connected to it, your computer will remember it. That way an attacker would have to continuously and reliably do a man in the middle.
SSL/TLS is only really good against passive attacks, and for that you don't even need the CA system. Perhaps we should make browser display the public key of the host as some kind of graphics. Sure it would just be a pattern of dots or something, but it could be done in a way that's easily recognizable. Or maybe we extend the URL standard in a way to include some public key information. Such longer URLs could then be promoted via QR codes.
Yes but apparently his point is that if you have software where you can "remove features you don't want", i.e. open source software, you aren't liable. This makes it very feasible as there is no reason for not distributing your source code anymore, except for malware.
Actually the thing with the stack pointer won't help against stack overflows and is in fact done by the computer. In short it guarantees you that pushes and pops to the stack will be symmetrical.
C also does things like automatically make sure that if you add a float to an integer, the correct "float to int" adding will be called instead of just seeing the bits of the float as an integer or vice versa.
Well we currently have the problem that compiling software is a slow and error prone thing, that's why installing Gentoo will typically greatly increase your power bill, but let's imagine we'd live in a world where compiling is fast and easy. So fast that your operating system and applications could actually compile from source while starting. Sure that sounds crazy, but it's what Forth and Javascript people are doing.
Now what if whenever you do an update, you actually get to see the changes. Just like with Eclipse you could simply see the changes between the old and the new version. Such changes are much easier to understand than the rest of the software. Of course 99.99% of all people would just accept them without even looking. However with the millions of computer users, 0.01% still amount to hundreds to thousands of people. Compare that to the few people who look at patches today.
There are lots of people who, while not proficient enough to actually write their own code, know enough to be able to spot code they may not want. Those people can then just refuse to accept certain parts of the code.
For other people this may be an introduction to reading code. If it's just a click away people might start looking at it, and it'll gradually make sense to them. Computing would change from some "magic box" to something we all can take part in.
> We've known how to prove code for decades. The trouble is - it's ruinously expensive.
Well not necessarily. Certainly hand proving your code, which is done in some areas routinely, is rather expensive. However there are little things where automatic proofs are already extremely common. Many languages, for example, will make sure that your stackpointer is at the same value it used to be before a function call. (unless you do some really weird things) This may seem trivial, but it helps preventing certain problems.
There is research going on into how to make proving more complex things easy. One idea is, for example, to have useful types. For example your compiler and system could know that a certain memory location contains an integer, which is a prime number. Your compiler could check the code path, and insert code to check this condition where necessary. It can even can throw a compiler error if you try to write in the exit value of a function that does not produce primes.
Futhermore you could have tags to your types indicating how this data can be moved. For example every word of memory you have could have a type tag which allows you to set that word to be "private". A "private" memory word would stay private during normal operations. So if you add 5 to it, it would stay private. Your network card would refuse to transmit words marked as private. However a special privileged function could, for example, encrypt it and turn that information public. That way you could guarantee that no information marked "private" ever gets out unencrypted.
Essentially the current attempts boil down to the idea that you give your compiler hints on how it can check if the code is right. Early starts to this are "const" attributes to variables in C.
"It doesn't work like that, because that would be moronic. Instead the encrypted blob is sent to your PC, where the password is used to decrypt it in-app."
Yes, if you are unlucky that app is just some Javascript on a web page which will be loaded anew each time you visit the page and can be tailored to you specifically. Since the CA model is broken, more sophisticated attackers can even replace it without the knowledge of the developer.
If you are a bit luckier, you have an actual app, however that can still be updated by the developer... or whoever else has access to the chain of trust bringing you that code. Updates on todays operating systems are done in binary form making it extremely hard to see what has actually been changed. So it's completely plausible that you as the target got a special version which sends out your master password, along with the encrypted blob, to some 3rd party server.
Software distribution is, unfortunately, severely broken on commercial systems. Even having a list of the source files that have changed between versions could make a big difference to the security conscious end user. Having access to the diffs could bring actual security, at least to educated users. It's comparatively easy to look at a patch in code.
Just think of it. For a foreign party (e.g. your local secret service) to compromise that unencrypted file, they need to compromise your local computer. Either remotely or via hardware access. If they can do that, it's trivial to sniff the master password you enter into one of those services to get to the other passwords...
Additionally there is the thread of the service delivering malware. While Javascript usually cannot break out of the browser, it can surely send the password you enter to the service as well as decrypt your passwords locally. All of this can be done selectively for certain users, and US law probably can even force services into doing this without telling their users. If this is only done for a select few, chances are it'll never be detected.
So seriously, instead of using such a service, it's far better to use a passwords text file.
Ohh you've missed a generation.
In between there was OPC, OLE for Process Control. A grand plan to make everything interoperable... based on OLE and DCOM. Of course it didn't actually work and now there are dozends of companies adding trivial features like logging to those systems. Oh and guess what, DCOM has little security features, and the few it has are typically deactivated... meaning that you can not just control your special SCADA software, but probably also other OLE software on your system. OLE was one of the backbones of Windows back in the 1990s.
So sure, text based SCADA kit with 9600 baud would be much more secure, in fact you could even hang them onto a small Linux system running SSH for network access.... but that won't bring you flashy graphics you can watch on your iPad.