The Cameras ...
Can they easily be "updated" and backend SW do illegal facial recognition too?
5598 posts • joined 23 Nov 2007
Can they easily be "updated" and backend SW do illegal facial recognition too?
Not really, they could be keeping a copy of everything uploaded? I don't know.
BUT A FRIGGING WEB ONLY utility you have to UPLOAD to isn't a free application. It's potentially the evil of the cloud and only works online?
MuseScore IS a free application.
" batteries got about 20% extra capacity per year while the price dropped 20%. "
Not Lithium, Alkaline or Lead Acid.
(I agree broadly with the rest. It's unlikely we can get 1/2 as good as the theoretical limit of battery density so you are generous.)
Alkaline: Similar to 1965. The x5 capacity gain vs Zinc is a lie for anything at 1/10C discharge, more like x2.
Lead Acid, Zinc Carbon: similar to 1950s
NiCd increased a bit from 1940s to 1970, then peaked at 450mAH to 500mAH
Never really. What happened is that different battery technologies were developed and then later became economic.
Lithium cells have hardly increased in capacity or life in 10 years. The number of cycles is still an issue.
By volume NiMH now match Lithium, but weigh very much more. They are about x5 capacity of NiCd now. originally NiMH were about x2 to x3 the capacity of NiCd.
"Then microprocessor controllers started to supplant designs using discrete logic devices"
As small computers, running programs simulating slow discrete logic. Internally a quite different design. I've implemented decoders, counters, generators etc using PIC micros. They ARE small cheap computers, (from 30c).
FPGA are fast, power hungry and expensive. They ONLY execute code AFTER loading the configuration if the design implements a CPU or it has a ready made CPU core.
Field Programmable Logic Array. They replace discrete ready made VLSI with a custom design in a production chip.
You won't get much FPGA for $10.
"Xilinx etc have tools that take C programs (not designs like SystemC etc) and turn it into something that will "run" on an FPGA."
Then someone is designing a custom CPU implemented using an FPGA.
Unless they are implementing a CPU.
The FPGA "source" is really a description (not a program) and "compiled" into a logic array configuration loaded at power on. No run time code at all, unless part of the FPGA is implementing a CPU. It's pretty wasteful of FPGA resources to implement multipliers, so most FPGA have pre-designed multipliers. Similarly there are a variety of FPGA with an ACTUAL cpu core (not one wired out of FPGA gates), to allow efficient execution of run time code or prototype the HW around a CPU core on an SoC as implementing a CPU uses too much of FPGA.
A raw FPGA is useless to programmer unless that programmer can design the HW of a hypothetical CPU from scratch. They are not programmed in CPU sense, but designed by people that design hardware. Verilog and VHDL only look like programming languages, they are hardware definition languages for physical definition and interconnection of gates (due to ultra fast static RAM, some logic functions are truth table type Look up tables in RAM rather than interconnected NAND and NOR as in earliest designs). The Hardware design is LOADED from a ROM, EPROM, Flash Memory or even USB or JTAG at power on time. After that, it's functionally an ASIC or logic board, only actually executing ANY code if there is a CPU (FPGA based or a pre-designed core element) as part of the design.
Above about 10K parts there is no point to an FPGA.
Only for proof of concept Prototypes. Any IoT chip will be an SoC / ASIC. An FPGA is simply too power hungry.
FPGA are only for volume too low for ASIC or prototypes. The FPGA "design tools" can target a specific FPGA for a prototype or specialist lower volume application or an ASIC.
I wish I could afford a priv.
Android but with Added security
My last "smart phone" with keyboard was the Nokia E65, stupid Nokia killed the widgets and created useless Ovi. The Sony / Experia Z1 is OK, but no updates and the Android apps and Playstore is so much like spyware I only use Camera, Phone, FM Radio, Music Player, etc. Bluetooth, WiFi and Mobile Data are disabled. I had to uninstall Amazon Kindle Reader as it was trying to connect, even when I wasn't using it and even though WiFi and Mobile Data disabled. I can't afford Mobile Data, so the Priv would have to work on my WiFi.
If the economy goes cashless, then WE WILL MOSTLY DIE, (food riots in less than 2 days) if Banks don't reverse their outsourcing.
ALL Mission critical stuff, stuff that defines the core of your business needs to be in house. It needs to keep going via having at least 2 geographically separated data centres if one has dodgy sw added (never do both at same time), or PSU fault, bomb, fire, flood, terrorist, aircraft falling on it, etc.
What happens to to the public and your business, if you and all the similar businesses have outsourced to the same megacorp and it falls over (for WHATEVER reason, Amazon, Oracle, Google, Microsoft never have outages?)
So don't just consider the money!
They use UPS all the time. As a courier.
I thus will not be surprised if Amazon take over the local Petrol stations / Midi-markets and use them also as parcel depots (to ship and receive, to aid selling on Amazon and buying), with a coffee shop with not just WiFi but the Amazon voice controlled ordering terminal as a touch screen for menu, local shopping and online ordering.
2015 was the year it was finally obvious to everyone that what ever disease Nokia had in 2002-2003, killing good stuff, insisting on S60, losing direction culminating with sabotage of Maemo and TrollTech's QT, that MS caught the same disease. Was VS6, SQL7, Windows XP, Server 2003, Office XP/ 2003 the peak and it's been downhill since with GUI stupidity and bloat?
Win 10 is a train wreck.
Indeed people might purchase the full version with the lyrics. I'd have thought that on balance this boosts sales if these lyric / voice free versions are not available. Seems yet again the publishers are missing a trick.
The creation is copyright violation, requires a civil suit. If there is monetisation, then perhaps there is unpaid VAT, unpaid tax on profits, possible criminal cases for forgery etc ...
Amazon has a creepy thing that lurks in your living room.
Are these things are gimmicks for people that:
a) Can't type
b) Can't use search properly
Do they actually manage as well as an 8 year old with a laptop and Google Search?
"Wee Jimmy, find me a deep fried pizza"
1) It was a good idea, badly done, a sort of C syntax Visual Basic for every platform. You really can do a GUI based application that looks like OS native style that runs on all the common desktops. Why though by default do people use the default stupid eye candy builds.
2) The insane license terms for Desktop Java is one reason for Davik etc on Android
3) Sun made a mess of it long before Oracle took over.
4) Amazing how many people don't realise that by default the applications have all the source in a manner easily extracted.
It could be fixed in the future, not impossible even to make a version to rule them all and work with old applications. Better GUI development tools with easier choice of window style etc and distribution only of bytecode.
I remember secretaries calling those new fangled things, hard disks.
Obviously the 360K 5.25" disks were floppy.
I agree, Mozilla has to include it in Firefox. They have to include it on all platforms of Firefox too, somehow. Better than messing up the GUI again...
Not cheap. It's a huge amount a year. Also pointless unless you are top end speed DSL with no cap, Cable or Fibre.
It's a service for the affluent and privileged.
Smart TVs are a bad idea, likely provider X uses codec Z, management Q and DRM R next week/year/month and the TV "maker" (or real OEM) never releases new firmware, or if they do it's only OTA on one platform in another country, or impossible for joe soap to install.
At least a laptop as a final resort can be dual booted into unwanted OS U to get X, Z, Q and R.
Yes, Adobe DRM added to ePub open standard: Books that tell someone what page you are on, and in the clear too!
Adobe DRM on PDFs, often scans of Public domain documents.
Purchased downloads and media should NEVER have DRM. It's immoral.
DRM on Streaming? Well, a decent HD camera and 42" HD monitor defeats it.
DRM is pointless and immoral. It only makes life more awkward for ordinary users, never stops commercial pirates and only for a short while stops casual home "pirates". Yet removes consumer rights, adds restrictions not in Berne Convention on copyright, allows corporations to "landgrab" public domain works and retain restrictions when copyright should have expired.
Some contracts and so called licences on Digital Downloads (not streaming) may not even be enforceable.
Did I mention DRM is pointless and immoral?
Well. I boycott pay TV and Subscription Streaming.
We've seen it. Nothing new here at all. Even the marketing buzz words are getting stale.
Its not even proper SF, but Science Fantasy with wishful thinking!
So did they previously use an analogue hydrological computer programmed by dials?
We have made no progress at all on AI since 1946, all progress has been narrow areas of simulation and so called "Expert Systems". Much language in the AI field has been redefined. Computer Neural Networks are nothing of the sort. "Learning" and "Adaptive" is nearly a misuse of English. Successful language translation has almost abandoned grammar / parsing / semantics / context to use a brute force "Rosetta Stone" approach.
Speech recognition is nowhere near as good as Audio Typist, never mind a personal assistant. It just needs less "training" that it used to.
We don't yet have an adequate definition of natural Intelligence, so how can we define the program requirements much less write one? True natural language interaction rather than the Artificial Stupidity of phone response "robots", Siri and Cortana is very far away and probably needs true AI. It's not a question of computer power, or it could be done slowly.
Neal Stephenson in "The Diamond Age" asks is in fact real AI even possible with a "turing machine" (i.e. ANY computer). The "book" in the story certainly isn't possible today [ignoring issues of communication and charging] without even a real time team of humans, rather than the one person. The hardware and software of the "Book" is certainly feasible, though it's more like something implemented with eInk plastic paper than OLED or LCD.
In a way Project Xanadu is the first attempt at the "book" and while earlier than Apple's Hypercard (before HTTP/WEB) and HTTP/Web tries to solve some basic limitations of web pages.
The hardware of C3PO or R2D2 was possible even in 1977. Though power supply and balance for C3PO a problem then, now solved by e.g. Honda's robot (though I suspect power / running time is an issue).
Various generations of iPod / iPhone docks
The Apple Watch so called "wireless charger" plate*
Inability to use USB storage mode, you have to use iTunes or "cloud" to move data from iDevice with USB port even if you have a Mac.
[* So called wireless chargers are connector-less, the pad still needs a cable. Even madder is a Cordless kettle. It's a corded kettle with a really big plug!]
Before 1926 they only did light bulbs. Now they only do light bulbs and health products. They licensed the TV and HiFi badges to two different Asian companies, a step beyond outsourcing. Philips Electronics that did wonderful valves (Mullard from 1928) and later Transistors and ICs (inc Valvo and Mullard) was spun off ages ago as NXP.
They were the last serious European competitor to Asian Consumer Electronics.
MS (after losing an EU case) is pretty open. Even before that very much was interoperable. Nor generally do MS create SW that only works with their H, or HW that only works with their accessories.
There is a good reason why retail OS X is a lot cheaper than retail boxed Windows.
Another good reason to avoid outsourcing or using the so called "Cloud".
Phones are too skinny, batteries too small for the consumption. My phone used to last all week.
I'd happily have a phone twice as thick as my Sony Xperia Z1 if the extra space was all battery.
You joke I hope.
99.99% of tweets are irrelevant (a guess).
If Facebook and Twitter was closed the Internet would be improved.
Media takes far too much notice of Tweets. The BBC has plumbed new depths by not only having a feature article on their website of something trending on Twitter, but having BBC R4 program promoting what ever trendy thing on Twitter they picked.
We certainly don't want undemocratic, self selecting, minority activists using twitter to create laws.
However the US Politician ignorance of Internet, or indeed life outside USA, or the "West" and the sense of entitlement to >70%+ resources for < 16% of population is astounding.
No ISP should EVER be using any Google supplied service, especially eMail.
If they can't run their own email system why be an ISP?
There is no evidence yet that
a) We need this even if done properly
b) That anyone can do it properly
c) That just on the security aspect, there is no evidence that can be got right without both an expert supplier and a resident security expert (I've seen otherwise secure PCs or Routers with STUPID PAWN ME settings enabled by user trying to get a game working).
Huawei are not actually the Chinese State. They even have complained about the state owned Chinese companies getting preferential treatment.
Besides if you are NOT Chinese / Tibetan / Taiwanese / North Korean etc which is worst:
3) Chinese Government
4) Israelis (10% of security products shipped!)
Aren't they wonderful?
When is Boffin Day so I can toast them?
The iPhone used commodity parts, was basically samsung chips glued to an iPod and the fingerworks GUI added to a cut down port of OS X (As MS did to get WinCE for PDAs and later phones from NT).
The original iPhone used a Samsung stock 6400 family ARM SoC.
Other people used resistive screens because "handwriting" and Annotation was a holy grail (From early Palm and Apple Newton devices). Capacitive screens existed from 1980s but ignored due to poor resolution.
Using a finger orientated GUI rather than miniaturised standard WIMP GUI with stylus was the ONLY "innovation" and not IMO patentable as it wasn't new, just not the fashion. Nothing else at all in the original iPhone was innovative, all stock parts. Apple bought their own ARM SoC designers later.
It's crazy that such a pre-existing design style that is so generic and not at all distinctive was allowed to be registered by Apple.
The Fluted Coca-Cola bottle is a good example that deserves it. A rounded rectangle with touch screen and icons does not.
It shows how broken and partisan the US system is that Apple should be awarded this so called patent,, that it should be enforced and that the penalty is set at 100% by Congress rather than a realistic assessment of damages. If the "design patent" is really valid (which is stupid!) then Apple should only be a awarded a symbolic 1c in damages. It's not an original concept, nor is it likely anyone anywhere intending to buy an Apple phone buys an alternative because it's a similar shape.
Stereo is mostly irrelevant to effects, really and was developed in 1930s (Alan Blumlien in EMI). BBC was using stereo on radio earlier than starwars and Stereo radio development (late 1950s, though BBC roll out was much later) was long after Stereo records were 1st produced..
However Lucas did make use of the effects channel, the .1 of 5.1, Surround sound though also predates Starwars.
Many Lucas films have well done 5.1 sound tracks.
What is the life of these magical helium drives?
Maybe Seagate is smart.
How do you keep the helium in?
a) Majority of XP systems not regularly patched anyway, so makes not much difference.
b) Many vulnerabilities are blocked by firewall/router
c) Most bad stuff is now from Internet, not shared disks, people click and install it. Or it can be mitigated by NoScript etc (Which I run on Linux as malware isn't the only issue it solves).
d) People with Win10 and 3rd party AV can be as easily infected if clueless and keep clicking on OK, installing toolbars, fake codecs, opening attachments, etc.
ActiveX in a browser. A stupid idea, in an ideal world who ever signed off that in MS should be in prison. Might as well send all kids past puberty to a brothel for every birthday party.
Probably more stuff runs XP than Win 10 as nearly all Win10 is on the Internet (it's broken otherwise) and some computer users and applications can't get, or don't want or must not have Internet. Meanwhile some people telling their XP Workstation and Win 2003 server that it's a POS terminal and getting Server 2003 and XP patches for free.
Relying on the IP address to validate anything is nuts.
Paying bogus invoices is incompetence.
No native voice on 4G. 3G and 2G voice is native, early data modems simply used the voice modulation, hence 245kbps on 3G and 14.4kbps on GSM. HSPA on 3G adds non-voice modulation modes to the CDMA hopping carrier to increase data rate, depending on link S/N. GSM EDGE is also a trick using different modulation to original GSM to get 200Kbps + There is a version of GSM (not used) for data only called ERMES+ that can do 2.4Mbps and native IP only (no voice). Superior to 3G HSPA as there is no cell breathing and speed is consistent. But bad for marketing as can't compete on peak speed.
3G has no native IP, IP is native on 4G
3G is a wider band version of CDMA-1, typically 5MHz CDMA coded channels. 4G (on downlink at least) uses COFDM or even COFDMA (loads of separate carriers, in 1MHz, 2MMHz, 5MHz, 10MHz or 20MHz channels. GSM uses a 0.2MHz carrier. CDMA 1.25MHz, or 3G 5MHz, is a single carrier frequency hopping in a known pseudo random fashion. Each link is different key. If all keys are used up, then the data is multiplexed, voice and data are different for 3G and HSPA version of 3G.)
Flash-OFDM, Wimax and LTE are 4G systems.
WiFi can be CDMA or COFDM depending on version.
There are real 4G specs, though the important ones for 4G mobile are all LTE, which is certainly NOT at tweaked 3G, it's as different as 3G is from GSM (2G). The 3G is however based on USA 2G (CDMA-1), I guess Qualcomm wanted more money.
" IPv6 was designed from the ground up to cope with mobile devices."
I've never ever seen that claimed before. Other than the addressing size and giving the same IMEI the same v6 IP every time. (Which might be a bad idea for privacy!)
"the need to maintain a static IP address while moving from mast to mast and (potentially) roaming between different subnets."
That's a completely separate issue. It's still not very reliable. Switching between 3G,4G and WiFi without losing a session is really hard. It's possible and IPv6 is nothing to do with it. Switching between mobile 4G sites with IPv4 works fine, was testing it in 2007.
3G has "voice HD". It has a choice of codecs. 4G can't use some of the 3G and 2G codecs as they are less than optimal for TCP/IP UDP type networks. VOIP needs codecs optimised for TCP/IP unless you know the latency and packet loss is very low, even then the native "wireless" codecs have wasteful frame sizes for IP traffic
You'd care if other people's voice calls was only carried on 4G. The data speeds might be 1/4 for you.
No-one is "going for 5G" already. It's still an ill defined concept, possibly more about integrating infrastructure. There is no 5G standard yet, nor any 5G base stations or software or hardware.
No, IPv6 mainly addresses the addressing issues. It does have some other "fixes", but still inherently not good for Wireless. If it was, then Satellite systems would use it natively on the links (nearly 90,000km each way, ground station to user.), they don't.
Mobile / Wireless needs a protocol designed for massively variable speed, packet loss and latency, which can vary from excellent to really bad in minutes for a given link. Also the number of user connections per sector can vary from 1 (20Mbps) to 20+ (0.12Mbps) during a transaction without the addition of weather, movement, interference (even from other cells) or whatever affecting the speed, latency and packet loss. IPv6 doesn't make a huge improvement over IPv4 for that scenario. Also Mobile has the philosophy that individual users get all of the bandwidth available, rather than throttling back power (reducing inter cell interference and increasing battery life) when 1 or 2 users compared to 5 users in a sector. It's pretty garbage anyway for more than 10+ serious users of data (c.f. 3G where about 100 native voice calls are possible).
4G is really really inefficient for voice calls compared to 2G and 3G native voice frames as it's carried as VOIP using TCP/IP to setup and control, then UDP streams. Hence most operators with 4G & 3G make your voice calls use 3G or even 2G!
" optimized for the massively varied use cases of the next mobile generation, for cloud services, and for virtualization and software-defined networking (SDN)."
Only wireless is different from point of view of TCP/IP stack and protocol compared to the others.
"Cloud Services" = Remotely connected servers, nothing new here.
"Virtualization" is irrelevant.
" software-defined networking (SDN)" isn't really about TCP/IP at all. It's infrastructure management.
Radio unlike cable*, fibre and in house networking suffers from variable packet loss (other fixed cable/optical links have almost none to zero) which can be high. Wireless has unpredictable variable latency and speed too. The other technologies at the link level are pretty much fixed latency and speed. So TCP/IP is really poor for wireless, especially outdoor wireless like mobile. The ultimate "wireless" is satellite. It doesn't bother with TCP/IP at all. Best to imagine each end of two way link is a pair of proxy servers with a special protocol between them. That's why a VPN is garbage performance on Satellite unless the satellite ISP has your VPN endpoint in your modem and recreating the VPN at their Groundstation.
This is why 4G stuck with TCP/IP, though really a poor solution for mobile, as without being more clever than the Satellite modem folks, lots would be broken.
So this is really ONLY about mobile and how to have not TCP/IP over the wireless transparently to all existing traffic. Look at IP V6. Is the end user or average business going to change to an alternative to TCP/IP?
Anyone today can do their own design of TCP/IP implementation on their OS (virtualised, cloudy or not) as long as to the external network port that it meets the spec!
[* DSL, VDSL etc and other schemes over Cat 3 (phone wires) is more like Radio than cable or fibre. Unlike actual outdoor Mobile, the latency and link speed is constant for a given pair of wires, but the packet loss / interference issues can be like WiFi or Mobile. A powerline Ethernet adaptor or electric fence can disrupt DSL]