Re: The old days of Skype
The original selling point of 3G when it was only 240Kbps, no HSPA or Data plans was video chat. But it was too expensive so it sort of died off.
7218 posts • joined 23 Nov 2007
The original selling point of 3G when it was only 240Kbps, no HSPA or Data plans was video chat. But it was too expensive so it sort of died off.
Email, IRC etc for text & files
QQ for voice, video, text, images and files. Though only seems to work on Windows, Android and Win Mobile
It's not fatal, so not Darwin related
Long range, low power so very slow. Even if the gadgets only do 1:1000 duty cycle what if a city of 100,000 has 10 per person?
So how many Things at what speed? Is is scalable at all beyond traffic lights?
"On 10 February 2011, Dialog semiconductor announced that it had completed a transaction to acquire SiTel Semiconductor"
They indeed are not very British
Dialog Semiconductor originated from the European operations of International Microelectronic Products, Inc. – a Silicon Valley company founded in 1981. Then 1989 Dialog Semiconductor formed as part of Messerschmitt-Bölkow-Blohm (part of Daimler-Benz).
According to Wikipedia, which doesn't entirely make sense.
September 2009 – Relocates to new Swindon facility (from Germany?), hence the "UK" connection.
December 2012 – New Design Centers in Livorno, Italy and Istanbul, Turkey
July 2013 – Acquires iWatt Inc.
They seem to have an Edinburgh centre since 2007.
I guess the Ardunio folks will be nervous for a while.
Content, Content, Content. Lots of it, relevant with quality and more each day.
Otherwise folk are just annoyed when they land.
I'm increasingly of the opinion that most SEO is snake oil. Certainly most spam selling it is.
I use QQ now on Windows, I'm more comfortable being spied on by China than NSA and GCHQ as I'm not Chinese. Sadly doesn't work on Linux (ironically Skype works better on Linux than Windows!). There is always email and IRC if it's just text chat rather than voice or video. QQ video has better features too.
Worstall does sort of believe in a minimum wage.
But not one that will interfere (much) with the market, so difference between that and a "living" wage has to be made up by welfare payments. Welfare payments have to ultimately come from Taxation. Any other Governmental income will be minuscule?
You need means testing.
Obviously this sort of scheme also ultimately applies using same means testing to actually unemployed, no matter if traditionally, or due to a Robot Apocalypse. While most people made unemployed (or their children) by Robotics/Automation find other activities, perhaps the underlying trend will be upwards.
Royalty income from Copyright for people deciding to be creative (IMO less likely to be replaced by AI than most AI enthusiasts think) is important. So the push by Google and friends to have only their Intellectual Property count has to be squashed.
It's important to see the big picture.
EDIT: PS: to Tim Worstall
Keep doing these good thoughtful articles. It didn't seem long at all. Even if we don't all agree with all of it. What's the point anyway of posting stuff everyone agrees with, that would be stuff like "rain is wet"?
I like your engaging comments in the comments.
Will Apple Corp sue them a third time?
I'm glad it's not just Me Mr Dabbs.
F&F brand in Tesco has their labels now printed with a helpful scissor icon and dotted line. Though nothing happens when you touch or tap it.
I see there is even an icon of someone removing a label.
Unfortunately it's still far too much stuff that has prior art, or is obvious to expert in the field or are too broad, or otherwise totally lack merit. The USPTO should have to get paid x2 for rejected applications. Then they would be proper patents.
At least this patent is "mostly harmless" even if totally pointless, or is the real target something else?
Bah. Any £1 / $2 / €1.50 Analogue face Quartz alarm clock would take about 2 min to wire alarm contacts to suitable battery and detonator instead of buzzer. The contacts are mechanical. A detonator needs more than the 1.5V of the clock battery. An LCD digital clock alarm about x4 more. LED clocks take too much power, all the ones I know run off mains and count the 50Hz or 60Hz, though there is usually a low accuracy RC oscillator and 9V battery, but in that state the display is off and the alarm doesn't work. They start at about x5 the price of analogue face quartz alarm, or x10 there is a radio.
I can't see why you'd use an Apple watch. If you want remote operation you just use a cheap phone with anonymous SIM and a simple one transistor adaptor to fire detonator from signal on the ringing piezo speaker. Make sure you unticked the receive marketing texts / calls when buying sim or it will go off early.
Even 10GHz is really only Line Of Sight. 60GHz will be limited really to ceiling mounted units in open plan offices. Like LED networking.
Turn off DAB and use it for FM Radio.
76-90 is used in Japan
The OIRT band in Eastern Europe is from 65.8 to 74.0 MHz.
Some countries are considering extending FM below 87.5MHz.
Existing sets (unlike DAB) can be converted using a less than £5 adaptor.
Maybe add part of 175MHz to 220MHz too (where DAB lives)/.
Been demonstrated some while ago at 1Gbps. Of course that was Line Of Sight only.
This is purely developmental stuff. No research or invention involved. 1Kbps comms has been on LEDs since 1970s, not with TCP/IP till late 1990s.
To actually control LED lamps, without ethernet via the power, surely Bluetooth or WiFi or an IRDA or even RC5 IR remote would be better.
I've implemented encryption on a carrier compatible with RC5 IR receivers using Manchester encoding over OOK at 38KHz. That's all you need for a light controller.
No, it's the Popular Judean People's Front today.
Really? There is still such a thing?
Well ANY software can be compromised by 3rd party addons. Like Programs on any OS. So they have a point.
I boot three laptops once a day. Even an 13 year old laptop is 45 second boot on a mechanical HDD.
I never use sleep or hibernate.
New batteries can be bought and I only see ARM tablets with significantly longer battery life. Four hours versus 2.5 hours isn't a reason to upgrade. 8 hours would be.
I don't want a new MS OS unless it's a re-imagined NT3.51 with decent desktop or maybe a new version of XP. MS can stick their too mobile, too touchy, too privacy breaking, too broadband dependent, cloud and rental orientated W10 junk. I thought they couldn't get worse after Office Ribbon. Then they did Vista. Win 10 is a rushed attempt to fix Win8. Unlike Win7 (which ought to have been free to Vista owners) which was a more considered patch of Win7.
Migrating to Linux Mint using Mate and WINE. Bye Bye MS. Intel is trapped. The Itanium was a fail. The x64 was really AMD's idea. Instead of using their ARM licence they STILL fiddle with "low power" x86-64 for an OS no-one wants anymore. They could do a brilliant ARM SOC for phones, tablets and entry level laptops and do some decent chunky Workstation Laptops for the legacy X86-64, if MS bothered to do a sensible version of Windows. Of course there is still Servers and Mac OS (for now) and there will always be Linux for those that want it on a workstation.
This launch lacks vision and focus.
Actually maybe two thirds of the loss is due to the patent case loss for "silicon for a declining device" about 50c a chip.
Apricot, Apple and others.
I forget when Amiga and AtariST came out.
IBM was late to the 3.5" party. The sole reason to use the 3" was cost. I heard the prototype 3" drives used cassette tape heads and were developed in Eastern Europe, which always seemed unlikely.
The only differences were the formats used by OS. The actual 8", 5.25" 3.5" and 3" drives all used the same control bus, only a dumb cable adaptor needed. A major exception was Apple II 100k flip over 5.25" floppy, I'm thinking it had different cost reduced electronics.
The 3" vs 3.5" v 5.25" is bonkers. They are interchangeable. Actually even the 8" drives with work same controller with a dumb interface cable.
The reason for the 3" drive was it was cheap.
I swapped a 3.5" 720K drive in as drive B and fitted 3" Drive B into an IBM AT. Just "dumb" cable adaptors. I used a program on DOS called "Nice 22" to read/write 3" or 3.5" CP/M discs on PC. I used a wordstar clone (Neword, New Word? Newstar?) and Cracker spreadsheet on PCW CP/M and later on PC DOS.
I had Modula-2, Pascal, Forth, Prolog on it.
Also a bizarre bitmapped DTP package.
A box with RTC, Parallel port and 2 x Serial. Internet didn't really exactly exist, but Prestel and X.25 worked on it. I accessed X.25 pad via dailup and sent/received Telex and eMail (with "Bitnet Users" via BT gold) Circa 1986. I had a mouse and a one pixel scanner that clipped on dotmatrix printhead.
Websites came AFTER internet and not till about 1992?
It's in the attic someplace.
"Only after Apple showed the way on how to create a usable interface on a mobile phone (because smartphones we already had) did the smartphone take off,"
Actually yes, it was better than Symbian S60. But success was due to having first decent data plan and subsidised sales in mobile contract. The GUI was bought in from Fingerworks and other people you mostly never heard of had good phone GUIs before Apple. They didn't innovate anything to do with phones.
They will not sell many* of these, but more than is sold of Surface Pro, and at a massive profit margin.
I wonder how the iFans will store and edit video on it?
[* I might be wrong, perhaps these will outsell Macs.]
How many years ago did I say iOS would eventually replace OS X?
*IF* this is a success you can wave bye bye to the regular skinny Mac, or Macs generally. Apple have no loyalty to customers. They have more control and income after sale for an iOS gadget than a OS X Mac.
Apple is less locked into backward compatibility than Microsoft (Mac has been 68000, Power PC, X86-64, OS X series quite different to earlier OS9 etc). Apple users are just supposed to by the new version software.
No, you can't. The so called "AppleTV" isn't a TV. It plugs into a TV or Monitor. It doesn't even have a tuner, so it's just an Internet Video streaming box.
Siri isn't going to adjust anything on my TV.
See title and later post!
Great Universal Stores is same company. Argos & Homebase was originally a Mail Order catalogue company.
I've seen one GUS catalogue dated 1932
They'd be mad to do so seeing as they make no money from it, apart from the mess their GUI strategy has got into for their core profitable markets.
Amazon acquired mobi and closed it (that's what Kindle is based on).
Sony eBooks (PRC series) are gone.
Even if a Company doesn't go bust it might simply "lose interest". But there are a load of other reasons I'll only use Cloudy Docs to Collaborate in realtime and then scrap the Cloud copy after download. A nice Peer to Peer Encrypted collaboration tool would be good.
I thought it was about Carlsberg.
If Carlsberg was in the fossil business ...
I don't mind if they can be reformatted. More use than malicious emails.
Proof the human race isn't descended from a common ancestor of Bonobo, but from Golgafringians or something.
Then US products are dead.
But not one is real A.I. They are just specialist databases with a specialist interface.
No, more like 70 years ago. But so far the only progress has been in redefining the jargon to make it look like progress. We have better databases and ways to interrogate the data. It's not real A.I., no-one is actually researching real AI, partly because over the last 70 years we have realised that we don't know what biological intelligence is and we have a better idea of how to write programs.
Computer "Neural Networks", AI, Cognitive Computing, Machine Learning, algorithmic "evolution" etc have almost zero connection to the similarly named things in biology. It's Humpty Dumpty jargon.
Google Translate abandons 30 years of effort on Natural language work to use the "rosetta stone" approach of a big database of known matched human translations. Zero AI in it.
I'd wonder if in a few years any Apple gear will run OS X. Unless of course MS totally alienate their x86-64 Windows users so much that all the Business application vendors port to OS X
I'd have expected at least x10 more dodgy authors scamming businesses etc .
WiFi isn't private.
It's irresponsible to host insecure WiFi.
Silly Nintendo DS and some other stuff with no updates only uses WEP or no security at all. So no, we haven't allowed the DS on WiFi for years.
But running Debian. Not evil Win10 (win XP or Win7 would have been OK) paywalled iOS or Goggle spyware Android/Chrome.
If it's Windows on ARM, it's nearly pointless. Edit: It's Snapdragon, so fine for Debian, Android etc and POINTLESS for Windows as it can't replace an x86 laptop
I can hook my Sony Z1 Android phone to a USB hub and then USB mouse + keyboard, and plug it into HDMI monitor. But it's a kludge. A Win10 ARM with nice dock won't run x86-64 code etc, so is just a gimmick if you need a real x86-64 Windows workstation.
Terrestrial TV has given up to much spectrum already.
Greed by regulators thinking of increased license fees
There has been a huge increase in allocation to Mobile Spectrum above 900MHz too.
They are not building enough base stations (adds capacity via frequency re-use and up to x20 higher speed due to half distance is about x4 speed). Spectrum is being used inefficiently. If there was a single physical provider and ALL mobile networks where virtual, that would more than double average speed/capacity overnight. The Regulators/Governments oppose this on spurious competition grounds, but the real reason is that their licence income would be 1/4. Licences should be performance based on meeting KPI and then revenue from use. Not up front sales to highest bidders after chopping up the spectrum. Some people "sit" on spectrum for years.
The regulators want to kill terrestrial TV and have only Cable & Satellite broadcast, or Mobile operators selling Broadcast version of LTE. Certainly Ofcom and Comreg have already decided this is policy, without any consultation.
Which are not just an EU thing. Countries as diverse as Switzerland, Germany, China and Russia are concerned by Google, Facebook and now Window 10 / Microsoft.
I trust it's properly labelled in French too.
It's not a scam unless it's not really air from the Rockies.
The design is meant to be as Open source (SW & HW) as feasible. It's basically a now obsolete ARM phone chip on a breakout board. Any complete non-trivial computer design, even if open source is going to have some (or mostly) proprietary chips. Probably the USB/Ethernet chip is proprietary. Fully documented so you can write drivers from scratch is the issue. Almost all chips are proprietary.
No, the FPGA is for prototyping. The same FPGA (or better an ASIC) specially programmed for mining is going to be better than this GPU implemented using an FPGA.
An FPGA is programmable HW, a standard part anyone can buy for prototypes or low volume (a custom chip needs 10K to 1M pieces and you check the design works by doing an FPGA version first. An ASIC can cost 100K to 1M for NRE, or even more if multi-layer, large die and small geometry etc)
FPGA are very power hungry compared to ASIC, but real HW. As the Verilog can be complied to ASIC design instead it just now needs the IP and money sorted to have a user chip, a chip for a MoBo or card.
This is quite exciting. So anyone with a suitable FPGA dev kit and the knowledge to use such can test out and adapt this?
Considering how hard it is for legitimate companies to get paid ...
That explains Google's lackadaisical approach.
I smell collaboration.
Bitcoin isn't a currency. It's a speculation medium.
The more complex it is, the harder to maintain. So called A.I. are exceptionally fragile compared to a Lathe controller, set-box GUI, accounts program because it's impossible to predict all eventual inputs and situations. Unlike a GUI an autonomous car can't ignore input until it gets "valid" input.
Autonomous vehicles need their own dedicated pedestrian and cyclist free road ways, the equivalent of railways without tracks.
I'd not call the sound output transducers in almost all current TVs, "speakers". Glorified headset drivers. Most "sound bars" have worse audio than any 1950 to 1990 TV also due to having driver units that are too small and a too skinny too plastic case.
The newer TVs also no longer have enough I/O ports. Some only have 1, many only two. Four ought to be a minimum, ideally six + 2 x SCART (supporting RGB) + Firewire, + Y/C (S-video), VGA. Otherwise how do you view your older equipment on the new TV?
Biting the hand that feeds IT © 1998–2018