1791 posts • joined Friday 23rd November 2007 08:43 GMT
Chopping up spectrum may create different suppliers, but does that really create competition when there are so few?
A single spectrum gives best efficiency and performance for users. There should be a single wholesale infrastructure reselling simple pipe to MVNOs.
Beatles' Apple Corp. Steve and Steve DELIBERATELY chose Apple because of Beatles!
NEC or Fujitsu iPad
How many other names?
Most Apple patents are either not really original or novel enough, or copies of prior art.
Refused apps on store and then Apple copies
Apple deserve to lose.
In 1987 I argued that it was a flaw not supporting Concurrency and that the language had learnt nothing about simple interfaces to building large systems from Modula-2. Modula-2 and Occam are older and both have concurrency support as part of the language.
Since the 1970s we knew a new language would need Concurrency, Multi-cpu support etc.
Also large C++ projects still too much depend on tools like Make. You should not need a separate file in a scripting language to build a large project.
Strings and Buffer Over-runs. Most Security Vulnerabilities and bugs.
Again by late 1970s it was known the default should be array bound checking. At outset of C++ in late 1980s the shortcomings of Strings in C was known and half baked string classes produced. It should have been a bit like VBSTR, not just null terminated but a size stored as well if dynamic, if Static the compiler knows the size. Then "count" to return the max number of array elements (string or otherwise) and "length" for size of array (or characters in string before null terminator).
C++ was born deformed due to having to be as backward compatible as possible with C, which after all wasn't really developed as a HLL at all but as a portable macro assembler to port UNIX and UNIX applications. Any form of Template, macro or define has no place in a safe language with human readable programs.
Obviously any native compiled language that's got a decent compiler is going to beat "p-code on Virtual Machines" (VB6, .Net, Java) or scripting languages. So no surprise C++ wins that competition.
Stratus approach is so Pre 1990s.
First we had VAX Cluster.
Then in 1999 I think Cluster with 2 x NT servers sharing at least two external storage shelves on separate 2 x SCSI bus (3 was better performance)
kill power randomly on one Server or one Storage shelf and no interruption of service. A cheap commodity server with single PSU is fine.
Obviously each shelf and server needs it's own PSU and separate UPS. But no server or shelf level expensive on board redundancy.
Turn it off?
Umm. It only needs well defined states to build a Digital Computer. Not actual "off". Though power consumption might be a problem.
An analogue computer is a Model of a system. It uses Analogue summing, log, Multiply, Square and Square Root circuits. Multiply by constant is an amplifier. Divide by constant is a pair of resistors. The number resolution is poor at between 1% to 0.1% full scale, thus roughly 7 bits to 10 bits resolution. The wiring decides the "model" or problem to be solved. Thus an analogue computer is not something for transistors that don't switch off so well, it's not a computer in the same sense at all as it has to be physically wired for a particular task. It's also inherently a "Data flow" processing device with parallelism and serial pipelines depending on the Process to be modelled.
There have been digital logic gates made with devices that don't fully turn on.
It's very power hungry, but was much faster than original TTL
A few late 1950s computers used it.
By the 1980s ECL type of design was mostly just used for Dividers from UHF or Microwave.
"The latest version of Microsoft's browser, released this year, does not – and will never – run on Windows XP. Microsoft is trying to steer people towards Windows 7. And rightfully so."
Win7 is a fix of Vista.
Vista and Win7 are mainly about eye candy. They offer no significant advantage to XP users, Win7 appears to be a marketing effort to compete with Mac OS.
MS has lost the plot. Also many high end computers even up to 9 years old are powerful enough to run Win7 apart from Graphics. The Graphics architecture is Stupidity. DirectX was bad, but at least it was really only to make migrating DOS like games easy (rather than Open GL or Win GDI based). Direct 3D is worse.
No-one needs IE 9, so that's not important.
Apple don't care about 40%+ of people running XP because they think people should run OS X. (which is priced cheap because it's only licensed for overpriced and Locked Apple Appliances).
What sort of semiconductors are they counting?
Which companies did they check?
The ideal PVR records EVERY channel for rolling 7 days... Browse entire EPG from now till a week ago and watch or "save".
About 0.3 Tbyte per channel. More for HD.
1T byte and larger are common now in PVRs.
IPTV? Only as a Pay TV service from your ISP. Real HD in real time over the Public Internet isn't feasible, costs about 10,000 more than Terrestrial broadcast.
FTA TV at Broadcast & HD quality will only be Broadcast. Until there is true Broadcast IP (which can't by definition be VOD) on Fibre over Public Internet, real TV on "Broadband" will be only from your ISP. And cost a lot more than FTA satellite or DTT.
iClouf and Google Docs
So called Cloud computing is the mainframe and terminal by another name.
The people who need PC or Laptop to do the kind of work they always did that wasn't media consumption, email and web browsing need to store specialist applications and data.
The "Cloudy" folk don't want your data. They want to be your application provider and then the Cloud Admins really own your data.
Back to the Central control of Mainframe era where someone else decides what applications you can have.
Also back to 1980s network speeds. Most people's cloud uplink speed is less than 0.5Mbps. Pretty like a single shared10Mbps cheaper-net / coax hub based network in a medium office with no switches, only hubs. Except with worse reliability and latency.
Get the facts right.
"Freesat HD" gear that can't do DVB-S2 isn't compliant. over four years ago I bought DVB-S2 sat cards for HD, because ALL HD was going to be MPEG4 H.264 L4 DVB-S2.
"Freesat" (non-HD) gear can't do HD no matter if it's on DVB-S or DVB-S2.
Unscrupulous dealers, makers and wholesalers is the problem. Not the BBC.
C4 HD was already DVB-S2.
many sites are producing apps for iPhone and Andriod. It's WAP and walled garden with fancier technology and more lock to vendor instead of carrier.
These things have WEB browsers... So have many non iTune and non-Andriod marketplace etc gadgets..
Publishers: Code for Web browsers. maybe even with USER menu for screen size. Don't rely on user agents.
A plague on sites with:
No usability at all without Flash (can be bad even on flash enabled stuff or people may not be able to install "latest" flash) or Silverlight.
Any requirement to use Java, except for special applications.
Any requirement to use .Net or Active X at all.
Fixed screen width, if must have, then no worse than 720 wide
Can we stop
Why calling these LED TVs?
They are LCD TVs with LED backlights. There are real LED HD displays (usually x10 bigger) and real OLED displays (usually 1/10th the size).
Why calling these 3D TVs?
They are stereoscopic TVs. Nothing 3D about them, There are (small and expensive) non-holographic yet true 3D displays.
People manage, not applications.
However if someone is clueless they COULD put it all on Excel. Trivial for "someone experienced in Databases" to export it and set up a real database before their 2nd coffee break*
(* Obviously they have their 1st breal while waiting for PC to boot and then browsing El Reg before getting down to less important stuff)
Freesat isn't Pay TV. So why are most boxes so awkward for non-Freesat?
How many button presses to go to Sky News (or what ever non-Freesat channel) and go back to Freesat mode and get BBC 1 HD?
Why can't you have a favourites list with Freesat and non-Freesat?
Does MHEG5 work on Non-Freesat channels (I know there are none in UK, but there is one in Germany and some in Ireland shortly http://www.saortv.info/satellite-saorsat/saorsat-reception/ )
Has it got Diseqc?
Does 28E have to be on Port one of Multiswitch distribution or 4 way Diseqc switch?
"Controlling what PC makers can and can't put in their machines that run Windows goes against the entire ethos of the Windows ecosystem, and makes Microsoft more like Apple."
This is important to start with.
Oddly it gives MS an advantage over Apple (because there is more devices and software and users by a factor of nearly x10).
Linux is as open as it gets, yet on x86 LESS peripherals and variation is *WELL* supported than on Windows XP. Of course more specialised Linux/GNU distributions support MIPS (no longer on Windows) and ARM (so far only on CE based Windows (i.e. Phone7) and not yet on NT based windows till Win8).
They are right to do this for first release and then add more HALs and Drivers later.
Quote "That isn't the case any more, it is just hyper critical of everything and I don't undertstand why."
We need it.
Because frankly all the big companies are arrogant, only concerned with Image.
Glossy screens, Animated eye candy, beautiful multimedia all looks lovely in show room. But they are more interested in that and 1st impressions that improving "real" user experience, reliability, reducing price, productivity.
Form is replacing function.
Then at the other extreme we have Nokia who at time of N9100 brick phone with symbian then had a good future and through it all way, made too many conflicting GUI choices, did nothing to make symbain development easier, along the way destroyed Trolltech and Maemo.
There were linux phones smartphones before Android. But not much investment because RIM, Palm, Windows and #1 beating all the rest was Symbian.
There is a brand new DELL Win7 laptop with super shiny screen. It's totally inferior in speed and usability to XP on 1600 x1200 Ultrasharp NINE years old Dell Inspiron.
1368 x 768 @ 60p may look cool for USA BD and DVD, Not so good for 25i or 50p DVD or BD. You can't get a 50Hz screen laptop.
768 lines totally inferior to 1200 lines for documents or web pages.
Even 1920 x 1080 is no use as it makes the laptop screen too wide compared to a 1600x1200, and 1080 is just slightly too short for A4 PDFs and Word documents.
We have LOADS of cool gadgets and toys. Technomage stuff.
But how much stuff actually works well to do your job?
* Keep up the good work El Reg *
Though many HW reviews are not critical enough.
Quote: "It's good we're moving into an age where user interfaces are moving beyond the ideas from the early-1980s at last."
Actually the idea of WIMP is early 1970s. 1st Implemented in Mid 1970s.
It's still pretty much the best interface for large screen, keyboard entry and CAD / Graphics work.
The styles and ability to have contextual, floating menus and such is a minor tweak. The original idea was a pen (on desk, not screen), but they couldn't miniaturise it so turned a trackball upside down to make a "mouse". Light pens working on the CRT emitted light and calculating time from line and frame scan pre-existed the trackball and mouse. But working all day raising your arm to a large screen to gesture, select and draw is very tiring compared to a desk surface.
Perhaps a matt surface tablet that's A3 or A2 size would allow a true "desktop" metaphor, but current tablets are too small.
There has been work on "surface" and Table computing with idea that the surface recognises objects and is even big enough to set a real wireless keyboard and mouse (for those workers that need them).
It's a masochist that would write a book or manual on a touch screen rather than tactile keyboard.
Tablets and phones need new ideas. Actual big screen + keyboard information creation tools just need to be quick, reliable and affordable. Without distracting animated eye candy.
One touch doesn't rule them all.
Different form factors need different GUIs.
With the Original Win CE (which Win Phone is based on actually) MS made the mistake of putting "classic" WIMP GUI menu on small screen devices.
Are thy now going to make the mistake of putting the small touch panel Zune/WinPh GUI on tablets (which need different GUI, combination of thumbs at each side and WIMP + gesture touch) and on Netbooks/PC/Laptop which are still best with a "WIMP" type interface with keyboard shortcuts?
You could argue that about 4 or 5 different "User Interfaces" are needed
TVs etc with Remote controls
Servers (no candy and optimised for remote desktop and consoles)
Small gadgets with media navigation pad and small screen (players, phones, cameras, games)
Smart Phones Variants with purely Touch centric and ones with more real buttons. Must use keyboard and pointing device if added.
Tablets. 4" to 7" and 7" to 12". Maybe 3 variations according to size and resolution. A 4.3" 800 x480 is very different to a 11" 1920x1080
Dedicated readers with eInk type displays: Don't suit fancy Tablet GUIs
Small Netbooks 6", 7" .
10" to 19" Netbooks/Laptops/PCs/Notebooks
If you are mainly creating content rather than browsing and consuming it a touch screen is very very tiring.
Each class of device has it's one advantages. You are going to fail imposing same Look & feel and paradigms on them all.
Why does it sound like something out of a Terry Pratchett?
Or at least what a 18's rated Discworld novel might be like.
The regulator is best and only qualified to run the databases
FCC and Ofcom are abrogating their responsibility. They issue licences and allocate spectrum. They should ALREADY have the needed databases, since the 1970s, or WHAT ON EARTH are they doing all day?
Credit the junior as lead
If you think the research is very dubious?
In 5MHz the LTE delivers barely more than 3G/HSPA+
"it maxes out at a trifling 100Mb/s down and 50Mb/s upstream. If anything, it's 3.9G."
Actually that is if signal is perfect, no-one else is using the mast and you have a 20MHz channel.
Real world Average throughput is very very much less. Maybe 4Mbps to 12Mbps. As the folk on Cell edge only get 600kbps. Divide by number of simultaneous users.
So 3Mbps DSL that's actually 3Mbps can beat 100Mbps LTE.
HSPA & LTE are close to Shannon Boundary. So the only way to get more speed, i.e. 1GBps peak, is 200MHz channel.
Your loaded user speed could easily be 10Mbps on a 1GBps peak mast as it's a shared resource.
How many people can get 250MBps on 250Mbps WiFi at same time? Actually in a typical setup with 10 users you might get 10Mbps. Or 1Mbps if everyone tries to watch a different YouTube channel.
Lies, Dammed Lies, Statistics
And then Mobile vendor speeds.
Proper testing has been done.
There is no evidence that so called "Electrosensitive" people are any more accurate at deciding a transmitter is on or off than a tossed coin.
See also "random walks" as to why a tossed coin doesn't get it right exactly 50% of times and why a large number of trials are needed.
People are "Electrosensitive". No.
Magnets are any better than Placebo . No.
Masts have any more effect than Nocebo. No
Makes any difference if Mast is live or off. No.
What about all the scareware ads they server too?
They are arrogant and irresponsible.
Subtractive vs Additive
LCDs are clear or Black. So don't work in three layers of CYM. Hence they have to have additive dots side by side. Each dot has to block entirely cyan (leaving red), magenta (leaving green) and yellow (leaving blue) wasting at least 2/3rds of the backlight. Real LED displays (like fill a wall) are Red, Green and Blue LED dots. AMOLED are really yellow or UV/Violet (with phosphor to get white), Most have LCD style RGB filters.
To use CYM you need a display that each element is coloured or transparent. The mono eInk in kindle etc uses tiny balls that are black on one pair of faces and white on the other, so they have to to use RGB filter additive side by side for colour which makes them VERY dim.
They must be using a liquid dye type display, each cell is empty (clear) or filled with cyan, yellow or magenta liquid. Parallax is an issue unless the cells are very much thinner than they are wide and high. OTH each cell can be 3x bigger than the RGB additive technique. Or twice as big for additive RGB arranged
for 2 x2 subpixel instead of older 3 x1 RGB stripes.
Most of these things are disguises for Ponzi schemes and people doing naked shorts.
Some people will make a lot. The majority at the end of the day will lose their shirt.
Apple are an example of a company deliberately encouraging speculation as they do not pay a dividend.
The only long term model which is not destructive is investment to get a return from real profits.
Anything else leads to Ponzi schemes, speculation, naked shorts and asset stripping. Look at eircom in Ireland, Anglo Irish Bank or Enron.
Not at the research but that Vulture Central took so long to notice it (or print it). :)
Of course what other icon is possible?
Intel: Forget about Transistors
It's not packing 40 Billion transistors rather than 10 Billion in a CPU that counts anymore. It's packing everything into a cheap low power chip that counts
Intel needs a CPU with 500,000 transistors instead of 1Billion to 40 Billion
Then they have 1/500th core consumption and space for RAM, Flash, GPU, I/O on the chip.
Moore's law is pointless beyond a certain point for CPU, It's the RAM, Flash and everything else.
Lets see a graph of how many CPUs shipped, not Transistors. Compared with ARM. MIPS, Microchip, Atmel, Samsung, Texas etc.
300dpi is fine. Laser printers historically only printed a dot or not. Even with only 4 levels of grey you can make 300dpi look as good as 720dpi
Also most people don't have the sharp enough vision at normal reading distance to see much more than 200dpi if the image is anti-aliased with a few grey levels.
My LCD is about 133dpi and almost no-one has ever noticed the pixels.
I look forward to getting an eReader with 9.7" screen and 300dpi. Great for A4 PDFs.
PDFs reformat badly, so a 9.7" 300dpi will be very nice for reading and documents
It allows a nice wide margin around "paperback" format text rather than text up to edge of screen.
2nd worst idiot Regulators in Europe
Irish Regulators (of any kind, as any fule noes) are worse.
"and assumes that wireless microphones won't go outdoors and that all TV aerials are more than 10 metres from the ground."
There are so many things wrong I don't know where to start.
But then they think so called "ethernet" power plugs are OK (In reality wide band transmitters).
They gave up on protecting licensed users of spectrum ages ago.
Ovi was disaster
It killed the rather handy (on conventional non-touch smart phone) widgets.
There already was a web based eco-system of symbian Apps.
It killed what was there without replacing it. The only advantage Nokia had above Android and iTunes was complete random openness. Ovi threw that away.
It would be like as if MS decided only MSN could deliver anything for Windows.
Useless for HD video
Don't bother adding USB DTT stick or PCI-e Satellite card.
The Intel GMA950 + Atom can't do MPEG4 AVC H.264 L4.
Also overpriced for hardware of a Netbook (similar spec at 1/2 the price?) without a screen
Kodak Portfolio CDs: Interactive menu using content from Audio CD and Photo CD tracks on one disk (real photo CD format, not the jpg Picture CD). (these are not to be confused with putting your portfolio on a CD.
Portfolio CD format never took off (I have a Kodak demo) as the tools to author were far too expensive. Even Photo CD was never popular due high cost of Kodak tools and closed specification.
A few dedicated CD audio/Photo CD players and free windows program can play Portfolio CDs. Basically menus with slide shows that have synchronised CD Audio sound track, but quite complex interactivity with cursor and hot spots. Obviously "portfolio" is proprietary Kodak format that is derived from Red Book (CD Audio) and Beige Book (Photo CD). But I've never heard of a Rainbow book for it.
Wikipedia knows nothing of Portfolio CDs.
1984 publicly announced in 1986
DVD menus are pretty much based on "portfolio CD" and CDi.
Appropriate Icon for these kind of "IP holders".
Taking the Tablets
Very often MS do have the right idea but fail on the execution,
x86 second Tablet interface
I worked on a non-x86 "Tablet" project in 1989. The reason everything used a stylus from then rather than a finger was:
1) The holy grail of handwriting recognition
2) Send handwritten faxes
3) Deemed faster for notes and annotation
4) Xerox only made a mouse because they couldn't fit a tracker ball into a pen.
Capacitive type touch screens DID exist, but no-one wanted the low resolution. Everyone chose Resistive to allow high resolution input (ideally to avoid "jiggling" the input resolution should be higher than screen resolution. With light pens it could only ever equal display and only worked if blacks where a bit grey, well dark green)
Intel better make ARMs
Because Samsung, Texas, Qualcomm etc will make ARM FinGate FETs... At the moment there is no value in putting FinGate FET in ARM Soc as it adds to cost.
The other aspect of x86 vs ARM SoC apart from Power and SoC Integration is much lower ARM prices.
LCD 3D glasses hacked and hooked up to Stress sensor (heart rate, sweat, Adrenalin)?
One of the most feasible things on the list, about 100%. Not 10%
Some of the 80% + items listed are not likely more than 1% likely.
Intel Mobile Strategy?
They lost that one by
1) Not developing the i960 into a Mobile part
2) Flogging off their ARM stuff
3) Sticking to recipe of ONLY aiming for more transistors, clock, benchmark on CPU
4) Belatedly looking at SoC. ARM success is partly that you get a whole system (many different ones) that simply happen to have an ARM as only one part.
x86 is a deadend architecture that is holding back Computing. Has done for 30 years.
They sold most of it to Marvell. But I think kept a communications processor.
It was later versions of StrongARM, the PXA series mostly they sold. Marvell is doing well with ARM stuff. Even some PC Mobo Ethernet/SATA combo chips are actually Marvell ARM.
1) Too expensive. A USB DTT stick is from £10
2) Since is connects only with WiFi, why only for iOS?
3) Most locations need external aerial. The Telescopic aerial should either be on a connector, or there should be a separate connector.
4) Streaming video on WiFi is very demanding. A direct dongle on the iThing and option for an external aerial would work FAR better, give far better battery life and cost a fraction.
It's basically a $6 DTT chip and a portable WiFi point. It's a nearly pointless shiny toy.
Though Elgato do make the only decent Apple Mac OSX TV tuners. But then for Windows or Linux you can use many more (almost any on Linux).
Chinese suing Chinese?
Bit like a Scorpion trying to sting it's self. Maybe they are trying to prove how "free market" and "not run by the the Government" they are.
Iv'e used OSX on real Mac, though I'd not buy either.
I'm still mystified as to why people
1) Buy a real Mac and install Linux as many "serious" Computer people do. Seems like paying a HW premium for a more limited platform that may or may not look prettier.
2) Install OSX on a non-Apple box rather than Ubuntu, Fedora etc which is more open, runs more applications. Seems like paying a SW premium for a more limited platform that may or may not look prettier.
Can someone enlighten me?