Re: Why VR is doomed to be nothing more than a Niche within a Niche
If you're referring to my comment - It was a reference to the last line of I Like Heckling's post.
215 posts • joined 12 Feb 2007
If you're referring to my comment - It was a reference to the last line of I Like Heckling's post.
I tend to agree, except to say, Apple does kind of well out of the people who buy the latest i devices!
The same debate has already been had over MD5... What's the point in repeating it? Move to SHA-2/3, get on with life.
Really? This is the most serious WTF moment you've had this year? I envy you.
How is Intel's board an ARM killer if it is taking aim at an AVR-based Arduino?
I also don't see what any of this has to do with IoT given neither board has any sort of networking capability as standard.
If the ARM-based C.H.I.P. delivers what is promised, then that'll be vastly better suited to IoT applications (built-in Wi-Fi and Bluetooth 4.0) and will supposedly only cost $9. Even the BBC micro:bit is more relevant, in the IoT space, than the Arduino due to its built-in Bluetooth - another ARM based board.
I don't have anything against Intel - Very happy with all my i7 machines, but Intel blew their chances in the embedded market when they sold off the XScale range - I fully expect this Quark nonsense to go the way of the i960.
I think it is pretty likely that the development tools will involve OpenCL. Altera already provides an OpenCL SDK for its FPGAs.
I don't think time-slicing an FPGA is a particularly efficient way of sharing its capabilities across processes - Way too much state to save/restore during context switches. Given that FPGA fabric tends to be highly homogeneous, I don't think it is far-fetched to imagine a scheme where processes could request blocks of gates from the OS, not dissimilar to the way they request RAM (i.e. sbrk/mmap) albeit through some sort of driver interface.
In my mind, Storage + Spectrum = ZX Microdrive... and we all know how that turned out.
How is this different to what OSs already do with file I/O? I don't what other OSs do, but Linux uses all available RAM that hasn't been specifically allocated for anything else as cache and buffer space. What am I missing?
Did 100Mbit ethernet or 54Mbit wi-fi turn out to be adequate? No.
Are consumers moving away from shifting data about in the LAN to shifting it in and out of the cloud? Yes
No, consumers aren't likely to utilize a 1Gbit connection 100% of the time, but when they upload or download something, the preference is likely to be; the quicker the better.
The basis for your argument seems to be about what people need. But the Internet isn't about what people need - Unlikely as it might sound, life can exist without the Internet! It is about what people want.
"no idea why anyone needs gigabit internet connectivity at home..."
Which is a bit like the quote "640K ought to be enough for anybody."
Sure, you can have wider registers for SIMD operations (e.g. NEON), but you've got to load and store them somehow. A 64-bit data path throughout your architecture means you can move more data per operation which will benefit SIMD performance, so it is relevant. Incidentally, one of the major features of the ARMv8 architecture is enhanced SIMD support, making it standard and increasing the number of available 128-bit registers.
As mobile apps are generally audio/graphics DSP intensive, an increase in throughput is likely to be more useful to developers than the ability to address a vast and sparsely populated address space.
On a mobile device, I would have thought the 64-bit advantage is more about SIMD than address space.
Whether a language is a scripting language or not is related to the level of operations programs written in the language are intended to perform - It has nothing to with the paradigm(s) a language supports. A scripting language can be procedural, object-oriented and/or functional just as much as a systems language can be.
Closures are its main bloody asset! It's a functional language and the moment you treat it as such instead of trying to bend it into it isn't (like Java), it actually becomes quite pleasant to work with.
Who came up with the name 'Xbox One' anyway? They knew it would be going up against something called the PlayStation 4. Makes it sound like a 1st generation device. Anyone up for a game of pong?
"There would need to be a software layer, like Memcached, to present storage memory as pseudo-RAM to applications."
Is this not the job of the operating system? Why shouldn't this type of storage look just be treated like a block device? Applications needing raw speed can just mmap.
Sounds a bit like what ARM set out to achieve with their Thumb/Thumb2 instruction sets, but taking things one step further with proper data compression rather than just shrinking the instruction format.
The article doesn't mention what architecture this is targeting, although as it's Intel I'm guessing x86 because I can't imagine them getting back into smaller embedded stuff having dumped MCS-51, i960 and XScale a long time ago. If that's the case, it makes sense for the compression/decompression to be on chip so that code compatibility can be maintained. If compression had to be done at compile time, compatibility would go out of the window.
As for 'intelligent drapes, coffee machines, toothbrushes, baby monitors, stereos, alarm clocks, supermarket shelves, air-quality sensors, and more', surely that has been ARM's bread and butter since almost forever? It's hardly new for microcontrollers to be embedded in that kind of stuff and you don't need vast amounts of processing power or memory to achieve Internet connectivity.
I require more information on this subject.
Brian Butterfield has gone too far this time!
Will someone please just take a brick and put Yahoo out of its misery?
..or the new HQ for a certain computer company?
verb (used with object)
1. to form by heating and hammering; beat into shape.
2. to form or make, especially by concentrated effort: to forge a friendship through mutual trust.
3. to imitate (handwriting, a signature, etc.) fraudulently; fabricate a forgery.
'What am I missing here?'
It's a Tuesday?
Have you ever tried writing any significant amount of code in C using GTK+'s bloated piece of crap GObject system? I shudder to think how many hours of my life I've wasted manually specifying vtables and handling GObject properties. Glad I jumped that ship 20 months ago.
Yes, C is a vastly easier language to understand compared to C++ and can even be a more productive language in some circumstances... But not when you use it to manually implement everything C++ gives you out of the box minus the syntactic sugar.
Guessing you're aware of the vast array of rep-rap variants out there though right?
Well of course it was overpriced - That's how retail works! You think I don't know that a camera I pay £400 for costs bugger all for Samsung to manufacture?
There is barely anything inside a tablet computer. In the quantity Samsung makes stuff, the mark up they put on wholesale would cover the production cost of a 7" tablet several times over.
Televisions, phones, tablets and cameras are all essentially the same thing in different form factors these days and are becoming better at inter-operating. Samsung giving away tablets with purchases of other Samsung gear both makes it more likely that a consumer would chose a Samsung product over some other make - Especially at Christmas.. and it also means that a consumer with two inter-operating Samsung products (e.g. a tablet and camera) is likely to buy another Samsung product (e.g. a television) later on or maybe even at the same time.
I didn't have to trade in anything. Bought a new camera from Amazon, got a Tab 2.0 with it for free + £50 cash back. Looks okay on paper to me.
Giving away Tab 2s with everything probably helped. I took the bait.
For all 3 minutes before it burnt down to the bone.
Jonathan Ive a one trick pony?
Oh please no! One Christmas Day is quite enough.
How do you make teaching attractive to the talented? Those who are talented in a particular field want to get on with exploring it and pushing boundaries, because they can. Would money really be enough to compensate people of that mindset for the dying inside that they would experience, regurgitating the same old shit, year in year out?
There always seems to have been this misconception amongst embedded systems programmers (of which I am one, although seemingly more enlightened) that C++ is somehow inherently more resource hungry than C. This simply is not true. There are actually very few language features that impose memory or execution overhead (basically just exceptions and RTTI) and only do so if used.
I can't tell if you're being serious or not.
The only time raw pointers have any place in C++ is when you're wrapping up legacy C code. In some ways, I wish C++ had never attempted to be backwards compatible with C - So many programmers treat C++ as C with classes (aka structs with functions), which doesn't even come close to utilising the language properly.
It also probably doesn't help that there are very few well written C++ libraries/frameworks out there to at least serve as an example of how C++ should be done properly. The STL is fantastic these days, especially with all that C++11 has added. Boost libraries aren't bad, but even having been a C++ programmer for 12 years, I sometimes find the complexity rather irritating. The problem is that, for the most part, the STL and Boost are libraries of generic classes and functions. Generics are absolutely right for containers, algorithms and such, but do nothing to serve as examples of how framework level or end application code should be structured. That isn't a criticism of the STL or Boost by the way - Certanily, as far as the STL is concerned, it is entirely appropriate that it be a library of generics and not much more because C++ is a systems language and its standard library must minimize its dependencies on any particular underlying platform.
At least with languages like Java, Ruby, Python etc. there are plenty of libraries (for better or worse) which beginners can both use and draw inspiration from when designing their own code. C++ is always going to be bewildering to the beginner without good real world examples to illustrate the language. Unfortunately, some of the more prevalent C++ frameworks, such as QT and wxWidgets treat C++ like it's still 1998 (arguably for legitimate reasons) and are, I think, examples of how things should NOT be done in modern C++.
Yes. I was just observing that the USB interface was the one in use and the MIDI ports were being left redundant. Geez you're a hard crowd sometimes.
Nice picture on the BBC article of a device using USB instead of MIDI.
So now, what are the chances of the USPTO granting a patent of an icon illustrating the cross-sectional view of an egg in an egg cup?
I colony of people in their mid 40s is not going to be a colony for very long.
There's probably about as much chance of that happening as them basing IE on WebKit!
Didn't spot the 'Joke Alert' icon huh?
Intel has already been there and got the t-shirt. They inherited StrongARM from DEC, then went on to develop their XScale implementation of the ARMv5 architecture before flogging it to Marvell so that they could focus fully on x86. Can't see them doing a u-turn any time soon (even though they still hold some sort of ARM license apparently).
What about 64-bit pop stars?
Who is that little man sitting inside the ChemCam? What is he doing and how did he get there?
How have I not seen an EPSON HX-20 before? That's incredible for 1981.
AMD CEO Rory Read: "We're the first company to offer both 64-bit ARM and x86 server processors"
So the article's assumption is perfectly valid I'd say.
AMD should have started pushing ARM chips a long time ago. I wonder what their approach to 'ultra-low power' will be.
It has got nothing to do with sexism.There is nothing more important to a baby than the bond it shares with its mother (something which is supposed to be reciprocated) - Nature just doesn't do the whole 'equal opportunities in the workplace' thing.
Biting the hand that feeds IT © 1998–2018