Re: Bluetooth is not baked into around a quarter of the world's mobile's
and Bluetooth is not needed, nor is WiFi or GPS.
Try this - https://en.wikipedia.org/wiki/Location-based_services
108 posts • joined 12 Nov 2014
You're comparing a legacy power connector with a multi-function connector that has improved on the original USB connector based on experience.
All (four) of my portable devices have USB-C charging, and I've just bought a couple of USB-C memory stick.
It's already here, accept it.
Teams allows you to have subtitles. That's particularly valuable in one meeting we have in work where a participant is deaf. The AI/ML engine (a descendent of [Tay], no doubt) is being put to a good use because it also does a good job of interpreting this person's speech.
Now then, if only the Indian politicians had known about Kuhn's [structure of scientific revolutions]. They would have realised that they need to navigate to the [blue ocean]. Not to compete but to leapfrog.
We have speech recognition, we have still and video backdrops, we need total avatar synthesis. With translation. What a coup that would be for India.
I want speech to be voiced by [Peter Sellers]. Or perhaps like [this].
It's impossible to stop because most hirers or their agents want (or pretend there is legislation that requires) proof of identity.
My driving licence and passport both have my date of birth on them. Ironically my free senior bus pass doesn't have DoB.
With age should come experience and as far as i am concerned the most important is the experience of failed projects. Actually, in IT, most people older than 25 will have plenty of those experiences. The older you get, the less you get fazed by crashes caused by off-by-one, OOB, uninitialized variables, permission errors. You will smell problems and you know KISS works.
There are disadvantages to age, which can be compensated for by better use of practices and tools like...
What was I talking about?
Read about WASM as much as you can. It is the future of mobile devices. Look at this bit from the article:
"Without Apple, Google, and Microsoft on board yet, it's difficult to guess whether wasm will soar or sink for lack of support."
They all make proprietary mobile operating systems, or pretend they are open source, or failed to make a dent in the market. They know that a micro-environment is a replacement for their OS. And WASM is safer than the dominant mobile device makers telling us they are policing their app stores.
I agree. HTML and the DOM are legacy artefacts that should now be allowed to die. They are holding back another step change in society's adoption of technology. It's like high priests of the SQL monsters have done all they can to block enterprise adoption of simpler and more efficient data storage and retrieval systems.
YAML is better than XML (an exercise left to the student, but it's to do with functional decomposition), but let's not stop there. Scene description languages will allow us to describe all audio-visual actions from static pages to full motion immersive three dimensional imagery suitable for Augmented Reality displays.
Even if we get the specifications wrong the first six or so times we can fudge it with edge code. We have web assembly that provides an order of magnitude improvement in client processing performance. Why allow a pedestrian interpretation of a simplistic marks on a page description language to bind us?
Drop HTML, drop CSS, XSL, and most important drop any pretense that we live in a two dimensional world.
Let's face it. The reviewer was never going to *read* the whole spec. I did it for version 1 and it took 10 days of two hours a day. I remain deeply scarred by the experience.
As originally specified, Bluetooth was meant to be a point to point solution. It was only when the specification was merged with Motorola's "Piano" piconet protocol that the problems began. It adopted "big" networking and was only saved from implementing IBMs SNA by adopting OSI.
Actually removing some terminology might be useful for changing design thinking. In a nod to Mel Conway seminal article "How do committees invent?". If we can remove any reference to "master", "parent" or "manager" we could avoid command and control thinking pervading designa that should be as autonomous as possible.
> clunky web interface with no simple export
There’s a reason for that. The suppliers know NHS IT procurement is naive so they only sell a web front. They can charge more for “integration” with other systems.
The main reason NHS IT is a mess is because patient data is under control of primary healthcare providers; the general practitioners. Until patient data is wholly owned and controlled by patients themselves we can never have multiple operating models, such as gp->consultant, independent service provision, hospital@home.
Anyway, hospitals are just hotels with attending physicians. Best to separate out the services.
You had to be reasonable otherwise you would appear needy:
"you can only reasonably do so via an up-to-date XCode, which usually needs and up-to-date MacOS"
Macs are the only machines you can work on to develop software for any device.
Admit it, deep down, you would love to own a real Mac. Not a hackintosh.
Look how much your well meaning comment has been voted down.
The commentards here wouldn't know what team working is. They are, by their own proud admission, the loner in the corner. Not wishing to pair up with less experienced developers to help them with insights. Not wishing to participate in model storming sessions with customers. Refusing to refactor other people's code. Hoarding code until they have crafted an over-engineered monolithic lump that only they can maintain.
Agile is a way to classify a number of development frameworks or methodologies that mostly exhibit the principles of the agile manifesto. They are usually founded on the theory of constraints and software is development using XP techniques.
There is not one specific Agile methodology.
Just like there is not one specific Waterfall methodology. Just a set of specific, clearly defined phases.
Unlike the Agile haters here, I know when waterfall phases are needed and when Scrum will be ambushed.
There's a success story bubbling away with Fuschia.
Get a touchscreen Chromebook that runs android apps and you'll glimpse the future.
Android apps on Chromebook give you low footprint yet functional and offline capable improvements over browser based services like email, office apps etc.
If you look carefully at the Fuschia documentation you'll find an OS that is all about data flow.
Ledger allows data to be offline with sync.
Flutter, according to the Fuschia glossary:
"... is a functional-reactive user interface framework optimized for Fuchsia and is used by many system components. Flutter also runs on a variety of other platform, including Android and iOS. Fuchsia itself does not require you to use any particular language or user interface framework."
Fuschia, providing some simple Android porting tools, will replace Chromium.
I agree. Eventually there will be one. Interactive add-ons are a disaster.
Each mobile platform has its own way of describing the page and then its own way of updating the current display when things change.
Eventually we'll recognise HTML/XML to be the descriptor equivalent of a programming language without sub-routines, functions, or even gotos. How insane are we? There are better descriptor languages, even now. YAML has the descriptor equivalent of functions.
This kludging HTML is just the last deathroes of an irrelevant UI framework.
A crap implementation!
The light won't come on if the fridge is not up to temperature.
Most OS have a "Big loop". All the events get tested in that loop. Trust me, even interrupt driven systems use an event loop to dynamically change priorities.
This article is about UIs. They need high update rates so screen interactivity appears smooth. Sliders for example. Or how I'm writing on this tablet using a swiping keyboard.
With embedded stuff it's all about the non-functional or system qualities.
It's not only easy to get into machine learning, it has been for years.
Even the name, the teachable machine, has been around for years.
Purr Puss could recognise you had put up two fingers to it, too.
It's purposeful obfuscation.
The mobile networks want to maintain control of 5G when the distributed nature of nodes plopped on t'internet mean it can be controlled by anyone who groks OpenStack. So that'll be the cloud suppliers like Amazon.
To be honest, all these different network function virtualization 'nuances' introduced by the protagonists are unimportant. To people used to the internet we know the power of gateways to join dissimilar segments together.
There's only one thing important to everyone and that's billing. And no-one can stop the inevitable where money is concerned.
Yes, the blockchain is where all billing (rating records) will go. Monolithic network operators are going the same way as banks. Totally unnecessary.
Obviously you lot only looked at the consumer end. For developers there is a lot.
Swift 4 is available. Yes, it's open source and that gives many eyes to make sure quality is high. Xcode 8.3 runs it now.
Latest iPads can run the Swift Playgrounds and there is a developer kit to create packaged playgrounds.
If you want to develop for future hardware, that's not a problem as you sling an external GPU via thunderbolt.
CoreML and ARKit on iOS point to a future outside, and heads up.
And what do you bitch about? That your old apple kit is not as shiny as a surface studio. Good luck lounging on your couches.
I agree about physical keyboards. As this phone has a high speed connector I expect there'll be an add-on appearing. Otherwise pray the Gemini sees the light of day.
Speaking of which, I'm bemused about daylight readable screens. Some phones had transreflective screens that gave you monochrome or sometimes even colour display in bright ambient light.
It's probably dropped because most of humanity are living indoors. Perhaps some bean counter decided it wasn't worth putting into products.
Why not look at mobiles sold in a country which doesn't have mostly cloud cover?
This is why I like reading the comment on the reg, we'll explore the problems and suggest changes.
If only some of those managers in the PMO carefully read these comments instead of 'socialising' requirements or 'managing expectations' then more of these chaotic and complex system evolutions would actually get better. Dare I say become antifragile?
My suggestion is an AI based system monitoring tool would have worked wonders.
https://www.moogsoft.com for example.
My two penny worth.
Google already have the NDK that allows development of android components in C/C++.
IMHO Google wants to walk away from perennial battles with Oracle over implementing Java APIs.
As soon as they can, Java will become just another way to write apps on clients in just the same way Java is just one of several languages you can use server side on GAE.
Java and C# should join COBOL as museum pieces worthy only of morbid interest.
Biting the hand that feeds IT © 1998–2020