APIs are being developed to enable a browser and browser-based applications to ingest and store database records over the net for offline database access. In an apparently paradoxical development, the World Wide Web Consortium (W3C) is developing sets of APIs to enable browser-integrated applications, such as e-mail clients and …
Which geniarse came up with this?
"The use of a standard Indexed Database API would enable browsers to do this in a standard way for all such applications."
Wow a totally standardised way of storing data from a webapp on the local machine...Not heard something so useful since XML came along...oh wait...
And how do you propose the webapp saves this XML file to the local machine, exactly?
XML is not an "indexed database"
XML doesn't scale, it needs to loaded into an internal structure to be searched efficiently and for a large enough data set that would consume a lot of memory. It's no replacement for a proper database with indexes.
Client Side Scripting
You have a web application that requires you to download a 'database' that is large enough to cause a problem in retrieving datasets from a large XML file (response greater than 1sec) then you have got totally the wrong idea with your development and need to step away and think things through...
My point is that this is a totally useless idea as you shouldn't be allowing people to download entire databases for offline use. It's a piss poor practice that was stopped years ago.
"It's a piss poor practice that was stopped years ago."
perhaps you should look at the offline feature of google mail, google calendar and google documents, all using an "entire" database for offline use.
The idea is that you only download the database (or even just the schema) once, then update it intermittently.
That would be using the new HTML 5 API
Isn't this out there already?
With Google Gears for one, and I've a feeling safari might a SimpleDB type API built in? Let's hope if it is a similar standard they actually take a look at what has been done already instead of developing a totally different set of standards.
So, cookies, then?
The title is the key, this is merely a value.
values can only be a string with cookies.
a database can have multiple types (int, char, string etc etc...).
ironically most browsers already store their cookies in a database already.
@Which geniarse came up with this?
"Wow a totally standardised way of storing data from a webapp on the local machine...Not heard something so useful since XML came along...oh wait..."
Except that traversing XML looking for individual datasets is a particularly poor and inefficient method of targeted data retrieval, great point!
For that matter, why use databases on servers either? Why not just do everything with XML and XSLT? You're not a real programmer unless you're using hundreds of lines of code that one of those toy database n00bs would knock out in 30, after all.
They already host a database
Firefox and Chrome (i think) already have sqlite baked into them to deal with history, settings and bookmarks. This is Google Gears project sneaking into the latest HTML spec through the back door.
This is probably a good thing in the context that most of your future browsing will be done on your mobile which might not always have a data connection.
I thought we had a standard database API
It was called SQL last time I looked. Some database products even use it! It might need some judicious subsetting but still.
Not so much cookie monster, as monster cookie?
I believe that cookies are no more than 4KB. Also some people deny cookies by default so a different mechanism is necessary (could fudge it perhaps but messy. Dunno). DB fits the requirements.
However, I do hope they consider the abuse-potential - tracking records instead of cookies. SQL injection on the client if they go down that road. etc. Yuck.
XML is for documents really
it is not great as a data store where you are looking for instant results, for that an RDBMS or LDAP system is a lot better.
Offline database is useful, there are many times I wish this was available, so yeah good idea, don't care about the SQL injection problems, a good implementation will avoid them, but at base the data doesn't really matter, you can download it again. This is about better access and reduced load.
Too much faith there.
"...don't care about the SQL injection problems, a good implementation will avoid them..."
I think you've just identified the contents and reason for IE10 Service Pack 1.
I remember reading about this
Back in 2007: http://webkit.org/blog/126/webkit-does-html5-client-side-database-storage/
I think there are iPhone webapps that have been doing this (sans trumpeting), for a long time.
Stop this insanity
Seriously, I don't need Firefox taking up even more of my system's resources....
Firefox already does it....
look for yourself in your 'Local Settings\Application Data\Mozilla\Firefox\Profiles\$usr\' directory for the *.sqlite files. They're databases, and they're using less resources than what was used in previous versions of firefox to do the same thing.
Except that there's no API available to a client-side app. Particularly a standard API available across multiple browsers.
Since the sqllite engine is already there, providing an API shouldn't be a big problem. Better to have everyone provide their input up front than have Microsoft head off in their own direction (not that they won't already). And have some smart people point out all the gaping security holes in the proposals up front.
XML is for documents
That's the way I understand XML too.
XML can present data well, but the data is much better off being stored in a database.
I'm reminded of Unidata, an old multi-value db owned by IBM now.
I can't say I enjoyed working with MV records but storing data was fast and efficient.
Files were small and text based regardless of datatype.
Opera already has a database
Opera has a database in-built for its mail storage too, it's just that it's disabled by default and the older 'files and folders' storage systems are used. There's nothing stopping any user from going into opera:config and changing it.
Database servers are clearly going to be part of all proper browsers in the future. As the Web evolves from a simple document platform to an application platform, databases are goin tp be required to speed things up. No, it won't be as fast as Opera 3 loading the original BBc News site, but that's no longer the age we live in, and in the same way that modern cars are hideously over-complicated in order to function *like* modern cars, so too must browsers get bloated.
Ultimately the proof is this: Firefox, Chrome and Safari are open source. If the market doesn't want the bloat then someone will download the source and release a bare-bones browser version of one or more of those apps. People were motivated enough to release a version of Chrome without Google's spying, yet there is no version of Firefox or Chrome without the bloat. Sorry if you don't want to hear this, but that means only one thing - No one wants a bare-bones browser, they actually like or at least don't mind the bloat because they have quad-core processors and 4GB of RAM loading apps from a SATA drive, and it all runs perfectly well for them.
Database storage visible in Safari Preferences
As one or two other people have pointed out, Safari seems to have support for this already. You can even see your databases and set the maximum size allocated to them in the Security tab of the Preferences window (Safari version 4).
XML and cookies
"You have a web application that requires you to download a 'database' that is large enough to cause a problem in retrieving datasets from a large XML file (response greater than 1sec) then you have got totally the wrong idea with your development and need to step away and think things through..."
Fail. Think of it UNIX-style -- there can be multiple users on one box, and there's no saying HOW FAST the box is. Aiming just to make something acceptable, more or less, for a single user on a fast box, is inefficient and lazy. An XML file is not a database and storing anything that could be described as a database in an XML file is terrible.
Second (not at you in particular), don't cookies do exactly this? Rather than coming up with a totally new mechanism, just extend them into some local storage mechanism.
Going full circle, are we ?
The merry-go-round continues.
We've gone from local databases to networked databases, to remotely-hosted databases, to databases hosted on the Internet, and now somebody wants to find a way to locally store the remotely-hosted Internet databases. And update them.
I'm sure there's a way to find all this funny, only I can't see it.
This is perfectly feasible
This is no different to the caching the browsers already do:
HTML 5 File API + XML/RDF + SPARQL
Would keep the old fashioned relational crowed happy and allow the new graph crowed to query pages directly.
this is nothing new
Part of my website design course stated that with active X and IE3.02 (circa 1998) said that you could read the contents of a text file to do something like a league table. You would upload your (CSV) text file into the directory. If I remember rightly Liverpool,1,0,0,1,0,4,0,liverpool.gif would be a good format for your table and your nice webpage. Needless to say it worked a treat with IE, but not others. Still, this was a time when the internet was shiny and new.
- Geek's Guide to Britain INSIDE GCHQ: Welcome to Cheltenham's cottage industry
- 'Catastrophic failure' of 3D-printed gun in Oz Police test
- Game Theory Is the next-gen console war already One?
- BBC suspends CTO after it wastes £100m on doomed IT system
- Peak Facebook: British users lose their Liking for Zuck's ad empire