For very very large datasets or images I developed a nodejs interface to hdf5 https://github.com/HDF-NI/hdf5.node. For bringing the view finder hyperslab to the client visualization before the whole image or dataset ever gets transported for better user experience it would be nice to request the hyperslab dimensions and retrieve that much data. Then tile out from there as the user pans a map, a medical image, another slice or scientific data such as electron densities or afm surfaces.
Actually, I think this is part of a mounting case for doing the WebID, Authn and Authz on a proxy, so you can have the benefits of those with other servers than LDP behind.
Exactly! I want to make my project’s work for koa, egg, sails etc. Enterprise and faster frameworks. It would be fantastic if https://github.com/solid/webid-oidc-spec became pluggable into any framework.
I’m experimenting now with https://github.com/uNetworking which sends data to browser websockets from native c, c++ or node where I can send tiles/hyperslabs of data(e.g. portion of a very large image) so the user experience starting from their viewer finder is blazing fast.
The community also really needs guidance with https://github.com/solid/solid-spec#social-web-app-protocols
We have just started to gather User Stories around the server. It is nice if you can formulate something generic around this:
So, it turned out that the Wiki was unsuitable for user stories, so, they are now here.
It would be really great if you submitted something around this, because I think it is a critical decision point about future server architecture.
Ah, I was away this last weekend. Let me gather together some info