Authentication with Datashare in server mode is the most impacting choice that has to be made. It can be one of the followings:
basic authentication with credentials stored in database (PostgreSQL)
basic authentication with credentials stored in Redis
OAuth2 with credentials provided by an identity provider (KeyCloak for example)
dummy basic auth to accept any user (⚠️ if the service is exposed to internet, it will leak your documents)
Basic authentication with Redis
Basic authentication is a simple protocol that uses the HTTP headers and the browser to authenticate users. User credentials are sent to the server in the header Authorization
with user:password
base64 encoded:
It is secure as long as the communication to the server is encrypted (with SSL
for example).
On the server side, you have to provide a user store for Datashare. For now we are using a Redis data store.
So you have to provision users. The passwords are sha256 hex
encoded. For example using bash
:
Then insert the user like this in Redis:
If you use other indices, you'll have to include them in the group_by_applications
, but local-datashare
should remain. For exammple if you use myindex
:
Then you should see this popup:
Here is an example of launching Datashare with Docker and the basic auth provider filter backed in Redis:
OAuth2 authentication with a third-party id service
This is the default authentication mode: if not provided in CLI it will be selected. With OAuth2 you will need a third-party authorization service. The diagram below describes the workflow:
We made a small demo repository to show how it could be setup.
Basic authentication with a database.
Basic authentication is a simple protocol that uses the HTTP headers and the browser to authenticate users. User credentials are sent to the server in the header Authorization
with user:password
base64 encoded:
It is secure as long as the communication to the server is encrypted (with SSL for example).
On the server side, you have to provide a database user inventory. You can launch datashare first with the full database URL, then datashare will automatically migrate your database schema. Datashare supports SQLite and PostgreSQL as back-end databases. SQLite is not recommended for a multi-user server because it cannot be multithreaded, so it will introduce contention on users' DB SQL requests.
Then you have to provision users. The passwords are sha256 hex encoded (for example with bash):
Then you can insert the user like this in your database:
If you use other indices, you'll have to include them in the group_by_applications
, but local-datashare
should remain. For exammple if you use myindex
:
Or you can use PostgreSQL import CSV COPY
statement if you want to create them all at once.
Then when accessing Datashare, you should see this popup:
Here is an example of launching Datashare with Docker and the basic auth provider filter backed in database: