- 25 Jun, 2020 1 commit
-
-
Ivan Tyagov authored
Be explicit and set desired portal_type as otherwise by default Data Stream Bucket will be created and tests will fail for missing API.
-
- 24 Jun, 2020 1 commit
-
-
Ivan Tyagov authored
See merge request nexedi/wendelin!45
-
- 22 Jun, 2020 9 commits
-
-
Roque authored
-
Roque authored
erp5_wendelin_data_lake_ingestion: split ingestion validation is done right after last chunk (eof) is ingested instead of waiting for the alarm - better handling of data stream hash calculation and publication
-
Roque authored
-
Roque authored
-
Roque authored
erp5_wendelin_data_lake_ingestion: split files are no longer appended/processed and no data streams are removed anymore - all ingestions and data streams corresponding to split parts are kept - client will receive the list of all data streams and it will be in charge of merging the parts during the download - validate chunk data streams only when full file was ingested - only process split ingestions when full file was ingested - calculate full split file size - calculate hash and add state control during data stream validation - stop invalidated ingestions
-
Roque authored
- do not return invalid data streams (filter by reference and state) - groups split file datastreams by reference - returns full file size and large hash
-
Roque authored
erp5_wendelin_data_lake_ui: fix UI queries to get documents and update renamed server scripts in urls
-
Roque authored
-
Roque authored
-
- 19 Jun, 2020 2 commits
-
-
Ivan Tyagov authored
See merge request nexedi/wendelin!44
-
Eteri authored
-
- 18 Jun, 2020 1 commit
-
-
Ivan Tyagov authored
categories. This is usually needed for Wendelin applications where for some reasons data is stored in different Data Streams which must form a logical and sorted "link". Add test.
-
- 04 Jun, 2020 2 commits
-
-
Ivan Tyagov authored
See merge request !42
-
Eteri authored
-
- 29 May, 2020 1 commit
-
-
Ivan Tyagov authored
See merge request !41
-
- 14 May, 2020 2 commits
- 11 May, 2020 1 commit
-
-
Ivan Tyagov authored
-
- 08 May, 2020 1 commit
-
-
Ivan Tyagov authored
Allow owner of a Data Stream to edit it despite published. In normal ERP5 this is not normal but with Data Lake functionality validation is done at ebulk side (which also takes care of increasing automatically Data Set's version). This is quite simplified model of ebulk only validation of a Data Set but sufficient for most cases.
-
- 06 May, 2020 1 commit
-
-
Ivan Tyagov authored
-
- 05 May, 2020 1 commit
-
-
Ivan Tyagov authored
-
- 04 May, 2020 3 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 30 Apr, 2020 2 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 29 Apr, 2020 2 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 24 Apr, 2020 3 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 23 Apr, 2020 3 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Rename validation_workflow (which duplicates erp5_core's one.) to data_validation_workflow which ash one new state only 'Published' Cleanup worklists.
-
Ivan Tyagov authored
This reverts commit ab6250a1.
-
- 22 Apr, 2020 1 commit
-
-
Ivan Tyagov authored
-
- 21 Apr, 2020 2 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Validates state's meaning is that it has passed a validation by content - either ebulk client side or Wendelin side. It's not meant to be part of publication context.
-
- 20 Apr, 2020 1 commit
-
-
Ivan Tyagov authored
Make it possible to configure if default (open) security model is to be installed for Wendelin's Data Lake. Default is not install.
-