- 30 Mar, 2016 1 commit
-
-
Tatuya Kamada authored
-
- 29 Mar, 2016 1 commit
-
-
Tatuya Kamada authored
Also use find_load_module, instead of try/except.
-
- 28 Mar, 2016 3 commits
-
-
Tatuya Kamada authored
-
Tatuya Kamada authored
-
Tatuya Kamada authored
-
- 25 Mar, 2016 1 commit
-
-
Tatuya Kamada authored
-
- 24 Mar, 2016 1 commit
-
-
Douglas authored
-
- 18 Mar, 2016 1 commit
-
-
Douglas authored
The implementation relies on the Data Array Module. It imports data from the stocks table through a zSQL Method. Category information is added later in a column-wise way, so it can be easily done in parallel and query Portal Catalog once for each category column in the array. This category processing needs to be done only once, when the array is created, and to new data as it is added. But there is a catch: each entity that belongs to the movement can have many categories. So the row can be duplicated for each entity's categories and searched by equality, or they can be stored as comma-separated values and searched using a regular expression. Regular expression seems faster for datasets up to 1M rows. Some unit tests were also added. These are the external methods created and their purposes: - Base_filterInventoryDataFrame is there just to parse keyword arguments and forward them to Base_getInventoryDataFrame. This is used for the non-programmer interface of Pandas-based getMovementHistoryList implementation and can be used as an external method in other scripts too. - Base_convertResultsToBigArray will convert results of Portal Catalog and ZSQL Method to a Data Array with a proer transformation of the schema to a compatible NumPy data type. - Base_extendBigArray will extend a Data Array with a Portal Catalog query or ZSQL Method result. Raise errors when the extension data type doesn't match the source. - Base_fillPandasInventoryCategoryList will fill category information in a Data Array which has stock movements information. -Base_zGetStockByResource is used in a test case as source to create a Data Array with stock data.
-
- 25 Feb, 2016 1 commit
-
-
Ivan Tyagov authored
subset of bigger one.
-
- 22 Feb, 2016 1 commit
-
-
Ivan Tyagov authored
-
- 17 Feb, 2016 3 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 12 Feb, 2016 2 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 02 Feb, 2016 1 commit
-
-
Ivan Tyagov authored
-
- 13 Jan, 2016 1 commit
-
-
Ivan Tyagov authored
Master @Tyagov This Merge request adds * Data Array Line -> to get a view into data array in any dimension defined by numpy indexing syntax * Data Event Module -> to store user-entered information about monitoring, for example on missing data The merge request also fixes getSize on empty data array and it fixes http range requests for some arrays. See merge request !9
-
- 12 Jan, 2016 5 commits
-
-
Klaus Wölfel authored
-
Klaus Wölfel authored
-
Klaus Wölfel authored
-
Klaus Wölfel authored
-
Klaus Wölfel authored
reason: previous way did not work for all kind of arrays
-
- 06 Jan, 2016 3 commits
-
-
Ivan Tyagov authored
To avoid random test failures due to test execution we make start date one day before. This commit only fixes test.
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 05 Jan, 2016 2 commits
-
-
Ivan Tyagov authored
Add comments to do.
-
Ivan Tyagov authored
This script should only append data to stream and NOT do any transformations on it keeping a simple rule of ERP5: save exactly what was entered by user or passed to us by remote agent (fluentd).
-
- 18 Nov, 2015 1 commit
-
-
Klaus Wölfel authored
-
- 12 Nov, 2015 1 commit
-
-
Klaus Wölfel authored
-
- 11 Nov, 2015 2 commits
-
-
Ivan Tyagov authored
add array preview listbox to Data Array View The listbox shows lines for all indexes in the first dimension of the ndarray and up to 100 columns for the second dimension. See merge request !8
-
Klaus Wölfel authored
The listbox shows lines for all indexes in the first dimension of the ndarray and up to 100 columns for the second dimension.
-
- 09 Nov, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 29 Oct, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 08 Oct, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 06 Oct, 2015 2 commits
-
-
Ivan Tyagov authored
Call pure transformation in an activity rather than execute in current transaction. This way we split ingestion part from transformation part. Note: this commit will serialize argument_list to activity's mysql table in case of big packets this can be slow, in case of small appends it can be acceptable. Instead we should call with start and end offset only and data should be read from Data Stream itself (WIP).
-
Ivan Tyagov authored
Raise in case of improper server side configuration so sender is aware rather than silently return nothing and fool sender it's all ok. Pylint.
-
- 29 Sep, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 28 Sep, 2015 2 commits
-
-
Klaus Wölfel authored
-
Klaus Wölfel authored
-
- 25 Sep, 2015 1 commit
-
-
Ivan Tyagov authored
-