Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
W
wendelin
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Levin Zimmermann
wendelin
Commits
14897249
Commit
14897249
authored
Jul 28, 2020
by
Ivan Tyagov
Browse files
Options
Browse Files
Download
Plain Diff
Add needed Data Operation used in testWendelin and remove entire...
See merge request
nexedi/wendelin!59
parents
1dbcd53e
ebe7cd72
Changes
10
Hide whitespace changes
Inline
Side-by-side
Showing
10 changed files
with
330 additions
and
13 deletions
+330
-13
bt5/erp5_wendelin/PathTemplateItem/data_operation_module/wendelin_ingest_data.xml
...mplateItem/data_operation_module/wendelin_ingest_data.xml
+93
-1
bt5/erp5_wendelin/PathTemplateItem/data_operation_module/wendelin_ingest_data_conversion.xml
...data_operation_module/wendelin_ingest_data_conversion.xml
+219
-0
bt5/erp5_wendelin/PathTemplateItem/portal_callables/DataIngestionLine_writeFluentdIngestionToDataStream.py
...es/DataIngestionLine_writeFluentdIngestionToDataStream.py
+8
-3
bt5/erp5_wendelin/PathTemplateItem/portal_callables/DataIngestionLine_writeIngestionToDataStream.py
...callables/DataIngestionLine_writeIngestionToDataStream.py
+4
-3
bt5/erp5_wendelin/SkinTemplateItem/portal_skins/erp5_wendelin/IngestionPolicyTool_addIngestionPolicy.py
...s/erp5_wendelin/IngestionPolicyTool_addIngestionPolicy.py
+3
-4
bt5/erp5_wendelin/bt/template_path_list
bt5/erp5_wendelin/bt/template_path_list
+1
-0
bt5/erp5_wendelin/bt/test_dependency_list
bt5/erp5_wendelin/bt/test_dependency_list
+0
-1
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_setupWendelinStandardBT5.py
...endelin/BusinessConfiguration_setupWendelinStandardBT5.py
+0
-1
bt5/erp5_wendelin_data/bt/description
bt5/erp5_wendelin_data/bt/description
+1
-0
bt5/erp5_wendelin_data_lake_ingestion_default_security_model/bt/dependency_list
..._lake_ingestion_default_security_model/bt/dependency_list
+1
-0
No files found.
bt5/erp5_wendelin/PathTemplateItem/data_operation_module/wendelin_ingest_data.xml
View file @
14897249
...
...
@@ -169,7 +169,9 @@
</item>
<item>
<key>
<string>
description
</string>
</key>
<value>
<string>
This is the standard data operation used for ingestion. It just appends everything to a data stream.
</string>
</value>
<value>
<string>
This data operation can be used to ingest data with fluentd. It assumes data comes in msgpack format.\n
It just appends everything to a data stream without any conversion whatsoever.\n
</string>
</value>
</item>
<item>
<key>
<string>
id
</string>
</key>
...
...
@@ -579,6 +581,96 @@
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
action
</string>
</key>
<value>
<string>
edit
</string>
</value>
</item>
<item>
<key>
<string>
actor
</string>
</key>
<value>
<string>
zope
</string>
</value>
</item>
<item>
<key>
<string>
comment
</string>
</key>
<value>
<none/>
</value>
</item>
<item>
<key>
<string>
error_message
</string>
</key>
<value>
<string></string>
</value>
</item>
<item>
<key>
<string>
serial
</string>
</key>
<value>
<string>
985.34625.23629.44987
</string>
</value>
</item>
<item>
<key>
<string>
state
</string>
</key>
<value>
<string>
current
</string>
</value>
</item>
<item>
<key>
<string>
time
</string>
</key>
<value>
<object>
<klass>
<reference
id=
"3.1"
/>
</klass>
<tuple>
<none/>
</tuple>
<state>
<tuple>
<float>
1595851194.16
</float>
<string>
UTC
</string>
</tuple>
</state>
</object>
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
action
</string>
</key>
<value>
<string>
edit
</string>
</value>
</item>
<item>
<key>
<string>
actor
</string>
</key>
<value>
<string>
zope
</string>
</value>
</item>
<item>
<key>
<string>
comment
</string>
</key>
<value>
<none/>
</value>
</item>
<item>
<key>
<string>
error_message
</string>
</key>
<value>
<string></string>
</value>
</item>
<item>
<key>
<string>
serial
</string>
</key>
<value>
<string>
985.34639.59167.4556
</string>
</value>
</item>
<item>
<key>
<string>
state
</string>
</key>
<value>
<string>
current
</string>
</value>
</item>
<item>
<key>
<string>
time
</string>
</key>
<value>
<object>
<klass>
<reference
id=
"3.1"
/>
</klass>
<tuple>
<none/>
</tuple>
<state>
<tuple>
<float>
1595930396.08
</float>
<string>
UTC
</string>
</tuple>
</state>
</object>
</value>
</item>
</dictionary>
</list>
</value>
</item>
...
...
bt5/erp5_wendelin/PathTemplateItem/data_operation_module/wendelin_ingest_data_conversion.xml
0 → 100644
View file @
14897249
<?xml version="1.0"?>
<ZopeData>
<record
id=
"1"
aka=
"AAAAAAAAAAE="
>
<pickle>
<global
name=
"Data Operation"
module=
"erp5.portal_type"
/>
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
_Access_contents_information_Permission
</string>
</key>
<value>
<tuple>
<string>
Assignee
</string>
<string>
Assignor
</string>
<string>
Associate
</string>
<string>
Auditor
</string>
<string>
Manager
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
_Add_portal_content_Permission
</string>
</key>
<value>
<tuple>
<string>
Assignee
</string>
<string>
Assignor
</string>
<string>
Associate
</string>
<string>
Manager
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
_Modify_portal_content_Permission
</string>
</key>
<value>
<tuple>
<string>
Assignee
</string>
<string>
Assignor
</string>
<string>
Associate
</string>
<string>
Manager
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
_View_Permission
</string>
</key>
<value>
<tuple>
<string>
Assignee
</string>
<string>
Assignor
</string>
<string>
Associate
</string>
<string>
Auditor
</string>
<string>
Manager
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
_local_properties
</string>
</key>
<value>
<tuple>
<dictionary>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
reference
</string>
</value>
</item>
<item>
<key>
<string>
type
</string>
</key>
<value>
<string>
string
</string>
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
version
</string>
</value>
</item>
<item>
<key>
<string>
type
</string>
</key>
<value>
<string>
string
</string>
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
data_operation_script_id
</string>
</value>
</item>
<item>
<key>
<string>
type
</string>
</key>
<value>
<string>
string
</string>
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
use_list
</string>
</value>
</item>
<item>
<key>
<string>
type
</string>
</key>
<value>
<string>
lines
</string>
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
quantity_unit_list
</string>
</value>
</item>
<item>
<key>
<string>
type
</string>
</key>
<value>
<string>
lines
</string>
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
aggregated_portal_type_list
</string>
</value>
</item>
<item>
<key>
<string>
type
</string>
</key>
<value>
<string>
lines
</string>
</value>
</item>
</dictionary>
<dictionary>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
base_contribution_list
</string>
</value>
</item>
<item>
<key>
<string>
type
</string>
</key>
<value>
<string>
lines
</string>
</value>
</item>
</dictionary>
</tuple>
</value>
</item>
<item>
<key>
<string>
aggregated_portal_type
</string>
</key>
<value>
<tuple>
<string>
Data Acquisition Unit
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
aggregated_portal_type_list
</string>
</key>
<value>
<tuple>
<string>
Data Stream
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
base_contribution_list
</string>
</key>
<value>
<tuple/>
</value>
</item>
<item>
<key>
<string>
categories
</string>
</key>
<value>
<tuple>
<string>
quantity_unit/unit/piece
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
default_reference
</string>
</key>
<value>
<string>
ingest-fluent-data
</string>
</value>
</item>
<item>
<key>
<string>
description
</string>
</key>
<value>
<string>
This data operation can be used to ingest data with fluentd. It assumes data comes in msgpack format.\n
It will first unpack the msgpack, then remove the timestamp, and convert the data part (usually a dictionary)\n
to string and append it to Data Stream.\n
\n
Note that what is saved to Data Stream might be different from what fluentd was reading initially,\n
depending on fluentd plugin configuration. For example fluentd might convert json to msgpack then\n
what is saved in Data Stream might be a string representation of a python dictionary and not json.
</string>
</value>
</item>
<item>
<key>
<string>
id
</string>
</key>
<value>
<string>
wendelin_ingest_data_conversion
</string>
</value>
</item>
<item>
<key>
<string>
portal_type
</string>
</key>
<value>
<string>
Data Operation
</string>
</value>
</item>
<item>
<key>
<string>
quantity_unit_list
</string>
</key>
<value>
<tuple>
<string>
information/byte
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
reference
</string>
</key>
<value>
<string>
FOURIER-MAX
</string>
</value>
</item>
<item>
<key>
<string>
script_id
</string>
</key>
<value>
<string>
DataIngestionLine_writeFluentdIngestionToDataStream
</string>
</value>
</item>
<item>
<key>
<string>
title
</string>
</key>
<value>
<string>
Ingest Fluentd Data
</string>
</value>
</item>
<item>
<key>
<string>
use_list
</string>
</key>
<value>
<tuple>
<string>
big_data/analysis
</string>
</tuple>
</value>
</item>
<item>
<key>
<string>
version
</string>
</key>
<value>
<string>
001
</string>
</value>
</item>
</dictionary>
</pickle>
</record>
</ZopeData>
bt5/erp5_wendelin/PathTemplateItem/portal_callables/DataIngestionLine_writeFluentdIngestionToDataStream.py
View file @
14897249
"""
This script is used during fluentd ingestion.
It will write data sent from fluentd by unpacking it first and then appending
as a string to respective "Data Stream".
This script is used during fluentd ingestion.
It assumes data comes in msgpack encoded in the following format: mspack(timestamp, data).
It will first unpack the msgpack, then remove the first item of the tuple (timestamp) and
append str(data) to "Data Stream".
Note that what is saved to Data Stream might be different from what fluentd was reading
initially, depending on fluentd plugin configuration. For example fluentd might convert
json to msgpack, then what is saved in Data Stream might be str(python_dict) and not json.
"""
out_stream
[
"Data Stream"
].
appendData
(
''
.
join
([
str
(
c
[
1
])
for
c
in
context
.
unpack
(
data_chunk
)]))
bt5/erp5_wendelin/PathTemplateItem/portal_callables/DataIngestionLine_writeIngestionToDataStream.py
View file @
14897249
"""
This script is used during fluentd ingestion.
It will append data sent from fluentd to Wendelin 'as it is' to respective "Data Stream".
By default data will be encoded in MsgPack format.
This script is a general ingestion script which can be used with fluentd or with other http based ingestion tools.
It will append data sent to Wendelin 'as it is' to respective "Data Stream".
Note that by default fluentd data is encoded in msgpack format, this script will not unpack it.
"""
out_stream
[
"Data Stream"
].
appendData
(
data_chunk
)
bt5/erp5_wendelin/SkinTemplateItem/portal_skins/erp5_wendelin/IngestionPolicyTool_addIngestionPolicy.py
View file @
14897249
...
...
@@ -13,11 +13,10 @@ ingestion_policy = context.newContent( \
script_id
=
'IngestionPolicy_parseSimpleFluentdTag'
)
ingestion_policy
.
validate
()
use_category
=
context
.
restrictedTraverse
(
"portal_categories/use/big_data/ingestion"
)
quantity_category
=
context
.
restrictedTraverse
(
"portal_categories/quantity_unit/unit/piece"
)
#
XXX: hard-coded dependency to object from erp5_wendelin_data, remove!
data_operation
=
context
.
restrictedTraverse
(
"data_operation_module/wendelin_
1
"
)
#
use by default a Data Operation which will convert data sent from fleuntd
data_operation
=
context
.
restrictedTraverse
(
"data_operation_module/wendelin_
ingest_data_conversion
"
)
# create Data Product
data_product
=
context
.
data_product_module
.
newContent
(
...
...
bt5/erp5_wendelin/bt/template_path_list
View file @
14897249
data_operation_module/wendelin_ingest_data
data_operation_module/wendelin_ingest_data_conversion
data_product_module/default_http_json
data_product_module/default_http_json/**
data_supply_module/default_http_json
...
...
bt5/erp5_wendelin/bt/test_dependency_list
View file @
14897249
erp5_full_text_mroonga_catalog
erp5_wendelin_data
erp5_wendelin_examples
\ No newline at end of file
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_setupWendelinStandardBT5.py
View file @
14897249
...
...
@@ -28,7 +28,6 @@ bt5_installation_list = ('erp5_full_text_mroonga_catalog',
'erp5_web_renderjs_ui'
,
'erp5_wendelin'
,
'erp5_wendelin_examples'
,
'erp5_wendelin_data'
,
'erp5_wendelin_development'
,
'erp5_notebook'
)
...
...
bt5/erp5_wendelin_data/bt/description
View file @
14897249
[OBSOLETE - please do not install, kept for historical reasons only!]
Sample Wendelin data model.
\ No newline at end of file
bt5/erp5_wendelin_data_lake_ingestion_default_security_model/bt/dependency_list
0 → 100644
View file @
14897249
erp5_wendelin_data_lake_ingestion
\ No newline at end of file
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment