Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
W
wendelin
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Paul Graydon
wendelin
Commits
eb371277
Commit
eb371277
authored
Apr 20, 2023
by
Martin Manchev
Committed by
Ivan Tyagov
Apr 20, 2023
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Revert "Changes in 'erp5_wendelin_data_lake' ..."
This reverts commit
004be34a
.
parent
40ee4d19
Changes
30
Hide whitespace changes
Inline
Side-by-side
Showing
30 changed files
with
225 additions
and
207 deletions
+225
-207
bt5/erp5_wendelin_configurator/PathTemplateItem/business_configuration_module/wendelin_configuration.xml
.../business_configuration_module/wendelin_configuration.xml
+6
-2
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_afterWendelinConfiguration.py
...delin/BusinessConfiguration_afterWendelinConfiguration.py
+4
-4
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_afterWendelinConfiguration.xml
...elin/BusinessConfiguration_afterWendelinConfiguration.xml
+6
-6
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_configureWendelinCategories.xml
...lin/BusinessConfiguration_configureWendelinCategories.xml
+6
-6
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_configureWendelinOrganisation.xml
...n/BusinessConfiguration_configureWendelinOrganisation.xml
+6
-6
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_configureWendelinSetupDataNotebook.xml
...inessConfiguration_configureWendelinSetupDataNotebook.xml
+6
-6
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_displayWendelinDownloadForm.xml
...lin/BusinessConfiguration_displayWendelinDownloadForm.xml
+6
-2
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_getPrettyCategoriesSpreadsheetConfiguratorItem.xml
...ration_getPrettyCategoriesSpreadsheetConfiguratorItem.xml
+6
-6
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_setupWendelinStandardBT5.xml
...ndelin/BusinessConfiguration_setupWendelinStandardBT5.xml
+6
-6
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_viewConfigureWendelinCategoriesDialog.xml
...ssConfiguration_viewConfigureWendelinCategoriesDialog.xml
+6
-2
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_viewWendelinConfigureOrganisationDialog.xml
...Configuration_viewWendelinConfigureOrganisationDialog.xml
+6
-2
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/ERP5Site_createDefaultKnowledgeBox.xml
...figurator_wendelin/ERP5Site_createDefaultKnowledgeBox.xml
+6
-6
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/DataLake_stopIngestionList.py
...ins/erp5_wendelin_data_lake/DataLake_stopIngestionList.py
+4
-5
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_checkIngestionReferenceExists.py
...delin_data_lake/ERP5Site_checkIngestionReferenceExists.py
+1
-1
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_createDataAnalysisList.py
...rp5_wendelin_data_lake/ERP5Site_createDataAnalysisList.py
+1
-1
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_executeDataAnalysisList.py
...p5_wendelin_data_lake/ERP5Site_executeDataAnalysisList.py
+0
-0
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_executeDataOperation.py
.../erp5_wendelin_data_lake/ERP5Site_executeDataOperation.py
+0
-0
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_getDescriptorHTMLContent.py
...5_wendelin_data_lake/ERP5Site_getDescriptorHTMLContent.py
+1
-1
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_getIngestionConstantsJson.py
..._wendelin_data_lake/ERP5Site_getIngestionConstantsJson.py
+2
-2
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_increaseDatasetVersion.py
...rp5_wendelin_data_lake/ERP5Site_increaseDatasetVersion.py
+0
-1
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_invalidateIngestionObjects.py
...wendelin_data_lake/ERP5Site_invalidateIngestionObjects.py
+2
-2
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_invalidateOldDatasets.py
...erp5_wendelin_data_lake/ERP5Site_invalidateOldDatasets.py
+2
-2
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_newCredentialRequest.py
.../erp5_wendelin_data_lake/ERP5Site_newCredentialRequest.py
+1
-1
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_setDatasetDescription.py
...erp5_wendelin_data_lake/ERP5Site_setDatasetDescription.py
+3
-4
bt5/erp5_wendelin_data_lake_ingestion/TestTemplateItem/portal_components/test.erp5.testDataLakeIngestion.py
...Item/portal_components/test.erp5.testDataLakeIngestion.py
+1
-1
bt5/erp5_wendelin_data_lake_ingestion/WorkflowTemplateItem/portal_workflow/data_validation_workflow/scripts/checkConsistency.py
...flow/data_validation_workflow/scripts/checkConsistency.py
+2
-2
bt5/erp5_wendelin_data_lake_ingestion/bt/dependency_list
bt5/erp5_wendelin_data_lake_ingestion/bt/dependency_list
+1
-1
bt5/erp5_wendelin_examples_keras/ExtensionTemplateItem/portal_components/keras_save_load.py
...xtensionTemplateItem/portal_components/keras_save_load.py
+6
-7
bt5/erp5_wendelin_examples_keras/ExtensionTemplateItem/portal_components/keras_train_model.py
...ensionTemplateItem/portal_components/keras_train_model.py
+126
-120
bt5/erp5_wendelin_examples_keras/ExtensionTemplateItem/portal_components/keras_vgg16_predict.py
...sionTemplateItem/portal_components/keras_vgg16_predict.py
+2
-2
No files found.
bt5/erp5_wendelin_configurator/PathTemplateItem/business_configuration_module/wendelin_configuration.xml
View file @
eb371277
...
...
@@ -115,9 +115,9 @@
<key>
<string>
categories
</string>
</key>
<value>
<tuple>
<string>
specialise/portal_templates/54
</string>
<string>
resource/portal_workflow/erp5_wendelin_workflow
</string>
<string>
current_state/portal_workflow/erp5_wendelin_workflow/state_start
</string>
<string>
specialise/portal_templates/54
</string>
</tuple>
</value>
</item>
...
...
@@ -136,9 +136,11 @@
<value>
<object>
<klass>
<global
id=
"1.1"
name=
"
DateTime"
module=
"DateTime.DateTime
"
/>
<global
id=
"1.1"
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple>
<global
id=
"1.2"
name=
"DateTime"
module=
"DateTime.DateTime"
/>
<global
id=
"1.3"
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
...
...
@@ -170,6 +172,8 @@
<object>
<klass>
<reference
id=
"1.1"
/>
</klass>
<tuple>
<reference
id=
"1.2"
/>
<reference
id=
"1.3"
/>
<none/>
</tuple>
<state>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_afterWendelinConfiguration.py
View file @
eb371277
"""
This script will be called to apply the customization.
"""
This script will be called to apply the customization.
"""
from
erp5.component.module.Log
import
log
# Activate the knowledge pads on portal home to enable later the Wendelin
# Activate the knowledge pads on portal home to enable later the Wendelin
# Information gadget.
portal
=
context
.
getPortalObject
()
default_site_preference
=
getattr
(
portal
.
portal_preferences
,
...
...
@@ -39,7 +39,7 @@ if default_security_model_business_template is not None:
portal_type_instance
=
getattr
(
portal
.
portal_types
,
portal_type
)
print
"Updated Role Mappings for %s"
%
portal_type
portal_type_instance
.
updateRoleMapping
()
# updata local roles (if any)
business_template
=
context
.
getSpecialiseValue
()
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_afterWendelinConfiguration.xml
View file @
eb371277
...
...
@@ -6,18 +6,18 @@
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
Script_magic
</string>
</key>
<value>
<int>
3
</int>
</value>
</item>
<item>
<key>
<string>
_bind_names
</string>
</key>
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_configureWendelinCategories.xml
View file @
eb371277
...
...
@@ -6,18 +6,18 @@
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
Script_magic
</string>
</key>
<value>
<int>
3
</int>
</value>
</item>
<item>
<key>
<string>
_bind_names
</string>
</key>
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_configureWendelinOrganisation.xml
View file @
eb371277
...
...
@@ -6,18 +6,18 @@
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
Script_magic
</string>
</key>
<value>
<int>
3
</int>
</value>
</item>
<item>
<key>
<string>
_bind_names
</string>
</key>
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_configureWendelinSetupDataNotebook.xml
View file @
eb371277
...
...
@@ -6,18 +6,18 @@
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
Script_magic
</string>
</key>
<value>
<int>
3
</int>
</value>
</item>
<item>
<key>
<string>
_bind_names
</string>
</key>
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_displayWendelinDownloadForm.xml
View file @
eb371277
...
...
@@ -11,9 +11,13 @@
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_getPrettyCategoriesSpreadsheetConfiguratorItem.xml
View file @
eb371277
...
...
@@ -6,18 +6,18 @@
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
Script_magic
</string>
</key>
<value>
<int>
3
</int>
</value>
</item>
<item>
<key>
<string>
_bind_names
</string>
</key>
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_setupWendelinStandardBT5.xml
View file @
eb371277
...
...
@@ -6,18 +6,18 @@
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
Script_magic
</string>
</key>
<value>
<int>
3
</int>
</value>
</item>
<item>
<key>
<string>
_bind_names
</string>
</key>
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_viewConfigureWendelinCategoriesDialog.xml
View file @
eb371277
...
...
@@ -11,9 +11,13 @@
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/BusinessConfiguration_viewWendelinConfigureOrganisationDialog.xml
View file @
eb371277
...
...
@@ -11,9 +11,13 @@
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_configurator/SkinTemplateItem/portal_skins/erp5_configurator_wendelin/ERP5Site_createDefaultKnowledgeBox.xml
View file @
eb371277
...
...
@@ -6,18 +6,18 @@
</pickle>
<pickle>
<dictionary>
<item>
<key>
<string>
Script_magic
</string>
</key>
<value>
<int>
3
</int>
</value>
</item>
<item>
<key>
<string>
_bind_names
</string>
</key>
<value>
<object>
<klass>
<global
name=
"
NameAssignments"
module=
"Shared.DC.Scripts.Bindings
"
/>
<global
name=
"
_reconstructor"
module=
"copy_reg
"
/>
</klass>
<tuple/>
<tuple>
<global
name=
"NameAssignments"
module=
"Shared.DC.Scripts.Bindings"
/>
<global
name=
"object"
module=
"__builtin__"
/>
<none/>
</tuple>
<state>
<dictionary>
<item>
...
...
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/DataLake_stopIngestionList.py
View file @
eb371277
from
erp5.component.module.Log
import
log
from
Products.ZSQLCatalog.SQLCatalog
import
Query
,
SimpleQuery
from
Products.ZSQLCatalog.SQLCatalog
import
Query
import
hashlib
CHUNK_SIZE
=
200000
...
...
@@ -14,7 +14,7 @@ def getHash(data_stream):
end_offset
=
n_chunk
*
chunk_size
+
chunk_size
try
:
data_stream_chunk
=
''
.
join
(
data_stream
.
readChunkList
(
start_offset
,
end_offset
))
except
:
except
Exception
:
# data stream is empty
data_stream_chunk
=
""
hash_md5
.
update
(
data_stream_chunk
)
...
...
@@ -24,7 +24,6 @@ def getHash(data_stream):
def
isInterruptedAbandonedSplitIngestion
(
reference
):
from
DateTime
import
DateTime
now
=
DateTime
()
day_hours
=
1.0
/
24
/
60
*
60
*
24
# started split data ingestions for reference
catalog_kw
=
{
'portal_type'
:
'Data Ingestion'
,
...
...
@@ -90,8 +89,8 @@ for data_ingestion in portal_catalog(portal_type = "Data Ingestion",
portal
.
data_stream_module
.
deleteContent
(
data_stream
.
getId
())
if
last_data_stream_id
.
endswith
(
reference_end_split
):
portal
.
ERP5Site_invalidateSplitIngestions
(
data_ingestion
.
getReference
(),
success
=
True
)
hash
=
getHash
(
full_data_stream
)
full_data_stream
.
setVersion
(
hash
)
full_data_stream_
hash
=
getHash
(
full_data_stream
)
full_data_stream
.
setVersion
(
full_data_stream_
hash
)
if
full_data_stream
.
getValidationState
()
!=
"validated"
:
full_data_stream
.
validate
()
related_split_ingestions
=
portal_catalog
(
portal_type
=
"Data Ingestion"
,
...
...
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_checkIngestionReferenceExists.py
View file @
eb371277
...
...
@@ -44,7 +44,7 @@ if data_ingestion != None:
# previous ingestion was interrumped
log
(
''
.
join
([
"[WARNING] User has restarted an interrumpted ingestion for reference "
,
data_ingestion
.
getReference
(),
". Previous split ingestions will be discarted and full ingestion restarted."
]))
portal
.
ERP5Site_invalidateSplitIngestions
(
data_ingestion
.
getReference
(),
success
=
False
)
except
:
except
Exception
:
pass
# the ingestion attemp corresponds to a split ingestion in course, accept
return
FALSE
...
...
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/
DataLak
e_createDataAnalysisList.py
→
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/
ERP5Sit
e_createDataAnalysisList.py
View file @
eb371277
from
DateTime
import
DateTime
portal
=
context
.
getPortalObject
()
portal_catalog
=
portal
.
portal_catalog
from
erp5.component.module.Log
import
log
now
=
DateTime
()
query_dict
=
{
"portal_type"
:
"Data Ingestion Line"
,
...
...
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/
DataLak
e_executeDataAnalysisList.py
→
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/
ERP5Sit
e_executeDataAnalysisList.py
View file @
eb371277
File moved
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/
DataLak
e_executeDataOperation.py
→
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/
ERP5Sit
e_executeDataOperation.py
View file @
eb371277
File moved
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5_getDescriptorHTMLContent.py
→
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5
Site
_getDescriptorHTMLContent.py
View file @
eb371277
from
Products.ZSQLCatalog.SQLCatalog
import
Query
,
SimpleQuery
,
ComplexQuery
from
Products.ZSQLCatalog.SQLCatalog
import
Query
,
ComplexQuery
portal
=
context
.
getPortalObject
()
portal_catalog
=
portal
.
portal_catalog
...
...
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_getIngestionConstantsJson.py
View file @
eb371277
import
json
portal
=
context
.
getPortalObject
()
dict
=
{
'invalid_suffix'
:
portal
.
ERP5Site_getIngestionReferenceDictionary
()[
'invalid_suffix'
],
dict
ionary
=
{
'invalid_suffix'
:
portal
.
ERP5Site_getIngestionReferenceDictionary
()[
'invalid_suffix'
],
'split_end_suffix'
:
portal
.
ERP5Site_getIngestionReferenceDictionary
()[
'split_end_suffix'
],
'single_end_suffix'
:
portal
.
ERP5Site_getIngestionReferenceDictionary
()[
'single_end_suffix'
],
'split_first_suffix'
:
portal
.
ERP5Site_getIngestionReferenceDictionary
()[
'split_first_suffix'
],
...
...
@@ -10,4 +10,4 @@ dict = {'invalid_suffix':portal.ERP5Site_getIngestionReferenceDictionary()['inva
'reference_length'
:
portal
.
ERP5Site_getIngestionReferenceDictionary
()[
'reference_length'
],
'invalid_chars'
:
portal
.
ERP5Site_getIngestionReferenceDictionary
()[
'invalid_chars'
],
}
return
json
.
dumps
(
dict
)
return
json
.
dumps
(
dict
ionary
)
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_increaseDatasetVersion.py
View file @
eb371277
portal
=
context
.
getPortalObject
()
portal_catalog
=
portal
.
portal_catalog
data_set
=
portal
.
data_set_module
.
get
(
reference
)
if
data_set
is
not
None
:
...
...
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_invalidateIngestionObjects.py
View file @
eb371277
from
Products.ZSQLCatalog.SQLCatalog
import
Query
,
SimpleQuery
,
ComplexQuery
from
Products.ZSQLCatalog.SQLCatalog
import
Query
,
ComplexQuery
portal
=
context
.
getPortalObject
()
portal_catalog
=
portal
.
portal_catalog
...
...
@@ -16,5 +16,5 @@ for document in portal_catalog(**kw_dict):
portal
.
ERP5Site_invalidateReference
(
document
)
try
:
document
.
invalidate
()
except
:
except
Exception
:
pass
# fails if it's already invalidated, draft or if it doens't allow invalidation (e.g. DI)
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_invalidateOldDatasets.py
View file @
eb371277
...
...
@@ -20,11 +20,11 @@ for data_set in portal_catalog(portal_type="Data Set", **catalog_kw):
portal
.
ERP5Site_invalidateIngestionObjects
(
data_stream
.
getReference
())
try
:
data_stream
.
invalidate
()
except
:
except
Exception
:
pass
# fails if it's already invalidated, draft or if it doens't allow invalidation (e.g. DI)
portal
.
ERP5Site_invalidateReference
(
data_set
)
try
:
data_set
.
invalidate
()
except
:
except
Exception
:
pass
# fails if it's already invalidated, draft or if it doens't allow invalidation (e.g. DI)
return
printed
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_newCredentialRequest.py
View file @
eb371277
...
...
@@ -79,5 +79,5 @@ if not batch_mode:
return
portal
.
Base_redirect
(
form_id
=
'login_form'
,
keep_items
=
dict
(
portal_status_message
=
context
.
Base_translateString
(
message_str
)))
else
:
return
json
.
dumps
({
'msg'
:
message_str
,
return
json
.
dumps
({
'msg'
:
message_str
,
'code'
:
0
})
bt5/erp5_wendelin_data_lake_ingestion/SkinTemplateItem/portal_skins/erp5_wendelin_data_lake/ERP5Site_setDatasetDescription.py
View file @
eb371277
...
...
@@ -2,7 +2,6 @@ import base64
import
json
portal
=
context
.
getPortalObject
()
portal_catalog
=
portal
.
portal_catalog
request
=
context
.
REQUEST
data_chunk
=
request
.
get
(
'data_chunk'
)
...
...
@@ -11,10 +10,10 @@ data_set = portal.data_set_module.get(dataset)
if
data_set
is
not
None
:
decoded
=
base64
.
b64decode
(
data_chunk
)
data_set
.
setDescription
(
decoded
)
dict
=
{
'status_code'
:
0
,
'message'
:
'Dataset description successfully set.'
}
response
=
{
'status_code'
:
0
,
'message'
:
'Dataset description successfully set.'
}
else
:
message
=
"No remote dataset found for reference '%s'"
%
(
dataset
)
dict
=
{
'status_code'
:
1
,
'message'
:
message
}
response
=
{
'status_code'
:
1
,
'message'
:
message
}
context
.
logEntry
(
message
)
return
json
.
dumps
(
dict
)
return
json
.
dumps
(
response
)
bt5/erp5_wendelin_data_lake_ingestion/TestTemplateItem/portal_components/test.erp5.testDataLakeIngestion.py
View file @
eb371277
...
...
@@ -103,7 +103,7 @@ class TestDataIngestion(SecurityTestCase):
def
stepIngest
(
self
,
extension
,
delimiter
,
randomize_ingestion_reference
=
False
,
data_set_reference
=
False
):
file_name
=
"file_name.csv"
reference
=
self
.
getRandomReference
()
array
=
[[
random
.
random
()
for
i
in
range
(
self
.
CHUNK_SIZE_CSV
+
10
)]
for
j
in
range
(
self
.
CHUNK_SIZE_CSV
+
10
)]
array
=
[[
random
.
random
()
for
_
in
range
(
self
.
CHUNK_SIZE_CSV
+
10
)]
for
_
in
range
(
self
.
CHUNK_SIZE_CSV
+
10
)]
np
.
savetxt
(
file_name
,
array
,
delimiter
=
delimiter
)
chunk
=
[]
with
open
(
file_name
,
'r'
)
as
csv_file
:
...
...
bt5/erp5_wendelin_data_lake_ingestion/WorkflowTemplateItem/portal_workflow/data_validation_workflow/scripts/checkConsistency.py
View file @
eb371277
object
=
state_change
[
'object'
]
object
.
Base_checkConsistency
()
state_change_object
=
state_change
[
"object"
]
state_change_
object
.
Base_checkConsistency
()
bt5/erp5_wendelin_data_lake_ingestion/bt/dependency_list
View file @
eb371277
erp5_wendelin
erp5_credential
erp5_credential
\ No newline at end of file
bt5/erp5_wendelin_examples_keras/ExtensionTemplateItem/portal_components/keras_save_load.py
View file @
eb371277
import
warnings
import
numpy
as
np
from
keras
import
backend
as
K
from
keras
import
__version__
as
keras_version
from
keras.models
import
Sequential
from
keras.models
import
model_from_config
from
keras.optimizers
import
optimizer_from_config
from
keras
import
optimizers
from
keras
import
backend
as
K
# pylint:disable=import-error
from
keras
import
__version__
as
keras_version
# pylint:disable=import-error
from
keras.models
import
Sequential
# pylint:disable=import-error
from
keras.models
import
model_from_config
# pylint:disable=import-error
from
keras.optimizers
import
optimizer_from_config
# pylint:disable=import-error
from
keras
import
optimizers
# pylint:disable=import-error
def
save_model
(
model
,
model_store
=
None
):
data
=
{}
...
...
@@ -179,7 +179,6 @@ def load_model(data):
else
:
model
.
_make_train_function
()
optimizer_weights_dict
=
data
[
'optimizer_weights'
]
optimizer_weight_names
=
optimizer_weights_dict
[
'weight_names'
]
optimizer_weight_values
=
optimizer_weights_dict
[
'weight_values'
]
model
.
optimizer
.
set_weights
(
optimizer_weight_values
)
return
model
\ No newline at end of file
bt5/erp5_wendelin_examples_keras/ExtensionTemplateItem/portal_components/keras_train_model.py
View file @
eb371277
import
numpy
as
np
import
time
import
sys
import
transaction
class
Progbar
(
object
):
def
output
(
self
,
data
):
self
.
output1
(
str
(
data
))
def
__init__
(
self
,
target
,
width
=
30
,
verbose
=
1
,
interval
=
0.01
,
output
=
None
):
"""Dislays a progress bar.
# Arguments:
target: Total number of steps expected.
interval: Minimum visual progress update interval (in seconds).
"""
self
.
width
=
width
self
.
target
=
target
self
.
sum_values
=
{}
self
.
unique_values
=
[]
self
.
start
=
time
.
time
()
self
.
last_update
=
0
self
.
interval
=
interval
self
.
total_width
=
0
self
.
seen_so_far
=
0
self
.
verbose
=
verbose
self
.
output1
=
output
def
update
(
self
,
current
,
values
=
[],
force
=
False
):
"""Updates the progress bar.
# Arguments
current: Index of current step.
values: List of tuples (name, value_for_last_step).
The progress bar will display averages for these values.
force: Whether to force visual progress update.
"""
for
k
,
v
in
values
:
if
k
not
in
self
.
sum_values
:
self
.
sum_values
[
k
]
=
[
v
*
(
current
-
self
.
seen_so_far
),
current
-
self
.
seen_so_far
]
self
.
unique_values
.
append
(
k
)
else
:
self
.
sum_values
[
k
][
0
]
+=
v
*
(
current
-
self
.
seen_so_far
)
self
.
sum_values
[
k
][
1
]
+=
(
current
-
self
.
seen_so_far
)
self
.
seen_so_far
=
current
now
=
time
.
time
()
if
self
.
verbose
==
1
:
if
not
force
and
(
now
-
self
.
last_update
)
<
self
.
interval
:
return
prev_total_width
=
self
.
total_width
#self.output('\b' * prev_total_width)
self
.
output
(
'
\
r
'
)
numdigits
=
int
(
np
.
floor
(
np
.
log10
(
self
.
target
)))
+
1
barstr
=
'%%%dd/%%%dd ['
%
(
numdigits
,
numdigits
)
bar
=
barstr
%
(
current
,
self
.
target
)
prog
=
float
(
current
)
/
self
.
target
prog_width
=
int
(
self
.
width
*
prog
)
if
prog_width
>
0
:
bar
+=
(
'='
*
(
prog_width
-
1
))
if
current
<
self
.
target
:
bar
+=
'>'
else
:
bar
+=
'='
bar
+=
(
'.'
*
(
self
.
width
-
prog_width
))
bar
+=
']'
self
.
output
(
bar
)
self
.
total_width
=
len
(
bar
)
if
current
:
time_per_unit
=
(
now
-
self
.
start
)
/
current
else
:
time_per_unit
=
0
eta
=
time_per_unit
*
(
self
.
target
-
current
)
info
=
''
if
current
<
self
.
target
:
info
+=
' - ETA: %ds'
%
eta
def
output
(
self
,
data
):
self
.
output1
(
str
(
data
))
def
__init__
(
self
,
target
,
width
=
30
,
verbose
=
1
,
interval
=
0.01
,
output
=
None
):
"""Dislays a progress bar.
# Arguments:
target: Total number of steps expected.
interval: Minimum visual progress update interval (in seconds).
"""
self
.
width
=
width
self
.
target
=
target
self
.
sum_values
=
{}
self
.
unique_values
=
[]
self
.
start
=
time
.
time
()
self
.
last_update
=
0
self
.
interval
=
interval
self
.
total_width
=
0
self
.
seen_so_far
=
0
self
.
verbose
=
verbose
self
.
output1
=
output
def
update
(
self
,
current
,
values
=
None
,
force
=
False
):
"""Updates the progress bar.
# Arguments
current: Index of current step.
values: List of tuples (name, value_for_last_step).
The progress bar will display averages for these values.
force: Whether to force visual progress update.
"""
if
values
in
None
:
values
=
[]
for
k
,
v
in
values
:
if
k
not
in
self
.
sum_values
:
self
.
sum_values
[
k
]
=
[
v
*
(
current
-
self
.
seen_so_far
),
current
-
self
.
seen_so_far
]
self
.
unique_values
.
append
(
k
)
else
:
self
.
sum_values
[
k
][
0
]
+=
v
*
(
current
-
self
.
seen_so_far
)
self
.
sum_values
[
k
][
1
]
+=
(
current
-
self
.
seen_so_far
)
self
.
seen_so_far
=
current
now
=
time
.
time
()
if
self
.
verbose
==
1
:
if
not
force
and
(
now
-
self
.
last_update
)
<
self
.
interval
:
return
prev_total_width
=
self
.
total_width
#self.output('\b' * prev_total_width)
self
.
output
(
'
\
r
'
)
numdigits
=
int
(
np
.
floor
(
np
.
log10
(
self
.
target
)))
+
1
barstr
=
'%%%dd/%%%dd ['
%
(
numdigits
,
numdigits
)
bar
=
barstr
%
(
current
,
self
.
target
)
prog
=
float
(
current
)
/
self
.
target
prog_width
=
int
(
self
.
width
*
prog
)
if
prog_width
>
0
:
bar
+=
(
'='
*
(
prog_width
-
1
))
if
current
<
self
.
target
:
bar
+=
'>'
else
:
bar
+=
'='
bar
+=
(
'.'
*
(
self
.
width
-
prog_width
))
bar
+=
']'
self
.
output
(
bar
)
self
.
total_width
=
len
(
bar
)
if
current
:
time_per_unit
=
(
now
-
self
.
start
)
/
current
else
:
time_per_unit
=
0
eta
=
time_per_unit
*
(
self
.
target
-
current
)
info
=
''
if
current
<
self
.
target
:
info
+=
' - ETA: %ds'
%
eta
else
:
info
+=
' - %ds'
%
(
now
-
self
.
start
)
for
k
in
self
.
unique_values
:
info
+=
' - %s:'
%
k
if
isinstance
(
self
.
sum_values
[
k
],
list
):
avg
=
self
.
sum_values
[
k
][
0
]
/
max
(
1
,
self
.
sum_values
[
k
][
1
])
if
abs
(
avg
)
>
1e-3
:
info
+=
' %.4f'
%
avg
else
:
info
+=
' - %ds'
%
(
now
-
self
.
start
)
for
k
in
self
.
unique_values
:
info
+=
' - %s:'
%
k
if
isinstance
(
self
.
sum_values
[
k
],
list
):
avg
=
self
.
sum_values
[
k
][
0
]
/
max
(
1
,
self
.
sum_values
[
k
][
1
])
if
abs
(
avg
)
>
1e-3
:
info
+=
' %.4f'
%
avg
else
:
info
+=
' %.4e'
%
avg
else
:
info
+=
' %s'
%
self
.
sum_values
[
k
]
self
.
total_width
+=
len
(
info
)
if
prev_total_width
>
self
.
total_width
:
info
+=
((
prev_total_width
-
self
.
total_width
)
*
' '
)
self
.
output
(
info
)
if
current
>=
self
.
target
:
self
.
output
(
'
\
r
\
n
'
)
if
self
.
verbose
==
2
:
if
current
>=
self
.
target
:
info
=
'%ds'
%
(
now
-
self
.
start
)
for
k
in
self
.
unique_values
:
info
+=
' - %s:'
%
k
avg
=
self
.
sum_values
[
k
][
0
]
/
max
(
1
,
self
.
sum_values
[
k
][
1
])
if
avg
>
1e-3
:
info
+=
' %.4f'
%
avg
else
:
info
+=
' %.4e'
%
avg
self
.
output
(
info
+
"
\
r
\
n
"
)
self
.
last_update
=
now
def
add
(
self
,
n
,
values
=
[]):
self
.
update
(
self
.
seen_so_far
+
n
,
values
)
from
keras.callbacks
import
ProgbarLogger
as
OriginalProgbarLogger
info
+=
' %.4e'
%
avg
else
:
info
+=
' %s'
%
self
.
sum_values
[
k
]
self
.
total_width
+=
len
(
info
)
if
prev_total_width
>
self
.
total_width
:
info
+=
((
prev_total_width
-
self
.
total_width
)
*
' '
)
self
.
output
(
info
)
if
current
>=
self
.
target
:
self
.
output
(
'
\
r
\
n
'
)
if
self
.
verbose
==
2
:
if
current
>=
self
.
target
:
info
=
'%ds'
%
(
now
-
self
.
start
)
for
k
in
self
.
unique_values
:
info
+=
' - %s:'
%
k
avg
=
self
.
sum_values
[
k
][
0
]
/
max
(
1
,
self
.
sum_values
[
k
][
1
])
if
avg
>
1e-3
:
info
+=
' %.4f'
%
avg
else
:
info
+=
' %.4e'
%
avg
self
.
output
(
info
+
"
\
r
\
n
"
)
self
.
last_update
=
now
def
add
(
self
,
n
,
values
=
None
):
if
values
is
None
:
values
=
[]
self
.
update
(
self
.
seen_so_far
+
n
,
values
)
from
keras.callbacks
import
ProgbarLogger
as
OriginalProgbarLogger
# pylint:disable=import-error
class
ProgbarLogger
(
OriginalProgbarLogger
):
...
...
@@ -162,10 +168,10 @@ def train(portal):
# 1. you can use keras.
# 2. you can save trained model.
# 3. you can load trained model.
from
cStringIO
import
StringIO
import
tensorflow
as
tf
#
from cStringIO import StringIO
import
tensorflow
as
tf
# pylint:disable=import-error
sess
=
tf
.
Session
()
from
keras
import
backend
as
K
from
keras
import
backend
as
K
# pylint:disable=import-error
K
.
set_session
(
sess
)
stream
=
portal
.
data_stream_module
.
wendelin_examples_keras_log
...
...
@@ -176,8 +182,8 @@ def train(portal):
if
saved_model_data
is
not
None
:
model
=
portal
.
keras_load_model
(
saved_model_data
)
else
:
from
keras.models
import
Sequential
from
keras.layers
import
Dense
from
keras.models
import
Sequential
# pylint:disable=import-error
from
keras.layers
import
Dense
# pylint:disable=import-error
model
=
Sequential
()
model
.
add
(
Dense
(
12
,
input_dim
=
8
,
init
=
'uniform'
,
activation
=
'relu'
))
model
.
add
(
Dense
(
8
,
init
=
'uniform'
,
activation
=
'relu'
))
...
...
bt5/erp5_wendelin_examples_keras/ExtensionTemplateItem/portal_components/keras_vgg16_predict.py
View file @
eb371277
from
keras.applications.vgg16
import
VGG16
,
preprocess_input
,
decode_predictions
from
keras.preprocessing
import
image
from
keras.applications.vgg16
import
VGG16
,
preprocess_input
,
decode_predictions
# pylint:disable=import-error
from
keras.preprocessing
import
image
# pylint:disable=import-error
import
numpy
as
np
from
cStringIO
import
StringIO
import
PIL.Image
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment