Skip to content

GitLab

  • Menu
Projects Groups Snippets
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Sign in / Register
  • slapos slapos
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Merge requests 126
    • Merge requests 126
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Jobs
  • Commits
Collapse sidebar
  • nexedi
  • slaposslapos
  • Merge requests
  • !1882

Open
Created Sep 12, 2025 by Xiaowu Zhang@xiaowu.zhangMaintainer
  • Report abuse
Report abuse

Draft: fluentd/template/fluentd-agent.conf.in: limit max chunk size

  • Overview 8
  • Commits 1
  • Changes 2

if chunk size is too large, wendelin can't handle it

unfortunally, we have such case in production, we have a chunk of 2.5 MB

2.5M buffer.q63e88e7e04f2709babd2a7c856c6590c.log

and ingestion of such data is always failed

Traceback (most recent call last):
  File "erp5/component/document/erp5_version.py", line 80, in ingest
    self.REQUEST.form['data_chunk'] = self.REQUEST.get('BODY')
  File "ZPublisher/HTTPRequest.py", line 1058, in get
    v = self.other[key] = self._fs.value
  File "ZPublisher/HTTPRequest.py", line 1371, in __get__
    raise BadRequest("data exceeds memory limit")
zExceptions.BadRequest: data exceeds memory limit
Assignee
Assign to
Reviewer
Request review from
Time tracking
Source branch: fix/fluentd_chunk_size
GitLab Nexedi Edition | About GitLab | About Nexedi | 沪ICP备2021021310号-2 | 沪ICP备2021021310号-7