Draft: fluentd/template/fluentd-agent.conf.in: limit max chunk size
if chunk size is too large, wendelin can't handle it
unfortunally, we have such case in production, we have a chunk of 2.5 MB
2.5M buffer.q63e88e7e04f2709babd2a7c856c6590c.log
and ingestion of such data is always failed
Traceback (most recent call last):
File "erp5/component/document/erp5_version.py", line 80, in ingest
self.REQUEST.form['data_chunk'] = self.REQUEST.get('BODY')
File "ZPublisher/HTTPRequest.py", line 1058, in get
v = self.other[key] = self._fs.value
File "ZPublisher/HTTPRequest.py", line 1371, in __get__
raise BadRequest("data exceeds memory limit")
zExceptions.BadRequest: data exceeds memory limit