You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your input on the pipeline setup. To better address the issue, could you provide more detailed information about the pipeline configuration? Specifically:
Transforms Used:
Could you list the specific transforms and their configurations that are currently used in the pipeline?
CSV Input:
Is there a CSV input in the pipeline? If so, it would be helpful to check whether the 'lazy conversion' checkbox is marked. This option helps manage memory more efficiently by delaying data type conversions.
Data Carrying Between Pipelines:
Please verify if the pipeline is carrying tables from one stage to another excessively. This practice can lead to memory overflow and introduce bugs, especially when combined with database lookup transforms.
Your detailed feedback on these points will help us identify and fix the issue more effectively.
Pipeline is very simple, output is only ~2 million rows:
Avro target folder is set on GS.
By the way, in parallel Im running similar pipelines (different times) on another software, another server, and uploading to GS large CSV's via "gcloud storage cp" command, and that never failed, so I would exclude network related issues.
Apache Hop version?
2.8
Java version?
openjdk 21.0.3 2024-04-16 LTS
Operating system
Linux
What happened?
Status in Web UI:
Log lines in Web UI:
Server log:
Issue Priority
Priority: 2
Issue Component
Component: Hop Server
The text was updated successfully, but these errors were encountered: