You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[...] If I run "dbt docs generate" there is little useful information. [...] if docs generation were to continue to be a multi-minute query some info could be added there.
On the other hand, when running with debug:
When you run "dbt docs generate --debug" - there is a line which reads for me:
SQL status: SUCCESS 24005 in 239.0 seconds
That line is perhaps the most important in the file - since it accounts for much of the time.
One option - make the line more clear what it refers to.
3 seconds later in the log file is what presumably is the connection which caused it:
13:32:56 SQL status: SUCCESS 24005 in 239.0 seconds
13:32:59 On dbt_db.information_schema: Close
if there is parallelization, no guarantee something will not be in between - so I might like a more descriptive first line.
And if the problem is not made to "go away", I could imagine accumulating some critical timing information for these important queries, and at the end of the log file process (with --debug) flushing that critical information there - so it stands out from the 43K line file.
Describe alternatives you've considered
docs generate is faster
Who will this benefit?
all users of dbt docs
Are you interested in contributing this feature?
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered:
Is this your first time submitting a feature request?
Describe the feature
the user wants the following
debug
log, to be reported asinfo
but with more added context about which query was run.Originally posted by @extrospective in dbt-labs/dbt-snowflake#1034 (comment)
Describe alternatives you've considered
Who will this benefit?
all users of dbt docs
Are you interested in contributing this feature?
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered: