Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distributed Logger #33

Open
2 tasks
xrsrke opened this issue Nov 14, 2023 · 1 comment
Open
2 tasks

Distributed Logger #33

xrsrke opened this issue Nov 14, 2023 · 1 comment
Assignees
Labels
good first issue Good for newcomers

Comments

@xrsrke
Copy link
Owner

xrsrke commented Nov 14, 2023

Print log messages based on a specific rank or ParallelMode neatly to the terminal, and save them to a local file. Let the user configure the file path and file name. By default, save the log using the name passed in by the user.

APIs

from pipegoose.distributed import ParallelMode
from pipegoose.distributed.logger import DistributedLogger

logger = DistributedLogger("latency_logger", parallel_context)

logger.info("hello", parallel_mode=ParallelMode.GLOBAL)
logger.warning("hello", parallel_mode=ParallelMode.GLOBAL)
logger.debug("hello", parallel_mode=ParallelMode.GLOBAL)
logger.error("hello", parallel_mode=ParallelMode.GLOBAL)

# other arguments
logger.info("hello", rank=0, parallel_mode=ParallelMode.GLOBAL)
logger.info("hello", rank=0, parallel_mode=ParallelMode.TENSOR)

TODO

  • Save log message by a specific rank in ParallelMode
  • Save log message by all ranks in a ParallelMode
@xrsrke xrsrke added help wanted Extra attention is needed good first issue Good for newcomers labels Nov 14, 2023
@KevorkSulahian
Copy link

KevorkSulahian commented Nov 15, 2023

Made a PR on #35 @xrsrke

@xrsrke xrsrke removed the help wanted Extra attention is needed label Nov 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
Status: In Progress
Development

No branches or pull requests

2 participants