Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disable prefix tuning and limit llama adapter #482

Merged
merged 5 commits into from May 6, 2024

Conversation

mreso
Copy link
Contributor

@mreso mreso commented May 2, 2024

What does this PR do?

This PR disables prefix tuning and limit llama_adapter to non-FSDP use case.

Fixes # (issue)
#359

Feature/Issue validation/testing

Please describe the tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • pytest tests/test_finetuning.py
    Logs for Test A
====================================================================================================================================== test session starts =======================================================================================================================================
platform linux -- Python 3.11.8, pytest-8.1.1, pluggy-1.4.0
rootdir: /home/mreso/llama-recipes
configfile: pyproject.toml
plugins: anyio-4.3.0, mock-3.14.0
collected 9 items

tests/test_finetuning.py .........                                                                                                                                                                                                                                                         [100%]

======================================================================================================================================== warnings summary ========================================================================================================================================
../.conda/envs/llama3/lib/python3.11/site-packages/fire/core.py:59
  /home/mreso/.conda/envs/llama3/lib/python3.11/site-packages/fire/core.py:59: DeprecationWarning: 'pipes' is deprecated and slated for removal in Python 3.13
    import pipes

src/llama_recipes/utils/train_utils.py:9
  /home/mreso/llama-recipes/src/llama_recipes/utils/train_utils.py:9: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    from pkg_resources import packaging

../.conda/envs/llama3/lib/python3.11/site-packages/torch/distributed/_shard/checkpoint/__init__.py:8
  /home/mreso/.conda/envs/llama3/lib/python3.11/site-packages/torch/distributed/_shard/checkpoint/__init__.py:8: DeprecationWarning: torch.distributed._shard.checkpoint will be deprecated, use torch.distributed.checkpoint instead
    warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================================================================================================================================= 9 passed, 3 warnings in 3.77s ==================================================================================================================================

Before submitting

  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Thanks for contributing 🎉!

@mreso mreso mentioned this pull request May 2, 2024
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants