-
Notifications
You must be signed in to change notification settings - Fork 58
Issues: Lightning-AI/lightning-thunder
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
grad transform Something isn't working
forward_and_backward_from_trace
is not handling NumberProxy properly in saved_for_backward
bug
#541
opened Jun 6, 2024 by
jjsjann123
[distributed][Tensor Parallelism] Support meta device params
tensor parallel
distributed - tensor parallel
#540
opened Jun 6, 2024 by
crcrpar
Applying Thunder on torch.fx.GraphModule from Dynamo fails
bug
Something isn't working
jit
triage review
#539
opened Jun 5, 2024 by
IvanYashchuk
Add New feature or request
transforms
post_optimization_transform
to thunder.jit
enhancement
#533
opened Jun 5, 2024 by
kshitij12345
missing slice check in prologue
bug
Something isn't working
jit
#532
opened Jun 5, 2024 by
jjsjann123
Fusion pass should drop unused arg from parent symbols.
bug
Something isn't working
fusion logic
#531
opened Jun 5, 2024 by
jjsjann123
math.xxx calls in function on NumberProxy is not being traced.
bug
Something isn't working
dynamic constraints
symbolic values
#526
opened Jun 5, 2024 by
jjsjann123
NVFuser error adding thunder.jit to UNet model of NeMo Stable Diffusion
bug
Something isn't working
nemo
Issues needed to support NVIDIA NeMo models.
#525
opened Jun 4, 2024 by
athitten
Thunder object's New feature or request
tracing architecture
__repr__
should indicate what object they are (TensorProxy and others)
enhancement
#510
opened Jun 3, 2024 by
t-vi
[RFC] Option to make a trace easier to interpret
enhancement
New feature or request
#507
opened Jun 2, 2024 by
crcrpar
FutureWarning: Something isn't working
ci / tests
torch.cuda.amp.autocast(args...)
is deprecated. Please use torch.amp.autocast('cuda', args...)
instead.
bug
#500
opened May 31, 2024 by
xwang233
Distill API for module transformations from distributed / quantization uses of ThunderModule attributes
enhancement
New feature or request
module
#497
opened May 31, 2024 by
t-vi
Support RN50 BatchNorm fusions with cudnn
cudnn
enhancement
New feature or request
#487
opened May 29, 2024 by
vedaanta
FP8 Linear and conv with cudnn
cudnn
enhancement
New feature or request
#486
opened May 29, 2024 by
vedaanta
load/save_state_dict hooks for early transforms
enhancement
New feature or request
module
#483
opened May 29, 2024 by
t-vi
fsdp(jit(...)) transform can use more memory compared to jit(fsdp(...))
bug
Something isn't working
distributed
#478
opened May 29, 2024 by
kshitij12345
OOM errors for Gemma-7, pythia-12b, Llama-2-13b-hf and Nous-Hermes-13b with FSDP zero3 and 2x8 H100
bug
Something isn't working
memory use
#474
opened May 29, 2024 by
mpatel31415
Dynamic shape needs to be modeled in trace
bug
Something isn't working
dynamic constraints
#471
opened May 29, 2024 by
jjsjann123
8 tasks
Implement GroupNorm to invoke APEX GroupNorm for NeMo Stable Diffusion AutoEncoder performance
bug
Something isn't working
nemo
Issues needed to support NVIDIA NeMo models.
performance
#468
opened May 29, 2024 by
athitten
dtype inconsistencies when dividing/rounding tensors
bug
Something isn't working
#467
opened May 29, 2024 by
k223kim
CI: Re-Enable torchrun call in Zero to Thunder notebook
bug
Something isn't working
ci / tests
ci
#465
opened May 27, 2024 by
t-vi
Constraints to insert static numbers
enhancement
New feature or request
#463
opened May 26, 2024 by
jjsjann123
Hang using thunder.jit with tokenizer in NeMo Stable Diffusion
bug
Something isn't working
nemo
Issues needed to support NVIDIA NeMo models.
#462
opened May 26, 2024 by
athitten
Previous Next
ProTip!
Follow long discussions with comments:>50.