Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support out_dim argument for Attention block #7877

Open
tigerlittle1 opened this issue May 7, 2024 · 0 comments
Open

Support out_dim argument for Attention block #7877

tigerlittle1 opened this issue May 7, 2024 · 0 comments

Comments

@tigerlittle1
Copy link

tigerlittle1 commented May 7, 2024

Is your feature request related to a problem? Please describe.
When i feed the out_dim argument in __init__ in Attention block it will raise the shape error, because the query_dim != out_dim. In this case, the following code try to keep the given channel of hidden_states.

hidden_states = hidden_states.transpose(-1, -2).reshape(batch_size, channel, height, width)

But it should change the channel as the output of hidden_states = attn.to_out[0](hidden_states).

Describe the solution you'd like.
I suggest the change of code base :

hidden_states = hidden_states.transpose(-1, -2).reshape(batch_size, channel, height, width)

to hidden_states = hidden_states.transpose(-1, -2).reshape(batch_size, -1, height, width), then it will respect the channel of hidden_states.
Maybe I will make a PR later.

Describe alternatives you've considered.
None.

Additional context.
None.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant