Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradio-Lite : zero-shot-classification pipeline returns only 1 score #8317

Open
1 task done
xavierbarbier opened this issue May 17, 2024 · 2 comments
Open
1 task done
Assignees
Labels
bug Something isn't working

Comments

@xavierbarbier
Copy link

Describe the bug

Hello,
I'm trying to use Gradio-Lite with zero-shot-classification Transformer.js.py pipeline and providing potential classes (as with "classic" Transformer and Transformer.js pipelines). But it outputs only one probability.

Using gr.Interface.from_pipeline(pipe) works but user needs to provide potential classes manually.

Am'I missing some arguments here ?

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

<script type="module" crossorigin src="https://cdn.jsdelivr.net/npm/@gradio/lite/dist/lite.js"></script>
	<gradio-requirements>
	transformers_js_py
	</gradio-requirements>
	
	<gradio-file name="app.py" entrypoint>
	from transformers_js import import_transformers_js
	import gradio as gr
	labels=['politics', 'music','police']
	transformers = await import_transformers_js()
	pipeline = transformers.pipeline
	model_path = 'Xenova/mobilebert-uncased-mnli'
	pipe = await pipeline('zero-shot-classification', model_path, labels )

	async def classify(text):
		pred = await pipe(text )
		
		return pred["scores"]
		

	demo = gr.Interface(classify, "textbox", "textbox")
	demo.launch()
	</gradio-file>

	</gradio-lite>
</body>

Screenshot

No response

Logs

No response

System Info

Gradio-Lite (So I guess it's up to date!)

Severity

Blocking usage of gradio

@xavierbarbier xavierbarbier added the bug Something isn't working label May 17, 2024
@whitphx
Copy link
Member

whitphx commented May 20, 2024

It's about Transformers.js' API spec where you have to pass the labels at prediction, not model initialization (see https://huggingface.co/docs/transformers.js/api/pipelines#module_pipelines.ZeroShotClassificationPipeline).

So your code should be modified like this:

from transformers_js import import_transformers_js
import gradio as gr

labels=['politics', 'music','police']
transformers = await import_transformers_js()
pipeline = transformers.pipeline
model_path = 'Xenova/mobilebert-uncased-mnli'
pipe = await pipeline('zero-shot-classification', model_path) # Not here.

async def classify(text):
	pred = await pipe(text, labels) # Pass `labels` here.

	return pred["scores"]


demo = gr.Interface(classify, "textbox", "textbox")
demo.launch()

@xavierbarbier
Copy link
Author

I first try to define labels within the pipe at inference (as with "classic" Transformer pipeline). Wasn't working.
Then I tried at model initialization. Wasn't working either as expected.
Seems you have to define them separately from inference call.
It's working now.
Arigato !

from transformers_js import import_transformers_js
import gradio as gr

labels=['politics', 'music','police'] # works
transformers = await import_transformers_js()
pipeline = transformers.pipeline
model_path = 'Xenova/mobilebert-uncased-mnli'
pipe = await pipeline('zero-shot-classification', model_path) # Not here.
# labels=['politics', 'music','police'] # works 
async def classify(text):
	pred = await pipe(text, labels # Pass `labels` here.
                                   # labels=['politics', 'music','police'] # doesn't work
                                     ) 

	return pred["scores"]


demo = gr.Interface(classify, "textbox", "textbox")
demo.launch()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants