Skip to content

Actions: Mozilla-Ocho/llamafile

All workflows

Actions

Loading...

Showing runs from all workflows
296 workflow runs
296 workflow runs
Event

Filter by event

Status

Filter by status

Branch
Actor

Filter by actor

Fix tinyBLAS accuracy
CI #142: Commit 31330d0 pushed by jart
March 4, 2024 08:21 6m 24s main
March 4, 2024 08:21 6m 24s
Synchronize with llama.cpp upstream
CI #139: Commit 8c4d7af pushed by jart
February 27, 2024 18:55 6m 45s main
February 27, 2024 18:55 6m 45s
Upgrade to cosmocc 3.3.2
CI #138: Commit 86ee0a8 pushed by jart
February 27, 2024 17:53 5m 4s main
February 27, 2024 17:53 5m 4s
Upgrade to cosmocc v3.3.1
CI #137: Commit c011235 pushed by jart
February 27, 2024 10:00 4m 50s main
February 27, 2024 10:00 4m 50s
Cherry pick gemma fix from upstream
CI #136: Commit b2f240e pushed by jart
February 23, 2024 20:13 4m 35s main
February 23, 2024 20:13 4m 35s
Synchronize with llama.cpp upstream
CI #135: Commit 5447ca8 pushed by jart
February 23, 2024 20:03 4m 42s main
February 23, 2024 20:03 4m 42s
Fixup commit for 84490a7bca53 (#265)
CI #134: Commit 725eca9 pushed by jart
February 22, 2024 07:21 5m 42s main
February 22, 2024 07:21 5m 42s
Upgrade to Cosmopolitan 3.3
CI #133: Commit 5b109da pushed by jart
February 21, 2024 03:07 10m 45s main
February 21, 2024 03:07 10m 45s
Include ngl parameter in README.md (#267)
CI #132: Commit 6d8790b pushed by jart
February 20, 2024 15:03 5m 49s main
February 20, 2024 15:03 5m 49s
Have less checks in check_args (#241)
CI #129: Commit 84490a7 pushed by jart
February 19, 2024 21:56 5m 9s main
February 19, 2024 21:56 5m 9s
Add sandboxing for the server on Apple Silicon Macs (#261)
CI #128: Commit d8e58ed pushed by jart
February 19, 2024 21:55 5m 36s main
February 19, 2024 21:55 5m 36s
Always enable --embedding mode on server
CI #127: Commit 97eca06 pushed by jart
February 19, 2024 20:20 6m 1s main
February 19, 2024 20:20 6m 1s
Add sandboxing for the server on Apple Silicon Macs
CI #126: Pull request #261 opened by hafta
February 16, 2024 21:57 6m 43s hafta:main
February 16, 2024 21:57 6m 43s
Improve prompt eval time on x86 CPUs
CI #125: Commit 9f5002a pushed by jart
February 12, 2024 21:40 8m 48s main
February 12, 2024 21:40 8m 48s
Remove -fopenmp flag
CI #124: Commit 176f089 pushed by jart
February 12, 2024 19:41 6m 59s main
February 12, 2024 19:41 6m 59s
have less checks in check_args
CI #123: Pull request #241 opened by ahgamut
February 6, 2024 05:43 2m 40s ahgamut:less-checks
February 6, 2024 05:43 2m 40s
Make clickable HTTP server links easier to see
CI #122: Commit 7a8d5ee pushed by jart
February 5, 2024 18:38 5m 0s main
February 5, 2024 18:38 5m 0s
Make AVX mandatory and support VNNI
CI #121: Commit cdd7458 pushed by jart
February 1, 2024 17:42 7m 25s main
February 1, 2024 17:42 7m 25s
Fix issue with recent llamafile-convert change
CI #120: Commit 4fd603e pushed by jart
January 30, 2024 19:34 6m 39s main
January 30, 2024 19:34 6m 39s
Improve llamafile-convert command
CI #119: Commit 703e03a pushed by jart
January 30, 2024 02:01 7m 34s main
January 30, 2024 02:01 7m 34s
Add missing build rule
CI #118: Commit fb150ca pushed by jart
January 27, 2024 21:38 7m 4s main
January 27, 2024 21:38 7m 4s
Release llamafile v0.6.2
CI #117: Commit d4c602d pushed by jart
January 27, 2024 21:19 5m 21s main
January 27, 2024 21:19 5m 21s
Synchronize with llama.cpp 2024-01-27
CI #116: Commit dfd3335 pushed by jart
January 27, 2024 20:40 6m 46s main
January 27, 2024 20:40 6m 46s
Synchronize with llama.cpp 2024-01-26
CI #115: Commit c008e43 pushed by jart
January 27, 2024 19:32 5m 53s main
January 27, 2024 19:32 5m 53s
Synchronize with llama.cpp 2024-01-26
CI #114: Commit b5f245a pushed by jart
January 27, 2024 19:31 9m 29s main
January 27, 2024 19:31 9m 29s