Skip to content

Optimum neuron LLM inference cache builder #29

Optimum neuron LLM inference cache builder

Optimum neuron LLM inference cache builder #29

Triggered via schedule September 29, 2024 00:26
Status Failure
Total duration 4h 13m 45s
Artifacts
Matrix: Create optimum-neuron inference cache
Fit to window
Zoom out
Zoom in

Annotations

1 error
Create optimum-neuron inference cache (mixtral)
The self-hosted runner: aws-inf2-48xlarge-use1-public-80-xgb8v-runner-28s4b lost communication with the server. Verify the machine is running and has a healthy network connection. Anything in your workflow that terminates the runner process, starves it for CPU/Memory, or blocks its network access can cause this error.