Issue running service using onnx models
caelinsutch
PROOP

2 years ago

Getting the following error when deploying a fastify server that uses Onnyx - a quick google search brings up a few threads from GH where folks are running into this issue on Vercel (i.e. here and here due to memory issues. Since this isn't serverless I don't think memory issues mentioned are applicable. Code works locally completely fine

Error: Failed to load model because protobuf parsing failed.

at OnnxruntimeSessionHandler (/app/node_modules/@xenova/transformers/node_modules/onnxruntime-node/lib/backend.ts:16:30)

2 Replies

caelinsutch
PROOP

2 years ago

93bdb69d-3975-4e76-bc29-f940b0028664


caelinsutch
PROOP

2 years ago

Actually this can be marked as closed - I was in a monorepo and didn't explicitly include Onnyx in my server package. Did that, upgraded versions, and that seemed to fix it


Loading...