9 months ago
Hello Railway Support Team,
I'm encountering two primary issues with deploying my Python/FastAPI application and would greatly appreciate your assistance.
Summary of Issues:
Docker Image Size Exceeds Limit: Recent deployments are failing due to the Docker image size reaching 7.5GB, which exceeds my plan's 4GB limit.
My local project folder साइज is approximately 1GB.
I've configured a .dockerignore file to exclude unnecessary files. (Can share .dockerignore content if needed).
Build logs indicate Nixpacks is using the ghcr.io/railwayapp/nixpacks:ubuntu-1745885067 base image and running apt-get install -y ffmpeg followed by pip install -r requirements.txt. (Can share requirements.txt content if needed).
(Intermittent or Previous Issue) Nixpacks Checksum/Cache Related Error: Prior to the image size issue, or intermittently, I've also encountered Nixpacks build errors similar to this:
ERROR: failed to solve: failed to compute cache key: failed to calculate checksum of ref ... "/.nixpacks/nixpkgs-...nix": not found
Key Troubleshooting Steps Taken:
Configured and attempted to optimize the backend/.dockerignore file.
Attempted various methods for setting environment variables via both the Railway CLI and the web UI (Raw Editor).
Set the NO_CACHE=1 environment variable in the service to disable build layer caching (however, the image size issue persists).
Reviewed and adjusted environment variable loading logic in main.py.
Removed unnecessary backup/temporary Python files from the project.
Specific Questions and Assistance Requested:
Regarding Image Size Issue:
What is the approximate base size of the ghcr.io/railwayapp/nixpacks:ubuntu-1745885067 builder image?
Considering my requirements.txt and the ffmpeg installation, is an image size of 7.5GBบน an Ubuntu base typical, or is there likely another factor молитва significantly to this size?
Could you provide information on the main system packages (apt packages, etc.) that Nixpacks might be installing by default or based on my requirements.txt dependencies, apart from ffmpeg?
Are there any other Nixpacks-level configurations or optimization strategies I can try to reduce the image size (e.g., options to select a lighter base image, exclude specific system packages)?
Regarding Nixpacks Checksum/Cache Error:
What are the common causes and solutions for this type of error? Is there a definitive platform-level way to achieve the equivalent of "clear build cache and redeploy"?
My project's requirements.txt is as follows:
fastapi==0.104.1
pydantic==2.4.2
yt-dlp==2023.11.16
python-dotenv==1.0.0
google-cloud-texttospeech==2.16.2
google-auth==2.23.4
openai==1.2.4
requests==2.31.0
uvicorn==0.24.0
psycopg2-binary==2.9.9
Any advice or assistance you can provide to resolve these issues would be highly appreciated. I can provide any additional information needed (full build logs, file contents, etc.).
Thank you.
1 Replies
9 months ago
This thread has been marked as public for community involvement, as it does not contain any sensitive or personal information. Any further activity in this thread will be visible to everyone.
Status changed to Open brody • 9 months ago
9 months ago
I think your image is getting huge because ffmpeg + all the system packages Nixpacks installs add up, and probably the apt cache isn’t cleaned after install. Also, some big files might be sneaking in from your project folder if your .dockerignore isn’t excluding everything unnecessary (like .git, \__pycache__, temp files, etc).
For the checksum error, it’s usually some caching issue with Nixpacks or missing files during build. Setting NO_CACHE=1 helps sometimes but you might also try forcing a rebuild by tweaking something minor (like adding a dummy env var) or even deleting and recreating the service if it persists.
If you can share your .dockerignore and build logs, it would help me help you.
Status changed to Closed brody • 9 months ago