8 months ago
Hey Railway team!
I’m in a bit of a bind here 😅
I urgently need to recover some critical data from my deploy logs, specifically from June 27th to July 4th (today).
I already exported logs via the platform and CLI, but I’m hitting the 5,000 line limit, and a lot of what I need is getting cut off.
Would it be possible for you to help me export the full build logs for that date range? It’s really important and time-sensitive for me.
Here are the project details:
Project ID: 50109696-c31e-4c90-98cf-3e57adc510f6
Service ID: 831d51dc-f949-45b4-9bab-c61215e84b73
Environment ID: 2a6928bc-f486-4bca-859e-73f39a480254
Let me know if you need anything else from me — this would help a ton. Appreciate it! 🙏
Thanks,
João
13 Replies
That’s the only way I can recover some of the leads I lost, if you guys could provide it, that’d be awesome!
8 months ago
A team member (Brody) has this tool for downloading the full build logs (or at least I believe the line limit is a lot higher)
maybe this will work for you?
8 months ago
^ That's limited to 5000 log lines.
A team member (Brody) has this tool for downloading logs without a fixed 5000 log line limit.
Maybe this will work for you? -
> This will download all the logs for the given deployment or service until one of the following conditions is met:
> - You reach the deployment/service's creation date
> - You reach the log retention limit (7/30/90 days depending on your account's plan)
> - You hit the API rate limit
> - You cancel the operation (Ctrl/Cmd + C)
> In any case, all the logs that have been downloaded will be saved to a file called deployment-.jsonl or service-.jsonl.
8 months ago
oh neat i didnt know you had another tool for that, thanks for sharing
8 months ago
If Railway doesn't build it, someone has to <:kekw:788259314607325204>
@Brody
Thanks for the tool. I actually tried it earlier.
The issue isn’t just the 5k line limit. The real problem is how Railway is structuring the logs, probably because of the way I wrote my console.log() statements in the code.
Each part of the JSON payload ends up being logged as a separate line. So instead of getting one complete object per event, I get separate entries like "em": […], "ph": […], "fn": […], and so on. That makes it really hard to reconstruct a full lead submission, especially when logs are interleaved or cut off.
What I actually need is access to the raw output of each console.log() call as it was originally sent. Ideally, full JSON blobs or at least grouped by event.
Would it be possible for the team to help me extract that raw log data from internal storage for the service?
This is time-sensitive — I’m trying to recover real user form submissions tied to payment operations. I’d really appreciate any help you can give.
Let me know if you need timestamps or specific keys I’m looking for.
OBS: AI Crafted response for swiftness and clarity, love you guys ❤️
8 months ago
Does that mean you logged JSON in pretty printed format?
8 months ago
Yep you logged pretty printed, nothing we could do to reconstruct that for you, we store each individual line in a row in a columnar database, not every print as a whole.
But I'm glad you found a solution, did it involve my CLI? please share the details, I'd love to know!
8 months ago
!s
Status changed to Solved brody • 8 months ago
