Logs Shown as Error (Celery, Beat, FastAPI)

fehmisener
HOBBY

a month ago

Hello lads,

As you can see from the image below, my celery beat or FastAPI worker logs appear as error logs. I'm not sure if this is a common thing or just me having this kind of problem.

This is my logging.py implementation

def setup_logging() -> None:
    """Configure application logging."""
    
    # Create formatter
    formatter = logging.Formatter(
        fmt="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
        datefmt="%Y-%m-%d %H:%M:%S"
    )
    
    # Create console handler
    console_handler = logging.StreamHandler(sys.stdout)
    console_handler.setFormatter(formatter)
    
    # Get root logger
    root_logger = logging.getLogger()
    root_logger.setLevel(getattr(logging, settings.log_level.upper()))
    root_logger.addHandler(console_handler)
    
    # Configure specific loggers
    configure_loggers()


def configure_loggers() -> None:
    """Configure specific logger levels."""
    
    # Also set logging level for some problematic root modules
    logging.getLogger("azure").setLevel(logging.WARNING)
    logging.getLogger("msal").setLevel(logging.WARNING)
    logging.getLogger("urllib3").setLevel(logging.WARNING)
    logging.getLogger("httpx").setLevel(logging.WARNING)
    
    logger_configs: Dict[str, str] = {
        "uvicorn": "INFO",
        "uvicorn.error": "INFO", 
        "uvicorn.access": "INFO" if settings.debug else "WARNING"}
    
    for logger_name, level in logger_configs.items():
        logger = logging.getLogger(logger_name)
        logger.setLevel(getattr(logging, level))


def get_logger(name: str) -> logging.Logger:
    """
    Get a logger instance.
    
    Args:
        name: Logger name
        
    Returns:
        logging.Logger: Logger instance
    """
    return logging.getLogger(name)


def configure_worker_logging() -> None:
    """
    Configure logging specifically for worker processes.
    This should be called in worker initialization to ensure
    external library logs are suppressed in workers.
    """
    # Force configuration of external loggers in worker processes
    external_loggers = ["..."]
    
    for logger_name in external_loggers:
        logger = logging.getLogger(logger_name)
        logger.setLevel(logging.WARNING)
        logger.propagate = True  # Ensure it respects parent logger settings
        
        # Also disable handlers to prevent duplicate logging
        logger.handlers = []
    
    # Set a more aggressive filter on the root logger for worker processes
    # This catches any loggers we might have missed
    root_logger = logging.getLogger()
    
    class ExternalLibraryFilter(logging.Filter):
        """Filter to suppress verbose logs from external libraries."""
        
        def filter(self, record):
            # Allow all logs from our own modules
            if record.name.startswith("src."):
                return True
            
            # Allow critical and error logs from external libraries    
            if record.levelno >= logging.ERROR:
                return True
                
            # Suppress info/debug logs from known verbose external libraries
            external_prefixes = [
                "azure", "msal", "msgraph", "httpx", "httpcore", "urllib3", 
                "requests", "openai", "anthropic", "botbuilder", "kiota"
            ]
            
            for prefix in external_prefixes:
                if record.name.startswith(prefix):
                    return False
                    
            return True
    
    # Add the filter to existing handlers
    for handler in root_logger.handlers:
        handler.addFilter(ExternalLibraryFilter())


def enable_debug_logging_for_library(library_name: str) -> None:
    """
    Temporarily enable debug logging for a specific external library.
    Useful for troubleshooting integration issues.
    
    Args:
        library_name: Name of the library (e.g., 'azure', 'httpx', 'msgraph')
    """
    logger = logging.getLogger(library_name)
    logger.setLevel(logging.DEBUG)
    
    # Remove any filters that might suppress the logs
    for handler in logging.getLogger().handlers:
        for filter_obj in handler.filters:
            if hasattr(filter_obj, '__class__') and 'ExternalLibraryFilter' in filter_obj.__class__.__name__:
                handler.removeFilter(filter_obj)
                
    print(f"DEBUG: Enabled debug logging for {library_name}")


def reset_logging_filters() -> None:
    """Reset logging filters and re-apply the external library suppression."""
    configure_worker_logging() 
Awaiting User Response$10 Bounty

4 Replies

Railway
BOT

a month ago

Our team is working on getting back to you as soon as possible. In the meantime, we've found the following might help you get unlocked faster:

If you find the answer from one of these, please let us know by solving the thread! Otherwise, we’ll be in touch.


idiegea21
HOBBYTop 10% Contributor

23 days ago

Hi man,

To fix this issue of Railway marking your Celery or FastAPI logs as "errors" despite being at the INFO level, ensure all logs are explicitly routed to stdout (not stderr) by using StreamHandler(sys.stdout) and apply your logging setup in Celery using the @setup_logging.connect signal. Simplify your log format to clearly include the log level early (e.g., "%(levelname)s: %(message)s") and avoid logging complex structures like tuples or dictionaries that could confuse Railway's log parser. These changes help prevent misclassification of benign logs as errors in the Railway UI.


idiegea21

Hi man,To fix this issue of Railway marking your Celery or FastAPI logs as "errors" despite being at the INFO level, ensure all logs are explicitly routed to stdout (not stderr) by using StreamHandler(sys.stdout) and apply your logging setup in Celery using the @setup_logging.connect signal. Simplify your log format to clearly include the log level early (e.g., "%(levelname)s: %(message)s") and avoid logging complex structures like tuples or dictionaries that could confuse Railway's log parser. These changes help prevent misclassification of benign logs as errors in the Railway UI.

fehmisener
HOBBY

23 days ago

I'll try, thanks a lot


Status changed to Solved brody 18 days ago


idiegea21

Hi man,To fix this issue of Railway marking your Celery or FastAPI logs as "errors" despite being at the INFO level, ensure all logs are explicitly routed to stdout (not stderr) by using StreamHandler(sys.stdout) and apply your logging setup in Celery using the @setup_logging.connect signal. Simplify your log format to clearly include the log level early (e.g., "%(levelname)s: %(message)s") and avoid logging complex structures like tuples or dictionaries that could confuse Railway's log parser. These changes help prevent misclassification of benign logs as errors in the Railway UI.

18 days ago

For future reference, our logging system does not take into account the level printed in plaintext.

The only way we parse the level is from stdout/stderr or if you do JSON logging with a level attribute, and if you are logging JSON then there is nothing you could log that could confuse the JSON parser as long as it's valid JSON.


Status changed to Awaiting User Response Railway 18 days ago