Server Logging
The server logger can be configured to adjust the amount of detail that is included in the server logs. This can be helpful for debugging or troubleshooting server issues. You can change the verbosity level of the logger by setting theSERVER_VERBOSITY environment variable.
The SERVER_VERBOSITY variable accepts the following values: DEBUG, INFO, WARNING, ERROR and CRITICAL.
Server logging format
You can customize the format by setting theSERVER_LOGGING_FORMAT environment variable. By default it is set as
SERVER_LOGGING_FORMAT=plaintext. If it is set to unknown format it will use default value instead.
Plaintext format
The plaintext logging format will be used by default. Or it can be set explicitly by setting theSERVER_LOGGING_FORMAT=plaintext environment variable.
By default, the server will log messages in the following format:
SERVER_LOGGING_TEMPLATE environment variable.
Example:
ImportantFormat string with special characters like
\t, \n, etc. can’t be passed through .env file as they are getting
escaped.To pass such string you can- Use
export SERVER_LOGGING_TEMPLATE=$'%(asctime)s\t[%(name)25.25s]\t%(levelname)5s: %(message)s'syntax for local python deployment or docker compose with exported env variables - Use combination of
.envfile and-eflag for plain docker deploymentdocker run -d --env-file .env -e SERVER_LOGGING_TEMPLATE=$'%(asctime)s\t\t[%(name)25.25s]\t%(levelname)5s: %(message)s' mageai/mageai:latest
JSON format
To use JSON logging format, setSERVER_LOGGING_FORMAT=json.
JSON formatted logs will have the following fields:
Pipelines logging
By default, logs for each pipeline run will be stored in the<path_to_project>/pipelines/<pipeline_name>/.logs folder.
Learn more about logs here.
Set logging level
The default logging level isINFO. To customize the logging level for logs of block runs and pipeline runs,
you can set the level of logging_config in your project’s metadata.yaml file.
Example logging config to only log errors:
Set log retention period
To delete old logs in your peristent volume, you can specify theretention_period in the logging_config. Valid period units
are ‘h’, ‘d’, and ‘w’.
Example logging config:
mage clean-old-logs [project_path] to clean up old logs
Block logging
A logger is available in the block’s context. You can use it to log messages to the block’s log file. The logger is passed into each block as a keyword argument. You can retrieve the logger by callingkwargs.get('logger') within the block.
Example:
Logging to external destination
S3
To store logs in S3, you need to set thelogging config in your project’s metadata.yaml file.
Example S3 logging config:
- Configure credentials in the
logging_config. - Configure credentials in environment variables:
AWS_ACCESS_KEY_IDandAWS_SECRET_ACCESS_KEY. - Authenticate with IAM role. Grant S3 access to the IAM role of the cloud instance.
endpoint_url can be configured to connect to S3 compatible services (e.g. MinIO).
Google Cloud Storage
To store logs in GCS, you need to set thelogging config in your project’s metadata.yaml file.
Example GCS logging config:
More destinations coming…
Edit pipeline logging
When you’re editing a pipeline (e.g./pipelines/[uuid]/edit), you can execute the code for an
individual block and see the output. Any print statements in the block of code is displayed
in the block’s output.
However, you can redirect those print statements to output to logs.
To toggle this feature, go to the pipeline settings page (e.g. /pipelines/[uuid]/settings)
and check the box labeled
When running a block while editing a pipeline, output the block messages to the logs.