I am creating an application which can be run locally or on Google Cloud. To set up google cloud logging I've used Google Cloud Logging, made a cloud logger and basically log using the class below
class CloudLogger():
def __init__(self, instance_id: str = LOGGER_INSTANCE_ID, instance_zone: str = LOGGER_INSTANCE_ZONE) -> None:
self.instance_id = instance_id
self.instance_zone = instance_zone
self.cred = service_account.Credentials.from_service_account_file(CREDENTIAL_FILE)
self.client = gcp_logging.Client(project = PROJECT, credentials=self.cred)
self.res = Resource(type="gce_instance",
labels={
"instance_id": self.instance_id,
"zone": self.instance_zone
})
self.hdlr = CloudLoggingHandler(self.client, resource = self.res)
self.logger = logging.getLogger('my_gcp_logger')
self.hdlr.setFormatter(logging.Formatter('%(message)s'))
if not self.logger.handlers:
self.logger.addHandler(self.hdlr)
self.logger.setLevel(logging.INFO)
def info(self, log_this):
self.logger.setLevel(logging.INFO)
self.logger.info(log_this)
I want to have this so that if it is running on the cloud, it uses the GCP logger, and if run locally, it uses python logging. I can either pass in as an argument ("Cloud", "Local") or make it intelligent enough to understand on its own. But I want the underlying logic to be the same so that I can log to cloud/local seamlessly. How would I go about doing this?
Wondering if (maybe) theres some way to create a local logger. And have those local logs parsed to GCP if running on the cloud.
logging.debug, there is no level detected in stackdriver logging, the log are in grey because there are printed in plain text. If you use the Cloud Logging formater, it works well, but in local, your logs are in FluentD format (JSON, hardly readable). If you use Cloud Function, it works well, because Logger is well formatted in the Function Build step.