Analysing Alarms using OCI Logging Analytics

Karthic
3 min readDec 19, 2024

--

In Oracle Cloud you can create alarms to get notified when a metric is breached or you find any pattern in the logs.

There is no central place to analyse the history of all alarms .You can see the history of individual alarms by navigating to that alarm and see the history for 90 days .

You have to use third party tools like Pagerduty to analyse the history of alarms .

In this blog I will show how you can use Logging Analytics as a central tool to analyse alarms and its history within OCI.

To achieve this you need Logging analytics enabled and a serverless function to send data to Logging Analytics.

Step 1: Create a function and deploy it in OCI


import io
import oci

def handler(ctx, data: io.BytesIO = None):
print("Entering AIOPS function", flush=True)
try:
signer = oci.auth.signers.get_resource_principals_signer()
log_analytics_client = oci.log_analytics.LogAnalyticsClient({}, signer=signer)
log_analytics_client.upload_log_file(
namespace_name="<logginganalytics_namespace>",
upload_name="Alarm",
log_source_name="Alarm",
opc_meta_loggrpid="<loganalytics_loggroup>",
filename="alarm.json",
content_type="application/octet-stream",
upload_log_file_body=data)
except (Exception, ValueError) as ex:
print(str(ex), flush=True)

The above is the sample function code which will send the alarm json to Logging Analytics. Replace the placeholder with real values for namespace_name and opc_meta_loggrpid. If you don’t like hardcoding these values you can pass these inputs as function context too.

Step 2 : Create a Notification topic with the created function as subscription. You can add email subscription into the same topic as well .

Step 3: Configure the alarm notification with the topic created so you can receive emails and the function will be triggered as well.This will send the alarm json into Logging Analytics . Set the alarm message format to raw messages.

I have created json parser and log source for the alarm json .Once a alarm FIRED you can see the data in log explorer like this.

In the alarm json the compartment details wont be there. To know the alarms is fired in which compartment you can set the compartment name as part of the Alarm name or alarm summary . We can use extended fields to extract the compartment name later if its part of alarm summary.

You can create dashboards to get daily ,weekly, monthly view of alarms. And how quickly alarms are getting resolved etc.

There are other methods to achieve this as well . We can send the data to opensearch or to autonomous DB . The reason I picked Logging analytics is we get 10GB of log data free per month so this solution will be completely free , most of the observability customers use Logging Analytics and its easier to implement.

--

--

No responses yet