OCI MySQL database log collection using Logging Analytics

Karthic
4 min readNov 2, 2023

--

This blog will show how you can setup data collection from mysql tables into Logging Analytics for analysis.

A MySQL instance already created in OCI(Oracle cloud Infrastructure).You need an instance which is allowed to connect to the MySQL instance created.Enable Management Agent plugin if not already enabled under Oracle Cloud Agent Tab.

Connect to your MySQL with admin user and create a new user and grant select on databases you want to collect data.I am using CloudShell to connect to the database using Ephemeral private network setup available to run mysql queries.

mysqlsh -h <host> -u <adminuser>

Enter the password when prompted and switch to sql mode by typing \sql

show databases; (To list all the databases)
use <databasename>; (To use a specific database)
\q (To quit )

The host details can be found under Endpoints in MySQL DBSystem page.


CREATE USER '<mysql_user>'@'<host_name>' IDENTIFIED BY '<password>';
GRANT SELECT ON *.* TO '<mysql_user>'@'<host_name>' WITH GRANT OPTION;
#To give grant permission for specific database
GRANT SELECT ON <database_name>.* TO '<mysql_user>'@'<host_name>' WITH GRANT OPTION

Create an entity in Logging Analytics for MySQL database instance. navigate to Logging Analytics Administration → Entities

Give the property details for the entity needed.

SSH to the instance where the management agent plugin is enabled and allowed to connect to MySQL instance.You can configure the network security group /security list to allow the connection.

Create a json file with the credential to connect to the database with the user created and allowed GRANT permission to run select queries.
For example the entity created is mysqldev and the user created is loguser and password is XXX1234xxx

{
"source": "lacollector.la_database_sql",
"name": "LCAgentDBCreds.mysqldev",
"type": "DBCreds",
"usage": "LOGANALYTICS",
"disabled": "false",
"properties": [
{
"name": "DBUserName",
"value": "CLEAR[loguser]"
},
{
"name": "DBPassword",
"value": "CLEAR[XXX1234xxx]"
}
]
}

Once the file is created run the command to add the credential to the agent.

cat <filename>.json | sudo -u oracle-cloud-agent /var/lib/oracle-cloud-agent/plugins/oci-managementagent/polaris/agent_inst/bin/credential_mgmt.sh -o upsertCredentials -s logan

If password is given wrong you can delete the credential and add it again.

{
"source": "lacollector.la_database_sql",
"name": "LCAgentDBCreds.mysqldev"
}

cat /tmp/filename.json |sudo -u oracle-cloud-agent /var/lib/oracle-cloud-agent/plugins/oci-managementagent/polaris/agent_inst/bin/credential_mgmt.sh -o deleteCredentials -s logan

Once the credential is added you can either enable auto-association or manually associate the log sources with the entity.

Auto-Association Enable or Disable
Manual Association

Check the Agent collection warnings if there is any issue or the agent logs for troubleshooting.If no issue you should see the logs associated for your log source. As an example I have associated the MySQL Error Logs Stored in Database.

MySQL Audit logs integration with Logging Analytics:

You need to have audit enabled in MySql using the audit plugin.Please refer this doc to enable audit in MySQL.

Create a new Database and a view to store the audit logs.

CREATE DATABASE LogAnalytics;
USE LogAnalytics;

CREATE VIEW `aud_view` AS select `audit_log`.`ts` AS `ts`,`audit_log`.`tsid` AS `tsid`,`audit_log`.`class` AS `class`,`audit_log`.`event` AS `event`,`audit_log`.`login_ip` AS `login_ip`,`audit_log`.`login_user` AS `login_user`,`audit_log`.`connection_id` AS `connection_id`,`audit_log`.`status` AS `status`,`audit_log`.`connection_type` AS `connection_type`,`audit_log`.`c_client_name` AS `c_client_name`,`audit_log`.`c_client_version` AS `c_client_version`,`audit_log`.`command` AS `command`,`audit_log`.`sql_command` AS `sql_command`,`audit_log`.`command_status` AS `command_status`,`audit_log`.`query` AS `query` from json_table(audit_log_read(concat('{ "start": { "timestamp": "',convert(date_format((utc_timestamp() - interval 90 second),'%Y-%m-%d %H:%i:%s') using utf8mb4),'"}, "max_array_length": 1000 } ') AS `concat( '{ "start": { "timestamp": "', date_format((utc_timestamp() - interval 90 second),'%Y-%m-%d %H:%i:%s'), '"}, "max_array_length": 1000 } ' )`), '$[*]' columns (`ts` timestamp path '$.timestamp', `tsid` int path '$.id', `class` varchar(20) character set utf8mb4 path '$.class', `event` varchar(80) character set utf8mb4 path '$.event', `login_ip` varchar(200) character set utf8mb4 path '$.login.ip', `login_user` varchar(200) character set utf8mb4 path '$.login.user', `connection_id` varchar(80) character set utf8mb4 path '$.connection_id', `status` int path '$.status', `connection_type` varchar(40) character set utf8mb4 path '$.general_data.connection_type', `c_client_name` varchar(80) character set utf8mb4 path '$.general_data.c_client_name', `c_client_version` varchar(80) character set utf8mb4 path '$.general_data.c_client_version', `command` varchar(40) character set utf8mb4 path '$.general_data.command', `sql_command` varchar(40) character set utf8mb4 path '$.general_data.sql_command', `command_status` varchar(40) character set utf8mb4 path '$.general_data.command_status', `query` text character set utf8mb4 path '$.general_data.query')) `audit_log`;

You can use select query to get data from the view created.

select ts, tsid, class, event, login_ip, login_user, connection_id, status, connection_type, c_client_name, c_client_version, command, sql_command, command_status, query from LogAnalytics.aud_view;

--

--