How to import Audit Logs from SonarQube Cloud to Splunk SIEM
A practical tutorial on exporting SonarQube Cloud audit logs into Splunk SIEM, enabling centralized security monitoring and compliance tracking across your software delivery pipeline.
Understanding SonarQube Cloud Audit Logs
SonarQube Cloud introduced audit logs functionality to help organizations meet compliance and security requirements. Announced on October 1st in the SonarCloud community forum, these logs provide detailed records of important events within the enterprise environment. The audit logs are accessible exclusively through the SonarQube Cloud API rather than the user interface, making programmatic access essential for integration with security information and event management (SIEM) tools like Splunk.
Accessing Audit Logs Through the API
To retrieve audit logs, users must first authenticate using the web API with an authorization bearer token. The process begins by identifying the enterprise UID, which can be obtained through the List Enterprises API endpoint. Users can find their enterprise key in the URL of their SonarCloud.io enterprise dashboard. Once authenticated and equipped with the enterprise UID, users can access the audit logs API through SonarQube's Swagger documentation. The API supports customizable date ranges, though queries are limited to 30-day periods. Additional optional parameters include page numbers and record limits per page, allowing users to control the volume of data retrieved in each request.
Integration with Splunk SIEM
To integrate audit logs with Splunk Cloud, users have two primary approaches. The simplest method involves downloading the audit logs as JSON directly from the API using tools like Postman, then uploading the file manually through Splunk's settings interface. However, for a more streamlined and better-formatted integration, users can format the audit logs using custom scripts—such as Python—that structure the data to work with Splunk's HTTP Event Collector. This involves setting proper metadata fields including event host, source type, and source fields that Splunk expects. Users must generate an HTTP Event Collector token in Splunk, which serves as the authentication mechanism for data submission.
Optimizing Data Ingestion
Once the HTTP Event Collector token is created, users can submit formatted audit logs to Splunk using command-line tools such as curl, directed to the services collector event endpoint. This approach produces cleaner, more searchable data visualization compared to raw JSON uploads. Formatted data enables security teams to more effectively query and analyze audit events, making compliance monitoring and security investigations significantly easier. The structured format allows for better filtering and pattern detection across the audit logs.
Key Takeaways
- SonarQube Cloud audit logs are accessed exclusively through the API and require bearer token authentication with an enterprise UID
- The audit logs API supports customizable 30-day date ranges and can be queried using tools like Postman or curl
- Splunk integration can be accomplished through direct JSON file uploads or via the HTTP Event Collector for enhanced formatting
- Formatting audit logs with metadata fields through custom scripts improves searchability and visualization in Splunk
- Properly structured audit log integration enables organizations to meet compliance requirements and streamline security monitoring workflows