December 20th, 2021
Cloud function dashboard with LimaCharlie
Christopher Luft
LimaCharlie provides everything you need to run modern cybersecurity operations, and it is also a great tool for builders.
Our powerful web application is built using the publicly accessible API. There are no magic functions and we put the full power of the platform into the hands of those capable of wielding it.
To demonstrate the kind of thing a user can easily build with LimaCharlie we have put together an interactive, embeddable dashboard. The idea being that this project could serve as a template for those wanting to create customized reports, nice widgets to embed into websites or SOC content management systems.
This example leverages LimaCharlie webhook outputs, Google Cloud and the Google data studio. Full project details, explicit instructions and sample code can be found here.
Create Your Own
Create a Google Cloud Function
If you don’t have a project in GCP you can go ahead and create one. Next create a Google Cloud Function and give it a name.
Once you are configured you will want to enable the Cloud Build API and use this small piece of Node.js code shared in this repo.
Once you have created your Cloud Function, click into it and navigate to the TRIGGER tab. Copy the Trigger URL over to your clipboard and head over to the LimaCharlie web application.
Create an Output in LimaCharlie
Inside of the web application click Outputs and then Add Output. From here you are going to select the Detection stream which only forwards events associated with detections inside of your organization.
Next you select Webhook for the Output Destination. In the field Output Destination copy the Trigger URL of your Google Cloud Function.
In the Advanced Options you can select to:
Wrap JSON Event with Event Type
Flatten JSON to single level
Once you have saved the output, all detections made by the rules governing that organization will be forwarded to - and processed by - the Google Cloud Function.
Google Cloud Platform → Log Router
Navigate to the log router and CREATE SINK. Work through the wizard and use the following configuration.
First we select the Sink Service and choose BigQuery.
Next select to create a new BigQuery dataset.
Create your inclusion filter. For this example we are simply using “powershell”.
With the sink created all new events logged matching your inclusion criteria will automatically be populated in the new Big Query dataset.
Google Data Studio
Now that we have our BigQuery dataset as part of our Google Cloud project, we can easily connect it to Google Data Studio as a datasource. When the Connect to data window appears, select the BigQuery option, and follow the swim-lane (Project → Dataset → Table → Configuration) until we reach our desired data source.
With the data source connected we can create a new report and insert the desired components of our visualization.