Cloud Visibility For Your Security Data Lake

So you’ve decided to build a security data lake. You’re probably looking forward to the scale and price performance of storing security data in cloud storage. $23 a month per terabyte is sweet! But that’s just storage.. how will you collect and analyze the data so that you’re cutting spending without cutting corners?

Snowflake Data Exchange

Security teams that are building their data lake on Snowflake can take advantage of data shared on the Data Exchange. This is a new marketplace where vendors share live data with the community or to individual customers.

For example, Zillow makes its Zillow Home Value Index (ZHVI) data available on the Data Exchange and describes a typical use case where anyone on Snowflake can “Use the Home Value Index to understand where you might want to buy your next home or rental property using historical valuation trends.”

While Zillow’s share is a “Standard share” meaning that data is openly available, Braze has listed a “Personalized share” where each customer’s data is shared exclusively to them. This listing enables Braze customers to access raw events from the Braze marketing engagement platform. Data-driven marketing teams don’t need to download or copy from Braze into their own database before analyzing it for trends and insights.

A growing roster of vendors are now listing data shares of both types in the Cybersecurity section of the Snowflake Data Exchange. The first provider to list is Lacework, a solution that provides visibility, anomaly detection, and compliance across a range of clouds and cloud-based workloads.

Lacework’s integration with the Data Exchange can help you to quickly stand up a security data lake with broad cloud coverage.

Lacework Handles Cloud Collection and Detection

Ask anyone that’s had to aggregate Azure log data and they’ll show you their scars. Azure has three different log types and separate APIs to collect inventory and configuration data. These also change whenever the Azure team fixes issues and adds fields as security teams push to meet visibility requirements. Data collection is one of the top challenges in multi-cloud security programs.

For Lacework, broad integration with cloud infrastructure APIs is a core business requirement. These APIs provide the fuel for Lacework’s anomaly detection and compliance services. Lacework customers, most of which would gladly outsource API integration and stream ingestion to a dedicated vendor, have an opportunity to bypass the need for cloud data collection using Lacework’s Data Exchange listing.

When Lacework collects logs from their point of origin, it loads them into Lacework’s Snowflake database for analysis and investigation. Once these datasets are loaded into Snowflake, they can be shared live to the customer. A customer that has requested access to their data via the Exchange can query these logs, configurations and findings as if they were loaded into their own Snowflake. No API calls or copy commands required.

Raw Data for Compliance and Flexibility

By providing cloud logs and asset details to the customer’s security data lake, Lacework delivers more value without burning additional calories. Log data available over data sharing can be used to satisfy compliance requirements such as the PCI-DSS “implement audit trails” control. This log data, which is kept by Lacework for 90 days, can also be automatically copied into the customer’s Snowflake and kept for years.

Access to raw visibility data also provides flexibility in reporting and applying custom business logic. Every organization has its own crown jewels, priorities and SLAs, so customers can turbocharge their security data lake by crafting SQL views and BI dashboards for internal stakeholders. These custom windows into the company’s cloud security can include alert tables, asset investigation forms, and configuration remediation graphs.

A security data lake built on Snowflake can use data sharing to extend its boundaries into the vendor’s database. With the backing of a vendor like Lacework and raw data available live in your security data lake, you can quickly meet cloud security requirements and then build a data-driven security program at your own pace.

I believe that better data is the key to better security. These are personal posts that don’t represent Snowflake.