Cloud Visibility For Your Security Data Lake

So you’ve decided to build a security data lake. You’re probably looking forward to the scale and price performance of storing security data in cloud storage. $23 a month per terabyte is sweet! But that’s just storage.. how will you collect and analyze the data so that you’re cutting spending without cutting corners?

Snowflake Data Exchange

Security teams that are building their data lake on Snowflake can take advantage of data shared on the Data Exchange. This is a new marketplace where vendors share live data with the community or to individual customers.

Image from Zillow’s Data Exchange Listing

Lacework Handles Cloud Collection and Detection

Ask anyone that’s had to aggregate Azure log data and they’ll show you their scars. Azure has three different log types and separate APIs to collect inventory and configuration data. These also change whenever the Azure team fixes issues and adds fields as security teams push to meet visibility requirements. Data collection is one of the top challenges in multi-cloud security programs.

The magic of read-only, zero-copy data sharing within Snowflake

Raw Data for Compliance and Flexibility

By providing cloud logs and asset details to the customer’s security data lake, Lacework delivers more value without burning additional calories. Log data available over data sharing can be used to satisfy compliance requirements such as the PCI-DSS “implement audit trails” control. This log data, which is kept by Lacework for 90 days, can also be automatically copied into the customer’s Snowflake and kept for years.

I believe that better data is the key to better security. These are personal posts that don’t represent Snowflake.