filmov
tv
Level Up - Automatically tokenize sensitive data with DLP and Dataflow

Показать описание
Welcome to Level Up: From Zero - the show where we show you how to build solutions with Google Cloud Platform, hands-on.
In this episode, Solution Architect Anant Damle discusses using Dataflow and the Data Loss Prevention API to auto-tokenize and encrypt sensitive data. One of the daunting challenges during data migration to the cloud is how to manage sensitive data. The sensitive data can be in structured forms like analytics tables or unstructured like chat history or transcription records. You can use Cloud DLP to identify sensitive data from both of these kinds of sources and then tokenize the sensitive parts.
00:35 - Introduction and explanation of tokenization
01:35 - Using encryption with tokenized data
02:45 - Automatic tokenization architecture overview
04:26 - Step 1: Flatten & sample
06:26 - Step 2: Batch & identify
09:20 - Step 3: Tokenize
09:42 - Hands-on demo
13:03 - Wrap-up and resource links
Hands-on training with Qwiklabs
Follow us on Twitter
Follow us on Facebook
In this episode, Solution Architect Anant Damle discusses using Dataflow and the Data Loss Prevention API to auto-tokenize and encrypt sensitive data. One of the daunting challenges during data migration to the cloud is how to manage sensitive data. The sensitive data can be in structured forms like analytics tables or unstructured like chat history or transcription records. You can use Cloud DLP to identify sensitive data from both of these kinds of sources and then tokenize the sensitive parts.
00:35 - Introduction and explanation of tokenization
01:35 - Using encryption with tokenized data
02:45 - Automatic tokenization architecture overview
04:26 - Step 1: Flatten & sample
06:26 - Step 2: Batch & identify
09:20 - Step 3: Tokenize
09:42 - Hands-on demo
13:03 - Wrap-up and resource links
Hands-on training with Qwiklabs
Follow us on Twitter
Follow us on Facebook
Level Up - Automatically tokenize sensitive data with DLP and Dataflow
How to make 10000$ in 1 hour #trading #shorts #crypto #trading #indicator #bitcoin #btc #makemoney
How to use tokenization with Cloud DLP
how I sped up python's tokenize module by 25% (intermediate) anthony explains #221
Level Up - AutoML with synthetic training data
Secure and Control Access to High-Risk Data in Snowflake with the VGS Vault Tokenizer
Create a Barcode in Excel in 30 seconds 🤯 #shorts
How to configure Automatic DLP
Keynote with the CEO - Inside Qubetics' Applications
Simplified Guide to XRP's AMM: Earn Money Easily!
IOTA Teams Up with Tokeny to Boost Enterprise Tokenization in Trillion-Dollar Market
Technology Brief : Cloud Security - Introducing Tokenization
Chunking Strategies in RAG: Optimising Data for Advanced AI Responses
Word-based tokenizers
AI Agents For Small Business - Mark Zuckerberg
Atlan + AWS Eventbridge: Automatically Tag Assets as PII #AtlanActivate
LangChain Explained in 13 Minutes | QuickStart Tutorial for Beginners
Natural Language Processing In 5 Minutes | What Is NLP And How Does It Work? | Simplilearn
How To Build AI Agents (For Beginners) OnDemand AI Agent Tutorial
What is Retrieval-Augmented Generation (RAG)?
Easily Launch Token Presales - The Ultimate Solution to Boost your Web3 Project
NLP pipeline deep dive: Why doesn't anyone tokenize by syllables?
TokenFi - create a token or tokenize RWA in a simple all-in-one platform without coding!
Build: Local Datastore and Security in Your Parse App
Комментарии