Advancing Spark - Learning Databricks with DBDemos

preview_player
Показать описание

Whether you're getting buy-in from clients & stakeholders, or teaching yourself with worked examples - there's a lot in there for everyone. In this video Simon takes a look at the various demos currently available, and builds out a couple of examples!

Рекомендации по теме
Комментарии
Автор

This is AWESOME for learning and demo purposes!

krishnakoirala
Автор

Amazing. Heard about it recently and will definitely try it.

allthingsdata
Автор

Thanks Simon for your videos it is informative and helpful to keep up with databricks latest updates. One request from me, It would be nice if you can make a video on this use case "In traditional data lake we used to store all transformed data in HBASE and expose that data via phoenix to external world via info and partner API's". Not sure whether we can achieve this only with databricks toolset, could not find any straight answer to this. Can you throw some light on this with your videos... Thanks again for your videos.

rxshiva
Автор

Hello,

Your videos have been very helpful to me. I recently encountered a question about how to better provide the generated data from Databricks to upper-level applications.
Let's consider a hypothetical scenario: Databricks performs real-time calculations on the number of times each product is viewed by users, and the application needs to obtain this data when a user places an order.
I have read many documents, examples, and tutorials, but most of them focus on performing ETL processes to generate DeltaTables, which are then used by business intelligence (BI) tools. They do not provide best practices for supplying the cleansed data directly to websites or applications.
Currently, my approach is to periodically write a copy of the generated data to an OLAP system such as Azure SQL Data Warehouse (Azure SQL DW). However, this approach is not elegant for two reasons:
I need to maintain two copies of the result data: one in Databricks and another in Azure SQL DW.
I require custom functions to implement the Merge operation because Databricks does not support the Merge operation when interacting with Azure SQL DW.
I would like to know if there is a more elegant solution to address this issue. Whether it involves smarter data synchronization or accessing DeltaTable data directly through REST APIs.
I would greatly appreciate it if you could provide me with some guidance.

rengarlee
Автор

Great tip! Thanks for video. Does it work with community edition? Is there something similar for data transformation, Beginners demo on community edition?

AlDamara-xj