![NextGen Learning](/img/default-banner.jpg)
- 411
- 2 178 191
NextGen Learning
India
Приєднався 25 січ 2017
We can discuss about common Hadoop Administration , Cloud and DevOps tasks. I will be uploading more videos on Hadoop Administration.
For training inquiries please send email to me at hadoopengineering@gmail.com
For training inquiries please send email to me at hadoopengineering@gmail.com
Adding users group and service principal to Databricks
Add users , group and service principal to Azure Databricks
Azure Databricks
Databricks user management
Databricks user on-boarding
Adding service principal to Databricks
Azure Databricks
Databricks user management
Databricks user on-boarding
Adding service principal to Databricks
Переглядів: 33
Відео
Databricks Connect to ADLS using service principal
Переглядів 8321 годину тому
This video will explain how to connect ADLS storage container using Azure Entra ID service principal. Azure Documentation - learn.microsoft.com/en-us/azure/databricks/connect/storage/tutorial-azure-storage #Azure Databricks #Connect Azure ADLS from Databricks
Manage Azure Databricks using API
Переглядів 50День тому
Manage databricks using API Databricks Rest API - docs.databricks.com/api/azure/workspace/introduction
Databricks Credential Pass through
Переглядів 5814 днів тому
Credential passthrough is deprecated starting with Databricks Runtime 15.0 and will be removed in future Databricks Runtime versions. Databricks recommends that you upgrade to Unity Catalog.
Stream processing with Apache Kafka and Azure Databricks
Переглядів 1,6 тис.7 місяців тому
This video describes how you can use Apache Kafka as a source when running Structured Streaming workloads on Azure Databricks. Video will explain how to setup Apache Kafka service and connect from Azure Databricks Timestamps 00:00 Create Resource Group 01:16 Deploy Virtual Network 06:09 Deploy Azure Databricks 12:30 Deploy Virtual Machine 15:17 Install and Configure Kafka 21:20 Start Zookeeper ...
15 - Unity Catalog - Column Masking
Переглядів 1,7 тис.7 місяців тому
This video will explain how to do column masking in Unity Catalog enabled Datrabricks More Unity Catalog Videos 01. Unity Catalog Introduction - ua-cam.com/video/yc5BHW149hs/v-deo.html 02. Unity Catalog Configuration prerequisites - ua-cam.com/video/FQnb1-kbbig/v-deo.html 03. Creating Metastore - ua-cam.com/video/vAab07QrLZk/v-deo.html 04. Create catalog schema and tables - ua-cam.com/video/Z13...
14 - Unity Catalog - Row Level Access Control
Переглядів 2 тис.7 місяців тому
This video will walk you through the configuration of Row Level Access control for tables . More Unity Catalog Videos 01. Unity Catalog Introduction - ua-cam.com/video/yc5BHW149hs/v-deo.html 02. Unity Catalog Configuration prerequisites - ua-cam.com/video/FQnb1-kbbig/v-deo.html 03. Creating Metastore - ua-cam.com/video/vAab07QrLZk/v-deo.html 04. Create catalog schema and tables - ua-cam.com/vid...
13 - Unity Catalog Lakehouse Federation
Переглядів 1,8 тис.8 місяців тому
This video introduces Lakehouse Federation, the query federation platform that enables you to use Azure Databricks to run queries against multiple external data sources. We will see how to set up Lakehouse Federation connections and create foreign catalogs in our Unity Catalog metastore More Unity Catalog Videos 01. Unity Catalog Introduction - ua-cam.com/video/yc5BHW149hs/v-deo.html 02. Unity ...
12 - Unity Catalog Change default Data location of Managed Table
Переглядів 1,8 тис.8 місяців тому
This video will explain how to change the default location of Managed tables in Unity Catalog More Unity Catalog Videos 01. Unity Catalog Introduction - ua-cam.com/video/yc5BHW149hs/v-deo.html 02. Unity Catalog Configuration prerequisites - ua-cam.com/video/FQnb1-kbbig/v-deo.html 03. Creating Metastore - ua-cam.com/video/vAab07QrLZk/v-deo.html 04. Create catalog schema and tables - ua-cam.com/v...
09- Unity Catalog - Create computer resources
Переглядів 1,6 тис.9 місяців тому
09- Unity Catalog - Create computer resources
08- Databricks Unity Catalog - Azure Ad as identity provider
Переглядів 2,6 тис.9 місяців тому
08- Databricks Unity Catalog - Azure Ad as identity provider
07- Databricks Unity Catalog Create external Location
Переглядів 6 тис.9 місяців тому
07- Databricks Unity Catalog Create external Location
06- Unity Catalog Delta sharing ( Non databricks consumers )
Переглядів 4,4 тис.Рік тому
06- Unity Catalog Delta sharing ( Non databricks consumers )
05- Unity Catalog Capture and view data lineage
Переглядів 6 тис.Рік тому
05- Unity Catalog Capture and view data lineage
04-Unity Catalog Create Calatlog , Schema and Tables
Переглядів 11 тис.Рік тому
04-Unity Catalog Create Calatlog , Schema and Tables
03- Unity Catalog Creating metastore in Azure Unity Catalog
Переглядів 12 тис.Рік тому
03- Unity Catalog Creating metastore in Azure Unity Catalog
02-Unity Catalog Configuration - Prerequisites
Переглядів 12 тис.Рік тому
02-Unity Catalog Configuration - Prerequisites
03 Deploy Resource Group in Azure using Terraform
Переглядів 1,7 тис.Рік тому
03 Deploy Resource Group in Azure using Terraform
Project02 - Process image files using cognitive service and store its output to Cosmos DB
Переглядів 301Рік тому
Project02 - Process image files using cognitive service and store its output to Cosmos DB
Copy data from ADLS to Azure SQL using Azure Data Factory
Переглядів 4,3 тис.Рік тому
Copy data from ADLS to Azure SQL using Azure Data Factory
YARN Service in Cloudera Data Platform
Переглядів 1,9 тис.2 роки тому
YARN Service in Cloudera Data Platform
Is it possible to create new azure subscription by using terraform
is the keytab file used to authenticate against the KDC? Thanks
Sir what is size of monitor screen you use . Just asking for reference 🙂. I am also looking to buy one monitor for ease of my work.
I am using LG Monitor , Purchased in 2021 , still running without any issues. I paid 12K at that time now it is reached 9K INR. LG Electronics 60 cm/24 inches Full HD IPS 1920 x 1080 Pixels LCD Monitor, Inbuilt Speaker, HDMI x 2, VGA Port, 75 Hz Refresh Rate, AMD Freesync, 3 Side Borderless Slim Design - 24ML600S-W (White) .
Need help, could you please add what permission required. I am getting error invalid stream on failed message
how to create entity in atlas please do it with latest version
nice explanation with demo
You are awesome
Good job
Hello Sir, Which course should i do for databricks administrator
Please email me..I am going to start a new training batch.
What is credential passthrough?
one of the ways to access storage container . I will be publishing the video on credential passthrough
Thank you for this simple and helpful video. Just enough information to learn the basics and build up from here.
Great video. Thanks. Noticed on your video, you published a message with name = "Jon", then next sent a message with name "Raj". however, on your query result, Row 1 is "Raj" whereas Row 2 is "Jon", i was expecting the first message sent to be Row 1, which would be "Jon". Any thoughts?
audio please
where did you learn it please, is it from a book or data bricks live demos
I did all the same, but getting below error, when i test the connection its working fine, but when i run the pipeline it immidiately terminates with the below error: Input Payload is invalid, validation result - '["Connector or activity name: ADLS_GEN2_LINKEDSERVICE, connector or activity type: AzureBlobFS, error: Calling partner RP EvaluatePolicyAsync returned an invalid status code 'Conflict', ReasonPhrase Conflict","Connector or activity name: AzureSqlDatabaseScalez, connector or activity type: AzureSqlDatabase, error: Calling partner RP EvaluatePolicyAsync returned an invalid status code 'Conflict', ReasonPhrase Conflict"]'
can someone explain to me how to access files stored in ADLS container of the storage account which Managed storage account by databricks, actually It has some files that that I want to fecth and load into the unityt catalog table
Very informative!! Appreciate the work..
Hi, just a quick query. When I run the "kadmin -p", I am getting the following error : kadmin: Cannot contact any KDC for realm, while initializing kadmin interface I have ensured I can telnet to port 88 & I do not have ip address issue nor the hostname. I have ensured my /etc/krb5.conf & kadm5.acl conf files are duly updated . Where else can I get.
Amazing series. Extremely Helpful and wellput demos. Thank you my friend.
Much appreciated!
is it possible to use in production environment ?
Umm, you forgot the required software in order to download docker images, like, what is this at 1:17?
my job is failed to load, please suggest, asking for service account, network, subnetwork, asking to fill all the parameter details
Great video, Thanks for sharing your knowledge !
Thank you, but I am trying to export the SCC findings to BigQuery export, so created pub-sub topic/subscriptions, BigQuery dataset, and tables. PubSub does not push data to BigQuery (created table schema manually), and could not find the auto-detect schema from the configuration tool. Issue is pubsub data is not exporting to bigquery and could not figure it out.. any help would be greatly appreciated.
Hi bro please make a video on creating a multinode confluent kafka cluster on rhel
How to connect the api key to pubsub? Why you show half baked stuff?
One of the best video!!
I am very thankful to you for providing such a wonderful series of knowledge. Thank u very much
Tq
Thanks for sharing video's on UC . I have quick clarifications 1.Once I share UC with recipients', will the data processing happen at provider or recipient? 2.Let's say I have view, RLS being applied on one of the AAD Group and Delta shared. Will RLS applicable on shared view?
bring more
I have a question here, although you changed the default location of catalog in this scenario, so will there be any change in our original storage account which is connected to metastore?
my question is what is the purpose of providing the location to catalog and schema.?
A really nice video. Thank you!
good one
Thank you !! Clear and easy to understand
Kindly share your email id or phone number
How to write 3 or 4 records, you have just entered one record or one row, can you explain if you can add more rows in same publish message
We can also use shared access signatute (sas) or service principle from storage account to connect Azure Databricks ...??
Well explained!
Thank you for providing this informative series. It is greatly appreciated and has been explained thoroughly. I am eagerly anticipating sessions on DLT and autoloader from your end
Is this template capable of loading multiple CSV files from GCS, or do we need to develop a custom template for this task?
Very nicely done
Hello, where could I find syntax of checking user status. Like here you wrote is_account_group_member. But what other syntaxes are available for me to check things?
thank you for this valuable playlist
How to load data into unity catalog schema tables from azure storage gen2 using this spark in azure data bricks
Create external location in Databricks ( which lets you connect ADLS ) , then create external table using that location
Bring real time scenarios on pyspark,databricks and make videos on incremental data load of batch data. Some people use adf some use adb which one to choose for better performance. How to achieve SCDs.
Same doubt hre
Hi! I am a newbie in the data engineering world, and your video has been super helpful. Please can you provide the link to your dataset. Thanks.
Can we register a storage cred and external storage account which is on an another tenant ? Will it work cross tenant ?
Plz am using suse 15sp5 and whenever i try to start the slapd it gives me an error stating that slapd.service does not exist