Load data from Azure Blob Storage into Python

preview_player
Показать описание
Code below:

from datetime import datetime, timedelta
import pandas as pd

#enter credentials
account_name = 'ACCOUNT NAME'
account_key = 'ACCOUNT KEY'
container_name = 'CONTAINER NAME'

#create a client to interact with blob storage

#use the client to connect to the container

#get a list of all blob files in the container
blob_list = []


df_list = []
#generate a shared access signiture for files and load them into Python
for blob_i in blob_list:
#generate a shared access signature for each blob file
sas_i = generate_blob_sas(account_name = account_name,
container_name = container_name,
blob_name = blob_i,
account_key=account_key,
permission=BlobSasPermissions(read=True),



Рекомендации по теме
Комментарии
Автор

Thanks a lot for this very clear video. I spent hours trying to do this until I luckily stumble across your video. I agree that this video should definitely have more views!!

EwaneGigga
Автор

Videos like yours should have way more views. Thank you for what you do.

CapitanFeeder
Автор

Fantastic video. Very clear explanation and clean code for us to follow. Thank you!

kline
Автор

what if i have multiple directories inside the container and blobfiles are present inside those directories?

thisissuvv
Автор

What with happen when you have SAS token on hand, can it be replaced with account key?

investing
Автор

ooh. I support a SaaS app and _hate_ Azure Storage Explorer with a burning passion. if I can access logs etc from Python instead of ASE that would be a very happy rabbithole to go down. I suspect I don't have access to those keys though

RoamingAdhocrat
Автор

I have a python script that reads an EDI file and, from there, creates unique data tags and elements (basically a CSV file with one tag and data field, per line). I need to process to load this into Azure and, for the outbound, to extract into the same tags+data. This looks close. Anyone interested in giving me a quote for this (can you show it working?). Thanks.

remorabay
Автор

I have image dataset stored in the azure datastores filestorage. I have a model in azure ml studio. So how do i access the dataset.

ArbaazShaikh-yt
Автор

getting HTTP Error 403: This request is not authorized to perform this operation using this resource type

satyakipradhan
Автор

Do you have any suggestions for how to then write a file in a similar fashion to the storage blob?

ericbixby
Автор

@dotpi5907 can we load the hugging face save model like this? if so can you guide how? or there is any alternate solution? many thanks in advance. Cheers

nikk
Автор

Can we do the same for json files stored in blob storage?

yuthikashekhar
Автор

What if you did not want to bring the files down to the local machine? How would you process the files up on Azure? And run the Python code on Azure. For instance, the files were placed in blob storage and now you wanted process them, clean them up and then save out the results out in blob storage. The Python code is not complicated, just what are the pices/configuration up on Azure.

kevinduffy
Автор

Well Explained ! Actually I want to read ".docx" file from blob. How can I do that?

_the.equalizer_
Автор

Hi Sir
Which version of pandas you have used, can we load into pyspark dataframe instead of pandas data frame if Yes, pls share me the syantax ASAP

learner-dfns
Автор

Can we do similar thing to load video file from azure blob storage using libraries like OpenCV .
I want to load and analyze videos from blob storage inside azure machine learning studio

sumitsp
Автор

Thank you fo this video! It saved my time

dhruvajmeri
Автор

Is there an way to select files/blob from an Azure container using a flask application for further usage in the application just like function help in selecting files from the local directory. Can someone help me with this?

surajbhu
Автор

can we set the expiry time to be infinite?

nirajmodh