How I save my Scraped Data to a Database with Python! Beginners sqlite3 tutorial

preview_player
Показать описание
We've focused on how to scrape content but not on how to save it persistently - I'll show you how I save my scraped data to a database in its most basic form, from setting up and connecting to an sqlite3 database, to creating a table and inserting data.

This Python tutorial is aimed at beginners who haven't tried using a database yet in their own projects, and has some examples of how it can be put into existing code with just a few extra lines.
------------------------------------
-------------------------------------
-------------------------------------

Disclaimer: These are affiliate links and as an Amazon Associate I earn from qualifying purchases
-------------------------------------
Sound like me:

-------------------------------------
Video like me:

-------------------------------------
PC Stuff:

Рекомендации по теме
Комментарии
Автор

Great video! Just an FYI the is a CREATE TABLE IF NOT EXISTS TABLE_NAME (column_name datatype, column_name datatype); command so you don't have to comment it out and rerun you script without errors. There is also a DROP TABLE IF EXISTS TABLE_NAME; as well if you want to recreate it with fresh data over and over.

mauisam
Автор

Thank you for adding pandas to this video. It was exactly what I needed to learn.

aejack
Автор

I dont follow many channels, but this one is gold. Thanks for your extremly well explained tutorials. Keep doing them, it helps a lot

eka
Автор

hello everyone, I am new to python.
I am trying to run the codes, but I got an attribute error "' 'NoneType' object has no attribute 'text' ", someone knows how can I fix it? (I thought it was related to the url, but I updated the url to other product of the same website, but no sucess).
peace.

eduardomatsumoto
Автор

Can this be done with a MySQL database? I'm thinking webscraping into a MySQL database and use PHP to view it on a web page

aogunnaike
Автор

How to append dictionary with different keys each time in a for loop to the same sqlite database in python?

zeeshantaj
Автор

Around 4:35 I like that you're using variables here to emulate real-life code instead of teaching it like a text book. A lot of people learn the same way I do, we need to be taught with applicable examples, thank you for that!

omerthebear
Автор

Great work! I wonder, will it be again so if we use find_all() instead of find(). I think it should be tougher

emilseyfi
Автор

Your videos are genuinely helpful and always to the point. I hope you never stop delivering such content and always remain motivated for delivering such stuffs. These are very good.. thanks John. 🙏

sounakchatterjee
Автор

What do you think should learn sql to store web scraping data in database

siraj.udlla_
Автор

Great job brother…you have every problems solution❤love from india…keep growing

prashantbhosale
Автор

I have two functions, one scrapes a series code, episode title and url (from Family Guy transcripts) and the other function scrapes the actual transcripts text. How can I add these all to the same database as the variables are defined in different functions? Thanks!

alexcrowley
Автор

I do have an urgent question:

I'm currently working on a technical assessment for a job interview. Everything seems quite simple so far - they need me to make a data pipeline using Python and SQL. Python needs to be the tool used for data (raw) pull and Data Quality checks. The rest of the pipeline is made using SQL (normalise, dimensions, etc.). My question is regarding the step between Python and SQL. According to the requirements, I need to make a mock Datamart where I can store the tables that are created using Python code. These tables that go into the mock DataMart are then pulled (queried) again using SQL. As mentioned before, SQL will then be used to normalise and analyse. What is a DataMart and how can I make a mock version of it? Are they simply asking me to make a database or data warehouse? I've heard of DataMarts before but never used them at university, job or even when I am doing coding sessions in my own free time.

hmak
Автор

WOW. Words cannot do justice to how well this has all been explained. :O Subscribed, please teach me more! :D The only sad thing I noticed is that you didn't say why it is important to close the connection at the end.

karolkleckovski
Автор

Good job brother,
Thanks so much.
I subed

Greetings from Tanzania 🇹🇿

raymondmichael
Автор

Thanks John its really very useful tutorial.

tubelessHuma
Автор

Am planning to make a stock market bot that checks for updates in prices and sends notification to users with telegram is it worth it or it's a stupid idea

Grinwa
Автор

How to introduce randomness so they don’t think you are web scrapping them?

armanwirawan
Автор

Thanks JWR for the excellent vids. I'm in NA and this website doesn't load up. Used Selenium to see what was happening and I could make it load with my VPN set to the UK. I'm fine with following this video and I'll use a different site, however I was also running into the issue of accepting cookies and wondered if there's a method using BS4 to address the prompt and setting cookies to actually scrape?
Assume this wasn't an issue when you first released the video. Thanks again.

stewart
Автор

Great videos John!! Any suggestions on pulling dynamic data from APIs (a data set that is updated maybe weekly) and being able to update that existing record's updates in the database?

jasond