Designing a data pipeline for a crypto trading system (bot) using Python and Google Cloud

preview_player
Показать описание
Detecting patterns takes data, and these days typically the more data you have the better off your models will be.

I'm working on detecting one hell of a pattern: crypto currency prices. When I first started tinkering on this project I built something that used data on-demand from the Gemini Exchange API. Having invested more time into the project, I've realized that the best way to improve the whole system is to build a data pipeline that supports easy access to a rich history of cryptocurrency prices.

That's the core of this video: how is this data pipeline going to come together? What even is a data pipeline, within the context of this independent project to build a trading system?

Let's find out, and let's make this thing happen.

Relevant tools:
Gemini Exchange REST API
Python
Pycharm (Python IDE)
Google Cloud Platform
-- BigQuery
-- Cloud Functions
-- Cloud Scheduler
Tableau Desktop
Рекомендации по теме
Комментарии
Автор

Data pipeline is exactly what I am trying to do right now. Loved your video and how you went over your motivation. Your video pointed me in the right direction, now I have to figure out how to save data to bq.
Thanks for the great video!

afaf
Автор

Thank you @Devyx. I'd have to try this for myself

dawnokem
Автор

Hi, Awesome tutorial. What can I do if I have already trained a model on some data that consists of 1 minute candles in the last 2000 days (btc-usdt only). I would like to use the model trained on that dataset to analyze the new data coming from the API that is being stored in the BigQuery.

Betegfos
Автор

Hi, how do you write/transform the result/output of cloud function to the google bigquery table?

cliviakong
visit shbcf.ru