Hi everybody, I’m learning to code, I would like to know what I need to study to extract data in real time from internet sites, for example to download sports scores or financial data, can I access their database directly? I did the basic sql course but it doesn’t talk about this, do I have to study web scraping techniques? In what language? Thank you all.
Yes, accessing APIs and web scraping.
I recommend Python (requests library).
I’d also suggest reading a website’s rules and documentation about using/accessing their data first too. Most sites have very specific and strict rules about accessing their data and if one is allowed to do so.
Thanks for the reply, sorry but i’m really a beginner, what do you mean by accessing api’s? And regarding python which framework i have to use? thank you
No worries! We’re all beginners at something and at some point.
An API is an application programming interface. They allow companies to open up their application’s/website’s data & functionality to 3rd parties (this is an Open API) (or to others within the company). There are specifics, or rules on how to access it in order to download it. The rules “tell” the computers how to interact. It starts with a basic request. More on that here. (or just google it).
Try YouTube for tutorials, or, here’s a good Python explanation to start with:
Also, there’s a TON of free data out there. For example, many cities have open data portals where you can download data about the city. Ex: NYC Open Data. If you go here, you can see that one way to grab the data is with an API. It’s just one example; there are tons worldwide.
Try googling “open/free APIs” to see what’s out there as well.
No, you don’t create them. Some sites will make one request an API key for users to access their data.
Thank you, I’ve understood.