Executing Twitch Exercise In CLI: Importing CSV's into SQLite3

I’m attempting to execute the SQL portion of the Twitch project in the Terminal on my computer, but have reached an impasse. I have successfully installed SQLite and have been able to run queries on .sqlite files, but am struggling to work on the .csv files provided for the project. Any help would be appreciated!



Welcome to the forums!

So the SQLite CLI isn’t the most user-friendly for data exploration, but if you want to use it, follow these steps:

  • create a new database using the command .open DB_NAME (so if you want a database file called twitch, you would say .open twitch.db
  • switch SQLite over to CSV mode with the command .mode csv
  • import your CSV in as a table by using the following syntax:
    • .import PATH/TO/CSV/FILE.csv TABLE_NAME

So lets say you open SQLite in the same folder as your CSV files and you want to create your twitch database there, with a table called streams. You would use these commands:

>>> .open twitch.db
>>> .mode csv
>>> .import ./streams.csv streams

Now you have a table called streams, and you can write queries just like you would normally do. You can also add the other CSV in the same database as another table (views I think?) with the same syntax of .import ./views.csv views so long as the database is still open and you are still in csv mode. You can also change to other modes to view the data differently. The options can be found in the documentation here.

Happy coding!

Thank you for the response; it’s very helpful! However, this raises another question to me regarding the practicality of traveling between SQL and python platforms.

If SQLite isn’t user-friendly for data exploration, what would be a better option? Additionally, what is the most efficient way to take SQL query results and translate them into an operable format for python?

Great question.

It isn’t that SQLite itself is not user-friendly for data exploration, just the command-line interface is not.
The reason I say this is because when you are exploring the data, you want to be able to see it in a way that allows you to easily notice patterns, etc. On the command line, the query outputs are often not formatted in a way that is easy to read. You can change the mode to column, which is fine for smaller query results, but I believe column also has a character limit, so anything in a column that is above that limit will be cut off.

I like to use SQLite on the command line when I don’t have to actually look at the data. When I am looking through it I usually gravitate toward one of three tools:

  • DB Browser (codecademy video here)
    • when I am only doing SQL and I want to explore/create/edit a new database
  • VS Code
    • when I am only running SQL queries and doing mild database exploration
  • Jupyter Notebooks
    • when I am doing data exploration and analysis that will involve Python

If you are doing a project that involves using data from SQL queries and doing analysis, etc., in Python then Jupyter notebooks are the way to go. You can use Python’s sqlite3 module to query the database, then save the query results as Pandas DataFrames, which look great on Jupyter Notebooks. Check it out and see if it fits with your desired workflow!

This topic was automatically closed 41 days after the last reply. New replies are no longer allowed.