Skip to main content

Relational Database in Python-SQLite

Another post starts with you beautiful people!
Hope you have enjoyed and learnt my previous post about file importing
Today we will learn how as a Data Scientist we connect to the database step by step-

Before starting the code first you must read about the very powerful library -SQLAlchemy which we will use in our exercises and you can find the details here- SQLAlchemy documentation

How To Create a Database Engine-
Here, we're going to fire up our very first SQL engine. We'll create an engine to connect to the SQLite database 'Chinook.db'. You can find the download and installation steps of this sample database from here- download sqlite sample database

An engine is just a common interface to a database, and the information it requires to
connect to one is contained in a connection string, such as sqlite:///C:\sqlite\db\chinook.db .Here, sqlite is the database driver, while chinook.db is a SQLite db file contained in the local directory.

A little bit of background on the Chinook database: the Chinook database contains information about a semi-fictional digital media store in which media data is real and customer, employee and sales data has been manually created.

We can see the database structure through sqlite3 tool and it looks like as below-

Now let's create a engine and see what tables it contains-

Output-

Now, it's time for liftoff! In this exercise, we'll perform the Hello World of SQL queries-
SELECT, in order to retrieve all columns of the table employees in the Chinook database.

Run the above query in your notebook and find what Employees table has !

Customizing the Hello World of SQL Queries-
Congratulations on executing your first SQL query! Now we're going to figure out how to customize your query in order to:
Select specified columns from a table;
Select a specified number of rows;
Import column names from the database table.

Output-

Filtering our database records using SQL's WHERE-
We can now execute a basic SQL query to select records from any table in our database and we can also perform simple query customizations to select particular columns and numbers of rows.
We can filter any SELECT statement by any condition using a WHERE clause. This is called filtering our records.

Output-

Ordering our SQL records with ORDER BY-
We can also order our SQL query results. For example, if you wanted to get all records from the Employees table of the Chinook database and order them in increasing order by the column BirthDate.

Pandas and The Hello World of SQL Queries-
Here, we'll take advantage of the power of pandas to write the results of our SQL query to a DataFrame in one swift line of Python code!

Output-

Pandas for more complex querying-
Here, we'll become more familiar with the pandas function read_sql_query() by using it to execute a more complex query:
a SELECT statement followed by both a WHERE clause AND an ORDER BY clause.
We'll build a DataFrame that contains the rows of the Employee table for which the EmployeeId is greater than or equal to 6 and we'll order these entries by BirthDate.

Output-

Relationships between tables: INNER JOIN
Here, we'll perform our first INNER JOIN!
For each record in the Albums table, we'll extract the Title along with the Name of the Artist. The latter will come from the Artist table and so we will need to INNER JOIN these two tables on the ArtistID column of both.

Output-

Filtering our INNER JOIN-
Congrats on performing your first INNER JOIN! You're now going to finish this chapter with one final exercise in which you perform an INNER JOIN and filter the result using a WHERE clause.

Output-

Try the above examples in your notebook and explore other databases also!
In the next post we will learn CRUD operations in relational database.

Comments

Post a Comment

Popular posts from this blog

How can I make a simple ChatBot?

Another post starts with you beautiful people! It has been a long time of posting a new post. But my friends in this period I was not sitting  where I got a chance to work with chatbot and classification related machine learning problem. So in this post I am going to share all about chatbot- from where I have learned? What I have learned? And how can you build your first bot? Quite interesting right! Chatbot is a program that can conduct an intelligent conversation based on user's input. Since chatbot is a new thing to me also, I first searched- is there any Python library available to start with this? And like always Python has helped me this time also. There is a Python library available with name as  ChatterBot   which is nothing but a machine learning conversational dialog engine. And yes that is all I want to start my learning because I always prefer inbuilt Python library to start my learning journey and once I learn this then only I move ahead for another...

Detecting Credit Card Fraud As a Data Scientist

Another post starts with you beautiful people! Hope you have learnt something from my previous post about  machine learning classification real world problem Today we will continue our machine learning hands on journey and we will work on an interesting Credit Card Fraud Detection problem. The goal of this exercise is to anonymize credit card transactions labeled as fraudulent or genuine. For your own practice you can download the dataset from here-  Download the dataset! About the dataset:  The datasets contains transactions made by credit cards in September 2013 by european cardholders. This dataset presents transactions that occurred in two days, where we have 492 frauds out of 284,807 transactions. The dataset is highly unbalanced, the positive class (frauds) account for 0.172% of all transactions. Let's start our analysis with loading the dataset first:- As per the  official documentation -  features V1, V2, ... V28 are the principal compo...

LightGBM and Kaggle's Mercari Price Suggestion Challenge

Another post starts with you beautiful people! I hope you have enjoyed and must learnt something from previous two posts about real world machine learning problems in Kaggle. As I said earlier Kaggle is a great platform to apply your machine learning skills and enhance your knowledge; today I will share again my learning from there with all of you! In this post we will work upon an online machine learning competition where we need to predict the the price of products for Japan’s biggest community-powered shopping app. The main attraction of this challenge is that this is a Kernels-only competition; it means the datasets are given for downloading only in stage 1.In next final stage it will be available only in Kernels. What kind of problem is this? Since our goal is to predict the price (which is a number), it will be a regression problem. Data: You can see the datasets  here Exploring the datasets: The datasets provided are in the zip format of 'tsv'. So how can ...