Skip to main content

Relational Database in Python-CRUD operations

Another post starts with you beautiful people!
Hope you have enjoyed and learn from my previous post about Relational Database In Python-sqlite
Previously, we used the Table object to reflect a table from an existing database, but what if you wanted to create a new table?
We'd still use the Table object; however, we'd need to replace the autoload and autoload_with parameters with Column objects.
The Column object takes a name, a SQLAlchemy type with an optional format, and optional keyword arguments for different constraints.

After defining the table, we can create the table in the database by using the .create_all() method on metadata and supplying the engine as the only parameter. You can find more details about metadata here- tell me more about metadata!

Output:-
Well done! When creating a table, it's important to carefully think about what data types each column should be.

Constraints and Data Defaults-
We're now going to practice creating a table with some constraints! Often, we'll need to make sure that a column is unique, nullable, a positive value, or related to a column in another table. This is where constraints come in.

Inserting a single row with an insert() statement-
There are several ways to perform an insert with SQLAlchemy; however, we are going to focus on the one that follows the same pattern as the select statement.

It uses an insert statement where we specify the table as an argument, and supply the data we wish to insert into the value via the .values() method as keyword arguments.

Inserting Multiple Records at Once-
It's time to practice inserting multiple records at once!
We'll want to first build a list of dictionaries that represents the data we want to insert.
Then, in the .execute() method, you can pair this list of dictionaries with an insert statement, which will insert all the records in your list of dictionaries.

Loading a CSV into a Table-
We've done a great job so far at inserting data into tables! We're now going to learn how to load the contents of a CSV file into a table.

We have used the csv module to set up a csv_reader, which is just a reader object that can iterate over the lines in a given CSV file -
in this case, a census CSV file. Using the enumerate() function, we can loop over the csv_reader to handle the results one at a time.
Run the above code in your notebook and find how many rows in this table.

Updating individual records-
The update statement is very similar to an insert statement, except that it also typically uses a where clause to help us determine what data to update. We'll be using the FIPS state code using here, which is appropriated by the U.S. government to identify U.S. states and certain other associated areas.
The names of the tables and columns of interest in this exercise are: state_fact (Table), name (Column), and fips_state (Column).

After executing the update statement, the fips_state code gets updated to 36.

Updating Multiple Records-
By using a where clause that selects more records, we can update multiple records at once.
It's time now to practice this!


Correlated Updates-
We can also update records with data from a select statement. This is called a correlated update.
It works by defining a select statement that returns the value you want to update the record with and assigning that as the value in an update statement.
We'll be using a flat_census in this exercise as the target of our correlated update. The flat_census table is a summarized copy of our census table.

Well done! Having learned how to create tables in a database, insert data into them, and update the data,
it's time to learn how to remove data from a database!

Deleting all the records from a table-
Often, we'll need to empty a table of all of its records so we can reload the data.
We can do this with a delete statement with just the table as an argument.
Do be careful, though, as deleting cannot be undone!

After executing the delete statement you can see, there are no records left in the census table.

Deleting specific records-
By using a where() clause, we can target the delete statement to remove only certain records.
For example, I deleted all columns from the employees table that had id 3 with the following delete statement:
delete(employees).where(employees.columns.id == 3)

Here we'll delete ALL rows which have 'M' in the sex column and 36 in the age column. We have included code at the start which computes the total number of these rows. It is important to make sure that this is the number of rows that you actually delete.

Deleting a Table Completely-
We're now going to practice dropping individual tables from a database with the .drop() method,
as well as all tables in a database with the .drop_all() method!

As Spider-Man's Uncle Ben:With great power, comes great responsibility!
Do be careful when deleting tables, as it's not simple or fast to restore large databases! Remember, we can check to see if a table exists with the .exists() method.

Brilliant work, and congratulations! We have learnt CRUD operation in relational database with Python.
Try above code in your notebook with different databases and explore more learning!

Comments

  1. Thanks for sharing this local file transfer concepts with coding. It is really helpful.
    Python Classes in Chennai | Python Training Institutes in Chennai

    ReplyDelete
    Replies
    1. Thanks Arjun for visiting my blog ! I am happy that somehow I have helped you. You are most welcome to discuss any problem you face in Data Science journey.

      Delete
  2. This comment has been removed by the author.

    ReplyDelete
  3. A valid information for sharing an wonderful to make it.
    best python training in chennai

    ReplyDelete
  4. Superb. I really enjoyed very much with this article here. Really it is an amazing article I had ever read. I hope it will help a lot for all. Thank you so much for this amazing posts and please keep update like this excellent article.thank you for sharing such a great blog with us. expecting for your.
    Tableau training in chennai |
    Tableau course in chennai

    ReplyDelete
  5. Thanks for sharing a useful information with us. If someone wants to know about Safety Softwares and Occupational health and Safety Software I think this is the right place for you.

    ReplyDelete
  6. Thanks for sharing valuable information. this blog really very useful for me.

    visit : Digital Marketing Training in Chennai || Digital Marketing Course in Chennai

    ReplyDelete

Post a Comment

Popular posts from this blog

Generative AI with LangChain: Basics

  Wishing everyone a Happy New Year '24😇 I trust that you've found valuable insights in my previous blog posts. Embarking on a new learning adventure with this latest post, we'll delve into the realm of Generative AI applications using LangChain💪. This article will initially cover the basics of Language Models and LangChain. Subsequent posts will guide you through hands-on experiences with various Generative AI use cases using LangChain. Let's kick off by exploring the essential fundamentals💁 What is a Large Language Model (LLM)? A large language model denotes a category of artificial intelligence (AI) models that undergo extensive training with extensive textual data to comprehend and produce language resembling human expression🙇. Such a large language model constitutes a scaled-up transformer model, often too extensive for execution on a single computer. Consequently, it is commonly deployed as a service accessible through an API or web interface. These models are...

How can I become a TPU expert?

Another post starts with you beautiful people! I have two good news for all of you! First good news is that Tensorflow has released it's new version (TF 2.1) which is focused on TPUs and the most interesting thing about this release is that it now also supports Keras high level API. And second wonderful news is to help us get started Kaggle has launched a TPU Playground Challenge . This means there is no any way to stop you learning & using TPUs. In this post I am going to share you how to configure and use TPUs while solving a image classification problem. What are TPUs? You must have heard about TPU while using  Google Colab . Now Kaggle also supports this hardware accelerator. TPUs or Tensor Processing Units are hardware accelerators specialized in deep learning tasks. They were created by Google and have been behind many cutting edge results in machine learning research. Kaggle Notebooks are configured with TPU v3-8s, which is a specialized hardware with...

How to convert your YOLOv4 weights to TensorFlow 2.2.0?

Another post starts with you beautiful people! Thank you all for your overwhelming response in my last two posts about the YOLOv4. It is quite clear that my beloved aspiring data scientists are very much curious to learn state of the art computer vision technique but they were not able to achieve that due to the lack of proper guidance. Now they have learnt exact steps to use a state of the art object detection and recognition technique from my last two posts. If you are new to my blog and want to use YOLOv4 in your project then please follow below two links- How to install and compile Darknet code with GPU? How to train your custom data with YOLOv4? In my  last post we have trained our custom dataset to identify eight types of Indian classical dance forms. After the model training we have got the YOLOv4 specific weights file as 'yolo-obj_final.weights'. This YOLOv4 specific weight file cannot be used directly to either with OpenCV or with TensorFlow currently becau...