Skip to main content

Python Advanced- Inroduction to NumPy

NumPy or Numerical Python is the most fundamental package designed for scientific computing and data analysis.
Most of the other packages such as pandas is built on top of it, and is an important package to know and learn about.
At the heart of NumPy is a data structure called ndarray. Using ndarray, you can store large multidimensional datasets in Python.

 In order to be able to use NumPy, first import it using import statement-

If you are doing performance intensive work, then saving space is of importance. In such cases, you can import specific modules of NumPy by using -

Let's understand why we need numpy with below code snippet?

If you run the above code you will get following error-
To get the expected result we need to convert the list into numpy array first as below -

I hope with the above example you can easily understand that numpy is an important feature of Python and widely used in mathematical operations required in Data Science.

We can easily find out the shape, size, dimension and type of the array with below code snippet-


Suppose you want to edit the size of the given array then you can do it as below-

For more details about NumPy operations please see NumPy

So keep practicing by your own with above examples in your notebook and comment if you face any issue.

Comments

Post a Comment

Popular posts from this blog

Building and deploying your ChatBot with Amazon Lex, AWS Lambda, Python and MongoDB

Another post starts with you beautiful people! Most of the businesses are adopting digital transformation to modernize customer communication and improve internal processes. By personalizing the user experience whether in a chatbot conversation, on a website or in email, you can make your user feel more valued and understood.  Google DialogFlow  and  Amazon Lex    are two pioneer vendors for building end to end personalized chatbot applications. In this post we are going to use Amazon Lex to build our chatbot and after the end of this post you will have your chatbot integrated with a web page and also your web page will be deployed on AWS cloud. This post is going to be long and very interesting so stay focus and keep reading the post till the end. Step 1. Creating your account in AWS To proceed with this post you must have an AWS account. If you don't have , just follow  this link   to create a free tier account there. While registration it may...

How to use opencv-python with Darknet's YOLOv4?

Another post starts with you beautiful people 😊 Thank you all for messaging me your doubts about Darknet's YOLOv4. I am very happy to see in a very short amount of time my lovely aspiring data scientists have learned a state of the art object detection and recognition technique. If you are new to my blog and to computer vision then please check my following blog posts one by one- Setup Darknet's YOLOv4 Train custom dataset with YOLOv4 Create production-ready API of YOLOv4 model Create a web app for your YOLOv4 model Since now we have learned to use YOLOv4 built on Darknet's framework. In this post, I am going to share with you how can you use your trained YOLOv4 model with another awesome computer vision and machine learning software library-  OpenCV  and of course with Python 🐍. Yes, the Python wrapper of OpenCV library has just released it's latest version with support of YOLOv4 which you can install in your system using below command- pip install opencv-pyt...

Generative AI with LangChain: Basics

  Wishing everyone a Happy New Year '24😇 I trust that you've found valuable insights in my previous blog posts. Embarking on a new learning adventure with this latest post, we'll delve into the realm of Generative AI applications using LangChain💪. This article will initially cover the basics of Language Models and LangChain. Subsequent posts will guide you through hands-on experiences with various Generative AI use cases using LangChain. Let's kick off by exploring the essential fundamentals💁 What is a Large Language Model (LLM)? A large language model denotes a category of artificial intelligence (AI) models that undergo extensive training with extensive textual data to comprehend and produce language resembling human expression🙇. Such a large language model constitutes a scaled-up transformer model, often too extensive for execution on a single computer. Consequently, it is commonly deployed as a service accessible through an API or web interface. These models are...