Skip to main content

Python Advanced- Inroduction to NumPy

NumPy or Numerical Python is the most fundamental package designed for scientific computing and data analysis.
Most of the other packages such as pandas is built on top of it, and is an important package to know and learn about.
At the heart of NumPy is a data structure called ndarray. Using ndarray, you can store large multidimensional datasets in Python.

 In order to be able to use NumPy, first import it using import statement-

If you are doing performance intensive work, then saving space is of importance. In such cases, you can import specific modules of NumPy by using -

Let's understand why we need numpy with below code snippet?

If you run the above code you will get following error-
To get the expected result we need to convert the list into numpy array first as below -

I hope with the above example you can easily understand that numpy is an important feature of Python and widely used in mathematical operations required in Data Science.

We can easily find out the shape, size, dimension and type of the array with below code snippet-


Suppose you want to edit the size of the given array then you can do it as below-

For more details about NumPy operations please see NumPy

So keep practicing by your own with above examples in your notebook and comment if you face any issue.

Comments

Post a Comment

Popular posts from this blog

Generative AI with LangChain: Basics

  Wishing everyone a Happy New Year '24😇 I trust that you've found valuable insights in my previous blog posts. Embarking on a new learning adventure with this latest post, we'll delve into the realm of Generative AI applications using LangChain💪. This article will initially cover the basics of Language Models and LangChain. Subsequent posts will guide you through hands-on experiences with various Generative AI use cases using LangChain. Let's kick off by exploring the essential fundamentals💁 What is a Large Language Model (LLM)? A large language model denotes a category of artificial intelligence (AI) models that undergo extensive training with extensive textual data to comprehend and produce language resembling human expression🙇. Such a large language model constitutes a scaled-up transformer model, often too extensive for execution on a single computer. Consequently, it is commonly deployed as a service accessible through an API or web interface. These models are...

How can I become a TPU expert?

Another post starts with you beautiful people! I have two good news for all of you! First good news is that Tensorflow has released it's new version (TF 2.1) which is focused on TPUs and the most interesting thing about this release is that it now also supports Keras high level API. And second wonderful news is to help us get started Kaggle has launched a TPU Playground Challenge . This means there is no any way to stop you learning & using TPUs. In this post I am going to share you how to configure and use TPUs while solving a image classification problem. What are TPUs? You must have heard about TPU while using  Google Colab . Now Kaggle also supports this hardware accelerator. TPUs or Tensor Processing Units are hardware accelerators specialized in deep learning tasks. They were created by Google and have been behind many cutting edge results in machine learning research. Kaggle Notebooks are configured with TPU v3-8s, which is a specialized hardware with...

How to convert your YOLOv4 weights to TensorFlow 2.2.0?

Another post starts with you beautiful people! Thank you all for your overwhelming response in my last two posts about the YOLOv4. It is quite clear that my beloved aspiring data scientists are very much curious to learn state of the art computer vision technique but they were not able to achieve that due to the lack of proper guidance. Now they have learnt exact steps to use a state of the art object detection and recognition technique from my last two posts. If you are new to my blog and want to use YOLOv4 in your project then please follow below two links- How to install and compile Darknet code with GPU? How to train your custom data with YOLOv4? In my  last post we have trained our custom dataset to identify eight types of Indian classical dance forms. After the model training we have got the YOLOv4 specific weights file as 'yolo-obj_final.weights'. This YOLOv4 specific weight file cannot be used directly to either with OpenCV or with TensorFlow currently becau...