๐Ÿ”ท Introduction

Python becomes truly powerful when you start using external libraries. These are packages developed by the community that allow you to perform complex tasks without writing everything from scratch.

From making HTTP requests to analyzing data or scraping websites, external libraries are essential for real-world development.

In this lesson, you will learn:

โœ” How to install and manage libraries using pip
โœ” How to use virtual environments (venv)
โœ” How to work with popular libraries:

๐ŸŸฉ 1. Installing Libraries with pip

๐Ÿ”น Install a package

pip install requests

๐Ÿ”น Install a specific version

pip install requests==2.31.0

๐Ÿ”น List installed packages

pip list

๐Ÿ”น Uninstall a package

pip uninstall requests

๐ŸŸฉ 2. Virtual Environments (venv)

A virtual environment isolates your project dependencies.

๐Ÿ”น Create a virtual environment

python -m venv venv

๐Ÿ”น Activate it

Windows:

venv\Scripts\activate

Linux / Mac:

source venv/bin/activate

๐Ÿ”น Why use venv?

โœ” Avoid version conflicts
โœ” Keep projects independent
โœ” Professional development practice

๐ŸŸฉ 3. Using the requests Library

Used for making HTTP requests.

๐Ÿ”น Example: GET request

import requests

response = requests.get("https://api.github.com")

print(response.status_code)
print(response.text)

๐Ÿ”น JSON response

data = response.json()
print(data)

๐ŸŸฉ 4. Web Scraping with BeautifulSoup

Used to extract data from HTML pages.

๐Ÿ”น Installation

pip install beautifulsoup4

๐Ÿ”น Example

from bs4 import BeautifulSoup
import requests

html = requests.get("https://example.com").text

soup = BeautifulSoup(html, "html.parser")

print(soup.title.text)

๐ŸŸฉ 5. Data Handling with Pandas

Pandas is used for data analysis.

๐Ÿ”น Installation

pip install pandas

๐Ÿ”น Example

import pandas as pd

data = {
    "Name": ["Ali", "Sara"],
    "Age": [22, 25]
}

df = pd.DataFrame(data)
print(df)

๐Ÿ”น Read CSV with pandas

df = pd.read_csv("data.csv")
print(df.head())

๐ŸŸฉ 6. Best Practices

โœ” Use virtual environments
โœ” Pin versions (requirements.txt)
โœ” Avoid installing unnecessary packages
โœ” Read documentation

๐ŸŸง 7. Exercises (Hidden Solutions)

Exercise 1 โ€” Install and use requests

Exercise 2 โ€” Parse HTML title

Exercise 3 โ€” Create a pandas DataFrame

Exercise 4 โ€” Read CSV with pandas

Exercise 5 โ€” Create a virtual environment

๐ŸŸฆ Conclusion

In this lesson, you learned how to:

โœ” Install and manage external libraries
โœ” Use virtual environments
โœ” Make HTTP requests
โœ” Scrape websites
โœ” Work with data using pandas

These skills are essential for:

๐Ÿš€ Mini Project โ€” Web Scraper + Data Analyzer (Books Website)

๐Ÿ”ท Project Idea

Build a Python program that:

  1. Scrapes book data from a website
  2. Extracts:
    • Title
    • Price
    • Rating
  3. Stores the data in a pandas DataFrame
  4. Saves it as a CSV file
  5. Performs simple analysis

๐ŸŒ Target Website (safe for practice)

๐Ÿ‘‰ http://books.toscrape.com/

๐Ÿงฉ Part 1 โ€” What You Will Learn

โœ” Real web scraping workflow
โœ” HTML parsing
โœ” Data extraction
โœ” Data cleaning
โœ” Data analysis with pandas

๐Ÿงช Part 2 โ€” Install Required Libraries

pip install requests beautifulsoup4 pandas

๐Ÿงจ Part 3 โ€” Full Solution

๐Ÿ” Part 4 โ€” Explanation

๐Ÿง  What You Practiced

โœ” HTTP requests
โœ” HTML parsing
โœ” Data extraction
โœ” Data cleaning
โœ” Data analysis
โœ” File export

๐ŸŽฏ Bonus Challenges

๐Ÿ‘‰ Extend the project:

  1. Scrape multiple pages
  2. Sort books by price
  3. Find cheapest book
  4. Filter books with rating โ‰ฅ 4
  5. Create a simple chart (matplotlib)