Making Requests With asyncio in Python (78/100 Days of Python)

Martin Mirakyan
3 min readMar 20, 2023

--

Day 78 of the “100 Days of Python” blog post series covering async requests

Making requests in Python is a common task for many applications. However, sometimes the process can become time-consuming, especially when dealing with a large number of requests. This is where asyncio comes into play. asyncio is a Python library for asynchronous programming, which can make network requests more efficient by allowing for concurrent requests to be made in a non-blocking manner. In this tutorial, we will explore how to make network requests using asyncio in Python.

Benefits of Using asyncio for Network Requests

Asyncio provides several benefits over traditional synchronous programming when it comes to network requests:

  1. Non-blocking: Asyncio allows for requests to be made in a non-blocking manner, meaning that the program can continue to perform other tasks while waiting for the request to complete. This can be especially useful when dealing with a large number of requests, as it can help to reduce the overall time needed to complete them.
  2. Concurrency: Asyncio allows for concurrent requests to be made, meaning that multiple requests can be processed at the same time. This can help to improve the overall performance of the application.
  3. Scalability: Asyncio can help to improve the scalability of an application, as it allows for requests to be made in a non-blocking and concurrent manner. This can help to ensure that the application can handle a large number of requests without becoming overwhelmed.

How to Make an HTTP Request With asyncio in Python

To make network requests using asyncio in Python, we need to use the asyncio and aiohttp libraries. The aiohttp library provides an asynchronous HTTP client and server implementation:

import asyncio
import aiohttp


async def main():
async with aiohttp.ClientSession() as session:
async with session.get('https://www.google.com') as response:
html = await response.text()
print(html)


loop = asyncio.get_event_loop()
loop.run_until_complete(main())

In the main coroutine function, we create an instance of aiohttp.ClientSession and use it to make a GET request to the URL using the session.get method. We then print the HTML of the response.

To run the code, we use the asyncio.get_event_loop function to get the event loop and call the main coroutine function using the loop.run_until_complete method.

Making Multiple Requests With asyncio Concurrently

Now let’s look at an example of making multiple requests concurrently using asyncio:

import asyncio
import aiohttp

urls = [
'https://www.google.com',
'https://www.facebook.com',
'https://www.twitter.com'
]


async def fetch(session, url):
async with session.get(url) as response:
return await response.text()


async def main():
async with aiohttp.ClientSession() as session:
tasks = [asyncio.ensure_future(fetch(session, url)) for url in urls]
responses = await asyncio.gather(*tasks)
for response in responses:
print(response)


loop = asyncio.get_event_loop()
loop.run_until_complete(main())

In this example, we define a list of URLs that we want to fetch. In the main coroutine, we create a list of asyncio.Task objects, with each task corresponding to a single URL. We use the asyncio.ensure_future method to schedule each task to run asynchronously.

We then use the asyncio.gather method to wait for all the tasks to complete. The gather method returns a list of the results of each task, in the order that they were specified. We then loop through the list of responses and print the HTML of each response.

Real-World Examples of When asyncio Requests Might Be Helpful

There are several real-world scenarios where asyncio can be helpful when making network requests:

  1. Web scraping: When scraping data from websites, it is often necessary to make a large number of requests. Using asyncio can help to make this process more efficient, as it allows for requests to be made concurrently.
  2. API requests: When making requests to APIs, it is often necessary to make multiple requests in order to retrieve all the necessary data. Asyncio can help to improve the efficiency of this process, as it allows for requests to be made concurrently.
  3. Distributed systems: When working with distributed systems, it is often necessary to make network requests to retrieve data from different nodes.

What’s next?

--

--

Martin Mirakyan
Martin Mirakyan

Written by Martin Mirakyan

Software Engineer | Machine Learning | Founder of Profound Academy (https://profound.academy)

Responses (1)