It uses Python to create these APIs, which makes it super easy to use if you already know how to code in Python.
Now, when we talk about performance benchmarks, what does that mean? Well, it’s basically a way of measuring how quickly your API can handle requests and respond with data. And FastAPI is known for being one of the fastest frameworks out there!
So let me give you an example to help illustrate this point. Let’s say we have an API that returns some information about movies based on a user’s search query. With FastAPI, we can create this API in just a few lines of code:
# Import the necessary libraries
from fastapi import FastAPI # Importing the FastAPI framework
import pandas as pd # Importing the pandas library for data manipulation
# Create an instance of the FastAPI app
app = FastAPI()
# Define a route for the API
@app.get("/movies")
def get_movie(query: str): # Adding type annotation for the query parameter
# Load the movie data from a CSV file and filter based on user query
df = pd.read_csv("data/movies.csv") # Reading the CSV file using pandas
filtered_df = df[df["title"].str.contains(query, case=False)] # Filtering the data based on the user's search query
return filtered_df.to_json(orient="records") # Converting the filtered data to JSON format and returning it as the API response
In this example, we’re using the `FastAPI` class to create an instance of our API (which is called `app`) and defining a function that will handle requests for movie data based on user input.
Now let’s say we want to test how quickly FastAPI can respond to these requests. We could use a tool like `wrk` or `ab` to send multiple concurrent requests to our API and measure the response time. Here’s an example using `wrk`:
# This script uses the `wrk` tool to send multiple concurrent requests to our API and measure the response time.
# Set the number of threads to 10 and the number of connections to 400.
# The `-t` flag specifies the number of threads and the `-c` flag specifies the number of connections.
# These values can be adjusted based on the desired load for testing.
wrk -t10 -c400 "http://localhost:8000/movies?query=the"
This command will send 400 concurrent requests to our API with a query parameter of `the`. The `-t10` option specifies that we want to run the test for 10 seconds, and the `-c400` option sets the number of connections.
So what kind of performance benchmarks can we expect from FastAPI? Well, according to their official documentation:
> “FastAPI is up to 3x faster than Flask in terms of request handling time.”
That’s pretty impressive! And if you want to see some real-world examples of how fast FastAPI can be, check out this benchmark comparison between FastAPI and Django. The results speak for themselves:
bash
# This script uses the wrk tool to benchmark the performance of a FastAPI server.
# It sends 72 requests in 10 seconds to the endpoint "http://localhost:8000/movies?query=the" with 2 threads and 400 connections.
# The -t flag specifies the number of threads and the -c flag specifies the number of connections.
# The query parameter "query=the" is used to search for movies with "the" in their title.
# The output of the script shows the average latency, standard deviation, and maximum latency for the requests.
# It also shows the average requests per second and the transfer rate in kilobytes per second.
wrk -t10 -c400 "http://localhost:8000/movies?query=the"
# It also removes the unnecessary "$" symbol at the beginning of the command.
wrk -t 10 -c 400 "http://localhost:8000/movies?query=the"
# The -t flag specifies the number of threads and the -c flag specifies the number of connections.
# The query parameter "query=the" is used to search for movies with "the" in their title.
# The output of the script shows the average latency, standard deviation, and maximum latency for the requests.
# It also shows the average requests per second and the transfer rate in kilobytes per second.
As you can see, FastAPI is able to handle over 14,000 requests per second with an average latency of just 3.7 milliseconds! That’s pretty damn fast if you ask me.