Stockton SEO Agency Company Articles Call 801-360-8331
A Rust API

A Rust API

Part of SEO is ensuring you have a fast loading website. Part of what we do here is serve up websites. So today I wanted to review a simple API build using Rust, which should give fast response times, to help a site load faster. Of course, an API doesn't help too incredibly much for SEO, unless you properly implement a frontend that knows how to handle the situation.

(Side note: you can build websites using Rust and a templating system (similar to Django's Jinja). I've done this before, and had a simple solution deployed for a short time)

I've had this idea for a while now that stated that if you don't use Django, you might as well use Rust to build an API. So the other day when I tested out FastAPI, I felt like this may be true. Django gives a developer so many "batteries" to jumpstart a project, that in my view, it's a clear winner.

So I figured I would go ahead and test out building an API using Rust, to see if I was able to accomplish the task in a similar timeframe as using FastAPI.

When I used FastAPI there was a bit of a learning curve. Also, the data modeling doesn't feel as slick as Django. And along with data modeling, you must create Pydantic models to match situations such as creating, updating, or listing a model. It is because of these specific battles that I felt like building in Rust would be an equal of a task.

So I decided to use Actix Web, which is a Rust library, and plow forward building a really* simple API. Another side note: I have tested out the Rocket library in the past, which I somewhat liked.

Genesis

Not only is Genesis the name of the first book of our English bibles, but it is also a descriptive word coming from the Greek, meaning "origin, creation, generation."

So the origin of this mission starts with Rust's packaging tool: Cargo. From the command line:

$ cargo new rustapi
    Creating binary (application) `rustapi` package
$ cd rustapi

Then I add the Actix Web dependency in the Cargo.toml file:

$ echo 'actix-web = "4"' >> Cargo.toml

And this sets up the basic pieces of what we'll need.

The Datumbase

As the song goes, it's "all about that [base], 'bout that [base]." So we'll connect this to a SQLite Database, just like my FastAPI project was doing.

The Actix Web documenation does provide some info about database connectivity. They seem to recommend using Diesel, so we'll use their docs for help with our database interactions. First we'll add that dependency, and we'll throw in the recommended dot env config crate as well:

$ echo 'diesel = { version = "2.2.0", features = ["sqlite", "returning_clauses_for_sqlite_3_35"] }' >> Cargo.toml
$ echo 'dotenvy = "0.15"' >> Cargo.toml

And we're going to need serde eventually, so we'll throw that in there as well:

$ echo 'serde = { version = "1.0.210", features = ["derive"] }' >> Cargo.toml

I've already installed the Diesel CLI tool, so I'll skip that step here. But next we specify our SQLite database in the .env file:

$ echo DATABASE_URL=./db.sqlite3 > .env
$ diesel setup

Note: in my case, this database already exists, and has some data in it. So we won't be running any actual migrations here. But, I would like the schema file that Diesel creates. So I start with this:

$ diesel migration generate oranges

Then I edit the up.sql file in the migration folder, and add my SQL:

CREATE TABLE oranges (id integer primary key autoincrement, species varchar, origin varchar);

After that, I can use Diesel again to get the schema, and store that in my source:

$ diesel print-schema >> src/schema.rs

Here is what that schema looks like:

diesel::table! {
    oranges (id) {
        id -> Integer,
        species -> Nullable<Text>,
        origin -> Nullable<Text>,
    }
}

The "id" was also "Nullable" from this schema, so I went ahead and edited this so the "id" wouldn't have that attribute within my Rust code. This made some things easier down the line.

At this point, I need to define some database connectivity, which I have done using the code here and stuffed in the file src/db.rs.

Now to create the model in Rust. In the file src/models.rs I actually created a few pieces, similar to what you would do with FastAPI:

use diesel::prelude::*;
use crate::schema::oranges;
use serde::{Serialize, Deserialize};


#[derive(Deserialize)]
pub struct Orange {
    pub species: String,
    pub origin: String,
}


#[derive(Serialize)]
#[derive(Queryable, Selectable)]
#[diesel(table_name = crate::schema::oranges)]
#[diesel(check_for_backend(diesel::sqlite::Sqlite))]
pub struct SqlOrange {
    pub id: i32,
    pub species: Option<String>,
    pub origin: Option<String>,
}


#[derive(Insertable)]
#[diesel(table_name = oranges)]
pub struct NewOrange<'a> {
    pub species: &'a str,
    pub origin: &'a str,
}

The "Orange" I later use as a parameter for the Actix route, so the struct named "SqlOrange" represents the data that is actually coming from the database. And the "NewOrange" is used by Diesel for inserting data.

The End...points

When I built a simple API with FastAPI, I had 3 endpoints:

The first endpoint listed my types of oranges (with the option for offset/limit), the second one pulled just one specific orange, and the third one created a new orange.

So we'll build the same endpoints here. First I need the main endpoint to simply list out the oranges from the database. To make things simple, my src/main.rs file will look like the following. Note, this isn't using any particular database session or persistent connection. It's just a quick implementation:

use actix_web::{web, get, post, App, HttpResponse, HttpServer, Responder};
use schema::oranges::dsl::*;
use diesel::prelude::*;
pub mod db;
pub mod schema;
pub mod models;
use serde::Deserialize;


#[derive(Deserialize)]
struct Pagination {
    offset: Option<i64>,
    limit: Option<i64>,
}

#[get("/")]
async fn index(info: web::Query<Pagination>) -> impl Responder {
    let offset = match info.offset {
        Some(offset) => offset,
        None => 0,
    };
    let limit = match info.limit {
        Some(limit) => limit,
        None => 100,
    };
    let conn = &mut db::establish_connection();
    let results = oranges
        .offset(offset)
        .limit(limit)
        .select(models::SqlOrange::as_select())
        .load(conn)
        .expect("Error loading oranges");
    web::Json(results)
}


#[derive(Deserialize)]
struct GetOrange { id: i32 }

#[get("/oranges/{id}")]
async fn orange(data: web::Path<GetOrange>) -> impl Responder {
    let conn = &mut db::establish_connection();
    match oranges.find(data.id).select(models::SqlOrange::as_select()).first(conn) {
        Ok(orange) => web::Json(orange),
        Err(_) => web::Json(models::SqlOrange{id:0,species:None,origin:None}),
    }
}


#[post("/create")]
async fn orange_create(data: web::Json<models::Orange>) -> impl Responder {
    use schema::oranges;
    let conn = &mut db::establish_connection();
    let new_orange = models::NewOrange {
        species: &data.species,
        origin: &data.origin,
    };
    diesel::insert_into(oranges::table)
        .values(&new_orange)
        .execute(conn)
        .expect("Error saving new orange");
    HttpResponse::Ok()
}


#[actix_web::main]
async fn main() -> std::io::Result<()> {
    HttpServer::new(|| {
        App::new()
            .service(index)
            .service(orange)
            .service(orange_create)
    }).bind(("127.0.0.1", 8080))?.run().await
}

With these 3 registered endpoints I get the same functionality I had with FastAPI.

Conclusion

Interestingly, this didn't feel like it took very much more time than developing with FastAPI, although it probably was a little longer. Of course, there was a learning curve for both frameworks for me, so there is that to consider. And I suspect that over time FastAPI would be the quickest development solution, since with Rust you must implement typing and you must get it right. Getting the correct types and traits setup took some additional time to figure out.

Now let's look at some pro's and con's.

Pros

Cons

Written by Jon

Author Profile Picture

Hi, I'm Jon. My family and I live in Utah, where we enjoy sports and raising our chickens. I have a bachelors degree in Software Development, various computer & project management certifications, and I've worked for web hosting and other dev/online companies for over a decade.