sneakycrow

rust

So, today I made some serious progress with my Forum API. I’m honestly super proud of myself. While most of the code is kind of a modified version of the example code on diesel, I was able to come up with some small stuff on my own.

I think the biggest takeaway today is I learned about the file structure of Rust programs a lot more. When to call things like self, super, and things like that.

Prior to today, there wasn’t really much in the project. I brought in the crates I wanted in my Cargo.toml file, but beyond that I hadn’t done much.

I’ve established a lot today. Primarily, DB connection, basic structs, and some utility functions for starting to read and write data..

Here’s the code, I’ll explain each piece.

#[macro_use]
extern crate diesel;
extern crate dotenv;

mod schema;
mod models;

use dotenv::dotenv;
use std::env;
use diesel::prelude::*;
use diesel::pg::PgConnection;
use std::io::{stdin, Read};

For this bit, we are bringing in the creates we need to start. For diesel, we’re using some macros below, so we need to throw that line right above the import.

After the crates, I’m bringing in some local files, and then below that pulling out the specific functions, models, traits, etc that I need from each crate.

fn establish_connection() -> PgConnection {
    dotenv().ok();
    
    let database_url = env::var("DATABASE_URL").expect("DATABASE_URL must be set");
    
    PgConnection::establish(&database_url).expect(&format!("Error connecting to {}", database_url))
}

This function is pretty simple. It returns a database connection to our Postgres DB (provided by the docker-compose file).

fn get_threads() -> Vec<models::Thread> {
    use schema::threads::dsl::*;

    let connection = establish_connection();
    let results = threads.filter(published.eq(true))
        .limit(5)
        .load::<models::Thread>(&connection)
        .expect("Error loading posts");
    results
}

With this function, we’re grabbing the threads. The way we do this is establish a connection, grab the “threads” (the alias’s are coming from the use statement in the fn), then filter each thread by whether or not it’s published.

fn create_thread<'a>(conn: &PgConnection, title: &'a str, body: &'a str) -> models::Thread {
    use schema::threads;

    let new_thread = models::NewThread {
        title: title,
        body: body
    };

    diesel::insert_into(threads::table)
        .values(&new_thread)
        .get_result(conn)
        .expect("Error saving new thread")
}

This function accepts a connection to our Postgres DB, a title, and a body. It uses the params to create a NewThread struct, then inserts that into our threads table.

fn publish_thread(id: i32) {
    use schema::threads::dsl::{threads, published};

    let connection = establish_connection();

    let thread = diesel::update(threads.find(id))
        .set(published.eq(true))
        .get_result::<models::Thread>(&connection)
        .expect(&format!("Unable to find post {}", id));

    println!("Published thread {}", thread.title);
}

This function is pretty simple too. It accepts an ID (the ID of the thread you want to publish), then set’s it’s published value to true in the DB.

fn cli_generate_thread() {
    let connection = establish_connection();

    println!("What would you like your title to be?");
    let mut title = String::new();
    stdin().read_line(&mut title).unwrap();
    let title = &title[..(title.len() - 1)]; // Drop the newline
    println!("\nOk! Let's write {} (Press {} when finished)\n", title, EOF);
    let mut body = String::new();
    stdin().read_to_string(&mut body).unwrap();

    let thread = create_thread(&connection, title, &body);
    println!("\nSaved  draft {} with id {}", title, thread.id);
}

This is a function I’ll more than likely delete. Because I haven’t set up any endpoints yet, I needed a way to generate threads to test functionality like create_thread, get_threads, and publish_thread. That’s the purpose of this function.

It establishes a connection, accepts for user input for the title, then asks for user input for the body. The EOF variable comes down below, and that’s simply establish some listeners to keyboard events (as I understand it), so we can tell the program when we’re done typing. Here’s those EOF variables

#[cfg(not(windows))]
const EOF: &’static str = “CTRL+D”;

#[cfg(windows)]
const EOF: &’static str = “CTRL+Z”;
fn main() {
    let threads = get_threads();

    for thread in threads {
        println!("Title: {}", thread.title);
        println!("\n-------");
        println!("Body: {}", thread.body);
    }
}

This last function is obviously the main one, and for now it’s being used to get the threads and displayed them. As I was coding I was consistently changing what was in here. First I changed it to use the cli_generate_thread, then the publish_thread, and now get_threads.

And that’s quite a lot of progress. This is literally the majority of the work for now. The next part is going to be setting up some endpoints (via a REST API and actix), and then having each of these functions be related to one of those.

All the final code can be found here: [Source Hut](https://git.sr.ht/~sneakycrow/rust-forum-api]

#rust

So, just like a lot of people I know, I have a bad habit of starting projects without finishing others.

After the recent release of Mozilla’s File Sharing service Send, I felt a little defeated in my own file share software. So I’ve decided to switch back to a similar project I’ve wanted to do for a while: Rust Forum API.

When I was younger, I used to go on forums a lot. Proboards was my jam. It’s arguably one of the entry points for me into web software. Forums are always something I’ve wanted to make myself. So, in that spirit, I’m starting a new project for helping my rust knowledge, a forum api.

It will also include a frontend in React, but the focus of the project will be the API. I’ll be using actix_web and diesel with a Postgres DB. It’s going to be a fun project (I think).

I’ll keep semi-regular updates here. I’ve also added issue tracking, and I’m trying to be good about successfully accomplishing at least one issue per week.

Repo Issue Tracking

GLHF :)

#rust