rockyourcode: Issue #47
Hello 👋! Thanks for subscribing.
Here are my latest articles:
Nextjs 13 With Prisma (MySQL) Planetscale and Vercel
In the last two weeks I’ve been hacking away on a Next.js 13 application to learn more about React server components.
I have used Prisma with a local Postgres database for development. I normally spin up my databases via docker.
When it came to deployment, it was easy enough to throw my app on Vercel, but I have been trouble finding a good free offering for Postgres.
In the past, I have used Heroku for such toy projects, but they’ve stopped their free tier. There have been new companies stepping up, for example, Railway, Render.com, Fly.io or Supabase.
Unfortunately, most of them have also placed limits on their free usages. Take Supabase: their Postgres instance only lives for 90 days, then it will be paused if the database doesn’t receive any requests. For a hobby project, that was not a good option for me. I don’t want to resume the Postgres database manually via the Supabase web interface.
I had high hopes for Fly.io. They offer a usage-based plan with a generous free threshold.
Turns out that you need a dedicated ipv4 address for the database for it to play nicely with Vercel’s edge functions.
This costs USD $2/month, which I don’t want to pay for a pet project. Sorry, folks.
In the end, I’ve switched my database to MySQL and used Planetscale as my production database. The experience was seemless.
In the following article I’ll explain how I would set up a new project with Next.js and MySQL.
This guide assumes that you have Node.js, Docker and docker compose installed and working.
Local Development
- Create a new Next.js application
npx create-next-app
Add prisma:
npm i -D prisma
- Setup MySQL via docker compose & Docker
Create a folder for the database:
mkdir db
Add Dockerfile to it (db/Dockerfile
):
FROM bitnami/mysql:8.0.34
ADD local-init.sql /docker-entrypoint-initdb.d
We need an initialization script: db/local-init.sql
:
CREATE USER 'user'@'%' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON * . * TO 'user'@'%';
Why?
We want to reach the database on localhost
on our machine. By default, MySQL and MariaDB restrict connections other than to the local machine. The Docker container runs on a separate network. To connect from your local machine, you’ll need to use the % wildcard as the host.
Now we need a docker-compose.yml
in the root folder of our project:
services:
db:
build:
context: ./db
dockerfile: Dockerfile
ports:
- 33306:3306
environment:
- MYSQL_DATABASE=<database name>
volumes:
- db-data:/var/lib/mysql:delegated
volumes:
db-data:
Replace the <database name>
with the name you want for your project.
Now you should be able to use the database for local development via Docker and docker compose:
docker compose up -d
The connection string for MysQL would be:
DATABASE_URL="mysql://user:password@localhost:33306/<database-name>?schema=public"
(Replace the database name.)
Prisma Schema
Create a database schema.
Planetscale has some differences to other databases which we need to take care of. For instance, you cannot use foreign key constraints, but need to emulate relations.
Here is how an example database schema would look like:
datasource db {
provider = "mysql"
url = env("DATABASE_URL")
relationMode = "prisma" // use this to emulate relations
}
generator client {
provider = "prisma-client-js"
}
model User {
id Int @id @default(autoincrement())
createdAt DateTime @default(now())
email String @unique
name String?
role Role @default(USER)
posts Post[]
}
model Post {
id Int @id @default(autoincrement())
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
published Boolean @default(false)
title String @db.VarChar(255)
author User @relation(fields: [authorId], references: [id])
authorId Int
@@unique([authorId, title]) // important
@@index([authorId, author]) // important
}
enum Role {
USER
ADMIN
}
It’s important to use indexes because Prisma won’t do it for you implicitly when using the relation mode.
Deployment
Planetscale
Create new database and click on the button “Get connection strings”.
There’s an option for Prisma which you can use.
You need one admin account to run the migrations and one read-write one for the actual application. I have simply used the “main” branch for my production database.
Planetscale offers branching strategies (like git) for your database. I didn’t need those, as my development only happens locally, and I need Planetscale only for the final production deployment.
For that, I use a hacky technique.
I temporarily change my connection string in my local .env
file to the admin connection string of my Planetscale DB:
DATABASE_URL="mysql://xxxxx:xxxxxx@aws.connect.psdb.cloud/<database-name>?schema=public&sslaccept=strict"
Please replace with your admin connection string. Important: add sslaccept=strict
.
Then run migrations:
npx prisma migrate deploy
For an alternative solution, you can read the blog post by shadcn.
Vercel
If you use GitHub or GitLab, it’s easy to deploy your Next.js application to Vercel.
You can import the project and are good to go.
As we’re using Prisma, there’s a little workaround needed for caching the Prisma client. Add the following postinstall script to package.json
:
{
"scripts": {
"postinstall": "prisma generate"
}
}
Add the database connection string for your normal read/write account to the Vercel environment variables:
DATABASE_URL="mysql://xxxxx:xxxxxx@aws.connect.psdb.cloud/<database-name>?sslaccept=strict"
Deploy and done.
Recap
Using Planetscale with Vercel is a dream combo. The integration is very easy. I was pleasantly surprised by Planetscale.
Especially in comparison to Fly.io it was very straightforward to spin up a database and connect it to external services (local machine and Vercel). They don’t even require a credit card for their hobby plan.
Links
Working With C# on Macos M1 (Helix, Rider)
I want to get into C# development, because my future project will likely be in C#.
I wanted to use Rider, a JetBrains product. I am using IntelliJ (the flagship JetBrains IDE) at my day job, and I find it more powerful than VS Code.
I had a bit of difficulties to get Rider working on macOs.
My alternative editor right now is Helix, a terminal editor.
There were also some small hiccups with setting up a language server support for Helix.
So here’s how I solved my problems with C# on macOs.
Install Dot.Net & Mono
I normally use homebrew to install my dependencies. Unfortunately, this doesn’t seem to play well with Rider.
So, instead, I manually downloaded the official installers for both Dot.net and Mono.
If you know how to get Rider working with homebrew, please let me know.
Rider
Follow this guide to set your toolchain.
Rider seems to have problems finding the installation folder, so head over the Rider settings under “Build, Execution, Deployment > Toolset and Build”.
You can find out your installation directory by running the following commands in your terminal:
which dotnet
&
which mono
Helix
The documentation has a guide on how to add OmniSharp as the language server fo C#.
But first we need to install OmniSharp.
Find the newest release on the official website.
I install my user packages into ~/.local/bin
:
# download & extract archive
curl -sSL https://github.com/OmniSharp/omnisharp-roslyn/releases/download/v1.39.8/omnisharp-linux-arm64.tar.gz | tar xvzf - -C ~/.local/bin/
# make sure that we have the correct permissions
chmod 744 ~/.local/bin/omnisharp/*
Now, add the language server in ~/.config/helix/languages.toml
:
[[language]]
name = "c-sharp"
language-server = { command = "dotnet", args = [ "dotnet", "/Users/me/.local/bin/omnisharp/Omnisharp.dll", "--languageserver" ] }
You need to use the absolute path as Helix can’t yet expand ~
to your home folder.
Recap
The biggest hurdle was the Rider setup as I’ve been sifting through the support forum for JetBrains to find out why Rider doesn’t find mono or dotnet.
There were some issues for Linux users, but nothing about homebrew.
I think the underlying issue is that Rider just doesn’t work well without any folders that are not “standard”.
And it seems not to recognize the packages installed via homebrew into /opt/homebrew/bin
.
Update Gulp to Use ESM
I built my personal website in Eleventy a few years ago.
For this, I followed a course called Learn Eleventy From Scratch. A couple of years ago, Eleventy made a big splash as a simple JavaScript framework to build static websites.
Nowadays, Astro is probably more popular, but I have been happy with Eleventy so far.
My Eleventy site has a Gulp pipeline to optimize images, get fonts, and transform sass.
I used Common JS syntax:
const { src, dest } = require('gulp')
const images = () => {
// more code
}
module.exports = images
One of the packages in the pipeline is gulp-imagemin, which helps to minify PNG, JPG and other images.
The author has made the package pure ESM in its latest version, thus you cannot use it in a gulp pipeline that uses CommonJS.
In the following blog post, I’ll write down how to transfer the gulp pipeline to ESM.
This article follows the excellent guide Moving gulpfile from CommonJS (CJS) to ECMAScript Modules(ESM).
How I updated my gulp pipeline to ESM
1. Update packages
I updated my packages to the latest versions. Here is the excerpt from package.json
:
{
"devDependencies": {
"get-google-fonts": "^1.2.2",
"gulp": "^4.0.2",
"gulp-clean-css": "^4.3.0",
"gulp-imagemin": "^8.0.0",
"gulp-sass": "^5.1.0",
"sass": "^1.64.1"
}
}
2. Update gulp to use ESM
I changed the file endings of all gulp files to .mjs
:
- gulp-tasks/fonts.mjs
- gulp-tasks/images.mjs
- gulp-tasks/sass.mjs
- gulpfile.mjs
3. Change syntax
Now comes the tedious part where you have to adjust how to import and export your modules correctly.
gulpfile.mjs
:
import gulp from 'gulp'
const { parallel, watch: gulpWatch } = gulp
// Pull in each task
import fonts from './gulp-tasks/fonts.mjs'
import images from './gulp-tasks/images.mjs'
import sass from './gulp-tasks/sass.mjs'
// Set each directory and contents that we want to watch and
// assign the relevant task. `ignoreInitial` set to true will
// prevent the task being run when we run `gulp watch`, but it
// will run when a file changes.
const watcher = () => {
gulpWatch('./src/images/**/*', { ignoreInitial: true }, images)
gulpWatch('./src/scss/**/*.scss', { ignoreInitial: true }, sass)
}
// The default (if someone just runs `gulp`) is to run each task in parallel
export default parallel(fonts, images, sass)
// This is our watcher task that instructs gulp to watch directories and
// act accordingly
export const watch = watcher
gulp-tasks/fonts.mjs
:
import GetGoogleFonts from 'get-google-fonts'
const fonts = async () => {
// Setup of the library instance by setting where we want
// the output to go. CSS is relative to output font directory
const instance = new GetGoogleFonts({
outputDir: './dist/fonts',
cssFile: './fonts.css',
})
// Grabs fonts and CSS from google and puts in the dist folder
const result = await instance.download(
'https://fonts.googleapis.com/css2?family=Literata:ital,wght@0,400;0,700;1,400&family=Red+Hat+Display:wght@400;900'
)
return result
}
export default fonts
gulp-tasks/images.mjs
:
import gulp from 'gulp'
import imagemin, { mozjpeg, optipng } from 'gulp-imagemin'
// Grabs all images, runs them through imagemin
// and plops them in the dist folder
const images = () => {
// We have specific configs for jpeg and png files to try
// to really pull down asset sizes
return gulp
.src('./src/assets/images/**/*')
.pipe(
imagemin(
[
mozjpeg({ quality: 60, progressive: true }),
optipng({ optimizationLevel: 5, interlaced: null }),
],
{
silent: true,
}
)
)
.pipe(gulp.dest('./dist/assets/images'))
}
export default images
gulp-tasks/sass.mjs
:
import gulp from 'gulp'
import cleanCSS from 'gulp-clean-css'
import * as dartSass from 'sass'
import gulpSass from 'gulp-sass'
const sassProcessor = gulpSass(dartSass)
// Flags whether we compress the output etc
const __prod__ = process.env.NODE_ENV === 'production'
// An array of outputs that should be sent over to includes
const criticalStyles = [
'critical.scss',
'home.scss',
'page.scss',
'project.scss',
]
// Takes the arguments passed by `dest` and determines where the output file goes
const calculateOutput = ({ history }) => {
// By default, we want a CSS file in our dist directory, so the
// HTML can grab it with a <link />
let response = './dist/css'
// Get everything after the last slash
const sourceFileName = /[^/]*$/.exec(history[0])[0]
// If this is critical CSS though, we want it to go
// to the css directory, so nunjucks can include it
// directly in a <style>
if (criticalStyles.includes(sourceFileName)) {
response = './src/css'
}
return response
}
// The main Sass method grabs all root Sass files,
// processes them, then sends them to the output calculator
const sass = () => {
return gulp
.src('./src/scss/*.scss')
.pipe(sassProcessor().on('error', sassProcessor.logError))
.pipe(
cleanCSS(
__prod__
? {
level: 2,
}
: {}
)
)
.pipe(gulp.dest(calculateOutput, { sourceMaps: !__prod__ }))
}
export default sass
Recap
While a boring task, it’s actually pretty straightforward to support ESM for gulp and it works fine with Eleventy as a build pipeline.
Check out the links below for more info: